Gelbard-Sagiv, Hagar; Faivre, Nathan; Mudrik, Liad; Koch, Christof
2016-01-01
The scope and limits of unconscious processing are a matter of ongoing debate. Lately, continuous flash suppression (CFS), a technique for suppressing visual stimuli, has been widely used to demonstrate surprisingly high-level processing of invisible stimuli. Yet, recent studies showed that CFS might actually allow low-level features of the stimulus to escape suppression and be consciously perceived. The influence of such low-level awareness on high-level processing might easily go unnoticed, as studies usually only probe the visibility of the feature of interest, and not that of lower-level features. For instance, face identity is held to be processed unconsciously since subjects who fail to judge the identity of suppressed faces still show identity priming effects. Here we challenge these results, showing that such high-level priming effects are indeed induced by faces whose identity is invisible, but critically, only when a lower-level feature, such as color or location, is visible. No evidence for identity processing was found when subjects had no conscious access to any feature of the suppressed face. These results suggest that high-level processing of an image might be enabled by-or co-occur with-conscious access to some of its low-level features, even when these features are not relevant to the processed dimension. Accordingly, they call for further investigation of lower-level awareness during CFS, and reevaluation of other unconscious high-level processing findings.
Parallel Processing at the High School Level.
ERIC Educational Resources Information Center
Sheary, Kathryn Anne
This study investigated the ability of high school students to cognitively understand and implement parallel processing. Data indicates that most parallel processing is being taught at the university level. Instructional modules on C, Linux, and the parallel processing language, P4, were designed to show that high school students are highly…
Implementing An Image Understanding System Architecture Using Pipe
NASA Astrophysics Data System (ADS)
Luck, Randall L.
1988-03-01
This paper will describe PIPE and how it can be used to implement an image understanding system. Image understanding is the process of developing a description of an image in order to make decisions about its contents. The tasks of image understanding are generally split into low level vision and high level vision. Low level vision is performed by PIPE -a high performance parallel processor with an architecture specifically designed for processing video images at up to 60 fields per second. High level vision is performed by one of several types of serial or parallel computers - depending on the application. An additional processor called ISMAP performs the conversion from iconic image space to symbolic feature space. ISMAP plugs into one of PIPE's slots and is memory mapped into the high level processor. Thus it forms the high speed link between the low and high level vision processors. The mechanisms for bottom-up, data driven processing and top-down, model driven processing are discussed.
NASA Astrophysics Data System (ADS)
Nawani, Jigna; Rixius, Julia; Neuhaus, Birgit J.
2016-08-01
Empirical analysis of secondary biology classrooms revealed that, on average, 68% of teaching time in Germany revolved around processing tasks. Quality of instruction can thus be assessed by analyzing the quality of tasks used in classroom discourse. This quasi-experimental study analyzed how teachers used tasks in 38 videotaped biology lessons pertaining to the topic 'blood and circulatory system'. Two fundamental characteristics used to analyze tasks include: (1) required cognitive level of processing (e.g. low level information processing: repetiition, summary, define, classify and high level information processing: interpret-analyze data, formulate hypothesis, etc.) and (2) complexity of task content (e.g. if tasks require use of factual, linking or concept level content). Additionally, students' cognitive knowledge structure about the topic 'blood and circulatory system' was measured using student-drawn concept maps (N = 970 students). Finally, linear multilevel models were created with high-level cognitive processing tasks and higher content complexity tasks as class-level predictors and students' prior knowledge, students' interest in biology, and students' interest in biology activities as control covariates. Results showed a positive influence of high-level cognitive processing tasks (β = 0.07; p < .01) on students' cognitive knowledge structure. However, there was no observed effect of higher content complexity tasks on students' cognitive knowledge structure. Presented findings encourage the use of high-level cognitive processing tasks in biology instruction.
High-Level Waste System Process Interface Description
DOE Office of Scientific and Technical Information (OSTI.GOV)
d'Entremont, P.D.
1999-01-14
The High-Level Waste System is a set of six different processes interconnected by pipelines. These processes function as one large treatment plant that receives, stores, and treats high-level wastes from various generators at SRS and converts them into forms suitable for final disposal. The three major forms are borosilicate glass, which will be eventually disposed of in a Federal Repository, Saltstone to be buried on site, and treated water effluent that is released to the environment.
NASA Astrophysics Data System (ADS)
Haqiqiansyah, G.; Sugiharto, E.
2018-04-01
This research was conducted to identify and scrutinize women empowerment of fish product processing group in the District of Sanga-Sanga on 2017. The method used was survey method, which is direct observation and interview to respondent. Data were collected in the form of primary and secondary data. Collected data then processed, tabulated, and displayed in the table and graph. The measurement of women empowerment degree was measured by Likert Scale on 3 level, that are score 1 = low, score 2 = less, and score 3 = high. The result of research demonstrated that the rate of empowerment women group of fish product processor was high (score 42,75). Partially, awareness level or willingness to change of processing enterprise group which indicate empowerment indicator categorized as high (91,67%). The level of capability to increase the chance of acquiring access was high (66,67%), the level of capability to overcome an obstacle tend to categorized as less (50%) and the level of capability to collaborate was high (66,67%). It means that the level of coastal women empowerment could be reliable to do a reformation.
The levels of perceptual processing and the neural correlates of increasing subjective visibility.
Binder, Marek; Gociewicz, Krzysztof; Windey, Bert; Koculak, Marcin; Finc, Karolina; Nikadon, Jan; Derda, Monika; Cleeremans, Axel
2017-10-01
According to the levels-of-processing hypothesis, transitions from unconscious to conscious perception may depend on stimulus processing level, with more gradual changes for low-level stimuli and more dichotomous changes for high-level stimuli. In an event-related fMRI study we explored this hypothesis using a visual backward masking procedure. Task requirements manipulated level of processing. Participants reported the magnitude of the target digit in the high-level task, its color in the low-level task, and rated subjective visibility of stimuli using the Perceptual Awareness Scale. Intermediate stimulus visibility was reported more frequently in the low-level task, confirming prior behavioral results. Visible targets recruited insulo-fronto-parietal regions in both tasks. Task effects were observed in visual areas, with higher activity in the low-level task across all visibility levels. Thus, the influence of level of processing on conscious perception may be mediated by attentional modulation of activity in regions representing features of consciously experienced stimuli. Copyright © 2017 Elsevier Inc. All rights reserved.
Metacognitive Analysis of Pre-Service Teachers of Chemistry in Posting Questions
NASA Astrophysics Data System (ADS)
Santoso, T.; Yuanita, L.
2017-04-01
Questions addressed to something can induce metacognitive function to monitor a person’s thinking process. This study aims to describe the structure of the level of student questions based on thinking level and chemistry understanding level and describe how students use their metacognitive knowledge in asking. This research is a case study in chemistry learning, followed by 87 students. Results of the analysis revealed that the structure of thinking level of student question consists of knowledge question, understanding and application question, and high thinking question; the structure of chemistry understanding levels of student questions are a symbol, macro, macro-micro, macro-process, micro-process, and the macro-micro-process. The level Questioning skill of students to scientific articles more qualified than the level questioning skills of students to the teaching materials. The analysis result of six student interviews, a student question demonstrate the metacognitive processes with categories: (1) low-level metacognitive process, which is compiled based on questions focusing on a particular phrase or change the words; (2) intermediate level metacognitive process, submission of questions requires knowledge and understanding, and (3) high-level metacognitive process, the student questions posed based on identifying the central topic or abstraction essence of scientific articles.
Neuropsychological Components of Imagery Processing, Final Technical Report.
ERIC Educational Resources Information Center
Kosslyn, Stephen M.
High-level visual processes make use of stored information, and are invoked during object identification, navigation, tracking, and visual mental imagery. The work presented in this document has resulted in a theory of the component "processing subsystems" used in high-level vision. This theory was developed by considering…
Enhanced Perceptual Processing of Speech in Autism
ERIC Educational Resources Information Center
Jarvinen-Pasley, Anna; Wallace, Gregory L.; Ramus, Franck; Happe, Francesca; Heaton, Pamela
2008-01-01
Theories of autism have proposed that a bias towards low-level perceptual information, or a featural/surface-biased information-processing style, may compromise higher-level language processing in such individuals. Two experiments, utilizing linguistic stimuli with competing low-level/perceptual and high-level/semantic information, tested…
Adapting high-level language programs for parallel processing using data flow
NASA Technical Reports Server (NTRS)
Standley, Hilda M.
1988-01-01
EASY-FLOW, a very high-level data flow language, is introduced for the purpose of adapting programs written in a conventional high-level language to a parallel environment. The level of parallelism provided is of the large-grained variety in which parallel activities take place between subprograms or processes. A program written in EASY-FLOW is a set of subprogram calls as units, structured by iteration, branching, and distribution constructs. A data flow graph may be deduced from an EASY-FLOW program.
Uckoo, Ram M; Jayaprakasha, Guddadarangavvanahally K; Balasubramaniam, V M; Patil, Bhimanagouda S
2012-09-01
Grapefruits (Citrus paradisi Macfad) contain several phytochemicals known to have health maintaining properties. Due to the consumer's interest in obtaining high levels of these phytochemicals, it is important to understand the changes in their levels by common household processing techniques. Therefore, mature Texas "Rio Red" grapefruits were processed by some of the common household processing practices such as blending, juicing, and hand squeezing techniques and analyzed for their phytochemical content by high performance liquid chromatography (HPLC). Results suggest that grapefruit juice processed by blending had significantly (P < 0.05) higher levels of flavonoids (narirutin, naringin, hesperidin, neohesperidin, didymin, and poncirin) and limonin compared to juicing and hand squeezing. No significant variation in their content was noticed in the juice processed by juicing and hand squeezing. Ascorbic acid and citric acid were significantly (P < 0.05) higher in juice processed by juicing and blending, respectively. Furthermore, hand squeezed fruit juice had significantly higher contents of dihydroxybergamottin (DHB) than juice processed by juicing and blending. Bergamottin and 5-methoxy-7 gernoxycoumarin (5-M-7-GC) were significantly higher in blended juice compared to juicing and hand squeezing. Therefore, consuming grapefruit juice processed by blending may provide higher levels of health beneficial phytochemicals such as naringin, narirutin, and poncirin. In contrast, juice processed by hand squeezing and juicing provides lower levels of limonin, bergamottin, and 5-M-7-GC. These results suggest that, processing techniques significantly influence the levels of phytochemicals and blending is a better technique for obtaining higher levels of health beneficial phytochemicals from grapefruits. Practical Application: Blending, squeezing, and juicing are common household processing techniques used for obtaining fresh grapefruit juice. Understanding the levels of health beneficial phytochemicals present in the juice processed by these techniques would enable the consumers to make a better choice to obtain high level of these compounds. © 2012 Institute of Food Technologists®
2015-09-01
this report made use of posttest processing techniques to provide packet-level time tagging with an accuracy close to 3 µs relative to Coordinated...h set of test records. The process described herein made use of posttest processing techniques to provide packet-level time tagging with an accuracy
Internal curvature signal and noise in low- and high-level vision
Grabowecky, Marcia; Kim, Yee Joon; Suzuki, Satoru
2011-01-01
How does internal processing contribute to visual pattern perception? By modeling visual search performance, we estimated internal signal and noise relevant to perception of curvature, a basic feature important for encoding of three-dimensional surfaces and objects. We used isolated, sparse, crowded, and face contexts to determine how internal curvature signal and noise depended on image crowding, lateral feature interactions, and level of pattern processing. Observers reported the curvature of a briefly flashed segment, which was presented alone (without lateral interaction) or among multiple straight segments (with lateral interaction). Each segment was presented with no context (engaging low-to-intermediate-level curvature processing), embedded within a face context as the mouth (engaging high-level face processing), or embedded within an inverted-scrambled-face context as a control for crowding. Using a simple, biologically plausible model of curvature perception, we estimated internal curvature signal and noise as the mean and standard deviation, respectively, of the Gaussian-distributed population activity of local curvature-tuned channels that best simulated behavioral curvature responses. Internal noise was increased by crowding but not by face context (irrespective of lateral interactions), suggesting prevention of noise accumulation in high-level pattern processing. In contrast, internal curvature signal was unaffected by crowding but modulated by lateral interactions. Lateral interactions (with straight segments) increased curvature signal when no contextual elements were added, but equivalent interactions reduced curvature signal when each segment was presented within a face. These opposing effects of lateral interactions are consistent with the phenomena of local-feature contrast in low-level processing and global-feature averaging in high-level processing. PMID:21209356
EmptyHeaded: A Relational Engine for Graph Processing
Aberger, Christopher R.; Tu, Susan; Olukotun, Kunle; Ré, Christopher
2016-01-01
There are two types of high-performance graph processing engines: low- and high-level engines. Low-level engines (Galois, PowerGraph, Snap) provide optimized data structures and computation models but require users to write low-level imperative code, hence ensuring that efficiency is the burden of the user. In high-level engines, users write in query languages like datalog (SociaLite) or SQL (Grail). High-level engines are easier to use but are orders of magnitude slower than the low-level graph engines. We present EmptyHeaded, a high-level engine that supports a rich datalog-like query language and achieves performance comparable to that of low-level engines. At the core of EmptyHeaded’s design is a new class of join algorithms that satisfy strong theoretical guarantees but have thus far not achieved performance comparable to that of specialized graph processing engines. To achieve high performance, EmptyHeaded introduces a new join engine architecture, including a novel query optimizer and data layouts that leverage single-instruction multiple data (SIMD) parallelism. With this architecture, EmptyHeaded outperforms high-level approaches by up to three orders of magnitude on graph pattern queries, PageRank, and Single-Source Shortest Paths (SSSP) and is an order of magnitude faster than many low-level baselines. We validate that EmptyHeaded competes with the best-of-breed low-level engine (Galois), achieving comparable performance on PageRank and at most 3× worse performance on SSSP. PMID:28077912
Occupational Noise Reduction in CNC Striping Process
NASA Astrophysics Data System (ADS)
Mahmad Khairai, Kamarulzaman; Shamime Salleh, Nurul; Razlan Yusoff, Ahmad
2018-03-01
Occupational noise hearing loss with high level exposure is common occupational hazards. In CNC striping process, employee that exposed to high noise level for a long time as 8-hour contributes to hearing loss, create physical and psychological stress that reduce productivity. In this paper, CNC stripping process with high level noises are measured and reduced to the permissible noise exposure. First condition is all machines shutting down and second condition when all CNC machine under operations. For both conditions, noise exposures were measured to evaluate the noise problems and sources. After improvement made, the noise exposures were measured to evaluate the effectiveness of reduction. The initial average noise level at the first condition is 95.797 dB (A). After the pneumatic system with leakage was solved, the noise reduced to 55.517 dB (A). The average noise level at the second condition is 109.340 dB (A). After six machines were gathered at one area and cover that area with plastic curtain, the noise reduced to 95.209 dB (A). In conclusion, the noise level exposure in CNC striping machine is high and exceed the permissible noise exposure can be reduced to acceptable levels. The reduction of noise level in CNC striping processes enhanced productivity in the industry.
Temporal Processing Capacity in High-Level Visual Cortex Is Domain Specific.
Stigliani, Anthony; Weiner, Kevin S; Grill-Spector, Kalanit
2015-09-09
Prevailing hierarchical models propose that temporal processing capacity--the amount of information that a brain region processes in a unit time--decreases at higher stages in the ventral stream regardless of domain. However, it is unknown if temporal processing capacities are domain general or domain specific in human high-level visual cortex. Using a novel fMRI paradigm, we measured temporal capacities of functional regions in high-level visual cortex. Contrary to hierarchical models, our data reveal domain-specific processing capacities as follows: (1) regions processing information from different domains have differential temporal capacities within each stage of the visual hierarchy and (2) domain-specific regions display the same temporal capacity regardless of their position in the processing hierarchy. In general, character-selective regions have the lowest capacity, face- and place-selective regions have an intermediate capacity, and body-selective regions have the highest capacity. Notably, domain-specific temporal processing capacities are not apparent in V1 and have perceptual implications. Behavioral testing revealed that the encoding capacity of body images is higher than that of characters, faces, and places, and there is a correspondence between peak encoding rates and cortical capacities for characters and bodies. The present evidence supports a model in which the natural statistics of temporal information in the visual world may affect domain-specific temporal processing and encoding capacities. These findings suggest that the functional organization of high-level visual cortex may be constrained by temporal characteristics of stimuli in the natural world, and this temporal capacity is a characteristic of domain-specific networks in high-level visual cortex. Significance statement: Visual stimuli bombard us at different rates every day. For example, words and scenes are typically stationary and vary at slow rates. In contrast, bodies are dynamic and typically change at faster rates. Using a novel fMRI paradigm, we measured temporal processing capacities of functional regions in human high-level visual cortex. Contrary to prevailing theories, we find that different regions have different processing capacities, which have behavioral implications. In general, character-selective regions have the lowest capacity, face- and place-selective regions have an intermediate capacity, and body-selective regions have the highest capacity. These results suggest that temporal processing capacity is a characteristic of domain-specific networks in high-level visual cortex and contributes to the segregation of cortical regions. Copyright © 2015 the authors 0270-6474/15/3512412-13$15.00/0.
van Boxtel, Jeroen J A; Lu, Hongjing
2013-01-01
People with Autism Spectrum Disorder (ASD) are hypothesized to have poor high-level processing but superior low-level processing, causing impaired social recognition, and a focus on non-social stimulus contingencies. Biological motion perception provides an ideal domain to investigate exactly how ASD modulates the interaction between low and high-level processing, because it involves multiple processing stages, and carries many important social cues. We investigated individual differences among typically developing observers in biological motion processing, and whether such individual differences associate with the number of autistic traits. In Experiment 1, we found that individuals with fewer autistic traits were automatically and involuntarily attracted to global biological motion information, whereas individuals with more autistic traits did not show this pre-attentional distraction. We employed an action adaptation paradigm in the second study to show that individuals with more autistic traits were able to compensate for deficits in global processing with an increased involvement in local processing. Our findings can be interpreted within a predictive coding framework, which characterizes the functional relationship between local and global processing stages, and explains how these stages contribute to the perceptual difficulties associated with ASD.
van Boxtel, Jeroen J. A.; Lu, Hongjing
2013-01-01
People with Autism Spectrum Disorder (ASD) are hypothesized to have poor high-level processing but superior low-level processing, causing impaired social recognition, and a focus on non-social stimulus contingencies. Biological motion perception provides an ideal domain to investigate exactly how ASD modulates the interaction between low and high-level processing, because it involves multiple processing stages, and carries many important social cues. We investigated individual differences among typically developing observers in biological motion processing, and whether such individual differences associate with the number of autistic traits. In Experiment 1, we found that individuals with fewer autistic traits were automatically and involuntarily attracted to global biological motion information, whereas individuals with more autistic traits did not show this pre-attentional distraction. We employed an action adaptation paradigm in the second study to show that individuals with more autistic traits were able to compensate for deficits in global processing with an increased involvement in local processing. Our findings can be interpreted within a predictive coding framework, which characterizes the functional relationship between local and global processing stages, and explains how these stages contribute to the perceptual difficulties associated with ASD. PMID:23630514
Anxiety, anticipation and contextual information: A test of attentional control theory.
Cocks, Adam J; Jackson, Robin C; Bishop, Daniel T; Williams, A Mark
2016-09-01
We tested the assumptions of Attentional Control Theory (ACT) by examining the impact of anxiety on anticipation using a dynamic, time-constrained task. Moreover, we examined the involvement of high- and low-level cognitive processes in anticipation and how their importance may interact with anxiety. Skilled and less-skilled tennis players anticipated the shots of opponents under low- and high-anxiety conditions. Participants viewed three types of video stimuli, each depicting different levels of contextual information. Performance effectiveness (response accuracy) and processing efficiency (response accuracy divided by corresponding mental effort) were measured. Skilled players recorded higher levels of response accuracy and processing efficiency compared to less-skilled counterparts. Processing efficiency significantly decreased under high- compared to low-anxiety conditions. No difference in response accuracy was observed. When reviewing directional errors, anxiety was most detrimental to performance in the condition conveying only contextual information, suggesting that anxiety may have a greater impact on high-level (top-down) cognitive processes, potentially due to a shift in attentional control. Our findings provide partial support for ACT; anxiety elicited greater decrements in processing efficiency than performance effectiveness, possibly due to predominance of the stimulus-driven attentional system.
High pressure liquid level monitor
Bean, Vern E.; Long, Frederick G.
1984-01-01
A liquid level monitor for tracking the level of a coal slurry in a high-pressure vessel including a toroidal-shaped float with magnetically permeable bands thereon disposed within the vessel, two pairs of magnetic field generators and detectors disposed outside the vessel adjacent the top and bottom thereof and magnetically coupled to the magnetically permeable bands on the float, and signal processing circuitry for combining signals from the top and bottom detectors for generating a monotonically increasing analog control signal which is a function of liquid level. The control signal may be utilized to operate high-pressure control valves associated with processes in which the high-pressure vessel is used.
Online sensing and control of oil in process wastewater
NASA Astrophysics Data System (ADS)
Khomchenko, Irina B.; Soukhomlinoff, Alexander D.; Mitchell, T. F.; Selenow, Alexander E.
2002-02-01
Industrial processes, which eliminate high concentration of oil in their waste stream, find it extremely difficult to measure and control the water purification process. Most oil separation processes involve chemical separation using highly corrosive caustics, acids, surfactants, and emulsifiers. Included in the output of this chemical treatment process are highly adhesive tar-like globules, emulsified and surface oils, and other emulsified chemicals, in addition to suspended solids. The level of oil/hydrocarbons concentration in the wastewater process may fluctuate from 1 ppm to 10,000 ppm, depending upon the specifications of the industry and level of water quality control. The authors have developed a sensing technology, which provides the accuracy of scatter/absorption sensing in a contactless environment by combining these methodologies with reflective measurement. The sensitivity of the sensor may be modified by changing the fluid level control in the flow cell, allowing for a broad range of accurate measurement from 1 ppm to 10,000 ppm. Because this sensing system has been designed to work in a highly invasive environment, it can be placed close to the process source to allow for accurate real time measurement and control.
Higher levels of depression are associated with reduced global bias in visual processing.
de Fockert, Jan W; Cooper, Andrew
2014-04-01
Negative moods have been associated with a tendency to prioritise local details in visual processing. The current study investigated the relation between depression and visual processing using the Navon task, a standard task of local and global processing. In the Navon task, global stimuli are presented that are made up of many local parts, and the participants are instructed to report the identity of either a global or a local target shape. Participants with a low self-reported level of depression showed evidence of the expected global processing bias, and were significantly faster at responding to the global, compared with the local level. By contrast, no such difference was observed in participants with high levels of depression. The reduction of the global bias associated with high levels of depression was only observed in the overall speed of responses to global (versus local) targets, and not in the level of interference produced by the global (versus local) distractors. These results are in line with recent findings of a dissociation between local/global processing bias and interference from local/global distractors, and support the claim that depression is associated with a reduction in the tendency to prioritise global-level processing.
The Action Execution Process Implemented in Different Cognitive Architectures: A Review
NASA Astrophysics Data System (ADS)
Dong, Daqi; Franklin, Stan
2014-12-01
An agent achieves its goals by interacting with its environment, cyclically choosing and executing suitable actions. An action execution process is a reasonable and critical part of an entire cognitive architecture, because the process of generating executable motor commands is not only driven by low-level environmental information, but is also initiated and affected by the agent's high-level mental processes. This review focuses on cognitive models of action, or more specifically, of the action execution process, as implemented in a set of popular cognitive architectures. We examine the representations and procedures inside the action execution process, as well as the cooperation between action execution and other high-level cognitive modules. We finally conclude with some general observations regarding the nature of action execution.
Automated defect spatial signature analysis for semiconductor manufacturing process
Tobin, Jr., Kenneth W.; Gleason, Shaun S.; Karnowski, Thomas P.; Sari-Sarraf, Hamed
1999-01-01
An apparatus and method for performing automated defect spatial signature alysis on a data set representing defect coordinates and wafer processing information includes categorizing data from the data set into a plurality of high level categories, classifying the categorized data contained in each high level category into user-labeled signature events, and correlating the categorized, classified signature events to a present or incipient anomalous process condition.
Zhang, Xiaomeng; Bartol, Kathryn M
2010-09-01
Integrating theories addressing attention and activation with creativity literature, we found an inverted U-shaped relationship between creative process engagement and overall job performance among professionals in complex jobs in an information technology firm. Work experience moderated the curvilinear relationship, with low-experience employees generally exhibiting higher levels of overall job performance at low to moderate levels of creative process engagement and high-experience employees demonstrating higher overall performance at moderate to high levels of creative process engagement. Creative performance partially mediated the relationship between creative process engagement and job performance. These relationships were tested within a moderated mediation framework. Copyright 2010 APA, all rights reserved
Meijer, Willemien A; Van Gerven, Pascal W; de Groot, Renate H; Van Boxtel, Martin P; Jolles, Jelle
2007-10-01
The aim of the present study was to examine whether deeper processing of words during encoding in middle-aged adults leads to a smaller increase in word-learning performance and a smaller decrease in retrieval effort than in young adults. It was also assessed whether high education attenuates age-related differences in performance. Accuracy of recall and recognition, and reaction times of recognition, after performing incidental and intentional learning tasks were compared between 40 young (25-35) and 40 middle-aged (50-60) adults with low and high educational levels. Age differences in recall increased with depth of processing, whereas age differences in accuracy and reaction times of recognition did not differ across levels. High education does not moderate age-related differences in performance. These findings suggest a smaller benefit of deep processing in middle age, when no retrieval cues are available.
Montalvo, Itziar; Gutiérrez-Zotes, Alfonso; Creus, Marta; Monseny, Rosa; Ortega, Laura; Franch, Joan; Lawrie, Stephen M; Reynolds, Rebecca M; Vilella, Elisabet; Labad, Javier
2014-01-01
Hyperprolactinaemia, a common side effect of some antipsychotic drugs, is also present in drug-naïve psychotic patients and subjects at risk for psychosis. Recent studies in non-psychiatric populations suggest that increased prolactin may have negative effects on cognition. The aim of our study was to explore whether high plasma prolactin levels are associated with poorer cognitive functioning in subjects with early psychoses. We studied 107 participants: 29 healthy subjects and 78 subjects with an early psychosis (55 psychotic disorders with <3 years of illness, 23 high-risk subjects). Cognitive assessment was performed with the MATRICS Cognitive Consensus Cognitive Battery, and prolactin levels were determined as well as total cortisol levels in plasma. Psychopathological status was assessed and the use of psychopharmacological treatments (antipsychotics, antidepressants, benzodiazepines) recorded. Prolactin levels were negatively associated with cognitive performance in processing speed, in patients with a psychotic disorder and high-risk subjects. In the latter group, increased prolactin levels were also associated with impaired reasoning and problem solving and poorer general cognition. In a multiple linear regression analysis conducted in both high-risk and psychotic patients, controlling for potential confounders, prolactin and benzodiazepines were independently related to poorer cognitive performance in the speed of processing domain. A mediation analysis showed that both prolactin and benzodiazepine treatment act as mediators of the relationship between risperidone/paliperidone treatment and speed of processing. These results suggest that increased prolactin levels are associated with impaired processing speed in early psychosis. If these results are confirmed in future studies, strategies targeting reduction of prolactin levels may improve cognition in this population.
The Effects of Test Anxiety on Learning at Superficial and Deep Levels of Processing.
ERIC Educational Resources Information Center
Weinstein, Claire E.; And Others
1982-01-01
Using a deep-level processing strategy, low test-anxious college students performed significantly better than high test-anxious students in learning a paired-associate word list. Using a superficial-level processing strategy resulted in no significant difference in performance. A cognitive-attentional theory and test anxiety mechanisms are…
Near-optimal integration of facial form and motion.
Dobs, Katharina; Ma, Wei Ji; Reddy, Leila
2017-09-08
Human perception consists of the continuous integration of sensory cues pertaining to the same object. While it has been fairly well shown that humans use an optimal strategy when integrating low-level cues proportional to their relative reliability, the integration processes underlying high-level perception are much less understood. Here we investigate cue integration in a complex high-level perceptual system, the human face processing system. We tested cue integration of facial form and motion in an identity categorization task and found that an optimal model could successfully predict subjects' identity choices. Our results suggest that optimal cue integration may be implemented across different levels of the visual processing hierarchy.
The effect of spatial attention on invisible stimuli.
Shin, Kilho; Stolte, Moritz; Chong, Sang Chul
2009-10-01
The influence of selective attention on visual processing is widespread. Recent studies have demonstrated that spatial attention can affect processing of invisible stimuli. However, it has been suggested that this effect is limited to low-level features, such as line orientations. The present experiments investigated whether spatial attention can influence both low-level (contrast threshold) and high-level (gender discrimination) adaptation, using the same method of attentional modulation for both types of stimuli. We found that spatial attention was able to increase the amount of adaptation to low- as well as to high-level invisible stimuli. These results suggest that attention can influence perceptual processes independent of visual awareness.
Cognitive Processes and Learner Strategies in the Acquisition of Motor Skills
1978-12-01
children: Capacity or processing deficits? Memory and Cognition, 1976, 4, 559-S72. Craik , F. I. M., & Lockhart , R. S. Levels of processing : A framework...learning and memory research. In F. I. M. Craik & L. S. Cermak (Eds.), Levels of processing and theories of memory. Hillsdale, N. J.: Erlbaum, 1978...functions. Cognitive activities are described at a highly theoretical (technical) level as well as in a pragmatic manner. Differences in processing
Deschrijver, Eliane; Wiersema, Jan R; Brass, Marcel
2017-04-01
For more than 15 years, motor interference paradigms have been used to investigate the influence of action observation on action execution. Most research on so-called automatic imitation has focused on variables that play a modulating role or investigated potential confounding factors. Interestingly, furthermore, a number of functional magnetic resonance imaging (fMRI) studies have tried to shed light on the functional mechanisms and neural correlates involved in imitation inhibition. However, these fMRI studies, presumably due to poor temporal resolution, have primarily focused on high-level processes and have neglected the potential role of low-level motor and perceptual processes. In the current EEG study, we therefore aimed to disentangle the influence of low-level perceptual and motoric mechanisms from high-level cognitive mechanisms. We focused on potential congruency differences in the visual N190 - a component related to the processing of biological motion, the Readiness Potential - a component related to motor preparation, and the high-level P3 component. Interestingly, we detected congruency effects in each of these components, suggesting that the interference effect in an automatic imitation paradigm is not only related to high-level processes such as self-other distinction but also to more low-level influences of perception on action and action on perception. Moreover, we documented relationships of the neural effects with (autistic) behavior.
Personal Striving Level and Self-Evaluation Process.
ERIC Educational Resources Information Center
Orias, John; Leung, Lisa; Dosanj, Shikha; McAnlis, JoAnna; Levy, Gal; Sheposh, John P.
Three studies were conducted to determine if goal striving level was related to accurate self-knowledge. The purpose of the research was to determine if the tendency of high strivers to confront stressful stimuli extends to self-evaluation processes. Three experiments were designed to investigate whether high strivers differ from low strivers in…
Multiple Theory Formation in High-Level Perception. Technical Report No. 38.
ERIC Educational Resources Information Center
Woods, William A.
This paper is concerned with the process of human reading as a high-level perceptual task. Drawing on insights from artificial-intelligence research--specifically, research in natural language processing and continuous speech understanding--the paper attempts to present a fairly concrete picture of the kinds of hypothesis formation and inference…
Advanced glycation endproducts in 35 types of seafood products consumed in eastern China
NASA Astrophysics Data System (ADS)
Wang, Jing; Li, Zhenxing; Pavase, Ramesh Tushar; Lin, Hong; Zou, Long; Wen, Jie; Lv, Liangtao
2016-08-01
Advanced glycation endproducts (AGEs) have been recognized as hazards in processed foods that can induce chronic diseases such as cardiovascular disease, diabetes, and diabetic nephropathy. In this study, we investigated the AGEs contents of 35 types of industrial seafood products that are consumed frequently in eastern China. Total fluorescent AGEs level and Nɛ-carboxymethyl-lysine (CML) content were evaluated by fluorescence spectrophotometry and gas chromatography-mass spectrometry (GC-MS), respectively. The level of total fluorescent AGEs in seafood samples ranged from 39.37 to 1178.3 AU, and was higher in canned and packaged instant aquatic products that were processed at high temperatures. The CML content in seafood samples ranged from 44.8 to 439.1 mg per kg dried sample, and was higher in roasted seafood samples. The total fluorescent AGEs and CML content increased when seafood underwent high-temperature processing, but did not show an obvious correlation. The present study suggested that commonly consumed seafood contains different levels of AGEs, and the seafood processed at high temperatures always displays a high level of either AGEs or CML.
Level indicator for pressure vessels
Not Available
1982-04-28
A liquid-level monitor for tracking the level of a coal slurry in a high-pressure vessel including a toroidal-shaped float with magnetically permeable bands thereon disposed within the vessel, two pairs of magnetic-field generators and detectors disposed outside the vessel adjacent the top and bottom thereof and magnetically coupled to the magnetically permeable bands on the float, and signal-processing circuitry for combining signals from the top and bottom detectors for generating a monotonically increasing analog control signal which is a function of liquid level. The control signal may be utilized to operate high-pressure control valves associated with processes in which the high-pressure vessel is used.
NASA Astrophysics Data System (ADS)
Shuai, W.; Jaffe, P. R.
2017-12-01
Effective ammonium (NH4+) removal has been a challenge in wastewater treatment processes. Aeration, which is required for the conventional NH4+ removal approach by ammonium oxidizing bacteria, is an energy intensive process during the operation of wastewater treatment plant. The efficiency of NH4+ oxidation in natural systems is also limited by oxygen transfer in water and sediments. The objective of this study is to enhance NH4+ removal by applying a novel microbial process, anaerobic NH4+ oxidation coupled to iron (Fe) reduction (also known as Feammox), in constructed wetlands (CW). Our studies have shown that an Acidimicrobiaceae bacterium named A6 can carry out the Feammox process using ferric Fe (Fe(III)) minerals like ferrihydrite as their electron acceptor. To investigate the properties of the Feammox process in CW as well as the influence of electrodes, Feammox bacterium A6 was inoculated in planted CW mesocosms with electrodes installed at multiple depths. CW mesocosms were operated using high NH4+ nutrient solution as inflow under high or low sediment Fe(III) level. During the operation, NH4+ and ferrous Fe concentration, pore water pH, voltages between electrodes, oxidation reduction potential and dissolved oxygen were measured. At the end of the experiment, CW sediment samples at different depths were taken, DNAs were extracted and quantitative polymerase chain reaction and pyrosequencing were performed to analyze the microbial communities. The results show that the high Fe level CW mesocosm has much higher NH4+ removal ability than the low Fe level CW mesocosm after Fe-reducing conditions are developed. This indicates the enhanced NH4+ removal can be attributed to elevated Feammox activity in high Fe level CW mesocosm. The microbial community structures are different in high or low Fe level CW mesocosms and on or away from the installed electrodes. The voltages between cathode and anode increased after the injection of A6 enrichment culture in low Fe level CW mesocosm but remained stable in high Fe level CW mesocosm, indicating A6 may use electrodes as their electron acceptor in the scarcity of Fe(III). The application of Feammox process in Fe-rich CW is promising in providing a cost and energy effective NH4+ removal approach, and the electrogenesis of A6 may also be useful in enhancing the Feammox process.
ERIC Educational Resources Information Center
Boets, Bart; Wouters, Jan; van Wieringen, Astrid; Ghesquiere, Pol
2007-01-01
This study investigates whether the core bottleneck of literacy-impairment should be situated at the phonological level or at a more basic sensory level, as postulated by supporters of the auditory temporal processing theory. Phonological ability, speech perception and low-level auditory processing were assessed in a group of 5-year-old pre-school…
Montalvo, Itziar; Gutiérrez-Zotes, Alfonso; Creus, Marta; Monseny, Rosa; Ortega, Laura; Franch, Joan; Lawrie, Stephen M.; Reynolds, Rebecca M.; Vilella, Elisabet; Labad, Javier
2014-01-01
Hyperprolactinaemia, a common side effect of some antipsychotic drugs, is also present in drug-naïve psychotic patients and subjects at risk for psychosis. Recent studies in non-psychiatric populations suggest that increased prolactin may have negative effects on cognition. The aim of our study was to explore whether high plasma prolactin levels are associated with poorer cognitive functioning in subjects with early psychoses. We studied 107 participants: 29 healthy subjects and 78 subjects with an early psychosis (55 psychotic disorders with <3 years of illness, 23 high-risk subjects). Cognitive assessment was performed with the MATRICS Cognitive Consensus Cognitive Battery, and prolactin levels were determined as well as total cortisol levels in plasma. Psychopathological status was assessed and the use of psychopharmacological treatments (antipsychotics, antidepressants, benzodiazepines) recorded. Prolactin levels were negatively associated with cognitive performance in processing speed, in patients with a psychotic disorder and high-risk subjects. In the latter group, increased prolactin levels were also associated with impaired reasoning and problem solving and poorer general cognition. In a multiple linear regression analysis conducted in both high-risk and psychotic patients, controlling for potential confounders, prolactin and benzodiazepines were independently related to poorer cognitive performance in the speed of processing domain. A mediation analysis showed that both prolactin and benzodiazepine treatment act as mediators of the relationship between risperidone/paliperidone treatment and speed of processing. These results suggest that increased prolactin levels are associated with impaired processing speed in early psychosis. If these results are confirmed in future studies, strategies targeting reduction of prolactin levels may improve cognition in this population. PMID:24586772
High level language for measurement complex control based on the computer E-100I
NASA Technical Reports Server (NTRS)
Zubkov, B. V.
1980-01-01
A high level language was designed to control the process of conducting an experiment using the computer "Elektrinika-1001". Program examples are given to control the measuring and actuating devices. The procedure of including these programs in the suggested high level language is described.
Process for solidifying high-level nuclear waste
Ross, Wayne A.
1978-01-01
The addition of a small amount of reducing agent to a mixture of a high-level radioactive waste calcine and glass frit before the mixture is melted will produce a more homogeneous glass which is leach-resistant and suitable for long-term storage of high-level radioactive waste products.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Elliott, Douglas C.; Hart, Todd R.; Neuenschwander, Gary G.
Through the use of a metal catalyst, gasification of wet algae slurries can be accomplished with high levels of carbon conversion to gas at relatively low temperature (350 C). In a pressurized-water environment (20 MPa), near-total conversion of the organic structure of the algae to gases has been achieved in the presence of a supported ruthenium metal catalyst. The process is essentially steam reforming, as there is no added oxidizer or reagent other than water. In addition, the gas produced is a medium-heating value gas due to the synthesis of high levels of methane, as dictated by thermodynamic equilibrium. Asmore » opposed to earlier work, biomass trace components were removed by processing steps so that they did not cause processing difficulties in the fixed catalyst bed tubular reactor system. As a result, the algae feedstocks, even those with high ash contents, were much more reliably processed. High conversions were obtained even with high slurry concentrations. Consistent catalyst operation in these short-term tests suggested good stability and minimal poisoning effects. High methane content in the product gas was noted with significant carbon dioxide captured in the aqueous byproduct in combination with alkali constituents and the ammonia byproduct derived from proteins in the algae. High conversion of algae to gas products was found with low levels of byproduct water contamination and low to moderate loss of carbon in the mineral separation step.« less
Low-level information and high-level perception: the case of speech in noise.
Nahum, Mor; Nelken, Israel; Ahissar, Merav
2008-05-20
Auditory information is processed in a fine-to-crude hierarchical scheme, from low-level acoustic information to high-level abstract representations, such as phonological labels. We now ask whether fine acoustic information, which is not retained at high levels, can still be used to extract speech from noise. Previous theories suggested either full availability of low-level information or availability that is limited by task difficulty. We propose a third alternative, based on the Reverse Hierarchy Theory (RHT), originally derived to describe the relations between the processing hierarchy and visual perception. RHT asserts that only the higher levels of the hierarchy are immediately available for perception. Direct access to low-level information requires specific conditions, and can be achieved only at the cost of concurrent comprehension. We tested the predictions of these three views in a series of experiments in which we measured the benefits from utilizing low-level binaural information for speech perception, and compared it to that predicted from a model of the early auditory system. Only auditory RHT could account for the full pattern of the results, suggesting that similar defaults and tradeoffs underlie the relations between hierarchical processing and perception in the visual and auditory modalities.
Positive Disintegration as a Process of Symmetry Breaking.
Laycraft, Krystyna
2017-04-01
This article presents an analysis of the positive disintegration as a process of symmetry breaking. Symmetry breaking plays a major role in self-organized patterns formation and correlates directly to increasing complexity and function specialization. According to Dabrowski, a creator of the Theory of Positive Disintegration, the change from lower to higher levels of human development requires a major restructuring of an individual's psychological makeup. Each level of human development is a relatively stable and coherent configuration of emotional-cognitive patterns called developmental dynamisms. Their main function is to restructure a mental structure by breaking the symmetry of a low level and bringing differentiation and then integration to higher levels. The positive disintegration is then the process of transitions from a lower level of high symmetry and low complexity to higher levels of low symmetry and high complexity of mental structure.
Thermal quenching effect of an infrared deep level in Mg-doped p-type GaN films
NASA Astrophysics Data System (ADS)
Kim, Keunjoo; Chung, Sang Jo
2002-03-01
The thermal quenching of an infrared deep level of 1.2-1.5 eV has been investigated on Mg-doped p-type GaN films, using one- and two-step annealing processes and photocurrent measurements. The deep level appeared in the one-step annealing process at a relatively high temperature of 900 °C, but disappeared in the two-step annealing process with a low-temperature step and a subsequent high-temperature step. The persistent photocurrent was residual in the sample including the deep level, while it was terminated in the sample without the deep level. This indicates that the deep level is a neutral hole center located above a quasi-Fermi level, estimated with an energy of EpF=0.1-0.15 eV above the valence band at a hole carrier concentration of 2.0-2.5×1017/cm3.
Temporal distance and person memory: thinking about the future changes memory for the past.
Wyer, Natalie A; Perfect, Timothy J; Pahl, Sabine
2010-06-01
Psychological distance has been shown to influence how people construe an event such that greater distance produces high-level construal (characterized by global or holistic processing) and lesser distance produces low-level construal (characterized by detailed or feature-based processing). The present research tested the hypothesis that construal level has carryover effects on how information about an event is retrieved from memory. Two experiments manipulated temporal distance and found that greater distance (high-level construal) improves face recognition and increases retrieval of the abstract features of an event, whereas lesser distance (low-level construal) impairs face recognition and increases retrieval of the concrete details of an event. The findings have implications for transfer-inappropriate processing accounts of face recognition and event memory, and suggest potential applications in forensic settings.
Willinger, Ulrike; Hergovich, Andreas; Schmoeger, Michaela; Deckert, Matthias; Stoettner, Susanne; Bunda, Iris; Witting, Andrea; Seidler, Melanie; Moser, Reinhilde; Kacena, Stefanie; Jaeckle, David; Loader, Benjamin; Mueller, Christian; Auff, Eduard
2017-05-01
Humour processing is a complex information-processing task that is dependent on cognitive and emotional aspects which presumably influence frame-shifting and conceptual blending, mental operations that underlie humour processing. The aim of the current study was to find distinctive groups of subjects with respect to black humour processing, intellectual capacities, mood disturbance and aggressiveness. A total of 156 adults rated black humour cartoons and conducted measurements of verbal and nonverbal intelligence, mood disturbance and aggressiveness. Cluster analysis yields three groups comprising following properties: (1) moderate black humour preference and moderate comprehension; average nonverbal and verbal intelligence; low mood disturbance and moderate aggressiveness; (2) low black humour preference and moderate comprehension; average nonverbal and verbal intelligence, high mood disturbance and high aggressiveness; and (3) high black humour preference and high comprehension; high nonverbal and verbal intelligence; no mood disturbance and low aggressiveness. Age and gender do not differ significantly, differences in education level can be found. Black humour preference and comprehension are positively associated with higher verbal and nonverbal intelligence as well as higher levels of education. Emotional instability and higher aggressiveness apparently lead to decreased levels of pleasure when dealing with black humour. These results support the hypothesis that humour processing involves cognitive as well as affective components and suggest that these variables influence the execution of frame-shifting and conceptual blending in the course of humour processing.
A second generation 50 Mbps VLSI level zero processing system prototype
NASA Technical Reports Server (NTRS)
Harris, Jonathan C.; Shi, Jeff; Speciale, Nick; Bennett, Toby
1994-01-01
Level Zero Processing (LZP) generally refers to telemetry data processing functions performed at ground facilities to remove all communication artifacts from instrument data. These functions typically include frame synchronization, error detection and correction, packet reassembly and sorting, playback reversal, merging, time-ordering, overlap deletion, and production of annotated data sets. The Data Systems Technologies Division (DSTD) at Goddard Space Flight Center (GSFC) has been developing high-performance Very Large Scale Integration Level Zero Processing Systems (VLSI LZPS) since 1989. The first VLSI LZPS prototype demonstrated 20 Megabits per second (Mbp's) capability in 1992. With a new generation of high-density Application-specific Integrated Circuits (ASIC) and a Mass Storage System (MSS) based on the High-performance Parallel Peripheral Interface (HiPPI), a second prototype has been built that achieves full 50 Mbp's performance. This paper describes the second generation LZPS prototype based upon VLSI technologies.
Sleep Disrupts High-Level Speech Parsing Despite Significant Basic Auditory Processing.
Makov, Shiri; Sharon, Omer; Ding, Nai; Ben-Shachar, Michal; Nir, Yuval; Zion Golumbic, Elana
2017-08-09
The extent to which the sleeping brain processes sensory information remains unclear. This is particularly true for continuous and complex stimuli such as speech, in which information is organized into hierarchically embedded structures. Recently, novel metrics for assessing the neural representation of continuous speech have been developed using noninvasive brain recordings that have thus far only been tested during wakefulness. Here we investigated, for the first time, the sleeping brain's capacity to process continuous speech at different hierarchical levels using a newly developed Concurrent Hierarchical Tracking (CHT) approach that allows monitoring the neural representation and processing-depth of continuous speech online. Speech sequences were compiled with syllables, words, phrases, and sentences occurring at fixed time intervals such that different linguistic levels correspond to distinct frequencies. This enabled us to distinguish their neural signatures in brain activity. We compared the neural tracking of intelligible versus unintelligible (scrambled and foreign) speech across states of wakefulness and sleep using high-density EEG in humans. We found that neural tracking of stimulus acoustics was comparable across wakefulness and sleep and similar across all conditions regardless of speech intelligibility. In contrast, neural tracking of higher-order linguistic constructs (words, phrases, and sentences) was only observed for intelligible speech during wakefulness and could not be detected at all during nonrapid eye movement or rapid eye movement sleep. These results suggest that, whereas low-level auditory processing is relatively preserved during sleep, higher-level hierarchical linguistic parsing is severely disrupted, thereby revealing the capacity and limits of language processing during sleep. SIGNIFICANCE STATEMENT Despite the persistence of some sensory processing during sleep, it is unclear whether high-level cognitive processes such as speech parsing are also preserved. We used a novel approach for studying the depth of speech processing across wakefulness and sleep while tracking neuronal activity with EEG. We found that responses to the auditory sound stream remained intact; however, the sleeping brain did not show signs of hierarchical parsing of the continuous stream of syllables into words, phrases, and sentences. The results suggest that sleep imposes a functional barrier between basic sensory processing and high-level cognitive processing. This paradigm also holds promise for studying residual cognitive abilities in a wide array of unresponsive states. Copyright © 2017 the authors 0270-6474/17/377772-10$15.00/0.
Lumetta, Gregg J; Braley, Jenifer C; Peterson, James M; Bryan, Samuel A; Levitskaia, Tatiana G
2012-06-05
Removing phosphate from alkaline high-level waste sludges at the Department of Energy's Hanford Site in Washington State is necessary to increase the waste loading in the borosilicate glass waste form that will be used to immobilize the highly radioactive fraction of these wastes. We are developing a process which first leaches phosphate from the high-level waste solids with aqueous sodium hydroxide, and then isolates the phosphate by precipitation with calcium oxide. Tests with actual tank waste confirmed that this process is an effective method of phosphate removal from the sludge and offers an additional option for managing the phosphorus in the Hanford tank waste solids. The presence of vibrationally active species, such as nitrate and phosphate ions, in the tank waste processing streams makes the phosphate removal process an ideal candidate for monitoring by Raman or infrared spectroscopic means. As a proof-of-principle demonstration, Raman and Fourier transform infrared (FTIR) spectra were acquired for all phases during a test of the process with actual tank waste. Quantitative determination of phosphate, nitrate, and sulfate in the liquid phases was achieved by Raman spectroscopy, demonstrating the applicability of Raman spectroscopy for the monitoring of these species in the tank waste process streams.
Pangalos, George
2001-01-01
Background The Internet provides many advantages when used for interaction and data sharing among health care providers, patients, and researchers. However, the advantages provided by the Internet come with a significantly greater element of risk to the confidentiality, integrity, and availability of information. It is therefore essential that Health Care Establishments processing and exchanging medical data use an appropriate security policy. Objective To develop a High Level Security Policy for the processing of medical data and their transmission through the Internet, which is a set of high-level statements intended to guide Health Care Establishment personnel who process and manage sensitive health care information. Methods We developed the policy based on a detailed study of the existing framework in the EU countries, USA, and Canada, and on consultations with users in the context of the Intranet Health Clinic project. More specifically, this paper has taken into account the major directives, technical reports, law, and recommendations that are related to the protection of individuals with regard to the processing of personal data, and the protection of privacy and medical data on the Internet. Results We present a High Level Security Policy for Health Care Establishments, which includes a set of 7 principles and 45 guidelines detailed in this paper. The proposed principles and guidelines have been made as generic and open to specific implementations as possible, to provide for maximum flexibility and adaptability to local environments. The High Level Security Policy establishes the basic security requirements that must be addressed to use the Internet to safely transmit patient and other sensitive health care information. Conclusions The High Level Security Policy is primarily intended for large Health Care Establishments in Europe, USA, and Canada. It is clear however that the general framework presented here can only serve as reference material for developing an appropriate High Level Security Policy in a specific implementation environment. When implemented in specific environments, these principles and guidelines must also be complemented by measures, which are more specific. Even when a High Level Security Policy already exists in an institution, it is advisable that the management of the Health Care Establishment periodically revisits it to see whether it should be modified or augmented. PMID:11720956
Ilioudis, C; Pangalos, G
2001-01-01
The Internet provides many advantages when used for interaction and data sharing among health care providers, patients, and researchers. However, the advantages provided by the Internet come with a significantly greater element of risk to the confidentiality, integrity, and availability of information. It is therefore essential that Health Care Establishments processing and exchanging medical data use an appropriate security policy. To develop a High Level Security Policy for the processing of medical data and their transmission through the Internet, which is a set of high-level statements intended to guide Health Care Establishment personnel who process and manage sensitive health care information. We developed the policy based on a detailed study of the existing framework in the EU countries, USA, and Canada, and on consultations with users in the context of the Intranet Health Clinic project. More specifically, this paper has taken into account the major directives, technical reports, law, and recommendations that are related to the protection of individuals with regard to the processing of personal data, and the protection of privacy and medical data on the Internet. We present a High Level Security Policy for Health Care Establishments, which includes a set of 7 principles and 45 guidelines detailed in this paper. The proposed principles and guidelines have been made as generic and open to specific implementations as possible, to provide for maximum flexibility and adaptability to local environments. The High Level Security Policy establishes the basic security requirements that must be addressed to use the Internet to safely transmit patient and other sensitive health care information. The High Level Security Policy is primarily intended for large Health Care Establishments in Europe, USA, and Canada. It is clear however that the general framework presented here can only serve as reference material for developing an appropriate High Level Security Policy in a specific implementation environment. When implemented in specific environments, these principles and guidelines must also be complemented by measures, which are more specific. Even when a High Level Security Policy already exists in an institution, it is advisable that the management of the Health Care Establishment periodically revisits it to see whether it should be modified or augmented.
ERIC Educational Resources Information Center
Huang, Yueh-Min; Shadiev, Rustam; Sun, Ai; Hwang, Wu-Yuin; Liu, Tzu-Yu
2017-01-01
For this study the researchers designed learning activities to enhance students' high level cognitive processes. Students learned new information in a classroom setting and then applied and analyzed their new knowledge in familiar authentic contexts by taking pictures of objects found there, describing them, and sharing their homework with peers.…
Breaking continuous flash suppression: competing for consciousness on the pre-semantic battlefield
Gayet, Surya; Van der Stigchel, Stefan; Paffen, Chris L. E.
2014-01-01
Traditionally, interocular suppression is believed to disrupt high-level (i.e., semantic or conceptual) processing of the suppressed visual input. The development of a new experimental paradigm, breaking continuous flash suppression (b-CFS), has caused a resurgence of studies demonstrating high-level processing of visual information in the absence of visual awareness. In this method the time it takes for interocularly suppressed stimuli to breach the threshold of visibility, is regarded as a measure of access to awareness. The aim of the current review is twofold. First, we provide an overview of the literature using this b-CFS method, while making a distinction between two types of studies: those in which suppression durations are compared between different stimulus classes (such as upright faces versus inverted faces), and those in which suppression durations are compared for stimuli that either match or mismatch concurrently available information (such as a colored target that either matches or mismatches a color retained in working memory). Second, we aim at dissociating high-level processing from low-level (i.e., crude visual) processing of the suppressed stimuli. For this purpose, we include a thorough review of the control conditions that are used in these experiments. Additionally, we provide recommendations for proper control conditions that we deem crucial for disentangling high-level from low-level effects. Based on this review, we argue that crude visual processing suffices for explaining differences in breakthrough times reported using b-CFS. As such, we conclude that there is as yet no reason to assume that interocularly suppressed stimuli receive full semantic analysis. PMID:24904476
Mercury Phase II Study - Mercury Behavior across the High-Level Waste Evaporator System
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bannochie, C. J.; Crawford, C. L.; Jackson, D. G.
2016-06-17
The Mercury Program team’s effort continues to develop more fundamental information concerning mercury behavior across the liquid waste facilities and unit operations. Previously, the team examined the mercury chemistry across salt processing, including the Actinide Removal Process/Modular Caustic Side Solvent Extraction Unit (ARP/MCU), and the Defense Waste Processing Facility (DWPF) flowsheets. This report documents the data and understanding of mercury across the high level waste 2H and 3H evaporator systems.
Effects of consumer food preparation on acrylamide formation.
Jackson, Lauren S; Al-Taher, Fadwa
2005-01-01
Acrylamide is formed in high-carbohydrate foods during high temperature processes such as frying, baking, roasting and extrusion. Although acrylamide is known to form during industrial processing of food, high levels of the chemical have been found in home-cooked foods, mainly potato- and grain-based products. This chapter will focus on the effects of cooking conditions (e.g. time/temperature) on acrylamide formation in consumer-prepared foods, the use of surface color (browning) as an indicator of acrylamide levels in some foods, and methods for reducing acrylamide levels in home-prepared foods. As with commercially processed foods, acrylamide levels in home-prepared foods tend to increase with cooking time and temperature. In experiments conducted at the NCFST, we found that acrylamide levels in cooked food depended greatly on the cooking conditions and the degree of "doneness", as measured by the level of surface browning. For example, French fries fried at 150-190 degrees C for up to 10 min had acrylamide levels of 55 to 2130 microg/kg (wet weight), with the highest levels in the most processed (highest frying times/temperatures) and the most highly browned fries. Similarly, more acrylamide was formed in "dark" toasted bread slices (43.7-610.7 microg/kg wet weight), than "light" (8.27-217.5 microg/kg) or "medium" (10.9-213.7 microg/kg) toasted slices. Analysis of the surface color by colorimetry indicated that some components of surface color ("a" and "L" values) correlated highly with acrylamide levels. This indicates that the degree of surface browning could be used as an indicator of acrylamide formation during cooking. Soaking raw potato slices in water before frying was effective at reducing acrylamide levels in French fries. Additional studies are needed to develop practical methods for reducing acrylamide formation in home-prepared foods without changing the acceptability of these foods.
Horizontal tuning for faces originates in high-level Fusiform Face Area.
Goffaux, Valerie; Duecker, Felix; Hausfeld, Lars; Schiltz, Christine; Goebel, Rainer
2016-01-29
Recent work indicates that the specialization of face visual perception relies on the privileged processing of horizontal angles of facial information. This suggests that stimulus properties assumed to be fully resolved in primary visual cortex (V1; e.g., orientation) in fact determine human vision until high-level stages of processing. To address this hypothesis, the present fMRI study explored the orientation sensitivity of V1 and high-level face-specialized ventral regions such as the Occipital Face Area (OFA) and Fusiform Face Area (FFA) to different angles of face information. Participants viewed face images filtered to retain information at horizontal, vertical or oblique angles. Filtered images were viewed upright, inverted and (phase-)scrambled. FFA responded most strongly to the horizontal range of upright face information; its activation pattern reliably separated horizontal from oblique ranges, but only when faces were upright. Moreover, activation patterns induced in the right FFA and the OFA by upright and inverted faces could only be separated based on horizontal information. This indicates that the specialized processing of upright face information in the OFA and FFA essentially relies on the encoding of horizontal facial cues. This pattern was not passively inherited from V1, which was found to respond less strongly to horizontal than other orientations likely due to adaptive whitening. Moreover, we found that orientation decoding accuracy in V1 was impaired for stimuli containing no meaningful shape. By showing that primary coding in V1 is influenced by high-order stimulus structure and that high-level processing is tuned to selective ranges of primary information, the present work suggests that primary and high-level levels of the visual system interact in order to modulate the processing of certain ranges of primary information depending on their relevance with respect to the stimulus and task at hand. Copyright © 2015 Elsevier Ltd. All rights reserved.
Michaeli, Yael; Sinik, Keren; Haus-Cohen, Maya; Reiter, Yoram
2012-04-01
Short-lived protein translation products are proposed to be a major source of substrates for major histocompatibility complex (MHC) class I antigen processing and presentation; however, a direct link between protein stability and the presentation level of MHC class I-peptide complexes has not been made. We have recently discovered that the peptide Tyr((369-377)) , derived from the tyrosinase protein is highly presented by HLA-A2 on the surface of melanoma cells. To examine the molecular mechanisms responsible for this presentation, we compared characteristics of tyrosinase in melanoma cells lines that present high or low levels of HLA-A2-Tyr((369-377)) complexes. We found no correlation between mRNA levels and the levels of HLA-A2-Tyr((369-377)) presentation. Co-localization experiments revealed that, in cell lines presenting low levels of HLA-A2-Tyr((369-377)) complexes, tyrosinase co-localizes with LAMP-1, a melanosome marker, whereas in cell lines presenting high HLA-A2-Tyr((369-377)) levels, tyrosinase localizes to the endoplasmic reticulum. We also observed differences in tyrosinase molecular weight and glycosylation composition as well as major differences in protein stability (t(1/2) ). By stabilizing the tyrosinase protein, we observed a dramatic decrease in HLA-A2-tyrosinase presentation. Our findings suggest that aberrant processing and instability of tyrosinase are responsible for the high presentation of HLA-A2-Tyr((369-377)) complexes and thus shed new light on the relationship between intracellular processing, stability of proteins, and MHC-restricted peptide presentation. © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Santos, Juliana Lane Paixão Dos; Samapundo, Simbarashe; Biyikli, Ayse; Van Impe, Jan; Akkermans, Simen; Höfte, Monica; Abatih, Emmanuel Nji; Sant'Ana, Anderson S; Devlieghere, Frank
2018-05-19
Heat-resistant moulds (HRMs) are well known for their ability to survive pasteurization and spoil high-acid food products, which is of great concern for processors of fruit-based products worldwide. Whilst the majority of the studies on HRMs over the last decades have addressed their inactivation, few data are currently available regarding their contamination levels in fruit and fruit-based products. Thus, this study aimed to quantify and identify heat-resistant fungal ascospores from samples collected throughout the processing of pasteurized high-acid fruit products. In addition, an assessment on the effect of processing on the contamination levels of HRMs in these products was carried out. A total of 332 samples from 111 batches were analyzed from three processing plants (=three processing lines): strawberry puree (n = 88, Belgium), concentrated orange juice (n = 90, Brazil) and apple puree (n = 154, the Netherlands). HRMs were detected in 96.4% (107/111) of the batches and 59.3% (197/332) of the analyzed samples. HRMs were present in 90.9% of the samples from the strawberry puree processing line (1-215 ascospores/100 g), 46.7% of the samples from the orange juice processing line (1-200 ascospores/100 g) and 48.7% of samples from the apple puree processing line (1-84 ascospores/100 g). Despite the high occurrence, the majority (76.8%, 255/332) of the samples were either not contaminated or presented low levels of HRMs (<10 ascospores/100 g). For both strawberry puree and concentrated orange juice, processing had no statistically significant effect on the levels of HRMs (p > 0.05). On the contrary, a significant reduction (p < 0.05) in HRMs levels was observed during the processing of apple puree. Twelve species were identified belonging to four genera - Byssochlamys, Aspergillus with Neosartorya-type ascospores, Talaromyces and Rasamsonia. N. fumigata (23.6%), N. fischeri (19.1%) and B. nivea (5.5%) were the predominant species in pasteurized products. The quantitative data (contamination levels of HRMs) were fitted to exponential distributions and will ultimately be included as input to spoilage risk assessment models which would allow better control of the spoilage of heat treated fruit products caused by heat-resistant moulds. Copyright © 2018 Elsevier B.V. All rights reserved.
ERIC Educational Resources Information Center
Ekici, Didem Inel
2016-01-01
This study aimed to determine Turkish junior high-school students' perceptions of the general problem-solving process. The Turkish junior high-school students' perceptions of the general problem-solving process were examined in relation to their gender, grade level, age and their grade point with regards to the science course identified in the…
Behavioral inhibition and anxiety: The moderating roles of inhibitory control and attention shifting
White, Lauren K.; McDermott, Jennifer Martin; Degnan, Kathryn A.; Henderson, Heather A.; Fox, Nathan A.
2013-01-01
Behavioral inhibition (BI), a temperament identified in early childhood, is associated with social reticence in childhood and an increased risk for anxiety problems in adolescence and adulthood. However, not all behaviorally inhibited children remain reticent or develop an anxiety disorder. One possible mechanism accounting for the variability in the developmental trajectories of BI is a child’s ability to successfully recruit cognitive processes involved in the regulation of negative reactivity. However, separate cognitive processes may differentially moderate the association between BI and later anxiety problems. The goal of the current study was to examine how two cognitive processes - attention shifting and inhibitory control - laboratory assessed at 48 months of age moderated the association between 24-month BI and anxiety symptoms in the preschool years. Results revealed that high levels of attention shifting decreased the risk for anxiety symptoms in children with high levels of BI, whereas high levels of inhibitory control increased this risk for anxiety symptoms. These findings suggest that different cognitive processes may influence relative levels of risk or adaptation depending upon a child’s temperamental reactivity. PMID:21301953
White, Lauren K; McDermott, Jennifer Martin; Degnan, Kathryn A; Henderson, Heather A; Fox, Nathan A
2011-07-01
Behavioral inhibition (BI), a temperament identified in early childhood, is associated with social reticence in childhood and an increased risk for anxiety problems in adolescence and adulthood. However, not all behaviorally inhibited children remain reticent or develop an anxiety disorder. One possible mechanism accounting for the variability in the developmental trajectories of BI is a child's ability to successfully recruit cognitive processes involved in the regulation of negative reactivity. However, separate cognitive processes may differentially moderate the association between BI and later anxiety problems. The goal of the current study was to examine how two cognitive processes-attention shifting and inhibitory control-laboratory assessed at 48 months of age moderated the association between 24-month BI and anxiety symptoms in the preschool years. Results revealed that high levels of attention shifting decreased the risk for anxiety problems in children with high levels of BI, whereas high levels of inhibitory control increased this risk for anxiety symptoms. These findings suggest that different cognitive processes may influence relative levels of risk or adaptation depending upon a child's temperamental reactivity.
Neural Correlates of Subliminal Language Processing
Axelrod, Vadim; Bar, Moshe; Rees, Geraint; Yovel, Galit
2015-01-01
Language is a high-level cognitive function, so exploring the neural correlates of unconscious language processing is essential for understanding the limits of unconscious processing in general. The results of several functional magnetic resonance imaging studies have suggested that unconscious lexical and semantic processing is confined to the posterior temporal lobe, without involvement of the frontal lobe—the regions that are indispensable for conscious language processing. However, previous studies employed a similarly designed masked priming paradigm with briefly presented single and contextually unrelated words. It is thus possible, that the stimulation level was insufficiently strong to be detected in the high-level frontal regions. Here, in a high-resolution fMRI and multivariate pattern analysis study we explored the neural correlates of subliminal language processing using a novel paradigm, where written meaningful sentences were suppressed from awareness for extended duration using continuous flash suppression. We found that subjectively and objectively invisible meaningful sentences and unpronounceable nonwords could be discriminated not only in the left posterior superior temporal sulcus (STS), but critically, also in the left middle frontal gyrus. We conclude that frontal lobes play a role in unconscious language processing and that activation of the frontal lobes per se might not be sufficient for achieving conscious awareness. PMID:24557638
ERIC Educational Resources Information Center
Hollander, Cara; de Andrade, Victor Manuel
2014-01-01
Schools located near to airports are exposed to high levels of noise which can cause cognitive, health, and hearing problems. Therefore, this study sought to explore whether this noise may cause auditory language processing (ALP) problems in primary school learners. Sixty-one children attending schools exposed to high levels of noise were matched…
How High School Students Select a College.
ERIC Educational Resources Information Center
Gilmour, Joseph E., Jr.; And Others
The college selection process used by high school students was studied and a paradigm that describes the process was developed, based on marketing theory concerning consumer behavior. Primarily college freshmen and high school seniors were interviewed, and a few high school juniors and upper-level college students were surveyed to determine…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Berry, K.R.; Hansen, F.R.; Napolitano, L.M.
1992-01-01
DART (DSP Arrary for Reconfigurable Tasks) is a parallel architecture of two high-performance SDP (digital signal processing) chips with the flexibility to handle a wide range of real-time applications. Each of the 32-bit floating-point DSP processes in DART is programmable in a high-level languate ( C'' or Ada). We have added extensions to the real-time operating system used by DART in order to support parallel processor. The combination of high-level language programmability, a real-time operating system, and parallel processing support significantly reduces the development cost of application software for signal processing and control applications. We have demonstrated this capability bymore » using DART to reconstruct images in the prototype VIP (Video Imaging Projectile) groundstation.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Berry, K.R.; Hansen, F.R.; Napolitano, L.M.
1992-01-01
DART (DSP Arrary for Reconfigurable Tasks) is a parallel architecture of two high-performance SDP (digital signal processing) chips with the flexibility to handle a wide range of real-time applications. Each of the 32-bit floating-point DSP processes in DART is programmable in a high-level languate (``C`` or Ada). We have added extensions to the real-time operating system used by DART in order to support parallel processor. The combination of high-level language programmability, a real-time operating system, and parallel processing support significantly reduces the development cost of application software for signal processing and control applications. We have demonstrated this capability by usingmore » DART to reconstruct images in the prototype VIP (Video Imaging Projectile) groundstation.« less
Towards Implementation of a Generalized Architecture for High-Level Quantum Programming Language
NASA Astrophysics Data System (ADS)
Ameen, El-Mahdy M.; Ali, Hesham A.; Salem, Mofreh M.; Badawy, Mahmoud
2017-08-01
This paper investigates a novel architecture to the problem of quantum computer programming. A generalized architecture for a high-level quantum programming language has been proposed. Therefore, the programming evolution from the complicated quantum-based programming to the high-level quantum independent programming will be achieved. The proposed architecture receives the high-level source code and, automatically transforms it into the equivalent quantum representation. This architecture involves two layers which are the programmer layer and the compilation layer. These layers have been implemented in the state of the art of three main stages; pre-classification, classification, and post-classification stages respectively. The basic building block of each stage has been divided into subsequent phases. Each phase has been implemented to perform the required transformations from one representation to another. A verification process was exposed using a case study to investigate the ability of the compiler to perform all transformation processes. Experimental results showed that the efficacy of the proposed compiler achieves a correspondence correlation coefficient about R ≈ 1 between outputs and the targets. Also, an obvious achievement has been utilized with respect to the consumed time in the optimization process compared to other techniques. In the online optimization process, the consumed time has increased exponentially against the amount of accuracy needed. However, in the proposed offline optimization process has increased gradually.
Serum irisin and myostatin levels after 2 weeks of high-altitude climbing.
Śliwicka, Ewa; Cisoń, Tomasz; Kasprzak, Zbigniew; Nowak, Alicja; Pilaczyńska-Szcześniak, Łucja
2017-01-01
Exposure to high-altitude hypoxia causes physiological and metabolic adaptive changes by disturbing homeostasis. Hypoxia-related changes in skeletal muscle affect the closely interconnected energy and regeneration processes. The balance between protein synthesis and degradation in the skeletal muscle is regulated by several molecules such as myostatin, cytokines, vitamin D, and irisin. This study investigates changes in irisin and myostatin levels in male climbers after a 2-week high-altitude expedition, and their association with 25(OH)D and indices of inflammatory processes. The study was performed in 8 men aged between 23 and 31 years, who participated in a 2-week climbing expedition in the Alps. The measurements of body composition and serum concentrations of irisin, myostatin, 25(OH)D, interleukin-6, myoglobin, high-sensitivity C-reactive protein, osteoprotegerin, and high-sensitivity soluble receptor activator of NF-κB ligand (sRANKL) were performed before and after expedition. A 2-week exposure to hypobaric hypoxia caused significant decrease in body mass, body mass index (BMI), free fat mass and irisin, 25-Hydroxyvitamin D levels. On the other hand, significant increase in the levels of myoglobin, high-sensitivity C-reactive protein, interleukin-6, and osteoprotegerin were noted. The observed correlations of irisin with 25(OH)D levels, as well as myostatin levels with inflammatory markers and the OPG/RANKL ratio indicate that these myokines may be involved in the energy-related processes and skeletal muscle regeneration in response to 2-week exposure to hypobaric hypoxia.
Processed foods and the nutrition transition: evidence from Asia.
Baker, P; Friel, S
2014-07-01
This paper elucidates the role of processed foods and beverages in the 'nutrition transition' underway in Asia. Processed foods tend to be high in nutrients associated with obesity and diet-related non-communicable diseases: refined sugar, salt, saturated and trans-fats. This paper identifies the most significant 'product vectors' for these nutrients and describes changes in their consumption in a selection of Asian countries. Sugar, salt and fat consumption from processed foods has plateaued in high-income countries, but has rapidly increased in the lower-middle and upper-middle-income countries. Relative to sugar and salt, fat consumption in the upper-middle- and lower-middle-income countries is converging most rapidly with that of high-income countries. Carbonated soft drinks, baked goods, and oils and fats are the most significant vectors for sugar, salt and fat respectively. At the regional level there appears to be convergence in consumption patterns of processed foods, but country-level divergences including high levels of consumption of oils and fats in Malaysia, and soft drinks in the Philippines and Thailand. This analysis suggests that more action is needed by policy-makers to prevent or mitigate processed food consumption. Comprehensive policy and regulatory approaches are most likely to be effective in achieving these goals. © 2014 The Authors. obesity reviews © 2014 World Obesity.
Nonlinear Associations Between Co-Rumination and Both Social Support and Depression Symptoms.
Ames-Sikora, Alyssa M; Donohue, Meghan Rose; Tully, Erin C
2017-08-18
Co-ruminating about one's problems appears to involve both beneficial self-disclosure and harmful rumination, suggesting that moderate levels may be the most adaptive. This study used nonlinear regression to determine whether moderate levels of self-reported co-rumination in relationships with a sibling, parent, friend, and romantic partner are linked to the highest levels of self-perceived social support and lowest levels of self-reported depression symptoms in 175 emerging adults (77% female; M = 19.66 years). As expected, moderate co-rumination was associated with high social support across all four relationship types, but, somewhat unexpectedly, high levels of co-rumination were also associated with high social support. As predicted, moderate levels of co-rumination with friends and siblings were associated with low levels of depression. Contrary to hypotheses, high levels of co-rumination were associated with high depression within romantic relationships. Co-rumination with a parent did not have a linear or quadratic association with depression. These findings suggest that high co-ruminating in supportive relationships and to a lesser extent low co-ruminating in unsupportive relationships are maladaptive interpersonal processes but that co-rumination's relation to depression depends on the co-ruminating partner. Psychotherapies for depression may target these maladaptive processes by supporting clients' development of balanced self-focused negative talk.
Bilingualism and Processing of Elementary Cognitive Tasks by Chicano Adolescents.
ERIC Educational Resources Information Center
Nanez, Jose E., Sr.; Padilla, Raymond V.
1995-01-01
Two experiments conducted with 49 Chicano high school students, aged 15-17, found that balanced-proficient bilingual speakers had slower rates of cognitive information processing at the level of short-term memory than did unbalanced bilingual speakers. The groups did not differ in rates of cognitive information processing at the simpler level of…
Expectations of Faculty, Parents, and Students for Due Process in Campus Disciplinary Hearings.
ERIC Educational Resources Information Center
Janosik, Steven M.
2001-01-01
A sample of 464 faculty members, parents, and students responded to a questionnaire that assessed their expectations for due process in campus disciplinary hearings. Respondents indicated they expected high levels of due process would be provided in suspension-level campus disciplinary hearings. The three groups differed on specific due process…
Applications of massively parallel computers in telemetry processing
NASA Technical Reports Server (NTRS)
El-Ghazawi, Tarek A.; Pritchard, Jim; Knoble, Gordon
1994-01-01
Telemetry processing refers to the reconstruction of full resolution raw instrumentation data with artifacts, of space and ground recording and transmission, removed. Being the first processing phase of satellite data, this process is also referred to as level-zero processing. This study is aimed at investigating the use of massively parallel computing technology in providing level-zero processing to spaceflights that adhere to the recommendations of the Consultative Committee on Space Data Systems (CCSDS). The workload characteristics, of level-zero processing, are used to identify processing requirements in high-performance computing systems. An example of level-zero functions on a SIMD MPP, such as the MasPar, is discussed. The requirements in this paper are based in part on the Earth Observing System (EOS) Data and Operation System (EDOS).
A global "imaging'' view on systems approaches in immunology.
Ludewig, Burkhard; Stein, Jens V; Sharpe, James; Cervantes-Barragan, Luisa; Thiel, Volker; Bocharov, Gennady
2012-12-01
The immune system exhibits an enormous complexity. High throughput methods such as the "-omic'' technologies generate vast amounts of data that facilitate dissection of immunological processes at ever finer resolution. Using high-resolution data-driven systems analysis, causal relationships between complex molecular processes and particular immunological phenotypes can be constructed. However, processes in tissues, organs, and the organism itself (so-called higher level processes) also control and regulate the molecular (lower level) processes. Reverse systems engineering approaches, which focus on the examination of the structure, dynamics and control of the immune system, can help to understand the construction principles of the immune system. Such integrative mechanistic models can properly describe, explain, and predict the behavior of the immune system in health and disease by combining both higher and lower level processes. Moving from molecular and cellular levels to a multiscale systems understanding requires the development of methodologies that integrate data from different biological levels into multiscale mechanistic models. In particular, 3D imaging techniques and 4D modeling of the spatiotemporal dynamics of immune processes within lymphoid tissues are central for such integrative approaches. Both dynamic and global organ imaging technologies will be instrumental in facilitating comprehensive multiscale systems immunology analyses as discussed in this review. © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
High pressure processing and its application to the challenge of virus-contaminated foods
USDA-ARS?s Scientific Manuscript database
High pressure processing (HPP) is an increasingly popular non-thermal food processing technology. Study of HPP’s potential to inactivate foodborne viruses has defined general pressure levels required to inactivate hepatitis A virus, norovirus surrogates, and human norovirus itself within foods such...
P/M Processing of Rare Earth Modified High Strength Steels.
1980-12-01
AA094 165 TRW INC CLEVELAND OH MATERIALS TECHNOLOGY F 6 P/N PROCESSING OF RARE EARTH MODIFIED HIGH STRENGTH STEELS DEC So A A SHEXM(ER NOOŕT76-C...LEVEL’ (7 PIM PROCESSING OF RARE EARTH MODIFIED HIGH STRENGTH STEELS By A. A. SHEINKER 00 TECHNICAL REPORT Prepared for Office of Naval Research...Processing of Rare Earth Modified High 1 Technical -’ 3t eC"Strength Steels * 1dc4,093Se~ 9PEFRIGOGNZTONAEADADDRESS 10. PROGRAM ELEMENT. PROJECT. TASK
Verification and Validation in a Rapid Software Development Process
NASA Technical Reports Server (NTRS)
Callahan, John R.; Easterbrook, Steve M.
1997-01-01
The high cost of software production is driving development organizations to adopt more automated design and analysis methods such as rapid prototyping, computer-aided software engineering (CASE) tools, and high-level code generators. Even developers of safety-critical software system have adopted many of these new methods while striving to achieve high levels Of quality and reliability. While these new methods may enhance productivity and quality in many cases, we examine some of the risks involved in the use of new methods in safety-critical contexts. We examine a case study involving the use of a CASE tool that automatically generates code from high-level system designs. We show that while high-level testing on the system structure is highly desirable, significant risks exist in the automatically generated code and in re-validating releases of the generated code after subsequent design changes. We identify these risks and suggest process improvements that retain the advantages of rapid, automated development methods within the quality and reliability contexts of safety-critical projects.
Data Processing for High School Students
ERIC Educational Resources Information Center
Spiegelberg, Emma Jo
1974-01-01
Data processing should be taught at the high school level so students may develop a general understanding and appreciation for the capabilities and the limitations of these automated data processing systems. Card machines, wiring, logic, flowcharting, and Cobol programing are to be taught, with behavioral objectives for each section listed. (SC)
Marković, Slobodan
2012-01-01
In this paper aesthetic experience is defined as an experience qualitatively different from everyday experience and similar to other exceptional states of mind. Three crucial characteristics of aesthetic experience are discussed: fascination with an aesthetic object (high arousal and attention), appraisal of the symbolic reality of an object (high cognitive engagement), and a strong feeling of unity with the object of aesthetic fascination and aesthetic appraisal. In a proposed model, two parallel levels of aesthetic information processing are proposed. On the first level two sub-levels of narrative are processed, story (theme) and symbolism (deeper meanings). The second level includes two sub-levels, perceptual associations (implicit meanings of object's physical features) and detection of compositional regularities. Two sub-levels are defined as crucial for aesthetic experience, appraisal of symbolism and compositional regularities. These sub-levels require some specific cognitive and personality dispositions, such as expertise, creative thinking, and openness to experience. Finally, feedback of emotional processing is included in our model: appraisals of everyday emotions are specified as a matter of narrative content (eg, empathy with characters), whereas the aesthetic emotion is defined as an affective evaluation in the process of symbolism appraisal or the detection of compositional regularities. PMID:23145263
Van Ettinger-Veenstra, Helene; McAllister, Anita; Lundberg, Peter; Karlsson, Thomas; Engström, Maria
2016-01-01
This study investigates the relation between individual language ability and neural semantic processing abilities. Our aim was to explore whether high-level language ability would correlate to decreased activation in language-specific regions or rather increased activation in supporting language regions during processing of sentences. Moreover, we were interested if observed neural activation patterns are modulated by semantic incongruency similarly to previously observed changes upon syntactic congruency modulation. We investigated 27 healthy adults with a sentence reading task-which tapped language comprehension and inference, and modulated sentence congruency-employing functional magnetic resonance imaging (fMRI). We assessed the relation between neural activation, congruency modulation, and test performance on a high-level language ability assessment with multiple regression analysis. Our results showed increased activation in the left-hemispheric angular gyrus extending to the temporal lobe related to high language ability. This effect was independent of semantic congruency, and no significant relation between language ability and incongruency modulation was observed. Furthermore, there was a significant increase of activation in the inferior frontal gyrus (IFG) bilaterally when the sentences were incongruent, indicating that processing incongruent sentences was more demanding than processing congruent sentences and required increased activation in language regions. The correlation of high-level language ability with increased rather than decreased activation in the left angular gyrus, a region specific for language processing, is opposed to what the neural efficiency hypothesis would predict. We can conclude that no evidence is found for an interaction between semantic congruency related brain activation and high-level language performance, even though the semantic incongruent condition shows to be more demanding and evoking more neural activation.
Van Ettinger-Veenstra, Helene; McAllister, Anita; Lundberg, Peter; Karlsson, Thomas; Engström, Maria
2016-01-01
This study investigates the relation between individual language ability and neural semantic processing abilities. Our aim was to explore whether high-level language ability would correlate to decreased activation in language-specific regions or rather increased activation in supporting language regions during processing of sentences. Moreover, we were interested if observed neural activation patterns are modulated by semantic incongruency similarly to previously observed changes upon syntactic congruency modulation. We investigated 27 healthy adults with a sentence reading task—which tapped language comprehension and inference, and modulated sentence congruency—employing functional magnetic resonance imaging (fMRI). We assessed the relation between neural activation, congruency modulation, and test performance on a high-level language ability assessment with multiple regression analysis. Our results showed increased activation in the left-hemispheric angular gyrus extending to the temporal lobe related to high language ability. This effect was independent of semantic congruency, and no significant relation between language ability and incongruency modulation was observed. Furthermore, there was a significant increase of activation in the inferior frontal gyrus (IFG) bilaterally when the sentences were incongruent, indicating that processing incongruent sentences was more demanding than processing congruent sentences and required increased activation in language regions. The correlation of high-level language ability with increased rather than decreased activation in the left angular gyrus, a region specific for language processing, is opposed to what the neural efficiency hypothesis would predict. We can conclude that no evidence is found for an interaction between semantic congruency related brain activation and high-level language performance, even though the semantic incongruent condition shows to be more demanding and evoking more neural activation. PMID:27014040
Thermal analysis of heat and power plant with high temperature reactor and intermediate steam cycle
NASA Astrophysics Data System (ADS)
Fic, Adam; Składzień, Jan; Gabriel, Michał
2015-03-01
Thermal analysis of a heat and power plant with a high temperature gas cooled nuclear reactor is presented. The main aim of the considered system is to supply a technological process with the heat at suitably high temperature level. The considered unit is also used to produce electricity. The high temperature helium cooled nuclear reactor is the primary heat source in the system, which consists of: the reactor cooling cycle, the steam cycle and the gas heat pump cycle. Helium used as a carrier in the first cycle (classic Brayton cycle), which includes the reactor, delivers heat in a steam generator to produce superheated steam with required parameters of the intermediate cycle. The intermediate cycle is provided to transport energy from the reactor installation to the process installation requiring a high temperature heat. The distance between reactor and the process installation is assumed short and negligable, or alternatively equal to 1 km in the analysis. The system is also equipped with a high temperature argon heat pump to obtain the temperature level of a heat carrier required by a high temperature process. Thus, the steam of the intermediate cycle supplies a lower heat exchanger of the heat pump, a process heat exchanger at the medium temperature level and a classical steam turbine system (Rankine cycle). The main purpose of the research was to evaluate the effectiveness of the system considered and to assess whether such a three cycle cogeneration system is reasonable. Multivariant calculations have been carried out employing the developed mathematical model. The results have been presented in a form of the energy efficiency and exergy efficiency of the system as a function of the temperature drop in the high temperature process heat exchanger and the reactor pressure.
An Analysis of High School Students' Performance on Five Integrated Science Process Skills
NASA Astrophysics Data System (ADS)
Beaumont-Walters, Yvonne; Soyibo, Kola
2001-02-01
This study determined Jamaican high school students' level of performance on five integrated science process skills and if there were statistically significant differences in their performance linked to their gender, grade level, school location, school type, student type and socio-economic background (SEB). The 305 subjects comprised 133 males, 172 females, 146 ninth graders, 159 10th graders, 150 traditional and 155 comprehensive high school students, 164 students from the Reform of Secondary Education (ROSE) project and 141 non-ROSE students, 166 urban and 139 rural students and 110 students from a high SEB and 195 from a low SEB. Data were collected with the authors' constructed integrated science process skills test the results indicated that the subjects' mean score was low and unsatisfactory; their performance in decreasing order was: interpreting data, recording data, generalising, formulating hypotheses and identifying variables; there were statistically significant differences in their performance based on their grade level, school type, student type, and SEB in favour of the 10th graders, traditional high school students, ROSE students and students from a high SEB. There was a positive, statistically significant and fairly strong relationship between their performance and school type, but weak relationships among their student type, grade level and SEB and performance.
A Framework for Translating a High Level Security Policy into Low Level Security Mechanisms
NASA Astrophysics Data System (ADS)
Hassan, Ahmed A.; Bahgat, Waleed M.
2010-01-01
Security policies have different components; firewall, active directory, and IDS are some examples of these components. Enforcement of network security policies to low level security mechanisms faces some essential difficulties. Consistency, verification, and maintenance are the major ones of these difficulties. One approach to overcome these difficulties is to automate the process of translation of high level security policy into low level security mechanisms. This paper introduces a framework of an automation process that translates a high level security policy into low level security mechanisms. The framework is described in terms of three phases; in the first phase all network assets are categorized according to their roles in the network security and relations between them are identified to constitute the network security model. This proposed model is based on organization based access control (OrBAC). However, the proposed model extend the OrBAC model to include not only access control policy but also some other administrative security policies like auditing policy. Besides, the proposed model enables matching of each rule of the high level security policy with the corresponding ones of the low level security policy. Through the second phase of the proposed framework, the high level security policy is mapped into the network security model. The second phase could be considered as a translation of the high level security policy into an intermediate model level. Finally, the intermediate model level is translated automatically into low level security mechanism. The paper illustrates the applicability of proposed approach through an application example.
Concept for Highly Mechanized Data Processing, Project 111.
A concept is developed for a highly mechanized maintenance data processing system capable of deriving factors, influences, and correlations to raise...the level of logistics knowledge and lead to the design of a management-control system. (Author)
A programmable computational image sensor for high-speed vision
NASA Astrophysics Data System (ADS)
Yang, Jie; Shi, Cong; Long, Xitian; Wu, Nanjian
2013-08-01
In this paper we present a programmable computational image sensor for high-speed vision. This computational image sensor contains four main blocks: an image pixel array, a massively parallel processing element (PE) array, a row processor (RP) array and a RISC core. The pixel-parallel PE is responsible for transferring, storing and processing image raw data in a SIMD fashion with its own programming language. The RPs are one dimensional array of simplified RISC cores, it can carry out complex arithmetic and logic operations. The PE array and RP array can finish great amount of computation with few instruction cycles and therefore satisfy the low- and middle-level high-speed image processing requirement. The RISC core controls the whole system operation and finishes some high-level image processing algorithms. We utilize a simplified AHB bus as the system bus to connect our major components. Programming language and corresponding tool chain for this computational image sensor are also developed.
NASA Astrophysics Data System (ADS)
Lopez, J. P.; de Almeida, A. J. F.; Tabosa, J. W. R.
2018-03-01
We report on the observation of subharmonic resonances in high-order wave mixing associated with the quantized vibrational levels of atoms trapped in a one-dimensional optical lattice created by two intense nearly counterpropagating coupling beams. These subharmonic resonances, occurring at ±1 /2 and ±1 /3 of the frequency separation between adjacent vibrational levels, are observed through phase-match angularly resolved six- and eight-wave mixing processes. We investigate how these resonances evolve with the intensity of the incident probe beam, which couples with one of the coupling beams to create anharmonic coherence gratings between adjacent vibrational levels. Our experimental results also show evidence of high-order processes associated with coherence involving nonadjacent vibrational levels. Moreover, we also demonstrate that these induced high-order coherences can be stored in the medium and the associated optical information retrieved after a controlled storage time.
Van der Molen, Melle J W; Poppelaars, Eefje S; Van Hartingsveldt, Caroline T A; Harrewijn, Anita; Gunther Moor, Bregtje; Westenberg, P Michiel
2013-01-01
Cognitive models posit that the fear of negative evaluation (FNE) is a hallmark feature of social anxiety. As such, individuals with high FNE may show biased information processing when faced with social evaluation. The aim of the current study was to examine the neural underpinnings of anticipating and processing social-evaluative feedback, and its correlates with FNE. We used a social judgment paradigm in which female participants (N = 31) were asked to indicate whether they believed to be socially accepted or rejected by their peers. Anticipatory attention was indexed by the stimulus preceding negativity (SPN), while the feedback-related negativity and P3 were used to index the processing of social-evaluative feedback. Results provided evidence of an optimism bias in social peer evaluation, as participants more often predicted to be socially accepted than rejected. Participants with high levels of FNE needed more time to provide their judgments about the social-evaluative outcome. While anticipating social-evaluative feedback, SPN amplitudes were larger for anticipated social acceptance than for social rejection feedback. Interestingly, the SPN during anticipated social acceptance was larger in participants with high levels of FNE. None of the feedback-related brain potentials correlated with the FNE. Together, the results provided evidence of biased information processing in individuals with high levels of FNE when anticipating (rather than processing) social-evaluative feedback. The delayed response times in high FNE individuals were interpreted to reflect augmented vigilance imposed by the upcoming social-evaluative threat. Possibly, the SPN constitutes a neural marker of this vigilance in females with higher FNE levels, particularly when anticipating social acceptance feedback.
Shared neural circuits for mentalizing about the self and others.
Lombardo, Michael V; Chakrabarti, Bhismadev; Bullmore, Edward T; Wheelwright, Sally J; Sadek, Susan A; Suckling, John; Baron-Cohen, Simon
2010-07-01
Although many examples exist for shared neural representations of self and other, it is unknown how such shared representations interact with the rest of the brain. Furthermore, do high-level inference-based shared mentalizing representations interact with lower level embodied/simulation-based shared representations? We used functional neuroimaging (fMRI) and a functional connectivity approach to assess these questions during high-level inference-based mentalizing. Shared mentalizing representations in ventromedial prefrontal cortex, posterior cingulate/precuneus, and temporo-parietal junction (TPJ) all exhibited identical functional connectivity patterns during mentalizing of both self and other. Connectivity patterns were distributed across low-level embodied neural systems such as the frontal operculum/ventral premotor cortex, the anterior insula, the primary sensorimotor cortex, and the presupplementary motor area. These results demonstrate that identical neural circuits are implementing processes involved in mentalizing of both self and other and that the nature of such processes may be the integration of low-level embodied processes within higher level inference-based mentalizing.
Werner, Stefan; Breus, Oksana; Symonenko, Yuri; Marillonnet, Sylvestre; Gleba, Yuri
2011-01-01
We describe here a unique ethanol-inducible process for expression of recombinant proteins in transgenic plants. The process is based on inducible release of viral RNA replicons from stably integrated DNA proreplicons. A simple treatment with ethanol releases the replicon leading to RNA amplification and high-level protein production. To achieve tight control of replicon activation and spread in the uninduced state, the viral vector has been deconstructed, and its two components, the replicon and the cell-to-cell movement protein, have each been placed separately under the control of an inducible promoter. Transgenic Nicotiana benthamiana plants incorporating this double-inducible system demonstrate negligible background expression, high (over 0.5 × 104-fold) induction multiples, and high absolute levels of protein expression upon induction (up to 4.3 mg/g fresh biomass). The process can be easily scaled up, supports expression of practically important recombinant proteins, and thus can be directly used for industrial manufacturing. PMID:21825158
Lackner, Ryan J.; Fresco, David M.
2016-01-01
Awareness of the body (i.e., interoceptive awareness) and self-referential thought represent two distinct, yet habitually integrated aspects of self. A recent neuroanatomical and processing model for depression and anxiety incorporates the connections between increased but low fidelity afferent interoceptive input with self-referential and belief-based states. A deeper understanding of how self-referential processes are integrated with interoceptive processes may ultimately aid in our understanding of altered, maladaptive views of the self – a shared experience of individuals with mood and anxiety disorders. Thus, the purpose of the current study was to examine how negative self-referential processing (i.e., brooding rumination) relates to interoception in the context of affective psychopathology. Undergraduate students (N = 82) completed an interoception task (heartbeat counting) in addition to self-reported measures of rumination and depression and anxiety symptoms. Results indicated an interaction effect of brooding rumination and interoceptive awareness on depression and anxiety-related distress. Specifically, high levels of brooding rumination coupled with low levels of interoceptive awareness were associated with the highest levels of depression and anxiety-related distress, whereas low levels of brooding rumination coupled with high levels of interoceptive awareness were associated with lower levels of depression and anxiety-related distress. The findings provide further support for the conceptualization of anxiety and depression as conditions involving the integration of interoceptive processes and negative self-referential processes. PMID:27567108
Aguilar-Mahecha, Adriana; Kuzyk, Michael A.; Domanski, Dominik; Borchers, Christoph H.; Basik, Mark
2012-01-01
Blood sample processing and handling can have a significant impact on the stability and levels of proteins measured in biomarker studies. Such pre-analytical variability needs to be well understood in the context of the different proteomics platforms available for biomarker discovery and validation. In the present study we evaluated different types of blood collection tubes including the BD P100 tube containing protease inhibitors as well as CTAD tubes, which prevent platelet activation. We studied the effect of different processing protocols as well as delays in tube processing on the levels of 55 mid and high abundance plasma proteins using novel multiple-reaction monitoring-mass spectrometry (MRM-MS) assays as well as 27 low abundance cytokines using a commercially available multiplexed bead-based immunoassay. The use of P100 tubes containing protease inhibitors only conferred proteolytic protection for 4 cytokines and only one MRM-MS-measured peptide. Mid and high abundance proteins measured by MRM are highly stable in plasma left unprocessed for up to six hours although platelet activation can also impact the levels of these proteins. The levels of cytokines were elevated when tubes were centrifuged at cold temperature, while low levels were detected when samples were collected in CTAD tubes. Delays in centrifugation also had an impact on the levels of cytokines measured depending on the type of collection tube used. Our findings can help in the development of guidelines for blood collection and processing for proteomic biomarker studies. PMID:22701622
Aguilar-Mahecha, Adriana; Kuzyk, Michael A; Domanski, Dominik; Borchers, Christoph H; Basik, Mark
2012-01-01
Blood sample processing and handling can have a significant impact on the stability and levels of proteins measured in biomarker studies. Such pre-analytical variability needs to be well understood in the context of the different proteomics platforms available for biomarker discovery and validation. In the present study we evaluated different types of blood collection tubes including the BD P100 tube containing protease inhibitors as well as CTAD tubes, which prevent platelet activation. We studied the effect of different processing protocols as well as delays in tube processing on the levels of 55 mid and high abundance plasma proteins using novel multiple-reaction monitoring-mass spectrometry (MRM-MS) assays as well as 27 low abundance cytokines using a commercially available multiplexed bead-based immunoassay. The use of P100 tubes containing protease inhibitors only conferred proteolytic protection for 4 cytokines and only one MRM-MS-measured peptide. Mid and high abundance proteins measured by MRM are highly stable in plasma left unprocessed for up to six hours although platelet activation can also impact the levels of these proteins. The levels of cytokines were elevated when tubes were centrifuged at cold temperature, while low levels were detected when samples were collected in CTAD tubes. Delays in centrifugation also had an impact on the levels of cytokines measured depending on the type of collection tube used. Our findings can help in the development of guidelines for blood collection and processing for proteomic biomarker studies.
Neural Correlates of Subliminal Language Processing.
Axelrod, Vadim; Bar, Moshe; Rees, Geraint; Yovel, Galit
2015-08-01
Language is a high-level cognitive function, so exploring the neural correlates of unconscious language processing is essential for understanding the limits of unconscious processing in general. The results of several functional magnetic resonance imaging studies have suggested that unconscious lexical and semantic processing is confined to the posterior temporal lobe, without involvement of the frontal lobe-the regions that are indispensable for conscious language processing. However, previous studies employed a similarly designed masked priming paradigm with briefly presented single and contextually unrelated words. It is thus possible, that the stimulation level was insufficiently strong to be detected in the high-level frontal regions. Here, in a high-resolution fMRI and multivariate pattern analysis study we explored the neural correlates of subliminal language processing using a novel paradigm, where written meaningful sentences were suppressed from awareness for extended duration using continuous flash suppression. We found that subjectively and objectively invisible meaningful sentences and unpronounceable nonwords could be discriminated not only in the left posterior superior temporal sulcus (STS), but critically, also in the left middle frontal gyrus. We conclude that frontal lobes play a role in unconscious language processing and that activation of the frontal lobes per se might not be sufficient for achieving conscious awareness. © The Author 2014. Published by Oxford University Press.
High-performance image processing on the desktop
NASA Astrophysics Data System (ADS)
Jordan, Stephen D.
1996-04-01
The suitability of computers to the task of medical image visualization for the purposes of primary diagnosis and treatment planning depends on three factors: speed, image quality, and price. To be widely accepted the technology must increase the efficiency of the diagnostic and planning processes. This requires processing and displaying medical images of various modalities in real-time, with accuracy and clarity, on an affordable system. Our approach to meeting this challenge began with market research to understand customer image processing needs. These needs were translated into system-level requirements, which in turn were used to determine which image processing functions should be implemented in hardware. The result is a computer architecture for 2D image processing that is both high-speed and cost-effective. The architectural solution is based on the high-performance PA-RISC workstation with an HCRX graphics accelerator. The image processing enhancements are incorporated into the image visualization accelerator (IVX) which attaches to the HCRX graphics subsystem. The IVX includes a custom VLSI chip which has a programmable convolver, a window/level mapper, and an interpolator supporting nearest-neighbor, bi-linear, and bi-cubic modes. This combination of features can be used to enable simultaneous convolution, pan, zoom, rotate, and window/level control into 1 k by 1 k by 16-bit medical images at 40 frames/second.
ERIC Educational Resources Information Center
National Heart, Lung, and Blood Inst. (DHHS/NIH), Bethesda, MD.
Studies have shown that high blood cholesterol levels play a role in the development of coronary heart disease in adults, and that the process leading to atherosclerosis begins in childhood. To address the problem of high cholesterol levels in children, the Panel on Blood Cholesterol Levels recommends complementary approaches for individuals and…
High level cognitive information processing in neural networks
NASA Technical Reports Server (NTRS)
Barnden, John A.; Fields, Christopher A.
1992-01-01
Two related research efforts were addressed: (1) high-level connectionist cognitive modeling; and (2) local neural circuit modeling. The goals of the first effort were to develop connectionist models of high-level cognitive processes such as problem solving or natural language understanding, and to understand the computational requirements of such models. The goals of the second effort were to develop biologically-realistic model of local neural circuits, and to understand the computational behavior of such models. In keeping with the nature of NASA's Innovative Research Program, all the work conducted under the grant was highly innovative. For instance, the following ideas, all summarized, are contributions to the study of connectionist/neural networks: (1) the temporal-winner-take-all, relative-position encoding, and pattern-similarity association techniques; (2) the importation of logical combinators into connection; (3) the use of analogy-based reasoning as a bridge across the gap between the traditional symbolic paradigm and the connectionist paradigm; and (4) the application of connectionism to the domain of belief representation/reasoning. The work on local neural circuit modeling also departs significantly from the work of related researchers. In particular, its concentration on low-level neural phenomena that could support high-level cognitive processing is unusual within the area of biological local circuit modeling, and also serves to expand the horizons of the artificial neural net field.
Singha, Poonam; Muthukumarappan, Kasiviswanathan; Krishnan, Padmanaban
2018-01-01
A combination of different levels of distillers dried grains processed for food application (FDDG), garbanzo flour and corn grits were chosen as a source of high-protein and high-fiber extruded snacks. A four-factor central composite rotatable design was adopted to study the effect of FDDG level, moisture content of blends, extrusion temperature, and screw speed on the apparent viscosity, mass flow rate or MFR, torque, and specific mechanical energy or SME during the extrusion process. With increase in the extrusion temperature from 100 to 140°C, apparent viscosity, specific mechanical energy, and torque value decreased. Increase in FDDG level resulted in increase in apparent viscosity, SME and torque. FDDG had no significant effect (p > .5) on mass flow rate. SME also increased with increase in the screw speed which could be due to the higher shear rates at higher screw speeds. Screw speed and moisture content had significant negative effect ( p < .05) on the torque. The apparent viscosity of dough inside the extruder and the system parameters were affected by the processing conditions. This study will be useful for control of extrusion process of blends containing these ingredients for the development of high-protein high-fiber extruded snacks.
Deng, Gui-Fang; Li, Ke; Ma, Jing; Liu, Fen; Dai, Jing-Jing; Li, Hua-Bin
2011-01-01
The level of aluminium in 178 processed food samples from Shenzhen city in China was evaluated using inductively coupled plasma-mass spectrometry. Some processed foods contained a concentration of up to 1226 mg/kg, which is about 12 times the Chinese food standard. To establish the main source in these foods, Al levels in the raw materials were determined. However, aluminium concentrations in raw materials were low (0.10-451.5 mg/kg). Therefore, aluminium levels in food additives used in these foods was determined and it was found that some food additives contained a high concentration of aluminium (0.005-57.4 g/kg). The results suggested that, in the interest of public health, food additives containing high concentrations of aluminium should be replaced by those containing less. This study has provided new information on aluminium levels in Chinese processed foods, raw materials and a selection of food additives.
Brody, Gene H; Dorsey, Shannon; Forehand, Rex; Armistead, Lisa
2002-01-01
The unique contributions that parenting processes (high levels of monitoring with a supportive, involved mother-child relationship) and classroom processes (high levels of organization, rule clarity, and student involvement) make to children's self-regulation and adjustment were examined with a sample of 277 single-parent African American families. A multi-informant design involving mothers, teachers, and 7- to 15-year-old children was used. Structural equation modeling indicated that parenting and classroom processes contributed uniquely to children's adjustment through the children's development of self-regulation. Additional analyses suggested that classroom processes can serve a protective-stabilizing function when parenting processes are compromised, and vice versa. Further research is needed to examine processes in both family and school contexts that promote child competence and resilience.
Identification of Sources of Endotoxin Exposure as Input for Effective Exposure Control Strategies.
van Duuren-Stuurman, Birgit; Gröllers-Mulderij, Mariska; van de Runstraat, Annemieke; Duisterwinkel, Anton; Terwoert, Jeroen; Spaan, Suzanne
2018-02-13
Aim of the present study is to investigate the levels of endotoxins on product samples from potatoes, onions, and seeds, representing a relevant part of the agro-food industry in the Netherlands, to gather valuable insights in possibilities for exposure control measures early in the process of industrial processing of these products. Endotoxin levels on 330 products samples from companies representing the potato, onion, and seed (processing) industry (four potato-packaging companies, five potato-processing companies, five onion-packaging companies, and four seed-processing companies) were assessed using the Limulus Amboecyte Lysate (LAL) assay. As variation in growth conditions (type of soil, growth type) and product characteristics (surface roughness, dustiness, size, species) are assumed to influence the level of endotoxin on products, different types, and growth conditions were considered when collecting the samples. Additionally, waste material, rotten products, felt material (used for drying), and process water were collected. A large variation in the endotoxin levels was found on samples of potatoes, onions, and seeds (overall geometric standard deviation 17), in the range between 0.7 EU g-1 to 16400000 EU g-1. The highest geometric mean endotoxin levels were found in plant material (319600 EU g-1), followed by soil material (49100 EU g-1) and the outer side of products (9300 EU g-1), indicating that removal of plant and soil material early in the process would be an effective exposure control strategy. The high levels of endotoxins found in the limited number of samples from rotten onions indicate that these rotten onions should also be removed early in the process. Mean endotoxin levels found in waste material (only available for seed processing) is similar to the level found in soil material, although the range is much larger. On uncleaned seeds, higher endotoxin levels were found than on cleaned seeds, indicating that cleaning processes are important control measures and also that the waste material should be handled with care. Although endotoxin levels in batches of to-be-processed potatoes, onions, and seeds vary quite dramatically, it could be concluded that rotten products, plant material, and waste material contain particularly high endotoxin levels. This information was used to propose control measures to reduce exposure to endotoxins of workers during the production process. © The Author(s) 2017. Published by Oxford University Press on behalf of the British Occupational Hygiene Society.
High- and low-level hierarchical classification algorithm based on source separation process
NASA Astrophysics Data System (ADS)
Loghmari, Mohamed Anis; Karray, Emna; Naceur, Mohamed Saber
2016-10-01
High-dimensional data applications have earned great attention in recent years. We focus on remote sensing data analysis on high-dimensional space like hyperspectral data. From a methodological viewpoint, remote sensing data analysis is not a trivial task. Its complexity is caused by many factors, such as large spectral or spatial variability as well as the curse of dimensionality. The latter describes the problem of data sparseness. In this particular ill-posed problem, a reliable classification approach requires appropriate modeling of the classification process. The proposed approach is based on a hierarchical clustering algorithm in order to deal with remote sensing data in high-dimensional space. Indeed, one obvious method to perform dimensionality reduction is to use the independent component analysis process as a preprocessing step. The first particularity of our method is the special structure of its cluster tree. Most of the hierarchical algorithms associate leaves to individual clusters, and start from a large number of individual classes equal to the number of pixels; however, in our approach, leaves are associated with the most relevant sources which are represented according to mutually independent axes to specifically represent some land covers associated with a limited number of clusters. These sources contribute to the refinement of the clustering by providing complementary rather than redundant information. The second particularity of our approach is that at each level of the cluster tree, we combine both a high-level divisive clustering and a low-level agglomerative clustering. This approach reduces the computational cost since the high-level divisive clustering is controlled by a simple Boolean operator, and optimizes the clustering results since the low-level agglomerative clustering is guided by the most relevant independent sources. Then at each new step we obtain a new finer partition that will participate in the clustering process to enhance semantic capabilities and give good identification rates.
Measuring and assessing maintainability at the end of high level design
NASA Technical Reports Server (NTRS)
Briand, Lionel C.; Morasca, Sandro; Basili, Victor R.
1993-01-01
Software architecture appears to be one of the main factors affecting software maintainability. Therefore, in order to be able to predict and assess maintainability early in the development process we need to be able to measure the high-level design characteristics that affect the change process. To this end, we propose a measurement approach, which is based on precise assumptions derived from the change process, which is based on Object-Oriented Design principles and is partially language independent. We define metrics for cohesion, coupling, and visibility in order to capture the difficulty of isolating, understanding, designing and validating changes.
NASA Astrophysics Data System (ADS)
Park, Keecheol; Oh, Kyungsuk
2017-09-01
In order to investigate the effect of leveling conditions on residual stress evolution during the leveling process of hot rolled high strength steels, the in-plane residual stresses of sheet processed under controlled conditions at skin-pass mill and levelers were measured by cutting method. The residual stress was localized near the edge of sheet. As the thickness of sheet was increased, the residual stress occurred region was expanded. The magnitude of residual stress within the sheet was reduced as increasing the deformation occurred during the leveling process. But the residual stress itself was not removed completely. The magnitude of camber occurred at cut plate was able to be predicted by the residual stress distribution. A numerical algorithm was developed for analysing the effect of leveling conditions on residual stress. It was able to implement the effect of plastic deformation in leveling, tension, work roll bending, and initial state of sheet (residual stress and curl distribution). The validity of simulated results was verified from comparison with the experimentally measured residual stress and curl in a sheet.
High pressure processing's potential to inactivate norovirus and other fooodborne viruses
USDA-ARS?s Scientific Manuscript database
High pressure processing (HPP) can inactivate human norovirus. However, all viruses are not equally susceptible to HPP. Pressure treatment parameters such as required pressure levels, initial pressurization temperatures, and pressurization times substantially affect inactivation. How food matrix ...
The moderating effect of motivation on health-related decision-making.
Berezowska, Aleksandra; Fischer, Arnout R H; Trijp, Hans C M van
2017-06-01
This study identifies how autonomous and controlled motivation moderates the cognitive process that drives the adoption of personalised nutrition services. The cognitive process comprises perceptions of privacy risk, personalisation benefit, and their determinants. Depending on their level of autonomous and controlled motivation, participants (N = 3453) were assigned to one of four motivational orientations, which resulted in a 2 (low/high autonomous motivation) × 2 (low/high controlled motivation) quasi-experimental design. High levels of autonomous motivation strengthened the extent to which: (1) the benefits of engaging with a service determined the outcome of a risk-benefit trade-off; (2) the effectiveness of a service determined benefit perceptions. High levels of controlled motivation influenced the extent to which: (1) the risk of privacy loss determined the outcome of a risk-benefit trade-off; (2) controlling personal information after disclosure and perceiving the disclosed personal information as sensitive determined the risk of potential privacy loss. To encourage the adoption of personalised dietary recommendations, for individuals with high levels of autonomous motivation emphasis should be on benefits and its determinants. For those with high levels of controlled motivation, it is important to focus on risk-related issues such as information sensitivity.
Hybrid photonic signal processing
NASA Astrophysics Data System (ADS)
Ghauri, Farzan Naseer
This thesis proposes research of novel hybrid photonic signal processing systems in the areas of optical communications, test and measurement, RF signal processing and extreme environment optical sensors. It will be shown that use of innovative hybrid techniques allows design of photonic signal processing systems with superior performance parameters and enhanced capabilities. These applications can be divided into domains of analog-digital hybrid signal processing applications and free-space---fiber-coupled hybrid optical sensors. The analog-digital hybrid signal processing applications include a high-performance analog-digital hybrid MEMS variable optical attenuator that can simultaneously provide high dynamic range as well as high resolution attenuation controls; an analog-digital hybrid MEMS beam profiler that allows high-power watt-level laser beam profiling and also provides both submicron-level high resolution and wide area profiling coverage; and all optical transversal RF filters that operate on the principle of broadband optical spectral control using MEMS and/or Acousto-Optic tunable Filters (AOTF) devices which can provide continuous, digital or hybrid signal time delay and weight selection. The hybrid optical sensors presented in the thesis are extreme environment pressure sensors and dual temperature-pressure sensors. The sensors employ hybrid free-space and fiber-coupled techniques for remotely monitoring a system under simultaneous extremely high temperatures and pressures.
A SIMPLE CELLULAR AUTOMATON MODEL FOR HIGH-LEVEL VEGETATION DYNAMICS
We have produced a simple two-dimensional (ground-plan) cellular automata model of vegetation dynamics specifically to investigate high-level community processes. The model is probabilistic, with individual plant behavior determined by physiologically-based rules derived from a w...
Ahmed, Lubna; de Fockert, Jan W
2012-10-01
Selective attention to relevant targets has been shown to depend on the availability of working memory (WM). Under conditions of high WM load, processing of irrelevant distractors is enhanced. Here we showed that this detrimental effect of WM load on selective attention efficiency is reversed when the task requires global- rather than local-level processing. Participants were asked to attend to either the local or the global level of a hierarchical Navon stimulus while keeping either a low or a high load in WM. In line with previous findings, during attention to the local level, distractors at the global level produced more interference under high than under low WM load. By contrast, loading WM had the opposite effect of improving selective attention during attention to the global level. The findings demonstrate that the impact of WM load on selective attention is not invariant, but rather is dependent on the level of the to-be-attended information.
Preliminary technical data summary No. 3 for the Defense Waste Processing Facility
DOE Office of Scientific and Technical Information (OSTI.GOV)
Landon, L.F.
1980-05-01
This document presents an update on the best information presently available for the purpose of establishing the basis for the design of a Defense Waste Processing Facility. Objective of this project is to provide a facility to fix the radionuclides present in Savannah River Plant (SRP) high-level liquid waste in a high-integrity form (glass). Flowsheets and material balances reflect the alternate CAB case including the incorporation of low-level supernate in concrete. (DLC)
Anger Assessment in Rural High School Students
ERIC Educational Resources Information Center
Lamb, Jacqueline M.; Puskar, Kathryn R.; Sereika, Susan; Patterson, Kathy; Kaufmann, Judith A.
2003-01-01
Anger and aggression in school children are a major concern in American society today. Students with high anger levels and poor cognitive processing skills are at risk for poor relationships, underachievement in school, and health problems. This article describes characteristics of children who are at risk for high anger levels and aggression as…
NASA Astrophysics Data System (ADS)
Benkrid, K.; Belkacemi, S.; Sukhsawas, S.
2005-06-01
This paper proposes an integrated framework for the high level design of high performance signal processing algorithms' implementations on FPGAs. The framework emerged from a constant need to rapidly implement increasingly complicated algorithms on FPGAs while maintaining the high performance needed in many real time digital signal processing applications. This is particularly important for application developers who often rely on iterative and interactive development methodologies. The central idea behind the proposed framework is to dynamically integrate high performance structural hardware description languages with higher level hardware languages in other to help satisfy the dual requirement of high level design and high performance implementation. The paper illustrates this by integrating two environments: Celoxica's Handel-C language, and HIDE, a structural hardware environment developed at the Queen's University of Belfast. On the one hand, Handel-C has been proven to be very useful in the rapid design and prototyping of FPGA circuits, especially control intensive ones. On the other hand, HIDE, has been used extensively, and successfully, in the generation of highly optimised parameterisable FPGA cores. In this paper, this is illustrated in the construction of a scalable and fully parameterisable core for image algebra's five core neighbourhood operations, where fully floorplanned efficient FPGA configurations, in the form of EDIF netlists, are generated automatically for instances of the core. In the proposed combined framework, highly optimised data paths are invoked dynamically from within Handel-C, and are synthesized using HIDE. Although the idea might seem simple prima facie, it could have serious implications on the design of future generations of hardware description languages.
Characteristics of Friction Stir Processed UHMW Polyethylene Based Composite
NASA Astrophysics Data System (ADS)
Hussain, G.; Khan, I.
2018-01-01
Ultra-high molecular weight polyethylene (UHMWPE) based composites are widely used in biomedical and food industries because of their biocompatibility and enhanced properties. The aim of this study was to fabricate UHMWPE / nHA composite through heat assisted Friction Stir Processing. The rotational speed (ω), feed rate (f), volume fraction of nHA (v) and shoulder temperature (T) were selected as the process parameters. Macroscopic and microscopic analysis revealed that these parameters have significant effects on the distribution of reinforcing material, defects formation and material mixing. Defects were observed especially at low levels of (ω, T) and high levels of (f, v). Low level of v with medium levels of other parameters resulted in better mixing and minimum defects. A 10% increase in strength with only 1% reduction in Percent Elongation was observed at the above set of conditions. Moreover, the resulted hardness of the composite was higher than that of the parent material.
Neural correlates of depth of strategic reasoning in medial prefrontal cortex
Coricelli, Giorgio; Nagel, Rosemarie
2009-01-01
We used functional MRI (fMRI) to investigate human mental processes in a competitive interactive setting—the “beauty contest” game. This game is well-suited for investigating whether and how a player's mental processing incorporates the thinking process of others in strategic reasoning. We apply a cognitive hierarchy model to classify subject's choices in the experimental game according to the degree of strategic reasoning so that we can identify the neural substrates of different levels of strategizing. According to this model, high-level reasoners expect the others to behave strategically, whereas low-level reasoners choose based on the expectation that others will choose randomly. The data show that high-level reasoning and a measure of strategic IQ (related to winning in the game) correlate with the neural activity in the medial prefrontal cortex, demonstrating its crucial role in successful mentalizing. This supports a cognitive hierarchy model of human brain and behavior. PMID:19470476
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1982-09-01
The U.S. Department of Energy (DOE) is considering the selection of a strategy for the long-term management of the defense high-level wastes at the Idaho Chemical Processing Plant (ICPP). This report describes the environmental impacts of alternative strategies. These alternative strategies include leaving the calcine in its present form at the Idaho National Engineering Laboratory (INEL), or retrieving and modifying the calcine to a more durable waste form and disposing of it either at the INEL or in an offsite repository. This report addresses only the alternatives for a program to manage the high-level waste generated at the ICPP. 24more » figures, 60 tables.« less
Van der Molen, Melle J. W.; Poppelaars, Eefje S.; Van Hartingsveldt, Caroline T. A.; Harrewijn, Anita; Gunther Moor, Bregtje; Westenberg, P. Michiel
2014-01-01
Cognitive models posit that the fear of negative evaluation (FNE) is a hallmark feature of social anxiety. As such, individuals with high FNE may show biased information processing when faced with social evaluation. The aim of the current study was to examine the neural underpinnings of anticipating and processing social-evaluative feedback, and its correlates with FNE. We used a social judgment paradigm in which female participants (N = 31) were asked to indicate whether they believed to be socially accepted or rejected by their peers. Anticipatory attention was indexed by the stimulus preceding negativity (SPN), while the feedback-related negativity and P3 were used to index the processing of social-evaluative feedback. Results provided evidence of an optimism bias in social peer evaluation, as participants more often predicted to be socially accepted than rejected. Participants with high levels of FNE needed more time to provide their judgments about the social-evaluative outcome. While anticipating social-evaluative feedback, SPN amplitudes were larger for anticipated social acceptance than for social rejection feedback. Interestingly, the SPN during anticipated social acceptance was larger in participants with high levels of FNE. None of the feedback-related brain potentials correlated with the FNE. Together, the results provided evidence of biased information processing in individuals with high levels of FNE when anticipating (rather than processing) social-evaluative feedback. The delayed response times in high FNE individuals were interpreted to reflect augmented vigilance imposed by the upcoming social-evaluative threat. Possibly, the SPN constitutes a neural marker of this vigilance in females with higher FNE levels, particularly when anticipating social acceptance feedback. PMID:24478667
Holmberg, Leif
2007-11-01
A health-care organization simultaneously belongs to two different institutional value patterns: a professional and an administrative value pattern. At the administrative level, medical problem-solving processes are generally perceived as the efficient application of familiar chains of activities to well-defined problems; and a low task uncertainty is therefore assumed at the work-floor level. This assumption is further reinforced through clinical pathways and other administrative guidelines. However, studies have shown that in clinical practice such administrative guidelines are often considered inadequate and difficult to implement mainly because physicians generally perceive task uncertainty to be high and that the guidelines do not cover the scope of encountered deviations. The current administrative level guidelines impose uniform structural features that meet the requirement for low task uncertainty. Within these structural constraints, physicians must organize medical problem-solving processes to meet any task uncertainty that may be encountered. Medical problem-solving processes with low task uncertainty need to be organized independently of processes with high task uncertainty. Each process must be evaluated according to different performance standards and needs to have autonomous administrative guideline models. Although clinical pathways seem appropriate when there is low task uncertainty, other kinds of guidelines are required when the task uncertainty is high.
Attention to Hierarchical Level Influences Attentional Selection of Spatial Scale
ERIC Educational Resources Information Center
Flevaris, Anastasia V.; Bentin, Shlomo; Robertson, Lynn C.
2011-01-01
Ample evidence suggests that global perception may involve low spatial frequency (LSF) processing and that local perception may involve high spatial frequency (HSF) processing (Shulman, Sullivan, Gish, & Sakoda, 1986; Shulman & Wilson, 1987; Robertson, 1996). It is debated whether SF selection is a low-level mechanism associating global…
ERIC Educational Resources Information Center
Hayward, Dana A.; Shore, David I.; Ristic, Jelena; Kovshoff, Hanna; Iarocci, Grace; Mottron, Laurent; Burack, Jacob A.
2012-01-01
We utilized a hierarchical figures task to determine the default level of perceptual processing and the flexibility of visual processing in a group of high-functioning young adults with autism (n = 12) and a typically developing young adults, matched by chronological age and IQ (n = 12). In one task, participants attended to one level of the…
ERIC Educational Resources Information Center
Krebs, Saskia Susanne; Roebers, Claudia Maria
2012-01-01
This multi-phase study examined the influence of retrieval processes on children's metacognitive processes in relation to and in interaction with achievement level and age. First, N = 150 9/10- and 11/12-year old high and low achievers watched an educational film and predicted their test performance. Children then solved a cloze test regarding the…
High-powered CO2 -lasers and noise control
NASA Astrophysics Data System (ADS)
Honkasalo, Antero; Kuronen, Juhani
High-power CO2 -lasers are being more and more widely used for welding, drilling and cutting in machine shops. In the near future, different kinds of surface treatments will also become routine practice with laser units. The industries benefitting most from high power lasers will be: the automotive industry, shipbuilding, the offshore industry, the aerospace industry, the nuclear and the chemical processing industries. Metal processing lasers are interesting from the point of view of noise control because the working tool is a laser beam. It is reasonable to suppose that the use of such laser beams will lead to lower noise levels than those connected with traditional metal processing methods and equipment. In the following presentation, the noise levels and possible noise-control problems attached to the use of high-powered CO2 -lasers are studied.
Theoretical approaches to lightness and perception.
Gilchrist, Alan
2015-01-01
Theories of lightness, like theories of perception in general, can be categorized as high-level, low-level, and mid-level. However, I will argue that in practice there are only two categories: one-stage mid-level theories, and two-stage low-high theories. Low-level theories usually include a high-level component and high-level theories include a low-level component, the distinction being mainly one of emphasis. Two-stage theories are the modern incarnation of the persistent sensation/perception dichotomy according to which an early experience of raw sensations, faithful to the proximal stimulus, is followed by a process of cognitive interpretation, typically based on past experience. Like phlogiston or the ether, raw sensations seem like they must exist, but there is no clear evidence for them. Proximal stimulus matches are postperceptual, not read off an early sensory stage. Visual angle matches are achieved by a cognitive process of flattening the visual world. Likewise, brightness (luminance) matches depend on a cognitive process of flattening the illumination. Brightness is not the input to lightness; brightness is slower than lightness. Evidence for an early (< 200 ms) mosaic stage is shaky. As for cognitive influences on perception, the many claims tend to fall apart upon close inspection of the evidence. Much of the evidence for the current revival of the 'new look' is probably better explained by (1) a natural desire of (some) subjects to please the experimenter, and (2) the ease of intuiting an experimental hypothesis. High-level theories of lightness are overkill. The visual system does not need to know the amount of illumination, merely which surfaces share the same illumination. This leaves mid-level theories derived from the gestalt school. Here the debate seems to revolve around layer models and framework models. Layer models fit our visual experience of a pattern of illumination projected onto a pattern of reflectance, while framework models provide a better account of illusions and failures of constancy. Evidence for and against these approaches is reviewed.
Lackner, Ryan J; Fresco, David M
2016-10-01
Awareness of the body (i.e., interoceptive awareness) and self-referential thought represent two distinct, yet habitually integrated aspects of self. A recent neuroanatomical and processing model for depression and anxiety incorporates the connections between increased but low fidelity afferent interoceptive input with self-referential and belief-based states. A deeper understanding of how self-referential processes are integrated with interoceptive processes may ultimately aid in our understanding of altered, maladaptive views of the self - a shared experience of individuals with mood and anxiety disorders. Thus, the purpose of the current study was to examine how negative self-referential processing (i.e., brooding rumination) relates to interoception in the context of affective psychopathology. Undergraduate students (N = 82) completed an interoception task (heartbeat counting) in addition to self-reported measures of rumination and depression and anxiety symptoms. Results indicated an interaction effect of brooding rumination and interoceptive awareness on depression and anxiety-related distress. Specifically, high levels of brooding rumination coupled with low levels of interoceptive awareness were associated with the highest levels of depression and anxiety-related distress, whereas low levels of brooding rumination coupled with high levels of interoceptive awareness were associated with lower levels of depression and anxiety-related distress. The findings provide further support for the conceptualization of anxiety and depression as conditions involving the integration of interoceptive processes and negative self-referential processes. Copyright © 2016 Elsevier Ltd. All rights reserved.
Friction stir processing on high carbon steel U12
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tarasov, S. Yu., E-mail: tsy@ispms.ru; Rubtsov, V. E., E-mail: rvy@ispms.ru; National Research Tomsk Polytechnic University, Tomsk, 634050
2015-10-27
Friction stir processing (FSP) of high carbon steel (U12) samples has been carried out using a milling machine and tools made of cemented tungsten carbide. The FSP tool has been made in the shape of 5×5×1.5 mm. The microstructural characterization of obtained stir zone and heat affected zone has been carried out. Microhardness at the level of 700 MPa has been obtained in the stir zone with microstructure consisting of large grains and cementitte network. This high-level of microhardness is explained by bainitic reaction developing from decarburization of austenitic grains during cementite network formation.
West Valley demonstration project: Alternative processes for solidifying the high-level wastes
NASA Astrophysics Data System (ADS)
Holton, L. K.; Larson, D. E.; Partain, W. L.; Treat, R. L.
1981-10-01
Two pretreatment approaches and several waste form processes for radioactive wastes were selected for evaluation. The two waste treatment approaches were the salt/sludge separation process and the combined waste process. Both terminal and interim waste form processes were studied.
EEG oscillations entrain their phase to high-level features of speech sound.
Zoefel, Benedikt; VanRullen, Rufin
2016-01-01
Phase entrainment of neural oscillations, the brain's adjustment to rhythmic stimulation, is a central component in recent theories of speech comprehension: the alignment between brain oscillations and speech sound improves speech intelligibility. However, phase entrainment to everyday speech sound could also be explained by oscillations passively following the low-level periodicities (e.g., in sound amplitude and spectral content) of auditory stimulation-and not by an adjustment to the speech rhythm per se. Recently, using novel speech/noise mixture stimuli, we have shown that behavioral performance can entrain to speech sound even when high-level features (including phonetic information) are not accompanied by fluctuations in sound amplitude and spectral content. In the present study, we report that neural phase entrainment might underlie our behavioral findings. We observed phase-locking between electroencephalogram (EEG) and speech sound in response not only to original (unprocessed) speech but also to our constructed "high-level" speech/noise mixture stimuli. Phase entrainment to original speech and speech/noise sound did not differ in the degree of entrainment, but rather in the actual phase difference between EEG signal and sound. Phase entrainment was not abolished when speech/noise stimuli were presented in reverse (which disrupts semantic processing), indicating that acoustic (rather than linguistic) high-level features play a major role in the observed neural entrainment. Our results provide further evidence for phase entrainment as a potential mechanism underlying speech processing and segmentation, and for the involvement of high-level processes in the adjustment to the rhythm of speech. Copyright © 2015 Elsevier Inc. All rights reserved.
SIMULANT DEVELOPMENT FOR SAVANNAH RIVER SITE HIGH LEVEL WASTE
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stone, M; Russell Eibling, R; David Koopman, D
2007-09-04
The Defense Waste Processing Facility (DWPF) at the Savannah River Site vitrifies High Level Waste (HLW) for repository internment. The process consists of three major steps: waste pretreatment, vitrification, and canister decontamination/sealing. The HLW consists of insoluble metal hydroxides (primarily iron, aluminum, magnesium, manganese, and uranium) and soluble sodium salts (carbonate, hydroxide, nitrite, nitrate, and sulfate). The HLW is processed in large batches through DWPF; DWPF has recently completed processing Sludge Batch 3 (SB3) and is currently processing Sludge Batch 4 (SB4). The composition of metal species in SB4 is shown in Table 1 as a function of the ratiomore » of a metal to iron. Simulants remove radioactive species and renormalize the remaining species. Supernate composition is shown in Table 2.« less
Toward large-area roll-to-roll printed nanophotonic sensors
NASA Astrophysics Data System (ADS)
Karioja, Pentti; Hiltunen, Jussi; Aikio, Sanna M.; Alajoki, Teemu; Tuominen, Jarkko; Hiltunen, Marianne; Siitonen, Samuli; Kontturi, Ville; Böhlen, Karl; Hauser, Rene; Charlton, Martin; Boersma, Arjen; Lieberzeit, Peter; Felder, Thorsten; Eustace, David; Haskal, Eliav
2014-05-01
Polymers have become an important material group in fabricating discrete photonic components and integrated optical devices. This is due to their good properties: high optical transmittance, versatile processability at relative low temperatures and potential for low-cost production. Recently, nanoimprinting or nanoimprint lithography (NIL) has obtained a plenty of research interest. In NIL, a mould is pressed against a substrate coated with a moldable material. After deformation of the material, the mold is separated and a replica of the mold is formed. Compared with conventional lithographic methods, imprinting is simple to carry out, requires less-complicated equipment and can provide high-resolution with high throughput. Nanoimprint lithography has shown potential to become a method for low-cost and high-throughput fabrication of nanostructures. We show the development process of nano-structured, large-area multi-parameter sensors using Photonic Crystal (PC) and Surface Enhanced Raman Scattering (SERS) methodologies for environmental and pharmaceutical applications. We address these challenges by developing roll-to-roll (R2R) UV-nanoimprint fabrication methods. Our development steps are the following: Firstly, the proof of concept structures are fabricated by the use of wafer-level processes in Si-based materials. Secondly, the master molds of successful designs are fabricated, and they are used to transfer the nanophotonic structures into polymer materials using sheet-level UV-nanoimprinting. Thirdly, the sheet-level nanoimprinting processes are transferred to roll-to-roll fabrication. In order to enhance roll-to-roll manufacturing capabilities, silicone-based polymer material development was carried out. In the different development phases, Photonic Crystal and SERS sensor structures with increasing complexities were fabricated using polymer materials in order to enhance sheet-level and roll-to-roll manufacturing processes. In addition, chemical and molecular imprint (MIP) functionalization methods were applied in the sensor demonstrators. In this paper, the process flow in fabricating large-area nanophotonic structures by the use of sheet-level and roll-to-roll UV- nanoimprinting is reported.
The Solid Phase Curing Time Effect of Asbuton with Texapon Emulsifier at the Optimum Bitumen Content
NASA Astrophysics Data System (ADS)
Sarwono, D.; Surya D, R.; Setyawan, A.; Djumari
2017-07-01
Buton asphalt (asbuton) could not be utilized optimally in Indonesia. Asbuton utilization rate was still low because the processed product of asbuton still have impracticable form in the term of use and also requiring high processing costs. This research aimed to obtain asphalt products from asbuton practical for be used through the extraction process and not requiring expensive processing cost. This research was done with experimental method in laboratory. The composition of emulsify asbuton were 5/20 grain, premium, texapon, HCl, and aquades. Solid phase was the mixture asbuton 5/20 grain and premium with 3 minutes mixing time. Liquid phase consisted texapon, HCl and aquades. The aging process was done after solid phase mixing process in order to reaction and tie of solid phase mixed become more optimal for high solubility level of asphalt production. Aging variable time were 30, 60, 90, 120, and 150 minutes. Solid and liquid phase was mixed for emulsify asbuton production, then extracted for 25 minutes. Solubility level of asphalt, water level, and asphalt characteristic was tested at extraction result of emulsify asbuton with most optimum ashphal level. The result of analysis tested data asphalt solubility level at extract asbuton resulted 94.77% on 120 minutes aging variable time. Water level test resulted water content reduction on emulsify asbuton more long time on occurring of aging solid phase. Examination of asphalt characteristic at extraction result of emulsify asbuton with optimum asphalt solubility level, obtain specimen that have rigid and strong texture in order that examination result have not sufficient ductility and penetration value.
Etchepare, Aurore; Prouteau, Antoinette
2018-04-01
Social cognition has received growing interest in many conditions in recent years. However, this construct still suffers from a considerable lack of consensus, especially regarding the dimensions to be studied and the resulting methodology of clinical assessment. Our review aims to clarify the distinctiveness of the dimensions of social cognition. Based on Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) statements, a systematic review was conducted to explore the factor structure of social cognition in the adult general and clinical populations. The initial search provided 441 articles published between January 1982 and March 2017. Eleven studies were included, all conducted in psychiatric populations and/or healthy participants. Most studies were in favor of a two-factor solution. Four studies drew a distinction between low-level (e.g., facial emotion/prosody recognition) and high-level (e.g., theory of mind) information processing. Four others reported a distinction between affective (e.g., facial emotion/prosody recognition) and cognitive (e.g., false beliefs) information processing. Interestingly, attributional style was frequently reported as an additional separate factor of social cognition. Results of factor analyses add further support for the relevance of models differentiating level of information processing (low- vs. high-level) from nature of processed information (affective vs. cognitive). These results add to a significant body of empirical evidence from developmental, clinical research and neuroimaging studies. We argue the relevance of integrating low- versus high-level processing with affective and cognitive processing in a two-dimensional model of social cognition that would be useful for future research and clinical practice. (JINS, 2018, 24, 391-404).
García-Capdevila, Sílvia; Portell-Cortés, Isabel; Torras-Garcia, Meritxell; Coll-Andreu, Margalida; Costa-Miserachs, David
2009-09-14
The effect of long-term voluntary exercise (running wheel) on anxiety-like behaviour (plus maze and open field) and learning and memory processes (object recognition and two-way active avoidance) was examined on Wistar rats. Because major individual differences in running wheel behaviour were observed, the data were analysed considering the exercising animals both as a whole and grouped according to the time spent in the running wheel (low, high, and very-high running). Although some variables related to anxiety-like behaviour seem to reflect an anxiogenic compatible effect, the view of the complete set of variables could be interpreted as an enhancement of defensive and risk assessment behaviours in exercised animals, without major differences depending on the exercise level. Effects on learning and memory processes were dependent on task and level of exercise. Two-way avoidance was not affected either in the acquisition or in the retention session, while the retention of object recognition task was affected. In this latter task, an enhancement in low running subjects and impairment in high and very-high running animals were observed.
Rachiplusia nu larva as a biofactory to achieve high level expression of horseradish peroxidase.
Romero, Lucía Virginia; Targovnik, Alexandra Marisa; Wolman, Federico Javier; Cascone, Osvaldo; Miranda, María Victoria
2011-05-01
A process based on orally-infected Rachiplusia nu larvae as biological factories for expression and one-step purification of horseradish peroxidase isozyme C (HRP-C) is described. The process allows obtaining high levels of pure HRP-C by membrane chromatography purification. The introduction of the partial polyhedrin homology sequence element in the target gene increased HRP-C expression level by 2.8-fold whereas it increased 1.8-fold when the larvae were reared at 27 °C instead of at 24 °C, summing up a 4.6-fold overall increase in the expression level. Additionally, HRP-C purification by membrane chromatography at a high flow rate greatly increase D the productivity without affecting the resolution. The V(max) and K(m) values of the recombinant HRP-C were similar to those of the HRP from Armoracia rusticana roots. © Springer Science+Business Media B.V. 2011
Intelligent Processing Equipment Research Supported by the National Science Foundation
NASA Technical Reports Server (NTRS)
Rao, Suren B.
1992-01-01
The research in progress on processes, workstations, and systems has the goal of developing a high level of understanding of the issues involved. This will enable the incorporation of a level of intelligence that will allow the creation of autonomous manufacturing systems that operate in an optimum manner, under a wide range of conditions. The emphasis of the research has been on the development of highly productive and flexible techniques to address current and future problems in manufacturing and processing. Several of these projects have resulted in well-defined and established models that can now be implemented in the application arena in the next few years.
Geers, Ann E; Hayes, Heather
2011-02-01
This study had three goals: (1) to document the literacy skills of deaf adolescents who received cochlear implants (CIs) as preschoolers; (2) to examine reading growth from elementary grades to high school; (3) to assess the contribution of early literacy levels and phonological processing skills, among other factors, to literacy levels in high school. A battery of reading, spelling, expository writing, and phonological processing assessments were administered to 112 high school (CI-HS) students, ages 15.5 to 18.5 yrs, who had participated in a reading assessment battery in early elementary grades (CI-E), ages 8.0 to 9.9 yrs. The CI-HS students' performance was compared with either a control group of hearing peers (N = 46) or hearing norms provided by the assessment developer. Many of the CI-HS students (47 to 66%) performed within or above the average range for hearing peers on reading tests. When compared with their CI-E performance, good early readers were also good readers in high school. Importantly, the majority of CI-HS students maintained their reading levels over time compared with hearing peers, indicating that the gap in performance was, at the very least, not widening for most students. Written expression and phonological processing tasks posed a great deal of difficulty for the CI-HS students. They were poorer spellers, poorer expository writers, and displayed poorer phonological knowledge than hearing age-mates. Phonological processing skills were a critical predictor of high school literacy skills (reading, spelling, and expository writing), accounting for 39% of variance remaining after controlling for child, family, and implant characteristics. Many children who receive CIs as preschoolers achieve age-appropriate literacy levels as adolescents. However, significant delays in spelling and written expression are evident compared with hearing peers. For children with CIs, the development of phonological processing skills is not just important for early reading skills, such as decoding, but is critical for later literacy success as well.
NASA Astrophysics Data System (ADS)
Huang, J. C.; Wright, W. V.
1982-04-01
The Defense Waste Processing Facility (DWPF) for immobilizing nuclear high level waste (HLW) is scheduled to be built. High level waste is produced when reactor components are subjected to chemical separation operations. Two candidates for immobilizing this HLW are borosilicate glass and crystalline ceramic, either being contained in weld sealed stainless steel canisters. A number of technical analyses are being conducted to support a selection between these two waste forms. The risks associated with the manufacture and interim storage of these two forms in the DWPF are compared. Process information used in the risk analysis was taken primarily from a DWPF processibility analysis. The DWPF environmental analysis provided much of the necessary environmental information.
ERIC Educational Resources Information Center
Koolen, Sophieke; Vissers, Constance Th. W. M.; Hendriks, Angelique W. C. J.; Egger, Jos I. M.; Verhoeven, Ludo
2012-01-01
This study examined the hypothesis of an atypical interaction between attention and language in ASD. A dual-task experiment with three conditions was designed, in which sentences were presented that contained errors requiring attentional focus either at (a) low level, or (b) high level, or (c) both levels of language. Speed and accuracy for error…
Beigneux, Anne P.; Davies, Brandon S. J.; Gin, Peter; Weinstein, Michael M.; Farber, Emily; Qiao, Xin; Peale, Franklin; Bunting, Stuart; Walzem, Rosemary L.; Wong, Jinny S.; Blaner, William S.; Ding, Zhi-Ming; Melford, Kristan; Wongsiriroj, Nuttaporn; Shu, Xiao; de Sauvage, Fred; Ryan, Robert O.; Fong, Loren G.; Bensadoun, André; Young, Stephen G.
2007-01-01
Summary The triglycerides in chylomicrons are hydrolyzed by lipoprotein lipase (LpL) along the luminal surface of the capillaries. However, the endothelial cell molecule that facilitates chylomicron processing by LpL has not yet been defined. Here, we show that glycosylphosphatidylinositol-anchored high density lipoprotein–binding protein 1 (GPIHBP1) plays a critical role in the lipolytic processing of chylomicrons. Gpihbp1-deficient mice exhibit a striking accumulation of chylomicrons in the plasma, even on a low-fat diet, resulting in milky plasma and plasma triglyceride levels as high as 5,000 mg/dl. Normally, Gpihbp1 is expressed highly in heart and adipose tissue, the same tissues that express high levels of LpL. In these tissues, GPIHBP1 is located on the luminal face of the capillary endothelium. Expression of GPIHBP1 in cultured cells confers the ability to bind both LpL and chylomicrons. These studies strongly suggest that GPIHBP1 is an important platform for the LpL-mediated processing of chylomicrons in capillaries. PMID:17403372
Process for measuring low cadmium levels in blood and other biological specimens
Peterson, David P.; Huff, Edmund A.; Bhattacharyya, Maryka H.
1994-01-01
A process for measuring low levels of cadmium in blood and other biological specimens is provided without interference from high levels of alkali metal contaminants by forming an aqueous solution and without contamination by environmental cadmium absent the proteins from the specimen, selectively removing cadmium from the aqueous solution on an anion exchange resin, thereby removing the alkali metal contaminants, resolubilizing cadmium from the resin to form a second solution and analyzing the second solution for cadmium, the process being carried out in a cadmium-free environment.
Process for measuring low cadmium levels in blood and other biological specimens
Peterson, David P.; Huff, Edmund A.; Bhattacharyya, Maryka H.
1994-05-03
A process for measuring low levels of cadmium in blood and other biological specimens is provided without interference from high levels of alkali metal contaminants by forming an aqueous solution and without contamination by environmental cadmium absent the proteins from the specimen, selectively removing cadmium from the aqueous solution on an anion exchange resin, thereby removing the alkali metal contaminants, resolubilizing cadmium from the resin to form a second solution and analyzing the second solution for cadmium, the process being carried out in a cadmium-free environment.
"Assessment Drives Learning": Do Assessments Promote High-Level Cognitive Processing?
ERIC Educational Resources Information Center
Bezuidenhout, M. J.; Alt, H.
2011-01-01
Students tend to learn in the way they know, or think, they will be assessed. Therefore, to ensure deep, meaningful learning, assessments must be geared to promote cognitive processing that requires complex, contextualised thinking to construct meaning and create knowledge. Bloom's taxonomy of cognitive levels is used worldwide to assist in…
NASA Astrophysics Data System (ADS)
Witantyo; Setyawan, David
2018-03-01
In a lead acid battery industry, grid casting is a process that has high defect and thickness variation level. DMAIC (Define-Measure-Analyse-Improve-Control) method and its tools will be used to improve the casting process. In the Define stage, it is used project charter and SIPOC (Supplier Input Process Output Customer) method to map the existent problem. In the Measure stage, it is conducted a data retrieval related to the types of defect and the amount of it, also the grid thickness variation that happened. And then the retrieved data is processed and analyzed by using 5 Why’s and FMEA method. In the Analyze stage, it is conducted a grid observation that experience fragile and crack type of defect by using microscope showing the amount of oxide Pb inclusion in the grid. Analysis that is used in grid casting process shows the difference of temperature that is too high between the metal fluid and mold temperature, also the corking process that doesn’t have standard. The Improve stage is conducted a fixing process which generates the reduction of grid variation thickness level and defect/unit level from 9,184% to 0,492%. In Control stage, it is conducted a new working standard determination and already fixed control process.
Effective low-level processing for interferometric image enhancement
NASA Astrophysics Data System (ADS)
Joo, Wonjong; Cha, Soyoung S.
1995-09-01
The hybrid operation of digital image processing and a knowledge-based AI system has been recognized as a desirable approach of the automated evaluation of noise-ridden interferogram. Early noise/data reduction before phase is extracted is essential for the success of the knowledge- based processing. In this paper, new concepts of effective, interactive low-level processing operators: that is, a background-matched filter and a directional-smoothing filter, are developed and tested with transonic aerodynamic interferograms. The results indicate that these new operators have promising advantages in noise/data reduction over the conventional ones, leading success of the high-level, intelligent phase extraction.
Stevenson, Ryan A; Sun, Sol Z; Hazlett, Naomi; Cant, Jonathan S; Barense, Morgan D; Ferber, Susanne
2018-04-01
Atypical sensory perception is one of the most ubiquitous symptoms of autism, including a tendency towards a local-processing bias. We investigated whether local-processing biases were associated with global-processing impairments on a global/local attentional-scope paradigm in conjunction with a composite-face task. Behavioural results were related to individuals' levels of autistic traits, specifically the Attention to Detail subscale of the Autism Quotient, and the Sensory Profile Questionnaire. Individuals showing high rates of Attention to Detail were more susceptible to global attentional-scope manipulations, suggesting that local-processing biases associated with Attention to Detail do not come at the cost of a global-processing deficit, but reflect a difference in default global versus local bias. This relationship operated at the attentional/perceptual level, but not response criterion.
Denys, S; Van Loey, A M; Hendrickx, M E
2000-01-01
A numerical heat transfer model for predicting product temperature profiles during high-pressure thawing processes was recently proposed by the authors. In the present work, the predictive capacity of the model was considerably improved by taking into account the pressure dependence of the latent heat of the product that was used (Tylose). The effect of pressure on the latent heat of Tylose was experimentally determined by a series of freezing experiments conducted at different pressure levels. By combining a numerical heat transfer model for freezing processes with a least sum of squares optimization procedure, the corresponding latent heat at each pressure level was estimated, and the obtained pressure relation was incorporated in the original high-pressure thawing model. Excellent agreement with the experimental temperature profiles for both high-pressure freezing and thawing was observed.
Shifts in information processing level: the speed theory of intelligence revisited.
Sircar, S S
2000-06-01
A hypothesis is proposed here to reconcile the inconsistencies observed in the IQ-P3 latency relation. The hypothesis stems from the observation that task-induced increase in P3 latency correlates positively with IQ scores. It is hypothesised that: (a) there are several parallel information processing pathways of varying complexity which are associated with the generation of P3 waves of varying latencies; (b) with increasing workload, there is a shift in the 'information processing level' through progressive recruitment of more complex polysynaptic pathways with greater processing power and inhibition of the oligosynaptic pathways; (c) high-IQ subjects have a greater reserve of higher level processing pathways; (d) a given 'task-load' imposes a greater 'mental workload' in subjects with lower IQ than in those with higher IQ. According to this hypothesis, a meaningful comparison of the P3 correlates of IQ is possible only when the information processing level is pushed to its limits.
Contamination pathways of spore-forming bacteria in a vegetable cannery.
Durand, Loïc; Planchon, Stella; Guinebretiere, Marie-Hélène; André, Stéphane; Carlin, Frédéric; Remize, Fabienne
2015-06-02
Spoilage of low-acid canned food during prolonged storage at high temperatures is caused by heat resistant thermophilic spores of strict or facultative bacteria. Here, we performed a bacterial survey over two consecutive years on the processing line of a French company manufacturing canned mixed green peas and carrots. In total, 341 samples were collected, including raw vegetables, green peas and carrots at different steps of processing, cover brine, and process environment samples. Thermophilic and highly-heat-resistant thermophilic spores growing anaerobically were counted. During vegetable preparation, anaerobic spore counts were significantly decreased, and tended to remain unchanged further downstream in the process. Large variation of spore levels in products immediately before the sterilization process could be explained by occasionally high spore levels on surfaces and in debris of vegetable combined with long residence times in conditions suitable for growth and sporulation. Vegetable processing was also associated with an increase in the prevalence of highly-heat-resistant species, probably due to cross-contamination of peas via blanching water. Geobacillus stearothermophilus M13-PCR genotypic profiling on 112 isolates determined 23 profile-types and confirmed process-driven cross-contamination. Taken together, these findings clarify the scheme of contamination pathway by thermophilic spore-forming bacteria in a vegetable cannery. Copyright © 2015 Elsevier B.V. All rights reserved.
Parallel Architectures and Parallel Algorithms for Integrated Vision Systems. Ph.D. Thesis
NASA Technical Reports Server (NTRS)
Choudhary, Alok Nidhi
1989-01-01
Computer vision is regarded as one of the most complex and computationally intensive problems. An integrated vision system (IVS) is a system that uses vision algorithms from all levels of processing to perform for a high level application (e.g., object recognition). An IVS normally involves algorithms from low level, intermediate level, and high level vision. Designing parallel architectures for vision systems is of tremendous interest to researchers. Several issues are addressed in parallel architectures and parallel algorithms for integrated vision systems.
Small numbers are sensed directly, high numbers constructed from size and density.
Zimmermann, Eckart
2018-04-01
Two theories compete to explain how we estimate the numerosity of visual object sets. The first suggests that the apparent numerosity is derived from an analysis of more low-level features like size and density of the set. The second theory suggests that numbers are sensed directly. Consistent with the latter claim is the existence of neurons in parietal cortex which are specialized for processing the numerosity of elements in the visual scene. However, recent evidence suggests that only low numbers can be sensed directly whereas the perception of high numbers is supported by the analysis of low-level features. Processing of low and high numbers, being located at different levels of the neural hierarchy should involve different receptive field sizes. Here, I tested this idea with visual adaptation. I measured the spatial spread of number adaptation for low and high numerosities. A focused adaptation spread of high numerosities suggested the involvement of early neural levels where receptive fields are comparably small and the broad spread for low numerosities was consistent with processing of number neurons which have larger receptive fields. These results provide evidence for the claim that different mechanism exist generating the perception of visual numerosity. Whereas low numbers are sensed directly as a primary visual attribute, the estimation of high numbers however likely depends on the area size over which the objects are spread. Copyright © 2017 Elsevier B.V. All rights reserved.
Ultra-processed foods and the limits of product reformulation.
Scrinis, Gyorgy; Monteiro, Carlos Augusto
2018-01-01
The nutritional reformulation of processed food and beverage products has been promoted as an important means of addressing the nutritional imbalances in contemporary dietary patterns. The focus of most reformulation policies is the reduction in quantities of nutrients-to-limit - Na, free sugars, SFA, trans-fatty acids and total energy. The present commentary examines the limitations of what we refer to as 'nutrients-to-limit reformulation' policies and practices, particularly when applied to ultra-processed foods and drink products. Beyond these nutrients-to-limit, there are a range of other potentially harmful processed and industrially produced ingredients used in the production of ultra-processed products that are not usually removed during reformulation. The sources of nutrients-to-limit in these products may be replaced with other highly processed ingredients and additives, rather than with whole or minimally processed foods. Reformulation policies may also legitimise current levels of consumption of ultra-processed products in high-income countries and increased levels of consumption in emerging markets in the global South.
Wang, Fan; Du, Bao-Lei; Cui, Zheng-Wei; Xu, Li-Ping; Li, Chun-Yang
2017-03-01
The aim of this study was to investigate the effects of high hydrostatic pressure and thermal processing on microbiological quality, bioactive compounds, antioxidant activity, and volatile profile of mulberry juice. High hydrostatic pressure processing at 500 MPa for 10 min reduced the total viable count from 4.38 log cfu/ml to nondetectable level and completely inactivated yeasts and molds in raw mulberry juice, ensuring the microbiological safety as thermal processing at 85 ℃ for 15 min. High hydrostatic pressure processing maintained significantly (p < 0.05) higher contents of total phenolic, total flavonoid and resveratrol, and antioxidant activity of mulberry juice than thermal processing. The main volatile compounds of mulberry juice were aldehydes, alcohols, and ketones. High hydrostatic pressure processing enhanced the volatile compound concentrations of mulberry juice while thermal processing reduced them in comparison with the control. These results suggested that high hydrostatic pressure processing could be an alternative to conventional thermal processing for production of high-quality mulberry juice.
Pacis, Efren; Yu, Marcella; Autsen, Jennifer; Bayer, Robert; Li, Feng
2011-10-01
The glycosylation profile of therapeutic antibodies is routinely analyzed throughout development to monitor the impact of process parameters and to ensure consistency, efficacy, and safety for clinical and commercial batches of therapeutic products. In this study, unusually high levels of the mannose-5 (Man5) glycoform were observed during the early development of a therapeutic antibody produced from a Chinese hamster ovary (CHO) cell line, model cell line A. Follow up studies indicated that the antibody Man5 level was increased throughout the course of cell culture production as a result of increasing cell culture medium osmolality levels and extending culture duration. With model cell line A, Man5 glycosylation increased more than twofold from 12% to 28% in the fed-batch process through a combination of high basal and feed media osmolality and increased run duration. The osmolality and culture duration effects were also observed for four other CHO antibody producing cell lines by adding NaCl in both basal and feed media and extending the culture duration of the cell culture process. Moreover, reduction of Man5 level from model cell line A was achieved by supplementing MnCl2 at appropriate concentrations. To further understand the role of glycosyltransferases in Man5 level, N-acetylglucosaminyltransferase I GnT-I mRNA levels at different osmolality conditions were measured. It has been hypothesized that specific enzyme activity in the glycosylation pathway could have been altered in this fed-batch process. Copyright © 2011 Wiley Periodicals, Inc.
ERIC Educational Resources Information Center
Adu-Gyamfi, Kenneth; Ampiah, Joseph Ghartey
2016-01-01
Science education at the Basic School (Primary and Junior High School) serves as the foundation upon which higher levels of science education are pivoted. This ethnographic study sought to investigate the teaching of Integrated Science at the Junior High School (JHS) level in the classrooms of two science teachers in two schools of differing…
Theron, Chrispian W; Berrios, Julio; Delvigne, Frank; Fickers, Patrick
2018-01-01
The methylotrophic yeast Komagataella (Pichia) pastoris has become one of the most utilized cell factories for the production of recombinant proteins over the last three decades. This success story is linked to its specific physiological traits, i.e., the ability to grow at high cell density in inexpensive culture medium and to secrete proteins at high yield. Exploiting methanol metabolism is at the core of most P. pastoris-based processes but comes with its own challenges. Co-feeding cultures with glycerol/sorbitol and methanol is a promising approach, which can benefit from improved understanding and prediction of metabolic response. The development of profitable processes relies on the construction and selection of efficient producing strains from less efficient ones but also depends on the ability to master the bioreactor process itself. More specifically, how a bioreactor processes could be monitored and controlled to obtain high yield of production. In this review, new perspectives are detailed regarding a multi-faceted approach to recombinant protein production processes by P. pastoris; including gaining improved understanding of the metabolic pathways involved, accounting for variations in transcriptional and translational efficiency at the single cell level and efficient monitoring and control of methanol levels at the bioreactor level.
Cappa, Carola; Lucisano, Mara; Barbosa-Cánovas, Gustavo V; Mariotti, Manuela
2016-07-01
The impact of high pressure (HP) processing on corn starch, rice flour and waxy rice flour was investigated as a function of pressure level (400MPa; 600MPa), pressure holding time (5min; 10min), and temperature (20°C; 40°C). Samples were pre-conditioned (final moisture level: 40g/100g) before HP treatments. Both the HP treated and the untreated raw materials were evaluated for pasting properties and solvent retention capacity, and investigated by differential scanning calorimetry, X-ray diffractometry and environmental scanning electron microscopy. Different pasting behaviors and solvent retention capacities were evidenced according to the applied pressure. Corn starch presented a slower gelatinization trend when treated at 600MPa. Corn starch and rice flour treated at 600MPa showed a higher retention capacity of carbonate and lactic acid solvents, respectively. Differential scanning calorimetry and environmental scanning electron microscopy investigations highlighted that HP affected the starch structure of rice flour and corn starch. Few variations were evidenced in waxy rice flour. These results can assist in advancing the HP processing knowledge, as the possibility to successfully process raw samples in a very high sample-to-water concentration level was evidenced. This work investigates the effect of high pressure as a potential technique to modify the processing characteristics of starchy materials without using high temperature. In this case the starches were processed in the powder form - and not as a slurry as in previously reported studies - showing the flexibility of the HP treatment. The relevance for industrial application is the possibility to change the structure of flour starches, and thus modifying the processability of the mentioned products. Copyright © 2016 Elsevier Ltd. All rights reserved.
Atomic Processes for XUV Lasers: Alkali Atoms and Ions
NASA Astrophysics Data System (ADS)
Dimiduk, David Paul
The development of extreme ultraviolet (XUV) lasers is dependent upon knowledge of processes in highly excited atoms. Described here are spectroscopy experiments which have identified and characterized certain autoionizing energy levels in core-excited alkali atoms and ions. Such levels, termed quasi-metastable, have desirable characteristics as upper levels for efficient, powerful XUV lasers. Quasi -metastable levels are among the most intense emission lines in the XUV spectra of core-excited alkalis. Laser experiments utilizing these levels have proved to be useful in characterizing other core-excited levels. Three experiments to study quasi-metastable levels are reported. The first experiment is vacuum ultraviolet (VUV) absorption spectroscopy on the Cs 109 nm transitions using high-resolution laser techniques. This experiment confirms the identification of transitions to a quasi-metastable level, estimates transition oscillator strengths, and estimates the hyperfine splitting of the quasi-metastable level. The second experiment, XUV emission spectroscopy of Ca II and Sr II in a microwave-heated plasma, identifies transitions from quasi-metastable levels in these ions, and provides confirming evidence of their radiative, rather than autoionizing, character. In the third experiment, core-excited Ca II ions are produced by inner-shell photoionization of Ca with soft x-rays from a laser-produced plasma. This preliminary experiment demonstrated a method of creating large numbers of these highly-excited ions for future spectroscopic experiments. Experimental and theoretical evidence suggests the CA II 3{ rm p}^5 3d4s ^4 {rm F}^circ_{3/2 } quasi-metastable level may be directly pumped via a dipole ionization process from the Ca I ground state. The direct process is permitted by J conservation, and occurs due to configuration mixing in the final state and possibly the initial state as well. The experiments identifying and characterizing quasi-metastable levels are compared to calculations using the Hartree-Fock code RCN/RCG. Calculated parameters include energy levels, wavefunctions, and transition rates. Based on an extension of this code, earlier unexplained experiments showing strong two-electron radiative transitions from quasi-metastable levels are now understood.
Code of Federal Regulations, 2011 CFR
2011-07-01
... production or activity level. (1) If the expected mix of products serves as the basis for the batch mass... from the high-level calibration gas is at least 20 times the standard deviation of the response from... 25A, 40 CFR part 60, appendix A, is acceptable if the response from the high-level calibration gas is...
Code of Federal Regulations, 2011 CFR
2011-07-01
... limitation is not dependent upon any past production or activity level. (1) If the expected mix of products... acceptable if the response from the high-level calibration gas is at least 20 times the standard deviation of..., appendix A is acceptable if the response from the high-level calibration gas is at least 20 times the...
Murphy, Steven C; Martin, Nicole H; Barbano, David M; Wiedmann, Martin
2016-12-01
This article provides an overview of the influence of raw milk quality on the quality of processed dairy products and offers a perspective on the merits of investing in quality. Dairy farmers are frequently offered monetary premium incentives to provide high-quality milk to processors. These incentives are most often based on raw milk somatic cell and bacteria count levels well below the regulatory public health-based limits. Justification for these incentive payments can be based on improved processed product quality and manufacturing efficiencies that provide the processor with a return on their investment for high-quality raw milk. In some cases, this return on investment is difficult to measure. Raw milks with high levels of somatic cells and bacteria are associated with increased enzyme activity that can result in product defects. Use of raw milk with somatic cell counts >100,000cells/mL has been shown to reduce cheese yields, and higher levels, generally >400,000 cells/mL, have been associated with textural and flavor defects in cheese and other products. Although most research indicates that fairly high total bacteria counts (>1,000,000 cfu/mL) in raw milk are needed to cause defects in most processed dairy products, receiving high-quality milk from the farm allows some flexibility for handling raw milk, which can increase efficiencies and reduce the risk of raw milk reaching bacterial levels of concern. Monitoring total bacterial numbers in regard to raw milk quality is imperative, but determining levels of specific types of bacteria present has gained increasing importance. For example, spores of certain spore-forming bacteria present in raw milk at very low levels (e.g., <1/mL) can survive pasteurization and grow in milk and cheese products to levels that result in defects. With the exception of meeting product specifications often required for milk powders, testing for specific spore-forming groups is currently not used in quality incentive programs in the United States but is used in other countries (e.g., the Netherlands). Copyright © 2016 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.
Determining robot actions for tasks requiring sensor interaction
NASA Technical Reports Server (NTRS)
Budenske, John; Gini, Maria
1989-01-01
The performance of non-trivial tasks by a mobile robot has been a long term objective of robotic research. One of the major stumbling blocks to this goal is the conversion of the high-level planning goals and commands into the actuator and sensor processing controls. In order for a mobile robot to accomplish a non-trivial task, the task must be described in terms of primitive actions of the robot's actuators. Most non-trivial tasks require the robot to interact with its environment; thus necessitating coordination of sensor processing and actuator control to accomplish the task. The main contention is that the transformation from the high level description of the task to the primitive actions should be performed primarily at execution time, when knowledge about the environment can be obtained through sensors. It is proposed to produce the detailed plan of primitive actions by using a collection of low-level planning components that contain domain specific knowledge and knowledge about the available sensors, actuators, and sensor/actuator processing. This collection will perform signal and control processing as well as serve as a control interface between an actual mobile robot and a high-level planning system. Previous research has shown the usefulness of high-level planning systems to plan the coordination of activities such to achieve a goal, but none have been fully applied to actual mobile robots due to the complexity of interacting with sensors and actuators. This control interface is currently being implemented on a LABMATE mobile robot connected to a SUN workstation and will be developed such to enable the LABMATE to perform non-trivial, sensor-intensive tasks as specified by a planning system.
Tradeoffs in the design of a system for high level language interpretation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Osorio, F.C.C.; Patt, Y.N.
The problem of designing a system for high-level language interpretation (HLLI) is considered. First, a model of the design process is presented where several styles of design, e.g. turing machine interpretation, CISC architecture interpretation and RISC architecture interpretation are treated uniformly. Second, the most significant characteristics of HLLI are analysed in the context of different design styles, and some guidelines are presented on how to identify the most suitable design style for a given high-level language problem. 12 references.
Grinding and classification of pine bark for use as plywood adhesive filler
Thomas L. Eberhardt; Karen G. Reed
2005-01-01
Prior efforts to incorporate bark or bark extracts into composites have met with only limited success because of poor performance relative to existing products and/or economic barriers stemming from high levels of processing. We are currently investigating applications for southern yellow pine (SYP) bark that require intermediate levels of processing, one being the use...
ERIC Educational Resources Information Center
Gökçen, Elif; Frederickson, Norah; Petrides, K. V.
2016-01-01
Autism spectrum disorder (ASD) is characterised by profound difficulties in empathic processing and executive control. Whilst the links between these processes have been frequently investigated in populations with autism, few studies have examined them at the subclinical level. In addition, the contribution of alexithymia, a trait characterised by…
Baldwin, Carryl L; Struckman-Johnson, David
2002-01-15
Speech displays and verbal response technologies are increasingly being used in complex, high workload environments that require the simultaneous performance of visual and manual tasks. Examples of such environments include the flight decks of modern aircraft, advanced transport telematics systems providing invehicle route guidance and navigational information and mobile communication equipment in emergency and public safety vehicles. Previous research has established an optimum range for speech intelligibility. However, the potential for variations in presentation levels within this range to affect attentional resources and cognitive processing of speech material has not been examined previously. Results of the current experimental investigation demonstrate that as presentation level increases within this 'optimum' range, participants in high workload situations make fewer sentence-processing errors and generally respond faster. Processing errors were more sensitive to changes in presentation level than were measures of reaction time. Implications of these findings are discussed in terms of their application for the design of speech communications displays in complex multi-task environments.
Techakanon, Chukwan; Gradziel, Thomas M; Zhang, Lu; Barrett, Diane M
2016-09-28
Fruit maturity is an important factor associated with final product quality, and it may have an effect on the level of browning in peaches that are high pressure processed (HPP). Peaches from three different maturities, as determined by firmness (M1 = 50-55 N, M2 = 35-40 N, and M3 = 15-20 N), were subjected to pressure levels at 0.1, 200, and 400 MPa for 10 min. The damage from HPP treatment results in loss of fruit integrity and the development of browning during storage. Increasing pressure levels of HPP treatment resulted in greater damage, particularly in the more mature peaches, as determined by shifts in transverse relaxation time (T2) of the vacuolar component and by light microscopy. The discoloration of peach slices of different maturities processed at the same pressure was comparable, indicating that the effect of pressure level is greater than that of maturity in the development of browning.
Yoon, Chiyul; Noh, Seungwoo; Lee, Jung Chan; Ko, Sung Ho; Ahn, Wonsik; Kim, Hee Chan
2014-03-01
The continuous autotransfusion system has been widely used in surgical operations. It is known that if oil is added to blood, and this mixture is then processed by an autotransfusion device, the added oil is removed and reinfusion of fat is prevented by the device. However, there is no detailed report on the influence of the particular washing program selected on the levels of blood components including blood fat after continuous autotransfusion using such a system. Fresh bovine blood samples were processed by a commercial continuous autotransfusion device using the "emergency," "quality," and "high-quality" programs, applied in random order. Complete blood count (CBC) and serum chemistry were analyzed to determine how the blood processing performance of the device changes with the washing program applied. There was no significant difference in the CBC results obtained with the three washing programs. Although all of the blood lipids in the processed blood were decreased compared to those in the blood before processing, the levels of triglyceride, phospholipid, and total cholesterol after processing via the emergency program were significantly higher than those present after processing via the quality and high-quality programs. Although the continuous autotransfusion device provided consistent hematocrit quality, the levels of some blood lipid components showed significant differences among the washing programs.
Effects of technological processes on enniatin levels in pasta.
Serrano, Ana B; Font, Guillermina; Mañes, Jordi; Ferrer, Emilia
2016-03-30
Potential human health risks posed by enniatins (ENs) require their control primarily from cereal products, creating a demand for harvesting, food processing and storage techniques capable to prevent, reduce and/or eliminate the contamination. In this study, different methodologies to pasta processing simulating traditional and industrial processes were developed in order to know the fate of the mycotoxin ENs. The levels of ENs were studied at different steps of pasta processing. The effect of the temperature during processing was evaluated in two types of pasta (white and whole-grain pasta). Mycotoxin analysis was performed by LC-MS/MS. High reductions (up to 50% and 80%) were achieved during drying pasta at 45-55°C and 70-90°C, respectively. The treatments at low temperature (25°C) did not change EN levels. The effect of pasta composition did not cause a significant effect on the stability of ENs. The effect of the temperature allowed a marked mycotoxin reduction during pasta processing. Generally, ENA1 and ENB showed higher thermal stability than did ENA and ENB1 . The findings from the present study suggested that pasta processing at medium-high temperatures is a potential tool to remove an important fraction of ENs from the initial durum wheat semolina. © 2015 Society of Chemical Industry.
Maier, Maximilian B; Lenz, Christian A; Vogel, Rudi F
2017-01-01
The effect of high pressure thermal (HPT) processing on the inactivation of spores of proteolytic type B Clostridium botulinum TMW 2.357 in four differently composed low-acid foods (green peas with ham, steamed sole, vegetable soup, braised veal) was studied in an industrially feasible pressure range and temperatures between 100 and 120°C. Inactivation curves exhibited rapid inactivation during compression and decompression followed by strong tailing effects. The highest inactivation (approx. 6-log cycle reduction) was obtained in braised veal at 600 MPa and 110°C after 300 s pressure-holding time. In general, inactivation curves exhibited similar negative exponential shapes, but maximum achievable inactivation levels were lower in foods with higher fat contents. At high treatment temperatures, spore inactivation was more effective at lower pressure levels (300 vs. 600 MPa), which indicates a non-linear pressure/temperature-dependence of the HPT spore inactivation efficiency. A comparison of spore inactivation levels achievable using HPT treatments versus a conventional heat sterilization treatment (121.1°C, 3 min) illustrates the potential of combining high pressures and temperatures to replace conventional retorting with the possibility to reduce the process temperature or shorten the processing time. Finally, experiments using varying spore inoculation levels suggested the presence of a resistant fraction comprising approximately 0.01% of a spore population as reason for the pronounced tailing effects in survivor curves. The loss of the high resistance properties upon cultivation indicates that those differences develop during sporulation and are not linked to permanent modifications at the genetic level.
Process-based Cost Estimation for Ramjet/Scramjet Engines
NASA Technical Reports Server (NTRS)
Singh, Brijendra; Torres, Felix; Nesman, Miles; Reynolds, John
2003-01-01
Process-based cost estimation plays a key role in effecting cultural change that integrates distributed science, technology and engineering teams to rapidly create innovative and affordable products. Working together, NASA Glenn Research Center and Boeing Canoga Park have developed a methodology of process-based cost estimation bridging the methodologies of high-level parametric models and detailed bottoms-up estimation. The NASA GRC/Boeing CP process-based cost model provides a probabilistic structure of layered cost drivers. High-level inputs characterize mission requirements, system performance, and relevant economic factors. Design alternatives are extracted from a standard, product-specific work breakdown structure to pre-load lower-level cost driver inputs and generate the cost-risk analysis. As product design progresses and matures the lower level more detailed cost drivers can be re-accessed and the projected variation of input values narrowed, thereby generating a progressively more accurate estimate of cost-risk. Incorporated into the process-based cost model are techniques for decision analysis, specifically, the analytic hierarchy process (AHP) and functional utility analysis. Design alternatives may then be evaluated not just on cost-risk, but also user defined performance and schedule criteria. This implementation of full-trade study support contributes significantly to the realization of the integrated development environment. The process-based cost estimation model generates development and manufacturing cost estimates. The development team plans to expand the manufacturing process base from approximately 80 manufacturing processes to over 250 processes. Operation and support cost modeling is also envisioned. Process-based estimation considers the materials, resources, and processes in establishing cost-risk and rather depending on weight as an input, actually estimates weight along with cost and schedule.
High-level waste borosilicate glass: A compendium of corrosion characteristics. Volume 3
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cunnane, J.C.; Bates, J.K.; Bradley, C.R.
1994-03-01
The objective of this document is to summarize scientific information pertinent to evaluating the extent to which high-level waste borosilicate glass corrosion and the associated radionuclide release processes are understood for the range of environmental conditions to which waste glass may be exposed in service. Alteration processes occurring within the bulk of the glass (e.g., devitrification and radiation-induced changes) are discussed insofar as they affect glass corrosion. Volume III contains a bibliography of glass corrosion studies, including studies that are not cited in Volumes I and II.
Shi, Yanwei; Ling, Wencui; Qiang, Zhimin
2013-01-01
The effect of chlorine dioxide (ClO2) oxidation on the formation of disinfection by-products (DBPs) during sequential (ClO2 pre-oxidation for 30 min) and simultaneous disinfection processes with free chlorine (FC) or monochloramine (MCA) was investigated. The formation of DBPs from synthetic humic acid (HA) water and three natural surface waters containing low bromide levels (11-27 microg/L) was comparatively examined in the FC-based (single FC, sequential ClO2-FC, and simultaneous ClO2/FC) and MCA-based (single MCA, ClO2-MCA, and ClO2/MCA) disinfection processes. The results showed that much more DBPs were formed from the synthetic HA water than from the three natural surface waters with comparative levels of dissolved organic carbon. In the FC-based processes, ClO2 oxidation could reduce trihalomethanes (THMs) by 27-35% and haloacetic acids (HAAs) by 14-22% in the three natural surface waters, but increased THMs by 19% and HAAs by 31% in the synthetic HA water after an FC contact time of 48 h. In the MCA-based processes, similar trends were observed although DBPs were produced at a much lower level. There was an insignificant difference in DBPs formation between the sequential and simultaneous processes. The presence of a high level of bromide (320 microg/L) remarkably promoted the DBPs formation in the FC-based processes. Therefore, the simultaneous disinfection process of ClO2/MCA is recommended particularly for waters with a high bromide level.
NASA Astrophysics Data System (ADS)
Mungov, G.; Dunbar, P. K.; Stroker, K. J.; Sweeney, A.
2016-12-01
The National Oceanic and Atmospheric Administration (NOAA) National Centers for Environmental Information is data repository for high-resolution, integrated water-level data to support tsunami research, risk assessment and mitigation to protect life and property damages along the coasts. NCEI responsibilities include, but are not limited to process, archiv and distribut and coastal water level data from different sourcesg tsunami and storm-surge inundation, sea-level change, climate variability, etc. High-resolution data for global historical tsunami events are collected by the Deep-ocean Assessment and Reporting of Tsunami (DART®) tsunameter network maintained by NOAA's National Data Buoy Center NDBC, coastal tide-gauges maintained by NOAA's Center for Operational Oceanographic Products and Services (CO-OPS) and Tsunami Warning Centers, historic marigrams and images, bathymetric data, and from other national and international sources. NCEI-CO water level database is developed in close collaboration with all data providers along with NOAA's Pacific Marine Environmental Laboratory. We outline here the present state in water-level data processing regarding the increasing needs for high-precision, homogeneous and "clean" tsunami records from data different sources and different sampling interval. Two tidal models are compared: the Mike Foreman's improved oceanographic model (2009) and the Akaike Bayesian Information Criterion approach applied by Tamura et al. (1991). The effects of filtering and the limits of its application are also discussed along with the used method for de-spiking the raw time series.
Raizes, Meytal; Elkana, Odelia; Franko, Motty; Ravona Springer, Ramit; Segev, Shlomo; Beeri, Michal Schnaider
2016-01-01
We explored the association of plasma glucose levels within the normal range with processing speed in high functioning young elderly, free of type 2 diabetes mellitus (T2DM). A sample of 41 participants (mean age = 64.7, SD = 10; glucose 94.5 mg/dL, SD = 9.3), were examined with a computerized cognitive battery. Hierarchical linear regression analysis showed that higher plasma glucose levels, albeit within the normal range (<110 mg/dL), were associated with longer reaction times (p < 0.01). These findings suggest that even in the subclinical range and in the absence of T2DM, monitoring plasma glucose levels may have an impact on cognitive function.
Ndidi, Uche Samuel; Ndidi, Charity Unekwuojo; Olagunju, Abbas; Muhammad, Aliyu; Billy, Francis Graham; Okpe, Oche
2014-01-01
This research was aimed at evaluating the proximate composition, level of anti-nutrients, and the mineral composition of raw and processed Sphenostylis stenocarpa seeds and at examining the effect of processing on the parameters. From the proximate composition analysis, the ash content showed no significant difference (P > 0.05) between the processed and unprocessed (raw) samples. However, there was significant difference (P < 0.05) in the levels of moisture, crude lipid, nitrogen-free extract, gross energy, true protein, and crude fiber between the processed and unprocessed S. stenocarpa. Analyses of the antinutrient composition show that the processed S. stenocarpa registered significant reduction in levels of hydrogen cyanide, trypsin inhibitor, phytate, oxalate, and tannins compared to the unprocessed. Evaluation of the mineral composition showed that the level of sodium, calcium, and potassium was high in both the processed and unprocessed sample (150–400 mg/100 g). However, the level of iron, copper, zinc, and magnesium was low in both processed and unprocessed samples (2–45 mg/100 g). The correlation analysis showed that tannins and oxalate affected the levels of ash and nitrogen-free extract of processed and unprocessed seeds. These results suggest that the consumption of S. stenocarpa will go a long way in reducing the level of malnutrition in northern Nigeria. PMID:24967265
Colossal Tooling Design: 3D Simulation for Ergonomic Analysis
NASA Technical Reports Server (NTRS)
Hunter, Steve L.; Dischinger, Charles; Thomas, Robert E.; Babai, Majid
2003-01-01
The application of high-level 3D simulation software to the design phase of colossal mandrel tooling for composite aerospace fuel tanks was accomplished to discover and resolve safety and human engineering problems. The analyses were conducted to determine safety, ergonomic and human engineering aspects of the disassembly process of the fuel tank composite shell mandrel. Three-dimensional graphics high-level software, incorporating various ergonomic analysis algorithms, was utilized to determine if the process was within safety and health boundaries for the workers carrying out these tasks. In addition, the graphical software was extremely helpful in the identification of material handling equipment and devices for the mandrel tooling assembly/disassembly process.
Rapid Disaster Damage Estimation
NASA Astrophysics Data System (ADS)
Vu, T. T.
2012-07-01
The experiences from recent disaster events showed that detailed information derived from high-resolution satellite images could accommodate the requirements from damage analysts and disaster management practitioners. Richer information contained in such high-resolution images, however, increases the complexity of image analysis. As a result, few image analysis solutions can be practically used under time pressure in the context of post-disaster and emergency responses. To fill the gap in employment of remote sensing in disaster response, this research develops a rapid high-resolution satellite mapping solution built upon a dual-scale contextual framework to support damage estimation after a catastrophe. The target objects are building (or building blocks) and their condition. On the coarse processing level, statistical region merging deployed to group pixels into a number of coarse clusters. Based on majority rule of vegetation index, water and shadow index, it is possible to eliminate the irrelevant clusters. The remaining clusters likely consist of building structures and others. On the fine processing level details, within each considering clusters, smaller objects are formed using morphological analysis. Numerous indicators including spectral, textural and shape indices are computed to be used in a rule-based object classification. Computation time of raster-based analysis highly depends on the image size or number of processed pixels in order words. Breaking into 2 level processing helps to reduce the processed number of pixels and the redundancy of processing irrelevant information. In addition, it allows a data- and tasks- based parallel implementation. The performance is demonstrated with QuickBird images captured a disaster-affected area of Phanga, Thailand by the 2004 Indian Ocean tsunami are used for demonstration of the performance. The developed solution will be implemented in different platforms as well as a web processing service for operational uses.
Conducting Original Research at the High School Level--the Students' Perspective.
ERIC Educational Resources Information Center
Scott, Marcus; VanNoord, Greg
1996-01-01
High school students discuss the process of conducting original scientific research in a high school biology course, including developing an idea, obtaining financial support, collecting data, and presenting findings. (MKR)
Cheng, Rebecca Wing-yi; Lam, Shui-fong; Chan, Joanne Chung-yan
2008-06-01
There has been an ongoing debate about the inconsistent effects of heterogeneous ability grouping on students in small group work such as project-based learning. The present research investigated the roles of group heterogeneity and processes in project-based learning. At the student level, we examined the interaction effect between students' within-group achievement and group processes on their self- and collective efficacy. At the group level, we examined how group heterogeneity was associated with the average self- and collective efficacy reported by the groups. The participants were 1,921 Hong Kong secondary students in 367 project-based learning groups. Student achievement was determined by school examination marks. Group processes, self-efficacy and collective efficacy were measured by a student-report questionnaire. Hierarchical linear modelling was used to analyse the nested data. When individual students in each group were taken as the unit of analysis, results indicated an interaction effect of group processes and students' within-group achievement on the discrepancy between collective- and self-efficacy. When compared with low achievers, high achievers reported lower collective efficacy than self-efficacy when group processes were of low quality. However, both low and high achievers reported higher collective efficacy than self-efficacy when group processes were of high quality. With 367 groups taken as the unit of analysis, the results showed that group heterogeneity, group gender composition and group size were not related to the discrepancy between collective- and self-efficacy reported by the students. Group heterogeneity was not a determinant factor in students' learning efficacy. Instead, the quality of group processes played a pivotal role because both high and low achievers were able to benefit when group processes were of high quality.
Mendez, Michelle A
2015-01-01
Background: “Processed foods” are defined as any foods other than raw agricultural commodities and can be categorized by the extent of changes occurring in foods as a result of processing. Conclusions about the association between the degree of food processing and nutritional quality are discrepant. Objective: We aimed to determine 2000–2012 trends in the contribution of processed and convenience food categories to purchases by US households and to compare saturated fat, sugar, and sodium content of purchases across levels of processing and convenience. Design: We analyzed purchases of consumer packaged goods for 157,142 households from the 2000–2012 Homescan Panel. We explicitly defined categories for classifying products by degree of industrial processing and separately by convenience of preparation. We classified >1.2 million products through use of barcode-specific descriptions and ingredient lists. Median saturated fat, sugar, and sodium content and the likelihood that purchases exceeded maximum daily intake recommendations for these components were compared across levels of processing or convenience by using quantile and logistic regression. Results: More than three-fourths of energy in purchases by US households came from moderately (15.9%) and highly processed (61.0%) foods and beverages in 2012 (939 kcal/d per capita). Trends between 2000 and 2012 were stable. When classifying foods by convenience, ready-to-eat (68.1%) and ready-to-heat (15.2%) products supplied the majority of energy in purchases. The adjusted proportion of household-level food purchases exceeding 10% kcal from saturated fat, 15% kcal from sugar, and 2400 mg sodium/2000 kcal simultaneously was significantly higher for highly processed (60.4%) and ready-to-eat (27.1%) food purchases than for purchases of less-processed foods (5.6%) or foods requiring cooking/preparation (4.9%). Conclusions: Highly processed food purchases are a dominant, unshifting part of US purchasing patterns, but highly processed foods may have higher saturated fat, sugar, and sodium content than less-processed foods. Wide variation in nutrient content suggests food choices within categories may be important. PMID:25948666
Poti, Jennifer M; Mendez, Michelle A; Ng, Shu Wen; Popkin, Barry M
2015-06-01
"Processed foods" are defined as any foods other than raw agricultural commodities and can be categorized by the extent of changes occurring in foods as a result of processing. Conclusions about the association between the degree of food processing and nutritional quality are discrepant. We aimed to determine 2000-2012 trends in the contribution of processed and convenience food categories to purchases by US households and to compare saturated fat, sugar, and sodium content of purchases across levels of processing and convenience. We analyzed purchases of consumer packaged goods for 157,142 households from the 2000-2012 Homescan Panel. We explicitly defined categories for classifying products by degree of industrial processing and separately by convenience of preparation. We classified >1.2 million products through use of barcode-specific descriptions and ingredient lists. Median saturated fat, sugar, and sodium content and the likelihood that purchases exceeded maximum daily intake recommendations for these components were compared across levels of processing or convenience by using quantile and logistic regression. More than three-fourths of energy in purchases by US households came from moderately (15.9%) and highly processed (61.0%) foods and beverages in 2012 (939 kcal/d per capita). Trends between 2000 and 2012 were stable. When classifying foods by convenience, ready-to-eat (68.1%) and ready-to-heat (15.2%) products supplied the majority of energy in purchases. The adjusted proportion of household-level food purchases exceeding 10% kcal from saturated fat, 15% kcal from sugar, and 2400 mg sodium/2000 kcal simultaneously was significantly higher for highly processed (60.4%) and ready-to-eat (27.1%) food purchases than for purchases of less-processed foods (5.6%) or foods requiring cooking/preparation (4.9%). Highly processed food purchases are a dominant, unshifting part of US purchasing patterns, but highly processed foods may have higher saturated fat, sugar, and sodium content than less-processed foods. Wide variation in nutrient content suggests food choices within categories may be important. © 2015 American Society for Nutrition.
Sea level oscillations over minute timescales: a global perspective
NASA Astrophysics Data System (ADS)
Vilibic, Ivica; Sepic, Jadranka
2016-04-01
Sea level oscillations occurring over minutes to a few hours are an important contributor to sea level extremes, and a knowledge on their behaviour is essential for proper quantification of coastal marine hazards. Tsunamis, meteotsunamis, infra-gravity waves and harbour oscillations may even dominate sea level extremes in certain areas and thus pose a great danger for humans and coastal infrastructure. Aside for tsunamis, which are, due to their enormous impact to the coastlines, a well-researched phenomena, the importance of other high-frequency oscillations to the sea level extremes is still underrated, as no systematic long-term measurements have been carried out at a minute timescales. Recently, Intergovernmental Oceanographic Commission (IOC) established Sea Level Monitoring Facility portal (http://www.ioc-sealevelmonitoring.org), making 1-min sea level data publicly available for several hundred tide gauge sites in the World Ocean. Thereafter, a global assessment of oscillations over tsunami timescales become possible; however, the portal contains raw sea level data only, being unchecked for spikes, shifts, drifts and other malfunctions of instruments. We present a quality assessment of these data, estimates of sea level variances and contributions of high-frequency processes to the extremes throughout the World Ocean. This is accompanied with assessment of atmospheric conditions and processes which generate intense high-frequency oscillations.
Boets, Bart; Wouters, Jan; van Wieringen, Astrid; Ghesquière, Pol
2007-04-09
This study investigates whether the core bottleneck of literacy-impairment should be situated at the phonological level or at a more basic sensory level, as postulated by supporters of the auditory temporal processing theory. Phonological ability, speech perception and low-level auditory processing were assessed in a group of 5-year-old pre-school children at high-family risk for dyslexia, compared to a group of well-matched low-risk control children. Based on family risk status and first grade literacy achievement children were categorized in groups and pre-school data were retrospectively reanalyzed. On average, children showing both increased family risk and literacy-impairment at the end of first grade, presented significant pre-school deficits in phonological awareness, rapid automatized naming, speech-in-noise perception and frequency modulation detection. The concurrent presence of these deficits before receiving any formal reading instruction, might suggest a causal relation with problematic literacy development. However, a closer inspection of the individual data indicates that the core of the literacy problem is situated at the level of higher-order phonological processing. Although auditory and speech perception problems are relatively over-represented in literacy-impaired subjects and might possibly aggravate the phonological and literacy problem, it is unlikely that they would be at the basis of these problems. At a neurobiological level, results are interpreted as evidence for dysfunctional processing along the auditory-to-articulation stream that is implied in phonological processing, in combination with a relatively intact or inconsistently impaired functioning of the auditory-to-meaning stream that subserves auditory processing and speech perception.
Active vision in satellite scene analysis
NASA Technical Reports Server (NTRS)
Naillon, Martine
1994-01-01
In earth observation or planetary exploration it is necessary to have more and, more autonomous systems, able to adapt to unpredictable situations. This imposes the use, in artificial systems, of new concepts in cognition, based on the fact that perception should not be separated from recognition and decision making levels. This means that low level signal processing (perception level) should interact with symbolic and high level processing (decision level). This paper is going to describe the new concept of active vision, implemented in Distributed Artificial Intelligence by Dassault Aviation following a 'structuralist' principle. An application to spatial image interpretation is given, oriented toward flexible robotics.
NASA Astrophysics Data System (ADS)
Maciel, M. J.; Costa, C. G.; Silva, M. F.; Gonçalves, S. B.; Peixoto, A. C.; Ribeiro, A. Fernando; Wolffenbuttel, R. F.; Correia, J. H.
2016-08-01
This paper reports on the development of a technology for the wafer-level fabrication of an optical Michelson interferometer, which is an essential component in a micro opto-electromechanical system (MOEMS) for a miniaturized optical coherence tomography (OCT) system. The MOEMS consists on a titanium dioxide/silicon dioxide dielectric beam splitter and chromium/gold micro-mirrors. These optical components are deposited on 45° tilted surfaces to allow the horizontal/vertical separation of the incident beam in the final micro-integrated system. The fabrication process consists of 45° saw dicing of a glass substrate and the subsequent deposition of dielectric multilayers and metal layers. The 45° saw dicing is fully characterized in this paper, which also includes an analysis of the roughness. The optimum process results in surfaces with a roughness of 19.76 nm (rms). The actual saw dicing process for a high-quality final surface results as a compromise between the dicing blade’s grit size (#1200) and the cutting speed (0.3 mm s-1). The proposed wafer-level fabrication allows rapid and low-cost processing, high compactness and the possibility of wafer-level alignment/assembly with other optical micro components for OCT integrated imaging.
ERIC Educational Resources Information Center
Hayden, Howard C.
1995-01-01
Presents a method to calculate the amount of high-level radioactive waste by taking into consideration the following factors: the fission process that yields the waste, identification of the waste, the energy required to run a 1-GWe plant for one year, and the uranium mass required to produce that energy. Briefly discusses waste disposal and…
Student Motivations as Predictors of High-Level Cognitions in Project-Based Classrooms
ERIC Educational Resources Information Center
Stolk, Jonathan; Harari, Janie
2014-01-01
It is well established that active learning helps students engage in high-level thinking strategies and develop improved cognitive skills. Motivation and self-regulated learning research, however, illustrates that cognitive engagement is an effortful process that is related to students' valuing of the learning tasks, adoption of internalized goal…
Driving Objectives and High-level Requirements for KP-Lab Technologies
ERIC Educational Resources Information Center
Lakkala, Minna; Paavola, Sami; Toikka, Seppo; Bauters, Merja; Markannen, Hannu; de Groot, Reuma; Ben Ami, Zvi; Baurens, Benoit; Jadin, Tanja; Richter, Christoph; Zoserl, Eva; Batatia, Hadj; Paralic, Jan; Babic, Frantisek; Damsa, Crina; Sins, Patrick; Moen, Anne; Norenes, Svein Olav; Bugnon, Alexandra; Karlgren, Klas; Kotzinons, Dimitris
2008-01-01
One of the central goals of the KP-Lab project is to co-design pedagogical methods and technologies for knowledge creation and practice transformation in an integrative and reciprocal manner. In order to facilitate this process user tasks, driving objectives and high-level requirements have been introduced as conceptual tools to mediate between…
Definitely maybe: can unconscious processes perform the same functions as conscious processes?
Hesselmann, Guido; Moors, Pieter
2015-01-01
Hassin recently proposed the “Yes It Can” (YIC) principle to describe the division of labor between conscious and unconscious processes in human cognition. According to this principle, unconscious processes can carry out every fundamental high-level cognitive function that conscious processes can perform. In our commentary, we argue that the author presents an overly idealized review of the literature in support of the YIC principle. Furthermore, we point out that the dissimilar trends observed in social and cognitive psychology, with respect to published evidence of strong unconscious effects, can better be explained by the way how awareness is defined and measured in both research fields. Finally, we show that the experimental paradigm chosen by Hassin to rule out remaining objections against the YIC principle is unsuited to verify the new default notion that all high-level cognitive functions can unfold unconsciously. PMID:25999896
NASA Astrophysics Data System (ADS)
Sengupta, Pranesh; Kaushik, C. P.; Kale, G. B.; Das, D.; Raj, K.; Sharma, B. P.
2009-08-01
Understanding the material behaviour under service conditions is essential to enhance the life span of alloy 690 process pot used in vitrification of high-level nuclear waste. During vitrification process, interaction of alloy 690 with borosilicate melt takes place for substantial time period. Present experimental studies show that such interactions may result in Cr carbide precipitation along grain boundaries, Cr depletion in austenitic matrix and intergranular attack close to alloy 690/borosilicate melt pool interfaces. Widths of Cr depleted zone within alloy 690 is found to follow kinetics of the type x = 10.9 × 10 -6 + 1 × 10 -8t1/2 m. Based on the experimental results it is recommended that compositional modification of alloy 690 process pot adjacent to borosilicate melt pool need to be considered seriously for any efforts towards reduction and/or prevention of process pot failures.
Behavior of radioactive iodine and technetium in the spray calcination of high-level waste
NASA Astrophysics Data System (ADS)
Knox, C. A.; Farnsworth, R. K.
1981-08-01
The Remote Laboratory-Scale Waste Treatment Facility (RLSWTF) was designed and built as a part of the High-Level Waste Immobilization Program (now the High-Level Waste Process Development Program) at the Pacific Northwest Laboratory. In facility, installed in a radiochemical cell, is described in which installed in a radiochemical cell is described in which small volumes of radioactive liquid wastes can be solidified, the process off gas can be analyzed, and the methods for decontaminating this off gas can be tested. During the spray calcination of commercial high-level liquid waste spiked with Tc-99 and I-131 and 31 wt% loss of I-131 past the sintered-metal filters. These filters and venturi scrubber were very efficient in removing particulates and Tc-99 from the the off-gas stream. Liquid scrubbers were not efficient in removing I-131 as 25% of the total lost went to the building off-gas system. Therefore, solid adsorbents are needed to remove iodine. For all future operations where iodine is present, a silver zeolite adsorber is to be used.
The hows and whys of face memory: level of construal influences the recognition of human faces
Wyer, Natalie A.; Hollins, Timothy J.; Pahl, Sabine; Roper, Jean
2015-01-01
Three experiments investigated the influence of level of construal (i.e., the interpretation of actions in terms of their meaning or their details) on different stages of face memory. We employed a standard multiple-face recognition paradigm, with half of the faces inverted at test. Construal level was manipulated prior to recognition (Experiment 1), during study (Experiment 2) or both (Experiment 3). The results support a general advantage for high-level construal over low-level construal at both study and at test, and suggest that matching processing style between study and recognition has no advantage. These experiments provide additional evidence in support of a link between semantic processing (i.e., construal) and visual (i.e., face) processing. We conclude with a discussion of implications for current theories relating to both construal and face processing. PMID:26500586
The development of a general purpose ARM-based processing unit for the ATLAS TileCal sROD
NASA Astrophysics Data System (ADS)
Cox, M. A.; Reed, R.; Mellado, B.
2015-01-01
After Phase-II upgrades in 2022, the data output from the LHC ATLAS Tile Calorimeter will increase significantly. ARM processors are common in mobile devices due to their low cost, low energy consumption and high performance. It is proposed that a cost-effective, high data throughput Processing Unit (PU) can be developed by using several consumer ARM processors in a cluster configuration to allow aggregated processing performance and data throughput while maintaining minimal software design difficulty for the end-user. This PU could be used for a variety of high-level functions on the high-throughput raw data such as spectral analysis and histograms to detect possible issues in the detector at a low level. High-throughput I/O interfaces are not typical in consumer ARM System on Chips but high data throughput capabilities are feasible via the novel use of PCI-Express as the I/O interface to the ARM processors. An overview of the PU is given and the results for performance and throughput testing of four different ARM Cortex System on Chips are presented.
Plasma technologies application for building materials surface modification
NASA Astrophysics Data System (ADS)
Volokitin, G. G.; Skripnikova, N. K.; Volokitin, O. G.; Shehovtzov, V. V.; Luchkin, A. G.; Kashapov, N. F.
2016-01-01
Low temperature arc plasma was used to process building surface materials, such as silicate brick, sand lime brick, concrete and wood. It was shown that building surface materials modification with low temperature plasma positively affects frost resistance, water permeability and chemical resistance with high adhesion strength. Short time plasma processing is rather economical than traditional processing thermic methods. Plasma processing makes wood surface uniquely waterproof and gives high operational properties, dimensional and geometrical stability. It also increases compression resistance and decreases inner tensions level in material.
Embedded Implementation of VHR Satellite Image Segmentation
Li, Chao; Balla-Arabé, Souleymane; Ginhac, Dominique; Yang, Fan
2016-01-01
Processing and analysis of Very High Resolution (VHR) satellite images provide a mass of crucial information, which can be used for urban planning, security issues or environmental monitoring. However, they are computationally expensive and, thus, time consuming, while some of the applications, such as natural disaster monitoring and prevention, require high efficiency performance. Fortunately, parallel computing techniques and embedded systems have made great progress in recent years, and a series of massively parallel image processing devices, such as digital signal processors or Field Programmable Gate Arrays (FPGAs), have been made available to engineers at a very convenient price and demonstrate significant advantages in terms of running-cost, embeddability, power consumption flexibility, etc. In this work, we designed a texture region segmentation method for very high resolution satellite images by using the level set algorithm and the multi-kernel theory in a high-abstraction C environment and realize its register-transfer level implementation with the help of a new proposed high-level synthesis-based design flow. The evaluation experiments demonstrate that the proposed design can produce high quality image segmentation with a significant running-cost advantage. PMID:27240370
Memory Scanning, Introversion-Extraversion, and Levels of Processing.
ERIC Educational Resources Information Center
Eysenck, Michael W.; Eysenck, M. Christine
1979-01-01
Investigated was the hypothesis that high arousal increases processing of physical characteristics and reduces processing of semantic characteristics. While introverts and extroverts had equivalent scanning rates for physical features, introverts were significantly slower in searching for semantic features of category membership, indicating…
Hohlfeld, Annette; Martín-Loeches, Manuel; Sommer, Werner
2015-01-01
The present study contributes to the discussion on the automaticity of semantic processing. Whereas most previous research investigated semantic processing at word level, the present study addressed semantic processing during sentence reading. A dual task paradigm was combined with the recording of event-related brain potentials. Previous research at word level processing reported different patterns of interference with the N400 by additional tasks: attenuation of amplitude or delay of latency. In the present study, we presented Spanish sentences that were semantically correct or contained a semantic violation in a critical word. At different intervals preceding the critical word a tone was presented that required a high-priority choice response. At short intervals/high temporal overlap between the tasks mean amplitude of the N400 was reduced relative to long intervals/low temporal overlap, but there were no shifts of peak latency. We propose that processing at sentence level exerts a protective effect against the additional task. This is in accord with the attentional sensitization model (Kiefer & Martens, 2010), which suggests that semantic processing is an automatic process that can be enhanced by the currently activated task set. The present experimental sentences also induced a P600, which is taken as an index of integrative processing. Additional task effects are comparable to those in the N400 time window and are briefly discussed. PMID:26203312
Simulating neutron star mergers as r-process sources in ultrafaint dwarf galaxies
NASA Astrophysics Data System (ADS)
Safarzadeh, Mohammadtaher; Scannapieco, Evan
2017-10-01
To explain the high observed abundances of r-process elements in local ultrafaint dwarf (UFD) galaxies, we perform cosmological zoom simulations that include r-process production from neutron star mergers (NSMs). We model star formation stochastically and simulate two different haloes with total masses ≈108 M⊙ at z = 6. We find that the final distribution of [Eu/H] versus [Fe/H] is relatively insensitive to the energy by which the r-process material is ejected into the interstellar medium, but strongly sensitive to the environment in which the NSM event occurs. In one halo, the NSM event takes place at the centre of the stellar distribution, leading to high levels of r-process enrichment such as seen in a local UFD, Reticulum II (Ret II). In a second halo, the NSM event takes place outside of the densest part of the galaxy, leading to a more extended r-process distribution. The subsequent star formation occurs in an interstellar medium with shallow levels of r-process enrichment that results in stars with low levels of [Eu/H] compared to Ret II stars even when the maximum possible r-process mass is assumed to be ejected. This suggests that the natal kicks of neutron stars may also play an important role in determining the r-process abundances in UFD galaxies, a topic that warrants further theoretical investigation.
Modeling Neutron stars as r-process sources in Ultra Faint Dwarf galaxies
NASA Astrophysics Data System (ADS)
Safarzadeh, Mohammadtaher; Scannapieco, Evan
2018-06-01
To explain the high observed abundances of r-process elements in local ultrafaint dwarf (UFD) galaxies, we perform cosmological zoom simulations that include r-process production from neutron star mergers (NSMs). We model star formation stochastically and simulate two different haloes with total masses ≈108 M⊙ at z = 6. We find that the final distribution of [Eu/H] versus [Fe/H] is relatively insensitive to the energy by which the r-process material is ejected into the interstellar medium, but strongly sensitive to the environment in which the NSM event occurs. In one halo, the NSM event takes place at the centre of the stellar distribution, leading to high levels of r-process enrichment such as seen in a local UFD, Reticulum II (Ret II). In a second halo, the NSM event takes place outside of the densest part of the galaxy, leading to a more extended r-process distribution. The subsequent star formation occurs in an interstellar medium with shallow levels of r-process enrichment that results in stars with low levels of [Eu/H] compared to Ret II stars even when the maximum possible r-process mass is assumed to be ejected. This suggests that the natal kicks of neutron stars may also play an important role in determining the r-process abundances in UFD galaxies, a topic that warrants further theoretical investigation.
Flank wear analysing of high speed end milling for hardened steel D2 using Taguchi Method
NASA Astrophysics Data System (ADS)
Hazza Faizi Al-Hazza, Muataz; Ibrahim, Nur Asmawiyah bt; Adesta, Erry T. Y.; Khan, Ahsan Ali; Abdullah Sidek, Atiah Bt.
2017-03-01
One of the main challenges for any manufacturer is how to decrease the machining cost without affecting the final quality of the product. One of the new advanced machining processes in industry is the high speed hard end milling process that merges three advanced machining processes: high speed milling, hard milling and dry milling. However, one of the most important challenges in this process is to control the flank wear rate. Therefore a analyzing the flank wear rate during machining should be investigated in order to determine the best cutting levels that will not affect the final quality of the product. In this research Taguchi method has been used to investigate the effect of cutting speed, feed rate and depth of cut and determine the best level s to minimize the flank wear rate up to total length of 0.3mm based on the ISO standard to maintain the finishing requirements.
Difference among Levels of Inquiry: Process Skills Improvement at Senior High School in Indonesia
ERIC Educational Resources Information Center
Hardianti, Tuti; Kuswanto, Heru
2017-01-01
The objective of the research concerned here was to discover the difference in effectiveness among Levels 2, 3, and 4 of inquiry learning in improving students' process skills. The research was a quasi-experimental study using the pretest-posttest non-equivalent control group research design. Three sample groups were selected by means of cluster…
ERIC Educational Resources Information Center
Crowe, Jacquelyn
This study investigated computer and word processing operator skills necessary for employment in today's high technology office. The study was comprised of seven major phases: (1) identification of existing community college computer operator programs in the state of Washington; (2) attendance at an information management seminar; (3) production…
ERIC Educational Resources Information Center
Liu, Yanni; Cherkassky, Vladimir L.; Minshew, Nancy J.; Just, Marcel Adam
2011-01-01
Previous behavioral studies have shown that individuals with autism are less hindered by interference from global processing during the performance of lower-level perceptual tasks, such as finding embedded figures. The primary goal of this study was to examine the brain manifestation of such atypicality in high-functioning autism using fMRI.…
Establishment of a high accuracy geoid correction model and geodata edge match
NASA Astrophysics Data System (ADS)
Xi, Ruifeng
This research has developed a theoretical and practical methodology for efficiently and accurately determining sub-decimeter level regional geoids and centimeter level local geoids to meet regional surveying and local engineering requirements. This research also provides a highly accurate static DGPS network data pre-processing, post-processing and adjustment method and a procedure for a large GPS network like the state level HRAN project. The research also developed an efficient and accurate methodology to join soil coverages in GIS ARE/INFO. A total of 181 GPS stations has been pre-processed and post-processed to obtain an absolute accuracy better than 1.5cm at 95% of the stations, and at all stations having a 0.5 ppm average relative accuracy. A total of 167 GPS stations in Iowa and around Iowa have been included in the adjustment. After evaluating GEOID96 and GEOID99, a more accurate and suitable geoid model has been established in Iowa. This new Iowa regional geoid model improved the accuracy from a sub-decimeter 10˜20 centimeter to 5˜10 centimeter. The local kinematic geoid model, developed using Kalman filtering, gives results better than third order leveling accuracy requirement with 1.5 cm standard deviation.
Raffety, B D; Smith, R E; Ptacek, J T
1997-04-01
Participants completed anxiety and coping diaries during 10 periods that began 7 days before an academic stressor and continued through the evening after the stressor. Profile analysis was used to examine the anxiety and coping processes in relation to 2 trait anxiety grouping variables: debilitating and facilitating test anxiety (D-TA and F-TA). Anxiety and coping changed over time, and high and low levels of D-TA and F-TA were associated with different daily patterns of anxiety and coping. Participants with a debilitative, as opposed to facilitative, trait anxiety style had lower examination scores, higher anxiety, and less problem-solving coping. Covarying F-TA, high D-TA was associated with a pattern of higher levels of tension, worry, distraction, and avoidant coping, as well as lower levels of proactive coping. Covarying D-TA, high F-TA was associated with higher levels of tension (but not worry or distraction), support seeking, proactive and problem-solving coping.
Russell, Brian; Yang, Yanhong; Handlogten, Michael; Hudak, Suzanne; Cao, Mingyan; Wang, Jihong; Robbins, David; Ahuja, Sanjeev; Zhu, Min
2017-01-01
ABSTRACT Antibody disulfide bond reduction during monoclonal antibody (mAb) production is a phenomenon that has been attributed to the reducing enzymes from CHO cells acting on the mAb during the harvest process. However, the impact of antibody reduction on the downstream purification process has not been studied. During the production of an IgG2 mAb, antibody reduction was observed in the harvested cell culture fluid (HCCF), resulting in high fragment levels. In addition, aggregate levels increased during the low pH treatment step in the purification process. A correlation between the level of free thiol in the HCCF (as a result of antibody reduction) and aggregation during the low pH step was established, wherein higher levels of free thiol in the starting sample resulted in increased levels of aggregates during low pH treatment. The elevated levels of free thiol were not reduced over the course of purification, resulting in carry‐over of high free thiol content into the formulated drug substance. When the drug substance with high free thiols was monitored for product degradation at room temperature and 2–8°C, faster rates of aggregation were observed compared to the drug substance generated from HCCF that was purified immediately after harvest. Further, when antibody reduction mitigations (e.g., chilling, aeration, and addition of cystine) were applied, HCCF could be held for an extended period of time while providing the same product quality/stability as material that had been purified immediately after harvest. Biotechnol. Bioeng. 2017;114: 1264–1274. © 2017 The Authors. Biotechnology and Bioengineering Published by Wiley Periodicals Inc. PMID:28186329
10 CFR 72.158 - Control of special processes.
Code of Federal Regulations, 2013 CFR
2013-01-01
... NUCLEAR FUEL, HIGH-LEVEL RADIOACTIVE WASTE, AND REACTOR-RELATED GREATER THAN CLASS C WASTE Quality..., and applicant for a CoC shall establish measures to ensure that special processes, including welding...
10 CFR 72.158 - Control of special processes.
Code of Federal Regulations, 2011 CFR
2011-01-01
... NUCLEAR FUEL, HIGH-LEVEL RADIOACTIVE WASTE, AND REACTOR-RELATED GREATER THAN CLASS C WASTE Quality..., and applicant for a CoC shall establish measures to ensure that special processes, including welding...
10 CFR 72.158 - Control of special processes.
Code of Federal Regulations, 2012 CFR
2012-01-01
... NUCLEAR FUEL, HIGH-LEVEL RADIOACTIVE WASTE, AND REACTOR-RELATED GREATER THAN CLASS C WASTE Quality..., and applicant for a CoC shall establish measures to ensure that special processes, including welding...
10 CFR 72.158 - Control of special processes.
Code of Federal Regulations, 2014 CFR
2014-01-01
... NUCLEAR FUEL, HIGH-LEVEL RADIOACTIVE WASTE, AND REACTOR-RELATED GREATER THAN CLASS C WASTE Quality..., and applicant for a CoC shall establish measures to ensure that special processes, including welding...
[Surveillance cultures after high-level disinfection of flexible endoscopes in a general hospital].
Robles, Christian; Turín, Christie; Villar, Alicia; Huerta-Mercado, Jorge; Samalvides, Frine
2014-04-01
Flexible endoscopes are instruments with a complex structure which are used in invasive gastroenterological procedures, therefore high-level disinfection (HLD) is recommended as an appropriate reprocessing method. However, most hospitals do not perform a quality control to assess the compliance and results of the disinfection process. To evaluate the effectiveness of the flexible endoscopes’ decontamination after high-level disinfection by surveillance cultures and to assess the compliance with the reprocessing guidelines. Descriptive study conducted in January 2013 in the Gastroenterological Unit of a tertiary hospital. 30 endoscopic procedures were randomly selected. Compliance with guidelines was evaluated and surveillance cultures for common bacteria were performed after the disinfection process. On the observational assessment, compliance with the guidelines was as follows: pre-cleaning 9 (30%), cleaning 5 (16.7%), rinse 3 (10%), first drying 30 (100%), disinfection 30 (100%), final rinse 0 (0%) and final drying 30 (100%), demonstrating that only 3 of 7 stages of the disinfection process were optimally performed. In the microbiological evaluation, 2 (6.7%) of the 30 procedures had a positive culture obtained from the surface of the endoscope. Furthermore, 1 (4.2%) of the 24 biopsy forcepsgave a positive culture. The organisms isolated were different Pseudomonas species. High-level disinfection procedures were not optimally performed, finding in 6.7% positive cultures of Pseudomonas species.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mee, V.V.D.; Meelker, H.; Schelde, R.V.D.
1999-01-01
In this investigation, an attempt is made to further the understanding of factors influencing the hydrogen content in duplex stainless steel gas tungsten arc (GTA) and gas metal arc (GMA) welds as well as to what extent it affects hydrogen-induced cracking susceptibility. The results indicated that susceptibility to hydrogen cracking using the GTA or GMA process appears to be limited. In practice, maintaining a moisture level below 10 ppm in the shielding gas is of less importance than the choice of welding parameters. Even a moisture level of 1000 ppm in the shielding gas, in combination with the correct weldingmore » parameters, will result in a sufficient low hydrogen content in the weld. Similarly, a moisture level in the shielding gas below 10 ppm does not necessarily result in low hydrogen content in the weld metal. Although very high ferrite levels were combined with high restrain and high hydrogen content, none of the GMA and GTA welds cracked. Susceptibility to hydrogen cracking is concluded to be limited.« less
Torsional ultrasonic wave based level measurement system
Holcomb, David E [Oak Ridge, TN; Kisner, Roger A [Knoxville, TN
2012-07-10
A level measurement system suitable for use in a high temperature and pressure environment to measure the level of coolant fluid within the environment, the system including a volume of coolant fluid located in a coolant region of the high temperature and pressure environment and having a level therein; an ultrasonic waveguide blade that is positioned within the desired coolant region of the high temperature and pressure environment; a magnetostrictive electrical assembly located within the high temperature and pressure environment and configured to operate in the environment and cooperate with the waveguide blade to launch and receive ultrasonic waves; and an external signal processing system located outside of the high temperature and pressure environment and configured for communicating with the electrical assembly located within the high temperature and pressure environment.
Adaptive Management Approach to Oil and Gas Activities in Areas Occupied by Pacific Walrus
NASA Astrophysics Data System (ADS)
Ireland, D.; Broker, K.; San Filippo, V.; Brzuzy, L.; Morse, L.
2016-12-01
During Shell's 2015 exploration drilling program in the Chukchi Sea, activities were conducted in accordance with a Letter of Authorization issued by the United States Fish and Wildlife Service that allowed the incidental harassment of Pacific Walrus and Polar Bears under the Marine Mammal Protection Act. As a part of the request for authorization, Shell proposed a process to monitor and assess the potential for activities to interact with walruses on ice, especially if ice posed a potential threat to the drill site. The process assimilated near real-time information from multiple data sources including vessel-based observations, aerial surveys, satellite-linked GPS tags on walrus, and satellite imagery of ice conditions and movements. These data were reviewed daily and assessed in the context of planned activities to assign a risk level (low, medium, or high). The risk level was communicated to all assets in the field and decision makers during morning briefings. A low risk level meant that planned activities could occur without further review. A medium risk level meant that some operations had a greater potential of interacting with walrus on ice and that additional discussions of those activities were required to determine the relative risk of potential impacts compare to the importance of the planned activity. A high risk level meant that the planned activities were necessary and walrus on ice were likely to be encountered. Assignment of a high risk level triggered contact with agency personnel and directly incorporated them into the assessment and decision making process. This process made effective use of relevant available information to provide meaningful assessments at temporal and spatial scales that allowed approved activities to proceed while minimizing potential impacts. More so, this process provides a valuable alternative to large-scale restriction areas with coarse temporal resolution without reducing protection to target species.
Adaptive Management Approach to Oil and Gas Activities in Areas Occupied by Pacific Walrus
NASA Astrophysics Data System (ADS)
Ireland, D.; Broker, K.; San Filippo, V.; Brzuzy, L.; Morse, L.
2016-02-01
During Shell's 2015 exploration drilling program in the Chukchi Sea, activities were conducted in accordance with a Letter of Authorization issued by the United States Fish and Wildlife Service that allowed the incidental harassment of Pacific Walrus and Polar Bears under the Marine Mammal Protection Act. As a part of the request for authorization, Shell proposed a process to monitor and assess the potential for activities to interact with walruses on ice, especially if ice posed a potential threat to the drill site. The process assimilated near real-time information from multiple data sources including vessel-based observations, aerial surveys, satellite-linked GPS tags on walrus, and satellite imagery of ice conditions and movements. These data were reviewed daily and assessed in the context of planned activities to assign a risk level (low, medium, or high). The risk level was communicated to all assets in the field and decision makers during morning briefings. A low risk level meant that planned activities could occur without further review. A medium risk level meant that some operations had a greater potential of interacting with walrus on ice and that additional discussions of those activities were required to determine the relative risk of potential impacts compare to the importance of the planned activity. A high risk level meant that the planned activities were necessary and walrus on ice were likely to be encountered. Assignment of a high risk level triggered contact with agency personnel and directly incorporated them into the assessment and decision making process. This process made effective use of relevant available information to provide meaningful assessments at temporal and spatial scales that allowed approved activities to proceed while minimizing potential impacts. More so, this process provides a valuable alternative to large-scale restriction areas with coarse temporal resolution without reducing protection to target species.
NASA Astrophysics Data System (ADS)
Knysh, Yu A.; Xanthopoulou, G. G.
2018-01-01
The object of the study is a catalytic combustion chamber that provides a highly efficient combustion process through the use of effects: heat recovery from combustion, microvortex heat transfer, catalytic reaction and acoustic resonance. High efficiency is provided by a complex of related technologies: technologies for combustion products heat transfer (recuperation) to initial mixture, catalytic processes technology, technology for calculating effective combustion processes based on microvortex matrices, technology for designing metamaterials structures and technology for obtaining the required topology product by laser fusion of metal powder compositions. The mesoscale level structure provides combustion process with the use of a microvortex effect with a high intensity of heat and mass transfer. High surface area (extremely high area-to-volume ratio) created due to nanoscale periodic structure and ensures catalytic reactions efficiency. Produced metamaterial is the first multiscale product of new concept which due to combination of different scale level periodic topologies provides qualitatively new set of product properties. This research is aimed at solving simultaneously two global problems of the present: ensure environmental safety of transport systems and power industry, as well as the economy and rational use of energy resources, providing humanity with energy now and in the foreseeable future.
Process connectivity in a naturally prograding river delta
NASA Astrophysics Data System (ADS)
Sendrowski, Alicia; Passalacqua, Paola
2017-03-01
River deltas are lowland systems that can display high hydrological connectivity. This connectivity can be structural (morphological connections), functional (control of fluxes), and process connectivity (information flow from system drivers to sinks). In this work, we quantify hydrological process connectivity in Wax Lake Delta, coastal Louisiana, by analyzing couplings among external drivers (discharge, tides, and wind) and water levels recorded at five islands and one channel over summer 2014. We quantify process connections with information theory, a branch of mathematics concerned with the communication of information. We represent process connections as a network; variables serve as network nodes and couplings as network links describing the strength, direction, and time scale of information flow. Comparing process connections at long (105 days) and short (10 days) time scales, we show that tides exhibit daily synchronization with water level, with decreasing strength from downstream to upstream, and that tides transfer information as tides transition from spring to neap. Discharge synchronizes with water level and the time scale of its information transfer compares well to physical travel times through the system, computed with a hydrodynamic model. Information transfer and physical transport show similar spatial patterns, although information transfer time scales are larger than physical travel times. Wind events associated with water level setup lead to increased process connectivity with highly variable information transfer time scales. We discuss the information theory results in the context of the hydrologic behavior of the delta, the role of vegetation as a connector/disconnector on islands, and the applicability of process networks as tools for delta modeling results.
NASA Astrophysics Data System (ADS)
Purwoko, Saad, Noor Shah; Tajudin, Nor'ain Mohd
2017-05-01
This study aims to: i) develop problem solving questions of Linear Equations System of Two Variables (LESTV) based on levels of IPT Model, ii) explain the level of students' skill of information processing in solving LESTV problems; iii) explain students' skill in information processing in solving LESTV problems; and iv) explain students' cognitive process in solving LESTV problems. This study involves three phases: i) development of LESTV problem questions based on Tessmer Model; ii) quantitative survey method on analyzing students' skill level of information processing; and iii) qualitative case study method on analyzing students' cognitive process. The population of the study was 545 eighth grade students represented by a sample of 170 students of five Junior High Schools in Hilir Barat Zone, Palembang (Indonesia) that were chosen using cluster sampling. Fifteen students among them were drawn as a sample for the interview session with saturated information obtained. The data were collected using the LESTV problem solving test and the interview protocol. The quantitative data were analyzed using descriptive statistics, while the qualitative data were analyzed using the content analysis. The finding of this study indicated that students' cognitive process was just at the step of indentifying external source and doing algorithm in short-term memory fluently. Only 15.29% students could retrieve type A information and 5.88% students could retrieve type B information from long-term memory. The implication was the development problems of LESTV had validated IPT Model in modelling students' assessment by different level of hierarchy.
Lead iron phosphate glass as a containment medium for disposal of high-level nuclear waste
Boatner, Lynn A.; Sales, Brian C.
1989-01-01
Lead-iron phosphate glasses containing a high level of Fe.sub.2 O.sub.3 for use as a storage medium for high-level radioactive nuclear waste. By combining lead-iron phosphate glass with various types of simulated high-level nuclear waste, a highly corrosion resistant, homogeneous, easily processed glass can be formed. For corroding solutions at 90.degree. C., with solution pH values in the range between 5 and 9, the corrosion rate of the lead-iron phosphate nuclear waste glass is at least 10.sup.2 to 10.sup.3 times lower than the corrosion rate of a comparable borosilicate nuclear waste glass. The presence of Fe.sub.2 O.sub.3 in forming the lead-iron phosphate glass is critical. Lead-iron phosphate nuclear waste glass can be prepared at temperatures as low as 800.degree. C., since they exhibit very low melt viscosities in the 800.degree. to 1050.degree. C. temperature range. These waste-loaded glasses do not readily devitrify at temperatures as high as 550.degree. C. and are not adversely affected by large doses of gamma radiation in H.sub.2 O at 135.degree. C. The lead-iron phosphate waste glasses can be prepared with minimal modification of the technology developed for processing borosilicate glass nuclear wasteforms.
Effects of Sequences of Cognitions on Group Performance Over Time
Molenaar, Inge; Chiu, Ming Ming
2017-01-01
Extending past research showing that sequences of low cognitions (low-level processing of information) and high cognitions (high-level processing of information through questions and elaborations) influence the likelihoods of subsequent high and low cognitions, this study examines whether sequences of cognitions are related to group performance over time; 54 primary school students (18 triads) discussed and wrote an essay about living in another country (32,375 turns of talk). Content analysis and statistical discourse analysis showed that within each lesson, groups with more low cognitions or more sequences of low cognition followed by high cognition added more essay words. Groups with more high cognitions, sequences of low cognition followed by low cognition, or sequences of high cognition followed by an action followed by low cognition, showed different words and sequences, suggestive of new ideas. The links between cognition sequences and group performance over time can inform facilitation and assessment of student discussions. PMID:28490854
Effects of Sequences of Cognitions on Group Performance Over Time.
Molenaar, Inge; Chiu, Ming Ming
2017-04-01
Extending past research showing that sequences of low cognitions (low-level processing of information) and high cognitions (high-level processing of information through questions and elaborations) influence the likelihoods of subsequent high and low cognitions, this study examines whether sequences of cognitions are related to group performance over time; 54 primary school students (18 triads) discussed and wrote an essay about living in another country (32,375 turns of talk). Content analysis and statistical discourse analysis showed that within each lesson, groups with more low cognitions or more sequences of low cognition followed by high cognition added more essay words. Groups with more high cognitions, sequences of low cognition followed by low cognition, or sequences of high cognition followed by an action followed by low cognition, showed different words and sequences, suggestive of new ideas. The links between cognition sequences and group performance over time can inform facilitation and assessment of student discussions.
The Role of Independent V&V in Upstream Software Development Processes
NASA Technical Reports Server (NTRS)
Easterbrook, Steve
1996-01-01
This paper describes the role of Verification and Validation (V&V) during the requirements and high level design processes, and in particular the role of Independent V&V (IV&V). The job of IV&V during these phases is to ensure that the requirements are complete, consistent and valid, and to ensure that the high level design meets the requirements. This contrasts with the role of Quality Assurance (QA), which ensures that appropriate standards and process models are defined and applied. This paper describes the current state of practice for IV&V, concentrating on the process model used in NASA projects. We describe a case study, showing the processes by which problem reporting and tracking takes place, and how IV&V feeds into decision making by the development team. We then describe the problems faced in implementing IV&V. We conclude that despite a well defined process model, and tools to support it, IV&V is still beset by communication and coordination problems.
High-autonomy control of space resource processing plants
NASA Technical Reports Server (NTRS)
Schooley, Larry C.; Zeigler, Bernard P.; Cellier, Francois E.; Wang, Fei-Yue
1993-01-01
A highly autonomous intelligent command/control architecture has been developed for planetary surface base industrial process plants and Space Station Freedom experimental facilities. The architecture makes use of a high-level task-oriented mode with supervisory control from one or several remote sites, and integrates advanced network communications concepts and state-of-the-art man/machine interfaces with the most advanced autonomous intelligent control. Attention is given to the full-dynamics model of a Martian oxygen-production plant, event-based/fuzzy-logic process control, and fault management practices.
The reliability of continuous brain responses during naturalistic listening to music.
Burunat, Iballa; Toiviainen, Petri; Alluri, Vinoo; Bogert, Brigitte; Ristaniemi, Tapani; Sams, Mikko; Brattico, Elvira
2016-01-01
Low-level (timbral) and high-level (tonal and rhythmical) musical features during continuous listening to music, studied by functional magnetic resonance imaging (fMRI), have been shown to elicit large-scale responses in cognitive, motor, and limbic brain networks. Using a similar methodological approach and a similar group of participants, we aimed to study the replicability of previous findings. Participants' fMRI responses during continuous listening of a tango Nuevo piece were correlated voxelwise against the time series of a set of perceptually validated musical features computationally extracted from the music. The replicability of previous results and the present study was assessed by two approaches: (a) correlating the respective activation maps, and (b) computing the overlap of active voxels between datasets at variable levels of ranked significance. Activity elicited by timbral features was better replicable than activity elicited by tonal and rhythmical ones. These results indicate more reliable processing mechanisms for low-level musical features as compared to more high-level features. The processing of such high-level features is probably more sensitive to the state and traits of the listeners, as well as of their background in music. Copyright © 2015 Elsevier Inc. All rights reserved.
Profile of mathematics anxiety of 7th graders
NASA Astrophysics Data System (ADS)
Udil, Patrisius Afrisno; Kusmayadi, Tri Atmojo; Riyadi
2017-08-01
Mathematics anxiety is one of the important factors affect students mathematics achievement. This present research investigates profile of students' mathematics anxiety. This research focuses on analysis and description of students' mathematics anxiety level generally and its dominant domain and aspect. Qualitative research with case study strategy was used in this research. Subject in this research involved 15 students of 7th grade chosen with purposive sampling. Data in this research were students' mathematics anxiety scale result, interview record, and observation result during both mathematics learning activity and test. They were asked to complete mathematics anxiety scale before interviewed and observed. The results show that generally students' mathematics anxiety was identified in the moderate level. In addition, students' mathematics anxiety during mathematics test was identified in the high level, but it was in the moderate level during mathematics learning process. Based on the anxiety domain, students have a high mathematics anxiety on cognitive domain, while it was in the moderate level for psychological and physiological domains. On the other hand, it was identified in low level for psychological domain during mathematics learning process. Therefore, it can be concluded that students have serious and high anxiety regarding mathematics on the cognitive domain and mathematics test aspect.
Prototype architecture for a VLSI level zero processing system. [Space Station Freedom
NASA Technical Reports Server (NTRS)
Shi, Jianfei; Grebowsky, Gerald J.; Horner, Ward P.; Chesney, James R.
1989-01-01
The prototype architecture and implementation of a high-speed level zero processing (LZP) system are discussed. Due to the new processing algorithm and VLSI technology, the prototype LZP system features compact size, low cost, high processing throughput, and easy maintainability and increased reliability. Though extensive control functions have been done by hardware, the programmability of processing tasks makes it possible to adapt the system to different data formats and processing requirements. It is noted that the LZP system can handle up to 8 virtual channels and 24 sources with combined data volume of 15 Gbytes per orbit. For greater demands, multiple LZP systems can be configured in parallel, each called a processing channel and assigned a subset of virtual channels. The telemetry data stream will be steered into different processing channels in accordance with their virtual channel IDs. This super system can cope with a virtually unlimited number of virtual channels and sources. In the near future, it is expected that new disk farms with data rate exceeding 150 Mbps will be available from commercial vendors due to the advance in disk drive technology.
Nickel Base Superalloy Turbine Disk
NASA Technical Reports Server (NTRS)
Gabb, Timothy P. (Inventor); Gauda, John (Inventor); Telesman, Ignacy (Inventor); Kantzos, Pete T. (Inventor)
2005-01-01
A low solvus, high refractory alloy having unusually versatile processing mechanical property capabilities for advanced disks and rotors in gas turbine engines. The nickel base superalloy has a composition consisting essentially of, in weight percent, 3.0-4.0 N, 0.02-0.04 B, 0.02-0.05 C, 12.0-14.0 Cr, 19.0-22.0 Co, 2.0-3.5 Mo, greater than 1.0 to 2.1 Nb, 1.3 to 2.1 Ta,3.04.OTi,4.1 to 5.0 W, 0.03-0.06 Zr, and balance essentially Ni and incidental impurities. The superalloy combines ease of processing with high temperature capabilities to be suitable for use in various turbine engine disk, impeller, and shaft applications. The Co and Cr levels of the superalloy can provide low solvus temperature for high processing versatility. The W, Mo, Ta, and Nb refractory element levels of the superalloy can provide sustained strength, creep, and dwell crack growth resistance at high temperatures.
NASA Astrophysics Data System (ADS)
Chen, Xiaolong; Honda, Hiroshi; Kuroda, Seiji; Araki, Hiroshi; Murakami, Hideyuki; Watanabe, Makoto; Sakka, Yoshio
2016-12-01
Effects of the ceramic powder size used for suspension as well as several processing parameters in suspension plasma spraying of YSZ were investigated experimentally, aiming to fabricate highly segmented microstructures for thermal barrier coating (TBC) applications. Particle image velocimetry (PIV) was used to observe the atomization process and the velocity distribution of atomized droplets and ceramic particles travelling toward the substrates. The tested parameters included the secondary plasma gas (He versus H2), suspension injection flow rate, and substrate surface roughness. Results indicated that a plasma jet with a relatively higher content of He or H2 as the secondary plasma gas was critical to produce highly segmented YSZ TBCs with a crack density up to 12 cracks/mm. The optimized suspension flow rate played an important role to realize coatings with a reduced porosity level and improved adhesion. An increased powder size and higher operation power level were beneficial for the formation of highly segmented coatings onto substrates with a wider range of surface roughness.
The FORCE - A highly portable parallel programming language
NASA Technical Reports Server (NTRS)
Jordan, Harry F.; Benten, Muhammad S.; Alaghband, Gita; Jakob, Ruediger
1989-01-01
This paper explains why the FORCE parallel programming language is easily portable among six different shared-memory multiprocessors, and how a two-level macro preprocessor makes it possible to hide low-level machine dependencies and to build machine-independent high-level constructs on top of them. These FORCE constructs make it possible to write portable parallel programs largely independent of the number of processes and the specific shared-memory multiprocessor executing them.
The FORCE: A highly portable parallel programming language
NASA Technical Reports Server (NTRS)
Jordan, Harry F.; Benten, Muhammad S.; Alaghband, Gita; Jakob, Ruediger
1989-01-01
Here, it is explained why the FORCE parallel programming language is easily portable among six different shared-memory microprocessors, and how a two-level macro preprocessor makes it possible to hide low level machine dependencies and to build machine-independent high level constructs on top of them. These FORCE constructs make it possible to write portable parallel programs largely independent of the number of processes and the specific shared memory multiprocessor executing them.
Energy Transfer to Upper Trophic Levels on a Small Offshore Bank
2009-09-30
Observations from Platts Bank and other feeding hotspots in the Gulf of Maine show that high levels of feeding activity are ephemeral—sometimes...feeding hotspots in the Gulf of Maine show that high levels of feeding activity are ephemeral?sometimes very active, often not. Differences can exist...defining patterns of biodiversity in the oceans and the processes that shape them. The Gulf of Maine program has the additional aim of describing how
Olsen, Shira A; Beck, J Gayle
2012-01-01
This study investigated the effects of high and low levels of dissociation on information processing for analogue trauma and neutral stimuli. Fifty-four undergraduate females who reported high and low levels of trait dissociation were presented with two films, one depicting traumatic events, the other containing neutral material. Participants completed a divided attention task (yielding a proxy measure of attention), as well as explicit memory (free-recall) and implicit memory (word-stem completion) tasks for both films. Results indicated that the high DES group showed less attention and had poorer recall for the analogue trauma stimuli, relative to the neutral stimuli and the low DES group. These findings suggest that high levels of trait dissociation are associated with reductions in attention and memory for analogue trauma stimuli, relative to neutral stimuli and relative to low trait dissociation. Implications for the role of cognitive factors in the etiology of negative post-trauma responses are discussed. Copyright © 2011 Elsevier Ltd. All rights reserved.
USDA-ARS?s Scientific Manuscript database
In these studies liquid hot water (LHW) pretreated and enzymatically hydrolyzed Sweet Sorghum Bagasse (SSB) hydrolyzates were fermented in a fed-batch reactor. As reported in the preceding paper, the culture was not able to ferment the hydrolyzate I in a batch process due to presence of high level o...
ERIC Educational Resources Information Center
Rahman, Abdul; Ahmar, Ansari Saleh
2016-01-01
Several studies suggest that most students are not in the same level of development (Slavin, 2008). From concrete operation level to formal operation level, students experience lateness in the transition phase. Consequently, students feel difficulty in solving mathematics problems. Method research is a qualitatively descriptive-explorative…
How gender-expectancy affects the processing of "them".
Doherty, Alice; Conklin, Kathy
2017-04-01
How sensitive is pronoun processing to expectancies based on real-world knowledge and language usage? The current study links research on the integration of gender stereotypes and number-mismatch to explore this question. It focuses on the use of them to refer to antecedents of different levels of gender-expectancy (low-cyclist, high-mechanic, known-spokeswoman). In a rating task, them is considered increasingly unnatural with greater gender-expectancy. However, participants might not be able to differentiate high-expectancy and gender-known antecedents online because they initially search for plural antecedents (e.g., Sanford & Filik), and they make all-or-nothing gender inferences. An eye-tracking study reveals early differences in the processing of them with antecedents of high gender-expectancy compared with gender-known antecedents. This suggests that participants have rapid access to the expected gender of the antecedent and the level of that expectancy.
Solar silicon via improved and expanded metallurgical silicon technology
NASA Technical Reports Server (NTRS)
Hunt, L. P.; Dosaj, V. D.; Mccormick, J. R.
1977-01-01
A completed preliminary survey of silica sources indicates that sufficient quantities of high-purity quartz are available in the U.S. and Canada to meet goals. Supply can easily meet demand for this little-sought commodity. Charcoal, as a reductant for silica, can be purified to a sufficient level by high-temperature fluorocarbon treatment and vacuum processing. High-temperature treatment causes partial graphitization which can lead to difficulty in smelting. Smelting of Arkansas quartz and purified charcoal produced kilogram quantities of silicon having impurity levels generally much lower than in MG-Si. Half of the goal was met of increasing the boron resistivity from 0.03 ohm-cm in metallurgical silicon to 0.3 ohm-cm in solar silicon. A cost analysis of the solidification process indicate $3.50-7.25/kg Si for the Czochralski-type process and $1.50-4.25/kg Si for the Bridgman-type technique.
Hasegawa, Naoya; Kitamura, Hideaki; Murakami, Hiroatsu; Kameyama, Shigeki; Sasagawa, Mutsuo; Egawa, Jun; Tamura, Ryu; Endo, Taro; Someya, Toshiyuki
2013-01-01
Individuals with autistic spectrum disorder (ASD) demonstrate an impaired ability to infer the mental states of others from their gaze. Thus, investigating the relationship between ASD and eye gaze processing is crucial for understanding the neural basis of social impairments seen in individuals with ASD. In addition, characteristics of ASD are observed in more comprehensive visual perception tasks. These visual characteristics of ASD have been well-explained in terms of the atypical relationship between high- and low-level gaze processing in ASD. We studied neural activity during gaze processing in individuals with ASD using magnetoencephalography, with a focus on the relationship between high- and low-level gaze processing both temporally and spatially. Minimum Current Estimate analysis was applied to perform source analysis of magnetic responses to gaze stimuli. The source analysis showed that later activity in the primary visual area (V1) was affected by gaze direction only in the ASD group. Conversely, the right posterior superior temporal sulcus, which is a brain region that processes gaze as a social signal, in the typically developed group showed a tendency toward greater activation during direct compared with averted gaze processing. These results suggest that later activity in V1 relating to gaze processing is altered or possibly enhanced in high-functioning individuals with ASD, which may underpin the social cognitive impairments in these individuals. © 2013 S. Karger AG, Basel.
A Social Approach to High-Level Context Generation for Supporting Context-Aware M-Learning
ERIC Educational Resources Information Center
Pan, Xu-Wei; Ding, Ling; Zhu, Xi-Yong; Yang, Zhao-Xiang
2017-01-01
In m-learning environments, context-awareness is for wide use where learners' situations are varied, dynamic and unpredictable. We are facing the challenge of requirements of both generality and depth in generating and processing high-level context. In this paper, we present a social approach which exploits social dynamics and social computing for…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bickford, D.F.; Congdon, J.W.; Oblath, S.B.
1987-01-01
At the U.S. Department of Energy's Savannah River Plant, corrosion of carbon steel storage tanks containing alkaline, high-level radioactive waste is controlled by specification of limits on waste composition and temperature. Processes for the preparation of waste for final disposal will result in waste with low corrosion inhibitor concentrations and, in some cases, high aromatic organic concentrations, neither of which are characteristic of previous operations. Laboratory tests, conducted to determine minimum corrosion inhibitor levels indicated pitting of carbon steel near the waterline for proposed storage conditions. In situ electrochemical measurements of full-scale radioactive process demonstrations have been conducted to assessmore » the validity of laboratory tests. Probes included pH, Eh (potential relative to a standard hydrogen electrode), tank potential, and alloy coupons. In situ results are compared to those of the laboratory tests, with particular regard given to simulated solution composition.« less
Near-field environment/processes working group summary
DOE Office of Scientific and Technical Information (OSTI.GOV)
Murphy, W.M.
1995-09-01
This article is a summary of the proceedings of a group discussion which took place at the Workshop on the Role of Natural Analogs in Geologic Disposal of High-Level Nuclear Waste in San Antonio, Texas on July 22-25, 1991. The working group concentrated on the subject of the near-field environment to geologic repositories for high-level nuclear waste. The near-field environment may be affected by thermal perturbations from the waste, and by disturbances caused by the introduction of exotic materials during construction of the repository. This group also discussed the application of modelling of performance-related processes.
Design of signal reception and processing system of embedded ultrasonic endoscope
NASA Astrophysics Data System (ADS)
Li, Ming; Yu, Feng; Zhang, Ruiqiang; Li, Yan; Chen, Xiaodong; Yu, Daoyin
2009-11-01
Embedded Ultrasonic Endoscope, based on embedded microprocessor and embedded real-time operating system, sends a micro ultrasonic probe into coelom through the biopsy channel of the Electronic Endoscope to get the fault histology features of digestive organs by rotary scanning, and acquires the pictures of the alimentary canal mucosal surface. At the same time, ultrasonic signals are processed by signal reception and processing system, forming images of the full histology of the digestive organs. Signal Reception and Processing System is an important component of Embedded Ultrasonic Endoscope. However, the traditional design, using multi-level amplifiers and special digital processing circuits to implement signal reception and processing, is no longer satisfying the standards of high-performance, miniaturization and low power requirements that embedded system requires, and as a result of the high noise that multi-level amplifier brought, the extraction of small signal becomes hard. Therefore, this paper presents a method of signal reception and processing based on double variable gain amplifier and FPGA, increasing the flexibility and dynamic range of the Signal Reception and Processing System, improving system noise level, and reducing power consumption. Finally, we set up the embedded experiment system, using a transducer with the center frequency of 8MHz to scan membrane samples, and display the image of ultrasonic echo reflected by each layer of membrane, with a frame rate of 5Hz, verifying the correctness of the system.
de la Fuente, Jesús; Sander, Paul; Martínez-Vicente, José M; Vera, Mariano; Garzón, Angélica; Fadda, Salvattore
2017-01-01
The Theory of Self- vs . Externally-Regulated Learning™ (SRL vs. ERL) proposed different types of relationships among levels of variables in Personal Self-Regulation (PSR) and Regulatory Teaching (RT) to predict the meta-cognitive, meta-motivational and -emotional variables of learning, and of Academic Achievement in Higher Education. The aim of this investigation was empirical in order to validate the model of the combined effect of low-medium-high levels in PSR and RT on the dependent variables. For the analysis of combinations, a selected sample of 544 undergraduate students from two Spanish universities was used. Data collection was obtained from validated instruments, in Spanish versions. Using an ex-post-facto design, different Univariate and Multivariate Analyses (3 × 1, 3 × 3, and 4 × 1) were conducted. Results provide evidence for a consistent effect of low-medium-high levels of PSR and of RT, thus giving significant partial confirmation of the proposed rational model. As predicted, (1) the levels of PSR and positively and significantly effected the levels of learning approaches, resilience, engagement, academic confidence, test anxiety, and procedural and attitudinal academic achievement; (2) the most favorable type of interaction was a high level of PSR with a high level RT process. The limitations and implications of these results in the design of effective teaching are analyzed, to improve university teaching-learning processes.
de la Fuente, Jesús; Sander, Paul; Martínez-Vicente, José M.; Vera, Mariano; Garzón, Angélica; Fadda, Salvattore
2017-01-01
The Theory of Self- vs. Externally-Regulated Learning™ (SRL vs. ERL) proposed different types of relationships among levels of variables in Personal Self-Regulation (PSR) and Regulatory Teaching (RT) to predict the meta-cognitive, meta-motivational and -emotional variables of learning, and of Academic Achievement in Higher Education. The aim of this investigation was empirical in order to validate the model of the combined effect of low-medium-high levels in PSR and RT on the dependent variables. For the analysis of combinations, a selected sample of 544 undergraduate students from two Spanish universities was used. Data collection was obtained from validated instruments, in Spanish versions. Using an ex-post-facto design, different Univariate and Multivariate Analyses (3 × 1, 3 × 3, and 4 × 1) were conducted. Results provide evidence for a consistent effect of low-medium-high levels of PSR and of RT, thus giving significant partial confirmation of the proposed rational model. As predicted, (1) the levels of PSR and positively and significantly effected the levels of learning approaches, resilience, engagement, academic confidence, test anxiety, and procedural and attitudinal academic achievement; (2) the most favorable type of interaction was a high level of PSR with a high level RT process. The limitations and implications of these results in the design of effective teaching are analyzed, to improve university teaching-learning processes. PMID:28280473
Material Processing Opportunites Utilizing a Free Electron Laser
NASA Astrophysics Data System (ADS)
Todd, Alan
1996-11-01
Many properties of photocathode-driven Free Electron Lasers (FEL) are extremely attractive for material processing applications. These include: 1) broad-band tunability across the IR and UV spectra which permits wavelength optimization, depth deposition control and utilization of resonance phenomena; 2) picosecond pulse structure with continuous nanosecond spacing for optimum deposition efficiency and minimal collateral damage; 3) high peak and average radiated power for economic processing in quantity; and 4) high brightness for spatially defined energy deposition and intense energy density in small spots. We discuss five areas: polymer, metal and electronic material processing, micromachining and defense applications; where IR or UV material processing will find application if the economics is favorable. Specific examples in the IR and UV, such as surface texturing of polymers for improved look and feel, and anti-microbial food packaging films, which have been demonstrated using UV excimer lamps and lasers, will be given. Unfortunately, although the process utility is readily proven, the power levels and costs of lamps and lasers do not scale to production margins. However, from these examples, application specific cost targets ranging from 0.1=A2/kJ to 10=A2/kJ of delivered radiation at power levels from 10 kW to 500 kW, have been developed and are used to define strawman FEL processing systems. Since =46EL radiation energy extraction from the generating electron beam is typically a few percent, at these high average power levels, economic considerations dictate the use of a superconducting RF accelerator with energy recovery to minimize cavity and beam dump power loss. Such a 1 kW IR FEL, funded by the US Navy, is presently under construction at the Thomas Jefferson National Accelerator Facility. This dual-use device, scheduled to generate first light in late 1997, will test both the viability of high-power FELs for shipboard self-defense against cruise missiles, and for the first time, provide an industrial testbed capable of processing various materials in market evaluation quantities.
Exogenous cortisol facilitates responses to social threat under high provocation.
Bertsch, Katja; Böhnke, Robina; Kruk, Menno R; Richter, Steffen; Naumann, Ewald
2011-04-01
Stress is one of the most important promoters of aggression. Human and animal studies have found associations between basal and acute levels of the stress hormone cortisol and (abnormal) aggression. Irrespective of the direction of these changes--i.e., increased or decreased aggressive behavior--the results of these studies suggest dramatic alterations in the processing of threat-related social information. Therefore, the effects of cortisol and provocation on social information processing were addressed by the present study. After a placebo-controlled pharmacological manipulation of acute cortisol levels, we exposed healthy individuals to high or low levels of provocation in a competitive aggression paradigm. Influences of cortisol and provocation on emotional face processing were then investigated with reaction times and event-related potentials (ERPs) in an emotional Stroop task. In line with previous results, enhanced early and later positive, posterior ERP components indicated a provocation-induced enhanced relevance for all kinds of social information. Cortisol, however, reduced an early frontocentral bias for angry faces and--despite the provocation-enhancing relevance--led to faster reactions for all facial expressions in highly provoked participants. The results thus support the moderating role of social information processing in the 'vicious circle of stress and aggression'. Copyright © 2010 Elsevier Inc. All rights reserved.
Food processing by high hydrostatic pressure.
Yamamoto, Kazutaka
2017-04-01
High hydrostatic pressure (HHP) process, as a nonthermal process, can be used to inactivate microbes while minimizing chemical reactions in food. In this regard, a HHP level of 100 MPa (986.9 atm/1019.7 kgf/cm 2 ) and more is applied to food. Conventional thermal process damages food components relating color, flavor, and nutrition via enhanced chemical reactions. However, HHP process minimizes the damages and inactivates microbes toward processing high quality safe foods. The first commercial HHP-processed foods were launched in 1990 as fruit products such as jams, and then some other products have been commercialized: retort rice products (enhanced water impregnation), cooked hams and sausages (shelf life extension), soy sauce with minimized salt (short-time fermentation owing to enhanced enzymatic reactions), and beverages (shelf life extension). The characteristics of HHP food processing are reviewed from viewpoints of nonthermal process, history, research and development, physical and biochemical changes, and processing equipment.
Implementation of a VLSI Level Zero Processing system utilizing the functional component approach
NASA Technical Reports Server (NTRS)
Shi, Jianfei; Horner, Ward P.; Grebowsky, Gerald J.; Chesney, James R.
1991-01-01
A high rate Level Zero Processing system is currently being prototyped at NASA/Goddard Space Flight Center (GSFC). Based on state-of-the-art VLSI technology and the functional component approach, the new system promises capabilities of handling multiple Virtual Channels and Applications with a combined data rate of up to 20 Megabits per second (Mbps) at low cost.
NASA Astrophysics Data System (ADS)
Snapir, Zohar; Eberbach, Catherine; Ben-Zvi-Assaraf, Orit; Hmelo-Silver, Cindy; Tripto, Jaklin
2017-10-01
Science education today has become increasingly focused on research into complex natural, social and technological systems. In this study, we examined the development of high-school biology students' systems understanding of the human body, in a three-year longitudinal study. The development of the students' system understanding was evaluated using the Components Mechanisms Phenomena (CMP) framework for conceptual representation. We coded and analysed the repertory grid personal constructs of 67 high-school biology students at 4 points throughout the study. Our data analysis builds on the assumption that systems understanding entails a perception of all the system categories, including structures within the system (its Components), specific processes and interactions at the macro and micro levels (Mechanisms), and the Phenomena that present the macro scale of processes and patterns within a system. Our findings suggest that as the learning process progressed, the systems understanding of our students became more advanced, moving forward within each of the major CMP categories. Moreover, there was an increase in the mechanism complexity presented by the students, manifested by more students describing mechanisms at the molecular level. Thus, the 'mechanism' category and the micro level are critical components that enable students to understand system-level phenomena such as homeostasis.
Nasri Nasrabadi, Mohammad Reza; Razavi, Seyed Hadi
2010-04-01
In this work, we applied statistical experimental design to a fed-batch process for optimization of tricarboxylic acid cycle (TCA) intermediates in order to achieve high-level production of canthaxanthin from Dietzia natronolimnaea HS-1 cultured in beet molasses. A fractional factorial design (screening test) was first conducted on five TCA cycle intermediates. Out of the five TCA cycle intermediates investigated via screening tests, alfaketoglutarate, oxaloacetate and succinate were selected based on their statistically significant (P<0.05) and positive effects on canthaxanthin production. These significant factors were optimized by means of response surface methodology (RSM) in order to achieve high-level production of canthaxanthin. The experimental results of the RSM were fitted with a second-order polynomial equation by means of a multiple regression technique to identify the relationship between canthaxanthin production and the three TCA cycle intermediates. By means of this statistical design under a fed-batch process, the optimum conditions required to achieve the highest level of canthaxanthin (13172 + or - 25 microg l(-1)) were determined as follows: alfaketoglutarate, 9.69 mM; oxaloacetate, 8.68 mM; succinate, 8.51 mM. Copyright 2009 The Society for Biotechnology, Japan. Published by Elsevier B.V. All rights reserved.
Technical and economical evaluation of carbon dioxide capture and conversion to methanol process
NASA Astrophysics Data System (ADS)
Putra, Aditya Anugerah; Juwari, Handogo, Renanto
2017-05-01
Phenomenon of global warming, which is indicated by increasing of earth's surface temperature, is caused by high level of greenhouse gases level in the atmosphere. Carbon dioxide, which increases year by year because of high demand of energy, gives the largest contribution in greenhouse gases. One of the most applied solution to mitigate carbon dioxide level is post-combustion carbon capture technology. Although the technology can absorb up to 90% of carbon dioxide produced, some worries occur that captured carbon dioxide that is stored underground will be released over time. Utilizing captured carbon dioxide could be a promising solution. Captured carbon dioxide can be converted into more valuable material, such as methanol. This research will evaluate the conversion process of captured carbon dioxide to methanol, technically and economically. From the research, it is found that technically methanol can be made from captured carbon dioxide. Product gives 25.6905 kg/s flow with 99.69% purity of methanol. Economical evaluation of the whole conversion process shows that the process is economically feasible. The capture and conversion process needs 176,101,157.69 per year for total annual cost and can be overcome by revenue gained from methanol product sales.
Keenan, Derek F; Brunton, Nigel; Gormley, Ronan; Butler, Francis
2011-01-26
The aim of the present study was the evaluation of high hydrostatic pressure (HHP) processing on the levels of polyphenolic compounds and selected quality attributes of fruit smoothies compared to fresh and mild conventional pasteurization processing. Fruit smoothie samples were thermally (P(70) > 10 min) or HHP processed (450 MPa/1, 3, or 5 min/20 °C) (HHP1, HHP3, and HHP5, respectively). The polyphenolic content, color difference (ΔE), sensory acceptability, and rheological (G'; G''; G*) properties of the smoothies were assessed over a storage period of 30 days at 4 °C. Processing had a significant effect (p < 0.001) on the levels of polyphenolic compounds in smoothies. However, this effect was not consistent for all compound types. HHP processed samples (HHP1 and HHP3) had higher (p < 0.001) levels of phenolic compounds, for example, procyanidin B1 and hesperidin, than HHP5 samples. Levels of flavanones and hydroxycinnamic acid compounds decreased (p < 0.001) after 30 days of storage at 2-4 °C). Decreases were particularly notable between days 10 and 20 (hesperidin) and days 20 and 30 (chlorogenic acid) (p < 0.001). There was a wide variation in ΔE values recorded over the 30 day storage period (p < 0.001), with fresh and thermally processed smoothies exhibiting lower color change than their HHP counterparts (p < 0.001). No effect was observed for the type of process on complex modulus (G*) data, but all smoothies became less rigid during the storage period (p < 0.001). Despite minor product deterioration during storage (p < 0.001), sensory acceptability scores showed no preference for either fresh or processed (thermal/HHP) smoothies, which were deemed acceptable (>3) by panelists.
Towards Portable Large-Scale Image Processing with High-Performance Computing.
Huo, Yuankai; Blaber, Justin; Damon, Stephen M; Boyd, Brian D; Bao, Shunxing; Parvathaneni, Prasanna; Noguera, Camilo Bermudez; Chaganti, Shikha; Nath, Vishwesh; Greer, Jasmine M; Lyu, Ilwoo; French, William R; Newton, Allen T; Rogers, Baxter P; Landman, Bennett A
2018-05-03
High-throughput, large-scale medical image computing demands tight integration of high-performance computing (HPC) infrastructure for data storage, job distribution, and image processing. The Vanderbilt University Institute for Imaging Science (VUIIS) Center for Computational Imaging (CCI) has constructed a large-scale image storage and processing infrastructure that is composed of (1) a large-scale image database using the eXtensible Neuroimaging Archive Toolkit (XNAT), (2) a content-aware job scheduling platform using the Distributed Automation for XNAT pipeline automation tool (DAX), and (3) a wide variety of encapsulated image processing pipelines called "spiders." The VUIIS CCI medical image data storage and processing infrastructure have housed and processed nearly half-million medical image volumes with Vanderbilt Advanced Computing Center for Research and Education (ACCRE), which is the HPC facility at the Vanderbilt University. The initial deployment was natively deployed (i.e., direct installations on a bare-metal server) within the ACCRE hardware and software environments, which lead to issues of portability and sustainability. First, it could be laborious to deploy the entire VUIIS CCI medical image data storage and processing infrastructure to another HPC center with varying hardware infrastructure, library availability, and software permission policies. Second, the spiders were not developed in an isolated manner, which has led to software dependency issues during system upgrades or remote software installation. To address such issues, herein, we describe recent innovations using containerization techniques with XNAT/DAX which are used to isolate the VUIIS CCI medical image data storage and processing infrastructure from the underlying hardware and software environments. The newly presented XNAT/DAX solution has the following new features: (1) multi-level portability from system level to the application level, (2) flexible and dynamic software development and expansion, and (3) scalable spider deployment compatible with HPC clusters and local workstations.
Process Development for Hydrothermal Liquefaction of Algae Feedstocks in a Continuous-Flow Reactor
DOE Office of Scientific and Technical Information (OSTI.GOV)
Elliott, Douglas C.; Hart, Todd R.; Schmidt, Andrew J.
Wet algae slurries can be converted into an upgradeable biocrude by hydrothermal liquefaction (HTL). High levels of carbon conversion to gravity-separable biocrude product were accomplished at relatively low temperature (350 °C) in a continuous-flow, pressurized (sub-critical liquid water) environment (20 MPa). As opposed to earlier work in batch reactors reported by others, direct oil recovery was achieved without the use of a solvent and biomass trace components were removed by processing steps so that they did not cause process difficulties. High conversions were obtained even with high slurry concentrations of up to 35 wt% of dry solids. Catalytic hydrotreating wasmore » effectively applied for hydrodeoxygenation, hydrodenitrogenation, and hydrodesulfurization of the biocrude to form liquid hydrocarbon fuel. Catalytic hydrothermal gasification was effectively applied for HTL byproduct water cleanup and fuel gas production from water soluble organics, allowing the water to be considered for recycle of nutrients to the algae growth ponds. As a result, high conversion of algae to liquid hydrocarbon and gas products was found with low levels of organic contamination in the byproduct water. All three process steps were accomplished in bench-scale, continuous-flow reactor systems such that design data for process scale-up was generated.« less
NASA Astrophysics Data System (ADS)
McKelvey, David; Menary, Gary; Martin, Peter; Yan, Shiyong
2017-10-01
The thermoforming process involves a previously extruded sheet of material being reheated to a softened state below the melting temperature and then forced into a mould either by a plug, air pressure or a combination of both. Thermoplastics such as polystyrene (PS) and polypropylene (PP) are commonly processed via thermoforming for products in the packaging industry. However, high density polyethylene (HDPE) is generally not processed via thermoforming and yet HDPE is extensively processed throughout the packaging industry. The aim of this study was to investigate the potential of thermoforming HDPE. The objectives were to firstly investigate the mechanical response under comparable loading conditions and secondly, to investigate the final mechanical properties post-forming. Obtaining in-process stress-strain behavior during thermoforming is extremely challenging if not impossible. To overcome this limitation the processing conditions were replicated offline using the QUB biaxial stretcher. Typical processing conditions that the material will experience during the process are high strain levels, high strain rates between 0.1-10s-1 and high temperatures in the solid phase (1). Dynamic Mechanical Analysis (DMA) was used to investigate the processing range of the HDPE grade used in this study, a peak in the tan delta curve was observed just below the peak melting temperature and hence, a forming temperature was selected in this range. HPDE was biaxially stretched at 128°C at a strain rate of 4s-1, under equal biaxial deformation (EB). The results showed a level of biaxial orientation was induced which was accompanied by an increase in the modulus from 606 MPa in the non-stretched sample to 1212MPa in the stretched sample.
NASA Astrophysics Data System (ADS)
Galdos, L.; Saenz de Argandoña, E.; Mendiguren, J.; Silvestre, E.
2017-09-01
The roll levelling is a flattening process used to remove the residual stresses and imperfections of metal strips by means of plastic deformations. During the process, the metal sheet is subjected to cyclic tension-compression deformations leading to a flat product. The process is especially important to avoid final geometrical errors when coils are cold formed or when thick plates are cut by laser. In the last years, and due to the appearance of high strength materials such as Ultra High Strength Steels, machine design engineers are demanding reliable tools for the dimensioning of the levelling facilities. Like in other metal forming fields, finite element analysis seems to be the most widely used solution to understand the occurring phenomena and to calculate the processing loads. In this paper, the roll levelling process of the third generation Fortiform 1050 steel is numerically analysed. The process has been studied using the MSC MARC software and two different material laws. A pure isotropic hardening law has been used and set as the baseline study. In the second part, tension-compression tests have been carried out to analyse the cyclic behaviour of the steel. With the obtained data, a new material model using a combined isotropic-kinematic hardening formulation has been fitted. Finally, the influence of the material model in the numerical results has been analysed by comparing a pure isotropic model and the later combined mixed hardening model.
Facility Monitoring: A Qualitative Theory for Sensor Fusion
NASA Technical Reports Server (NTRS)
Figueroa, Fernando
2001-01-01
Data fusion and sensor management approaches have largely been implemented with centralized and hierarchical architectures. Numerical and statistical methods are the most common data fusion methods found in these systems. Given the proliferation and low cost of processing power, there is now an emphasis on designing distributed and decentralized systems. These systems use analytical/quantitative techniques or qualitative reasoning methods for date fusion.Based on other work by the author, a sensor may be treated as a highly autonomous (decentralized) unit. Each highly autonomous sensor (HAS) is capable of extracting qualitative behaviours from its data. For example, it detects spikes, disturbances, noise levels, off-limit excursions, step changes, drift, and other typical measured trends. In this context, this paper describes a distributed sensor fusion paradigm and theory where each sensor in the system is a HAS. Hence, given the reach qualitative information from each HAS, a paradigm and formal definitions are given so that sensors and processes can reason and make decisions at the qualitative level. This approach to sensor fusion makes it possible the implementation of intuitive (effective) methods to monitor, diagnose, and compensate processes/systems and their sensors. This paradigm facilitates a balanced distribution of intelligence (code and/or hardware) to the sensor level, the process/system level, and a higher controller level. The primary application of interest is in intelligent health management of rocket engine test stands.
Membrane processes in biotechnology: an overview.
Charcosset, Catherine
2006-01-01
Membrane processes are increasingly reported for various applications in both upstream and downstream technology, such as the established ultrafiltration and microfiltration, and emerging processes as membrane bioreactors, membrane chromatography, and membrane contactors for the preparation of emulsions and particles. Membrane systems exploit the inherent properties of high selectivity, high surface-area-per-unit-volume, and their potential for controlling the level of contact and/or mixing between two phases. This review presents these various membrane processes by focusing more precisely on membrane materials, module design, operating parameters and the large range of possible applications.
Similarities in neural activations of face and Chinese character discrimination.
Liu, Jiangang; Tian, Jie; Li, Jun; Gong, Qiyong; Lee, Kang
2009-02-18
This study compared Chinese participants' visual discrimination of Chinese faces with that of Chinese characters, which are highly similar to faces on a variety of dimensions. Both Chinese faces and characters activated the bilateral middle fusiform with high levels of correlations. These findings suggest that although the expertise systems for faces and written symbols are known to be anatomically differentiated at the later stages of processing to serve face processing or written-symbol-specific processing purposes, they may share similar neural structures in the ventral occipitotemporal cortex at the stages of visual processing.
Multispectral simulation environment for modeling low-light-level sensor systems
NASA Astrophysics Data System (ADS)
Ientilucci, Emmett J.; Brown, Scott D.; Schott, John R.; Raqueno, Rolando V.
1998-11-01
Image intensifying cameras have been found to be extremely useful in low-light-level (LLL) scenarios including military night vision and civilian rescue operations. These sensors utilize the available visible region photons and an amplification process to produce high contrast imagery. It has been demonstrated that processing techniques can further enhance the quality of this imagery. For example, fusion with matching thermal IR imagery can improve image content when very little visible region contrast is available. To aid in the improvement of current algorithms and the development of new ones, a high fidelity simulation environment capable of producing radiometrically correct multi-band imagery for low- light-level conditions is desired. This paper describes a modeling environment attempting to meet these criteria by addressing the task as two individual components: (1) prediction of a low-light-level radiance field from an arbitrary scene, and (2) simulation of the output from a low- light-level sensor for a given radiance field. The radiance prediction engine utilized in this environment is the Digital Imaging and Remote Sensing Image Generation (DIRSIG) model which is a first principles based multi-spectral synthetic image generation model capable of producing an arbitrary number of bands in the 0.28 to 20 micrometer region. The DIRSIG model is utilized to produce high spatial and spectral resolution radiance field images. These images are then processed by a user configurable multi-stage low-light-level sensor model that applies the appropriate noise and modulation transfer function (MTF) at each stage in the image processing chain. This includes the ability to reproduce common intensifying sensor artifacts such as saturation and 'blooming.' Additionally, co-registered imagery in other spectral bands may be simultaneously generated for testing fusion and exploitation algorithms. This paper discusses specific aspects of the DIRSIG radiance prediction for low- light-level conditions including the incorporation of natural and man-made sources which emphasizes the importance of accurate BRDF. A description of the implementation of each stage in the image processing and capture chain for the LLL model is also presented. Finally, simulated images are presented and qualitatively compared to lab acquired imagery from a commercial system.
Occupational injuries and sick leaves in household moving works.
Hwan Park, Myoung; Jeong, Byung Yong
2017-09-01
This study is concerned with household moving works and the characteristics of occupational injuries and sick leaves in each step of the moving process. Accident data for 392 occupational accidents were categorized by the moving processes in which the accidents occurred, and possible incidents and sick leaves were assessed for each moving process and hazard factor. Accidents occurring during specific moving processes showed different characteristics depending on the type of accident and agency of accidents. The most critical form in the level of risk management was falls from a height in the 'lifting by ladder truck' process. Incidents ranked as a 'High' level of risk management were in the forms of slips, being struck by objects and musculoskeletal disorders in the 'manual materials handling' process. Also, falls in 'loading/unloading', being struck by objects during 'lifting by ladder truck' and driving accidents in the process of 'transport' were ranked 'High'. The findings of this study can be used to develop more effective accident prevention policy reflecting different circumstances and conditions to reduce occupational accidents in household moving works.
Investigations of Reactive Processes at Temperatures Relevant to the Hypersonic Flight Regime
2014-10-31
molecule is constructed based on high- level ab-initio calculations and interpolated using the reproducible kernel Hilbert space (RKHS) method and...a potential energy surface (PES) for the ground state of the NO2 molecule is constructed based on high- level ab initio calculations and interpolated...between O(3P) and NO(2Π) at higher temperatures relevant to the hypersonic flight regime of reentering space- crafts. At a more fundamental level , we
Ashkenazi, Sarit
2018-02-05
Current theoretical approaches suggest that mathematical anxiety (MA) manifests itself as a weakness in quantity manipulations. This study is the first to examine automatic versus intentional processing of numerical information using the numerical Stroop paradigm in participants with high MA. To manipulate anxiety levels, we combined the numerical Stroop task with an affective priming paradigm. We took a group of college students with high MA and compared their performance to a group of participants with low MA. Under low anxiety conditions (neutral priming), participants with high MA showed relatively intact number processing abilities. However, under high anxiety conditions (mathematical priming), participants with high MA showed (1) higher processing of the non-numerical irrelevant information, which aligns with the theoretical view regarding deficits in selective attention in anxiety and (2) an abnormal numerical distance effect. These results demonstrate that abnormal, basic numerical processing in MA is context related.
Pragmatic Comprehension of High and Low Level Language Learners
ERIC Educational Resources Information Center
Garcia, Paula
2004-01-01
This study compares the performances of 16 advanced and 19 beginning English language learners on a listening comprehension task that focused on linguistic and pragmatic processing. Processing pragmatic meaning differs from processing linguistic meaning because pragmatic meaning requires the listener to understand not only linguistic information,…
Potential uses for cuphea oil processing byproducts and processed oils
USDA-ARS?s Scientific Manuscript database
Cuphea spp. has seeds that contain high levels of medium chain fatty acids and have the potential to be commercially cultivated. In the course of processing and refining Cuphea oil a number of bi-products are generated. Developing commercial uses for these bi-products would improve the economics of...
Chen, Qingkui; Zhao, Deyu; Wang, Jingjuan
2017-01-01
This paper aims to develop a low-cost, high-performance and high-reliability computing system to process large-scale data using common data mining algorithms in the Internet of Things (IoT) computing environment. Considering the characteristics of IoT data processing, similar to mainstream high performance computing, we use a GPU (Graphics Processing Unit) cluster to achieve better IoT services. Firstly, we present an energy consumption calculation method (ECCM) based on WSNs. Then, using the CUDA (Compute Unified Device Architecture) Programming model, we propose a Two-level Parallel Optimization Model (TLPOM) which exploits reasonable resource planning and common compiler optimization techniques to obtain the best blocks and threads configuration considering the resource constraints of each node. The key to this part is dynamic coupling Thread-Level Parallelism (TLP) and Instruction-Level Parallelism (ILP) to improve the performance of the algorithms without additional energy consumption. Finally, combining the ECCM and the TLPOM, we use the Reliable GPU Cluster Architecture (RGCA) to obtain a high-reliability computing system considering the nodes’ diversity, algorithm characteristics, etc. The results show that the performance of the algorithms significantly increased by 34.1%, 33.96% and 24.07% for Fermi, Kepler and Maxwell on average with TLPOM and the RGCA ensures that our IoT computing system provides low-cost and high-reliability services. PMID:28777325
Fang, Yuling; Chen, Qingkui; Xiong, Neal N; Zhao, Deyu; Wang, Jingjuan
2017-08-04
This paper aims to develop a low-cost, high-performance and high-reliability computing system to process large-scale data using common data mining algorithms in the Internet of Things (IoT) computing environment. Considering the characteristics of IoT data processing, similar to mainstream high performance computing, we use a GPU (Graphics Processing Unit) cluster to achieve better IoT services. Firstly, we present an energy consumption calculation method (ECCM) based on WSNs. Then, using the CUDA (Compute Unified Device Architecture) Programming model, we propose a Two-level Parallel Optimization Model (TLPOM) which exploits reasonable resource planning and common compiler optimization techniques to obtain the best blocks and threads configuration considering the resource constraints of each node. The key to this part is dynamic coupling Thread-Level Parallelism (TLP) and Instruction-Level Parallelism (ILP) to improve the performance of the algorithms without additional energy consumption. Finally, combining the ECCM and the TLPOM, we use the Reliable GPU Cluster Architecture (RGCA) to obtain a high-reliability computing system considering the nodes' diversity, algorithm characteristics, etc. The results show that the performance of the algorithms significantly increased by 34.1%, 33.96% and 24.07% for Fermi, Kepler and Maxwell on average with TLPOM and the RGCA ensures that our IoT computing system provides low-cost and high-reliability services.
Cadesky, Lee; Walkling-Ribeiro, Markus; Kriner, Kyle T; Karwe, Mukund V; Moraru, Carmen I
2017-09-01
Reconstituted micellar casein concentrates and milk protein concentrates of 2.5 and 10% (wt/vol) protein concentration were subjected to high-pressure processing at pressures from 150 to 450 MPa, for 15 min, at ambient temperature. The structural changes induced in milk proteins by high-pressure processing were investigated using a range of physical, physicochemical, and chemical methods, including dynamic light scattering, rheology, mid-infrared spectroscopy, scanning electron microscopy, proteomics, and soluble mineral analyses. The experimental data clearly indicate pressure-induced changes of casein micelles, as well as denaturation of serum proteins. Calcium-binding α S1 - and α S2 -casein levels increased in the soluble phase after all pressure treatments. Pressurization up to 350 MPa also increased levels of soluble calcium and phosphorus, in all samples and concentrations, whereas treatment at 450 MPa reduced the levels of soluble Ca and P. Experimental data suggest dissociation of calcium phosphate and subsequent casein micelle destabilization as a result of pressure treatment. Treatment of 10% micellar casein concentrate and 10% milk protein concentrate samples at 450 MPa resulted in weak, physical gels, which featured aggregates of uniformly distributed, casein substructures of 15 to 20 nm in diameter. Serum proteins were significantly denatured by pressures above 250 MPa. These results provide information on pressure-induced changes in high-concentration protein systems, and may inform the development on new milk protein-based foods with novel textures and potentially high nutritional quality, of particular interest being the soft gel structures formed at high pressure levels. The Authors. Published by the Federation of Animal Science Societies and Elsevier Inc. on behalf of the American Dairy Science Association®. This is an open access article under the CC BY-NC-ND license (http://creativecommons.org/licenses/by-nc-nd/3.0/).
A Task-Dependent Causal Role for Low-Level Visual Processes in Spoken Word Comprehension
ERIC Educational Resources Information Center
Ostarek, Markus; Huettig, Falk
2017-01-01
It is well established that the comprehension of spoken words referring to object concepts relies on high-level visual areas in the ventral stream that build increasingly abstract representations. It is much less clear whether basic low-level visual representations are also involved. Here we asked in what task situations low-level visual…
Acquisition of a High Voltage/High resolution Transmission Electron Microscope.
1988-08-21
microstructural design starts at the nanometer level. One such method is colloidal processing of materials with ultrafine particles in which particle...applications in the colloidal processing of ceramics with ultrafine particles . Aftervards, nanometer-sized particles will be synthesized and...STRUCTURAL CONTROL WITH ULTRAFINE PARTICLES Jun Liu. Mehmet Sarikaya, and I. A. Aksay Department of Materials Science and Engineering. Advanced
ERIC Educational Resources Information Center
Koran, Selcuk
2015-01-01
Teacher motivation is one of the primary variables of students' high performance. It is experienced that students whose teachers are highly motivated are more engaged in the learning process. Therefore, it's mostly the teacher who determines the level of success or failure in achieving institution's goal in the educational process. Thus, teachers…
Dan Neary
2016-01-01
Forested catchments throughout the world are known for producing high quality water for human use. In the 20th Century, experimental forest catchment studies played a key role in studying the processes contributing to high water quality. The hydrologic processes investigated on these paired catchments have provided the science base for examining water quality...
High pressure processing and its application to the challenge of virus-contaminated foods.
Kingsley, David H
2013-03-01
High pressure processing (HPP) is an increasingly popular non-thermal food processing technology. Study of HPP's potential to inactivate foodborne viruses has defined general pressure levels required to inactivate hepatitis A virus, norovirus surrogates, and human norovirus itself within foods such as shellfish and produce. The sensitivity of a number of different picornaviruses to HPP is variable. Experiments suggest that HPP inactivates viruses via denaturation of capsid proteins which render the virus incapable of binding to its receptor on the surface of its host cell. Beyond the primary consideration of treatment pressure level, the effects of extending treatment times, temperature of initial pressure application, and matrix composition have been identified as critical parameters for designing HPP inactivation strategies. Research described here can serve as a preliminary guide to whether a current commercial process could be effective against HuNoV or HAV.
van Berkel, Jantien; Boot, Cécile R L; Proper, Karin I; Bongers, Paulien M; van der Beek, Allard J
2013-01-01
To evaluate the process of the implementation of an intervention aimed at improving work engagement and energy balance, and to explore associations between process measures and compliance. Process measures were assessed using a combination of quantitative and qualitative methods. The mindfulness training was attended at least once by 81.3% of subjects, and 54.5% were highly compliant. With regard to e-coaching and homework exercises, 6.3% and 8.0%, respectively, were highly compliant. The training was appreciated with a 7.5 score and e-coaching with a 6.8 score. Appreciation of training and e-coaching, satisfaction with trainer and coach, and practical facilitation were significantly associated with compliance. The intervention was implemented well on the level of the mindfulness training, but poorly on the level of e-coaching and homework time investment. To increase compliance, attention should be paid to satisfaction and trainer-participant relationship.
Taris, Toon W; Feij, Jan A
2004-11-01
The present 3-wave longitudinal study was an examination of job-related learning and strain as a function of job demand and job control. The participants were 311 newcomers to their jobs. On the basis of R. A. Karasek and T. Theorell's (1990) demand-control model, the authors predicted that high demand and high job control would lead to high levels of learning; low demand and low job control should lead to low levels of learning; high demand and low job control should lead to high levels of strain; and low demand and high job control should lead to low levels of strain. The relation between strain and learning was also examined. The authors tested the hypotheses using ANCOVA and structural equation modeling. The results revealed that high levels of strain have an adverse effect on learning; the reverse effect was not confirmed. It appears that Karasek and Theorell's model is very relevant when examining work socialization processes.
ERIC Educational Resources Information Center
Collins, Loel; Collins, Dave
2017-01-01
This article continues a theme of previous investigations by the authors and examines the focus of in-action reflection as a component of professional judgement and decision-making (PJDM) processes in high-level adventure sports coaching. We utilised a thematic analysis approach to investigate the decision-making practices of a sample of…
Adaptation and Age-Related Expectations of Older Gay and Lesbian Adults.
ERIC Educational Resources Information Center
Quam, Jean K.; Whitford, Gary S.
1992-01-01
Respondents in a study of lesbian women and gay men over age 50 who indicated high levels of involvement in the gay community reported acceptance of the aging process and high levels of life satisfaction, despite predictable problems associated with aging and sexual orientation. Being active in the gay community was an asset to accepting one's…
A preliminary analysis of Florida State Park satisfaction survey data
Andrew Holdnak; Stephen Holland; Erin Parks
2002-01-01
This study is part of a five-year quality review process for Florida State Parks. It attempts to document the feelings visitors have about the parks they visit. The preliminary findings are very similar to results found in a similar study conducted in 1995 in which high levels of overall satisfaction were found. Despite high levels of overall satisfaction there were...
Sensitivity to musical structure in the human brain
McDermott, Josh H.; Norman-Haignere, Sam; Kanwisher, Nancy
2012-01-01
Evidence from brain-damaged patients suggests that regions in the temporal lobes, distinct from those engaged in lower-level auditory analysis, process the pitch and rhythmic structure in music. In contrast, neuroimaging studies targeting the representation of music structure have primarily implicated regions in the inferior frontal cortices. Combining individual-subject fMRI analyses with a scrambling method that manipulated musical structure, we provide evidence of brain regions sensitive to musical structure bilaterally in the temporal lobes, thus reconciling the neuroimaging and patient findings. We further show that these regions are sensitive to the scrambling of both pitch and rhythmic structure but are insensitive to high-level linguistic structure. Our results suggest the existence of brain regions with representations of musical structure that are distinct from high-level linguistic representations and lower-level acoustic representations. These regions provide targets for future research investigating possible neural specialization for music or its associated mental processes. PMID:23019005
Maintaining consistency between planning hierarchies: Techniques and applications
NASA Technical Reports Server (NTRS)
Zoch, David R.
1987-01-01
In many planning and scheduling environments, it is desirable to be able to view and manipulate plans at different levels of abstraction, allowing the users the option of viewing and manipulating either a very detailed representation of the plan or a high-level more abstract version of the plan. Generating a detailed plan from a more abstract plan requires domain-specific planning/scheduling knowledge; the reverse process of generating a high-level plan from a detailed plan Reverse Plan Maintenance, or RPM) requires having the system remember the actions it took based on its domain-specific knowledge and its reasons for taking those actions. This reverse plan maintenance process is described as implemented in a specific planning and scheduling tool, The Mission Operations Planning Assistant (MOPA), as well as the applications of RPM to other planning and scheduling problems; emphasizing the knowledge that is needed to maintain the correspondence between the different hierarchical planning levels.
Strategic Minimization of High Level Waste from Pyroprocessing of Spent Nuclear Fuel
DOE Office of Scientific and Technical Information (OSTI.GOV)
Simpson, Michael F.; Benedict, Robert W.
The pyroprocessing of spent nuclear fuel results in two high-level waste streams--ceramic and metal waste. Ceramic waste contains active metal fission product-loaded salt from the electrorefining, while the metal waste contains cladding hulls and undissolved noble metals. While pyroprocessing was successfully demonstrated for treatment of spent fuel from Experimental Breeder Reactor-II in 1999, it was done so without a specific objective to minimize high-level waste generation. The ceramic waste process uses “throw-away” technology that is not optimized with respect to volume of waste generated. In looking past treatment of EBR-II fuel, it is critical to minimize waste generation for technologymore » developed under the Global Nuclear Energy Partnership (GNEP). While the metal waste cannot be readily reduced, there are viable routes towards minimizing the ceramic waste. Fission products that generate high amounts of heat, such as Cs and Sr, can be separated from other active metal fission products and placed into short-term, shallow disposal. The remaining active metal fission products can be concentrated into the ceramic waste form using an ion exchange process. It has been estimated that ion exchange can reduce ceramic high-level waste quantities by as much as a factor of 3 relative to throw-away technology.« less
Kosilo, Maciej; Wuerger, Sophie M.; Craddock, Matt; Jennings, Ben J.; Hunt, Amelia R.; Martinovic, Jasna
2013-01-01
Until recently induced gamma-band activity (GBA) was considered a neural marker of cortical object representation. However, induced GBA in the electroencephalogram (EEG) is susceptible to artifacts caused by miniature fixational saccades. Recent studies have demonstrated that fixational saccades also reflect high-level representational processes. Do high-level as opposed to low-level factors influence fixational saccades? What is the effect of these factors on artifact-free GBA? To investigate this, we conducted separate eye tracking and EEG experiments using identical designs. Participants classified line drawings as objects or non-objects. To introduce low-level differences, contours were defined along different directions in cardinal color space: S-cone-isolating, intermediate isoluminant, or a full-color stimulus, the latter containing an additional achromatic component. Prior to the classification task, object discrimination thresholds were measured and stimuli were scaled to matching suprathreshold levels for each participant. In both experiments, behavioral performance was best for full-color stimuli and worst for S-cone isolating stimuli. Saccade rates 200–700 ms after stimulus onset were modulated independently by low and high-level factors, being higher for full-color stimuli than for S-cone isolating stimuli and higher for objects. Low-amplitude evoked GBA and total GBA were observed in very few conditions, showing that paradigms with isoluminant stimuli may not be ideal for eliciting such responses. We conclude that cortical loops involved in the processing of objects are preferentially excited by stimuli that contain achromatic information. Their activation can lead to relatively early exploratory eye movements even for foveally-presented stimuli. PMID:24391611
PyGirl: Generating Whole-System VMs from High-Level Prototypes Using PyPy
NASA Astrophysics Data System (ADS)
Bruni, Camillo; Verwaest, Toon
Virtual machines (VMs) emulating hardware devices are generally implemented in low-level languages for performance reasons. This results in unmaintainable systems that are difficult to understand. In this paper we report on our experience using the PyPy toolchain to improve the portability and reduce the complexity of whole-system VM implementations. As a case study we implement a VM prototype for a Nintendo Game Boy, called PyGirl, in which the high-level model is separated from low-level VM implementation issues. We shed light on the process of refactoring from a low-level VM implementation in Java to a high-level model in RPython. We show that our whole-system VM written with PyPy is significantly less complex than standard implementations, without substantial loss in performance.
Zaranyika, M F; Nyati, W
2017-10-01
The aim of the present work was to demonstrate the existence of metal-metal interactions in plants and their implications for the absorption of toxic elements like Cr. Typha capensis , a good accumulator of heavy metals, was chosen for the study. Levels of Fe, Cr, Ni, Cd, Pb, Cu and Zn were determined in the soil and roots, rhizomes, stems and leaves of T. capensis from three Sites A, B and C polluted by effluent from a chrome ore processing plant, a gold ore processing plant, and a nickel ore processing plant, respectively. The levels of Cr were extremely high at Site A at 5415 and 786-16,047 μg g -1 dry weight in the soil and the plant, respectively, while the levels of Ni were high at Site C at 176 and 24-891 μg g -1 in the soil and the plant, respectively. The levels of Fe were high at all three sites at 2502-7500 and 906-13,833 μg g -1 in the soil and plant, respectively. For the rest of the metals, levels were modest at 8.5-148 and 2-264 μg g -1 in the soil and plant, respectively. Pearson's correlation analysis confirmed mutual synergistic metal-metal interactions in the uptake of Zn, Cu, Co, Ni, Fe, and Cr, which are attributed to the similarity in the radii and coordination geometry of the cations of these elements. The implications of such metal-metal interactions (or effects of one metal on the behaviour of another) on the uptake of Cr, a toxic element, and possible Cr detoxification mechanism within the plant, are discussed.
Opposite Roles of Furin and PC5A in N-Cadherin Processing12
Maret, Deborah; Sadr, Mohamad Seyed; Sadr, Emad Seyed; Colman, David R; Del Maestro, Rolando F; Seidah, Nabil G
2012-01-01
We recently demonstrated that lack of Furin-processing of the N-cadherin precursor (proNCAD) in highly invasive melanoma and brain tumor cells results in the cell-surface expression of a nonadhesive protein favoring cell migration and invasion in vitro. Quantitative polymerase chain reaction analysis of malignant human brain tumor cells revealed that of all proprotein convertases (PCs) only the levels of Furin and PC5A are modulated, being inversely (Furin) or directly (PC5A) correlated with brain tumor invasive capacity. Intriguingly, the N-terminal sequence following the Furin-activated NCAD site (RQKR↓DW161, mouse nomenclature) reveals a second putative PC-processing site (RIRSDR↓DK189) located in the first extracellular domain. Cleavage at this site would abolish the adhesive functions of NCAD because of the loss of the critical Trp161. This was confirmed upon analysis of the fate of the endogenous prosegment of proNCAD in human malignant glioma cells expressing high levels of Furin and low levels of PC5A (U343) or high levels of PC5A and negligible Furin levels (U251). Cellular analyses revealed that Furin is the best activating convertase releasing an ∼17-kDa prosegment, whereas PC5A is the major inactivating enzyme resulting in the secretion of an ∼20-kDa product. Like expression of proNCAD at the cell surface, cleavage of the NCAD molecule at RIRSDR↓DK189 renders the U251 cancer cells less adhesive to one another and more migratory. Our work modifies the present view on posttranslational processing and surface expression of classic cadherins and clarifies how NCAD possesses a range of adhesive potentials and plays a critical role in tumor progression. PMID:23097623
A pedagogical derivation of the matrix element method in particle physics data analysis
NASA Astrophysics Data System (ADS)
Sumowidagdo, Suharyo
2018-03-01
The matrix element method provides a direct connection between the underlying theory of particle physics processes and detector-level physical observables. I am presenting a pedagogically-oriented derivation of the matrix element method, drawing from elementary concepts in probability theory, statistics, and the process of experimental measurements. The level of treatment should be suitable for beginning research student in phenomenology and experimental high energy physics.
Groen, Iris I A; Silson, Edward H; Baker, Chris I
2017-02-19
Visual scene analysis in humans has been characterized by the presence of regions in extrastriate cortex that are selectively responsive to scenes compared with objects or faces. While these regions have often been interpreted as representing high-level properties of scenes (e.g. category), they also exhibit substantial sensitivity to low-level (e.g. spatial frequency) and mid-level (e.g. spatial layout) properties, and it is unclear how these disparate findings can be united in a single framework. In this opinion piece, we suggest that this problem can be resolved by questioning the utility of the classical low- to high-level framework of visual perception for scene processing, and discuss why low- and mid-level properties may be particularly diagnostic for the behavioural goals specific to scene perception as compared to object recognition. In particular, we highlight the contributions of low-level vision to scene representation by reviewing (i) retinotopic biases and receptive field properties of scene-selective regions and (ii) the temporal dynamics of scene perception that demonstrate overlap of low- and mid-level feature representations with those of scene category. We discuss the relevance of these findings for scene perception and suggest a more expansive framework for visual scene analysis.This article is part of the themed issue 'Auditory and visual scene analysis'. © 2017 The Author(s).
2017-01-01
Visual scene analysis in humans has been characterized by the presence of regions in extrastriate cortex that are selectively responsive to scenes compared with objects or faces. While these regions have often been interpreted as representing high-level properties of scenes (e.g. category), they also exhibit substantial sensitivity to low-level (e.g. spatial frequency) and mid-level (e.g. spatial layout) properties, and it is unclear how these disparate findings can be united in a single framework. In this opinion piece, we suggest that this problem can be resolved by questioning the utility of the classical low- to high-level framework of visual perception for scene processing, and discuss why low- and mid-level properties may be particularly diagnostic for the behavioural goals specific to scene perception as compared to object recognition. In particular, we highlight the contributions of low-level vision to scene representation by reviewing (i) retinotopic biases and receptive field properties of scene-selective regions and (ii) the temporal dynamics of scene perception that demonstrate overlap of low- and mid-level feature representations with those of scene category. We discuss the relevance of these findings for scene perception and suggest a more expansive framework for visual scene analysis. This article is part of the themed issue ‘Auditory and visual scene analysis’. PMID:28044013
Brouard, Agnes; Fagon, Jean Yves; Daniels, Charles E
2011-01-01
This project was designed to underline any actions relative to medication error prevention and patient safety improvement setting up in North American hospitals which could be implemented in French Parisian hospitals. A literature research and analysis of medication-use process in the North American hospitals and a validation survey of hospital pharmacist managers in the San Diego area was performed to assess main points of hospital medication-use process. Literature analysis, survey analysis of respondents highlighted main differences between the two countries at three levels: nationwide, hospital level and pharmaceutical service level. According to this, proposal development to optimize medication-use process in the French system includes the following topics: implementation of an expanded use of information technology and robotics; increase pharmaceutical human resources allowing expansion of clinical pharmacy activities; focus on high-risk medications and high-risk patient populations; develop a collective sense of responsibility for medication error prevention in hospital settings, involving medical, pharmaceutical and administrative teams. Along with a strong emphasis that should be put on the identified topics to improve the quality and safety of hospital care in France, consideration of patient safety as a priority at a nationwide level needs to be reinforced.
Emotion unfolded by motion: a role for parietal lobe in decoding dynamic facial expressions.
Sarkheil, Pegah; Goebel, Rainer; Schneider, Frank; Mathiak, Klaus
2013-12-01
Facial expressions convey important emotional and social information and are frequently applied in investigations of human affective processing. Dynamic faces may provide higher ecological validity to examine perceptual and cognitive processing of facial expressions. Higher order processing of emotional faces was addressed by varying the task and virtual face models systematically. Blood oxygenation level-dependent activation was assessed using functional magnetic resonance imaging in 20 healthy volunteers while viewing and evaluating either emotion or gender intensity of dynamic face stimuli. A general linear model analysis revealed that high valence activated a network of motion-responsive areas, indicating that visual motion areas support perceptual coding for the motion-based intensity of facial expressions. The comparison of emotion with gender discrimination task revealed increased activation of inferior parietal lobule, which highlights the involvement of parietal areas in processing of high level features of faces. Dynamic emotional stimuli may help to emphasize functions of the hypothesized 'extended' over the 'core' system for face processing.
Risk Assessment for Stonecutting Enterprises
NASA Astrophysics Data System (ADS)
Aleksandrova, A. J.; Timofeeva, S. S.
2017-04-01
Working conditions at enterprises and artisanal workshops for the processing of jewelry and ornamental stones were considered. The main stages of the technological process for processing of stone raw materials were shown; dangerous processes in the extraction of stone and its processing were identified. The characteristic of harmful and dangerous production factors affecting stonecutters is given. It was revealed that the most dangerous are the increased level of noise and vibration, as well as chemical reagents. The results of a special assessment of the working conditions of stone-cutting plant workers are studied. Professions with high professional risk were identified; an analysis of occupational risks and occupational injuries was carried out. Risk assessment was produced by several methods; professions with high and medium risk indicators were identified by results of the evaluation. The application of risk assessment methods was given the possibility to justify rational measures reducing risks to the lowest possible level. The received quantitative indicators of risk of workers of the stone-cutting enterprises are the result of this work.
Dancing with the Muses: dissociation and flow.
Thomson, Paula; Jaque, S Victoria
2012-01-01
This study investigated dissociative psychological processes and flow (dispositional and state) in a group of professional and pre-professional dancers (n=74). In this study, high scores for global (Mdn=4.14) and autotelic (Mdn=4.50) flow suggest that dancing was inherently integrating and rewarding, although 17.6% of the dancers were identified as possibly having clinical levels of dissociation (Dissociative Experiences Scale-Taxon cutoff score≥20). The results of the multivariate analysis of variance indicated that subjects with high levels of dissociation had significantly lower levels of global flow (p<.05). Stepwise linear regression analyses demonstrated that dispositional flow negatively predicted the dissociative constructs of depersonalization and taxon (p<.05) but did not significantly predict the variance in absorption/imagination (p>.05). As hypothesized, dissociation and flow seem to operate as different mental processes.
Ruiz-Capillas, C; Aller-Guiote, P; Carballo, J; Colmenero, F Jiménez
2006-12-27
Changes in biogenic amine formation and nitrite depletion in meat batters as affected by pressure-temperature combinations (300 MPa/30 min/7, 20, and 40 degrees C), cooking process (70 degrees C/30 min), and storage (54 days/2 degrees C) were studied. Changes in residual nitrite concentration in raw meat batters were conditioned by the temperature and not by the pressure applied. Cooking process decreased (P < 0.05) the residual nitrite concentration in all samples. High-pressure processing and cooking treatment increased (P < 0.05) the nitrate content. Whereas protein-bound nitrite concentration decreased with pressure processing, no effect was observed with the heating process of meat batters. High-pressure processing conditions had no effect on the rate of residual nitrite loss throughout the storage. The application of high pressure decreased (P < 0.05) the concentration of some biogenic amines (tyramine, agmatine, and spermine). Irrespective of the high processing conditions, generally, throughout storage biogenic amine levels did not change or increased, although quantitatively this effect was not very important.
The Role of Jet Adjustment Processes in Subtropical Dust Storms
NASA Astrophysics Data System (ADS)
Pokharel, Ashok Kumar; Kaplan, Michael L.; Fiedler, Stephanie
2017-11-01
Meso-α/β/γ scale atmospheric processes of jet dynamics responsible for generating Harmattan, Saudi Arabian, and Bodélé Depression dust storms are analyzed with observations and high-resolution modeling. The analysis of the role of jet adjustment processes in each dust storm shows similarities as follows: (1) the presence of a well-organized baroclinic synoptic scale system, (2) cross mountain flows that produced a leeside inversion layer prior to the large-scale dust storm, (3) the presence of thermal wind imbalance in the exit region of the midtropospheric jet streak in the lee of the respective mountains shortly after the time of the inversion formation, (4) dust storm formation accompanied by large magnitude ageostrophic isallobaric low-level winds as part of the meso-β scale adjustment process, (5) substantial low-level turbulence kinetic energy (TKE), and (6) emission and uplift of mineral dust in the lee of nearby mountains. The thermally forced meso-γ scale adjustment processes, which occurred in the canyons/small valleys, may have been the cause of numerous observed dust streaks leading to the entry of the dust into the atmosphere due to the presence of significant vertical motion and TKE generation. This study points to the importance of meso-β to meso-γ scale adjustment processes at low atmospheric levels due to an imbalance within the exit region of an upper level jet streak for the formation of severe dust storms. The low level TKE, which is one of the prerequisites to deflate the dust from the surface, cannot be detected with the low resolution data sets; so our results show that a high spatial resolution is required for better representing TKE as a proxy for dust emission.
Pérez, Alberto J; González-Peña, Rolando J; Braga, Roberto; Perles, Ángel; Pérez-Marín, Eva; García-Diego, Fernando J
2018-01-11
Dynamic laser speckle (DLS) is used as a reliable sensor of activity for all types of materials. Traditional applications are based on high-rate captures (usually greater than 10 frames-per-second, fps). Even for drying processes in conservation treatments, where there is a high level of activity in the first moments after the application and slower activity after some minutes or hours, the process is based on the acquisition of images at a time rate that is the same in moments of high and low activity. In this work, we present an alternative approach to track the drying process of protective layers and other painting conservation processes that take a long time to reduce their levels of activity. We illuminate, using three different wavelength lasers, a temporary protector (cyclododecane) and a varnish, and monitor them using a low fps rate during long-term drying. The results are compared to the traditional method. This work also presents a monitoring method that uses portable equipment. The results present the feasibility of using the portable device and show the improved sensitivity of the dynamic laser speckle when sensing the long-term process for drying cyclododecane and varnish in conservation.
Sigurdardottir, Heida Maria; Fridriksdottir, Liv Elisabet; Gudjonsdottir, Sigridur; Kristjánsson, Árni
2018-06-01
Evidence of interdependencies of face and word processing mechanisms suggest possible links between reading problems and abnormal face processing. In two experiments we assessed such high-level visual deficits in people with a history of reading problems. Experiment 1 showed that people who were worse at face matching had greater reading problems. In experiment 2, matched dyslexic and typical readers were tested, and difficulties with face matching were consistently found to predict dyslexia over and above both novel-object matching as well as matching noise patterns that shared low-level visual properties with faces. Furthermore, ADHD measures could not account for face matching problems. We speculate that reading difficulties in dyslexia are partially caused by specific deficits in high-level visual processing, in particular for visual object categories such as faces and words with which people have extensive experience. Copyright © 2018 Elsevier B.V. All rights reserved.
The decay of highly excited open strings
NASA Technical Reports Server (NTRS)
Mitchell, D.; Turok, N.; Wilkinson, R.; Jetzer, P.
1988-01-01
The decay rates of leading edge Regge trajectory states are calculated for very high level number in open bosonic string theories, ignoring tachyon final states. The optical theorem simplifies the analysis while enabling identification of the different mass level decay channels. The main result is that (in four dimensions) the greatest single channel is the emission of a single photon and a state of the next mass level down. A simple asymptotic formula for arbitrarily high level number is given for this process. Also calculated is the total decay rate exactly up to N=100. It shows little variation over this range but appears to decrease for larger N. The formalism is checked in examples and the decay rate of the first excited level calculated for open superstring theories. The calculation may also have implications for high spin meson resonances.
Kurasawa, Shintaro; Koyama, Shouhei; Ishizawa, Hiroaki; Fujimoto, Keisaku; Chino, Shun
2017-11-23
This paper describes and verifies a non-invasive blood glucose measurement method using a fiber Bragg grating (FBG) sensor system. The FBG sensor is installed on the radial artery, and the strain (pulse wave) that is propagated from the heartbeat is measured. The measured pulse wave signal was used as a collection of feature vectors for multivariate analysis aiming to determine the blood glucose level. The time axis of the pulse wave signal was normalized by two signal processing methods: the shortest-time-cut process and 1-s-normalization process. The measurement accuracy of the calculated blood glucose level was compared with the accuracy of these signal processing methods. It was impossible to calculate a blood glucose level exceeding 200 mg/dL in the calibration curve that was constructed by the shortest-time-cut process. In the 1-s-normalization process, the measurement accuracy of the blood glucose level was improved, and a blood glucose level exceeding 200 mg/dL could be calculated. By verifying the loading vector of each calibration curve to calculate the blood glucose level with a high measurement accuracy, we found the gradient of the peak of the pulse wave at the acceleration plethysmogram greatly affected.
Final report on cermet high-level waste forms
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kobisk, E.H.; Quinby, T.C.; Aaron, W.S.
1981-08-01
Cermets are being developed as an alternate method for the fixation of defense and commercial high level radioactive waste in a terminal disposal form. Following initial feasibility assessments of this waste form, consisting of ceramic particles dispersed in an iron-nickel base alloy, significantly improved processing methods were developed. The characterization of cermets has continued through property determinations on samples prepared by various methods from a variety of simulated and actual high-level wastes. This report describes the status of development of the cermet waste form as it has evolved since 1977. 6 tables, 18 figures.
NASA Astrophysics Data System (ADS)
Tahri, Meryem; Maanan, Mohamed; Hakdaoui, Mustapha
2016-04-01
This paper shows a method to assess the vulnerability of coastal risks such as coastal erosion or submarine applying Fuzzy Analytic Hierarchy Process (FAHP) and spatial analysis techniques with Geographic Information System (GIS). The coast of the Mohammedia located in Morocco was chosen as the study site to implement and validate the proposed framework by applying a GIS-FAHP based methodology. The coastal risk vulnerability mapping follows multi-parametric causative factors as sea level rise, significant wave height, tidal range, coastal erosion, elevation, geomorphology and distance to an urban area. The Fuzzy Analytic Hierarchy Process methodology enables the calculation of corresponding criteria weights. The result shows that the coastline of the Mohammedia is characterized by a moderate, high and very high level of vulnerability to coastal risk. The high vulnerability areas are situated in the east at Monika and Sablette beaches. This technical approach is based on the efficiency of the Geographic Information System tool based on Fuzzy Analytical Hierarchy Process to help decision maker to find optimal strategies to minimize coastal risks.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1977-06-01
The pilot plant is developed for ERDA low-level contact-handled transuranic waste, ERDA remote-handled intermediate-level transuranic waste, and for high-level waste experiments. All wastes placed in the WIPP arrive at the site processed and packaged; no waste processing is done at the WIPP. All wastes placed into the WIPP are retrievable. The proposed site for WIPP lies 26 miles east of Carlsbad, New Mexico. This document includes the executive summary and a detailed description of the facilities and systems. (DLC)
NASA Technical Reports Server (NTRS)
Johansson, Sveneric; Carpenter, Kenneth G.
1988-01-01
Two fluorescence processes operating in atmospheres of cool stars, symbiotic stars, and the Sun are presented. Two emission lines, at 1347.03 and 1360.17 A, are identified as fluorescence lines of Cr II and Fe II. The lines are due to transitions from highly excited levels, which are populated radiatively by the hydrogen Lyman alpha line due to accidental wavelength coincidences. Three energy levels, one in Cr II and two in Fe II, are reported.
McGregor, Lucy S; Melvin, Glenn A; Newman, Louise K
2015-07-01
In recent years there has been increased debate and critique of the focus on psychopathology in general, and posttraumatic stress disorder (PTSD) in particular, as a predominant consequence of the refugee experience. This study was conducted to broaden the conceptualization and examination of the outcomes of the refugee experience by jointly examining how adaptive processes, psychosocial factors, and psychopathology are implicated. A mixed-methods approach was used to specifically examine whether adolescents' (N = 10) accounts of their refugee and resettlement experiences differed according to their level, "high" or "low," of PTSD symptomatology. The superordinate themes of cultural belongingness and identification, psychological functioning, family unit functioning and relationships, and friendships and interpersonal processes, were identified as having particular relevance for the study's participants and in distinguishing between participants with high and low levels of PTSD symptomatology. Findings were characterized by marked differences between adolescents' accounts according to their symptomatology levels, and may thereby inform important avenues for future research as well as clinical prevention and intervention programs with refugee youth. (c) 2015 APA, all rights reserved).
González-Cebrino, Francisco; Durán, Rocío; Delgado-Adámez, Jonathan; Contador, Rebeca; Bernabé, Rosario Ramírez
2016-04-01
Physicochemical parameters, bioactive compounds' content (carotenoids and total phenols), total antioxidant activity, and enzymatic activity of polyphenol oxidase (PPO) were evaluated after high pressure processing (HPP) on a pumpkin purée (cv. 'Butternut'). Three pressure levels (400, 500, and 600 MPa) were combined with three holding times (200, 400, and 600 s). The applied treatments reduced the levels of total aerobic mesophilic (TAM), total psychrophilic and psychrotrophic bacteria (TPP), and molds and yeasts (M&Y). All applied treatments did not affect enzymatic activity of PPO. Pressure level increased CIE L* values, which could enhance the lightness perception of high pressure (HP)-treated purées. No differences were found between the untreated and HP-treated purées regarding total phenols and carotenoids content (lutein, α-carotene, and β-carotene) and total antioxidant activity. HPP did not affect most quality parameters and maintained the levels of bioactive compounds. However, it did not achieve the complete inhibition of PPO, which could reduce the shelf-life of the pumpkin purée. © The Author(s) 2015.
Data Processing at the High School Level.
ERIC Educational Resources Information Center
Richmond, Sue
1981-01-01
The teaching of data processing in the secondary school is examined, including teachers (certification, work experience), textbooks (selection, concentration), community (advisory committees, career exploration), students (recruitment, aptitude tests), instruction methods (simulation, audiovisuals, field trips), course content (machine technology,…
Tang, Li Juan; Chen, Xiao; Wen, Tian Yu; Yang, Shuang; Zhao, Jun Jie; Qiao, Hong Wei; Hou, Yu; Yang, Hua Gui
2018-02-26
A highly transparent NiO layer was prepared by a solution processing method with nickel(II) 2-ethylhexanoate in non-polar solvent and utilized as HTM in perovskite solar cells. Excellent optical transmittance and the matched energy level lead to the enhanced power conversion efficiency (PCE, 18.15 %) than that of conventional sol-gel-processed NiO-based device (12.98 %). © 2018 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.
NASA Astrophysics Data System (ADS)
Yu, Leiming; Nina-Paravecino, Fanny; Kaeli, David; Fang, Qianqian
2018-01-01
We present a highly scalable Monte Carlo (MC) three-dimensional photon transport simulation platform designed for heterogeneous computing systems. Through the development of a massively parallel MC algorithm using the Open Computing Language framework, this research extends our existing graphics processing unit (GPU)-accelerated MC technique to a highly scalable vendor-independent heterogeneous computing environment, achieving significantly improved performance and software portability. A number of parallel computing techniques are investigated to achieve portable performance over a wide range of computing hardware. Furthermore, multiple thread-level and device-level load-balancing strategies are developed to obtain efficient simulations using multiple central processing units and GPUs.
Levelling in the German Verb Paradigm
ERIC Educational Resources Information Center
Newman, John
1974-01-01
Levelling processes in the history of the German verb paradigm from Old High German to the present are discussed. It is asserted that the theory of transformational generative grammar provides a proper framework for the study of linguistic change. (RM)
Affective ERP Processing in a Visual Oddball Task: Arousal, Valence, and Gender
Rozenkrants, Bella; Polich, John
2008-01-01
Objective To assess affective event-related brain potentials (ERPs) using visual pictures that were highly distinct on arousal level/valence category ratings and a response task. Methods Images from the International Affective Pictures System (IAPS) were selected to obtain distinct affective arousal (low, high) and valence (negative, positive) rating levels. The pictures were used as target stimuli in an oddball paradigm, with a visual pattern as the standard stimulus. Participants were instructed to press a button whenever a picture occurred and to ignore the standard. Task performance and response time did not differ across conditions. Results High-arousal compared to low-arousal stimuli produced larger amplitudes for the N2, P3, early slow wave, and late slow wave components. Valence amplitude effects were weak overall and originated primarily from the later waveform components and interactions with electrode position. Gender differences were negligible. Conclusion The findings suggest that arousal level is the primary determinant of affective oddball processing, and valence minimally influences ERP amplitude. Significance Affective processing engages selective attentional mechanisms that are primarily sensitive to the arousal properties of emotional stimuli. The application and nature of task demands are important considerations for interpreting these effects. PMID:18783987
Biological-Physical Feedbacks Determine Coastal Environmental Response to Climate Change
NASA Astrophysics Data System (ADS)
Moore, L. J.; Duran Vinent, O.; Walters, D.; Fagherazzi, S.; Mariotti, G.; Young, D.; Wolner, C. V.
2012-12-01
As low-lying coastal landforms, transitional between marine and terrestrial realms, barrier islands are especially sensitive to changing environmental conditions. Interactions among biological and physical processes appear to play a critical role in determining how these landscapes will evolve in the future as sea level rises, storm intensity increases and plant species composition changes. Within a new conceptual framework, barrier islands tend to exist in one of two primary states. "Low" islands have little relief above sea level and are dominated by external processes, responding quickly on short time scales to changes in forcing (e.g., storms, sea level rise, etc.), migrating rapidly and generally being low in ecological diversity and productivity. In contrast, "high" islands are less vulnerable to storms, tend to be dominated by internal processes (e.g., sand trapping by vegetation), require long time periods to respond to changes in forcing, migrate slowly (if at all) and host a range of plant species and morphological environments including shrubs, small trees and vegetated secondary and tertiary dunes with intervening swales. The continued existence of barrier island landforms will depend on the degree to which islands can maintain elevation above sea level while also responding to changes in forcing by migrating landward. A long-term morphological-behavior model exploring coupled barrier-marsh evolution and a new ecomorphodynamic model representing the formation/recovery of dunes as a function of storms, shed light on the role of interactions among biological and physical processes on barrier island response to climate change. Results suggest that connections between the marsh and barrier realms, which are mediated by biological processes in the marsh environment, are highly sensitive to factors such as sea level rise rate, antecedent morphology and marsh composition. Results also suggest that feedbacks between sediment transport and vegetation involved in dune building may allow small, gradual changes in storms to cause abrupt, nonlinear transitions from the high to low island state.
Yang, Jun; Chen, Xiaorong; Zhu, Changlan; Peng, Xiaosong; He, Xiaopeng; Fu, Junru; Ouyang, Linjuan; Bian, Jianmin; Hu, Lifang; Sun, Xiaotang; Xu, Jie; He, Haohua
2015-01-01
Rice reproductive development is sensitive to high temperature and soil nitrogen supply, both of which are predicted to be increased threats to rice crop yield. Rice spikelet development is a critical process that determines yield, yet little is known about the transcriptional regulation of rice spikelet development in response to the combination of heat stress and low nitrogen availability. Here, we profiled gene expression of rice spikelet development during meiosis under heat stress and different nitrogen levels using RNA-seq. We subjected plants to four treatments: 1) NN: normal nitrogen level (165 kg ha-1) with normal temperature (30°C); 2) HH: high nitrogen level (264 kg ha-1) with high temperature (37°C); 3) NH: normal nitrogen level and high temperature; and 4) HN: high nitrogen level and normal temperature. The de novo transcriptome assembly resulted in 52,250,482 clean reads aligned with 76,103 unigenes, which were then used to compare differentially expressed genes (DEGs) in the different treatments. Comparing gene expression in samples with the same nitrogen levels but different temperatures, we identified 70 temperature-responsive DEGs in normal nitrogen levels (NN vs NH) and 135 DEGs in high nitrogen levels (HN vs HH), with 27 overlapping DEGs. We identified 17 and seven nitrogen-responsive DEGs by comparing changes in nitrogen levels in lower temperature (NN vs HN) and higher temperature (NH vs HH), with one common DEG. The temperature-responsive genes were principally associated with cytochrome, heat shock protein, peroxidase, and ubiquitin, while the nitrogen-responsive genes were mainly involved in glutamine synthetase, amino acid transporter, pollen development, and plant hormone. Rice spikelet fertility was significantly reduced under high temperature, but less reduced under high-nitrogen treatment. In the high temperature treatments, we observed downregulation of genes involved in spikelet development, such as pollen tube growth, pollen maturation, especially sporopollenin biosynthetic process, and pollen exine formation. Moreover, we observed higher expression levels of the co-expressed DEGs in HN vs HH compared to NN vs NH. These included the six downregulated genes (one pollen maturation and five pollen exine formation genes), as well as the four upregulated DEGs in response to heat. This suggests that high-nitrogen treatment may enhance the gene expression levels to mitigate aspects of heat-stress. The spikelet genes identified in this study may play important roles in response to the combined effects of high temperature and high nitrogen, and may serve as candidates for crop improvement.
High-accurate optical fiber liquid level sensor
NASA Astrophysics Data System (ADS)
Sun, Dexing; Chen, Shouliu; Pan, Chao; Jin, Henghuan
1991-08-01
A highly accurate optical fiber liquid level sensor is presented. The single-chip microcomputer is used to process and control the signal. This kind of sensor is characterized by self-security and is explosion-proof, so it can be applied in any liquid level detecting areas, especially in the oil and chemical industries. The theories and experiments about how to improve the measurement accuracy are described. The relative error for detecting the measurement range 10 m is up to 0.01%.
High-Level Prediction Signals in a Low-Level Area of the Macaque Face-Processing Hierarchy.
Schwiedrzik, Caspar M; Freiwald, Winrich A
2017-09-27
Theories like predictive coding propose that lower-order brain areas compare their inputs to predictions derived from higher-order representations and signal their deviation as a prediction error. Here, we investigate whether the macaque face-processing system, a three-level hierarchy in the ventral stream, employs such a coding strategy. We show that after statistical learning of specific face sequences, the lower-level face area ML computes the deviation of actual from predicted stimuli. But these signals do not reflect the tuning characteristic of ML. Rather, they exhibit identity specificity and view invariance, the tuning properties of higher-level face areas AL and AM. Thus, learning appears to endow lower-level areas with the capability to test predictions at a higher level of abstraction than what is afforded by the feedforward sweep. These results provide evidence for computational architectures like predictive coding and suggest a new quality of functional organization of information-processing hierarchies beyond pure feedforward schemes. Copyright © 2017 Elsevier Inc. All rights reserved.
When the ends outweigh the means: mood and level of identification in depression.
Watkins, Edward R; Moberly, Nicholas J; Moulds, Michelle L
2011-11-01
Research in healthy controls has found that mood influences cognitive processing via level of action identification: happy moods are associated with global and abstract processing; sad moods are associated with local and concrete processing. However, this pattern seems inconsistent with the high level of abstract processing observed in depressed patients, leading Watkins (2008, 2010) to hypothesise that the association between mood and level of goal/action identification is impaired in depression. We tested this hypothesis by measuring level of identification on the Behavioural Identification Form after happy and sad mood inductions in never-depressed controls and currently depressed patients. Participants used increasingly concrete action identifications as they became sadder and less happy, but this effect was moderated by depression status. Consistent with Watkins' (2008) hypothesis, increases in sad mood and decreases in happiness were associated with shifts towards the use of more concrete action identifications in never-depressed individuals, but not in depressed patients. These findings suggest that the putatively adaptive association between mood and level of identification is impaired in major depression.
When the ends outweigh the means: Mood and level of identification in depression
Watkins, Edward R.; Moberly, Nicholas J.; Moulds, Michelle L.
2011-01-01
Research in healthy controls has found that mood influences cognitive processing via level of action identification: happy moods are associated with global and abstract processing; sad moods are associated with local and concrete processing. However, this pattern seems inconsistent with the high level of abstract processing observed in depressed patients, leading Watkins (2008, 2010) to hypothesise that the association between mood and level of goal/action identification is impaired in depression. We tested this hypothesis by measuring level of identification on the Behavioural Identification Form after happy and sad mood inductions in never-depressed controls and currently depressed patients. Participants used increasingly concrete action identifications as they became sadder and less happy, but this effect was moderated by depression status. Consistent with Watkins' (2008) hypothesis, increases in sad mood and decreases in happiness were associated with shifts towards the use of more concrete action identifications in never-depressed individuals, but not in depressed patients. These findings suggest that the putatively adaptive association between mood and level of identification is impaired in major depression. PMID:22017614
Archiving, processing, and disseminating ASTER products at the USGS EROS Data Center
Jones, B.; Tolk, B.; ,
2002-01-01
The U.S. Geological Survey EROS Data Center archives, processes, and disseminates Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER) data products. The ASTER instrument is one of five sensors onboard the Earth Observing System's Terra satellite launched December 18, 1999. ASTER collects broad spectral coverage with high spatial resolution at near infrared, shortwave infrared, and thermal infrared wavelengths with ground resolutions of 15, 30, and 90 meters, respectively. The ASTER data are used in many ways to understand local and regional earth-surface processes. Applications include land-surface climatology, volcanology, hazards monitoring, geology, agronomy, land cover change, and hydrology. The ASTER data are available for purchase from the ASTER Ground Data System in Japan and from the Land Processes Distributed Active Archive Center in the United States, which receives level 1A and level 1B data from Japan on a routine basis. These products are archived and made available to the public within 48 hours of receipt. The level 1A and level 1B data are used to generate higher level products that include routine and on-demand decorrelation stretch, brightness temperature at the sensor, emissivity, surface reflectance, surface kinetic temperature, surface radiance, polar surface and cloud classification, and digital elevation models. This paper describes the processes and procedures used to archive, process, and disseminate standard and on-demand higher level ASTER products at the Land Processes Distributed Active Archive Center.
Biochemistry at High School and University Levels in Saudi Arabia.
ERIC Educational Resources Information Center
Abu-Salah, Khalid M.; And Others
1988-01-01
Describes the assessment process for students in Saudi Arabia who are interested in pursuing a higher education in biochemistry. Provides recommendations for improving biochemistry education in both high schools and universities. (TW)
Object Categorization in Finer Levels Relies More on Higher Spatial Frequencies and Takes Longer.
Ashtiani, Matin N; Kheradpisheh, Saeed R; Masquelier, Timothée; Ganjtabesh, Mohammad
2017-01-01
The human visual system contains a hierarchical sequence of modules that take part in visual perception at different levels of abstraction, i.e., superordinate, basic, and subordinate levels. One important question is to identify the "entry" level at which the visual representation is commenced in the process of object recognition. For a long time, it was believed that the basic level had a temporal advantage over two others. This claim has been challenged recently. Here we used a series of psychophysics experiments, based on a rapid presentation paradigm, as well as two computational models, with bandpass filtered images of five object classes to study the processing order of the categorization levels. In these experiments, we investigated the type of visual information required for categorizing objects in each level by varying the spatial frequency bands of the input image. The results of our psychophysics experiments and computational models are consistent. They indicate that the different spatial frequency information had different effects on object categorization in each level. In the absence of high frequency information, subordinate and basic level categorization are performed less accurately, while the superordinate level is performed well. This means that low frequency information is sufficient for superordinate level, but not for the basic and subordinate levels. These finer levels rely more on high frequency information, which appears to take longer to be processed, leading to longer reaction times. Finally, to avoid the ceiling effect, we evaluated the robustness of the results by adding different amounts of noise to the input images and repeating the experiments. As expected, the categorization accuracy decreased and the reaction time increased significantly, but the trends were the same. This shows that our results are not due to a ceiling effect. The compatibility between our psychophysical and computational results suggests that the temporal advantage of the superordinate (resp. basic) level to basic (resp. subordinate) level is mainly due to the computational constraints (the visual system processes higher spatial frequencies more slowly, and categorization in finer levels depends more on these higher spatial frequencies).
Object Categorization in Finer Levels Relies More on Higher Spatial Frequencies and Takes Longer
Ashtiani, Matin N.; Kheradpisheh, Saeed R.; Masquelier, Timothée; Ganjtabesh, Mohammad
2017-01-01
The human visual system contains a hierarchical sequence of modules that take part in visual perception at different levels of abstraction, i.e., superordinate, basic, and subordinate levels. One important question is to identify the “entry” level at which the visual representation is commenced in the process of object recognition. For a long time, it was believed that the basic level had a temporal advantage over two others. This claim has been challenged recently. Here we used a series of psychophysics experiments, based on a rapid presentation paradigm, as well as two computational models, with bandpass filtered images of five object classes to study the processing order of the categorization levels. In these experiments, we investigated the type of visual information required for categorizing objects in each level by varying the spatial frequency bands of the input image. The results of our psychophysics experiments and computational models are consistent. They indicate that the different spatial frequency information had different effects on object categorization in each level. In the absence of high frequency information, subordinate and basic level categorization are performed less accurately, while the superordinate level is performed well. This means that low frequency information is sufficient for superordinate level, but not for the basic and subordinate levels. These finer levels rely more on high frequency information, which appears to take longer to be processed, leading to longer reaction times. Finally, to avoid the ceiling effect, we evaluated the robustness of the results by adding different amounts of noise to the input images and repeating the experiments. As expected, the categorization accuracy decreased and the reaction time increased significantly, but the trends were the same. This shows that our results are not due to a ceiling effect. The compatibility between our psychophysical and computational results suggests that the temporal advantage of the superordinate (resp. basic) level to basic (resp. subordinate) level is mainly due to the computational constraints (the visual system processes higher spatial frequencies more slowly, and categorization in finer levels depends more on these higher spatial frequencies). PMID:28790954
NASA Astrophysics Data System (ADS)
Roy, K.; Peltier, W. R.
2017-12-01
Our understanding of the Earth-Ice-Ocean interactions that have accompanied the large glaciation-deglaciation process characteristic of the last half of the Pleistocene has benefited significantly from the development of high-quality models of the Glacial Isostatic Adjustment (GIA) process. These models provide fundamental insight on the large changes in sea level and land ice cover over this time period, as well as key constraints on the viscosity structure of the Earth's interior. Their development has benefited from the recent availability of high-quality constraints from regions of forebulge collapse. In particular, over North America, the joint use of high-quality sea level data from the U.S. East coast, together with the vast network of precise space-geodetic observations of crustal motion existing over most of the interior of the continent, has led to the latest ICE-7G_NA (VM7) model (Roy & Peltier, GJI, 2017). In this paper, exciting opportunities provided by such high-quality observations related to the GIA process will be discussed, not only in the context of the continuing effort to refine global models of this phenomenon, but also in terms of the fundamental insight they may provide on outstanding issues in high-pressure geophysics, paleoclimatology or hydrogeology. Specific examples where such high-quality observations can be used (either separately, or using a combination of independent sources) will be presented, focusing particularly on constraints from the North American continent and from the Mediterranean basin. This work will demonstrate that, given the high-quality of currently available constraints on the GIA process, considerable further geophysical insight can be obtained based upon the use of spherically-symmetric models of the viscosity structure of the planet.
Business Data Processing: A Teacher's Guide.
ERIC Educational Resources Information Center
Virginia State Dept. of Education, Richmond. Business Education Service.
The curriculum guide, which was prepared to serve as an aid to all teachers of business data processing, gives a complete outline for a high-school level course in both Common Business Oriented Language (COBOL) and Report Program Generator (RPG). Parts one and two of the guide together comprise an introduction to data processing, which deals with…
The Urban Adaptation and Adaptation Process of Urban Migrant Children: A Qualitative Study
ERIC Educational Resources Information Center
Liu, Yang; Fang, Xiaoyi; Cai, Rong; Wu, Yang; Zhang, Yaofang
2009-01-01
This article employs qualitative research methods to explore the urban adaptation and adaptation processes of Chinese migrant children. Through twenty-one in-depth interviews with migrant children, the researchers discovered: The participant migrant children showed a fairly high level of adaptation to the city; their process of urban adaptation…
Spike processing with a graphene excitable laser
Shastri, Bhavin J.; Nahmias, Mitchell A.; Tait, Alexander N.; Rodriguez, Alejandro W.; Wu, Ben; Prucnal, Paul R.
2016-01-01
Novel materials and devices in photonics have the potential to revolutionize optical information processing, beyond conventional binary-logic approaches. Laser systems offer a rich repertoire of useful dynamical behaviors, including the excitable dynamics also found in the time-resolved “spiking” of neurons. Spiking reconciles the expressiveness and efficiency of analog processing with the robustness and scalability of digital processing. We demonstrate a unified platform for spike processing with a graphene-coupled laser system. We show that this platform can simultaneously exhibit logic-level restoration, cascadability and input-output isolation—fundamental challenges in optical information processing. We also implement low-level spike-processing tasks that are critical for higher level processing: temporal pattern detection and stable recurrent memory. We study these properties in the context of a fiber laser system and also propose and simulate an analogous integrated device. The addition of graphene leads to a number of advantages which stem from its unique properties, including high absorption and fast carrier relaxation. These could lead to significant speed and efficiency improvements in unconventional laser processing devices, and ongoing research on graphene microfabrication promises compatibility with integrated laser platforms. PMID:26753897
Quantification of chromatin condensation level by image processing.
Irianto, Jerome; Lee, David A; Knight, Martin M
2014-03-01
The level of chromatin condensation is related to the silencing/activation of chromosomal territories and therefore impacts on gene expression. Chromatin condensation changes during cell cycle, progression and differentiation, and is influenced by various physicochemical and epigenetic factors. This study describes a validated experimental technique to quantify chromatin condensation. A novel image processing procedure is developed using Sobel edge detection to quantify the level of chromatin condensation from nuclei images taken by confocal microscopy. The algorithm was developed in MATLAB and used to quantify different levels of chromatin condensation in chondrocyte nuclei achieved through alteration in osmotic pressure. The resulting chromatin condensation parameter (CCP) is in good agreement with independent multi-observer qualitative visual assessment. This image processing technique thereby provides a validated unbiased parameter for rapid and highly reproducible quantification of the level of chromatin condensation. Copyright © 2013 IPEM. Published by Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Pastushkov, V. G.; Molchanov, A. V.; Serebryakov, V. P.; Smelova, T. V.; Shestoperov, I. N.
2000-07-01
The paper discusses specific features of technology, equipment and control of a single stage RAMW decontamination and melting process in an induction furnace equipped with a "cold" crucible. The calculated and experimental data are given on melting high activity level stainless steel and Zr simulating high activity level metal waste. The work is under way in SSC RF VNIINM.
NASA Technical Reports Server (NTRS)
Chien, S.
1994-01-01
This paper describes work on the Multimission VICAR Planner (MVP) system to automatically construct executable image processing procedures for custom image processing requests for the JPL Multimission Image Processing Lab (MIPL). This paper focuses on two issues. First, large search spaces caused by complex plans required the use of hand encoded control information. In order to address this in a manner similar to that used by human experts, MVP uses a decomposition-based planner to implement hierarchical/skeletal planning at the higher level and then uses a classical operator based planner to solve subproblems in contexts defined by the high-level decomposition.
Martínez-Onandi, Nerea; Rivas-Cañedo, Ana; Ávila, Marta; Garde, Sonia; Nuñez, Manuel; Picon, Antonia
2017-09-01
The volatile fraction of 30 Iberian dry-cured hams of different physicochemical characteristics and the effect of high pressure processing (HPP) at 600MPa on volatile compounds were investigated. According to the analysis of variance carried out on the levels of 122 volatile compounds, intramuscular fat content influenced the levels of 8 benzene compounds, 5 carboxylic acids, 2 ketones, 2 furanones, 1 alcohol, 1 aldehyde and 1 sulfur compound, salt concentration influenced the levels of 1 aldehyde and 1 ketone, salt-in-lean ratio had no effect on volatile compounds, and water activity influenced the levels of 3 sulfur compounds, 1 alcohol and 1 aldehyde. HPP-treated samples of Iberian ham had higher levels of 4 compounds and lower levels of 31 compounds than untreated samples. A higher influence of HPP treatment on volatile compounds than physicochemical characteristics was observed for Iberian ham. Therefore, HPP treatment conditions should be optimized in order to diminish its possible effect on Iberian ham odor and aroma characteristics. Copyright © 2017 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Omura, Yasuhisa; Mori, Yoshiaki; Sato, Shingo; Mallik, Abhijit
2018-04-01
This paper discusses the role of trap-assisted-tunneling process in controlling the ON- and OFF-state current levels and its impacts on the current-voltage characteristics of a tunnel field-effect transistor. Significant impacts of high-density traps in the source region are observed that are discussed in detail. With regard to recent studies on isoelectronic traps, it has been discovered that deep level density must be minimized to suppress the OFF-state leakage current, as is well known, whereas shallow levels can be utilized to control the ON-state current level. A possible mechanism is discussed based on simulation results.
Geerkens, Christian Hubert; Schweiggert, Ralf Martin; Steingass, Herbert; Boguhn, Jeannette; Rodehutscord, Markus; Carle, Reinhold
2013-06-19
Several food processing byproducts were assessed as potential feed and feed supplements. Since their chemical composition revealed a high nutritional potential for ruminants, the Hohenheim in vitro gas test was used to investigate total gas, methane, and volatile fatty acid production as well as protozoal numbers after ruminal digestion of different substrate levels. Processing byproducts used were low- and high-esterified citrus and apple pectins, integral mango peels, and depectinized mango peels. In addition, the effect of a phenolic mango peel extract and pure gallic acid was investigated. The highest decrease in methane production (19%) was achieved by supplementing high levels of low-esterified citrus pectin to the hay-based diet. Interestingly, total gas production was not affected at the same time. Showing valuable nutritional potential, all byproducts exhibited, e.g., high metabolizable energy (11.9-12.8 MJ/kg DM). In conclusion, all byproducts, particularly low-esterified citrus pectin, revealed promising potential as feed and feed supplements.
Automatic Near-Real-Time Image Processing Chain for Very High Resolution Optical Satellite Data
NASA Astrophysics Data System (ADS)
Ostir, K.; Cotar, K.; Marsetic, A.; Pehani, P.; Perse, M.; Zaksek, K.; Zaletelj, J.; Rodic, T.
2015-04-01
In response to the increasing need for automatic and fast satellite image processing SPACE-SI has developed and implemented a fully automatic image processing chain STORM that performs all processing steps from sensor-corrected optical images (level 1) to web-delivered map-ready images and products without operator's intervention. Initial development was tailored to high resolution RapidEye images, and all crucial and most challenging parts of the planned full processing chain were developed: module for automatic image orthorectification based on a physical sensor model and supported by the algorithm for automatic detection of ground control points (GCPs); atmospheric correction module, topographic corrections module that combines physical approach with Minnaert method and utilizing anisotropic illumination model; and modules for high level products generation. Various parts of the chain were implemented also for WorldView-2, THEOS, Pleiades, SPOT 6, Landsat 5-8, and PROBA-V. Support of full-frame sensor currently in development by SPACE-SI is in plan. The proposed paper focuses on the adaptation of the STORM processing chain to very high resolution multispectral images. The development concentrated on the sub-module for automatic detection of GCPs. The initially implemented two-step algorithm that worked only with rasterized vector roads and delivered GCPs with sub-pixel accuracy for the RapidEye images, was improved with the introduction of a third step: super-fine positioning of each GCP based on a reference raster chip. The added step exploits the high spatial resolution of the reference raster to improve the final matching results and to achieve pixel accuracy also on very high resolution optical satellite data.
Mirroring and beyond: coupled dynamics as a generalized framework for modelling social interactions
Hasson, Uri; Frith, Chris D.
2016-01-01
When people observe one another, behavioural alignment can be detected at many levels, from the physical to the mental. Likewise, when people process the same highly complex stimulus sequences, such as films and stories, alignment is detected in the elicited brain activity. In early sensory areas, shared neural patterns are coupled to the low-level properties of the stimulus (shape, motion, volume, etc.), while in high-order brain areas, shared neural patterns are coupled to high-levels aspects of the stimulus, such as meaning. Successful social interactions require such alignments (both behavioural and neural), as communication cannot occur without shared understanding. However, we need to go beyond simple, symmetric (mirror) alignment once we start interacting. Interactions are dynamic processes, which involve continuous mutual adaptation, development of complementary behaviour and division of labour such as leader–follower roles. Here, we argue that interacting individuals are dynamically coupled rather than simply aligned. This broader framework for understanding interactions can encompass both processes by which behaviour and brain activity mirror each other (neural alignment), and situations in which behaviour and brain activity in one participant are coupled (but not mirrored) to the dynamics in the other participant. To apply these more sophisticated accounts of social interactions to the study of the underlying neural processes we need to develop new experimental paradigms and novel methods of data analysis PMID:27069044
A Statistical Perspective on Highly Accelerated Testing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Thomas, Edward V.
Highly accelerated life testing has been heavily promoted at Sandia (and elsewhere) as a means to rapidly identify product weaknesses caused by flaws in the product's design or manufacturing process. During product development, a small number of units are forced to fail at high stress. The failed units are then examined to determine the root causes of failure. The identification of the root causes of product failures exposed by highly accelerated life testing can instigate changes to the product's design and/or manufacturing process that result in a product with increased reliability. It is widely viewed that this qualitative use ofmore » highly accelerated life testing (often associated with the acronym HALT) can be useful. However, highly accelerated life testing has also been proposed as a quantitative means for "demonstrating" the reliability of a product where unreliability is associated with loss of margin via an identified and dominating failure mechanism. It is assumed that the dominant failure mechanism can be accelerated by changing the level of a stress factor that is assumed to be related to the dominant failure mode. In extreme cases, a minimal number of units (often from a pre-production lot) are subjected to a single highly accelerated stress relative to normal use. If no (or, sufficiently few) units fail at this high stress level, some might claim that a certain level of reliability has been demonstrated (relative to normal use conditions). Underlying this claim are assumptions regarding the level of knowledge associated with the relationship between the stress level and the probability of failure. The primary purpose of this document is to discuss (from a statistical perspective) the efficacy of using accelerated life testing protocols (and, in particular, "highly accelerated" protocols) to make quantitative inferences concerning the performance of a product (e.g., reliability) when in fact there is lack-of-knowledge and uncertainty concerning the assumed relationship between the stress level and performance. In addition, this document contains recommendations for conducting more informative accelerated tests.« less
An Economic comparison of sludge irradiation and alternative methods of municipal sludge treatment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ahlstrom, S.B.; McGuire, H.E.
1977-11-01
The relative economics of radiation treatment and other sludge treatment processes are reported. The desirability of radiation treatment is assessed in terms of cost and the quality of the treated sludge product. The major conclusions of this study are: radiation treatment is a high-level disinfection process. Therefore, it should only be considered if high levels of disinfection are required for widespread reuse of the sludge; the handling, transporting and pathogen growback problems associated with disinfected wet sludge makes it less attractive for reuse than dry sludge; radiation of composted sludge produces a product of similar quality at less cost thanmore » any thermal treatment and/or flash drying treatment option for situations where a high degree of disinfection is required; and heavy metal concerns, especially cadmium, may limit the reuse of sludge despite high disinfection levels. It is recommended that radiation treatment of sludge, particularly dry sludge, continue to be studied. A sensitivity analysis investigating the optimal conditions under which sludge irradiation operates should be instigated. Furthermore, costs of adding sludge irradiation to existing sludge treatment schemes should be determined.« less
Comparison of Point Matching Techniques for Road Network Matching
NASA Astrophysics Data System (ADS)
Hackeloeer, A.; Klasing, K.; Krisp, J. M.; Meng, L.
2013-05-01
Map conflation investigates the unique identification of geographical entities across different maps depicting the same geographic region. It involves a matching process which aims to find commonalities between geographic features. A specific subdomain of conflation called Road Network Matching establishes correspondences between road networks of different maps on multiple layers of abstraction, ranging from elementary point locations to high-level structures such as road segments or even subgraphs derived from the induced graph of a road network. The process of identifying points located on different maps by means of geometrical, topological and semantical information is called point matching. This paper provides an overview of various techniques for point matching, which is a fundamental requirement for subsequent matching steps focusing on complex high-level entities in geospatial networks. Common point matching approaches as well as certain combinations of these are described, classified and evaluated. Furthermore, a novel similarity metric called the Exact Angular Index is introduced, which considers both topological and geometrical aspects. The results offer a basis for further research on a bottom-up matching process for complex map features, which must rely upon findings derived from suitable point matching algorithms. In the context of Road Network Matching, reliable point matches provide an immediate starting point for finding matches between line segments describing the geometry and topology of road networks, which may in turn be used for performing a structural high-level matching on the network level.
Surface Layer Processes And Nocturnal Low Level Jet Development--An Observational Study During Pecan
2016-12-01
PROCESSES AND NOCTURNAL LOW-LEVEL JET DEVELOPMENT—AN OBSERVATIONAL STUDY DURING PECAN by Michael K. Beall December 2016 Thesis Advisor... OBSERVATIONAL STUDY DURING PECAN 5. FUNDING NUMBERS 6. AUTHOR(S) Michael K. Beall 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) Naval Postgraduate... research project and collected high-resolution stable boundary layer data as it evolved through the night. The objective of this study was to use this
VAB Platform K(2) Lift & Install into Highbay 3
2016-03-07
A 250-ton crane is used to lower the second half of the K-level work platforms for NASA’s Space Launch System (SLS) rocket into High Bay 3 inside the Vehicle Assembly Building at NASA's Kennedy Space Center in Florida. The platform will be secured about 86 feet above the VAB floor, on tower E of the high bay. The K work platforms will provide access to the SLS core stage and solid rocket boosters during processing and stacking operations on the mobile launcher. The Ground Systems Development and Operations Program is overseeing upgrades and modifications to High Bay 3 to support processing of the SLS and Orion spacecraft. A total of 10 levels of new platforms, 20 platform halves altogether, will surround the SLS rocket and Orion spacecraft.
Distracted and confused?: selective attention under load.
Lavie, Nilli
2005-02-01
The ability to remain focused on goal-relevant stimuli in the presence of potentially interfering distractors is crucial for any coherent cognitive function. However, simply instructing people to ignore goal-irrelevant stimuli is not sufficient for preventing their processing. Recent research reveals that distractor processing depends critically on the level and type of load involved in the processing of goal-relevant information. Whereas high perceptual load can eliminate distractor processing, high load on "frontal" cognitive control processes increases distractor processing. These findings provide a resolution to the long-standing early and late selection debate within a load theory of attention that accommodates behavioural and neuroimaging data within a framework that integrates attention research with executive function.
Zhan, Mei; Crane, Matthew M; Entchev, Eugeni V; Caballero, Antonio; Fernandes de Abreu, Diana Andrea; Ch'ng, QueeLim; Lu, Hang
2015-04-01
Quantitative imaging has become a vital technique in biological discovery and clinical diagnostics; a plethora of tools have recently been developed to enable new and accelerated forms of biological investigation. Increasingly, the capacity for high-throughput experimentation provided by new imaging modalities, contrast techniques, microscopy tools, microfluidics and computer controlled systems shifts the experimental bottleneck from the level of physical manipulation and raw data collection to automated recognition and data processing. Yet, despite their broad importance, image analysis solutions to address these needs have been narrowly tailored. Here, we present a generalizable formulation for autonomous identification of specific biological structures that is applicable for many problems. The process flow architecture we present here utilizes standard image processing techniques and the multi-tiered application of classification models such as support vector machines (SVM). These low-level functions are readily available in a large array of image processing software packages and programming languages. Our framework is thus both easy to implement at the modular level and provides specific high-level architecture to guide the solution of more complicated image-processing problems. We demonstrate the utility of the classification routine by developing two specific classifiers as a toolset for automation and cell identification in the model organism Caenorhabditis elegans. To serve a common need for automated high-resolution imaging and behavior applications in the C. elegans research community, we contribute a ready-to-use classifier for the identification of the head of the animal under bright field imaging. Furthermore, we extend our framework to address the pervasive problem of cell-specific identification under fluorescent imaging, which is critical for biological investigation in multicellular organisms or tissues. Using these examples as a guide, we envision the broad utility of the framework for diverse problems across different length scales and imaging methods.
Effects of Crowding and Attention on High-Levels of Motion Processing and Motion Adaptation
Pavan, Andrea; Greenlee, Mark W.
2015-01-01
The motion after-effect (MAE) persists in crowding conditions, i.e., when the adaptation direction cannot be reliably perceived. The MAE originating from complex moving patterns spreads into non-adapted sectors of a multi-sector adapting display (i.e., phantom MAE). In the present study we used global rotating patterns to measure the strength of the conventional and phantom MAEs in crowded and non-crowded conditions, and when attention was directed to the adapting stimulus and when it was diverted away from the adapting stimulus. The results show that: (i) the phantom MAE is weaker than the conventional MAE, for both non-crowded and crowded conditions, and when attention was focused on the adapting stimulus and when it was diverted from it, (ii) conventional and phantom MAEs in the crowded condition are weaker than in the non-crowded condition. Analysis conducted to assess the effect of crowding on high-level of motion adaptation suggests that crowding is likely to affect the awareness of the adapting stimulus rather than degrading its sensory representation, (iii) for high-level of motion processing the attentional manipulation does not affect the strength of either conventional or phantom MAEs, neither in the non-crowded nor in the crowded conditions. These results suggest that high-level MAEs do not depend on attention and that at high-level of motion adaptation the effects of crowding are not modulated by attention. PMID:25615577
Evaluation of the FIR Example using Xilinx Vivado High-Level Synthesis Compiler
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jin, Zheming; Finkel, Hal; Yoshii, Kazutomo
Compared to central processing units (CPUs) and graphics processing units (GPUs), field programmable gate arrays (FPGAs) have major advantages in reconfigurability and performance achieved per watt. This development flow has been augmented with high-level synthesis (HLS) flow that can convert programs written in a high-level programming language to Hardware Description Language (HDL). Using high-level programming languages such as C, C++, and OpenCL for FPGA-based development could allow software developers, who have little FPGA knowledge, to take advantage of the FPGA-based application acceleration. This improves developer productivity and makes the FPGA-based acceleration accessible to hardware and software developers. Xilinx Vivado HLSmore » compiler is a high-level synthesis tool that enables C, C++ and System C specification to be directly targeted into Xilinx FPGAs without the need to create RTL manually. The white paper [1] published recently by Xilinx uses a finite impulse response (FIR) example to demonstrate the variable-precision features in the Vivado HLS compiler and the resource and power benefits of converting floating point to fixed point for a design. To get a better understanding of variable-precision features in terms of resource usage and performance, this report presents the experimental results of evaluating the FIR example using Vivado HLS 2017.1 and a Kintex Ultrascale FPGA. In addition, we evaluated the half-precision floating-point data type against the double-precision and single-precision data type and present the detailed results.« less
High-efficiency power production from natural gas with carbon capture
NASA Astrophysics Data System (ADS)
Adams, Thomas A.; Barton, Paul I.
A unique electricity generation process uses natural gas and solid oxide fuel cells at high electrical efficiency (74%HHV) and zero atmospheric emissions. The process contains a steam reformer heat-integrated with the fuel cells to provide the heat necessary for reforming. The fuel cells are powered with H 2 and avoid carbon deposition issues. 100% CO 2 capture is achieved downstream of the fuel cells with very little energy penalty using a multi-stage flash cascade process, where high-purity water is produced as a side product. Alternative reforming techniques such as CO 2 reforming, autothermal reforming, and partial oxidation are considered. The capital and energy costs of the proposed process are considered to determine the levelized cost of electricity, which is low when compared to other similar carbon capture-enabled processes.
McRobert, Allistair Paul; Causer, Joe; Vassiliadis, John; Watterson, Leonie; Kwan, James; Williams, Mark A
2013-06-01
It is well documented that adaptations in cognitive processes with increasing skill levels support decision making in multiple domains. We examined skill-based differences in cognitive processes in emergency medicine physicians, and whether performance was significantly influenced by the removal of contextual information related to a patient's medical history. Skilled (n=9) and less skilled (n=9) emergency medicine physicians responded to high-fidelity simulated scenarios under high- and low-context information conditions. Skilled physicians demonstrated higher diagnostic accuracy irrespective of condition, and were less affected by the removal of context-specific information compared with less skilled physicians. The skilled physicians generated more options, and selected better quality options during diagnostic reasoning compared with less skilled counterparts. These cognitive processes were active irrespective of the level of context-specific information presented, although high-context information enhanced understanding of the patients' symptoms resulting in higher diagnostic accuracy. Our findings have implications for scenario design and the manipulation of contextual information during simulation training.
NASA Astrophysics Data System (ADS)
Xie, Yiwei; Geng, Zihan; Zhuang, Leimeng; Burla, Maurizio; Taddei, Caterina; Hoekman, Marcel; Leinse, Arne; Roeloffzen, Chris G. H.; Boller, Klaus-J.; Lowery, Arthur J.
2017-12-01
Integrated optical signal processors have been identified as a powerful engine for optical processing of microwave signals. They enable wideband and stable signal processing operations on miniaturized chips with ultimate control precision. As a promising application, such processors enables photonic implementations of reconfigurable radio frequency (RF) filters with wide design flexibility, large bandwidth, and high-frequency selectivity. This is a key technology for photonic-assisted RF front ends that opens a path to overcoming the bandwidth limitation of current digital electronics. Here, the recent progress of integrated optical signal processors for implementing such RF filters is reviewed. We highlight the use of a low-loss, high-index-contrast stoichiometric silicon nitride waveguide which promises to serve as a practical material platform for realizing high-performance optical signal processors and points toward photonic RF filters with digital signal processing (DSP)-level flexibility, hundreds-GHz bandwidth, MHz-band frequency selectivity, and full system integration on a chip scale.
Delta-Doping at Wafer Level for High Throughput, High Yield Fabrication of Silicon Imaging Arrays
NASA Technical Reports Server (NTRS)
Hoenk, Michael E. (Inventor); Nikzad, Shoulch (Inventor); Jones, Todd J. (Inventor); Greer, Frank (Inventor); Carver, Alexander G. (Inventor)
2014-01-01
Systems and methods for producing high quantum efficiency silicon devices. A silicon MBE has a preparation chamber that provides for cleaning silicon surfaces using an oxygen plasma to remove impurities and a gaseous (dry) NH3 + NF3 room temperature oxide removal process that leaves the silicon surface hydrogen terminated. Silicon wafers up to 8 inches in diameter have devices that can be fabricated using the cleaning procedures and MBE processing, including delta doping.
Research, development and pilot production of high output thin silicon solar cells
NASA Technical Reports Server (NTRS)
Iles, P. A.
1976-01-01
Work was performed to define and apply processes which could lead to high output from thin (2-8 mils) silicon solar cells. The overall problems are outlined, and two satisfactory process sequences were developed. These sequences led to good output cells in the thickness range to just below 4 mils; although the initial contract scope was reduced, one of these sequences proved capable of operating beyond a pilot line level, to yield good quality 4-6 mil cells of high output.
Chebat, Jean-Charles; Vercollier, Sarah Drissi; Gélinas-Chebat, Claire
2003-06-01
The effects of drama versus lecture format in public service advertisements are studied in a 2 (format) x 2 (malaria vs AIDS) factorial design. Two structural equation models are built (one for each level of self-relevance), showing two distinct patterns. In both low and high self-relevant situations, empathy plays a key role. Under low self-relevance conditions, drama enhances information processing through empathy. Under high self-relevant conditions, the advertisement format has neither significant cognitive or empathetic effects. The information processing generated by the highly relevant topic affects viewers' empathy, which in turn affects the attitude the advertisement and the behavioral intent. As predicted by the Elaboration Likelihood Model, the advertisement format enhances the attitudes and information processing mostly under low self-relevant conditions. Under low self-relevant conditions, empathy enhances information processing while under high self-relevance, the converse relation holds.
Hughey, Justin R; Keen, Justin M; Miller, Dave A; Brough, Chris; McGinity, James W
2012-11-15
The primary aim of the present study was to investigate the ability of hydroxypropyl and methoxyl substituted cellulose ethers to stabilize supersaturated concentrations of itraconazole (ITZ), a poorly water-soluble weak base, after an acid-to-neutral pH transition. A secondary aim of the study was to evaluate the effect of fusion processes on polymer stability and molecular weight. Polymer screening studies showed that stabilization of ITZ supersaturation was related to the molecular weight of the polymer and levels of hydroxypropyl and methoxyl substitution. METHOCEL E50LV (E50LV), which is characterized as having a high melt viscosity, was selected for solid dispersion formulation studies. Hot-melt extrusion processing of E50LV based compositions resulted in high torque loads, low material throughput and polymer degradation. KinetiSol Dispersing, a novel fusion based processing technique, was evaluated as a method to prepare the solid dispersions with reduced levels of polymer degradation. An experimental design revealed that polymer molecular weight was sensitive to shearing forces and high temperatures. However, optimal processing conditions resulted in significantly reduced E50LV degradation relative to HME processing. The technique was effectively utilized to prepare homogenous solid solutions of E50LV and ITZ, characterized as having a single glass transition temperature over a wide range of drug loadings. All prepared compositions provided for a high degree of ITZ supersaturation stabilization. Copyright © 2012 Elsevier B.V. All rights reserved.
Martinez Steele, Euridice; Canella, Daniela Silva; Monteiro, Carlos Augusto
2018-01-01
Objectives To compare ultra-processed food consumption across sociodemographic groups and over time (2007–2008, 2009–2010, 2011–2012) in the USA. Design Cross-sectional study. Setting National Health and Nutrition Examination Survey (NHANES) 2007–2012. Participants All individuals aged ≥2 years with at least one 24-hour dietary recall were included (n=23 847). Main outcome measures Average dietary contribution of ultra-processed foods (expressed as a percentage of the total caloric value of the diet), obtained after classifying all food items according to extent and purpose of industrial food processing using NOVA classification. Data analysis Linear regression was used to evaluate the association between sociodemographic characteristics or NHANES cycles and dietary contribution of ultra-processed foods. Results Almost 60% of calories consumed in the period 2007–2012 came from ultra-processed foods. Consumption of ultra-processed foods decreased with age and income level, was higher for non-Hispanic whites or non-Hispanic blacks than for other race/ethnicity groups and lower for people with college than for lower levels of education, all differences being statistically significant. Overall contribution of ultra-processed foods increased significantly between NHANES cycles (nearly 1% point per cycle), the same being observed among males, adolescents and high school education-level individuals. Conclusions Ultra-processed food consumption in the USA in the period 2007–2012 was overall high, greater among non-Hispanic whites or non-Hispanic blacks, less educated, younger, lower-income strata and increased across time. PMID:29525772
Domínguez, Ximena; Vitiello, Virginia E; Fuccillo, Janna M; Greenfield, Daryl B; Bulotsky-Shearer, Rebecca J
2011-04-01
Research suggests that promoting adaptive approaches to learning early in childhood may help close the gap between advantaged and disadvantaged children. Recent research has identified specific child-level and classroom-level variables that are significantly associated with preschoolers' approaches to learning. However, further research is needed to understand the interactive effects of these variables and determine whether classroom-level variables buffer the detrimental effects of child-level risk variables. Using a largely urban and minority sample (N=275) of preschool children, the present study examined the additive and interactive effects of children's context-specific problem behaviors and classroom process quality dimensions on children's approaches to learning. Teachers rated children's problem behavior and approaches to learning and independent assessors conducted classroom observations to assess process quality. Problem behaviors in structured learning situations and in peer and teacher interactions were found to negatively predict variance in approaches to learning. Classroom process quality domains did not independently predict variance in approaches to learning. Nonetheless, classroom process quality played an important role in these associations; high emotional support buffered the detrimental effects of problem behavior, whereas high instructional support exacerbated them. The findings of this study have important implications for classroom practices aimed at helping children who exhibit problem behaviors. Copyright © 2010 Society for the Study of School Psychology. Published by Elsevier Ltd. All rights reserved.
High resolution 4-D spectroscopy with sparse concentric shell sampling and FFT-CLEAN.
Coggins, Brian E; Zhou, Pei
2008-12-01
Recent efforts to reduce the measurement time for multidimensional NMR experiments have fostered the development of a variety of new procedures for sampling and data processing. We recently described concentric ring sampling for 3-D NMR experiments, which is superior to radial sampling as input for processing by a multidimensional discrete Fourier transform. Here, we report the extension of this approach to 4-D spectroscopy as Randomized Concentric Shell Sampling (RCSS), where sampling points for the indirect dimensions are positioned on concentric shells, and where random rotations in the angular space are used to avoid coherent artifacts. With simulations, we show that RCSS produces a very low level of artifacts, even with a very limited number of sampling points. The RCSS sampling patterns can be adapted to fine rectangular grids to permit use of the Fast Fourier Transform in data processing, without an apparent increase in the artifact level. These artifacts can be further reduced to the noise level using the iterative CLEAN algorithm developed in radioastronomy. We demonstrate these methods on the high resolution 4-D HCCH-TOCSY spectrum of protein G's B1 domain, using only 1.2% of the sampling that would be needed conventionally for this resolution. The use of a multidimensional FFT instead of the slow DFT for initial data processing and for subsequent CLEAN significantly reduces the calculation time, yielding an artifact level that is on par with the level of the true spectral noise.
High Resolution 4-D Spectroscopy with Sparse Concentric Shell Sampling and FFT-CLEAN
Coggins, Brian E.; Zhou, Pei
2009-01-01
SUMMARY Recent efforts to reduce the measurement time for multidimensional NMR experiments have fostered the development of a variety of new procedures for sampling and data processing. We recently described concentric ring sampling for 3-D NMR experiments, which is superior to radial sampling as input for processing by a multidimensional discrete Fourier transform. Here, we report the extension of this approach to 4-D spectroscopy as Randomized Concentric Shell Sampling (RCSS), where sampling points for the indirect dimensions are positioned on concentric shells, and where random rotations in the angular space are used to avoid coherent artifacts. With simulations, we show that RCSS produces a very low level of artifacts, even with a very limited number of sampling points. The RCSS sampling patterns can be adapted to fine rectangular grids to permit use of the Fast Fourier Transform in data processing, without an apparent increase in the artifact level. These artifacts can be further reduced to the noise level using the iterative CLEAN algorithm developed in radioastronomy. We demonstrate these methods on the high resolution 4-D HCCH-TOCSY spectrum of protein G's B1 domain, using only 1.2% of the sampling that would be needed conventionally for this resolution. The use of a multidimensional FFT instead of the slow DFT for initial data processing and for subsequent CLEAN significantly reduces the calculation time, yielding an artifact level that is on par with the level of the true spectral noise. PMID:18853260
FUSE: a profit maximization approach for functional summarization of biological networks.
Seah, Boon-Siew; Bhowmick, Sourav S; Dewey, C Forbes; Yu, Hanry
2012-03-21
The availability of large-scale curated protein interaction datasets has given rise to the opportunity to investigate higher level organization and modularity within the protein interaction network (PPI) using graph theoretic analysis. Despite the recent progress, systems level analysis of PPIS remains a daunting task as it is challenging to make sense out of the deluge of high-dimensional interaction data. Specifically, techniques that automatically abstract and summarize PPIS at multiple resolutions to provide high level views of its functional landscape are still lacking. We present a novel data-driven and generic algorithm called FUSE (Functional Summary Generator) that generates functional maps of a PPI at different levels of organization, from broad process-process level interactions to in-depth complex-complex level interactions, through a pro t maximization approach that exploits Minimum Description Length (MDL) principle to maximize information gain of the summary graph while satisfying the level of detail constraint. We evaluate the performance of FUSE on several real-world PPIS. We also compare FUSE to state-of-the-art graph clustering methods with GO term enrichment by constructing the biological process landscape of the PPIS. Using AD network as our case study, we further demonstrate the ability of FUSE to quickly summarize the network and identify many different processes and complexes that regulate it. Finally, we study the higher-order connectivity of the human PPI. By simultaneously evaluating interaction and annotation data, FUSE abstracts higher-order interaction maps by reducing the details of the underlying PPI to form a functional summary graph of interconnected functional clusters. Our results demonstrate its effectiveness and superiority over state-of-the-art graph clustering methods with GO term enrichment.
Energy index decomposition methodology at the plant level
NASA Astrophysics Data System (ADS)
Kumphai, Wisit
Scope and method of study. The dissertation explores the use of a high level energy intensity index as a facility-level energy performance monitoring indicator with a goal of developing a methodology for an economically based energy performance monitoring system that incorporates production information. The performance measure closely monitors energy usage, production quantity, and product mix and determines the production efficiency as a part of an ongoing process that would enable facility managers to keep track of and, in the future, be able to predict when to perform a recommissioning process. The study focuses on the use of the index decomposition methodology and explored several high level (industry, sector, and country levels) energy utilization indexes, namely, Additive Log Mean Divisia, Multiplicative Log Mean Divisia, and Additive Refined Laspeyres. One level of index decomposition is performed. The indexes are decomposed into Intensity and Product mix effects. These indexes are tested on a flow shop brick manufacturing plant model in three different climates in the United States. The indexes obtained are analyzed by fitting an ARIMA model and testing for dependency between the two decomposed indexes. Findings and conclusions. The results concluded that the Additive Refined Laspeyres index decomposition methodology is suitable to use on a flow shop, non air conditioned production environment as an energy performance monitoring indicator. It is likely that this research can be further expanded in to predicting when to perform a recommissioning process.
VAB Platform K(2) Lift & Install into Highbay 3
2016-03-07
Preparations are underway to lift the second half of the K-level work platforms for NASA’s Space Launch System (SLS) rocket up from High Bay 4 inside the Vehicle Assembly Building at NASA's Kennedy Space Center in Florida. The platform will be lifted up and over the transfer aisle and then lowered into High Bay 3 for installation. It will be secured about 86 feet above the VAB floor, on tower E of the high bay. The K work platforms will provide access to the SLS core stage and solid rocket boosters during processing and stacking operations on the mobile launcher. The Ground Systems Development and Operations Program is overseeing upgrades and modifications to High Bay 3 to support processing of the SLS and Orion spacecraft. A total of 10 levels of new platforms, 20 platform halves altogether, will surround the SLS rocket and Orion spacecraft.
VAB Platform K(2) Lift & Install into Highbay 3
2016-03-07
A 250-ton crane is used to lift the second half of the K-level work platforms for NASA’s Space Launch System (SLS) rocket high above the transfer aisle inside the Vehicle Assembly Building at NASA's Kennedy Space Center in Florida. The platform is being lifted up for transfer into High Bay 3 for installation. The platform will be secured about 86 feet above the VAB floor, on tower E of the high bay. The K work platforms will provide access to the SLS core stage and solid rocket boosters during processing and stacking operations on the mobile launcher. The Ground Systems Development and Operations Program is overseeing upgrades and modifications to High Bay 3 to support processing of the SLS and Orion spacecraft. A total of 10 levels of new platforms, 20 platform halves altogether, will surround the SLS rocket and Orion spacecraft.
VAB Platform K(2) Lift & Install into Highbay 3
2016-03-07
A 250-ton crane is used to lift the second half of the K-level work platforms for NASA’s Space Launch System (SLS) rocket up from High Bay 4 inside the Vehicle Assembly Building at NASA's Kennedy Space Center in Florida. The platform will be lifted up and over the transfer aisle and then lowered into High Bay 3 for installation. It will be secured about 86 feet above the VAB floor, on tower E of the high bay. The K work platforms will provide access to the SLS core stage and solid rocket boosters during processing and stacking operations on the mobile launcher. The Ground Systems Development and Operations Program is overseeing upgrades and modifications to High Bay 3 to support processing of the SLS and Orion spacecraft. A total of 10 levels of new platforms, 20 platform halves altogether, will surround the SLS rocket and Orion spacecraft.
VAB Platform K(2) Lift & Install into Highbay 3
2016-03-07
A 250-ton crane is used to lift the second half of the K-level work platforms for NASA’s Space Launch System (SLS) rocket up from High Bay 4 inside the Vehicle Assembly Building at NASA's Kennedy Space Center in Florida. The platform is being lifted up and over the transfer aisle and will be lowered into High Bay 3 for installation. It will be secured about 86 feet above the VAB floor, on tower E of the high bay. The K work platforms will provide access to the SLS core stage and solid rocket boosters during processing and stacking operations on the mobile launcher. The Ground Systems Development and Operations Program is overseeing upgrades and modifications to High Bay 3 to support processing of the SLS and Orion spacecraft. A total of 10 levels of new platforms, 20 platform halves altogether, will surround the SLS rocket and Orion spacecraft.
Imbir, Kamil; Spustek, Tomasz; Bernatowicz, Gabriela; Duda, Joanna; Żygierewicz, Jarosław
2017-01-01
The arousal level of words presented in a Stroop task was found to affect their interference on the required naming of the words’ color. Based on a dual-processes approach, we propose that there are two aspects to activation: arousal and subjective significance. Arousal is crucial for automatic processing. Subjective significance is specific to controlled processing. Based on this conceptual model, we predicted that arousal would enhance interference in a Stroop task, as attention would be allocated to the meaning of the inhibited word. High subjective significance should have the opposite effect, i.e., it should enhance the controlled and explicit part of Stroop task processing, which is color naming. We found that response latencies were modulated by the interaction between the arousal and subjective significance levels of words. The longest reaction times were observed for highly arousing words of medium subjective significance level. Arousal shaped event related potentials in the 150–290 ms time range, while effects of subjective significance were found for 50–150, 150–290, and 290–530 ms time ranges. PMID:29311872
Gor, Kira
2014-01-01
Second language learners perform worse than native speakers under adverse listening conditions, such as speech in noise (SPIN). No data are available on heritage language speakers’ (early naturalistic interrupted learners’) ability to perceive SPIN. The current study fills this gap and investigates the perception of Russian speech in multi-talker babble noise by the matched groups of high- and low-proficiency heritage speakers (HSs) and late second language learners of Russian who were native speakers of English. The study includes a control group of Russian native speakers. It manipulates the noise level (high and low), and context cloze probability (high and low). The results of the SPIN task are compared to the tasks testing the control of phonology, AXB discrimination and picture-word discrimination, and lexical knowledge, a word translation task, in the same participants. The increased phonological sensitivity of HSs interacted with their ability to rely on top–down processing in sentence integration, use contextual cues, and build expectancies in the high-noise/high-context condition in a bootstrapping fashion. HSs outperformed oral proficiency-matched late second language learners on SPIN task and two tests of phonological sensitivity. The outcomes of the SPIN experiment support both the early naturalistic advantage and the role of proficiency in HSs. HSs’ ability to take advantage of the high-predictability context in the high-noise condition was mitigated by their level of proficiency. Only high-proficiency HSs, but not any other non-native group, took advantage of the high-predictability context that became available with better phonological processing skills in high-noise. The study thus confirms high-proficiency (but not low-proficiency) HSs’ nativelike ability to combine bottom–up and top–down cues in processing SPIN. PMID:25566130
Gor, Kira
2014-01-01
Second language learners perform worse than native speakers under adverse listening conditions, such as speech in noise (SPIN). No data are available on heritage language speakers' (early naturalistic interrupted learners') ability to perceive SPIN. The current study fills this gap and investigates the perception of Russian speech in multi-talker babble noise by the matched groups of high- and low-proficiency heritage speakers (HSs) and late second language learners of Russian who were native speakers of English. The study includes a control group of Russian native speakers. It manipulates the noise level (high and low), and context cloze probability (high and low). The results of the SPIN task are compared to the tasks testing the control of phonology, AXB discrimination and picture-word discrimination, and lexical knowledge, a word translation task, in the same participants. The increased phonological sensitivity of HSs interacted with their ability to rely on top-down processing in sentence integration, use contextual cues, and build expectancies in the high-noise/high-context condition in a bootstrapping fashion. HSs outperformed oral proficiency-matched late second language learners on SPIN task and two tests of phonological sensitivity. The outcomes of the SPIN experiment support both the early naturalistic advantage and the role of proficiency in HSs. HSs' ability to take advantage of the high-predictability context in the high-noise condition was mitigated by their level of proficiency. Only high-proficiency HSs, but not any other non-native group, took advantage of the high-predictability context that became available with better phonological processing skills in high-noise. The study thus confirms high-proficiency (but not low-proficiency) HSs' nativelike ability to combine bottom-up and top-down cues in processing SPIN.
Impact of glycolate anion on aqueous corrosion in DWPF and downstream facilities
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mickalonis, J. I.
2015-12-15
Glycolic acid is being evaluated as an alternate reductant in the preparation of high level waste for the Defense Waste Processing Facility (DWPF) at the Savannah River Site (SRS). During processing, the glycolic acid may not be completely consumed with small quantities of the glycolate anion being carried forward to other high level waste (HLW) facilities. The impact of the glycolate anion on the corrosion of the materials of construction (MoC) throughout the waste processing system has not been previously evaluated. A literature review had revealed that corrosion data were not available for the MoCs in glycolic-bearing solutions applicable tomore » SRS systems. Data on the material compatibility with only glycolic acid or its derivative products were identified; however, data were limited for solutions containing glycolic acid or the glycolate anion.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Li, Guochang; Chen, George, E-mail: gc@ecs.soton.ac.uk, E-mail: sli@mail.xjtu.edu.cn; School of Electronic and Computer Science, University of Southampton, Southampton SO17 1BJ
Charge transport properties in nanodielectrics present different tendencies for different loading concentrations. The exact mechanisms that are responsible for charge transport in nanodielectrics are not detailed, especially for high loading concentration. A charge transport model in nanodielectrics has been proposed based on quantum tunneling mechanism and dual-level traps. In the model, the thermally assisted hopping (TAH) process for the shallow traps and the tunnelling process for the deep traps are considered. For different loading concentrations, the dominant charge transport mechanisms are different. The quantum tunneling mechanism plays a major role in determining the charge conduction in nanodielectrics with high loadingmore » concentrations. While for low loading concentrations, the thermal hopping mechanism will dominate the charge conduction process. The model can explain the observed conductivity property in nanodielectrics with different loading concentrations.« less
Ocular hazards of industrial spot welding.
Chou, B R; Cullen, A P
1996-06-01
Any welding process is perceived to be a radiation hazard to the eye. Site visits were made to an automotive assembly plant to assess the levels of optical radiation and other hazards on the production line. Measurements were taken with a scanning spectro-radiometer and optical power and energy meters at operating working distances at spot welding stations where nonrobotic procedures were performed. Ultraviolet (UV) irradiance levels produced while spot welding with electrodes operating at 10 to 15 kA and 10 to 20 V were several orders of magnitude below recommended safety limits for industrial exposure. Flashes were rich in visible light and infrared (IR) radiation, but not at hazardous levels. The principal hazards in manual spot welding with high-current electrodes are high-speed droplets of molten metal produced by the process. These are easily defended against by wraparound polycarbonate eye shields.
A patient workflow management system built on guidelines.
Dazzi, L.; Fassino, C.; Saracco, R.; Quaglini, S.; Stefanelli, M.
1997-01-01
To provide high quality, shared, and distributed medical care, clinical and organizational issues need to be integrated. This work describes a methodology for developing a Patient Workflow Management System, based on a detailed model of both the medical work process and the organizational structure. We assume that the medical work process is represented through clinical practice guidelines, and that an ontological description of the organization is available. Thus, we developed tools 1) for acquiring the medical knowledge contained into a guideline, 2) to translate the derived formalized guideline into a computational formalism, precisely a Petri Net, 3) to maintain different representation levels. The high level representation guarantees that the Patient Workflow follows the guideline prescriptions, while the low level takes into account the specific organization characteristics and allow allocating resources for managing a specific patient in daily practice. PMID:9357606
Production and Distribution of NASA MODIS Remote Sensing Products
NASA Technical Reports Server (NTRS)
Wolfe, Robert
2007-01-01
The two Moderate Resolution Imaging Spectroradiometer (MODIS) instruments on-board NASA's Earth Observing System (EOS) Terra and Aqua satellites make key measurements for understanding the Earth's terrestrial ecosystems. Global time-series of terrestrial geophysical parameters have been produced from MODIS/Terra for over 7 years and for MODIS/Aqua for more than 4 1/2 years. These well calibrated instruments, a team of scientists and a large data production, archive and distribution systems have allowed for the development of a new suite of high quality product variables at spatial resolutions as fine as 250m in support of global change research and natural resource applications. This talk describes the MODIS Science team's products, with a focus on the terrestrial (land) products, the data processing approach and the process for monitoring and improving the product quality. The original MODIS science team was formed in 1989. The team's primary role is the development and implementation of the geophysical algorithms. In addition, the team provided feedback on the design and pre-launch testing of the instrument and helped guide the development of the data processing system. The key challenges the science team dealt with before launch were the development of algorithms for a new instrument and provide guidance of the large and complex multi-discipline processing system. Land, Ocean and Atmosphere discipline teams drove the processing system requirements, particularly in the area of the processing loads and volumes needed to daily produce geophysical maps of the Earth at resolutions as fine as 250 m. The processing system had to handle a large number of data products, large data volumes and processing loads, and complex processing requirements. Prior to MODIS, daily global maps from heritage instruments, such as Advanced Very High Resolution Radiometer (AVHRR), were not produced at resolutions finer than 5 km. The processing solution evolved into a combination of processing the lower level (Level 1) products and the higher level discipline specific Land and Atmosphere products in the MODIS Science Investigator Lead Processing System (SIPS), the MODIS Adaptive Processing System (MODAPS), and archive and distribution of the Land products to the user community by two of NASA s EOS Distributed Active Archive Centers (DAACs). Recently, a part of MODAPS, the Level 1 and Atmosphere Archive and Distribution System (LAADS), took over the role of archiving and distributing the Level 1 and Atmosphere products to the user community.
Li, Shu; Du, Xue-Lei; Li, Qi; Xuan, Yan-Hua; Wang, Yun; Rao, Li-Lin
2016-01-01
Two kinds of probability expressions, verbal and numerical, have been used to characterize the uncertainty that people face. However, the question of whether verbal and numerical probabilities are cognitively processed in a similar manner remains unresolved. From a levels-of-processing perspective, verbal and numerical probabilities may be processed differently during early sensory processing but similarly in later semantic-associated operations. This event-related potential (ERP) study investigated the neural processing of verbal and numerical probabilities in risky choices. The results showed that verbal probability and numerical probability elicited different N1 amplitudes but that verbal and numerical probabilities elicited similar N2 and P3 waveforms in response to different levels of probability (high to low). These results were consistent with a levels-of-processing framework and suggest some internal consistency between the cognitive processing of verbal and numerical probabilities in risky choices. Our findings shed light on possible mechanism underlying probability expression and may provide the neural evidence to support the translation of verbal to numerical probabilities (or vice versa). PMID:26834612
Segregation and persistence of form in the lateral occipital complex.
Ferber, Susanne; Humphrey, G Keith; Vilis, Tutis
2005-01-01
While the lateral occipital complex (LOC) has been shown to be implicated in object recognition, it is unclear whether this brain area is responsive to low-level stimulus-driven features or high-level representational processes. We used scrambled shape-from-motion displays to disambiguate the presence of contours from figure-ground segregation and to measure the strength of the binding process for shapes without contours. We found persisting brain activation in the LOC for scrambled displays after the motion stopped indicating that this brain area subserves and maintains figure-ground segregation processes, a low-level function in the object processing hierarchy. In our second experiment, we found that the figure-ground segregation process has some form of spatial constancy indicating top-down influences. The persisting activation after the motion stops suggests an intermediate role in object recognition processes for this brain area and might provide further evidence for the idea that the lateral occipital complex subserves mnemonic functions mediating between iconic and short-term memory.
ERIC Educational Resources Information Center
Murray, Chris
2002-01-01
Looks at equipment, process, and training aspects of backpack vacuum cleaners that facilitate good ergonomics and high productivity levels, focusing on: designing new equipment for bodies and productivity; creating comfortable backpack harnesses; improving the work process via team training; and providing ergonomic training to ensure that backpack…
Sustainability Research: Biofuels, Processes and Supply Chains
Presentation will talk about sustainability at the EPA, summarily covering high level efforts and focusing in more detail on research in metrics for liquid biofuels and tools to evaluate sustainable processes. The presentation will also briefly touch on a new area of research, t...
NASA Technical Reports Server (NTRS)
Kizhner, Semion; Miko, Joseph; Bradley, Damon; Heinzen, Katherine
2008-01-01
NASA Hubble Space Telescope (HST) and upcoming cosmology science missions carry instruments with multiple focal planes populated with many large sensor detector arrays. These sensors are passively cooled to low temperatures for low-level light (L3) and near-infrared (NIR) signal detection, and the sensor readout electronics circuitry must perform at extremely low noise levels to enable new required science measurements. Because we are at the technological edge of enhanced performance for sensors and readout electronics circuitry, as determined by thermal noise level at given temperature in analog domain, we must find new ways of further compensating for the noise in the signal digital domain. To facilitate this new approach, state-of-the-art sensors are augmented at their array hardware boundaries by non-illuminated reference pixels, which can be used to reduce noise attributed to sensors. There are a few proposed methodologies of processing in the digital domain the information carried by reference pixels, as employed by the Hubble Space Telescope and the James Webb Space Telescope Projects. These methods involve using spatial and temporal statistical parameters derived from boundary reference pixel information to enhance the active (non-reference) pixel signals. To make a step beyond this heritage methodology, we apply the NASA-developed technology known as the Hilbert- Huang Transform Data Processing System (HHT-DPS) for reference pixel information processing and its utilization in reconfigurable hardware on-board a spaceflight instrument or post-processing on the ground. The methodology examines signal processing for a 2-D domain, in which high-variance components of the thermal noise are carried by both active and reference pixels, similar to that in processing of low-voltage differential signals and subtraction of a single analog reference pixel from all active pixels on the sensor. Heritage methods using the aforementioned statistical parameters in the digital domain (such as statistical averaging of the reference pixels themselves) zeroes out the high-variance components, and the counterpart components in the active pixels remain uncorrected. This paper describes how the new methodology was demonstrated through analysis of fast-varying noise components using the Hilbert-Huang Transform Data Processing System tool (HHT-DPS) developed at NASA and the high-level programming language MATLAB (Trademark of MathWorks Inc.), as well as alternative methods for correcting for the high-variance noise component, using an HgCdTe sensor data. The NASA Hubble Space Telescope data post-processing, as well as future deep-space cosmology projects on-board instrument data processing from all the sensor channels, would benefit from this effort.
Inert gas narcosis and the encoding and retrieval of long-term memory.
Kneller, Wendy; Hobbs, Malcolm
2013-12-01
Prior research has indicated that inert gas narcosis (IGN) causes decrements in free recall memory performance and that these result from disruption of either encoding or self-guided search in the retrieval process. In a recent study we provided evidence, using a Levels of Processing approach, for the hypothesis that IGN affects the encoding of new information. The current study sought to replicate these results with an improved methodology. The effect of ambient pressure (111.5-212.8 kPa/1-11 msw vs. 456-516.8 kPa/35-41 msw) and level of processing (shallow vs. deep) on free recall memory performance was measured in 34 divers in the context of an underwater field experiment. Free recall was significantly worse at high ambient pressure compared to low ambient pressure in the deep processing condition (low pressure: M = 5.6; SD = 2.7; high pressure: M = 3.3; SD = 1.4), but not in the shallow processing condition (low pressure: M = 3.9; SD = 1.7; high pressure: M = 3.1; SD = 1.8), indicating IGN impaired memory ability in the deep processing condition. In the shallow water, deep processing improved recall over shallow processing but, significantly, this effect was eliminated in the deep water. In contrast to our earlier study this supported the hypothesis that IGN affects the self-guided search of information and not encoding. It is suggested that IGN may affect both encoding and self-guided search and further research is recommended.
Papera, Massimiliano; Richards, Anne
2016-05-01
Exogenous allocation of attentional resources allows the visual system to encode and maintain representations of stimuli in visual working memory (VWM). However, limits in the processing capacity to allocate resources can prevent unexpected visual stimuli from gaining access to VWM and thereby to consciousness. Using a novel approach to create unbiased stimuli of increasing saliency, we investigated visual processing during a visual search task in individuals who show a high or low propensity to neglect unexpected stimuli. When propensity to inattention is high, ERP recordings show a diminished amplification concomitantly with a decrease in theta band power during the N1 latency, followed by a poor target enhancement during the N2 latency. Furthermore, a later modulation in the P3 latency was also found in individuals showing propensity to visual neglect, suggesting that more effort is required for conscious maintenance of visual information in VWM. Effects during early stages of processing (N80 and P1) were also observed suggesting that sensitivity to contrasts and medium-to-high spatial frequencies may be modulated by low-level saliency (albeit no statistical group differences were found). In accordance with the Global Workplace Model, our data indicate that a lack of resources in low-level processors and visual attention may be responsible for the failure to "ignite" a state of high-level activity spread across several brain areas that is necessary for stimuli to access awareness. These findings may aid in the development of diagnostic tests and intervention to detect/reduce inattention propensity to visual neglect of unexpected stimuli. © 2016 Society for Psychophysiological Research.
NASA Astrophysics Data System (ADS)
Petrucci, B.; Huc, M.; Feuvrier, T.; Ruffel, C.; Hagolle, O.; Lonjou, V.; Desjardins, C.
2015-10-01
For the production of Level2A products during Sentinel-2 commissioning in the Technical Expertise Center Sentinel-2 in CNES, CESBIO proposed to adapt the Venus Level-2 , taking advantage of the similarities between the two missions: image acquisition at a high frequency (2 days for Venus, 5 days with the two Sentinel-2), high resolution (5m for Venus, 10, 20 and 60m for Sentinel-2), images acquisition under constant viewing conditions. The Multi-Mission Atmospheric Correction and Cloud Screening (MACCS) tool was born: based on CNES Orfeo Toolbox Library, Venμs processor which was already able to process Formosat2 and VENμS data, was adapted to process Sentinel-2 and Landsat5-7 data; since then, a great effort has been made reviewing MACCS software architecture in order to ease the add-on of new missions that have also the peculiarity of acquiring images at high resolution, high revisit and under constant viewing angles, such as Spot4/Take5 and Landsat8. The recursive and multi-temporal algorithm is implemented in a core that is the same for all the sensors and that combines several processing steps: estimation of cloud cover, cloud shadow, water, snow and shadows masks, of water vapor content, aerosol optical thickness, atmospheric correction. This core is accessed via a number of plug-ins where the specificity of the sensor and of the user project are taken into account: products format, algorithmic processing chaining and parameters. After a presentation of MACCS architecture and functionalities, the paper will give an overview of the production facilities integrating MACCS and the associated specificities: the interest for this tool has grown worldwide and MACCS will be used for extensive production within the THEIA land data center and Agri-S2 project. Finally the paper will zoom on the use of MACCS during Sentinel-2 In Orbit Test phase showing the first Level-2A products.
Limpo, Teresa; Alves, Rui A; Fidalgo, Raquel
2014-06-01
It is well established that the activity of producing a text is a complex one involving three main cognitive processes: Planning, translating, and revising. Although these processes are crucial in skilled writing, beginning and developing writers seem to struggle with them, mainly with planning and revising. To trace the development of the high-level writing processes of planning and revising, from Grades 4 to 9, and to examine whether these skills predict writing quality in younger and older students (Grades 4-6 vs. 7-9), after controlling for gender, school achievement, age, handwriting fluency, spelling, and text structure. Participants were 381 students from Grades 4 to 9 (age 9-15). Students were asked to plan and write a story and to revise another story by detecting and correcting mechanical and substantive errors. From Grades 4 to 9, we found a growing trend in students' ability to plan and revise despite the observed decreases and stationary periods from Grades 4 to 5 and 6 to 7. Moreover, whereas younger students' planning and revising skills made no contribution to the quality of their writing, in older students, these high-level skills contributed to writing quality above and beyond control predictors. The findings of this study seem to indicate that besides the increase in planning and revising, these skills are not fully operational in school-age children. Indeed, given the contribution of these high-level skills to older students' writing, supplementary instruction and practice should be provided from early on. © 2013 The British Psychological Society.
Processing of Fear and Anger Facial Expressions: The Role of Spatial Frequency
Comfort, William E.; Wang, Meng; Benton, Christopher P.; Zana, Yossi
2013-01-01
Spatial frequency (SF) components encode a portion of the affective value expressed in face images. The aim of this study was to estimate the relative weight of specific frequency spectrum bandwidth on the discrimination of anger and fear facial expressions. The general paradigm was a classification of the expression of faces morphed at varying proportions between anger and fear images in which SF adaptation and SF subtraction are expected to shift classification of facial emotion. A series of three experiments was conducted. In Experiment 1 subjects classified morphed face images that were unfiltered or filtered to remove either low (<8 cycles/face), middle (12–28 cycles/face), or high (>32 cycles/face) SF components. In Experiment 2 subjects were adapted to unfiltered or filtered prototypical (non-morphed) fear face images and subsequently classified morphed face images. In Experiment 3 subjects were adapted to unfiltered or filtered prototypical fear face images with the phase component randomized before classifying morphed face images. Removing mid frequency components from the target images shifted classification toward fear. The same shift was observed under adaptation condition to unfiltered and low- and middle-range filtered fear images. However, when the phase spectrum of the same adaptation stimuli was randomized, no adaptation effect was observed. These results suggest that medium SF components support the perception of fear more than anger at both low and high level of processing. They also suggest that the effect at high-level processing stage is related more to high-level featural and/or configural information than to the low-level frequency spectrum. PMID:23637687
Advances in Proteomics Data Analysis and Display Using an Accurate Mass and Time Tag Approach
Zimmer, Jennifer S.D.; Monroe, Matthew E.; Qian, Wei-Jun; Smith, Richard D.
2007-01-01
Proteomics has recently demonstrated utility in understanding cellular processes on the molecular level as a component of systems biology approaches and for identifying potential biomarkers of various disease states. The large amount of data generated by utilizing high efficiency (e.g., chromatographic) separations coupled to high mass accuracy mass spectrometry for high-throughput proteomics analyses presents challenges related to data processing, analysis, and display. This review focuses on recent advances in nanoLC-FTICR-MS-based proteomics approaches and the accompanying data processing tools that have been developed to display and interpret the large volumes of data being produced. PMID:16429408
DOE Office of Scientific and Technical Information (OSTI.GOV)
De Moor, Emmanuel
The present project investigated Quenching and Partitioning (Q&P) to process cold rolled steels to develop high strength sheet steels that exhibit superior ductility compared to available grades with the intent to allow forming of high strength parts at room temperature to provide an alternative to hot stamping of parts. Hot stamping of boron alloyed steel is the current technology to manufacture thinner gauge sections in automotive structures to guarantee anti-intrusion during collisions whilst improving fuel efficiency by decreasing vehicle weight. Hot stamping involves reheating steel to 900 °C or higher followed by deformation and quenching in the die to producemore » ultra-high strength materials. Hot stamping requires significant energy to reheat the steel and is less productive than traditional room temperature stamping operations. Stamping at elevated temperature was developed due to the lack of available steels with strength levels of interest possessing sufficient ductility enabling traditional room temperature forming. This process is seeing growing demand within the automotive industry and, given the reheating step in this operation, increased energy consumption during part manufacturing results. The present research program focused on the development of steel grades via Q&P processing that exhibit high strength and formability enabling room temperature forming to replace hot stamping. The main project objective consisted of developing sheet steels exhibiting minimum ultimate tensile strength levels of 1200 MPa in combination with minimum tensile elongation levels of 15 pct using Q&P processing through judicious alloy design and heat treating parameter definition. In addition, detailed microstructural characterization and study of properties, processing and microstructure interrelationships were pursued to develop strategies to further enhance tensile properties. In order to accomplish these objectives, alloy design was conducted towards achieving the target properties. Twelve alloys were designed and laboratory produced involving melting, alloying, casting, hot rolling, and cold rolling to obtain sheet steels of approximately 1 mm thickness. Q&P processing of the samples was then conducted. Target properties were achieved and substantially exceeded demonstrating success in the developed and employed alloy design approaches. The best combinations of tensile properties were found at approximately 1550 MPa with a total elongation in excess of 20 pct clearly showing the potential for replacement of hot stamping to produce advanced high strength steels.« less
Ultra high speed image processing techniques. [electronic packaging techniques
NASA Technical Reports Server (NTRS)
Anthony, T.; Hoeschele, D. F.; Connery, R.; Ehland, J.; Billings, J.
1981-01-01
Packaging techniques for ultra high speed image processing were developed. These techniques involve the development of a signal feedthrough technique through LSI/VLSI sapphire substrates. This allows the stacking of LSI/VLSI circuit substrates in a 3 dimensional package with greatly reduced length of interconnecting lines between the LSI/VLSI circuits. The reduced parasitic capacitances results in higher LSI/VLSI computational speeds at significantly reduced power consumption levels.
Holzman, Jacob B; Valentiner, David P
2016-03-01
Cognitive-behavioral models highlight the conjoint roles of self-focused attention (SFA), post-event processing (PEP), and performance appraisals in the maintenance of social anxiety. SFA, PEP, and biased performance appraisals are related to social anxiety; however, limited research has examined how SFA affects information-processing following social events. The current study examined whether SFA affects the relationships between performance appraisals and PEP following a social event.. 137 participants with high (n = 72) or low (n = 65) social anxiety were randomly assigned to conditions of high SFA or low SFA while engaging in a standardized social performance. Subsequent performance appraisals and PEP were measured. Immediate performance appraisals were not affected by SFA. High levels of SFA led to a stronger, inverse relationship between immediate positive performance appraisals and subsequent negative PEP. High levels of SFA also led to a stronger, inverse relationship between negative PEP and changes in positive performance appraisals.. Future research should examine whether the current findings, which involved a standardized social performance event, extend to interaction events as well as in a clinical sample. These findings suggest that SFA affects the processing of positive information following a social performance event. SFA is particularly important for understanding how negative PEP undermines positive performance appraisals.. Published by Elsevier Ltd.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Larson, D.E.
1996-09-01
This report provides a collection of annotated bibliographies for documents prepared under the Hanford High-Level Waste Vitrification (Plant) Program. The bibliographies are for documents from Fiscal Year 1983 through Fiscal Year 1995, and include work conducted at or under the direction of the Pacific Northwest National Laboratory. The bibliographies included focus on the technology developed over the specified time period for vitrifying Hanford pretreated high-level waste. The following subject areas are included: General Documentation; Program Documentation; High-Level Waste Characterization; Glass Formulation and Characterization; Feed Preparation; Radioactive Feed Preparation and Glass Properties Testing; Full-Scale Feed Preparation Testing; Equipment Materials Testing; Meltermore » Performance Assessment and Evaluations; Liquid-Fed Ceramic Melter; Cold Crucible Melter; Stirred Melter; High-Temperature Melter; Melter Off-Gas Treatment; Vitrification Waste Treatment; Process, Product Control and Modeling; Analytical; and Canister Closure, Decontamination, and Handling« less
Preserved subliminal processing and impaired conscious access in schizophrenia
Del Cul, Antoine; Dehaene, Stanislas; Leboyer, Marion
2006-01-01
Background Studies of visual backward masking have frequently revealed an elevated masking threshold in schizophrenia. This finding has frequently been interpreted as indicating a low-level visual deficit. However, more recent models suggest that masking may also involve late and higher-level integrative processes, while leaving intact early “bottom-up” visual processing. Objectives We tested the hypothesis that the backward masking deficit in schizophrenia corresponds to a deficit in the late stages of conscious perception, whereas the subliminal processing of masked stimuli is fully preserved. Method 28 patients with schizophrenia and 28 normal controls performed two backward-masking experiments. We used Arabic digits as stimuli and varied quasi-continuously the interval with a subsequent mask, thus allowing us to progressively “unmask” the stimuli. We finely quantified their degree of visibility using both objective and subjective measures to evaluate the threshold duration for access to consciousness. We also studied the priming effect caused by the variably masked numbers on a comparison task performed on a subsequently presented and highly visible target number. Results The threshold delay between digit and mask necessary for the conscious perception of the masked stimulus was longer in patients compared to control subjects. This higher consciousness threshold in patients was confirmed by an objective and a subjective measure, and both measures were highly correlated for patients as well as for controls. However, subliminal priming of masked numbers was effective and identical in patients compared to controls. Conclusions Access to conscious report of masked stimuli is impaired in schizophrenia, while fast bottom-up processing of the same stimuli, as assessed by subliminal priming, is preserved. These findings suggest a high-level origin of the masking deficit in schizophrenia, although they leave open for further research its exact relation to previously identified bottom-up visual processing abnormalities. PMID:17146006
ERIC Educational Resources Information Center
Olson, Carol Booth, Comp.
The concept of writing as process has revolutionized the way many view composition, and this book is organized by the stages of that process. Each section begins with a well-known author presenting specific techniques, followed by commentaries which include testimonials, applications of writing techniques, and descriptions of strategy…
The Development of a High-Throughput/Combinatorial Workflow for the Study of Porous Polymer Networks
2012-04-05
poragen composition , poragen level, and cure temperature. A total of 216 unique compositions were prepared. Changes in opacity of the blends as they cured...allowed for the identification of compositional variables and process variables that enabled the production of porous networks. Keywords: high...in polymer network cross-link density,poragen composition , poragen level, and cure temperature. A total of 216 unique compositions were prepared
Adapting High-Throughput Screening Methods and Assays for Biocontainment Laboratories
Tigabu, Bersabeh; White, E. Lucile; Bostwick, Robert; Tower, Nichole; Bukreyev, Alexander; Rockx, Barry; LeDuc, James W.; Noah, James W.
2015-01-01
Abstract High-throughput screening (HTS) has been integrated into the drug discovery process, and multiple assay formats have been widely used in many different disease areas but with limited focus on infectious agents. In recent years, there has been an increase in the number of HTS campaigns using infectious wild-type pathogens rather than surrogates or biochemical pathogen-derived targets. Concurrently, enhanced emerging pathogen surveillance and increased human mobility have resulted in an increase in the emergence and dissemination of infectious human pathogens with serious public health, economic, and social implications at global levels. Adapting the HTS drug discovery process to biocontainment laboratories to develop new drugs for these previously uncharacterized and highly pathogenic agents is now feasible, but HTS at higher biosafety levels (BSL) presents a number of unique challenges. HTS has been conducted with multiple bacterial and viral pathogens at both BSL-2 and BSL-3, and pilot screens have recently been extended to BSL-4 environments for both Nipah and Ebola viruses. These recent successful efforts demonstrate that HTS can be safely conducted at the highest levels of biological containment. This review outlines the specific issues that must be considered in the execution of an HTS drug discovery program for high-containment pathogens. We present an overview of the requirements for HTS in high-level biocontainment laboratories. PMID:25710545
Adaptive Multi-scale PHM for Robotic Assembly Processes
Choo, Benjamin Y.; Beling, Peter A.; LaViers, Amy E.; Marvel, Jeremy A.; Weiss, Brian A.
2017-01-01
Adaptive multiscale prognostics and health management (AM-PHM) is a methodology designed to support PHM in smart manufacturing systems. As a rule, PHM information is not used in high-level decision-making in manufacturing systems. AM-PHM leverages and integrates component-level PHM information with hierarchical relationships across the component, machine, work cell, and production line levels in a manufacturing system. The AM-PHM methodology enables the creation of actionable prognostic and diagnostic intelligence up and down the manufacturing process hierarchy. Decisions are made with the knowledge of the current and projected health state of the system at decision points along the nodes of the hierarchical structure. A description of the AM-PHM methodology with a simulated canonical robotic assembly process is presented. PMID:28664161
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bardal, M.A.; Darwen, N.J.
2008-07-01
Cold war plutonium production led to extensive amounts of radioactive waste stored in tanks at the Department of Energy's (DOE) Hanford site. Bechtel National, Inc. is building the largest nuclear Waste Treatment Plant in the world located at the Department of Energy's Hanford site to immobilize the millions of gallons of radioactive waste. The site comprises five main facilities; Pretreatment, High Level Waste vitrification, Low Active Waste vitrification, an Analytical Lab and the Balance of Facilities. The pretreatment facilities will separate the high and low level waste. The high level waste will then proceed to the HLW facility for vitrification.more » Vitrification is a process of utilizing a melter to mix molten glass with radioactive waste to form a stable product for storage. The melter cave is designated as the High Level Waste Melter Cave Support Handling System (HSH). There are several key processes that occur in the HSH cell that are necessary for vitrification and include: feed preparation, mixing, pouring, cooling and all maintenance and repair of the process equipment. Due to the cell's high level radiation, remote handling equipment provided by PaR Systems, Inc. is required to install and remove all equipment in the HSH cell. The remote handling crane is composed of a bridge and trolley. The trolley supports a telescoping tube set that rigidly deploys a TR 4350 manipulator arm with seven degrees of freedom. A rotating, extending, and retracting slewing hoist is mounted to the bottom of the trolley and is centered about the telescoping tube set. Both the manipulator and slewer are unique to this cell. The slewer can reach into corners and the manipulator's cross pivoting wrist provides better operational dexterity and camera viewing angles at the end of the arm. Since the crane functions will be operated remotely, the entire cell and crane have been modeled with 3-D software. Model simulations have been used to confirm operational and maintenance functional and timing studies throughout the design process. Since no humans can go in or out of the cell, there are several recovery options that have been designed into the system including jack-down wheels for the bridge and trolley, recovery drums for the manipulator hoist, and a wire rope cable cutter for the slewer jib hoist. If the entire crane fails in cell, the large diameter cable reel that provides power, signal, and control to the crane can be used to retrieve the crane from the cell into the crane maintenance area. (authors)« less
Wening, Stefanie; Keith, Nina; Abele, Andrea E
2016-06-01
In negotiations, a focus on interests (why negotiators want something) is key to integrative agreements. Yet, many negotiators spontaneously focus on positions (what they want), with suboptimal outcomes. Our research applies construal-level theory to negotiations and proposes that a high construal level instigates a focus on interests during negotiations which, in turn, positively affects outcomes. In particular, we tested the notion that the effect of construal level on outcomes was mediated by information exchange and judgement accuracy. Finally, we expected the mere mode of presentation of task material to affect construal levels and manipulated construal levels using concrete versus abstract negotiation tasks. In two experiments, participants negotiated in dyads in either a high- or low-construal-level condition. In Study 1, high-construal-level dyads outperformed dyads in the low-construal-level condition; this main effect was mediated by information exchange. Study 2 replicated both the main and mediation effects using judgement accuracy as mediator and additionally yielded a positive effect of a high construal level on a second, more complex negotiation task. These results not only provide empirical evidence for the theoretically proposed link between construal levels and negotiation outcomes but also shed light on the processes underlying this effect. © 2015 The British Psychological Society.
Aircraft Structural Mass Property Prediction Using Conceptual-Level Structural Analysis
NASA Technical Reports Server (NTRS)
Sexstone, Matthew G.
1998-01-01
This paper describes a methodology that extends the use of the Equivalent LAminated Plate Solution (ELAPS) structural analysis code from conceptual-level aircraft structural analysis to conceptual-level aircraft mass property analysis. Mass property analysis in aircraft structures has historically depended upon parametric weight equations at the conceptual design level and Finite Element Analysis (FEA) at the detailed design level. ELAPS allows for the modeling of detailed geometry, metallic and composite materials, and non-structural mass coupled with analytical structural sizing to produce high-fidelity mass property analyses representing fully configured vehicles early in the design process. This capability is especially valuable for unusual configuration and advanced concept development where existing parametric weight equations are inapplicable and FEA is too time consuming for conceptual design. This paper contrasts the use of ELAPS relative to empirical weight equations and FEA. ELAPS modeling techniques are described and the ELAPS-based mass property analysis process is detailed. Examples of mass property stochastic calculations produced during a recent systems study are provided. This study involved the analysis of three remotely piloted aircraft required to carry scientific payloads to very high altitudes at subsonic speeds. Due to the extreme nature of this high-altitude flight regime, few existing vehicle designs are available for use in performance and weight prediction. ELAPS was employed within a concurrent engineering analysis process that simultaneously produces aerodynamic, structural, and static aeroelastic results for input to aircraft performance analyses. The ELAPS models produced for each concept were also used to provide stochastic analyses of wing structural mass properties. The results of this effort indicate that ELAPS is an efficient means to conduct multidisciplinary trade studies at the conceptual design level.
Aircraft Structural Mass Property Prediction Using Conceptual-Level Structural Analysis
NASA Technical Reports Server (NTRS)
Sexstone, Matthew G.
1998-01-01
This paper describes a methodology that extends the use of the Equivalent LAminated Plate Solution (ELAPS) structural analysis code from conceptual-level aircraft structural analysis to conceptual-level aircraft mass property analysis. Mass property analysis in aircraft structures has historically depended upon parametric weight equations at the conceptual design level and Finite Element Analysis (FEA) at the detailed design level ELAPS allows for the modeling of detailed geometry, metallic and composite materials, and non-structural mass coupled with analytical structural sizing to produce high-fidelity mass property analyses representing fully configured vehicles early in the design process. This capability is especially valuable for unusual configuration and advanced concept development where existing parametric weight equations are inapplicable and FEA is too time consuming for conceptual design. This paper contrasts the use of ELAPS relative to empirical weight equations and FEA. ELAPS modeling techniques are described and the ELAPS-based mass property analysis process is detailed Examples of mass property stochastic calculations produced during a recent systems study are provided This study involved the analysis of three remotely piloted aircraft required to carry scientific payloads to very high altitudes at subsonic speeds. Due to the extreme nature of this high-altitude flight regime,few existing vehicle designs are available for use in performance and weight prediction. ELAPS was employed within a concurrent engineering analysis process that simultaneously produces aerodynamic, structural, and static aeroelastic results for input to aircraft performance analyses. The ELAPS models produced for each concept were also used to provide stochastic analyses of wing structural mass properties. The results of this effort indicate that ELAPS is an efficient means to conduct multidisciplinary trade studies at the conceptual design level.
The antioxidant level of Alaska’s wild berries: high, higher and highest
Dinstel, Roxie Rodgers; Cascio, Julie; Koukel, Sonja
2013-01-01
Background In the last few years, antioxidants have become the stars of the nutritional world. Antioxidants are important in terms of their ability to protect against oxidative cell damage that can lead to conditions, such as Alzheimer’s disease, cancer and heart disease – conditions also linked with chronic inflammation. The antioxidant and anti-inflammatory effects of Alaska’s wild berries may have the potential to help prevent these diseases. Objective To discover the antioxidant levels of Alaska wild berries and the ways these antioxidant levels translate when preservation methods are applied to the berry. Design This research centred on both the raw berries and products made from the berries. In the first year, a variety of wild berries were tested to discover their oxygen radical absorption capacity (ORAC) in the raw berries. The second level of the research project processed 4 different berries – blueberries, lingonberries, salmonberries, highbush cranberries – into 8 or 9 products made from these berries. The products were tested for both ORAC as well as specific antioxidants. Results The Alaska wild berries collected and tested in the first experiment ranged from 3 to 5 times higher in ORAC value than cultivated berries from the lower 48 states. For instance, cultivated blueberries have an ORAC scale of 30. Alaska wild dwarf blueberries measure 85. This is also higher than lower 48 wild blueberries, which had a score of 61. All of the Alaskan berries tested have a level of antioxidant considered nutritionally valuable, ranging from 19 for watermelon berries to 206 for lingonberries on the ORAC scale. With the processed products made from 4 Alaska wild berries, one of the unexpected outcomes of the research was that the berries continued to have levels of antioxidants considered high, despite the effects of commonly used heat-processing techniques. When berries were dehydrated, per gram ORAC values increased. Conclusion Alaska wild berries have extraordinarily high antioxidant levels. Though cooking lowered the antioxidant level, and adding ingredients such as sugar diluted the antioxidant concentration, products made from berries are high sources of antioxidants. PMID:23977647
The antioxidant level of Alaska's wild berries: high, higher and highest.
Dinstel, Roxie Rodgers; Cascio, Julie; Koukel, Sonja
2013-01-01
In the last few years, antioxidants have become the stars of the nutritional world. Antioxidants are important in terms of their ability to protect against oxidative cell damage that can lead to conditions, such as Alzheimer's disease, cancer and heart disease--conditions also linked with chronic inflammation. The antioxidant and anti-inflammatory effects of Alaska's wild berries may have the potential to help prevent these diseases. To discover the antioxidant levels of Alaska wild berries and the ways these antioxidant levels translate when preservation methods are applied to the berry. This research centred on both the raw berries and products made from the berries. In the first year, a variety of wild berries were tested to discover their oxygen radical absorption capacity (ORAC) in the raw berries. The second level of the research project processed 4 different berries--blueberries, lingonberries, salmonberries, highbush cranberries--into 8 or 9 products made from these berries. The products were tested for both ORAC as well as specific antioxidants. The Alaska wild berries collected and tested in the first experiment ranged from 3 to 5 times higher in ORAC value than cultivated berries from the lower 48 states. For instance, cultivated blueberries have an ORAC scale of 30. Alaska wild dwarf blueberries measure 85. This is also higher than lower 48 wild blueberries, which had a score of 61. All of the Alaskan berries tested have a level of antioxidant considered nutritionally valuable, ranging from 19 for watermelon berries to 206 for lingonberries on the ORAC scale. With the processed products made from 4 Alaska wild berries, one of the unexpected outcomes of the research was that the berries continued to have levels of antioxidants considered high, despite the effects of commonly used heat-processing techniques. When berries were dehydrated, per gram ORAC values increased. Alaska wild berries have extraordinarily high antioxidant levels. Though cooking lowered the antioxidant level, and adding ingredients such as sugar diluted the antioxidant concentration, products made from berries are high sources of antioxidants.
Labour Market Flexibility. Report by a High-Level Group of Experts to the Secretary-General.
ERIC Educational Resources Information Center
Organisation for Economic Cooperation and Development, Paris (France).
Sustainable growth has economic policies addressed to current imbalances as its necessary, though not its sufficient, condition. Labor markets are important for the growth process; as the level of economic activity increases, they function better, and as they function better, the level of economic activity increases further. Labor market…
Nyirenda, D B; Chiwona-Karltun, L; Chitundu, M; Haggblade, S; Brimer, L
2011-03-01
The cassava belt area in Southern Africa is experiencing an unforeseen surge in cassava production, processing and consumption. Little documentation exists on the effects of this surge on processing procedures, the prevailing levels of cyanogenic glucosides of products consumed and the levels of products commercially available on the market. Risk assessments disclose that effects harmful to the developing central nervous system (CNS) may be observed at a lower exposure than previously anticipated. We interviewed farmers in Zambia and Malawi about their cultivars, processing procedures and perceptions concerning cassava and chemical food safety. Chips, mixed biscuits and flour, procured from households and markets in three regions of Zambia (Luapula-North, Western and Southern) as well as products from the Northern, Central and Southern regions of Malawi, were analyzed for total cyanogenic potential (CNp). Processed products from Luapula showed a low CNp, <10 mg HCN equiv./kg air dried weight, while samples from Mongu, Western Province, exhibited high levels of CNp, varying from 50 to 290 mg HCN equiv./kg. Even the lowest level is five times higher than the recommended safety level of 10mg/kg decided on for cassava flour. Our results call for concerted efforts in promoting gender oriented processing technologies. Copyright © 2010 Elsevier Ltd. All rights reserved.
A Petri Net-Based Software Process Model for Developing Process-Oriented Information Systems
NASA Astrophysics Data System (ADS)
Li, Yu; Oberweis, Andreas
Aiming at increasing flexibility, efficiency, effectiveness, and transparency of information processing and resource deployment in organizations to ensure customer satisfaction and high quality of products and services, process-oriented information systems (POIS) represent a promising realization form of computerized business information systems. Due to the complexity of POIS, explicit and specialized software process models are required to guide POIS development. In this chapter we characterize POIS with an architecture framework and present a Petri net-based software process model tailored for POIS development with consideration of organizational roles. As integrated parts of the software process model, we also introduce XML nets, a variant of high-level Petri nets as basic methodology for business processes modeling, and an XML net-based software toolset providing comprehensive functionalities for POIS development.
Sample Size for Tablet Compression and Capsule Filling Events During Process Validation.
Charoo, Naseem Ahmad; Durivage, Mark; Rahman, Ziyaur; Ayad, Mohamad Haitham
2017-12-01
During solid dosage form manufacturing, the uniformity of dosage units (UDU) is ensured by testing samples at 2 stages, that is, blend stage and tablet compression or capsule/powder filling stage. The aim of this work is to propose a sample size selection approach based on quality risk management principles for process performance qualification (PPQ) and continued process verification (CPV) stages by linking UDU to potential formulation and process risk factors. Bayes success run theorem appeared to be the most appropriate approach among various methods considered in this work for computing sample size for PPQ. The sample sizes for high-risk (reliability level of 99%), medium-risk (reliability level of 95%), and low-risk factors (reliability level of 90%) were estimated to be 299, 59, and 29, respectively. Risk-based assignment of reliability levels was supported by the fact that at low defect rate, the confidence to detect out-of-specification units would decrease which must be supplemented with an increase in sample size to enhance the confidence in estimation. Based on level of knowledge acquired during PPQ and the level of knowledge further required to comprehend process, sample size for CPV was calculated using Bayesian statistics to accomplish reduced sampling design for CPV. Copyright © 2017 American Pharmacists Association®. Published by Elsevier Inc. All rights reserved.
Monitoring Calcium in Trout Eggs Exposed to Hydrazine.
1981-07-10
differentiation processes (Chapman, 1980). The high level of calcium in the notochord can be attributed to poor circulation which is characteristic of this...healthy muscle of the same individual. The notochord of the 8.0 mg/P group showed a higher calcium level than the control group. The chorion did not...calcium can alter the process . The above would suggest that even if some hydrazine is converted to N2 and thereby produce the gas bubble disease
The purchase decision process and involvement of the elderly regarding nonprescription products.
Reisenwitz, T H; Wimbish, G J
1997-01-01
The elderly or senior citizen is a large and growing market segment that purchases a disproportionate amount of health care products, particularly nonprescription products. This study attempts to examine the elderly's level of involvement (high versus low) and their purchase decision process regarding nonprescription or over-the-counter (OTC) products. Frequencies and percentages are calculated to indicate level of involvement as well as purchase decision behavior. Previous research is critiqued and managerial implications are discussed.
Description of waste pretreatment and interfacing systems dynamic simulation model
DOE Office of Scientific and Technical Information (OSTI.GOV)
Garbrick, D.J.; Zimmerman, B.D.
1995-05-01
The Waste Pretreatment and Interfacing Systems Dynamic Simulation Model was created to investigate the required pretreatment facility processing rates for both high level and low level waste so that the vitrification of tank waste can be completed according to the milestones defined in the Tri-Party Agreement (TPA). In order to achieve this objective, the processes upstream and downstream of the pretreatment facilities must also be included. The simulation model starts with retrieval of tank waste and ends with vitrification for both low level and high level wastes. This report describes the results of three simulation cases: one based on suggestedmore » average facility processing rates, one with facility rates determined so that approximately 6 new DSTs are required, and one with facility rates determined so that approximately no new DSTs are required. It appears, based on the simulation results, that reasonable facility processing rates can be selected so that no new DSTs are required by the TWRS program. However, this conclusion must be viewed with respect to the modeling assumptions, described in detail in the report. Also included in the report, in an appendix, are results of two sensitivity cases: one with glass plant water recycle steams recycled versus not recycled, and one employing the TPA SST retrieval schedule versus a more uniform SST retrieval schedule. Both recycling and retrieval schedule appear to have a significant impact on overall tank usage.« less
Processes governing transient responses of the deep ocean buoyancy budget to a doubling of CO2
NASA Astrophysics Data System (ADS)
Palter, J. B.; Griffies, S. M.; Hunter Samuels, B. L.; Galbraith, E. D.; Gnanadesikan, A.
2012-12-01
Recent observational analyses suggest there is a temporal trend and high-frequency variability in deep ocean buoyancy in the last twenty years, a phenomenon reproduced even in low-mixing models. Here we use an earth system model (GFDL's ESM2M) to evaluate physical processes that influence buoyancy (and thus steric sea level) budget of the deep ocean in quasi-steady state and under a doubling of CO2. A new suite of model diagnostics allows us to quantitatively assess every process that influences the buoyancy budget and its temporal evolution, revealing surprising dynamics governing both the equilibrium budget and its transient response to climate change. The results suggest that the temporal evolution of the deep ocean contribution to sea level rise is due to a diversity of processes at high latitudes, whose net effect is then advected in the Eulerian mean flow to mid and low latitudes. In the Southern Ocean, a slowdown in convection and spin up of the residual mean advection are approximately equal players in the deep steric sea level rise. In the North Atlantic, the region of greatest deep steric sea level variability in our simulations, a decrease in mixing of cold, dense waters from the marginal seas and a reduction in open ocean convection causes an accumulation of buoyancy in the deep subpolar gyre, which is then advected equatorward.
Brown, Meredith; Kuperberg, Gina R.
2015-01-01
Language and thought dysfunction are central to the schizophrenia syndrome. They are evident in the major symptoms of psychosis itself, particularly as disorganized language output (positive thought disorder) and auditory verbal hallucinations (AVHs), and they also manifest as abnormalities in both high-level semantic and contextual processing and low-level perception. However, the literatures characterizing these abnormalities have largely been separate and have sometimes provided mutually exclusive accounts of aberrant language in schizophrenia. In this review, we propose that recent generative probabilistic frameworks of language processing can provide crucial insights that link these four lines of research. We first outline neural and cognitive evidence that real-time language comprehension and production normally involve internal generative circuits that propagate probabilistic predictions to perceptual cortices — predictions that are incrementally updated based on prediction error signals as new inputs are encountered. We then explain how disruptions to these circuits may compromise communicative abilities in schizophrenia by reducing the efficiency and robustness of both high-level language processing and low-level speech perception. We also argue that such disruptions may contribute to the phenomenology of thought-disordered speech and false perceptual inferences in the language system (i.e., AVHs). This perspective suggests a number of productive avenues for future research that may elucidate not only the mechanisms of language abnormalities in schizophrenia, but also promising directions for cognitive rehabilitation. PMID:26640435
Zinc toxicity among galvanization workers in the iron and steel industry.
El Safty, Amal; El Mahgoub, Khalid; Helal, Sawsan; Abdel Maksoud, Neveen
2008-10-01
Galvanization is the process of coating steel or cast iron pieces with zinc, allowing complete protection against corrosion. The ultimate goal of this work was to assess the effect of occupational exposure to zinc in the galvanization process on different metals in the human body and to detect the association between zinc exposure and its effect on the respiratory system. This study was conducted in 111 subjects in one of the major companies in the iron and steel industry. There were 61 subjects (workers) who were involved in the galvanization process. Fifty adult men were chosen as a matched reference group from other departments of the company. All workers were interviewed using a special questionnaire on occupational history and chest diseases. Ventilatory functions and chest X rays were assessed in all examined workers. Also, complete blood counts were performed, and serum zinc, iron, copper, calcium, and magnesium levels were tested. This study illustrated the relation between zinc exposure in the galvanization process and high zinc levels among exposed workers, which was associated with a high prevalence rate of metal fume fever (MFF) and low blood copper and calcium levels. There was no statistically significant difference between the exposed and control groups with regards to the magnesium level. No long-term effect of metals exposure was detected on ventilatory functions or chest X rays among the exposed workers.
NASA Technical Reports Server (NTRS)
Doshi, Rajkumar S.; Lam, Raymond; White, James E.
1989-01-01
Intermediate and high level processing operations are performed on vision data for the organization of images into more meaningful, higher-level topological representations by means of a region-based route planner (RBRP). The RBRP operates in terrain scenarios where some or most of the terrain is occluded, proceeding without a priori maps on the basis of two-dimensional representations and gradient-and-roughness information. Route planning is accomplished by three successive abstractions and yields a detailed point-by-point path by searching only within the boundaries of relatively small regions.
Geornaras, Ifigenia; Kunene, Nokuthula F.; von Holy, Alexander; Hastings, John W.
1999-01-01
Molecular typing has been used previously to identify and trace dissemination of pathogenic and spoilage bacteria associated with food processing. Amplified fragment length polymorphism (AFLP) is a novel DNA fingerprinting technique which is considered highly reproducible and has high discriminatory power. This technique was used to fingerprint 88 Pseudomonas fluorescens and Pseudomonas putida strains that were previously isolated from plate counts of carcasses at six processing stages and various equipment surfaces and environmental sources of a poultry abattoir. Clustering of the AFLP patterns revealed a high level of diversity among the strains. Six clusters (clusters I through VI) were delineated at an arbitrary Dice coefficient level of 0.65; clusters III (31 strains) and IV (28 strains) were the largest clusters. More than one-half (52.3%) of the strains obtained from carcass samples, which may have represented the resident carcass population, grouped together in cluster III. By contrast, 43.2% of the strains from most of the equipment surfaces and environmental sources grouped together in cluster IV. In most cases, the clusters in which carcass strains from processing stages grouped corresponded to the clusters in which strains from the associated equipment surfaces and/or environmental sources were found. This provided evidence that there was cross-contamination between carcasses and the abattoir environment at the DNA level. The AFLP data also showed that strains were being disseminated from the beginning to the end of the poultry processing operation, since many strains associated with carcasses at the packaging stage were members of the same clusters as strains obtained from carcasses after the defeathering stage. PMID:10473382
Geornaras, I; Kunene, N F; von Holy, A; Hastings, J W
1999-09-01
Molecular typing has been used previously to identify and trace dissemination of pathogenic and spoilage bacteria associated with food processing. Amplified fragment length polymorphism (AFLP) is a novel DNA fingerprinting technique which is considered highly reproducible and has high discriminatory power. This technique was used to fingerprint 88 Pseudomonas fluorescens and Pseudomonas putida strains that were previously isolated from plate counts of carcasses at six processing stages and various equipment surfaces and environmental sources of a poultry abattoir. Clustering of the AFLP patterns revealed a high level of diversity among the strains. Six clusters (clusters I through VI) were delineated at an arbitrary Dice coefficient level of 0.65; clusters III (31 strains) and IV (28 strains) were the largest clusters. More than one-half (52.3%) of the strains obtained from carcass samples, which may have represented the resident carcass population, grouped together in cluster III. By contrast, 43.2% of the strains from most of the equipment surfaces and environmental sources grouped together in cluster IV. In most cases, the clusters in which carcass strains from processing stages grouped corresponded to the clusters in which strains from the associated equipment surfaces and/or environmental sources were found. This provided evidence that there was cross-contamination between carcasses and the abattoir environment at the DNA level. The AFLP data also showed that strains were being disseminated from the beginning to the end of the poultry processing operation, since many strains associated with carcasses at the packaging stage were members of the same clusters as strains obtained from carcasses after the defeathering stage.
NASA Astrophysics Data System (ADS)
Yahya, N. M.; Zahid, M. N. O.
2018-03-01
This study conducted to assess the work-related musculoskeletal disorders (WMDs) among the workers at core assembly production in an electronic components manufacturing company located in Pekan, Pahang, Malaysia. The study is to identify the WMDs risk factor and risk level. A set of questionnaires survey based on modified Nordic Musculoskeletal Disorder Questionnaires have been distributed to respective workers to acquire the WMDs risk factor identification. Then, postural analysis was conducted in order to measure the respective WMDs risk level. The analysis were based on two ergonomics assessment tools; Rapid Upper Limb Assessment (RULA) and Rapid Entire Body Assessment (REBA). The study found that 30 respondents out of 36 respondents suffered from WMDs especially at shoulder, wrists and lower back. The WMDs risk have been identified from unloading process, pressing process and winding process. In term of the WMDs risk level, REBA and RULA assessment tools have indicated high risk level to unloading and pressing process. Thus, this study had established the WMDs risk factor and risk level of core assembly production in an electronic components manufacturing company at Malaysia environment.
Tal, Orna; Rassin, Michal
2018-05-01
Evaluating the impact of the accreditation process on the basis of achievements, benefits and barriers from the viewpoint of leaders of the hospital accreditation in comparison to the hospital staff members. The implementation of standards for accreditation aim to improve the safety and quality of treatment. Partaking in this process has raised dilemmas regarding the actual benefits of accreditation in relation to the efforts invested in its achievement. Examining the standpoints of leaders of the process can reflect on the influence of this mechanism both on hospital activity and on hospital staff. A survey was conducted among two groups: The first group, the JCI accreditation leaders group, included 35 participants (the steering committee, 15 chapter heads and the hospital management); and 71 participants from the extended headquarters (senior physicians, nurses and administration staff). The second group included 564 hospital personnel from the medical, nursing, alternative medicine, administrators and housekeeping staff. The questionnaire included 46 statements in five fields: the effectiveness and benefit from the process, weaknesses, barriers, leadership and administration of the accreditation. All the respondents to the survey perceived the process as a leverage for implementing significant changes in all levels of the organization. There were high levels of agreement on the benefit of the process regarding the effective and affective contribution - high morale, feelings of accomplishment and team pride, improvement in communication, cooperation and social cohesion. The weaknesses of the process, including financial costs, bureaucracy, paper overflow and work overload, were awarded relatively low scores. The advantages of the process were ranked high in both groups; the accreditation leaders group attributed the process benefits to the organization as a whole, ranking it significantly higher, as well as for the individual. The hospital staff rated as significantly higher: the contribution of the process on the department level and the opportunity to promote accomplishments that were not reached in the past. The survey raised organizational discussion which minimized the objections to the process of change. Focusing on chosen aspects bridged between managers and on-site staff to find effective solutions. In order to promote successful inter-organizational processes the hospital requires both leadership and a well-formulated strategic program. The secondary gains from the broad process encompassing the whole organization, such as in the case of accreditation, are expressed in the form of social cohesion, cooperation, group pride and high staff morale.
A task-dependent causal role for low-level visual processes in spoken word comprehension.
Ostarek, Markus; Huettig, Falk
2017-08-01
It is well established that the comprehension of spoken words referring to object concepts relies on high-level visual areas in the ventral stream that build increasingly abstract representations. It is much less clear whether basic low-level visual representations are also involved. Here we asked in what task situations low-level visual representations contribute functionally to concrete word comprehension using an interference paradigm. We interfered with basic visual processing while participants performed a concreteness task (Experiment 1), a lexical-decision task (Experiment 2), and a word class judgment task (Experiment 3). We found that visual noise interfered more with concrete versus abstract word processing, but only when the task required visual information to be accessed. This suggests that basic visual processes can be causally involved in language comprehension, but that their recruitment is not automatic and rather depends on the type of information that is required in a given task situation. (PsycINFO Database Record (c) 2017 APA, all rights reserved).
Evaluation of intrusion detection technologies for high speed rail grade crossings : final report.
DOT National Transportation Integrated Search
2003-12-01
The rail industry is in the process of developing a prototype system for high speed rail. One of the concerns when using high speed rail is the danger of obstructions on the track. This level of danger is much higher than with traditional railway veh...
Development of High Level Trigger Software for Belle II at SuperKEKB
NASA Astrophysics Data System (ADS)
Lee, S.; Itoh, R.; Katayama, N.; Mineo, S.
2011-12-01
The Belle collaboration has been trying for 10 years to reveal the mystery of the current matter-dominated universe. However, much more statistics is required to search for New Physics through quantum loops in decays of B mesons. In order to increase the experimental sensitivity, the next generation B-factory, SuperKEKB, is planned. The design luminosity of SuperKEKB is 8 x 1035cm-2s-1 a factor 40 above KEKB's peak luminosity. At this high luminosity, the level 1 trigger of the Belle II experiment will stream events of 300 kB size at a 30 kHz rate. To reduce the data flow to a manageable level, a high-level trigger (HLT) is needed, which will be implemented using the full offline reconstruction on a large scale PC farm. There, physics level event selection is performed, reducing the event rate by ~ 10 to a few kHz. To execute the reconstruction the HLT uses the offline event processing framework basf2, which has parallel processing capabilities used for multi-core processing and PC clusters. The event data handling in the HLT is totally object oriented utilizing ROOT I/O with a new method of object passing over the UNIX socket connection. Also under consideration is the use of the HLT output as well to reduce the pixel detector event size by only saving hits associated with a track, resulting in an additional data reduction of ~ 100 for the pixel detector. In this contribution, the design and implementation of the Belle II HLT are presented together with a report of preliminary testing results.
Sohn, Chan Wok; Kim, Hyunae; You, Bo Ram; Kim, Min Jee; Kim, Hyo Jin; Lee, Ji Yeon; Sok, Dai-Eun; Kim, Jin Hee; Lee, Kun Jong; Kim, Mee Ree
2012-05-01
Garlic protects against degenerative diseases such as hyperlipidemia and cardiovascular diseases. However, raw garlic has a strong pungency, which is unpleasant. In this study, we examined the effect of high temperature/high pressure-processed garlic on plasma lipid profiles in rats. Sprague-Dawley rats were fed a normal control diet, a high cholesterol (0.5% cholesterol) diet (HCD) only, or a high cholesterol diet supplemented with 0.5% high temperature/high pressure-processed garlic (HCP) or raw garlic (HCR) for 10 weeks. The body weights of the rats fed the garlic-supplemented diets decreased, mostly because of reduced fat pad weights. Plasma levels of total cholesterol (TC), low-density lipoprotein cholesterol, and triglyceride (TG) in the HCP and HCR groups decreased significantly compared with those in the HCD group. Additionally, fecal TC and TG increased significantly in the HCP and HCR groups. It is notable that no significant differences in plasma or fecal lipid profiles were observed between the HCP and HCR groups. High temperature/high pressure-processed garlic contained a higher amount of S-allyl cysteine than raw garlic (P<.05). The results suggest that high temperature/high pressure-processed garlic may be useful as a functional food to improve lipid profiles.
Sohn, Chan Wok; Kim, Hyunae; You, Bo Ram; Kim, Min Jee; Kim, Hyo Jin; Lee, Ji Yeon; Sok, Dai-Eun; Kim, Jin Hee; Lee, Kun Jong
2012-01-01
Abstract Garlic protects against degenerative diseases such as hyperlipidemia and cardiovascular diseases. However, raw garlic has a strong pungency, which is unpleasant. In this study, we examined the effect of high temperature/high pressure-processed garlic on plasma lipid profiles in rats. Sprague–Dawley rats were fed a normal control diet, a high cholesterol (0.5% cholesterol) diet (HCD) only, or a high cholesterol diet supplemented with 0.5% high temperature/high pressure-processed garlic (HCP) or raw garlic (HCR) for 10 weeks. The body weights of the rats fed the garlic-supplemented diets decreased, mostly because of reduced fat pad weights. Plasma levels of total cholesterol (TC), low-density lipoprotein cholesterol, and triglyceride (TG) in the HCP and HCR groups decreased significantly compared with those in the HCD group. Additionally, fecal TC and TG increased significantly in the HCP and HCR groups. It is notable that no significant differences in plasma or fecal lipid profiles were observed between the HCP and HCR groups. High temperature/high pressure-processed garlic contained a higher amount of S-allyl cysteine than raw garlic (P<.05). The results suggest that high temperature/high pressure-processed garlic may be useful as a functional food to improve lipid profiles. PMID:22404600
Measurement of actinides and strontium-90 in high activity waste
DOE Office of Scientific and Technical Information (OSTI.GOV)
Maxwell, S.L. III; Nelson, M.R.
1994-08-01
The reliable measurement of trace radionuclides in high activity waste is important to support waste processing activities at SRS (F and H Area Waste Tanks, Extended Sludge Processing (ESP) and In-Tank precipitation (ITP) processing). Separation techniques are needed to remove high levels of gamma activity and alpha/beta interferences prior to analytical measurement. Using new extraction chromatographic resins from EiChrom Industries, Inc., the SRS Central Laboratory has developed new high speed separation methods that enable measurement of neptunium, thorium, uranium, plutonium, americium and strontium-90 in high activity waste solutions. Small particle size resin and applied vacuum are used to reduce analysismore » times and enhance column performance. Extraction chromatographic resins are easy to use and eliminate the generation of contaminated liquid organic waste.« less
High level of CA 125 due to large endometrioma.
Phupong, Vorapong; Chen, Orawan; Ultchaswadi, Pornthip
2004-09-01
CA 125 is a tumor-associated antigen. Its high levels are usually associated with ovarian malignancies, whereas smaller increases in the levels were associated with benign gynecologic conditions. The authors report a high level of CA 125 in a case of large ovarian endometrioma. A 45-year-old nulliparous Thai woman, presented with an increase of her abdominal girth for 7 months. Transabdominal ultrasonogram demonstrated a large ovarian cyst and multiple small leiomyoma uteri, and serum CA 125 level was 1,006 U/ml. The preoperative diagnosis was ovarian cancer with leiomyoma uteri. Exploratory laparotomy was performed. There were a large right ovarian endometrioma, small left ovarian endometrioma and multiple small leiomyoma. Total abdominal hysterectomy and bilateral salpingo-oophorectomy was performed and histopathology confirmed the diagnosis of endometrioma and leiomyoma. The serum CA 125 level declined to non-detectable at the 4th week. She was well at discharge and throughout her 4th week follow-up period Although a very high level of CA 125 is associated with a malignant process, it can also be found in benign conditions such as a large endometrioma. The case emphasizes the association of high levels of CA 125 with benign gynecologic conditions.
Pérez, Alberto J.; Braga, Roberto; Perles, Ángel; Pérez–Marín, Eva; García-Diego, Fernando J.
2018-01-01
Dynamic laser speckle (DLS) is used as a reliable sensor of activity for all types of materials. Traditional applications are based on high-rate captures (usually greater than 10 frames-per-second, fps). Even for drying processes in conservation treatments, where there is a high level of activity in the first moments after the application and slower activity after some minutes or hours, the process is based on the acquisition of images at a time rate that is the same in moments of high and low activity. In this work, we present an alternative approach to track the drying process of protective layers and other painting conservation processes that take a long time to reduce their levels of activity. We illuminate, using three different wavelength lasers, a temporary protector (cyclododecane) and a varnish, and monitor them using a low fps rate during long-term drying. The results are compared to the traditional method. This work also presents a monitoring method that uses portable equipment. The results present the feasibility of using the portable device and show the improved sensitivity of the dynamic laser speckle when sensing the long-term process for drying cyclododecane and varnish in conservation. PMID:29324692
Real-time high-level video understanding using data warehouse
NASA Astrophysics Data System (ADS)
Lienard, Bruno; Desurmont, Xavier; Barrie, Bertrand; Delaigle, Jean-Francois
2006-02-01
High-level Video content analysis such as video-surveillance is often limited by computational aspects of automatic image understanding, i.e. it requires huge computing resources for reasoning processes like categorization and huge amount of data to represent knowledge of objects, scenarios and other models. This article explains how to design and develop a "near real-time adaptive image datamart", used, as a decisional support system for vision algorithms, and then as a mass storage system. Using RDF specification as storing format of vision algorithms meta-data, we can optimise the data warehouse concepts for video analysis, add some processes able to adapt the current model and pre-process data to speed-up queries. In this way, when new data is sent from a sensor to the data warehouse for long term storage, using remote procedure call embedded in object-oriented interfaces to simplified queries, they are processed and in memory data-model is updated. After some processing, possible interpretations of this data can be returned back to the sensor. To demonstrate this new approach, we will present typical scenarios applied to this architecture such as people tracking and events detection in a multi-camera network. Finally we will show how this system becomes a high-semantic data container for external data-mining.
[Children with hyperthyroidism due to elevated hCG levels].
Jöbsis, Jasper J; van Trotsenburg, A S Paul; Merks, Johannes H M; Kamp, Gerdine A
2014-01-01
We describe two children with hyperthyroidism secondary to elevated hCG levels: one patient with gestational trophoblastic disease and one patient with choriocarcinoma. hCG resembles other glycoproteins that can lead to hyperthyroidism through TSH receptor activation. Also, through its LH-mimicking effect, hCG can induce high oestradiol levels, resulting in stormy pubertal development. False negative hCG tests due to the high-dose hook effect may complicate the diagnostic process. In patients with antibody-negative thyrotoxicosis, the diagnosis of hCG-induced hyperthyroidism must be considered.
Human-rated Safety Certification of a High Voltage Robonaut Lithium-ion Battery
NASA Technical Reports Server (NTRS)
Jeevarajan, Judith; Yayathi, S.; Johnson, M.; Waligora, T.; Verdeyen, W.
2013-01-01
NASA's rigorous certification process is being followed for the R2 high voltage battery program for use of R2 on International Space Station (ISS). Rigorous development testing at appropriate levels to credible off-nominal conditions and review of test data led to design improvements for safety at the virtual cell, cartridge and battery levels. Tests were carried out at all levels to confirm that both hardware and software controls work. Stringent flight acceptance testing of the flight battery will be completed before launch for mission use on ISS.
A high speed implementation of the random decrement algorithm
NASA Technical Reports Server (NTRS)
Kiraly, L. J.
1982-01-01
The algorithm is useful for measuring net system damping levels in stochastic processes and for the development of equivalent linearized system response models. The algorithm works by summing together all subrecords which occur after predefined threshold level is crossed. The random decrement signature is normally developed by scanning stored data and adding subrecords together. The high speed implementation of the random decrement algorithm exploits the digital character of sampled data and uses fixed record lengths of 2(n) samples to greatly speed up the process. The contributions to the random decrement signature of each data point was calculated only once and in the same sequence as the data were taken. A hardware implementation of the algorithm using random logic is diagrammed and the process is shown to be limited only by the record size and the threshold crossing frequency of the sampled data. With a hardware cycle time of 200 ns and 1024 point signature, a threshold crossing frequency of 5000 Hertz can be processed and a stably averaged signature presented in real time.
Uranus: a rapid prototyping tool for FPGA embedded computer vision
NASA Astrophysics Data System (ADS)
Rosales-Hernández, Victor; Castillo-Jimenez, Liz; Viveros-Velez, Gilberto; Zuñiga-Grajeda, Virgilio; Treviño Torres, Abel; Arias-Estrada, M.
2007-01-01
The starting point for all successful system development is the simulation. Performing high level simulation of a system can help to identify, insolate and fix design problems. This work presents Uranus, a software tool for simulation and evaluation of image processing algorithms with support to migrate them to an FPGA environment for algorithm acceleration and embedded processes purposes. The tool includes an integrated library of previous coded operators in software and provides the necessary support to read and display image sequences as well as video files. The user can use the previous compiled soft-operators in a high level process chain, and code his own operators. Additional to the prototyping tool, Uranus offers FPGA-based hardware architecture with the same organization as the software prototyping part. The hardware architecture contains a library of FPGA IP cores for image processing that are connected with a PowerPC based system. The Uranus environment is intended for rapid prototyping of machine vision and the migration to FPGA accelerator platform, and it is distributed for academic purposes.
Fujiyoshi, Tomoharu; Ikami, Takahito; Kikukawa, Koji; Kobayashi, Masato; Takai, Rina; Kozaki, Daisuke; Yamamoto, Atsushi
2018-02-01
The preservatives benzoic acid and sorbic acid are generally quantified with separation techniques, such as HPLC or GC. Here we describe a new method for determining these compounds in processed food samples based on a narrowness of the UV-visible spectral band width with derivative processing. It permits more selective identification and determination of target analytes in matrices. After a sample is purified by micro dialysis, UV spectra of sample solutions were measured and fourth order derivatives of the spectrum were calculated. The amplitude between the maximum and minimum values in a high-order derivative spectrum was used for the determination of benzoic acid and sorbic acid. Benzoic acid and sorbic acid levels in several commercially available processed foods were measured by HPLC and the proposed spectrometry method. The levels obtained by the two methods were highly correlated (r 2 >0.97) for both preservatives. Copyright © 2017 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Lee, Edmund W. J.; Ho, Shirley S.
2015-05-01
Public level of familiarity with nanotechnology partly determines their acceptance or rejection of the technology. This study examines the differential influence of public attention to science news in the media and reflective integration on perceived familiarity with nanotechnology among people in the higher and lower socioeconomic status (SES) groups in Singapore. Significant three-way interactions among education, science news attention, and reflective integration variables were found. Attention to television science news narrowed the level of perceived familiarity with nanotechnology between the higher and lower SES groups for those who engaged in high elaborative processing. Science newspaper attention, on the other hand, widened the familiarity gap between the higher and lower SES groups among those who engaged in high elaborative processing. Two-way interaction among education and elaborative processing were found—elaborative processing closed the familiarity gap between higher and lower SES groups. Theoretical and practical implications were discussed.
On Discipline: The Products and Process.
ERIC Educational Resources Information Center
Frasher, James
1982-01-01
One explanation for unexpectedly low stress levels among assistant principals may lie in "administrative attribution theory." The demand for school discipline by the public, school boards, teachers, and students should induce high levels of stress in assistant principals, because they are usually responsible for discipline enforcement,…
Friedman, Naomi P; Miyake, Akira
2017-01-01
Executive functions (EFs) are high-level cognitive processes, often associated with the frontal lobes, that control lower level processes in the service of goal-directed behavior. They include abilities such as response inhibition, interference control, working memory updating, and set shifting. EFs show a general pattern of shared but distinct functions, a pattern described as "unity and diversity". We review studies of EF unity and diversity at the behavioral and genetic levels, focusing on studies of normal individual differences and what they reveal about the functional organization of these cognitive abilities. In particular, we review evidence that across multiple ages and populations, commonly studied EFs (a) are robustly correlated but separable when measured with latent variables; (b) are not the same as general intelligence or g; (c) are highly heritable at the latent level and seemingly also highly polygenic; and (d) activate both common and specific neural areas and can be linked to individual differences in neural activation, volume, and connectivity. We highlight how considering individual differences at the behavioral and neural levels can add considerable insight to the investigation of the functional organization of the brain, and conclude with some key points about individual differences to consider when interpreting neuropsychological patterns of dissociation. Copyright © 2016 Elsevier Ltd. All rights reserved.
Numerical simulation of the processes in the normal incidence tube for high acoustic pressure levels
NASA Astrophysics Data System (ADS)
Fedotov, E. S.; Khramtsov, I. V.; Kustov, O. Yu.
2016-10-01
Numerical simulation of the acoustic processes in an impedance tube at high levels of acoustic pressure is a way to solve a problem of noise suppressing by liners. These studies used liner specimen that is one cylindrical Helmholtz resonator. The evaluation of the real and imaginary parts of the liner acoustic impedance and sound absorption coefficient was performed for sound pressure levels of 130, 140 and 150 dB. The numerical simulation used experimental data having been obtained on the impedance tube with normal incidence waves. At the first stage of the numerical simulation it was used the linearized Navier-Stokes equations, which describe well the imaginary part of the liner impedance whatever the sound pressure level. These equations were solved by finite element method in COMSOL Multiphysics program in axisymmetric formulation. At the second stage, the complete Navier-Stokes equations were solved by direct numerical simulation in ANSYS CFX in axisymmetric formulation. As the result, the acceptable agreement between numerical simulation and experiment was obtained.
Sensor readout detector circuit
Chu, Dahlon D.; Thelen, Jr., Donald C.
1998-01-01
A sensor readout detector circuit is disclosed that is capable of detecting sensor signals down to a few nanoamperes or less in a high (microampere) background noise level. The circuit operates at a very low standby power level and is triggerable by a sensor event signal that is above a predetermined threshold level. A plurality of sensor readout detector circuits can be formed on a substrate as an integrated circuit (IC). These circuits can operate to process data from an array of sensors in parallel, with only data from active sensors being processed for digitization and analysis. This allows the IC to operate at a low power level with a high data throughput for the active sensors. The circuit may be used with many different types of sensors, including photodetectors, capacitance sensors, chemically-sensitive sensors or combinations thereof to provide a capability for recording transient events or for recording data for a predetermined period of time following an event trigger. The sensor readout detector circuit has applications for portable or satellite-based sensor systems.
Sensor readout detector circuit
Chu, D.D.; Thelen, D.C. Jr.
1998-08-11
A sensor readout detector circuit is disclosed that is capable of detecting sensor signals down to a few nanoamperes or less in a high (microampere) background noise level. The circuit operates at a very low standby power level and is triggerable by a sensor event signal that is above a predetermined threshold level. A plurality of sensor readout detector circuits can be formed on a substrate as an integrated circuit (IC). These circuits can operate to process data from an array of sensors in parallel, with only data from active sensors being processed for digitization and analysis. This allows the IC to operate at a low power level with a high data throughput for the active sensors. The circuit may be used with many different types of sensors, including photodetectors, capacitance sensors, chemically-sensitive sensors or combinations thereof to provide a capability for recording transient events or for recording data for a predetermined period of time following an event trigger. The sensor readout detector circuit has applications for portable or satellite-based sensor systems. 6 figs.
Chen, Heng; Chen, Xinying
2018-01-01
Language is a complex adaptive system, but how does it change? For investigating this process, four diachronic Chinese word co-occurrence networks have been built based on texts that were written during the last 2,000 years. By comparing the network indicators that are associated with the hierarchical features in language networks, we learn that the hierarchy of Chinese lexical networks has indeed evolved over time at three different levels. The connections of words at the micro level are continually weakening; the number of words in the meso-level communities has increased significantly; and the network is expanding at the macro level. This means that more and more words tend to be connected to medium-central words and form different communities. Meanwhile, fewer high-central words link these communities into a highly efficient small-world network. Understanding this process may be crucial for understanding the increasing structural complexity of the language system. PMID:29489837
Chen, Heng; Chen, Xinying; Liu, Haitao
2018-01-01
Language is a complex adaptive system, but how does it change? For investigating this process, four diachronic Chinese word co-occurrence networks have been built based on texts that were written during the last 2,000 years. By comparing the network indicators that are associated with the hierarchical features in language networks, we learn that the hierarchy of Chinese lexical networks has indeed evolved over time at three different levels. The connections of words at the micro level are continually weakening; the number of words in the meso-level communities has increased significantly; and the network is expanding at the macro level. This means that more and more words tend to be connected to medium-central words and form different communities. Meanwhile, fewer high-central words link these communities into a highly efficient small-world network. Understanding this process may be crucial for understanding the increasing structural complexity of the language system.
Assessment of Advanced Coal Gasification Processes
NASA Technical Reports Server (NTRS)
McCarthy, John; Ferrall, Joseph; Charng, Thomas; Houseman, John
1981-01-01
This report represents a technical assessment of the following advanced coal gasification processes: AVCO High Throughput Gasification (HTG) Process; Bell Single-Stage High Mass Flux (HMF) Process; Cities Service/Rockwell (CS/R) Hydrogasification Process; Exxon Catalytic Coal Gasification (CCG) Process. Each process is evaluated for its potential to produce SNG from a bituminous coal. In addition to identifying the new technology these processes represent, key similarities/differences, strengths/weaknesses, and potential improvements to each process are identified. The AVCO HTG and the Bell HMF gasifiers share similarities with respect to: short residence time (SRT), high throughput rate, slagging and syngas as the initial raw product gas. The CS/R Hydrogasifier is also SRT but is non-slagging and produces a raw gas high in methane content. The Exxon CCG gasifier is a long residence time, catalytic, fluidbed reactor producing all of the raw product methane in the gasifier. The report makes the following assessments: 1) while each process has significant potential as coal gasifiers, the CS/R and Exxon processes are better suited for SNG production; 2) the Exxon process is the closest to a commercial level for near-term SNG production; and 3) the SRT processes require significant development including scale-up and turndown demonstration, char processing and/or utilization demonstration, and reactor control and safety features development.
Borsani, Julia; Budde, Claudio O; Porrini, Lucía; Lauxmann, Martin A; Lombardo, Verónica A; Murray, Ricardo; Andreo, Carlos S; Drincovich, María F; Lara, María V
2009-01-01
Peach (Prunus persica L. Batsch) is a climacteric fruit that ripens after harvest, prior to human consumption. Organic acids and soluble sugars contribute to the overall organoleptic quality of fresh peach; thus, the integrated study of the metabolic pathways controlling the levels of these compounds is of great relevance. Therefore, in this work, several metabolites and enzymes involved in carbon metabolism were analysed during the post-harvest ripening of peach fruit cv 'Dixiland'. Depending on the enzyme studied, activity, protein level by western blot, or transcript level by quantitative real time-PCR were analysed. Even though sorbitol did not accumulate at a high level in relation to sucrose at harvest, it was rapidly consumed once the fruit was separated from the tree. During the ripening process, sucrose degradation was accompanied by an increase of glucose and fructose. Specific transcripts encoding neutral invertases (NIs) were up-regulated or down-regulated, indicating differential functions for each putative NI isoform. Phosphoenolpyruvate carboxylase was markedly induced, and may participate as a glycolytic shunt, since the malate level did not increase during post-harvest ripening. The fermentative pathway was highly induced, with increases in both the acetaldehyde level and the enzymes involved in this process. In addition, proteins differentially expressed during the post-harvest ripening process were also analysed. Overall, the present study identified enzymes and pathways operating during the post-harvest ripening of peach fruit, which may contribute to further identification of varieties with altered levels of enzymes/metabolites or in the evaluation of post-harvest treatments to produce fruit of better organoleptic attributes.
Sustained Attention in Mild Alzheimer’s Disease
Berardi, Anna Maria; Parasuraman, Raja; Haxby, James V.
2008-01-01
The vigilance decrement in perceptual sensitivity was examined in 10 patients with mild Alzheimer’s disease (AD) and 20 age-matched controls. A visual high-event rate digit-discrimination task lasting 7.2 min. (six 1.2 min blocks) was presented at different levels of stimulus degradation. Previous studies have shown that sensitivity decrements (d′) over time at high-stimulus degradation result from demands on effortful processing. For all degradation levels, the overall level of vigilance (d′) was lower in AD patients than in controls. All participants showed sensitivity decrement over blocks, with greater decrement at higher degradation levels. AD patients exhibited greater sensitivity decrement over time at the highest degradation level they all could perform relative to control participants. There were no concomitant changes in either response bias (C) or response times. The results indicate that mild AD patients have overall lower levels of vigilance under conditions that require both automatic and effortful processing. Mild AD patients also exhibit a deficit in the maintenance of vigilance over time under effortful processing conditions. Although the sample of AD patients was small, results further suggest that both possible and probable AD patients had greater sensitivity decrement over time at the highest degradation level than did control participants, but only probable AD patients had lower overall levels of vigilance. In the possible AD patients as a group, the decrement in vigilance occurred in the absence of concurrent deficits on standard attentional tasks, such as the Stroop and Trail Making tests, suggesting that deficits in vigilance over time may appear earlier than deficits in selective attention. PMID:15992254
Wang, Xiao-Yu; Li, Shuai; Wang, Guang; Ma, Zheng-Lai; Chuai, Manli; Cao, Liu; Yang, Xuesong
2015-01-01
High glucose levels induced by maternal diabetes could lead to defects in neural crest development during embryogenesis, but the cellular mechanism is still not understood. In this study, we observed a defect in chick cranial skeleton, especially parietal bone development in the presence of high glucose levels, which is derived from cranial neural crest cells (CNCC). In early chick embryo, we found that inducing high glucose levels could inhibit the development of CNCC, however, cell proliferation was not significantly involved. Nevertheless, apoptotic CNCC increased in the presence of high levels of glucose. In addition, the expression of apoptosis and autophagy relevant genes were elevated by high glucose treatment. Next, the application of beads soaked in either an autophagy stimulator (Tunicamycin) or inhibitor (Hydroxychloroquine) functionally proved that autophagy was involved in regulating the production of CNCC in the presence of high glucose levels. Our observations suggest that the ERK pathway, rather than the mTOR pathway, most likely participates in mediating the autophagy induced by high glucose. Taken together, our observations indicated that exposure to high levels of glucose could inhibit the survival of CNCC by affecting cell apoptosis, which might result from the dysregulation of the autophagic process. PMID:26671447
Radioactive waste management in France: safety demonstration fundamentals.
Ouzounian, G; Voinis, S; Boissier, F
2012-01-01
The main challenge in development of the safety case for deep geological disposal is associated with the long periods of time over which high- and intermediate-level long-lived wastes remain hazardous. A wide range of events and processes may occur over hundreds of thousands of years. These events and processes are characterised by specific timescales. For example, the timescale for heat generation is much shorter than any geological timescale. Therefore, to reach a high level of reliability in the safety case, it is essential to have a thorough understanding of the sequence of events and processes likely to occur over the lifetime of the repository. It then becomes possible to assess the capability of the repository to fulfil its safety functions. However, due to the long periods of time and the complexity of the events and processes likely to occur, uncertainties related to all processes, data, and models need to be understood and addressed. Assessment is required over the lifetime of the radionuclides contained in the radioactive waste. Copyright © 2012. Published by Elsevier Ltd.
Bourne, Victoria J; Vladeanu, Matei
2011-04-01
Recent neuropsychological studies have attempted to distinguish between different types of anxiety by contrasting patterns of brain organisation or activation; however, lateralisation for processing emotional stimuli has received relatively little attention. This study examines the relationship between strength of lateralisation for the processing of facial expressions of emotion and three measures of anxiety: state anxiety, trait anxiety and social anxiety. Across all six of the basic emotions (anger, disgust, fear, happiness, sadness, surprise) the same patterns of association were found. Participants with high levels of trait anxiety were more strongly lateralised to the right hemisphere for processing facial emotion. In contrast, participants with high levels of self-reported physiological arousal in response to social anxiety were more weakly lateralised to the right hemisphere, or even lateralised to the left hemisphere, for the processing of facial emotion. There were also sex differences in these associations: the relationships were evident for males only. The finding of distinct patterns of lateralisation for trait anxiety and self-reported physiological arousal suggests different neural circuitry for trait and social anxiety. Copyright © 2011. Published by Elsevier Ltd.
ABM Drag_Pass Report Generator
NASA Technical Reports Server (NTRS)
Fisher, Forest; Gladden, Roy; Khanampornpan, Teerapat
2008-01-01
dragREPORT software was developed in parallel with abmREPORT, which is described in the preceding article. Both programs were built on the capabilities created during that process. This tool generates a drag_pass report that summarizes vital information from the MRO aerobreaking drag_pass build process to facilitate both sequence reviews and provide a high-level summarization of the sequence for mission management. The script extracts information from the ENV, SSF, FRF, SCMFmax, and OPTG files, presenting them in a single, easy-to-check report providing the majority of parameters needed for cross check and verification as part of the sequence review process. Prior to dragReport, all the needed information was spread across a number of different files, each in a different format. This software is a Perl script that extracts vital summarization information and build-process details from a number of source files into a single, concise report format used to aid the MPST sequence review process and to provide a high-level summarization of the sequence for mission management reference. This software could be adapted for future aerobraking missions to provide similar reports, review and summarization information.
NASA Astrophysics Data System (ADS)
Hoeke, R. K.; Reyns, J.; O'Grady, J.; Becker, J. M.; Merrifield, M. A.; Roelvink, J. A.
2016-02-01
Oceanic islands are widely perceived as vulnerable to sea level rise and are characterized by steep nearshore topography and fringing reefs. In such settings, near shore dynamics and (non-tidal) water level variability tends to be dominated by wind-wave processes. These processes are highly sensitive to reef morphology and roughness and to regional wave climate. Thus sea level extremes tend to be highly localized and their likelihood can be expected to change in the future (beyond simple extrapolation of sea level rise scenarios): e.g. sea level rise may increase the effective mean depth of reef crests and flats and ocean acidification and/or increased temperatures may lead to changes in reef structure. The problem is sufficiently complex that analytic or numerical approaches are necessary to estimate current hazards and explore potential future changes. In this study, we evaluate the capacity of several analytic/empirical approaches and phase-averaged and phase-resolved numerical models at sites in the insular tropical Pacific. We consider their ability to predict time-averaged wave setup and instantaneous water level exceedance probability (or dynamic wave run-up) as well as computational cost; where possible, we compare the model results with in situ observations from a number of previous studies. Preliminary results indicate analytic approaches are by far the most computationally efficient, but tend to perform poorly when alongshore straight and parallel morphology cannot be assumed. Phase-averaged models tend to perform well with respect to wave setup in such situations, but are unable to predict processes related to individual waves or wave groups, such as infragravity motions or wave run-up. Phase-resolved models tend to perform best, but come at high computational cost, an important consideration when exploring possible future scenarios. A new approach of combining an unstructured computational grid with a quasi-phase averaged approach (i.e. only phase resolving motions below a frequency cutoff) shows promise as a good compromise between computational efficiency and resolving processes such as wave runup and overtopping in more complex bathymetric situations.
Fayed, Mohamed H; Abdel-Rahman, Sayed I; Alanazi, Fars K; Ahmed, Mahrous O; Tawfeek, Hesham M; Al-Shedfat, Ramadan I
2017-01-01
Application of quality by design (QbD) in high shear granulation process is critical and need to recognize the correlation between the granulation process parameters and the properties of intermediate (granules) and corresponding final product (tablets). The present work examined the influence of water amount (X,) and wet massing time (X2) as independent process variables on the critical quality attributes of granules and corresponding tablets using design of experiment (DoE) technique. A two factor, three level (32) full factorial design was performed; each of these variables was investigated at three levels to characterize their strength and interaction. The dried granules have been analyzed for their size distribution, density and flow pattern. Additionally, the produced tablets have been investigated for weight uniformity, crushing strength, friability and percent capping, disintegration time and drug dissolution. Statistically significant impact (p < 0.05) of water amount was identified for granule growth, percent fines and distribution width and flow behavior. Granule density and compressibility were found to be significantly influenced (p < 0.05) by the two operating conditions. Also, water amount has significant effect (p < 0.05) on tablet weight unifornity, friability and percent capping. Moreover, tablet disintegration time and drug dissolution appears to be significantly influenced (p < 0.05) by the two process variables. On the other hand, the relationship of process parameters with critical quality attributes of granule and final product tablet was identified and correlated. Ultimately, a judicious selection of process parameters in high shear granulation process will allow providing product of desirable quality.
Injection Process Control of the Well at the Hydrodynamic Research of Coalbed
NASA Astrophysics Data System (ADS)
Odnokopylov, I. G.; Galtseva, O. V.; Krasnov, I. Yu; Smirnov, A. O.; Karpov, M. S.; Surzhikova, O. A.; Kuznetsov, V. V.; Li, J.
2017-04-01
This scientific work is devoted to the study results of water injection process into the well at the hydrodynamic research by using the high pressure unregulated pump. The injection process should be accompanied by the retention of some hydraulic parameters at constant level during some time. Various variants for use of mechatronic nodes for automatization of water injection process are considered. Scheme for reducing the load on the pump and equipment in hydraulic system and also for improving the quality control system with high accuracy is shown. Simulation results of injection process into the well at the pressure and consumption fixation and recommendations for the use of the proposed schemes depending on the technological process are given.
Martínez-Onandi, N; Rivas-Cañedo, A; Picon, A; Nuñez, M
2016-12-01
One hundred and three volatile compounds were detected by solid-phase microextraction followed by gas chromatography-mass spectrometry in 30 ripened Serrano dry-cured hams, submitted or not to high pressure processing (HPP) and afterwards held for 5months at 4°C. The effect of ham physicochemical parameters and HPP (600MPa for 6min) on volatile compounds was assessed. Physicochemical parameters primarily affected the levels of acids, alcohols, alkanes, esters, benzene compounds, sulfur compounds and some miscellaneous compounds. Intramuscular fat content was the physicochemical parameter with the most pronounced effect on the volatile fraction of untreated Serrano ham after refrigerated storage, influencing the levels of 38 volatile compounds while aw, salt content and salt-in-lean ratio respectively influenced the levels of 4, 4 and 5 volatile compounds. HPP treatment affected 21 volatile compounds, resulting in higher levels of alkanes and ketones and lower levels of esters and secondary alcohols, what might affect Serrano ham odor and aroma after 5months of refrigerated storage. Copyright © 2016 Elsevier Ltd. All rights reserved.
Impaired recognition of faces and objects in dyslexia: Evidence for ventral stream dysfunction?
Sigurdardottir, Heida Maria; Ívarsson, Eysteinn; Kristinsdóttir, Kristjana; Kristjánsson, Árni
2015-09-01
The objective of this study was to establish whether or not dyslexics are impaired at the recognition of faces and other complex nonword visual objects. This would be expected based on a meta-analysis revealing that children and adult dyslexics show functional abnormalities within the left fusiform gyrus, a brain region high up in the ventral visual stream, which is thought to support the recognition of words, faces, and other objects. 20 adult dyslexics (M = 29 years) and 20 matched typical readers (M = 29 years) participated in the study. One dyslexic-typical reader pair was excluded based on Adult Reading History Questionnaire scores and IS-FORM reading scores. Performance was measured on 3 high-level visual processing tasks: the Cambridge Face Memory Test, the Vanderbilt Holistic Face Processing Test, and the Vanderbilt Expertise Test. People with dyslexia are impaired in their recognition of faces and other visually complex objects. Their holistic processing of faces appears to be intact, suggesting that dyslexics may instead be specifically impaired at part-based processing of visual objects. The difficulty that people with dyslexia experience with reading might be the most salient manifestation of a more general high-level visual deficit. (c) 2015 APA, all rights reserved).
Source origin of trace elements in PM from regional background, urban and industrial sites of Spain
NASA Astrophysics Data System (ADS)
Querol, X.; Viana, M.; Alastuey, A.; Amato, F.; Moreno, T.; Castillo, S.; Pey, J.; de la Rosa, J.; Sánchez de la Campa, A.; Artíñano, B.; Salvador, P.; García Dos Santos, S.; Fernández-Patier, R.; Moreno-Grau, S.; Negral, L.; Minguillón, M. C.; Monfort, E.; Gil, J. I.; Inza, A.; Ortega, L. A.; Santamaría, J. M.; Zabalza, J.
Despite their significant role in source apportionment analysis, studies dedicated to the identification of tracer elements of emission sources of atmospheric particulate matter based on air quality data are relatively scarce. The studies describing tracer elements of specific sources currently available in the literature mostly focus on emissions from traffic or large-scale combustion processes (e.g. power plants), but not on specific industrial processes. Furthermore, marker elements are not usually determined at receptor sites, but during emission. In our study, trace element concentrations in PM 10 and PM 2.5 were determined at 33 monitoring stations in Spain throughout the period 1995-2006. Industrial emissions from different forms of metallurgy (steel, stainless steel, copper, zinc), ceramic and petrochemical industries were evaluated. Results obtained at sites with no significant industrial development allowed us to define usual concentration ranges for a number of trace elements in rural and urban background environments. At industrial and traffic hotspots, average trace metal concentrations were highest, exceeding rural background levels by even one order of magnitude in the cases of Cr, Mn, Cu, Zn, As, Sn, W, V, Ni, Cs and Pb. Steel production emissions were linked to high levels of Cr, Mn, Ni, Zn, Mo, Cd, Se and Sn (and probably Pb). Copper metallurgy areas showed high levels of As, Bi, Ga and Cu. Zinc metallurgy was characterised by high levels of Zn and Cd. Glazed ceramic production areas were linked to high levels of Zn, As, Se, Zr, Cs, Tl, Li, Co and Pb. High levels of Ni and V (in association) were tracers of petrochemical plants and/or fuel-oil combustion. At one site under the influence of heavy vessel traffic these elements could be considered tracers (although not exclusively) of shipping emissions. Levels of Zn-Ba and Cu-Sb were relatively high in urban areas when compared with industrialised regions due to tyre and brake abrasion, respectively.
Catalytic reaction processes revealed by scanning probe microscopy. [corrected].
Jiang, Peng; Bao, Xinhe; Salmeron, Miquel
2015-05-19
Heterogeneous catalysis is of great importance for modern society. About 80% of the chemicals are produced by catalytic reactions. Green energy production and utilization as well as environmental protection also need efficient catalysts. Understanding the reaction mechanisms is crucial to improve the existing catalysts and develop new ones with better activity, selectivity, and stability. Three components are involved in one catalytic reaction: reactant, product, and catalyst. The catalytic reaction process consists of a series of elementary steps: adsorption, diffusion, reaction, and desorption. During reaction, the catalyst surface can change at the atomic level, with roughening, sintering, and segregation processes occurring dynamically in response to the reaction conditions. Therefore, it is imperative to obtain atomic-scale information for understanding catalytic reactions. Scanning probe microscopy (SPM) is a very appropriate tool for catalytic research at the atomic scale because of its unique atomic-resolution capability. A distinguishing feature of SPM, compared to other surface characterization techniques, such as X-ray photoelectron spectroscopy, is that there is no intrinsic limitation for SPM to work under realistic reaction conditions (usually high temperature and high pressure). Therefore, since it was introduced in 1981, scanning tunneling microscopy (STM) has been widely used to investigate the adsorption, diffusion, reaction, and desorption processes on solid catalyst surfaces at the atomic level. STM can also monitor dynamic changes of catalyst surfaces during reactions. These invaluable microscopic insights have not only deepened the understanding of catalytic processes, but also provided important guidance for the development of new catalysts. This Account will focus on elementary reaction processes revealed by SPM. First, we will demonstrate the power of SPM to investigate the adsorption and diffusion process of reactants on catalyst surfaces at the atomic level. Then the dynamic processes, including surface reconstruction, roughening, sintering, and phase separation, studied by SPM will be discussed. Furthermore, SPM provides valuable insights toward identifying the active sites and understanding the reaction mechanisms. We also illustrate here how both ultrahigh vacuum STM and high pressure STM provide valuable information, expanding the understanding provided by traditional surface science. We conclude with highlighting remarkable recent progress in noncontact atomic force microscopy (NC-AFM) and inelastic electron tunneling spectroscopy (IETS), and their impact on single-chemical-bond level characterization for catalytic reaction processes in the future.
Pre-PDK block-level PPAC assessment of technology options for sub-7nm high-performance logic
NASA Astrophysics Data System (ADS)
Liebmann, L.; Northrop, G.; Facchini, M.; Riviere Cazaux, L.; Baum, Z.; Nakamoto, N.; Sun, K.; Chanemougame, D.; Han, G.; Gerousis, V.
2018-03-01
This paper describes a rigorous yet flexible standard cell place-and-route flow that is used to quantify block-level power, performance, and area trade-offs driven by two unique cell architectures and their associated design rule differences. The two architectures examined in this paper differ primarily in their use of different power-distribution-networks to achieve the desired circuit performance for high-performance logic designs. The paper shows the importance of incorporating block-level routability experiments in the early phases of design-technology co-optimization by reviewing a series of routing trials that explore different aspects of the technology definition. Since the electrical and physical parameters leading to critical process assumptions and design rules are unique to specific integration schemes and design objectives, it is understood that the goal of this work is not to promote one cell-architecture over another, but rather to convey the importance of exploring critical trade-offs long before the process details of the technology node are finalized to a point where a process design kit can be published.
Meier, Robin; Moll, Klaus-Peter; Krumme, Markus; Kleinebudde, Peter
2017-06-01
In a previous study a change of the fill-level in the barrel exerted a huge influence on the twin-screw granulation (TSG) process of a high drug loaded, simplified formulation. The present work investigated this influence systematically. The specific feed load (SFL) indicating the mass per revolution as surrogate parameter for the fill-level was applied and the correlation to the real volumetric fill level of an extruder could be demonstrated by a newly developed method. A design of experiments was conducted to examine the combined influence of SFL and screw speed on the process and on critical quality attributes of granules and tablets. The same formulation was granulated at constant liquid level with the same screw configuration and led to distinctively different results by only changing the fill-level and the screw speed. The power consumption of the extruder increased at higher SFLs with hardly any influence of screw speed. At low SFL the median residence time was mainly fill-level dependent and at higher SFL mainly screw speed dependent. Optimal values for the product characteristics were found at medium values for the SFL. Granule size distributions shifted from mono-modal and narrow shape to broader and even bimodal distributions of larger median granule sizes, when exceeding or falling below a certain fill-level. Deviating from the optimum fill-level, tensile strength of tablets decreased by about 25% and disintegration times of tablets increased for more than one third. At low fill-levels, material accumulation in front of the kneading zone was detected by pressure measurements and was assumed to be responsible for the unfavored product performance. At high fill-levels, granule consolidation due to higher propensity of contact with the result of higher material temperature was accounted for inferior product performance. The fill-level was found to be an important factor in assessment and development of twin-screw granulation processes as it impacted process and product attributes enormously. Copyright © 2017 Elsevier B.V. All rights reserved.
Tolle, John C; Becker, Calvin L; Califano, Jean C; Chang, Jane L; Gernhardt, Kevin; Napier, James J; Wittenberger, Steven J; Yuan, Judy
2009-01-01
Understanding impurity rejection in a drug substance crystallization process is valuable for establishing purity specifications for the starting materials used in the process. Impurity rejection has been determined for all known ABT-510 impurities and for many of the reasonable & conceivable impurities. Based on this study, a very high purity specification (e.g., > 99.7%) can be set for ABT-510 with a high level of confidence.
Decreased medial prefrontal cortex activation during self-referential processing in bipolar mania.
Herold, Dorrit; Usnich, Tatiana; Spengler, Stephanie; Sajonz, Bastian; Bauer, Michael; Bermpohl, Felix
2017-09-01
Patients with bipolar disorder in mania exhibit symptoms pointing towards altered self-referential processing, such as decreased self-focus, flight of ideas and high distractibility. In depression, the opposite pattern of symptoms has been connected to increased activation of medial prefrontal cortex (mPFC) during self-referential processing. In this study, we hypothesized that (1) patients with mania will exhibit decreased activation in the mPFC during self-referential processing and (2) will be more alexithymic and that levels of alexithymia will correlate negatively with mPFC activation. The neural response to standardized pictures was compared in 14 patients with bipolar I disorder in mania to 14 healthy controls using blood oxygen level dependent contrast magnetic resonance imaging. Participants were asked to indicate with button press during the scanning session for each picture whether the pictures personally related to them or not. Toronto alexithymia scale (TAS) scores were recorded from all participants. In the group analysis, patients with mania exhibited decreased activation in a predefined region of interest in the mPFC during self-referential processing compared to healthy controls. Patients with mania showed significantly higher levels of alexithymia, attributable to difficulties in identifying and describing emotions. Activation in the mPFC correlated negatively with levels of alexithymia. Results presented here should be replicated in a larger group, potentially including unmedicated patients. The finding of decreased mPFC activation during self-referential processing in mania may reflect decreased self-focus and high distractibility. Support for this view comes from the negative correlation between higher alexithymia scores and decreased mPFC activation. These findings represent an opposite clinical and neuroimaging pattern to findings in depression. Copyright © 2017. Published by Elsevier B.V.
Plyler, Patrick N; Reber, Monika Bertges; Kovach, Amanda; Galloway, Elisabeth; Humphrey, Elizabeth
2013-02-01
Multichannel wide dynamic range compression (WDRC) and ChannelFree processing have similar goals yet differ significantly in terms of signal processing. Multichannel WDRC devices divide the input signal into separate frequency bands; a separate level is determined within each frequency band; and compression in each band is based on the level within each band. ChannelFree processing detects the wideband level, and gain adjustments are based on the wideband signal level and adjusted up to 20,000 times per second. Although both signal processing strategies are currently available in hearing aids, it is unclear if differences in these signal processing strategies affect the performance and/or preference of the end user. The purpose of the research was to determine the effects of multichannel wide dynamic range compression and ChannelFree processing on performance and/or preference of listeners using open-canal hearing instruments. An experimental study in which subjects were exposed to a repeated measures design was utilized. Fourteen adult listeners with mild sloping to moderately severe sensorineural hearing loss participated (mean age 67 yr). Participants completed two 5 wk trial periods for each signal processing strategy. Probe microphone, behavioral and subjective measures were conducted unaided and aided at the end of each trial period. Behavioral and subjective results for both signal processing strategies were significantly better than unaided results; however, behavioral and subjective results were not significantly different between the signal processing strategies. Multichannel WDRC and ChannelFree processing are both effective signal processing strategies that provide significant benefit for hearing instrument users. Overall preference between the strategies may be related to the degree of hearing loss of the user, high-frequency in-situ levels, and/or acceptance of background noise. American Academy of Audiology.
Monitoring sodium levels in commercially processed and restaurant foods - dataset and webpages.
USDA-ARS?s Scientific Manuscript database
Nutrient Data Laboratory (NDL), Agriculture Research Service (ARS) in collaboration with Food Surveys Research Group, ARS, and the Centers for Disease Control and Prevention has been monitoring commercially processed and restaurant foods in the United States since 2010. About 125 highly consumed, s...
Materials and Processes Technology.
ERIC Educational Resources Information Center
Ritz, John M.; And Others
This instructional resource guide is intended to assist the industrial arts (IA) teacher in implementing a comprehensive materials and Processes Technology program at the technical level in Virginia high schools. The course is designed to help students make informed educational and occupational choices and prepare them for advanced technical or…
Collins, Loel; Collins, Dave
2015-01-01
This study examined the integration of professional judgement and decision-making processes in adventure sports coaching. The study utilised a thematic analysis approach to investigate the decision-making practices of a sample of high-level adventure sports coaches over a series of sessions. Results revealed that, in order to make judgements and decisions in practice, expert coaches employ a range of practical and pedagogic management strategies to create and opportunistically use time for decision-making. These approaches include span of control and time management strategies to facilitate the decision-making process regarding risk management, venue selection, aims, objectives, session content, and differentiation of the coaching process. The implication for coaches, coach education, and accreditation is the recognition and training of the approaches that "create time" for the judgements in practice, namely "creating space to think". The paper concludes by offering a template for a more expertise-focused progression in adventure sports coaching.
[Early mother-infant interaction and factors negatively affecting parenting].
Cerezo, María Angeles; Trenado, Rosa María; Pons-Salvador, Gemma
2006-08-01
The social information-processing model contributes to identifying the psychological processes underlying the construct "sensitivity" in early mother-child interaction. Negative emotional states associated with inadequate self-regulation in coping with stressors affect the mother's attention skills and the processing of the baby's signals. This leads to less synchronous parental practices, particularly unsatisfactory when the baby is unhappy, or crying because the required self-regulation is not provided. This micro-social research studies the sequential profile of maternal reactions to the baby's positive/neutral vs. difficult behaviours and compares them in two groups of dyads, one with mothers who reported high levels of distress and other negative factors for parenting and another group with low levels. The unfavourable circumstances of the high stress group and their negative effects on interaction were observed in some indiscriminate maternal responses and particularly as they reacted to their baby's difficult behaviour, when the mother's regulatory role is more necessary.
Enabling multi-level relevance feedback on PubMed by integrating rank learning into DBMS.
Yu, Hwanjo; Kim, Taehoon; Oh, Jinoh; Ko, Ilhwan; Kim, Sungchul; Han, Wook-Shin
2010-04-16
Finding relevant articles from PubMed is challenging because it is hard to express the user's specific intention in the given query interface, and a keyword query typically retrieves a large number of results. Researchers have applied machine learning techniques to find relevant articles by ranking the articles according to the learned relevance function. However, the process of learning and ranking is usually done offline without integrated with the keyword queries, and the users have to provide a large amount of training documents to get a reasonable learning accuracy. This paper proposes a novel multi-level relevance feedback system for PubMed, called RefMed, which supports both ad-hoc keyword queries and a multi-level relevance feedback in real time on PubMed. RefMed supports a multi-level relevance feedback by using the RankSVM as the learning method, and thus it achieves higher accuracy with less feedback. RefMed "tightly" integrates the RankSVM into RDBMS to support both keyword queries and the multi-level relevance feedback in real time; the tight coupling of the RankSVM and DBMS substantially improves the processing time. An efficient parameter selection method for the RankSVM is also proposed, which tunes the RankSVM parameter without performing validation. Thereby, RefMed achieves a high learning accuracy in real time without performing a validation process. RefMed is accessible at http://dm.postech.ac.kr/refmed. RefMed is the first multi-level relevance feedback system for PubMed, which achieves a high accuracy with less feedback. It effectively learns an accurate relevance function from the user's feedback and efficiently processes the function to return relevant articles in real time.
Enabling multi-level relevance feedback on PubMed by integrating rank learning into DBMS
2010-01-01
Background Finding relevant articles from PubMed is challenging because it is hard to express the user's specific intention in the given query interface, and a keyword query typically retrieves a large number of results. Researchers have applied machine learning techniques to find relevant articles by ranking the articles according to the learned relevance function. However, the process of learning and ranking is usually done offline without integrated with the keyword queries, and the users have to provide a large amount of training documents to get a reasonable learning accuracy. This paper proposes a novel multi-level relevance feedback system for PubMed, called RefMed, which supports both ad-hoc keyword queries and a multi-level relevance feedback in real time on PubMed. Results RefMed supports a multi-level relevance feedback by using the RankSVM as the learning method, and thus it achieves higher accuracy with less feedback. RefMed "tightly" integrates the RankSVM into RDBMS to support both keyword queries and the multi-level relevance feedback in real time; the tight coupling of the RankSVM and DBMS substantially improves the processing time. An efficient parameter selection method for the RankSVM is also proposed, which tunes the RankSVM parameter without performing validation. Thereby, RefMed achieves a high learning accuracy in real time without performing a validation process. RefMed is accessible at http://dm.postech.ac.kr/refmed. Conclusions RefMed is the first multi-level relevance feedback system for PubMed, which achieves a high accuracy with less feedback. It effectively learns an accurate relevance function from the user’s feedback and efficiently processes the function to return relevant articles in real time. PMID:20406504
Solan, Martin; Hauton, Chris; Godbold, Jasmin A.; Wood, Christina L.; Leighton, Timothy G.; White, Paul
2016-01-01
Coastal and shelf environments support high levels of biodiversity that are vital in mediating ecosystem processes, but they are also subject to noise associated with mounting levels of offshore human activity. This has the potential to alter the way in which species interact with their environment, compromising the mediation of important ecosystem properties. Here, we show that exposure to underwater broadband sound fields that resemble offshore shipping and construction activity can alter sediment-dwelling invertebrate contributions to fluid and particle transport - key processes in mediating benthic nutrient cycling. Despite high levels of intra-specific variability in physiological response, we find that changes in the behaviour of some functionally important species can be dependent on the class of broadband sound (continuous or impulsive). Our study provides evidence that exposing coastal environments to anthropogenic sound fields is likely to have much wider ecosystem consequences than are presently acknowledged. PMID:26847483
DOE Office of Scientific and Technical Information (OSTI.GOV)
Aderholdt, Ferrol; Caldwell, Blake A.; Hicks, Susan Elaine
High performance computing environments are often used for a wide variety of workloads ranging from simulation, data transformation and analysis, and complex workflows to name just a few. These systems may process data at various security levels but in so doing are often enclaved at the highest security posture. This approach places significant restrictions on the users of the system even when processing data at a lower security level and exposes data at higher levels of confidentiality to a much broader population than otherwise necessary. The traditional approach of isolation, while effective in establishing security enclaves poses significant challenges formore » the use of shared infrastructure in HPC environments. This report details current state-of-the-art in virtualization, reconfigurable network enclaving via Software Defined Networking (SDN), and storage architectures and bridging techniques for creating secure enclaves in HPC environments.« less
Down-regulation of respiration in pear fruit depends on temperature.
Ho, Quang Tri; Hertog, Maarten L A T M; Verboven, Pieter; Ambaw, Alemayehu; Rogge, Seppe; Verlinden, Bert E; Nicolaï, Bart M
2018-04-09
The respiration rate of plant tissues decreases when the amount of available O2 is reduced. There is, however, a debate on whether the respiration rate is controlled either by diffusion limitation of oxygen or through regulatory processes at the level of the transcriptome. We used experimental and modelling approaches to demonstrate that both diffusion limitation and metabolic regulation affect the response of respiration of bulky plant organs such as fruit to reduced O2 levels in the surrounding atmosphere. Diffusion limitation greatly affects fruit respiration at high temperature, but at low temperature respiration is reduced through a regulatory process, presumably a response to a signal generated by a plant oxygen sensor. The response of respiration to O2 is time dependent and is highly sensitive, particularly at low O2 levels in the surrounding atmosphere. Down-regulation of the respiration at low temperatures may save internal O2 and relieve hypoxic conditions in the fruit.
Peker, Musa; Şen, Baha; Gürüler, Hüseyin
2015-02-01
The effect of anesthesia on the patient is referred to as depth of anesthesia. Rapid classification of appropriate depth level of anesthesia is a matter of great importance in surgical operations. Similarly, accelerating classification algorithms is important for the rapid solution of problems in the field of biomedical signal processing. However numerous, time-consuming mathematical operations are required when training and testing stages of the classification algorithms, especially in neural networks. In this study, to accelerate the process, parallel programming and computing platform (Nvidia CUDA) facilitates dramatic increases in computing performance by harnessing the power of the graphics processing unit (GPU) was utilized. The system was employed to detect anesthetic depth level on related electroencephalogram (EEG) data set. This dataset is rather complex and large. Moreover, the achieving more anesthetic levels with rapid response is critical in anesthesia. The proposed parallelization method yielded high accurate classification results in a faster time.
Effects of graded taurine levels on juvenile cobia
USDA-ARS?s Scientific Manuscript database
Taurine, which has multiple important physiological roles in teleost fish and mammals, is an amino acid not found in alternative protein sources not derived from animals. Although taurine is found in fish-meal-based feeds, its high water solubility leads to lower taurine levels in reduction-process-...
ERIC Educational Resources Information Center
Olivier, Dianne F.; Huffman, Jane B.
2016-01-01
As the Professional Learning Community (PLC) process becomes embedded within schools, the level of district support has a direct impact on whether schools have the ability to re-culture and sustain highly effective collaborative practices. The purpose of this article is to share a professional learning community conceptual framework from the US,…
Research in Stochastic Processes.
1983-10-01
increases. A more detailed investigation for the exceedances themselves (rather than Just the cluster centers) was undertaken, together with J. HUsler and...J. HUsler and M.R. Leadbetter, Compoung Poisson limit theorems for high level exceedances by stationary sequences, Center for Stochastic Processes...stability by a random linear operator. C.D. Hardin, General (asymmetric) stable variables and processes. T. Hsing, J. HUsler and M.R. Leadbetter, Compound
Greer, J; Smailes, D; Spencer, H; Freeston, M; Dudley, R
2016-03-01
Biased processing of negatively valenced, and particularly threat-related material plays an important role in the development of paranoid thinking. This has been demonstrated by superior memory for threat-related information in patients with persecutory delusions and in non-clinical paranoia-prone participants. This study examined how emotional material was recalled having been encoded in relation to one self or to another person, in people high or low in paranoid ideation. It was predicted that people high in paranoia would recall more threat related material about others than people low in paranoia owing to being particularly alert to threats from other people. Participants who reported high (N = 30) or low (N = 30) levels of sub-clinical paranoid thinking were presented with a series of threat-related and positive words and were asked to process them in terms of the self, or in terms of a fictional character. As predicted, when words were processed in terms of another person, the high paranoia group recalled more threat-related words than positive words, but when words had been processed in terms of the self, recall of threat-related and positive words did not differ. In contrast, there was no interaction between word-valence and referent in the low paranoia group. These findings are drawn from an analogue sample. Replication in a sample of clinical participants who report persecutory delusions is required. People high in sub-clinical paranoid ideation recalled threat preferentially in relation to other people. Such information processing biases may help understand the development and maintenance of persecutory beliefs. Copyright © 2015 Elsevier Ltd. All rights reserved.
Dilution: atheoretical burden or just load? A reply to Tsal and Benoni (2010).
Lavie, Nilli; Torralbo, Ana
2010-12-01
Load theory of attention proposes that distractor processing is reduced in tasks with high perceptual load that exhaust attentional capacity within task-relevant processing. In contrast, tasks of low perceptual load leave spare capacity that spills over, resulting in the perception of task-irrelevant, potentially distracting stimuli. Tsal and Benoni (2010) find that distractor response competition effects can be reduced under conditions with a high search set size but low perceptual load (due to a singleton color target). They claim that the usual effect of search set size on distractor processing is not due to attentional load but instead attribute this to lower level visual interference. Here, we propose an account for their findings within load theory. We argue that in tasks of low perceptual load but high set size, an irrelevant distractor competes with the search nontargets for remaining capacity. Thus, distractor processing is reduced under conditions in which the search nontargets receive the spillover of capacity instead of the irrelevant distractor. We report a new experiment testing this prediction. Our new results demonstrate that, when peripheral distractor processing is reduced, it is the search nontargets nearest to the target that are perceived instead. Our findings provide new evidence for the spare capacity spillover hypothesis made by load theory and rule out accounts in terms of lower level visual interference (or mere "dilution") for cases of reduced distractor processing under low load in displays of high set size. We also discuss additional evidence that discounts the viability of Tsal and Benoni's dilution account as an alternative to perceptual load.
Lavie, Nilli; Torralbo, Ana
2010-01-01
Load theory of attention proposes that distractor processing is reduced in tasks with high perceptual load that exhaust attentional capacity within task-relevant processing. In contrast, tasks of low perceptual load leave spare capacity that spills over, resulting in the perception of task-irrelevant, potentially distracting stimuli. Tsal and Benoni (2010) find that distractor response competition effects can be reduced under conditions with a high search set size but low perceptual load (due to a singleton color target). They claim that the usual effect of search set size on distractor processing is not due to attentional load but instead attribute this to lower level visual interference. Here, we propose an account for their findings within load theory. We argue that in tasks of low perceptual load but high set size, an irrelevant distractor competes with the search nontargets for remaining capacity. Thus, distractor processing is reduced under conditions in which the search nontargets receive the spillover of capacity instead of the irrelevant distractor. We report a new experiment testing this prediction. Our new results demonstrate that, when peripheral distractor processing is reduced, it is the search nontargets nearest to the target that are perceived instead. Our findings provide new evidence for the spare capacity spillover hypothesis made by load theory and rule out accounts in terms of lower level visual interference (or mere “dilution”) for cases of reduced distractor processing under low load in displays of high set size. We also discuss additional evidence that discounts the viability of Tsal and Benoni's dilution account as an alternative to perceptual load. PMID:21133554
ERIC Educational Resources Information Center
Lee, Shinyoung; Kim, Heui-Baik
2014-01-01
The purpose of this study is to identify the epistemological features and model qualities depending on model evaluation levels and to explore the reasoning process behind high-level evaluation through small group interaction about blood circulation. Nine groups of three to four students in the eighth grade participated in the modeling practice.…
Silk, Jennifer S; Vanderbilt-Adriance, Ella; Shaw, Daniel S; Forbes, Erika E; Whalen, Diana J; Ryan, Neal D; Dahl, Ronald E
2007-01-01
This article offers a multilevel perspective on resilience to depression, with a focus on interactions among social and neurobehavioral systems involved in emotional reactivity and regulation. We discuss models of cross-contextual mediation and moderation by which the social context influences or modifies the effects of resilience processes at the biological level, or the biological context influences or modifies the effects of resilience processes at the social level. We highlight the socialization of emotion regulation as a candidate process contributing to resilience against depression at the social context level. We discuss several factors and their interactions across levels-including genetic factors, stress reactivity, positive affect, neural systems of reward, and sleep-as candidate processes contributing to resilience against depression at the neurobehavioral level. We then present some preliminary supportive findings from two studies of children and adolescents at high risk for depression. Study 1 shows that elevated neighborhood level adversity has the potential to constrain or limit the benefits of protective factors at other levels. Study 2 indicates that ease and quickness in falling asleep and a greater amount of time in deep Stage 4 sleep may be protective against the development of depressive disorders for children. The paper concludes with a discussion of clinical implications of this approach.
Commercial Disinfectants During Disinfection Process Validation: More Failures than Success.
Chatterjee, Shiv Sekhar; Chumber, Sushil Kumar; Khanduri, Uma
2016-08-01
Disinfection process validation is mandatory before introduction of a new disinfectant in hospital services. Commercial disinfection brands often question existing hospital policy claiming greater efficacy and lack of toxicity of their products. Inadvertent inadequate disinfection leads to morbidity, patient's economic burden, and the risk of mortality. To evaluate commercial disinfectants for high, intermediate and low-level disinfection so as to identify utility for our routine situations. This laboratory based experiment was conducted at St Stephen Hospital, Delhi during July-September 2013. Twelve commercial disinfectants: Sanidex®, Sanocid®, Cidex®, SekuSept Aktiv®, BIB Forte®, Alprojet W®, Desnet®, Sanihygiene®, Incidin®, D125®, Lonzagard®, and Glutishield® were tested. Time-kill assay (suspension test) was performed against six indicator bacteria (Escherichia coli, Staphylococcus aureus, Pseudomonas aeruginosa, Salmonella Typhi, Bacillus cereus, and Mycobacterium fortuitum). Low and high inoculum (final concentrations 1.5X10(6) and 9X10(6) cfu/ml) of the first five bacteria while only low level of M. fortuitum was tested. Cidex® (2.4% Glutaraldehyde) performed best as high level disinfectant while newer quarternary ammonium compounds (QACs) (Incidin®, D125®, and Lonzagard®) were good at low level disinfection. Sanidex® (0.55% Ortho-pthalaldehyde) though mycobactericidal took 10 minutes for sporicidal activity. Older QAC containing BIB Forte® and Desnet® took 20 minutes to fully inhibit P. aeruginosa. All disinfectants effectively reduced S. Typhi to zero counts within 5 minutes. Cidex® is a good high-level disinfectant while newer QACs (Incidin®, D125®, and Lonzagard®) were capable low-level disinfectants.
Understanding the Rising Phase of the PM2.5 Concentration Evolution in Large China Cities
Lv, Baolei; Cai, Jun; Xu, Bing; Bai, Yuqi
2017-01-01
Long-term air quality observations are seldom analyzed from a dynamic view. This study analyzed fine particulate matter (PM2.5) pollution processes using long-term PM2.5 observations in three Chinese cities. Pollution processes were defined as linearly growing PM2.5 concentrations following the criteria of coefficient of determination R2 > 0.8 and duration time T ≥ 18 hrs. The linear slopes quantitatively measured pollution levels by PM2.5 concentrations rising rates (PMRR, μg/(m3·hr)). The 741, 210 and 193 pollution processes were filtered out, respectively, in Beijing (BJ), Shanghai (SH), and Guangzhou (GZ). Then the relationships between PMRR and wind speed, wind direction, 24-hr backward points, gaseous pollutants (CO, NO2 and SO2) concentrations, and regional PM2.5 levels were studied. Inverse relationships existed between PMRR and wind speed. The wind directions and 24-hr backward points converged in specific directions indicating long-range transport. Gaseous pollutants concentrations increased at variable rates in the three cities with growing PMRR values. PM2.5 levels at the upwind regions of BJ and SH increased at high PMRRs. Regional transport dominated the PM2.5 pollution processes of SH. In BJ, both local contributions and regional transport increased during high-PMRR pollution processes. In GZ, PM2.5 pollution processes were mainly caused by local emissions. PMID:28440282
GOCE gravity field simulation based on actual mission scenario
NASA Astrophysics Data System (ADS)
Pail, R.; Goiginger, H.; Mayrhofer, R.; Höck, E.; Schuh, W.-D.; Brockmann, J. M.; Krasbutter, I.; Fecher, T.; Gruber, T.
2009-04-01
In the framework of the ESA-funded project "GOCE High-level Processing Facility", an operational hardware and software system for the scientific processing (Level 1B to Level 2) of GOCE data has been set up by the European GOCE Gravity Consortium EGG-C. One key component of this software system is the processing of a spherical harmonic Earth's gravity field model and the corresponding full variance-covariance matrix from the precise GOCE orbit and calibrated and corrected satellite gravity gradiometry (SGG) data. In the framework of the time-wise approach a combination of several processing strategies for the optimum exploitation of the information content of the GOCE data has been set up: The Quick-Look Gravity Field Analysis is applied to derive a fast diagnosis of the GOCE system performance and to monitor the quality of the input data. In the Core Solver processing a rigorous high-precision solution of the very large normal equation systems is derived by applying parallel processing techniques on a PC cluster. Before the availability of real GOCE data, by means of a realistic numerical case study, which is based on the actual GOCE orbit and mission scenario and simulation data stemming from the most recent ESA end-to-end simulation, the expected GOCE gravity field performance is evaluated. Results from this simulation as well as recently developed features of the software system are presented. Additionally some aspects on data combination with complementary data sources are addressed.
NASA Astrophysics Data System (ADS)
Li, Y.; Acharya, K.; Chen, D.; Stone, M.; Yu, Z.; Young, M.; Zhu, J.; Shafer, D. S.; Warwick, J. J.
2009-12-01
Sustained drought in the western United States since 2000 has led to a significant drop (about 35 meters) in the water level of Lake Mead, the largest reservoir by volume in United States. The drought combined with rapid urban development in southern Nevada and emergence of invasive species has threatened the water quality and ecological processes in Lake Mead. A three-dimensional hydrodynamic model, Environmental Fluid Dynamics Code (EFDC), was applied to investigate lake circulation and temperature stratification in parts of Lake Mead (Las Vegas Bay and Boulder Basin) under changing water levels. Besides the inflow from Las Vegas Wash and the Colorado River, the model considered atmospheric changes as well as the boundary conditions restricted by the operation of Hoover Dam. The model was calibrated and verified by using observed data including water level, velocity, and temperature from 2003 and 2005. The model was applied to study the hydrodynamic processes at water level 366.8 m (year 2000) and at water level 338.2 m (year 2008). The high-stage simulation described the pre-drought lake hydrodynamic processes while the low-stage simulation highlighted the drawdown impact on such processes. The results showed that both inflow and wind-driven mixing process played major roles in the thermal stratification and lake circulation in both cases. However, the atmospheric boundary played a more important role than inflow temperature on thermal stratification of Lake Mead during water level decline. Further, the thermal stratification regime and flow circulation pattern in shallow lake regions (e.g.., the Boulder Basin area) were most impacted. The temperature of the lake at the high-stage was more sensitive to inflow temperatures than at low-stage. Furthermore, flow velocities decreased with the decreasing water level due to reduction in wind impacts, particularly in shallow areas of the lake. Such changes in temperature and lake current due to present drought have a strong influence on contaminant and nutrient dynamics and ecosystem of the lake.
Klumpers, Floris; Everaerd, Daphne; Kooijman, Sabine C.; van Wingen, Guido A.; Fernández, Guillén
2016-01-01
Stress exposure is known to precipitate psychological disorders. However, large differences exist in how individuals respond to stressful situations. A major marker for stress sensitivity is hypothalamus–pituitary–adrenal (HPA)-axis function. Here, we studied how interindividual variance in both basal cortisol levels and stress-induced cortisol responses predicts differences in neural vigilance processing during stress exposure. Implementing a randomized, counterbalanced, crossover design, 120 healthy male participants were exposed to a stress-induction and control procedure, followed by an emotional perception task (viewing fearful and happy faces) during fMRI scanning. Stress sensitivity was assessed using physiological (salivary cortisol levels) and psychological measures (trait questionnaires). High stress-induced cortisol responses were associated with increased stress sensitivity as assessed by psychological questionnaires, a stronger stress-induced increase in medial temporal activity and greater differential amygdala responses to fearful as opposed to happy faces under control conditions. In contrast, high basal cortisol levels were related to relative stress resilience as reflected by higher extraversion scores, a lower stress-induced increase in amygdala activity and enhanced differential processing of fearful compared with happy faces under stress. These findings seem to reflect a critical role for HPA-axis signaling in stress coping; higher basal levels indicate stress resilience, whereas higher cortisol responsivity to stress might facilitate recovery in those individuals prone to react sensitively to stress. PMID:26668010
Lovelock, Catherine E.; Bennion, Vicki; Grinham, Alistair; Cahoon, Donald R.
2011-01-01
Increases in the elevation of the soil surfaces of mangroves and salt marshes are key to the maintenance of these habitats with accelerating sea level rise. Understanding the processes that give rise to increases in soil surface elevation provides science for management of landscapes for sustainable coastal wetlands. Here, we tested whether the soil surface elevation of mangroves and salt marshes in Moreton Bay is keeping up with local rates of sea level rise (2.358 mm y-1) and whether accretion on the soil surface was the most important process for keeping up with sea level rise. We found variability in surface elevation gains, with sandy areas in the eastern bay having the highest surface elevation gains in both mangrove and salt marsh (5.9 and 1.9 mm y-1) whereas in the muddier western bay rates of surface elevation gain were lower (1.4 and -0.3 mm y-1 in mangrove and salt marsh, respectively). Both sides of the bay had similar rates of surface accretion (~7–9 mm y-1 in the mangrove and 1–3 mm y-1 in the salt marsh), but mangrove soils in the western bay were subsiding at a rate of approximately 8 mm y-1, possibly due to compaction of organic sediments. Over the study surface elevation increments were sensitive to position in the intertidal zone (higher when lower in the intertidal) and also to variation in mean sea level (higher at high sea level). Although surface accretion was the most important process for keeping up with sea level rise in the eastern bay, subsidence largely negated gains made through surface accretion in the western bay indicating a high vulnerability to sea level rise in these forests.
Perfusion seed cultures improve biopharmaceutical fed-batch production capacity and product quality.
Yang, William C; Lu, Jiuyi; Kwiatkowski, Chris; Yuan, Hang; Kshirsagar, Rashmi; Ryll, Thomas; Huang, Yao-Ming
2014-01-01
Volumetric productivity and product quality are two key performance indicators for any biopharmaceutical cell culture process. In this work, we showed proof-of-concept for improving both through the use of alternating tangential flow perfusion seed cultures coupled with high-seed fed-batch production cultures. First, we optimized the perfusion N-1 stage, the seed train bioreactor stage immediately prior to the production bioreactor stage, to minimize the consumption of perfusion media for one CHO cell line and then successfully applied the optimized perfusion process to a different CHO cell line. Exponential growth was observed throughout the N-1 duration, reaching >40 × 10(6) vc/mL at the end of the perfusion N-1 stage. The cultures were subsequently split into high-seed (10 × 10(6) vc/mL) fed-batch production cultures. This strategy significantly shortened the culture duration. The high-seed fed-batch production processes for cell lines A and B reached 5 g/L titer in 12 days, while their respective low-seed processes reached the same titer in 17 days. The shortened production culture duration potentially generates a 30% increase in manufacturing capacity while yielding comparable product quality. When perfusion N-1 and high-seed fed-batch production were applied to cell line C, higher levels of the active protein were obtained, compared to the low-seed process. This, combined with correspondingly lower levels of the inactive species, can enhance the overall process yield for the active species. Using three different CHO cell lines, we showed that perfusion seed cultures can optimize capacity utilization and improve process efficiency by increasing volumetric productivity while maintaining or improving product quality. © 2014 American Institute of Chemical Engineers.
Higgins, D L; O'Reilly, K; Tashima, N; Crain, C; Beeker, C; Goldbaum, G; Elifson, C S; Galavotti, C; Guenther-Grey, C
1996-01-01
The AIDS Community Demonstration Projects provided community-level HIV prevention interventions to historically hard-to-reach groups at high risk for HIV infection. The projects operated under a common research protocol which encompassed formative research, intervention delivery, process evaluation, and outcome evaluation. A formative research process specifically focusing on intervention development was devised to assist project staff in identifying, prioritizing, accessing, and understanding the intervention target groups. This process was central to the creation of interventions that were acceptable and unique to the target populations. Intended to be rapid, the process took 6 months to complete. Drawn from the disciplines of anthropology, community psychology, sociology, and public health, the formative research process followed distinct steps which included (a) defining the populations at high-risk for HIV; (b) gathering information about these populations through interviews with persons who were outside of, but who had contact with, the target groups (such as staff from the health department and alcohol and drug treatment facilities, as well as persons who interacted in an informal manner with the target groups, such as clerks in neighborhood grocery stores and bartenders); (c) interviewing people with access to the target populations (gatekeepers), and conducting observations in areas where these high-risk groups were reported to gather (from previous interviews); (d) interviewing members of these groups at high risk for HIV infection or transmission; and (e) systematically integrating information throughout the process. Semistructured interview schedules were used for all data collection in this process. This standardized systematic method yielded valuable information about the focal groups in each demonstration project site. The method, if adopted by others, would assist community intervention specialists in developing interventions that are culturally appropriate and meaningful to their respective target populations. PMID:8862154
Arrizon, J; Gschaedler, A
2007-04-01
To study the effect of the addition of different nitrogen sources at high sugar concentration in the tequila fermentation process. Fermentations were performed at high sugar concentration (170 g l(-1)) using Agave tequilana Weber blue variety with and without added nitrogen from different sources (ammonium sulfate; glutamic acid; a mixture of ammonium sulfate and amino acids) during the exponential phase of growth. All the additions increased the fermentation rate and alcohol efficiency. The level of synthesis of volatile compounds depended on the source added. The concentration of amyl alcohols and isobutanol were decreased while propanol and acetaldehyde concentration increased. The most efficient nitrogen sources for fermentation rate were ammonium sulfate and the mixture of ammonium sulfate and amino acids. The level of volatile compounds produced depended upon types of nitrogen. The synthesis of some volatile compounds increased while others decreased with nitrogen addition. The addition of nitrogen could be a strategy for improving the fermentation rate and efficiency in the tequila fermentation process at high sugar Agave tequilana concentration. Furthermore, the sensory quality of the final product may change because the synthesis of the volatile compounds is modified.
Rodrigues, Camila S; Renshaw, Keith D
2010-10-01
Studies have identified coping processes as one potential factor influencing PTSD in veterans. This study examined the associations between coping, combat exposure, and PTSD among 218 National Guard veterans deployed overseas since 2001. Problem-focused coping was unrelated to combat exposure and PTSD symptoms. In contrast, increased levels of emotion focused coping (EFC) were found in veterans who reported higher levels of combat exposure. Moreover, the severity of combat was a curvilinear moderator of the relation between coping process and PTSD, such that EFC was unrelated to PTSD symptom severity at low levels of combat, associated with higher symptom severity at moderate levels of combat, and associated with lower symptom severity at high levels of combat. These findings indicate that the type and severity of trauma may moderate the association of coping and psychological outcomes, and that these associations might not be linear. Published by Elsevier Ltd.
Monich, Victor A; Bavrina, Anna P; Malinovskaya, Svetlana L
2018-01-01
Exposure of living tissues to high-intensity red or near-infrared light can produce the oxidative stress effects both in the target zone and adjacent ones. The protein oxidative modification (POM) products can be used as reliable and early markers of oxidative stress. The contents of modified proteins in the investigated specimens can be evaluated by the 2,4-dinitrophenylhydrazine assay (the DNPH assay). Low-intensity red light is able to decrease the activity of oxidative processes and the DNPH assay data about the POM products in the biological tissues could show both an oxidative stress level and an efficiency of physical agent protection against the oxidative processes. Two control groups of white rats were irradiated by laser light, the first control group by red light and the second one by near-infrared radiation (NIR).Two experimental groups were consequently treated with laser and red low-level light-emitting diode radiation (LED). One of them was exposed to red laser light + LED and the other to NIR + LED. The fifth group was intact. Each group included ten animals. The effect of laser light was studied by methods of protein oxidative modifications. We measured levels of both induced and spontaneous POM products by the DNPH assay. The dramatic increase in levels of POM products in the control group samples when compared with the intact group data as well as the sharp decrease in the POM products in the experimental groups treated with LED low-level light were statistically significant (p ≤ 0.05). Exposure of skeletal muscles to high-intensity red and near-infrared laser light causes oxidative stress that continues not less than 3 days. The method of measurement of POM product contents by the DNPH assay is a reliable test of an oxidative process rate. Red low-intensity LED radiation can provide rehabilitation of skeletal muscle tissues treated with high-intensity laser light.
Software-based high-level synthesis design of FPGA beamformers for synthetic aperture imaging.
Amaro, Joao; Yiu, Billy Y S; Falcao, Gabriel; Gomes, Marco A C; Yu, Alfred C H
2015-05-01
Field-programmable gate arrays (FPGAs) can potentially be configured as beamforming platforms for ultrasound imaging, but a long design time and skilled expertise in hardware programming are typically required. In this article, we present a novel approach to the efficient design of FPGA beamformers for synthetic aperture (SA) imaging via the use of software-based high-level synthesis techniques. Software kernels (coded in OpenCL) were first developed to stage-wise handle SA beamforming operations, and their corresponding FPGA logic circuitry was emulated through a high-level synthesis framework. After design space analysis, the fine-tuned OpenCL kernels were compiled into register transfer level descriptions to configure an FPGA as a beamformer module. The processing performance of this beamformer was assessed through a series of offline emulation experiments that sought to derive beamformed images from SA channel-domain raw data (40-MHz sampling rate, 12 bit resolution). With 128 channels, our FPGA-based SA beamformer can achieve 41 frames per second (fps) processing throughput (3.44 × 10(8) pixels per second for frame size of 256 × 256 pixels) at 31.5 W power consumption (1.30 fps/W power efficiency). It utilized 86.9% of the FPGA fabric and operated at a 196.5 MHz clock frequency (after optimization). Based on these findings, we anticipate that FPGA and high-level synthesis can together foster rapid prototyping of real-time ultrasound processor modules at low power consumption budgets.
Mark J. Ambrose
2018-01-01
Tree mortality is a natural process in all forest ecosystems. High mortality can be an indicator of forest health problems. On aregional scale, high mortality levels may indicate widespread insect or disease impacts. High mortality may also occur if a large proportion of the forest in a particular region is made up of older, senescent stands. The approach...
Primary Literature as a Basis for a High-School Biology Curriculum.
ERIC Educational Resources Information Center
Yarden, Anat; Brill, Gilat; Falk, Hedda
2001-01-01
Adopts primary literature as a means of developing scientific literacy among high-school biology majors. Reports on the development and implementation of a primary literature-based curriculum in developmental biology. Discusses the process of adapting original research articles to the high-school level, as well as a conversational model developed…
Judicious Discipline: A Constitutional Approach for Public High Schools.
ERIC Educational Resources Information Center
Grandmont, Richard P.
2003-01-01
Examines the practices in a large public high school where constitutional language and democratic citizenship education--judicious discipline--are introduced into the decision-making processes of the classroom. Data analysis suggests that a considerable number of students felt they possessed a high level of respect and responsibility as a result.…
Mark J. Ambrose
2012-01-01
Tree mortality is a natural process in all forest ecosystems. However, extremely high mortality also can be an indicator of forest health issues. On a regional scale, high mortality levels may indicate widespread insect or disease problems. High mortality may also occur if a large proportion of the forest in a particular region is made up of older, senescent stands....
FERMI: a digital Front End and Readout MIcrosystem for high resolution calorimetry
NASA Astrophysics Data System (ADS)
Alexanian, H.; Appelquist, G.; Bailly, P.; Benetta, R.; Berglund, S.; Bezamat, J.; Blouzon, F.; Bohm, C.; Breveglieri, L.; Brigati, S.; Cattaneo, P. W.; Dadda, L.; David, J.; Engström, M.; Genat, J. F.; Givoletti, M.; Goggi, V. G.; Gong, S.; Grieco, G. M.; Hansen, M.; Hentzell, H.; Holmberg, T.; Höglund, I.; Inkinen, S. J.; Kerek, A.; Landi, C.; Ledortz, O.; Lippi, M.; Lofstedt, B.; Lund-Jensen, B.; Maloberti, F.; Mutz, S.; Nayman, P.; Piuri, V.; Polesello, G.; Sami, M.; Savoy-Navarro, A.; Schwemling, P.; Stefanelli, R.; Sundblad, R.; Svensson, C.; Torelli, G.; Vanuxem, J. P.; Yamdagni, N.; Yuan, J.; Ödmark, A.; Fermi Collaboration
1995-02-01
We present a digital solution for the front-end electronics of high resolution calorimeters at future colliders. It is based on analogue signal compression, high speed {A}/{D} converters, a fully programmable pipeline and a digital signal processing (DSP) chain with local intelligence and system supervision. This digital solution is aimed at providing maximal front-end processing power by performing waveform analysis using DSP methods. For the system integration of the multichannel device a multi-chip, silicon-on-silicon multi-chip module (MCM) has been adopted. This solution allows a high level of integration of complex analogue and digital functions, with excellent flexibility in mixing technologies for the different functional blocks. This type of multichip integration provides a high degree of reliability and programmability at both the function and the system level, with the additional possibility of customising the microsystem to detector-specific requirements. For enhanced reliability in high radiation environments, fault tolerance strategies, i.e. redundancy, reconfigurability, majority voting and coding for error detection and correction, are integrated into the design.
Accounting For Uncertainty in The Application Of High Throughput Datasets
The use of high throughput screening (HTS) datasets will need to adequately account for uncertainties in the data generation process and propagate these uncertainties through to ultimate use. Uncertainty arises at multiple levels in the construction of predictors using in vitro ...
Güler, I; Burunkaya, M
2002-01-01
Relative humidity levels of an incubator were measured and controlled. An ultrasonic nebulizer system as an active humidifier was used to humidify the incubator environment. An integrated circuit-type humidity sensor was used to measure the humidity level of the incubator environment. Measurement and control processes were achieved by a PIC microcontroller. The high-performance and high-speed PIC provided the flexibility of the system. The developed system can be used effectively for the intensive care of newborns and/or premature babies. Since the humidifier generates an aerosol in ambient conditions, it is possible to provide the high relative humidity level for therapeutic and diagnostic purposes in medicine.
Aryee, Samuel; Walumbwa, Fred O; Seidu, Emmanuel Y M; Otaye, Lilian E
2012-03-01
We proposed and tested a multilevel model, underpinned by empowerment theory, that examines the processes linking high-performance work systems (HPWS) and performance outcomes at the individual and organizational levels of analyses. Data were obtained from 37 branches of 2 banking institutions in Ghana. Results of hierarchical regression analysis revealed that branch-level HPWS relates to empowerment climate. Additionally, results of hierarchical linear modeling that examined the hypothesized cross-level relationships revealed 3 salient findings. First, experienced HPWS and empowerment climate partially mediate the influence of branch-level HPWS on psychological empowerment. Second, psychological empowerment partially mediates the influence of empowerment climate and experienced HPWS on service performance. Third, service orientation moderates the psychological empowerment-service performance relationship such that the relationship is stronger for those high rather than low in service orientation. Last, ordinary least squares regression results revealed that branch-level HPWS influences branch-level market performance through cross-level and individual-level influences on service performance that emerges at the branch level as aggregated service performance.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ruiz, Steven Adriel
The following discussion contains a high-level description of methods used to implement software for data processing. It describes the required directory structures and file handling required to use Excel's Visual Basic for Applications programming language and how to identify shot, test and capture types to appropriately process data. It also describes how to interface with the software.
The Importance of Negotiation for Policy Dialogue: Latin American Training Experiences
ERIC Educational Resources Information Center
Jaramillo, Maria Clara
2004-01-01
Over the past several decades, Latin American countries have supported processes of bringing public policy decisions on education closer to the people concerned. Participation at all levels of decision-making processes has generally been highly valued. Nonetheless, these decentralization efforts came about without governments taking the necessary…
Impact of Therapist Self-Disclosure.
ERIC Educational Resources Information Center
O'Hare, Christopher
The effects of first-session interviewer self-disclosures that differed in three levels of intimacy--low, medium and high--and three kinds of temporal focus--historical (past tense and external to the interview process), current (present tense and external to the interview process), and existential in which the interviewer disclosed immediate…
2018-01-01
Use of additive manufacturing is growing rapidly in the orthotics field. This technology allows orthotics to be designed directly on digital scans of limbs. However, little information is available about scanners and 3D scans. The aim of this study is to look at the agreement between manual measurements, high-level and low-cost handheld 3D scanners. We took two manual measurements and three 3D scans with each scanner from 14 lower limbs. The lower limbs were divided into 17 sections of 30mm each from 180mm above the mid-patella to 300mm below. Time to record and to process the three 3D scans for scanners methods were compared with Student t-test while Bland-Altman plots were used to study agreement between circumferences of each section from the three methods. The record time was 97s shorter with high-level scanner than with the low-cost (p = .02) while the process time was nine times quicker with the low-cost scanner (p < .01). An overestimation of 2.5mm was found in high-level scanner compared to manual measurement, but with a better repeatability between measurements. The low-cost scanner tended to overestimate the circumferences from 0.1% to 1.5%, overestimation being greater for smaller circumferences. In conclusion, 3D scanners provide more information about the shape of the lower limb, but the reliability depends on the 3D scanner and the size of the scanned segment. Low-cost scanners could be useful for clinicians because of the simple and fast process, but attention should be focused on accuracy, which depends on the scanned body segment. PMID:29320560
Prototyping scalable digital signal processing systems for radio astronomy using dataflow models
NASA Astrophysics Data System (ADS)
Sane, N.; Ford, J.; Harris, A. I.; Bhattacharyya, S. S.
2012-05-01
There is a growing trend toward using high-level tools for design and implementation of radio astronomy digital signal processing (DSP) systems. Such tools, for example, those from the Collaboration for Astronomy Signal Processing and Electronics Research (CASPER), are usually platform-specific, and lack high-level, platform-independent, portable, scalable application specifications. This limits the designer's ability to experiment with designs at a high-level of abstraction and early in the development cycle. We address some of these issues using a model-based design approach employing dataflow models. We demonstrate this approach by applying it to the design of a tunable digital downconverter (TDD) used for narrow-bandwidth spectroscopy. Our design is targeted toward an FPGA platform, called the Interconnect Break-out Board (IBOB), that is available from the CASPER. We use the term TDD to refer to a digital downconverter for which the decimation factor and center frequency can be reconfigured without the need for regenerating the hardware code. Such a design is currently not available in the CASPER DSP library. The work presented in this paper focuses on two aspects. First, we introduce and demonstrate a dataflow-based design approach using the dataflow interchange format (DIF) tool for high-level application specification, and we integrate this approach with the CASPER tool flow. Secondly, we explore the trade-off between the flexibility of TDD designs and the low hardware cost of fixed-configuration digital downconverter (FDD) designs that use the available CASPER DSP library. We further explore this trade-off in the context of a two-stage downconversion scheme employing a combination of TDD or FDD designs.
Dessery, Yoann; Pallari, Jari
2018-01-01
Use of additive manufacturing is growing rapidly in the orthotics field. This technology allows orthotics to be designed directly on digital scans of limbs. However, little information is available about scanners and 3D scans. The aim of this study is to look at the agreement between manual measurements, high-level and low-cost handheld 3D scanners. We took two manual measurements and three 3D scans with each scanner from 14 lower limbs. The lower limbs were divided into 17 sections of 30mm each from 180mm above the mid-patella to 300mm below. Time to record and to process the three 3D scans for scanners methods were compared with Student t-test while Bland-Altman plots were used to study agreement between circumferences of each section from the three methods. The record time was 97s shorter with high-level scanner than with the low-cost (p = .02) while the process time was nine times quicker with the low-cost scanner (p < .01). An overestimation of 2.5mm was found in high-level scanner compared to manual measurement, but with a better repeatability between measurements. The low-cost scanner tended to overestimate the circumferences from 0.1% to 1.5%, overestimation being greater for smaller circumferences. In conclusion, 3D scanners provide more information about the shape of the lower limb, but the reliability depends on the 3D scanner and the size of the scanned segment. Low-cost scanners could be useful for clinicians because of the simple and fast process, but attention should be focused on accuracy, which depends on the scanned body segment.
Solidification of Savannah River plant high level waste
NASA Astrophysics Data System (ADS)
Maher, R.; Shafranek, L. F.; Kelley, J. A.; Zeyfang, R. W.
1981-11-01
Authorization for construction of the Defense Waste Processing Facility (DWPF) is expected in FY-83. The optimum time for stage 2 authorization is about three years later. Detailed design and construction will require approximately five years for stage 1, with stage 2 construction completed about two to three years later. Production of canisters of waste glass would begin in 1988, and the existing backlog of high level waste sludge stored at SRP would be worked off by about the year 2000. Stage 2 operation could begin in 1990. The technology and engineering are ready for construction and eventual operation of the DWPF for immobilizing high level radioactive waste at Savannah River Plant (SRP). Proceeding with this project will provide the public, and the leadership of this country, with a crucial demonstration that a major quanitity of existing high level nuclear wastes can be safely and permanently immobilized.
Roush, David J; Myrold, Adam; Burnham, Michael S; And, Joseph V; Hughes, Joseph V
2015-01-01
Virus filtration (VF) is a key step in an overall viral clearance process since it has been demonstrated to effectively clear a wide range of mammalian viruses with a log reduction value (LRV) > 4. The potential to achieve higher LRV from virus retentive filters has historically been examined using bacteriophage surrogates, which commonly demonstrated a potential of > 9 LRV when using high titer spikes (e.g. 10(10) PFU/mL). However, as the filter loading increases, one typically experiences significant decreases in performance and LRV. The 9 LRV value is markedly higher than the current expected range of 4-5 LRV when utilizing mammalian retroviruses on virus removal filters (Miesegaes et al., Dev Biol (Basel) 2010;133:3-101). Recent values have been reported in the literature (Stuckey et al., Biotech Progr 2014;30:79-85) of LRV in excess of 6 for PPV and XMuLV although this result appears to be atypical. LRV for VF with therapeutic proteins could be limited by several factors including process limits (flux decay, load matrix), virus spike level and the analytical methods used for virus detection (i.e. the Limits of Quantitation), as well as the virus spike quality. Research was conducted using the Xenotropic-Murine Leukemia Virus (XMuLV) for its direct relevance to the most commonly cited document, the International Conference of Harmonization (ICH) Q5A (International Conference on Harmonisation of Technical Requirements for Registration of Pharmaceuticals for Human Use, Geneva, Switzerland, 1999) for viral safety evaluations. A unique aspect of this work is the independent evaluation of the impact of retrovirus quality and virus spike level on VF performance and LRV. The VF studies used XMuLV preparations purified by either ultracentrifugation (Ultra 1) or by chromatographic processes that yielded a more highly purified virus stock (Ultra 2). Two monoclonal antibodies (Mabs) with markedly different filtration characteristics and with similar levels of aggregate (<1.5%) were evaluated with the Ultra 1 and Ultra 2 virus preparations utilizing the Planova 20 N, a small virus removal filter. Impurities in the virus preparation ultimately limited filter loading as measured by determining the volumetric loading condition where 75% flux decay is observed versus initial conditions (V75). This observation occurred with both Mabs with the difference in virus purity more pronounced when very high spike levels were used (>5 vol/vol %). Significant differences were seen for the process performance over a number of lots of the less-pure Ultra 1 virus preparations. Experiments utilizing a developmental lot of the chromatographic purified XMuLV (Ultra 2 Development lot) that had elevated levels of host cell residuals (vs. the final Ultra 2 preparations) suggest that these contaminant residuals can impact virus filter fouling, even if the virus prep is essentially monodisperse. Process studies utilizing an Ultra 2 virus with substantially less host cell residuals and highly monodispersed virus particles demonstrated superior performance and an LRV in excess of 7.7 log10 . A model was constructed demonstrating the linear dependence of filtration flux versus filter loading which can be used to predict the V75 for a range of virus spike levels conditions using this highly purified virus. Fine tuning the virus spike level with this model can ultimately maximize the LRV for the virus filter step, essentially adding the LRV equivalent of another process step (i.e. protein A or CEX chromatography). © 2014 American Institute of Chemical Engineers.
A Dynamic Interactive Theory of Person Construal
ERIC Educational Resources Information Center
Freeman, Jonathan B.; Ambady, Nalini
2011-01-01
A dynamic interactive theory of person construal is proposed. It assumes that the perception of other people is accomplished by a dynamical system involving continuous interaction between social categories, stereotypes, high-level cognitive states, and the low-level processing of facial, vocal, and bodily cues. This system permits lower-level…
Federal Register 2010, 2011, 2012, 2013, 2014
2012-06-26
... protective equipment to protect miners from harmful levels of noise that can result in hearing loss. However..., relies on drills, crushers, compressors, conveyors, trucks, loaders, and other heavy-duty equipment for the excavation, haulage, and processing of material. This equipment creates high sound levels...
77 FR 16865 - Proposed Extension of Existing Information Collection; Occupational Noise Exposure
Federal Register 2010, 2011, 2012, 2013, 2014
2012-03-22
..., administrative controls, and personal protective equipment to protect miners from harmful levels of noise that..., compressors, conveyors, trucks, loaders, and other heavy-duty equipment for the excavation, haulage, and processing of material. This equipment creates high sound levels, exposing machine operators as well as...
How Awareness Changes the Relative Weights of Evidence During Human Decision-Making
Lamme, Victor A. F.; Dehaene, Stanislas
2011-01-01
Human decisions are based on accumulating evidence over time for different options. Here we ask a simple question: How is the accumulation of evidence affected by the level of awareness of the information? We examined the influence of awareness on decision-making using combined behavioral methods and magneto-encephalography (MEG). Participants were required to make decisions by accumulating evidence over a series of visually presented arrow stimuli whose visibility was modulated by masking. Behavioral results showed that participants could accumulate evidence under both high and low visibility. However, a top-down strategic modulation of the flow of incoming evidence was only present for stimuli with high visibility: once enough evidence had been accrued, participants strategically reduced the impact of new incoming stimuli. Also, decision-making speed and confidence were strongly modulated by the strength of the evidence for high-visible but not low-visible evidence, even though direct priming effects were identical for both types of stimuli. Neural recordings revealed that, while initial perceptual processing was independent of visibility, there was stronger top-down amplification for stimuli with high visibility than low visibility. Furthermore, neural markers of evidence accumulation over occipito-parietal cortex showed a strategic bias only for highly visible sensory information, speeding up processing and reducing neural computations related to the decision process. Our results indicate that the level of awareness of information changes decision-making: while accumulation of evidence already exists under low visibility conditions, high visibility allows evidence to be accumulated up to a higher level, leading to important strategical top-down changes in decision-making. Our results therefore suggest a potential role of awareness in deploying flexible strategies for biasing information acquisition in line with one's expectations and goals. PMID:22131904
How awareness changes the relative weights of evidence during human decision-making.
de Lange, Floris P; van Gaal, Simon; Lamme, Victor A F; Dehaene, Stanislas
2011-11-01
Human decisions are based on accumulating evidence over time for different options. Here we ask a simple question: How is the accumulation of evidence affected by the level of awareness of the information? We examined the influence of awareness on decision-making using combined behavioral methods and magneto-encephalography (MEG). Participants were required to make decisions by accumulating evidence over a series of visually presented arrow stimuli whose visibility was modulated by masking. Behavioral results showed that participants could accumulate evidence under both high and low visibility. However, a top-down strategic modulation of the flow of incoming evidence was only present for stimuli with high visibility: once enough evidence had been accrued, participants strategically reduced the impact of new incoming stimuli. Also, decision-making speed and confidence were strongly modulated by the strength of the evidence for high-visible but not low-visible evidence, even though direct priming effects were identical for both types of stimuli. Neural recordings revealed that, while initial perceptual processing was independent of visibility, there was stronger top-down amplification for stimuli with high visibility than low visibility. Furthermore, neural markers of evidence accumulation over occipito-parietal cortex showed a strategic bias only for highly visible sensory information, speeding up processing and reducing neural computations related to the decision process. Our results indicate that the level of awareness of information changes decision-making: while accumulation of evidence already exists under low visibility conditions, high visibility allows evidence to be accumulated up to a higher level, leading to important strategical top-down changes in decision-making. Our results therefore suggest a potential role of awareness in deploying flexible strategies for biasing information acquisition in line with one's expectations and goals.
Wang, Xiaolong; Gao, Dawen
2016-01-01
The one-stage partial nitritation and anammox process (PN/A) has been a promising microbial process to remove ammonia from wastewater especially with low carbon/nitrogen ratio. The main breakdown was the deterioration caused by overgrowth of nitrite oxidizing bacteria (NOB) resulting effluent nitrate build-up in the PN/A process. This study presented an in-situ restoring strategy for suppressing NOB activity in a one-stage granular PN/A system deteriorated over 2 months, using elevated concentrations of substrates (ammonia and nitrite) under limited dissolved oxygen level. The results showed that the NOB activity was successfully suppressed after 56 days of restoration, and finally the ratio of produced nitrate/consumed ammonium was reduced from 36.8% to 7%. On day 66 the nitrogen removal rate obtained as 1.2 kg N/(m3·d). The high FA level (5–40 mg/L) and low dissolved oxygen (<0.13 mg/L) were responsible for NOB suppression. From quantitative PCR (qPCR) analysis, after this restoration, anammox bacteria had a widely growth, and AOB stay stable, but Nitrospira increase and Nitrobacter declined. High amount of NOB was still persistent in the granules, which was not easy to wash-out and threaten the deammonification performance. PMID:27881860
Automaticity of phonological and semantic processing during visual word recognition.
Pattamadilok, Chotiga; Chanoine, Valérie; Pallier, Christophe; Anton, Jean-Luc; Nazarian, Bruno; Belin, Pascal; Ziegler, Johannes C
2017-04-01
Reading involves activation of phonological and semantic knowledge. Yet, the automaticity of the activation of these representations remains subject to debate. The present study addressed this issue by examining how different brain areas involved in language processing responded to a manipulation of bottom-up (level of visibility) and top-down information (task demands) applied to written words. The analyses showed that the same brain areas were activated in response to written words whether the task was symbol detection, rime detection, or semantic judgment. This network included posterior, temporal and prefrontal regions, which clearly suggests the involvement of orthographic, semantic and phonological/articulatory processing in all tasks. However, we also found interactions between task and stimulus visibility, which reflected the fact that the strength of the neural responses to written words in several high-level language areas varied across tasks. Together, our findings suggest that the involvement of phonological and semantic processing in reading is supported by two complementary mechanisms. First, an automatic mechanism that results from a task-independent spread of activation throughout a network in which orthography is linked to phonology and semantics. Second, a mechanism that further fine-tunes the sensitivity of high-level language areas to the sensory input in a task-dependent manner. Copyright © 2017 Elsevier Inc. All rights reserved.
Electrospun amplified fiber optics.
Morello, Giovanni; Camposeo, Andrea; Moffa, Maria; Pisignano, Dario
2015-03-11
All-optical signal processing is the focus of much research aiming to obtain effective alternatives to existing data transmission platforms. Amplification of light in fiber optics, such as in Erbium-doped fiber amplifiers, is especially important for efficient signal transmission. However, the complex fabrication methods involving high-temperature processes performed in a highly pure environment slow the fabrication process and make amplified components expensive with respect to an ideal, high-throughput, room temperature production. Here, we report on near-infrared polymer fiber amplifiers working over a band of ∼20 nm. The fibers are cheap, spun with a process entirely carried out at room temperature, and shown to have amplified spontaneous emission with good gain coefficients and low levels of optical losses (a few cm(-1)). The amplification process is favored by high fiber quality and low self-absorption. The found performance metrics appear to be suitable for short-distance operations, and the large variety of commercially available doping dyes might allow for effective multiwavelength operations by electrospun amplified fiber optics.