Sample records for uhde-pfirrmann process

  1. The relationship between quantitative measures of disc height and disc signal intensity with Pfirrmann score of disc degeneration.

    PubMed

    Salamat, Sara; Hutchings, John; Kwong, Clemens; Magnussen, John; Hancock, Mark J

    2016-01-01

    To assess the relationship between quantitative measures of disc height and signal intensity with the Pfirrmann disc degeneration scoring system and to test the inter-rater reliability of the quantitative measures. Participants were 76 people who had recently recovered from their last episode of acute low back pain and underwent MRI scan on a single 3T machine. At all 380 lumbar discs, quantitative measures of disc height and signal intensity were made by 2 independent raters and compared to Pfirrmann scores from a single radiologist. For quantitative measures of disc height and signal intensity a "raw" score and 2 adjusted ratios were calculated and the relationship with Pfirrmann scores was assessed. The inter-tester reliability of quantitative measures was also investigated. There was a strong linear relationship between quantitative disc signal intensity and Pfirrmann scores for grades 1-4, but not for grades 4 and 5. For disc height only, Pfirrmann grade 5 had significantly reduced disc height compared to all other grades. Results were similar regardless of whether raw or adjusted scores were used. Inter-rater reliability for the quantitative measures was excellent (ICC > 0.97). Quantitative measures of disc signal intensity were strongly related to Pfirrmann scores from grade 1 to 4; however disc height only differentiated between grade 4 and 5 Pfirrmann scores. Using adjusted ratios for quantitative measures of disc height or signal intensity did not significantly alter the relationship with Pfirrmann scores.

  2. Space Station Fisheye Fly-Through_ UHD

    NASA Image and Video Library

    2016-10-27

    Join us for a fly-through of the International Space Station. Produced by Harmonic exclusively for NASA TV UHD, the footage was shot in Ultra High Definition (4K) using a fisheye lens for extreme focus and depth of field.

  3. Ultra-high definition (8K UHD) endoscope: our first clinical success.

    PubMed

    Yamashita, Hiromasa; Aoki, Hisae; Tanioka, Kenkichi; Mori, Toshiyuki; Chiba, Toshio

    2016-01-01

    We have started clinical application of 8K ultra-high definition (UHD; 7680 × 4320 pixels) imaging technology, which is a 16-fold higher resolution than the current 2K high-definition (HD; 1920 × 1080 pixels) technology, to an endoscope for advanced laparoscopic surgery. Based on preliminary testing experience and with subsequent technical and system improvements, we then proceeded to perform two cases of cholecystectomy and were able to achieve clinical success with an 8K UHD endoscopic system, which consisted of an 8K camera, a 30-degrees angled rigid endoscope with a lens adapter, a pair of 300-W xenon light sources, an 85-inch 8K LCD and an 8K video recorder. These experimental and clinical studies revealed the engineering and clinical feasibility of the 8K UHD endoscope, enabling us to have a positive outlook on its prospective use in clinical practice. The 8K UHD endoscopy promises to open up new possibilities for intricate procedures including anastomoses of thin nerves and blood vessels as well as more confident surgical resections of a diversity of cancer tissues. 8K endoscopic imaging, compared to imaging by the current 2K imaging technology, is very likely to lead to major changes in the future of medical practice.

  4. Engineering a Live UHD Program from the International Space Station

    NASA Technical Reports Server (NTRS)

    Grubbs, Rodney; George, Sandy

    2017-01-01

    The first-ever live downlink of Ultra-High Definition (UHD) video from the International Space Station (ISS) was the highlight of a “Super Session” at the National Association of Broadcasters (NAB) Show in April 2017. Ultra-High Definition is four times the resolution of “full HD” or “1080P” video. Also referred to as “4K”, the Ultra-High Definition video downlink from the ISS all the way to the Las Vegas Convention Center required considerable planning, pushed the limits of conventional video distribution from a space-craft, and was the first use of High Efficiency Video Coding (HEVC) from a space-craft. The live event at NAB will serve as a pathfinder for more routine downlinks of UHD as well as use of HEVC for conventional HD downlinks to save bandwidth. A similar demonstration was conducted in 2006 with the Discovery Channel to demonstrate the ability to stream HDTV from the ISS. This paper will describe the overall work flow and routing of the UHD video, how audio was synchronized even though the video and audio were received many seconds apart from each other, and how the demonstration paves the way for not only more efficient video distribution from the ISS, but also serves as a pathfinder for more complex video distribution from deep space. The paper will also describe how a “live” event was staged when the UHD video coming from the ISS had a latency of 10+ seconds. In addition, the paper will touch on the unique collaboration between the inherently governmental aspects of the ISS, commercial partners Amazon and Elemental, and the National Association of Broadcasters.

  5. The Pfirrmann classification of lumbar intervertebral disc degeneration: an independent inter- and intra-observer agreement assessment.

    PubMed

    Urrutia, Julio; Besa, Pablo; Campos, Mauricio; Cikutovic, Pablo; Cabezon, Mario; Molina, Marcelo; Cruz, Juan Pablo

    2016-09-01

    Grading inter-vertebral disc degeneration (IDD) is important in the evaluation of many degenerative conditions, including patients with low back pain. Magnetic resonance imaging (MRI) is considered the best imaging instrument to evaluate IDD. The Pfirrmann classification is commonly used to grade IDD; the authors describing this classification showed an adequate agreement using it; however, there has been a paucity of independent agreement studies using this grading system. The aim of this study was to perform an independent inter- and intra-observer agreement study using the Pfirrmann classification. T2-weighted sagittal images of 79 patients consecutively studied with lumbar spine MRI were classified using the Pfirrmann grading system by six evaluators (three spine surgeons and three radiologists). After a 6-week interval, the 79 cases were presented to the same evaluators in a random sequence for repeat evaluation. The intra-class correlation coefficient (ICC) and the weighted kappa (wκ) were used to determine the inter- and intra-observer agreement. The inter-observer agreement was excellent, with an ICC = 0.94 (0.93-0.95) and wκ = 0.83 (0.74-0.91). There were no differences between spine surgeons and radiologists. Likewise, there were no differences in agreement evaluating the different lumbar discs. Most differences among observers were only of one grade. Intra-observer agreement was also excellent with ICC = 0.86 (0.83-0.89) and wκ = 0.89 (0.85-0.93). In this independent study, the Pfirrmann classification demonstrated an adequate agreement among different observers and by the same observer on separate occasions. Furthermore, it allows communication between radiologists and spine surgeons.

  6. Usefulness of Four Different Echinococcus granulosus Recombinant Antigens for Serodiagnosis of Unilocular Hydatid Disease (UHD) and Postsurgical Follow-Up of Patients Treated for UHD▿

    PubMed Central

    Hernández-González, Ana; Muro, Antonio; Barrera, Inmaculada; Ramos, Guillermo; Orduña, Antonio; Siles-Lucas, Mar

    2008-01-01

    Four different recombinant antigens derived from Echinococcus granulosus, designated B1t, B2t, E14t, and C317, were tested with enzyme-linked immunosorbent assays (ELISAs) for the detection of specific immunoglobulin G (IgG) in patients with unilocular hydatid disease (UHD). The results were compared to those obtained with hydatid fluid and were subjected to receiver operator characteristic analysis. The diagnostic performance of the above-listed proteins was defined with respect to their specificity, sensitivity, and predictive values (PV); the influence of cyst location; and usefulness in the follow-up of surgical treatment for UHD and in the determination of whether or not patients have been surgically cured of UHD. The best diagnostic results were obtained with the anti-B2t IgG ELISA, with 91.2% sensitivity, 93% specificity, and high positive and negative PV (89.4 and 94.2, respectively). In addition, this diagnostic tool proved to be useful for the follow-up of surgically treated UHD patients. The anti-B2t IgG ELISA may find an application in the serodiagnosis of UHD in clinical laboratories. PMID:17989342

  7. Cervical arthroplasty for moderate to severe disc degeneration: clinical and radiological assessments after a minimum follow-up of 18 months--Pfirrmann grade and cervical arthroplasty.

    PubMed

    Oh, Chang Hyun; Kim, Do Yeon; Ji, Gyu Yeul; Kim, Yeo Ju; Yoon, Seung Hwan; Hyun, Dongkeun; Kim, Eun Young; Park, Hyeonseon; Park, Hyeong-Chun

    2014-07-01

    Clinical outcomes and radiologic results after cervical arthroplasty have been reported in many articles, yet relatively few studies after cervical arthroplasty have been conducted in severe degenerative cervical disc disease. Sixty patients who underwent cervical arthroplasty (Mobi-C®) between April 2006 and November 2011 with a minimum follow-up of 18 months were enrolled in this study. Patients were divided into two groups according to Pfirrmann classification on preoperative cervical MR images: group A (Pfirrmann disc grade III, n=38) and group B (Pfirrmann disc grades IV or V, n=22). Visual analogue scale (VAS) scores of neck and arm pain, modified Oswestry Disability Index (mODI) score, and radiological results including cervical range of motion (ROM) were assessed before and after surgery. VAS and mean mODI scores decreased after surgery from 5.1 and 57.6 to 2.7 and 31.5 in group A and from 6.1 and 59.9 to 3.7 and 38.4 in group B, respectively. In both groups, VAS and mODI scores significantly improved postoperatively (p<0.001), although no significant intergroup differences were found. Also, cervical dynamic ROM was preserved or gradually improved up to 18 months after cervical arthroplasty in both groups. Global, segmental and adjacent ROM was similar for both groups during follow-up. No cases of device subsidence or extrusion were recorded. Clinical and radiological results following cervical arthroplasty in patients with severe degenerative cervical disc disease were no different from those in patients with mild degenerative cervical disc disease after 18 months of follow-up.

  8. Joint denoising, demosaicing, and chromatic aberration correction for UHD video

    NASA Astrophysics Data System (ADS)

    Jovanov, Ljubomir; Philips, Wilfried; Damstra, Klaas Jan; Ellenbroek, Frank

    2017-09-01

    High-resolution video capture is crucial for numerous applications such as surveillance, security, industrial inspection, medical imaging and digital entertainment. In the last two decades, we are witnessing a dramatic increase of the spatial resolution and the maximal frame rate of video capturing devices. In order to achieve further resolution increase, numerous challenges will be facing us. Due to the reduced size of the pixel, the amount of light also reduces, leading to the increased noise level. Moreover, the reduced pixel size makes the lens imprecisions more pronounced, which especially applies to chromatic aberrations. Even in the case when high quality lenses are used some chromatic aberration artefacts will remain. Next, noise level additionally increases due to the higher frame rates. To reduce the complexity and the price of the camera, one sensor captures all three colors, by relying on Color Filter Arrays. In order to obtain full resolution color image, missing color components have to be interpolated, i.e. demosaicked, which is more challenging than in the case of lower resolution, due to the increased noise and aberrations. In this paper, we propose a new method, which jointly performs chromatic aberration correction, denoising and demosaicking. By jointly performing the reduction of all artefacts, we are reducing the overall complexity of the system and the introduction of new artefacts. In order to reduce possible flicker we also perform temporal video enhancement. We evaluate the proposed method on a number of publicly available UHD sequences and on sequences recorded in our studio.

  9. Automated assembly of camera modules using active alignment with up to six degrees of freedom

    NASA Astrophysics Data System (ADS)

    Bräuniger, K.; Stickler, D.; Winters, D.; Volmer, C.; Jahn, M.; Krey, S.

    2014-03-01

    With the upcoming Ultra High Definition (UHD) cameras, the accurate alignment of optical systems with respect to the UHD image sensor becomes increasingly important. Even with a perfect objective lens, the image quality will deteriorate when it is poorly aligned to the sensor. For evaluating the imaging quality the Modulation Transfer Function (MTF) is used as the most accepted test. In the first part it is described how the alignment errors that lead to a low imaging quality can be measured. Collimators with crosshair at defined field positions or a test chart are used as object generators for infinite-finite or respectively finite-finite conjugation. The process how to align the image sensor accurately to the optical system will be described. The focus position, shift, tilt and rotation of the image sensor are automatically corrected to obtain an optimized MTF for all field positions including the center. The software algorithm to grab images, calculate the MTF and adjust the image sensor in six degrees of freedom within less than 30 seconds per UHD camera module is described. The resulting accuracy of the image sensor rotation is better than 2 arcmin and the accuracy position alignment in x,y,z is better 2 μm. Finally, the process of gluing and UV-curing is described and how it is managed in the integrated process.

  10. The QoE implications of ultra-high definition video adaptation strategies

    NASA Astrophysics Data System (ADS)

    Nightingale, James; Awobuluyi, Olatunde; Wang, Qi; Alcaraz-Calero, Jose M.; Grecos, Christos

    2016-04-01

    As the capabilities of high-end consumer devices increase, streaming and playback of Ultra-High Definition (UHD) is set to become commonplace. The move to these new, higher resolution, video services is one of the main factors contributing to the predicted continuation of growth in video related traffic in the Internet. This massive increases in bandwidth requirement, even when mitigated by the use of new video compression standards such as H.265, will place an ever-increasing burden on network service providers. This will be especially true in mobile environments where users have come to expect ubiquitous access to content. Consequently, delivering UHD and Full UHD (FUHD) video content is one of the key drivers for future Fifth Generation (5G) mobile networks. One often voiced, but as yet unanswered question, is whether users of mobile devices with modest screen sizes (e.g. smartphones or smaller tablet) will actually benefit from consuming the much higher bandwidth required to watch online UHD video, in terms of an improved user experience. In this paper, we use scalable H.265 encoded video streams to conduct a subjective evaluation of the impact on a user's perception of video quality across a comprehensive range of adaptation strategies, covering each of the three adaptation domains, for UHD and FUHD video. The results of our subjective study provide insightful and useful indications of which methods of adapting UHD and FUHD streams have the least impact on user's perceived QoE. In particular, it was observed that, in over 70% of cases, users were unable to distinguish between full HD (1080p) and UHD (4K) videos when they were unaware of which version was being shown to them. Our results from this evaluation can be used to provide adaptation rule sets that will facilitate fast, QoE aware in-network adaptation of video streams in support of realtime adaptation objectives. Undoubtedly they will also promote discussion around how network service providers manage

  11. Quantitative T2 evaluation at 3.0T compared to morphological grading of the lumbar intervertebral disc: a standardized evaluation approach in patients with low back pain.

    PubMed

    Stelzeneder, David; Welsch, Goetz Hannes; Kovács, Balázs Krisztián; Goed, Sabine; Paternostro-Sluga, Tatjana; Vlychou, Marianna; Friedrich, Klaus; Mamisch, Tallal Charles; Trattnig, Siegfried

    2012-02-01

    The purpose of our investigation was to compare quantitative T2 relaxation time measurement evaluation of lumbar intervertebral discs with morphological grading in young to middle-aged patients with low back pain, using a standardized region-of-interest evaluation approach. Three hundred thirty lumbar discs from 66 patients (mean age, 39 years) with low back pain were examined on a 3.0T MR unit. Sagittal T1-FSE, sagittal, coronal, and axial T2-weighted FSE for morphological MRI, as well as a multi-echo spin-echo sequence for T2 mapping, were performed. Morphologically, all discs were classified according to Pfirrmann et al. Equally sized rectangular regions of interest (ROIs) for the annulus fibrosus were selected anteriorly and posteriorly in the outermost 20% of the disc. The space between was defined as the nucleus pulposus. To assess the reproducibility of this evaluation, inter- and intraobserver statistics were performed. The Pfirrmann scoring of 330 discs showed the following results: grade I: six discs (1.8%); grade II: 189 (57.3%); grade III: 96 (29.1%); grade IV: 38 (11.5%); and grade V: one (0.3%). The mean T2 values (in milliseconds) for the anterior and the posterior annulus, and the nucleus pulposus for the respective Pfirrmann groups were: I: 57/30/239; II: 44/67/129; III: 42/51/82; and IV: 42/44/56. The nucleus pulposus T2 values showed a stepwise decrease from Pfirrmann grade I to IV. The posterior annulus showed the highest T2 values in Pfirrmann group II, while the anterior annulus showed relatively constant T2 values in all Pfirrmann groups. The inter- and intraobserver analysis yielded intraclass correlation coefficients (ICC) for average measures in a range from 0.82 (anterior annulus) to 0.99 (nucleus). Our standardized method of region-specific quantitative T2 relaxation time evaluation seems to be able to characterize different degrees of disc degeneration quantitatively. The reproducibility of our ROI measurements is sufficient to

  12. Experimental physical methods and theories--then and now.

    PubMed

    Schulte, Jurgen

    2015-10-01

    A first evaluation of fundamental research into the physics and physiology of Ultra high dilutions (UHDs) was conducted by the author in 1994(1). In this paper we revisit methods and theories from back then and follow their paths through their evolution and contribution to new knowledge in UHD research since then. Physical methods and theories discusses in our anthology on UHD in 1994(1) form the basis for tracing ideas and findings along their path of further development and impact on new knowledge in UHD. Experimental approaches to probe physical changes in homeopathic preparations have become more sophisticated over past two decades, so did the desire to report results to a scientific standard that is on par with those in specialist literature. The same cannot be said about underlying supporting theoretical models and simulations. Grant challenges in science often take a more targeted and more concerted approach to formulate a research question and then look for answers. A concerted effort to focus on one hypothesized physical aspect of a well-defined homeopathic preparation may help aligning experimental methods with theoretical models and, in doing so, help to gain a deeper understanding of the whole body of insights and data produced. Copyright © 2015 The Faculty of Homeopathy. Published by Elsevier Ltd. All rights reserved.

  13. Stunning Aurora Borealis from Space - Ultra-High Definition 4K

    NASA Image and Video Library

    2016-04-17

    NASA Television’s newest offering, NASA TV UHD, brings ultra-high definition video to a new level with the kind of imagery only the world’s leader in space exploration could provide. Harmonic produced this show exclusively for NASA TV UHD, using time-lapses shot from the International Space Station, showing both the Aurora Borealis and Aurora Australis phenomena that occur when electrically charged electrons and protons in the Earth's magnetic field collide with neutral atoms in the upper atmosphere.

  14. Quantitative T2 Magnetic Resonance Imaging Compared to Morphological Grading of the Early Cervical Intervertebral Disc Degeneration: An Evaluation Approach in Asymptomatic Young Adults

    PubMed Central

    Han, Zhihua; Shao, Lixin; Xie, Yan; Wu, Jianhong; Zhang, Yan; Xin, Hongkui; Ren, Aijun; Guo, Yong; Wang, Deli; He, Qing; Ruan, Dike

    2014-01-01

    Objective The objective of this study was to evaluate the efficacy of quantitative T2 magnetic resonance imaging (MRI) for quantifying early cervical intervertebral disc (IVD) degeneration in asymptomatic young adults by correlating the T2 value with Pfirrmann grade, sex, and anatomic level. Methods Seventy asymptomatic young subjects (34 men and 36 women; mean age, 22.80±2.11 yr; range, 18–25 years) underwent 3.0-T MRI to obtain morphological data (one T1-fast spin echo (FSE) and three-plane T2-FSE, used to assign a Pfirrmann grade (I–V)) and for T2 mapping (multi-echo spin echo). T2 values in the nucleus pulposus (NP, n = 350) and anulus fibrosus (AF, n = 700) were obtained. Differences in T2 values between sexes and anatomic level were evaluated, and linear correlation analysis of T2 values versus degenerative grade was conducted. Findings Cervical IVDs of healthy young adults were commonly determined to be at Pfirrmann grades I and II. T2 values of NPs were significantly higher than those of AF at all anatomic levels (P<0.000). The NP, anterior AF and posterior AF values did not differ significantly between genders at the same anatomic level (P>0.05). T2 values decreased linearly with degenerative grade. Linear correlation analysis revealed a strong negative association between the Pfirrmann grade and the T2 values of the NP (P = 0.000) but not the T2 values of the AF (P = 0.854). However, non-degenerated discs (Pfirrmann grades I and II) showed a wide range of T2 relaxation time. T2 values according to disc degeneration level classification were as follows: grade I (>62.03 ms), grade II (54.60–62.03 ms), grade III (<54.60 ms). Conclusions T2 quantitation provides a more sensitive and robust approach for detecting and characterizing the early stage of cervical IVD degeneration and to create a reliable quantitative in healthy young adults. PMID:24498384

  15. Quantitative T2 magnetic resonance imaging compared to morphological grading of the early cervical intervertebral disc degeneration: an evaluation approach in asymptomatic young adults.

    PubMed

    Chen, Chun; Huang, Minghua; Han, Zhihua; Shao, Lixin; Xie, Yan; Wu, Jianhong; Zhang, Yan; Xin, Hongkui; Ren, Aijun; Guo, Yong; Wang, Deli; He, Qing; Ruan, Dike

    2014-01-01

    The objective of this study was to evaluate the efficacy of quantitative T2 magnetic resonance imaging (MRI) for quantifying early cervical intervertebral disc (IVD) degeneration in asymptomatic young adults by correlating the T2 value with Pfirrmann grade, sex, and anatomic level. Seventy asymptomatic young subjects (34 men and 36 women; mean age, 22.80±2.11 yr; range, 18-25 years) underwent 3.0-T MRI to obtain morphological data (one T1-fast spin echo (FSE) and three-plane T2-FSE, used to assign a Pfirrmann grade (I-V)) and for T2 mapping (multi-echo spin echo). T2 values in the nucleus pulposus (NP, n = 350) and anulus fibrosus (AF, n = 700) were obtained. Differences in T2 values between sexes and anatomic level were evaluated, and linear correlation analysis of T2 values versus degenerative grade was conducted. Cervical IVDs of healthy young adults were commonly determined to be at Pfirrmann grades I and II. T2 values of NPs were significantly higher than those of AF at all anatomic levels (P<0.000). The NP, anterior AF and posterior AF values did not differ significantly between genders at the same anatomic level (P>0.05). T2 values decreased linearly with degenerative grade. Linear correlation analysis revealed a strong negative association between the Pfirrmann grade and the T2 values of the NP (P = 0.000) but not the T2 values of the AF (P = 0.854). However, non-degenerated discs (Pfirrmann grades I and II) showed a wide range of T2 relaxation time. T2 values according to disc degeneration level classification were as follows: grade I (>62.03 ms), grade II (54.60-62.03 ms), grade III (<54.60 ms). T2 quantitation provides a more sensitive and robust approach for detecting and characterizing the early stage of cervical IVD degeneration and to create a reliable quantitative in healthy young adults.

  16. Live Ultra-High Definition from the International Space Station

    NASA Technical Reports Server (NTRS)

    Grubbs, Rodney; George, Sandy

    2017-01-01

    The first ever live downlink of Ultra-High Definition (UHD) video from the International Space Station (ISS) was the highlight of a 'Super Session' at the National Association of Broadcasters (NAB) in April 2017. The Ultra-High Definition video downlink from the ISS all the way to the Las Vegas Convention Center required considerable planning, pushed the limits of conventional video distribution from a space-craft, and was the first use of High Efficiency Video Coding (HEVC) from a space-craft. The live event at NAB will serve as a pathfinder for more routine downlinks of UHD as well as use of HEVC for conventional HD downlinks to save bandwidth. HEVC may also enable live Virtual Reality video downlinks from the ISS. This paper will describe the overall work flow and routing of the UHD video, how audio was synchronized even though the video and audio were received many seconds apart from each other, and how the demonstration paves the way for not only more efficient video distribution from the ISS, but also serves as a pathfinder for more complex video distribution from deep space. The paper will also describe how a 'live' event was staged when the UHD coming from the ISS had a latency of 10+ seconds. Finally, the paper will discuss how NASA is leveraging commercial technologies for use on-orbit vs. creating technology as was required during the Apollo Moon Program and early space age.

  17. Noninvasive Assessment of Biochemical and Mechanical Properties of Lumbar Discs Through Quantitative Magnetic Resonance Imaging in Asymptomatic Volunteers.

    PubMed

    Foltz, Mary H; Kage, Craig C; Johnson, Casey P; Ellingson, Arin M

    2017-11-01

    Intervertebral disc degeneration is a prevalent phenomenon associated with back pain. It is of critical clinical interest to discriminate disc health and identify early stages of degeneration. Traditional clinical T2-weighted magnetic resonance imaging (MRI), assessed using the Pfirrmann classification system, is subjective and fails to adequately capture initial degenerative changes. Emerging quantitative MRI techniques offer a solution. Specifically, T2* mapping images water mobility in the macromolecular network, and our preliminary ex vivo work shows high predictability of the disc's glycosaminoglycan content (s-GAG) and residual mechanics. The present study expands upon this work to predict the biochemical and biomechanical properties in vivo and assess their relationship with both age and Pfirrmann grade. Eleven asymptomatic subjects (range: 18-62 yrs) were enrolled and imaged using a 3T MRI scanner. T2-weighted images (Pfirrmann grade) and quantitative T2* maps (predict s-GAG and residual stress) were acquired. Surface maps based on the distribution of these properties were generated and integrated to quantify the surface volume. Correlational analyses were conducted to establish the relationship between each metric of disc health derived from the quantitative T2* maps with both age and Pfirrmann grade, where an inverse trend was observed. Furthermore, the nucleus pulposus (NP) signal in conjunction with volumetric surface maps provided the ability to discern differences during initial stages of disc degeneration. This study highlights the ability of T2* mapping to noninvasively assess the s-GAG content, residual stress, and distributions throughout the entire disc, which may provide a powerful diagnostic tool for disc health assessment.

  18. Uniqueness of the anterior dentition three-dimensionally assessed for forensic bitemark analysis.

    PubMed

    Franco, A; Willems, G; Souza, Phc; Coucke, W; Thevissen, P

    2017-02-01

    The uniqueness of the human dentition (UHD) is an important concept in the comparative process in bitemark analysis. During this analysis, the incisal edges of the suspects' teeth are matched with the bitemarks collected from the victim's body or crime scenes. Despite playing an essential part to exclude suspects, the UHD contained in the involved incisal tooth edges remains an assumption on bitemark level. The present study was aimed, first, to investigate three-dimensionally (3D) the UHD within different quantities of dental material from the incisal edges; second, to test these outcomes in a bidimensional (2D) simulation. Four-hundred forty-five dental casts were collected to compose 4 study groups: I - randomly-selected subjects, II - orthodontically treated subjects, III - twins and IV - orthodontically treated twins. Additionally, 20 dental casts were included to create threshold groups on subjects from whom the dental impressions were taken at 2 different moments (Group V). All the dental casts were digitalized with an automated motion device (XCAD 3D ® (XCADCAM Technology ® , São Paulo, SP, Brazil). The digital cast files (DCF) were integrated in Geomagic Studio ® (3D Systems ® , Rock Hill, SC, USA) software package (GS) for cropping, automated superimposition and pair-wise comparisons. All the DCF were cropped remaining 3 mm (part 1), 2 mm (part 2) and 1 mm (part 3) from the incisal edges of the anterior teeth. For a 2D validation, slices of 1 mm, not including incisal edges (part 4), were also cropped. These procedures were repeated in Group V, creating specific thresholds for each of the study parts. The 4 study groups were compared with its respective threshold using ANOVA test with statistical significance of 5%. Groups I, II and III did not differ from the corresponding threshold (Group V) in all study parts (p > 0.05). Scientific evidence to support the UHD was not observed in the current study. Bitemark analysis should not be disregarded

  19. Experimental demonstration of OpenFlow-enabled media ecosystem architecture for high-end applications over metro and core networks.

    PubMed

    Ntofon, Okung-Dike; Channegowda, Mayur P; Efstathiou, Nikolaos; Rashidi Fard, Mehdi; Nejabati, Reza; Hunter, David K; Simeonidou, Dimitra

    2013-02-25

    In this paper, a novel Software-Defined Networking (SDN) architecture is proposed for high-end Ultra High Definition (UHD) media applications. UHD media applications require huge amounts of bandwidth that can only be met with high-capacity optical networks. In addition, there are requirements for control frameworks capable of delivering effective application performance with efficient network utilization. A novel SDN-based Controller that tightly integrates application-awareness with network control and management is proposed for such applications. An OpenFlow-enabled test-bed demonstrator is reported with performance evaluations of advanced online and offline media- and network-aware schedulers.

  20. Correlation Between Expression of High Temperature Requirement Serine Protease A1 (HtrA1) in Nucleus Pulposus and T2 Value of Magnetic Resonance Imaging.

    PubMed

    Li, Dapeng; Yue, Jiawei; Jiang, Lu; Huang, Yonghui; Sun, Jifu; Wu, Yan

    2017-04-22

    BACKGROUND Degrading enzymes play an important role in the process of disc degeneration. The objective of this study was to investigate the correlation between the expression of high temperature requirement serine protease A1 (HtrA1) in the nucleus pulposus and the T2 value of the nucleus pulposus region in magnetic resonance imaging (MRI). MATERIAL AND METHODS Thirty-six patients who had undergone surgical excision of the nucleus pulposus were examined by MRI before surgery. Pfirrmann grading of the target intervertebral disc was performed according to the sagittal T2-weighted imaging, and the T2 value of the target nucleus pulposus was measured according to the median sagittal T2 mapping. The correlation between the Pfirrmann grade and the T2 value was analyzed. The expression of HtrA1 in the nucleus pulposus was analyzed by RT-PCR and Western blot. The correlation between the expression of HtrA1 and the T2 value was analyzed. RESULTS The T2 value of the nucleus pulposus region was 33.11-167.91 ms, with an average of 86.64±38.73 ms. According to Spearman correlation analysis, there was a rank correlation between T2 value and Pfirrmann grade (P<0.0001), and the correlation coefficient (rs)=-0.93617. There was a linear correlation between the mRNA level of HtrA1 and T2 value in nucleus pulposus tissues (a=3.88, b=-0.019, F=112.63, P<0.0001), normalized regression coefficient=-0.88. There was a linear correlation between the expression level of HtrA1 protein and the T2 value in the nucleus pulposus tissues (a=3.30, b=-0.016, F=93.15, P<0.0001) and normalized regression coefficient=-0.86. CONCLUSIONS The expression of HtrA1 was strongly related to the T2 value, suggesting that HtrA1 plays an important role in the pathological process of intervertebral disc degeneration.

  1. D Reconstruction from Uav-Based Hyperspectral Images

    NASA Astrophysics Data System (ADS)

    Liu, L.; Xu, L.; Peng, J.

    2018-04-01

    Reconstructing the 3D profile from a set of UAV-based images can obtain hyperspectral information, as well as the 3D coordinate of any point on the profile. Our images are captured from the Cubert UHD185 (UHD) hyperspectral camera, which is a new type of high-speed onboard imaging spectrometer. And it can get both hyperspectral image and panchromatic image simultaneously. The panchromatic image have a higher spatial resolution than hyperspectral image, but each hyperspectral image provides considerable information on the spatial spectral distribution of the object. Thus there is an opportunity to derive a high quality 3D point cloud from panchromatic image and considerable spectral information from hyperspectral image. The purpose of this paper is to introduce our processing chain that derives a database which can provide hyperspectral information and 3D position of each point. First, We adopt a free and open-source software, Visual SFM which is based on structure from motion (SFM) algorithm, to recover 3D point cloud from panchromatic image. And then get spectral information of each point from hyperspectral image by a self-developed program written in MATLAB. The production can be used to support further research and applications.

  2. An efficient interpolation filter VLSI architecture for HEVC standard

    NASA Astrophysics Data System (ADS)

    Zhou, Wei; Zhou, Xin; Lian, Xiaocong; Liu, Zhenyu; Liu, Xiaoxiang

    2015-12-01

    The next-generation video coding standard of High-Efficiency Video Coding (HEVC) is especially efficient for coding high-resolution video such as 8K-ultra-high-definition (UHD) video. Fractional motion estimation in HEVC presents a significant challenge in clock latency and area cost as it consumes more than 40 % of the total encoding time and thus results in high computational complexity. With aims at supporting 8K-UHD video applications, an efficient interpolation filter VLSI architecture for HEVC is proposed in this paper. Firstly, a new interpolation filter algorithm based on the 8-pixel interpolation unit is proposed in this paper. It can save 19.7 % processing time on average with acceptable coding quality degradation. Based on the proposed algorithm, an efficient interpolation filter VLSI architecture, composed of a reused data path of interpolation, an efficient memory organization, and a reconfigurable pipeline interpolation filter engine, is presented to reduce the implement hardware area and achieve high throughput. The final VLSI implementation only requires 37.2k gates in a standard 90-nm CMOS technology at an operating frequency of 240 MHz. The proposed architecture can be reused for either half-pixel interpolation or quarter-pixel interpolation, which can reduce the area cost for about 131,040 bits RAM. The processing latency of our proposed VLSI architecture can support the real-time processing of 4:2:0 format 7680 × 4320@78fps video sequences.

  3. Single-layer HDR video coding with SDR backward compatibility

    NASA Astrophysics Data System (ADS)

    Lasserre, S.; François, E.; Le Léannec, F.; Touzé, D.

    2016-09-01

    The migration from High Definition (HD) TV to Ultra High Definition (UHD) is already underway. In addition to an increase of picture spatial resolution, UHD will bring more color and higher contrast by introducing Wide Color Gamut (WCG) and High Dynamic Range (HDR) video. As both Standard Dynamic Range (SDR) and HDR devices will coexist in the ecosystem, the transition from Standard Dynamic Range (SDR) to HDR will require distribution solutions supporting some level of backward compatibility. This paper presents a new HDR content distribution scheme, named SL-HDR1, using a single layer codec design and providing SDR compatibility. The solution is based on a pre-encoding HDR-to-SDR conversion, generating a backward compatible SDR video, with side dynamic metadata. The resulting SDR video is then compressed, distributed and decoded using standard-compliant decoders (e.g. HEVC Main 10 compliant). The decoded SDR video can be directly rendered on SDR displays without adaptation. Dynamic metadata of limited size are generated by the pre-processing and used to reconstruct the HDR signal from the decoded SDR video, using a post-processing that is the functional inverse of the pre-processing. Both HDR quality and artistic intent are preserved. Pre- and post-processing are applied independently per picture, do not involve any inter-pixel dependency, and are codec agnostic. Compression performance, and SDR quality are shown to be solidly improved compared to the non-backward and backward-compatible approaches, respectively using the Perceptual Quantization (PQ) and Hybrid Log Gamma (HLG) Opto-Electronic Transfer Functions (OETF).

  4. Level of Education as a Risk Factor for Extensive Prevalence of Cervical Intervertebral Disc Degenerative Changes and Chronic Neck Pain.

    PubMed

    Markotić, Vedran; Zubac, Damir; Miljko, Miro; Šimić, Goran; Zalihić, Amra; Bogdan, Gojko; Radančević, Dorijan; Šimić, Ana Dugandžić; Mašković, Josip

    2017-09-01

    The aim of this study was to document the prevalence of degenerative intervertebral disc changes in the patients who previously reported symptoms of neck pain and to determine the influence of education level on degenerative intervertebral disc changes and subsequent chronic neck pain. One hundred and twelve patients were randomly selected from the University Hospital in Mostar, Bosna and Herzegovina, (aged 48.5±12.7 years) and submitted to magnetic resonance imaging (MRI) of the cervical spine. MRI of 3.0 T (Siemens, Skyrim, Erlangen, Germany) was used to obtain cervical spine images. Patients were separated into two groups based on their education level: low education level (LLE) and high education level (HLE). Pfirrmann classification was used to document intervertebral disc degeneration, while self-reported chronic neck pain was evaluated using the previously validated Oswestry questionnaire. The entire logistic regression model containing all predictors was statistically significant, (χ 2 (3)=12.2, p=0.02), and was able to distinguish between respondents who had chronic neck pain and vice versa. The model explained between 10.0% (Cox-Snell R 2 ) and 13.8% (Nagelkerke R 2 ) of common variance with Pfirrmann classification, and it had the strength to discriminate and correctly classify 69.6% of patients. The probability of a patient being classified in the high or low group of degenerative disc changes according to the Pfirrmann scale was associated with the education level (Wald test: 5.5, p=0.02). Based on the Pfirrmann assessment scale, the HLE group was significantly different from the LLE group in the degree of degenerative changes of the cervical intervertebral discs (U=1,077.5, p=0.001). A moderate level of intervertebral disc degenerative changes (grade II and III) was equally matched among all patients, while the overall results suggest a higher level of education as a risk factor leading to cervical disc degenerative changes, regardless of age

  5. Preliminary experience with 4K ultra-high definition endoscope: analysis of pros and cons in skull base surgery.

    PubMed

    Rigante, M; La Rocca, G; Lauretti, L; D'Alessandris, G Q; Mangiola, A; Anile, C; Olivi, A; Paludetti, G

    2017-06-01

    During the last two decades endoscopic skull base surgery observed a continuous technical and technological development 3D endoscopy and ultra High Definition (HD) endoscopy have provided great advances in terms of visualisation and spatial resolution. Ultra-high definition (UHD) 4K systems, recently introduced in the clinical practice, will shape next steps forward especially in skull base surgery field. Patients were operated on through transnasal transsphenoidal endoscopic approaches performed using Olympus NBI 4K UHD endoscope with a 4 mm 0° Ultra Telescope, 300 W xenon lamp (CLV-S400) predisposed for narrow band imaging (NBI) technology connected through a camera head to a high-quality control unit (OTV-S400 - VISERA 4K UHD) (Olympus Corporation, Tokyo, Japan). Two screens are used, one 31" Monitor - (LMD-X310S) and one main ultra-HD 55" screen optimised for UHD image reproduction (LMD-X550S). In selected cases, we used a navigation system (Stealthstation S7, Medtronic, Minneapolis, MN, US). We evaluated 22 pituitary adenomas (86.3% macroadenomas; 13.7% microadenomas). 50% were not functional (NF), 22.8% GH, 18.2% ACTH, 9% PRL-secreting. Three of 22 were recurrences. In 91% of cases we achieved total removal, while in 9% near total resection. A mean follow-up of 187 days and average length of hospitalisation was 3.09 ± 0.61 days. Surgical duration was 128.18± 30.74 minutes. We experienced only 1 case of intraoperative low flow fistula with no further complications. None of the cases required any post- or intraoperative blood transfusion. The visualisation and high resolution of the operative field provided a very detailed view of all anatomical structures and pathologies allowing an improvement in safety and efficacy of the surgical procedure. The operative time was similar to the standard 2D HD and 3D procedures and the physical strain was also comparable to others in terms of ergonomics and weight. © Copyright by Società Italiana di Otorinolaringologia

  6. Reliability of macroscopic grading of intervertebral disk degeneration in dogs by use of the Thompson system and comparison with low-field magnetic resonance imaging findings.

    PubMed

    Bergknut, Niklas; Grinwis, Guy; Pickee, Emile; Auriemma, Edoardo; Lagerstedt, Anne-Sofie; Hagman, Ragnvi; Hazewinkel, Herman A W; Meij, Björn P

    2011-07-01

    To evaluate the reliability of the Thompson system for use in grading the gross pathological changes of intervertebral disk (IVD) degeneration in dogs and to investigate the agreement between gross pathological findings and low-field (0.2-T) magnetic resonance imaging (MRI) findings. Vertebral columns from cadavers of 19 dogs of various ages, breeds, and origins. 182 intervertebral segments were collected from 19 canine cadavers. Sagittal T2-weighted MRI of the T11 through S1 portion of the vertebral column was performed within 24 hours after the dogs were euthanized. The vertebral columns were subsequently divided in the midsagittal plane, and high-resolution photographs were obtained of each intervertebral segment (end plate-disk-end plate). The MRI images and photographs were graded separately in a blinded manner by 4 observers who used both Pfirrmann and Thompson grading criteria. The interobserver agreement for Thompson scores ranged from 0.76 to 0.88, and the intraobserver agreement ranged from 0.88 to 0.94 (Cohen weighted κ analysis). Agreement between scores for the Pfirrmann and Thompson grading criteria was κ = 0.70. Grading of IVD degeneration in dogs by use of the Thompson system resulted in high interobserver and intraobserver agreement, and scores for the Thompson system had substantial agreement with low-field MRI findings graded by use of the Pfirrmann system. This suggested that low-field MRI can be used to diagnose IVD degeneration in dogs.

  7. Comparison of pedicle screw-based dynamic stabilization and fusion surgery in the treatment of radiographic adjacent-segment degeneration: a retrospective analysis of single L5-S1 degenerative spondylosis covering 4 years.

    PubMed

    Han, Yu; Sun, Jianguang; Luo, Chenghan; Huang, Shilei; Li, Liren; Ji, Xiang; Duan, Xiaozong; Wang, Zhenqing; Pi, Guofu

    2016-12-01

    OBJECTIVE Pedicle screw-based dynamic spinal stabilization systems (PDSs) were devised to decrease, theoretically, the risk of long-term complications such as adjacent-segment degeneration (ASD) after lumbar fusion surgery. However, to date, there have been few studies that fully proved that a PDS can reduce the risk of ASD. The purpose of this study was to examine whether a PDS can influence the incidence of ASD and to discuss the surgical coping strategy for L5-S1 segmental spondylosis with preexisting L4-5 degeneration with no related symptoms or signs. METHODS This study retrospectively compared 62 cases of L5-S1 segmental spondylosis in patients who underwent posterior lumbar interbody fusion (n = 31) or K-Rod dynamic stabilization (n = 31) with a minimum of 4 years' follow-up. The authors measured the intervertebral heights and spinopelvic parameters on standing lateral radiographs and evaluated preexisting ASD on preoperative MR images using the modified Pfirrmann grading system. Radiographic ASD was evaluated according to the results of radiography during follow-up. RESULTS All 62 patients achieved remission of their neurological symptoms without surgical complications. The Kaplan-Meier curve and Cox proportional-hazards model showed no statistically significant differences between the 2 surgical groups in the incidence of radiographic ASD (p > 0.05). In contrast, the incidence of radiographic ASD was 8.75 times (95% CI 1.955-39.140; p = 0.005) higher in the patients with a preoperative modified Pfirrmann grade higher than 3 than it was in patients with a modified Pfirrmann grade of 3 or lower. In addition, no statistical significance was found for other risk factors such as age, sex, and spinopelvic parameters. CONCLUSIONS Pedicle screw-based dynamic spinal stabilization systems were not found to be superior to posterior lumbar interbody fusion in preventing radiographic ASD (L4-5) during the midterm follow-up. Preexisting ASD with a modified Pfirrmann

  8. High dynamic range subjective testing

    NASA Astrophysics Data System (ADS)

    Allan, Brahim; Nilsson, Mike

    2016-09-01

    This paper describes of a set of subjective tests that the authors have carried out to assess the end user perception of video encoded with High Dynamic Range technology when viewed in a typical home environment. Viewers scored individual single clips of content, presented in High Definition (HD) and Ultra High Definition (UHD), in Standard Dynamic Range (SDR), and in High Dynamic Range (HDR) using both the Perceptual Quantizer (PQ) and Hybrid Log Gamma (HLG) transfer characteristics, and presented in SDR as the backwards compatible rendering of the HLG representation. The quality of SDR HD was improved by approximately equal amounts by either increasing the dynamic range or increasing the resolution to UHD. A further smaller increase in quality was observed in the Mean Opinion Scores of the viewers by increasing both the dynamic range and the resolution, but this was not quite statistically significant.

  9. Characterization of high density SiPM non-linearity and energy resolution for prompt gamma imaging applications

    NASA Astrophysics Data System (ADS)

    Regazzoni, V.; Acerbi, F.; Cozzi, G.; Ferri, A.; Fiorini, C.; Paternoster, G.; Piemonte, C.; Rucatti, D.; Zappalà, G.; Zorzi, N.; Gola, A.

    2017-07-01

    Fondazione Bruno Kessler (FBK) (Trento, Italy) has recently introduced High Density (HD) and Ultra High-Density (UHD) SiPMs, featuring very small micro-cell pitch. The high cell density is a very important factor to improve the linearity of the SiPM in high-dynamic-range applications, such as the scintillation light readout in high-energy gamma-ray spectroscopy and in prompt gamma imaging for proton therapy. The energy resolution at high energies is a trade-off between the excess noise factor caused by the non-linearity of the SiPM and the photon detection efficiency of the detector. To study these effects, we developed a new setup that simulates the LYSO light emission in response to gamma photons up to 30 MeV, using a pulsed light source. We measured the non-linearity and energy resolution vs. energy of the FBK RGB-HD e RGB-UHD SiPM technologies. We considered five different cell sizes, ranging from 10 μm up to 25 μm. With the UHD technology we were able to observe a remarkable reduction of the SiPM non-linearity, less than 5% at 5 MeV with 10 μm cells, which should be compared to a non-linearity of 50% with 25 μm-cell HD-SiPMs. With the same setup, we also measured the different components of the energy resolution (intrinsic, statistical, detector and electronic noise) vs. cell size, over-voltage and energy and we separated the different sources of excess noise factor.

  10. Resolution analysis of archive films for the purpose of their optimal digitization and distribution

    NASA Astrophysics Data System (ADS)

    Fliegel, Karel; Vítek, Stanislav; Páta, Petr; Myslík, Jiří; Pecák, Josef; Jícha, Marek

    2017-09-01

    With recent high demand for ultra-high-definition (UHD) content to be screened in high-end digital movie theaters but also in the home environment, film archives full of movies in high-definition and above are in the scope of UHD content providers. Movies captured with the traditional film technology represent a virtually unlimited source of UHD content. The goal to maintain complete image information is also related to the choice of scanning resolution and spatial resolution for further distribution. It might seem that scanning the film material in the highest possible resolution using state-of-the-art film scanners and also its distribution in this resolution is the right choice. The information content of the digitized images is however limited, and various degradations moreover lead to its further reduction. Digital distribution of the content in the highest image resolution might be therefore unnecessary or uneconomical. In other cases, the highest possible resolution is inevitable if we want to preserve fine scene details or film grain structure for archiving purposes. This paper deals with the image detail content analysis of archive film records. The resolution limit in captured scene image and factors which lower the final resolution are discussed. Methods are proposed to determine the spatial details of the film picture based on the analysis of its digitized image data. These procedures allow determining recommendations for optimal distribution of digitized video content intended for various display devices with lower resolutions. Obtained results are illustrated on spatial downsampling use case scenario, and performance evaluation of the proposed techniques is presented.

  11. A high-resolution map of the H1 locus harbouring resistance to the potato cyst nematode Globodera rostochiensis.

    PubMed

    Bakker, Erin; Achenbach, Ute; Bakker, Jeroen; van Vliet, Joke; Peleman, Johan; Segers, Bart; van der Heijden, Stefan; van der Linde, Piet; Graveland, Robert; Hutten, Ronald; van Eck, Herman; Coppoolse, Eric; van der Vossen, Edwin; Bakker, Jaap; Goverse, Aska

    2004-06-01

    The resistance gene H1 confers resistance to the potato cyst nematode Globodera rostochiensis and is located at the distal end of the long arm of chromosome V of potato. For marker enrichment of the H1 locus, a bulked segregant analysis (BSA) was carried out using 704 AFLP primer combinations. A second source of markers tightly linked to H1 is the ultra-high-density (UHD) genetic map of the potato cross SH x RH. This map has been produced with 387 AFLP primer combinations and consists of 10,365 AFLP markers in 1,118 bins (http://www.dpw.wageningen-ur.nl/uhd/). Comparing these two methods revealed that BSA resulted in one marker/cM and the UHD map in four markers/cM in the H1 interval. Subsequently, a high-resolution genetic map of the H1 locus has been developed using a segregating F(1) SH x RH population consisting of 1,209 genotypes. Two PCR-based markers were designed at either side of the H1 gene to screen the 1,209 genotypes for recombination events. In the high-resolution genetic map, two of the four co-segregating AFLP markers could be separated from the H1 gene. Marker EM1 is located at a distance of 0.2 cM, and marker EM14 is located at a distance of 0.8 cM. The other two co-segregating markers CM1 (in coupling) and EM15 (in repulsion) could not be separated from the H1 gene.

  12. MRI histogram analysis enables objective and continuous classification of intervertebral disc degeneration.

    PubMed

    Waldenberg, Christian; Hebelka, Hanna; Brisby, Helena; Lagerstrand, Kerstin Magdalena

    2018-05-01

    Magnetic resonance imaging (MRI) is the best diagnostic imaging method for low back pain. However, the technique is currently not utilized in its full capacity, often failing to depict painful intervertebral discs (IVDs), potentially due to the rough degeneration classification system used clinically today. MR image histograms, which reflect the IVD heterogeneity, may offer sensitive imaging biomarkers for IVD degeneration classification. This study investigates the feasibility of using histogram analysis as means of objective and continuous grading of IVD degeneration. Forty-nine IVDs in ten low back pain patients (six males, 25-69 years) were examined with MRI (T2-weighted images and T2-maps). Each IVD was semi-automatically segmented on three mid-sagittal slices. Histogram features of the IVD were extracted from the defined regions of interest and correlated to Pfirrmann grade. Both T2-weighted images and T2-maps displayed similar histogram features. Histograms of well-hydrated IVDs displayed two separate peaks, representing annulus fibrosus and nucleus pulposus. Degenerated IVDs displayed decreased peak separation, where the separation was shown to correlate strongly with Pfirrmann grade (P < 0.05). In addition, some degenerated IVDs within the same Pfirrmann grade displayed diametrically different histogram appearances. Histogram features correlated well with IVD degeneration, suggesting that IVD histogram analysis is a suitable tool for objective and continuous IVD degeneration classification. As histogram analysis revealed IVD heterogeneity, it may be a clinical tool for characterization of regional IVD degeneration effects. To elucidate the usefulness of histogram analysis in patient management, IVD histogram features between asymptomatic and symptomatic individuals needs to be compared.

  13. Eradication therapy for peptic ulcer disease in Helicobacter pylori-positive people.

    PubMed

    Ford, Alexander C; Gurusamy, Kurinchi Selvan; Delaney, Brendan; Forman, David; Moayyedi, Paul

    2016-04-19

    eradication compared with ulcer healing drug, placebo or no treatment. Trials were included if they reported assessment from two weeks onwards. We collected data on ulcer healing, recurrence, relief of symptoms and adverse effects. We calculated the risk ratio (RR) with 95% confidence intervals (CI) using both fixed-effect and random-effects models with Review Manager software (RevMan 5.3) based on intention-to-treat analysis as far as possible. A total of 55 trials were included for one or more outcomes for this review.In duodenal ulcer healing, eradication therapy was superior to ulcer healing drug (UHD) (34 trials, 3910 participants, RR of ulcer persisting = 0.66, 95% confidence interval (CI) 0.58 to 0.76; 381/2286 (adjusted proportion: 12.4%) in eradication therapy plus UHD versus 304/1624 (18.7%) in UHD; low quality evidence) and no treatment (two trials, 207 participants, RR 0.37, 95% CI 0.26 to 0.53; 30/125 (adjusted proportion: 21.7%) in eradication therapy versus 48/82 (58.5%) in no treatment; low quality evidence).In gastric ulcer healing, the differences were imprecise between eradication therapy and UHD (15 trials, 1974 participants, RR 1.23, 95% CI 0.90 to 1.68; 220/1192 (adjusted proportion: 16.0%) in eradication therapy plus UHD versus 102/782 (13.0%) in UHD; very low quality evidence). In preventing duodenal ulcer recurrence the differences were imprecise between maintenance therapy with H.pylori eradication therapy and maintenance therapy with UHD (four trials, 319 participants, RR of ulcer recurring 0.73; 95% CI 0.42 to 1.25; 19/159 (adjusted proportion: 11.9%) in eradication therapy versus 26/160 (16.3%) in UHD; very low quality evidence), but eradication therapy was superior to no treatment (27 trials 2509 participants, RR 0.20, 95% CI 0.15 to 0.26; 215/1501 (adjusted proportion: 12.9%) in eradication therapy versus 649/1008 (64.4%) in no treatment; very low quality evidence).In preventing gastric ulcer recurrence, eradication therapy was superior to no

  14. The Critical Aspects of No Child Left Behind for an Urban Teacher Education Program

    ERIC Educational Resources Information Center

    Chen, Irene; Paige, Susan; Bhattacharjee, Maria

    2004-01-01

    The faculty in the Department of Urban Education (DUE) at the University of Houston Downtown (UHD) have developed critical awareness of issues related to the responsibility of both teachers and teacher educators for addressing the mandates of the No Child Left Behind Act of 2001. The purpose of this article is to provide relevant information on…

  15. Radiographic and MRI characteristics of lumbar disseminated idiopathic spinal hyperostosis and spondylosis deformans in dogs.

    PubMed

    Togni, A; Kranenburg, H J C; Morgan, J P; Steffen, F

    2014-07-01

    To evaluate clinical signs, describe lesions and differences in the magnetic resonance imaging appearance of spinal new bone formations classified as disseminated idiopathic spinal hyperostosis and/or spondylosis deformans on radiographs and compare degeneration status of the intervertebral discs using the Pfirrmann scale. Retrospective analysis of 18 dogs presented with spinal disorders using information from radiographic and magnetic resonance imaging examinations. All dogs were found to be affected with both disseminated idiopathic spinal hyperostosis and spondylosis deformans. Neurological signs due to foraminal stenosis associated with disseminated idiopathic spinal hyperostosis were found in two dogs. Spondylosis deformans was associated with foraminal stenosis and/or disc protrusion in 15 cases. The Pfirrmann score on magnetic resonance imaging was significantly higher in spondylosis deformans compared with disseminated idiopathic spinal hyperostosis and signal intensity of new bone due to disseminated idiopathic spinal hyperostosis was significantly higher compared to spondylosis deformans. Differences between disseminated idiopathic spinal hyperostosis and spondylosis deformans found on magnetic resonance imaging contribute to an increased differentiation between the two entities. Clinically relevant lesions in association with disseminated idiopathic spinal hyperostosis were rare compared to those seen with spondylosis deformans. © 2014 British Small Animal Veterinary Association.

  16. An Ultra-High Discrimination Y Chromosome Short Tandem Repeat Multiplex DNA Typing System

    PubMed Central

    Hanson, Erin K.; Ballantyne, Jack

    2007-01-01

    In forensic casework, Y chromosome short tandem repeat markers (Y-STRs) are often used to identify a male donor DNA profile in the presence of excess quantities of female DNA, such as is found in many sexual assault investigations. Commercially available Y-STR multiplexes incorporating 12–17 loci are currently used in forensic casework (Promega's PowerPlex® Y and Applied Biosystems' AmpFlSTR® Yfiler®). Despite the robustness of these commercial multiplex Y-STR systems and the ability to discriminate two male individuals in most cases, the coincidence match probabilities between unrelated males are modest compared with the standard set of autosomal STR markers. Hence there is still a need to develop new multiplex systems to supplement these for those cases where additional discriminatory power is desired or where there is a coincidental Y-STR match between potential male participants. Over 400 Y-STR loci have been identified on the Y chromosome. While these have the potential to increase the discrimination potential afforded by the commercially available kits, many have not been well characterized. In the present work, 91 loci were tested for their relative ability to increase the discrimination potential of the commonly used ‘core’ Y-STR loci. The result of this extensive evaluation was the development of an ultra high discrimination (UHD) multiplex DNA typing system that allows for the robust co-amplification of 14 non-core Y-STR loci. Population studies with a mixed African American and American Caucasian sample set (n = 572) indicated that the overall discriminatory potential of the UHD multiplex was superior to all commercial kits tested. The combined use of the UHD multiplex and the Applied Biosystems' AmpFlSTR® Yfiler® kit resulted in 100% discrimination of all individuals within the sample set, which presages its potential to maximally augment currently available forensic casework markers. It could also find applications in human evolutionary

  17. Quantitative in vivo MRI evaluation of lumbar facet joints and intervertebral discs using axial T2 mapping.

    PubMed

    Stelzeneder, David; Messner, Alina; Vlychou, Marianna; Welsch, Goetz H; Scheurecker, Georg; Goed, Sabine; Pieber, Karin; Pflueger, Verena; Friedrich, Klaus M; Trattnig, Siegfried

    2011-11-01

    To assess the feasibility of T2 mapping of lumbar facet joints and intervertebral discs in a single imaging slab and to compare the findings with morphological grading. Sixty lumbar spine segments from 10 low back pain patients and 5 healthy volunteers were examined by axial T2 mapping and morphological MRI at 3.0 Tesla. Regions of interest were drawn on a single slice for the facet joints and the intervertebral discs (nucleus pulposus, anterior and posterior annulus fibrosus). The Weishaupt grading was used for facet joints and the Pfirrmann score was used for morphological disc grading ("normal" vs. "abnormal" discs). The inter-rater agreement was excellent for the facet joint T2 evaluation (r = 0.85), but poor for the morphological Weishaupt grading (kappa = 0.15). The preliminary results show similar facet joint T2 values in segments with normal and abnormal Pfirrmann scores. There was no difference in mean T2 values between facet joints in different Weishaupt grading groups. Facet joint T2 values showed a weak correlation with T2 values of the posterior annulus (r = 0.32) This study demonstrates the feasibility of a combined T2 mapping approach for the facet joints and intervertebral discs using a single axial slab.

  18. The Effect of Very High versus Very Low Sustained Loading on the Lower Back and Knees in Middle Life

    PubMed Central

    Milgrom, Yael; Constantini, Naama; Applbaum, Yaakov; Radeva-Petrova, Denitsa; Finestone, Aharon S.

    2013-01-01

    To evaluate the effect of the extremes of long term high and low physical activities on musculoskeletal heath in middle age, a historical cohort study was performed. The MRI knee and back findings of 25 randomly selected subjects who were inducted into the armed forces in 1983 and served at least 3 years as elite infantry soldiers were compared 25 years later, with 20 randomly selected subjects who were deferred from army service for full time religious studies at the same time. Both cohorts were from the same common genome. The two primary outcome measures were degenerative lumbar disc disease evaluated by the Pfirrmann score and degenerative knee changes evaluated by the WORMS score. At the 25-year follow up, the mean Pfirrmann score (8.6) for the L1 to S1 level of the elite infantry group was significantly higher than that of the sedentary group (6.7), (P = 0.003). There was no statistically significant difference between the WORMS knee scores between the two cohorts (P = 0.7). In spite of the much greater musculoskeletal loading history of the elite infantry cohort, only their lumbar spines but not their knees showed increased degenerative changes at middle age by MRI criteria. PMID:24093109

  19. Lumbar Spine Musculoskeletal Physiology and Biomechanics During Simulated Military Operations

    DTIC Science & Technology

    2016-06-01

    was found at L5-S1. A caudal increase in fat fraction of the multifidus was observed, with no significant increase in the erector spine. No...proportional to Pfirrmann grade. No significant differences were found between volume, fat fraction, T2, or DTI in any of the muscles in subjects with disc...Figure 8: Fat fraction of the erector spinae and multifidus muscles at each vertebral level. The erector spinae has elevated fat fraction compared to

  20. In Situ Corrosion and Heat Loss Assessment of Two Nonstandard Underground Heat Distribution System Piping Designs: Supplement-Appendices for Final Report on Project F07-AR01

    DTIC Science & Technology

    2011-06-01

    negative mission impacts . This report documents the assessment of two similar nonstandard UHDS piping system designs — one at Fort Carson, CO, and one at...psig and monitored for 2 hours to determine whether the conduit piping system is protected from ground water infiltration and its degrading impacts ...Conduits to/from this pit were tested from adjacent pits. 2. Supply, Return drains tested on 8/15/07: All Dry N S EW MH-3N ERDC/CERL TR-11-14 H13

  1. [Transpedicular dynamics stabilization in the treatment of lumbar stenosis. Fourth years follow-up].

    PubMed

    Reyes-Sánchez, Alejandro; Sánchez-Bringas, Guadalupe; Zarate-Kalfopulos, Barón; Alpizar-Aguirre, Armando; Lara-Padilla, Eleazar; Rosales-Olivares, Luis Miguel

    2013-01-01

    We need to evaluate the efficacy and safety of the use of dynamic fixation in patients with narrow lumbar through comparing the assessment of two years with 4 years of follow-up. Prospective, longitudinal, autocontrol deliberately and sequential intervention, in lumbar stenosis patients who made treatment with dynamic stabilization posterior type Acuflex. An evaluation of four of final follow-up. 18 patients who completed follow-up two years results as a basis for comparison: 18 patients, 14 female and 4 male, average age 44.05 years. Pain evaluated with numerical visual scale was found in the lower back at 24 months in an average of 2.84 and 48 months in 3.26. We measured the functional level of Oswestry at two years to be 24% and at four years 22.44%, with a p = 0.373. In the magnetic resonance for classification of patients 15 Pfirrmann without changes and three with increase of a degree. According to patients 2 Modic changes one of type 0 to type III and another to type I. We have observed that five patients have required second surgery for removal of material findings. There is no change between 2 and 4 years in the scale of Oswestry and pain with visual numerical scale functionality. The average height in discs had change with statistical significance, in the comparative period. The intervertebral discs had changes in 3 patients with direct relationship between scale of Pfirrmann and Modic. The rest of patients keep rehydration and normal disc height.

  2. Evaluation of color mapping algorithms in different color spaces

    NASA Astrophysics Data System (ADS)

    Bronner, Timothée.-Florian; Boitard, Ronan; Pourazad, Mahsa T.; Nasiopoulos, Panos; Ebrahimi, Touradj

    2016-09-01

    The color gamut supported by current commercial displays is only a subset of the full spectrum of colors visible by the human eye. In High-Definition (HD) television technology, the scope of the supported colors covers 35.9% of the full visible gamut. For comparison, Ultra High-Definition (UHD) television, which is currently being deployed on the market, extends this range to 75.8%. However, when reproducing content with a wider color gamut than that of a television, typically UHD content on HD television, some original color information may lie outside the reproduction capabilities of the television. Efficient gamut mapping techniques are required in order to fit the colors of any source content into the gamut of a given display. The goal of gamut mapping is to minimize the distortion, in terms of perceptual quality, when converting video from one color gamut to another. It is assumed that the efficiency of gamut mapping depends on the color space in which it is computed. In this article, we evaluate 14 gamut mapping techniques, 12 combinations of two projection methods across six color spaces as well as R'G'B' Clipping and wrong gamut interpretation. Objective results, using the CIEDE2000 metric, show that the R'G'B' Clipping is slightly outperformed by only one combination of color space and projection method. However, analysis of images shows that R'G'B' Clipping can result in loss of contrast in highly saturated images, greatly impairing the quality of the mapped image.

  3. Ultra High Definition Video from the International Space Station (Reel 1)

    NASA Image and Video Library

    2015-06-15

    The view of life in space is getting a major boost with the introduction of 4K Ultra High-Definition (UHD) video, providing an unprecedented look at what it's like to live and work aboard the International Space Station. This important new capability will allow researchers to acquire high resolution - high frame rate video to provide new insight into the vast array of experiments taking place every day. It will also bestow the most breathtaking views of planet Earth and space station activities ever acquired for consumption by those still dreaming of making the trip to outer space.

  4. A randomised controlled trial of structured nurse-led outpatient clinic follow-up for dyspeptic patients after direct access gastroscopy.

    PubMed

    Chan, David; Harris, Scott; Roderick, Paul; Brown, David; Patel, Praful

    2009-02-06

    Dyspepsia is a common disorder in the community, with many patients referred for diagnostic gastroscopy by their General Practitioner (GP). The National Institute of Clinical Excellence (NICE) recommends follow-up after investigation for cost effective management, including lifestyle advice and drug use. An alternative strategy may be the use of a gastro-intestinal nurse practitioner (GNP) instead of the GP. The objective of this study is to compare the effectiveness and costs of systematic GNP led follow-up to usual care by GPs in dyspeptic patients following gastroscopy. Direct access adult dyspeptic patients referred for gastroscopy; without serious pathology, were followed-up in a structured nurse-led outpatient clinic. Outcome measurement used to compare the two study cohorts (GNP versus GP) included Glasgow dyspepsia severity (Gladys) score, Health Status Short Form 12 (SF12), ulcer healing drug (UHD) use and costs. One hundred and seventy five patients were eligible after gastroscopy, 89 were randomised to GNP follow-up and 86 to GP follow-up. Follow-up at 6 months was 81/89 (91%) in the GNP arm and 79/86 (92%) in the GP arm. On an intention to treat analysis, adjusted mean differences (95%CI) at follow-up between Nurse and GP follow-up were: Gladys score 2.30 (1.4-3.2) p < 0.001, SF12 140.6 (96.5-184.8) p =< 0.001 and UHD costs pound39.60 ( pound24.20- pound55.10) p =< 0.001, all in favour of nurse follow-up. A standardised and structured follow-up by one gastrointestinal nurse practitioner was effective and may save drug costs in patients after gastroscopy. These findings need replication in other centres.

  5. Ultra high-definition video: convergence toward 100Gbps and beyond for digital A/V connectivity with fiber optics

    NASA Astrophysics Data System (ADS)

    Parekh, Devang; Nguyen, Nguyen X.

    2018-02-01

    The recent advent of Ultra-high-definition television (also known as Ultra HD television, Ultra HD, UHDTV, UHD and Super Hi-Vision) has accelerated a demand for a Fiber-in-the-Premises video communication (VCOM) solution that converges toward 100Gbps and Beyond. Hybrid Active-Optical-Cables (AOC) is a holistic connectivity platform well suited for this "The Last Yard" connectivity; as it combines both copper and fiber optics to deliver a high data-rate and power transmission needed. While technically feasible yet challenging to manufacture, hybrid-AOC could be a holygrail fiber-optics solution that dwarfs the volume of both telecom and datacom connection in the foreseeable future.

  6. [Assessment of the correlation between histological degeneration and radiological and clinical parameters in a series of patients who underwent lumbar disc herniation surgery].

    PubMed

    Munarriz, Pablo M; Paredes, Igor; Alén, José F; Castaño-Leon, Ana M; Cepeda, Santiago; Hernandez-Lain, Aurelio; Lagares, Alfonso

    The use of histological degeneration scores in surgically-treated herniated lumbar discs is not common in clinical practice and its use has been primarily restricted to research. The objective of this study is to evaluate if there is an association between a higher grade of histological degeneration when compared with clinical or radiological parameters. Retrospective consecutive analysis of 122 patients who underwent single-segment lumbar disc herniation surgery. Clinical information was available on all patients, while the histological study and preoperative magnetic resonance imaging were also retrieved for 75 patients. Clinical variables included age, duration of symptoms, neurological deficits, or affected deep tendon reflex. The preoperative magnetic resonance imaging was evaluated using Modic and Pfirrmann scores for the affected segment by 2 independent observers. Histological degeneration was evaluated using Weiler's score; the presence of inflammatory infiltrates and neovascularization, not included in the score, were also studied. Correlation and chi-square tests were used to assess the association between histological variables and clinical or radiological variables. Interobserver agreement was also evaluated for the MRI variables using weighted kappa. No statistically significant correlation was found between histological variables (histological degeneration score, inflammatory infiltrates or neovascularization) and clinical or radiological variables. Interobserver agreement for radiological scores resulted in a kappa of 0.79 for the Pfirrmann scale and 0.65 for the Modic scale, both statistically significant. In our series of patients, we could not demonstrate any correlation between the degree of histological degeneration or the presence of inflammatory infiltrates when compared with radiological degeneration scales or clinical variables such as the patient's age or duration of symptoms. Copyright © 2017 Sociedad Española de Neurocirug

  7. "Ultra High Dilution 1994" revisited 2015--the state of follow-up research.

    PubMed

    Endler, P Christian; Schulte, Jurgen; Stock-Schroeer, Beate; Stephen, Saundra

    2015-10-01

    The "Ultra High Dilution 1994" project was an endeavour to take stock of the findings and theories on homeopathic extreme dilutions that were under research at the time in areas of biology, biophysics, physics and medicine. The project finally materialized into an anthology assembling contributions of leading scientists in the field. Over the following two decades, it became widely quoted within the homeopathic community and also known in other research communities. The aim of the present project was to re-visit and review the 1994 studies from the perspective of 2015. The original authors from 1994 or close laboratory colleagues were asked to contribute papers covering their research efforts and learnings in the period from 1994 up to 2015. These contributions were edited and cross-referenced, and a selection of further contributions was added. About a dozen contributions reported on follow-up experiments and studies, including further developments in theory. Only few of the models that had seemed promising in 1994 had not been followed up later. Most models presented in the original publication had meanwhile been submitted to intra-laboratory, multicentre or independent scrutiny. The results of the follow-up research seemed to have rewarded the efforts. Furthermore, contributions were provided on new models that had been inspired by the original ones or that may be candidates for further in-depth ultra high dilution (UHD) research. The project "Ultra High Dilution 1994 revisited 2015" is the latest output of what might be considered the "buena vista social club" of homeopathy research. However, it presents new developments and results of the older, established experimental models as well as a general survey of the state of UHD research. Copyright © 2015 The Faculty of Homeopathy. Published by Elsevier Ltd. All rights reserved.

  8. Quantitative evaluation of lumbar intervertebral disc degeneration by axial T2* mapping.

    PubMed

    Huang, Leitao; Liu, Yuan; Ding, Yi; Wu, Xia; Zhang, Ning; Lai, Qi; Zeng, Xianjun; Wan, Zongmiao; Dai, Min; Zhang, Bin

    2017-12-01

    To quantitatively evaluate the clinical value and demonstrate the potential benefits of biochemical axial T2* mapping-based grading of early stages of degenerative disc disease (DDD) using 3.0-T magnetic resonance imaging (MRI) in a clinical setting.Fifty patients with low back pain and 20 healthy volunteers (control) underwent standard MRI protocols including axial T2* mapping. All the intervertebral discs (IVDs) were classified morphologically. Lumbar IVDs were graded using Pfirrmann score (I to IV). The T2* values of the anterior annulus fibrosus (AF), posterior AF, and nucleus pulposus (NP) of each lumbar IVD were measured. The differences between groups were analyzed regarding specific T2* pattern at different regions of interest.The T2* values of the NP and posterior AF in the patient group were significantly lower than those in the control group (P < .01). The T2* value of the anterior AF was not significantly different between the patients and the controls (P > .05). The mean T2*values of the lumbar IVD in the patient group were significantly lower, especially the posterior AF, followed by the NP, and finally, the anterior AF. In the anterior AF, comparison of grade I with grade III and grade I with grade IV showed statistically significant differences (P = .07 and P = .08, respectively). Similarly, in the NP, comparison of grade I with grade III, grade I with grade IV, grade II with grade III, and grade II with grade IV showed statistically significant differences (P < .001). In the posterior AF, comparison of grade II with grade IV showed a statistically significant difference (P = .032). T2 values decreased linearly with increasing degeneration based on the Pfirrmann scoring system (ρ < -0.5, P < .001).Changes in the T2* value can signify early degenerative IVD diseases. Hence, T2* mapping can be used as a diagnostic tool for quantitative assessment of IVD degeneration. Copyright © 2017 The Authors. Published by Wolters

  9. Quantitative evaluation of lumbar intervertebral disc degeneration by axial T2∗ mapping

    PubMed Central

    Huang, Leitao; Liu, Yuan; Ding, Yi; Wu, Xia; Zhang, Ning; Lai, Qi; Zeng, Xianjun; Wan, Zongmiao; Dai, Min; Zhang, Bin

    2017-01-01

    Abstract To quantitatively evaluate the clinical value and demonstrate the potential benefits of biochemical axial T2∗ mapping-based grading of early stages of degenerative disc disease (DDD) using 3.0-T magnetic resonance imaging (MRI) in a clinical setting. Fifty patients with low back pain and 20 healthy volunteers (control) underwent standard MRI protocols including axial T2∗ mapping. All the intervertebral discs (IVDs) were classified morphologically. Lumbar IVDs were graded using Pfirrmann score (I to IV). The T2∗ values of the anterior annulus fibrosus (AF), posterior AF, and nucleus pulposus (NP) of each lumbar IVD were measured. The differences between groups were analyzed regarding specific T2∗ pattern at different regions of interest. The T2∗ values of the NP and posterior AF in the patient group were significantly lower than those in the control group (P < .01). The T2∗ value of the anterior AF was not significantly different between the patients and the controls (P > .05). The mean T2∗values of the lumbar IVD in the patient group were significantly lower, especially the posterior AF, followed by the NP, and finally, the anterior AF. In the anterior AF, comparison of grade I with grade III and grade I with grade IV showed statistically significant differences (P = .07 and P = .08, respectively). Similarly, in the NP, comparison of grade I with grade III, grade I with grade IV, grade II with grade III, and grade II with grade IV showed statistically significant differences (P < .001). In the posterior AF, comparison of grade II with grade IV showed a statistically significant difference (P = .032). T2∗ values decreased linearly with increasing degeneration based on the Pfirrmann scoring system (ρ < −0.5, P < .001). Changes in the T2∗ value can signify early degenerative IVD diseases. Hence, T2∗ mapping can be used as a diagnostic tool for quantitative assessment of IVD degeneration. PMID:29390547

  10. MUCESS-Supported Ozone Studies in Upstate New York and along the Texas Gulf Coast

    NASA Astrophysics Data System (ADS)

    Hromis, A.; Balimuttajjo, M.; Johnson, A.; Wright, J. M.; Idowu, A.; Vieyra, D.; Musselwhite, D.; Morris, P. A.

    2010-12-01

    The Minority University Consortium for Earth and Space Sciences (MUCESS) supports yearly atmospheric science workshops at their respective institutions. The NSF funded program has enabled Universities and colleges that are part of MUCESS, which include Medgar Evers College, City University of NY, University of Houston-Downtown and South Carolina State University, to develop and support atmospheric studies. The goal of the annual workshops is to instruct the students on the basics of atmospheric science and provide them with hands-on experience for preparing and calibrating the instruments for measuring atmospheric parameters. The instruments are subsequently attached to weather balloons. The data is obtained with an ENSCI ECC ozonesonde, which measures ozone concentrations to parts per billion, and an iMET radiosonde, which records temperature, pressure, relative humidity, and GPS altitude and position. In March 2010, Medgar Evers hosted the workshop in Paradox, NY. Students and faculty from the three institutions attended the 3 day workshop. Subsequent to the annual workshop students from the University of Houston-Downtown (UHD) launched a series of four Sunday launches during the summer from the campus. The data from both the workshop and UHD launches was subsequently analyzed to compare ozone profiles within the troposphere and stratosphere. Comparing rural (Paradox, NY) and urban ozone profiles (Houston, Tx) provides an invaluable experience. An excellent example is the March Paradox temperature profiles as the data indicates a mid-tropospheric temperature inversion. Coincident with this inversion, there is a significant rise in ozone concentrations, the source of which is likely of non-local provenance. In contrast, the Houston summer data indicates a different story as ground level ozone is produced by industrial and transportation-related ozone sources levels which vary. Weekend ground level ozone levels on Sunday are usually relatively low because of

  11. Microstructure of ultra high performance concrete containing lithium slag.

    PubMed

    He, Zhi-Hai; Du, Shi-Gui; Chen, Deng

    2018-04-03

    Lithium slag (LS) is discharged as a byproduct in the process of the lithium carbonate, and it is very urgent to explore an efficient way to recycle LS in order to protect the environments and save resources. Many available supplementary cementitious materials for partial replacement of cement and/or silica fume (SF) can be used to prepare ultra high performance concrete (UHPC). The effect of LS to replace SF partially by weight used as a supplementary cementitious material (0%, 5%, 10% and 15% of binder) on the compressive strengths and microstructure evolution of UHPC has experimentally been studied by multi-techniques including mercury intrusion porosimetry, scanning electron microscope and nanoindentation technique. The results show that the use of LS degrades the microstructure of UHPC at early ages, and however, the use of LS with the appropriate content improves microstructure of UHPC at later ages. The hydration products of UHPC are mainly dominated by ultra-high density calcium-silicate-hydrate (UHD C-S-H) and interfacial transition zone (ITZ) in UHPC has similar compact microstructure with the matrix. The use of LS improves the hydration degree of UHPC and increases the elastic modulus of ITZ in UHPC. LS is a promising substitute for SF for preparation UHPC. Copyright © 2018 Elsevier B.V. All rights reserved.

  12. Effect of Lumbar Disc Degeneration and Low-Back Pain on the Lumbar Lordosis in Supine and Standing: A Cross-Sectional MRI Study.

    PubMed

    Hansen, Bjarke B; Bendix, Tom; Grindsted, Jacob; Bliddal, Henning; Christensen, Robin; Hansen, Philip; Riis, Robert G C; Boesen, Mikael

    2015-11-01

    Cross-sectional study. To examine the influence of low-back pain (LBP) and lumbar disc degeneration (LDD) on the lumbar lordosis in weight-bearing positional magnetic resonance imaging (pMRI). The lumbar lordosis increases with a change of position from supine to standing and is known as an essential contributor to dynamic changes. However, the lordosis may be affected by disc degeneration and pain. Patients with LBP >40 on a 0 to 100 mm Visual Analog Scale (VAS) both during activity and rest and a sex and age-decade matching control group without LBP were scanned in the supine and standing position in a 0.25-T open MRI unit. LDD was graded using Pfirrmann's grading-scale. Subsequently, the L2-to-S1 lumbar lordosis angle (LA) was measured. Thirty-eight patients with an average VAS of 58 (±13.8) mm during rest and 75 (±5.0) mm during activities, and 38 healthy controls were included. MRI findings were common in both groups, whereas, the summation of the Pfirrmann's grades (LDD-score) was significantly higher in the patients [(MD 1.44; 95% confidence intervals (CI) 0.80 to 2.10; P < 0.001]. The patients were less lordotic than the controls in both the supine (MD -6.4°; 95% CI -11.4 to -1.3), and standing position (MD -5.6°; 95% CI -10.7 to -0.7); however, the changes between the positions (ΔLA) were the same (MD 0.8°; 95% CI -1.8 to 3.3). Using generalized linear model the LDD-score was associated with age (P < 0.001) for both groups. The LDD-score and ΔLA were negatively associated in the control group (P < 0.001), also after adjustments for gender and age (β-coefficient: -2.66; 95% CI -4.3 to -1.0; P = 0.002). Patients may be less lordotic in both the supine and standing position, whereas, change in the lordosis between the positions may be independent of pain. Decreasing lordosis change seems to be associated with age-related increasing disc degeneration in healthy individuals. 2.

  13. Single Image Super-Resolution Using Global Regression Based on Multiple Local Linear Mappings.

    PubMed

    Choi, Jae-Seok; Kim, Munchurl

    2017-03-01

    Super-resolution (SR) has become more vital, because of its capability to generate high-quality ultra-high definition (UHD) high-resolution (HR) images from low-resolution (LR) input images. Conventional SR methods entail high computational complexity, which makes them difficult to be implemented for up-scaling of full-high-definition input images into UHD-resolution images. Nevertheless, our previous super-interpolation (SI) method showed a good compromise between Peak-Signal-to-Noise Ratio (PSNR) performances and computational complexity. However, since SI only utilizes simple linear mappings, it may fail to precisely reconstruct HR patches with complex texture. In this paper, we present a novel SR method, which inherits the large-to-small patch conversion scheme from SI but uses global regression based on local linear mappings (GLM). Thus, our new SR method is called GLM-SI. In GLM-SI, each LR input patch is divided into 25 overlapped subpatches. Next, based on the local properties of these subpatches, 25 different local linear mappings are applied to the current LR input patch to generate 25 HR patch candidates, which are then regressed into one final HR patch using a global regressor. The local linear mappings are learned cluster-wise in our off-line training phase. The main contribution of this paper is as follows: Previously, linear-mapping-based conventional SR methods, including SI only used one simple yet coarse linear mapping to each patch to reconstruct its HR version. On the contrary, for each LR input patch, our GLM-SI is the first to apply a combination of multiple local linear mappings, where each local linear mapping is found according to local properties of the current LR patch. Therefore, it can better approximate nonlinear LR-to-HR mappings for HR patches with complex texture. Experiment results show that the proposed GLM-SI method outperforms most of the state-of-the-art methods, and shows comparable PSNR performance with much lower

  14. Jeff’s Earth - 4K

    NASA Image and Video Library

    2017-01-17

    The first time you see Planet Earth from space, it’s stunning; when you’ve spent 534 days in space—more than any other American—it still is! On his most recent trip the International Space Station NASA astronaut Jeff Williams brought an Ultra High Definition video camera that he pointed at the planet 250 miles below; here he shares some of those images, and talks about the beauty of the planet, the variety of things to see, and the value of sharing that perspective with everyone who can’t go to orbit in person. HD download link: https://archive.org/details/TheSpaceProgram UHD content download link: https://archive.org/details/NASA-Ultra-High-Definition _______________________________________ FOLLOW THE SPACE STATION! Twitter: https://twitter.com/Space_Station Facebook: https://www.facebook.com/ISS Instagram: https://instagram.com/iss/ YouTube: https://youtu.be/-nmNhKRzy4w

  15. Effects of Leak Detection/Location on Underground Heat Distribution System (UHDS) Life Cycle Costs: A Probabilistic Model

    DTIC Science & Technology

    1991-12-01

    ADA167556 (USACERL, March 1986). E. Segan and C. Chen ’K. Cooper et a]. 9 spots associated with leaks are detected with either a digital temperature...HSHG.DEX ATTN: NAPEN-FL Walte Road AMC 20307 Norto APB. CA 92409 ATTN: Faciliion Ensgime ATIN: APRC-MX4DB US Ary Roagr Diviaiur AWlN: Libary (13

  16. UAV-based mapping, back analysis and trajectory modeling of a coseismic rockfall in Lefkada island, Greece

    NASA Astrophysics Data System (ADS)

    Saroglou, Charalampos; Asteriou, Pavlos; Zekkos, Dimitrios; Tsiambaos, George; Clark, Marin; Manousakis, John

    2018-01-01

    We present field evidence and a kinematic study of a rock block mobilized in the Ponti area by a Mw = 6.5 earthquake near the island of Lefkada on 17 November 2015. A detailed survey was conducted using an unmanned aerial vehicle (UAV) with an ultrahigh definition (UHD) camera, which produced a high-resolution orthophoto and a digital terrain model (DTM). The sequence of impact marks from the rock trajectory on the ground surface was identified from the orthophoto and field verified. Earthquake characteristics were used to estimate the acceleration of the rock slope and the initial condition of the detached block. Using the impact points from the measured rockfall trajectory, an analytical reconstruction of the trajectory was undertaken, which led to insights on the coefficients of restitution (CORs). The measured trajectory was compared with modeled rockfall trajectories using recommended parameters. However, the actual trajectory could not be accurately predicted, revealing limitations of existing rockfall analysis software used in engineering practice.

  17. Correlation study between facet joint cartilage and intervertebral discs in early lumbar vertebral degeneration using T2, T2* and T1ρ mapping

    PubMed Central

    Zhang, Yi; Hu, Jianzhong; Duan, Chunyue; Hu, Ping; Lu, Hongbin; Peng, Xianjing

    2017-01-01

    Recent advancements in magnetic resonance imaging have allowed for the early detection of biochemical changes in intervertebral discs and articular cartilage. Here, we assessed the feasibility of axial T2, T2* and T1ρ mapping of the lumbar facet joints (LFJs) to determine correlations between cartilage and intervertebral discs (IVDs) in early lumbar vertebral degeneration. We recruited 22 volunteers and examined 202 LFJs and 101 IVDs with morphological (sagittal and axial FSE T2-weighted imaging) and axial biochemical (T2, T2* and T1ρ mapping) sequences using a 3.0T MRI scanner. IVDs were graded using the Pfirrmann system. Mapping values of LFJs were recorded according to the degeneration grades of IVDs at the same level. The feasibility of T2, T2* and T1ρ in IVDs and LFJs were analyzed by comparing these mapping values across subjects with different rates of degeneration using Kruskal-Wallis tests. A Pearson’s correlation analysis was used to compare T2, T2* and T1ρ values of discs and LFJs. We found excellent reproducibility in the T2, T2* and T1ρ values for the nucleus pulposus (NP), anterior and posterior annulus fibrosus (PAF), and LFJ cartilage (intraclass correlation coefficients 0.806–0.955). T2, T2* and T1ρ mapping (all P<0.01) had good Pfirrmann grade performances in the NP with IVD degeneration. LFJ T2* values were significantly different between grades I and IV (PL = 0.032, PR = 0.026), as were T1ρ values between grades II and III (PL = 0.002, PR = 0.006) and grades III and IV (PL = 0.006, PR = 0.001). Correlations were moderately negative for T1ρ values between LFJ cartilage and NP (rL = −0.574, rR = −0.551), and between LFJ cartilage and PAF (rL = −0.551, rR = −0.499). T1ρ values of LFJ cartilage was weakly correlated with T2 (r = 0.007) and T2* (r = −0.158) values. Overall, we show that axial T1ρ effectively assesses early LFJ cartilage degeneration. Using T1ρ analysis, we propose a link between LFJ degeneration and IVD NP or

  18. Correlation of matrix metalloproteinase (MMP)-1, -2, -3, and -9 expressions with demographic and radiological features in primary lumbar intervertebral disc disease.

    PubMed

    Basaran, Recep; Senol, Mehmet; Ozkanli, Seyma; Efendioglu, Mustafa; Kaner, Tuncay

    2017-07-01

    Degeneration of IVD is a progressive and irreversible process and can be evaluated with immunohistochemical examination or radiological grading. MMPs are a family of proteolytic enzymes and involved in the degradation of the matrix components of the IVD. We aimed to compare MMP-1, -2, -3, and -9 expressions with demographic features, visual analogue scale (VAS), Oswestry Disability Index (ODI) and radiological (MRI) grades. The study involved 60 participants. We recorded data about age, complaint, radiological imaging, expression levels of MMP-1, -2, -3, and -9, ODI and VAS for back pain retrospectively. Intervertebral disc degeneration was graded on a 0-5 scale according to the Pfirrmann classification. As a result of the study, the median age was 52.09±12.74years. There were statistical significances between age and MMP-1, and MMP-2. There was a close correlation between grade and MMP-9. We found correlation between the VAS and the MMP-9 expression. In addition, there was relationship between expression of MMP-2 and MMP-1, MMP-3, MMP-9. In conclusion, the expressions of MMP-1 and -2 are increased with aging. There was no relationship between radiological evaluation of IVDD and aging. Increased expression of MMPs affected IVDD positively. The relationship with MMPs is not explained. This study adds to our understanding of the interaction between MMPs and IVDD. Copyright © 2017 Elsevier Ltd. All rights reserved.

  19. Frame Decoder for Consultative Committee for Space Data Systems (CCSDS)

    NASA Technical Reports Server (NTRS)

    Reyes, Miguel A. De Jesus

    2014-01-01

    GNU Radio is a free and open source development toolkit that provides signal processing to implement software radios. It can be used with low-cost external RF hardware to create software defined radios, or without hardware in a simulation-like environment. GNU Radio applications are primarily written in Python and C++. The Universal Software Radio Peripheral (USRP) is a computer-hosted software radio designed by Ettus Research. The USRP connects to a host computer via high-speed Gigabit Ethernet. Using the open source Universal Hardware Driver (UHD), we can run GNU Radio applications using the USRP. An SDR is a "radio in which some or all physical layer functions are software defined"(IEEE Definition). A radio is any kind of device that wirelessly transmits or receives radio frequency (RF) signals in the radio frequency. An SDR is a radio communication system where components that have been typically implemented in hardware are implemented in software. GNU Radio has a generic packet decoder block that is not optimized for CCSDS frames. Using this generic packet decoder will add bytes to the CCSDS frames and will not permit for bit error correction using Reed-Solomon. The CCSDS frames consist of 256 bytes, including a 32-bit sync marker (0x1ACFFC1D). This frames are generated by the Space Data Processor and GNU Radio will perform the modulation and framing operations, including frame synchronization.

  20. A new high temperature reactor for operando XAS: Application for the dry reforming of methane over Ni/ZrO2 catalyst.

    PubMed

    Aguilar-Tapia, Antonio; Ould-Chikh, Samy; Lahera, Eric; Prat, Alain; Delnet, William; Proux, Olivier; Kieffer, Isabelle; Basset, Jean-Marie; Takanabe, Kazuhiro; Hazemann, Jean-Louis

    2018-03-01

    The construction of a high-temperature reaction cell for operando X-ray absorption spectroscopy characterization is reported. A dedicated cell was designed to operate as a plug-flow reactor using powder samples requiring gas flow and thermal treatment at high temperatures. The cell was successfully used in the reaction of dry reforming of methane (DRM). We present X-ray absorption results in the fluorescence detection mode on a 0.4 wt. % Ni/ZrO 2 catalyst under realistic conditions at 750 °C, reproducing the conditions used for a conventional dynamic microreactor for the DRM reaction. The setup includes a gas distribution system that can be fully remotely operated. The reaction cell offers the possibility of transmission and fluorescence detection modes. The complete setup dedicated to the study of catalysts is permanently installed on the Collaborating Research Groups French Absorption spectroscopy beamline in Material and Environmental sciences (CRG-FAME) and French Absorption spectroscopy beamline in Material and Environmental sciences at Ultra-High Dilution (FAME-UHD) beamlines (BM30B and BM16) at the European Synchrotron Radiation Facility in Grenoble, France.

  1. A new high temperature reactor for operando XAS: Application for the dry reforming of methane over Ni/ZrO2 catalyst

    NASA Astrophysics Data System (ADS)

    Aguilar-Tapia, Antonio; Ould-Chikh, Samy; Lahera, Eric; Prat, Alain; Delnet, William; Proux, Olivier; Kieffer, Isabelle; Basset, Jean-Marie; Takanabe, Kazuhiro; Hazemann, Jean-Louis

    2018-03-01

    The construction of a high-temperature reaction cell for operando X-ray absorption spectroscopy characterization is reported. A dedicated cell was designed to operate as a plug-flow reactor using powder samples requiring gas flow and thermal treatment at high temperatures. The cell was successfully used in the reaction of dry reforming of methane (DRM). We present X-ray absorption results in the fluorescence detection mode on a 0.4 wt. % Ni/ZrO2 catalyst under realistic conditions at 750 °C, reproducing the conditions used for a conventional dynamic microreactor for the DRM reaction. The setup includes a gas distribution system that can be fully remotely operated. The reaction cell offers the possibility of transmission and fluorescence detection modes. The complete setup dedicated to the study of catalysts is permanently installed on the Collaborating Research Groups French Absorption spectroscopy beamline in Material and Environmental sciences (CRG-FAME) and French Absorption spectroscopy beamline in Material and Environmental sciences at Ultra-High Dilution (FAME-UHD) beamlines (BM30B and BM16) at the European Synchrotron Radiation Facility in Grenoble, France.

  2. Comparison between multi-constellation ambiguity-fixed PPP and RTK for maritime precise navigation

    NASA Astrophysics Data System (ADS)

    Tegedor, Javier; Liu, Xianglin; Ørpen, Ole; Treffers, Niels; Goode, Matthew; Øvstedal, Ola

    2015-06-01

    In order to achieve high-accuracy positioning, either Real-Time Kinematic (RTK) or Precise Point Positioning (PPP) techniques can be used. While RTK normally delivers higher accuracy with shorter convergence times, PPP has been an attractive technology for maritime applications, as it delivers uniform positioning performance without the direct need of a nearby reference station. Traditional PPP has been based on ambiguity-­float solutions using GPS and Glonass constellations. However, the addition of new satellite systems, such as Galileo and BeiDou, and the possibility of fixing integer carrier-phase ambiguities (PPP-AR) allow to increase PPP accuracy. In this article, a performance assessment has been done between RTK, PPP and PPP-AR, using GNSS data collected from two antennas installed on a ferry navigating in Oslo (Norway). RTK solutions have been generated using short, medium and long baselines (up to 290 km). For the generation of PPP-AR solutions, Uncalibrated Hardware Delays (UHDs) for GPS, Galileo and BeiDou have been estimated using reference stations in Oslo and Onsala. The performance of RTK and multi-­constellation PPP and PPP-AR are presented.

  3. Polycrystalline structures formed in evaporating droplets as a parameter to test the action of Zincum metallicum 30c in a wheat seed model.

    PubMed

    Kokornaczyk, Maria Olga; Baumgartner, Stephan; Betti, Lucietta

    2016-05-01

    Polycrystalline structures formed inside evaporating droplets of different biological fluids have been shown sensitive towards various influences, including ultra high dilutions (UHDs), representing so a new approach potentially useful for basic research in homeopathy. In the present study we tested on a wheat seed model Zincum metallicum 30c efficacy versus lactose 30c and water. Stressed and non-stressed wheat seeds were watered with the three treatments. Seed-leakage droplets were evaporated and the polycrystalline structures formed inside the droplet residues were analyzed for their local connected fractal dimensions (LCFDs) (measure of complexity) using the software ImageJ. We have found significant differences in LCFD values of polycrystalline structures obtained from stressed seeds following the treatments (p<0.0001); Zincum metallicum 30c lowered the structures' complexity compared to lactose 30c and water. In non-stressed seeds no significant differences were found. The droplet evaporation method (DEM) might represent a potentially useful tool in basic research in homeopathy. Furthermore our results suggest a sensitization of the stressed model towards the treatment action, which is conforming to previous findings. Copyright © 2015 The Faculty of Homeopathy. Published by Elsevier Ltd. All rights reserved.

  4. Comparison of compression efficiency between HEVC/H.265 and VP9 based on subjective assessments

    NASA Astrophysics Data System (ADS)

    Řeřábek, Martin; Ebrahimi, Touradj

    2014-09-01

    Current increasing effort of broadcast providers to transmit UHD (Ultra High Definition) content is likely to increase demand for ultra high definition televisions (UHDTVs). To compress UHDTV content, several alternative encoding mechanisms exist. In addition to internationally recognized standards, open access proprietary options, such as VP9 video encoding scheme, have recently appeared and are gaining popularity. One of the main goals of these encoders is to efficiently compress video sequences beyond HDTV resolution for various scenarios, such as broadcasting or internet streaming. In this paper, a broadcast scenario rate-distortion performance analysis and mutual comparison of one of the latest video coding standards H.265/HEVC with recently released proprietary video coding scheme VP9 is presented. Also, currently one of the most popular and widely spread encoder H.264/AVC has been included into the evaluation to serve as a comparison baseline. The comparison is performed by means of subjective evaluations showing actual differences between encoding algorithms in terms of perceived quality. The results indicate a general dominance of HEVC based encoding algorithm in comparison to other alternatives, while VP9 and AVC showing similar performance.

  5. The optical design of ultra-short throw system for panel emitted theater video system

    NASA Astrophysics Data System (ADS)

    Huang, Jiun-Woei

    2015-07-01

    In the past decade, the display format from (HD High Definition) through Full HD(1920X1080) to UHD(4kX2k), mainly guides display industry to two directions: one is liquid crystal display(LCD) from 10 inch to 100 inch and more, and the other is projector. Although LCD has been popularly used in market; however, the investment for production such kind displays cost more money expenditure, and less consideration of environmental pollution and protection[1]. The Projection system may be considered, due to more viewing access, flexible in location, energy saving and environmental protection issues. The topic is to design and fabricate a short throw factor liquid crystal on silicon (LCoS) projection system for cinema. It provides a projection lens system, including a tele-centric lens fitted for emitted LCoS to collimate light to enlarge the field angle. Then, the optical path is guided by a symmetric lens. Light of LCoS may pass through the lens, hit on and reflect through an aspherical mirror, to form a less distortion image on blank wall or screen for home cinema. The throw ratio is less than 0.33.

  6. Open versus percutaneous instrumentation in thoracolumbar fractures: magnetic resonance imaging comparison of paravertebral muscles after implant removal.

    PubMed

    Ntilikina, Yves; Bahlau, David; Garnon, Julien; Schuller, Sébastien; Walter, Axel; Schaeffer, Mickaël; Steib, Jean-Paul; Charles, Yann Philippe

    2017-08-01

    OBJECTIVE Percutaneous instrumentation in thoracolumbar fractures is intended to decrease paravertebral muscle damage by avoiding dissection. The aim of this study was to compare muscles at instrumented levels in patients who were treated by open or percutaneous surgery. METHODS Twenty-seven patients underwent open instrumentation, and 65 were treated percutaneously. A standardized MRI protocol using axial T1-weighted sequences was performed at a minimum 1-year follow-up after implant removal. Two independent observers measured cross-sectional areas (CSAs, in cm 2 ) and region of interest (ROI) signal intensity (in pixels) of paravertebral muscles by using OsiriX at the fracture level, and at cranial and caudal instrumented pedicle levels. An interobserver comparison was made using the Bland-Altman method. Reference ROI muscle was assessed in the psoas and ROI fat subcutaneously. The ratio ROI-CSA/ROI-fat was compared for patients treated with open versus percutaneous procedures by using a linear mixed model. A linear regression analyzed additional factors: age, sex, body mass index (BMI), Pfirrmann grade of adjacent discs, and duration of instrumentation in situ. RESULTS The interobserver agreement was good for all CSAs. The average CSA for the entire spine was 15.7 cm 2 in the open surgery group and 18.5 cm 2 in the percutaneous group (p = 0.0234). The average ROI-fat and ROI-muscle signal intensities were comparable: 497.1 versus 483.9 pixels for ROI-fat and 120.4 versus 111.7 pixels for ROI-muscle in open versus percutaneous groups. The ROI-CSA varied between 154 and 226 for open, and between 154 and 195 for percutaneous procedures, depending on instrumented levels. A significant difference of the ROI-CSA/ROI-fat ratio (0.4 vs 0.3) was present at fracture levels T12-L1 (p = 0.0329) and at adjacent cranial (p = 0.0139) and caudal (p = 0.0100) instrumented levels. Differences were not significant at thoracic levels. When adjusting based on age, BMI, and Pfirrmann

  7. ISSLS PRIZE IN BIOENGINEERING SCIENCE 2017: Automation of reading of radiological features from magnetic resonance images (MRIs) of the lumbar spine without human intervention is comparable with an expert radiologist.

    PubMed

    Jamaludin, Amir; Lootus, Meelis; Kadir, Timor; Zisserman, Andrew; Urban, Jill; Battié, Michele C; Fairbank, Jeremy; McCall, Iain

    2017-05-01

    Investigation of the automation of radiological features from magnetic resonance images (MRIs) of the lumbar spine. To automate the process of grading lumbar intervertebral discs and vertebral bodies from MRIs. MR imaging is the most common imaging technique used in investigating low back pain (LBP). Various features of degradation, based on MRIs, are commonly recorded and graded, e.g., Modic change and Pfirrmann grading of intervertebral discs. Consistent scoring and grading is important for developing robust clinical systems and research. Automation facilitates this consistency and reduces the time of radiological analysis considerably and hence the expense. 12,018 intervertebral discs, from 2009 patients, were graded by a radiologist and were then used to train: (1) a system to detect and label vertebrae and discs in a given scan, and (2) a convolutional neural network (CNN) model that predicts several radiological gradings. The performance of the model, in terms of class average accuracy, was compared with the intra-observer class average accuracy of the radiologist. The detection system achieved 95.6% accuracy in terms of disc detection and labeling. The model is able to produce predictions of multiple pathological gradings that consistently matched those of the radiologist. The model identifies 'Evidence Hotspots' that are the voxels that most contribute to the degradation scores. Automation of radiological grading is now on par with human performance. The system can be beneficial in aiding clinical diagnoses in terms of objectivity of gradings and the speed of analysis. It can also draw the attention of a radiologist to regions of degradation. This objectivity and speed is an important stepping stone in the investigation of the relationship between MRIs and clinical diagnoses of back pain in large cohorts. Level 3.

  8. Markov Processes in Image Processing

    NASA Astrophysics Data System (ADS)

    Petrov, E. P.; Kharina, N. L.

    2018-05-01

    Digital images are used as an information carrier in different sciences and technologies. The aspiration to increase the number of bits in the image pixels for the purpose of obtaining more information is observed. In the paper, some methods of compression and contour detection on the basis of two-dimensional Markov chain are offered. Increasing the number of bits on the image pixels will allow one to allocate fine object details more precisely, but it significantly complicates image processing. The methods of image processing do not concede by the efficiency to well-known analogues, but surpass them in processing speed. An image is separated into binary images, and processing is carried out in parallel with each without an increase in speed, when increasing the number of bits on the image pixels. One more advantage of methods is the low consumption of energy resources. Only logical procedures are used and there are no computing operations. The methods can be useful in processing images of any class and assignment in processing systems with a limited time and energy resources.

  9. [Effect of exercise load on apparent diffusion coefficient and fractional anisotropy of normal lumbar intervertebral discs in diffusion tensor imaging].

    PubMed

    Zhong, Xiu; Qiu, Shijun

    2015-06-01

    To investigate the effect of exercise load on apparent diffusion coefficient (ADC) and fractional anisotropy (FA) of normal lumbar intervertebral discs in magnetic resonance (MR) diffusion tensor imaging (DTI). Thirty healthy volunteers (24 males and 6 females, aged 19 to 25 years) underwent examinations with MR T2WI and DTI of the lumbar intervertebral discs before and after exercise load. Pfirrmann grading was evaluated with T2WI, and the B0 map, ADC map and FA map were reconstructed based on the DTI data to investigate the changes in ADC and FA after exercise. Of the 30 volunteers (150 intervertebral discs) receiving the examination, 27 with discs of Pfirrminn grade II were included for analysis. In these 27 volunteers, the average ADC and FA before exercise were (1.99 ± 0.18)×10⁻³ mm²/s and 0.155∓0.059, respectively. After exercise, ADC was lowered significantly to (1.93 ± 0.17)×10⁻³ mm²/s (P<0.05) and FA increased slightly to 0.1623 ± 0.017 (P>0.05). DTI allows quantitatively analysis of the changes in water molecular diffusion and anisotropy of the lumbar intervertebral discs after exercise load, which can cause a decreased ADC and a increased FA value, and the change of ADC is more sensitive to exercise load.

  10. Process and Post-Process: A Discursive History.

    ERIC Educational Resources Information Center

    Matsuda, Paul Kei

    2003-01-01

    Examines the history of process and post-process in composition studies, focusing on ways in which terms, such as "current-traditional rhetoric,""process," and "post-process" have contributed to the discursive construction of reality. Argues that use of the term post-process in the context of second language writing needs to be guided by a…

  11. Perceptual processing affects conceptual processing.

    PubMed

    Van Dantzig, Saskia; Pecher, Diane; Zeelenberg, René; Barsalou, Lawrence W

    2008-04-05

    According to the Perceptual Symbols Theory of cognition (Barsalou, 1999), modality-specific simulations underlie the representation of concepts. A strong prediction of this view is that perceptual processing affects conceptual processing. In this study, participants performed a perceptual detection task and a conceptual property-verification task in alternation. Responses on the property-verification task were slower for those trials that were preceded by a perceptual trial in a different modality than for those that were preceded by a perceptual trial in the same modality. This finding of a modality-switch effect across perceptual processing and conceptual processing supports the hypothesis that perceptual and conceptual representations are partially based on the same systems. 2008 Cognitive Science Society, Inc.

  12. Process Correlation Analysis Model for Process Improvement Identification

    PubMed Central

    Park, Sooyong

    2014-01-01

    Software process improvement aims at improving the development process of software systems. It is initiated by process assessment identifying strengths and weaknesses and based on the findings, improvement plans are developed. In general, a process reference model (e.g., CMMI) is used throughout the process of software process improvement as the base. CMMI defines a set of process areas involved in software development and what to be carried out in process areas in terms of goals and practices. Process areas and their elements (goals and practices) are often correlated due to the iterative nature of software development process. However, in the current practice, correlations of process elements are often overlooked in the development of an improvement plan, which diminishes the efficiency of the plan. This is mainly attributed to significant efforts and the lack of required expertise. In this paper, we present a process correlation analysis model that helps identify correlations of process elements from the results of process assessment. This model is defined based on CMMI and empirical data of improvement practices. We evaluate the model using industrial data. PMID:24977170

  13. Process correlation analysis model for process improvement identification.

    PubMed

    Choi, Su-jin; Kim, Dae-Kyoo; Park, Sooyong

    2014-01-01

    Software process improvement aims at improving the development process of software systems. It is initiated by process assessment identifying strengths and weaknesses and based on the findings, improvement plans are developed. In general, a process reference model (e.g., CMMI) is used throughout the process of software process improvement as the base. CMMI defines a set of process areas involved in software development and what to be carried out in process areas in terms of goals and practices. Process areas and their elements (goals and practices) are often correlated due to the iterative nature of software development process. However, in the current practice, correlations of process elements are often overlooked in the development of an improvement plan, which diminishes the efficiency of the plan. This is mainly attributed to significant efforts and the lack of required expertise. In this paper, we present a process correlation analysis model that helps identify correlations of process elements from the results of process assessment. This model is defined based on CMMI and empirical data of improvement practices. We evaluate the model using industrial data.

  14. 5 CFR 1653.13 - Processing legal processes.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 5 Administrative Personnel 3 2014-01-01 2014-01-01 false Processing legal processes. 1653.13 Section 1653.13 Administrative Personnel FEDERAL RETIREMENT THRIFT INVESTMENT BOARD COURT ORDERS AND LEGAL PROCESSES AFFECTING THRIFT SAVINGS PLAN ACCOUNTS Legal Process for the Enforcement of a Participant's Legal...

  15. 5 CFR 1653.13 - Processing legal processes.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 5 Administrative Personnel 3 2013-01-01 2013-01-01 false Processing legal processes. 1653.13 Section 1653.13 Administrative Personnel FEDERAL RETIREMENT THRIFT INVESTMENT BOARD COURT ORDERS AND LEGAL PROCESSES AFFECTING THRIFT SAVINGS PLAN ACCOUNTS Legal Process for the Enforcement of a Participant's Legal...

  16. Management of processes of electrochemical dimensional processing

    NASA Astrophysics Data System (ADS)

    Akhmetov, I. D.; Zakirova, A. R.; Sadykov, Z. B.

    2017-09-01

    In different industries a lot high-precision parts are produced from hard-processed scarce materials. Forming such details can only be acting during non-contact processing, or a minimum of effort, and doable by the use, for example, of electro-chemical processing. At the present stage of development of metal working processes are important management issues electrochemical machining and its automation. This article provides some indicators and factors of electrochemical machining process.

  17. Perceptual Processing Affects Conceptual Processing

    ERIC Educational Resources Information Center

    van Dantzig, Saskia; Pecher, Diane; Zeelenberg, Rene; Barsalou, Lawrence W.

    2008-01-01

    According to the Perceptual Symbols Theory of cognition (Barsalou, 1999), modality-specific simulations underlie the representation of concepts. A strong prediction of this view is that perceptual processing affects conceptual processing. In this study, participants performed a perceptual detection task and a conceptual property-verification task…

  18. SAR processing using SHARC signal processing systems

    NASA Astrophysics Data System (ADS)

    Huxtable, Barton D.; Jackson, Christopher R.; Skaron, Steve A.

    1998-09-01

    Synthetic aperture radar (SAR) is uniquely suited to help solve the Search and Rescue problem since it can be utilized either day or night and through both dense fog or thick cloud cover. Other papers in this session, and in this session in 1997, describe the various SAR image processing algorithms that are being developed and evaluated within the Search and Rescue Program. All of these approaches to using SAR data require substantial amounts of digital signal processing: for the SAR image formation, and possibly for the subsequent image processing. In recognition of the demanding processing that will be required for an operational Search and Rescue Data Processing System (SARDPS), NASA/Goddard Space Flight Center and NASA/Stennis Space Center are conducting a technology demonstration utilizing SHARC multi-chip modules from Boeing to perform SAR image formation processing.

  19. Improving the process of process modelling by the use of domain process patterns

    NASA Astrophysics Data System (ADS)

    Koschmider, Agnes; Reijers, Hajo A.

    2015-01-01

    The use of business process models has become prevalent in a wide area of enterprise applications. But while their popularity is expanding, concerns are growing with respect to their proper creation and maintenance. An obvious way to boost the efficiency of creating high-quality business process models would be to reuse relevant parts of existing models. At this point, however, limited support exists to guide process modellers towards the usage of appropriate model content. In this paper, a set of content-oriented patterns is presented, which is extracted from a large set of process models from the order management and manufacturing production domains. The patterns are derived using a newly proposed set of algorithms, which are being discussed in this paper. The authors demonstrate how such Domain Process Patterns, in combination with information on their historic usage, can support process modellers in generating new models. To support the wider dissemination and development of Domain Process Patterns within and beyond the studied domains, an accompanying website has been set up.

  20. High-throughput process development: I. Process chromatography.

    PubMed

    Rathore, Anurag S; Bhambure, Rahul

    2014-01-01

    Chromatographic separation serves as "a workhorse" for downstream process development and plays a key role in removal of product-related, host cell-related, and process-related impurities. Complex and poorly characterized raw materials and feed material, low feed concentration, product instability, and poor mechanistic understanding of the processes are some of the critical challenges that are faced during development of a chromatographic step. Traditional process development is performed as trial-and-error-based evaluation and often leads to a suboptimal process. High-throughput process development (HTPD) platform involves an integration of miniaturization, automation, and parallelization and provides a systematic approach for time- and resource-efficient chromatography process development. Creation of such platforms requires integration of mechanistic knowledge of the process with various statistical tools for data analysis. The relevance of such a platform is high in view of the constraints with respect to time and resources that the biopharma industry faces today. This protocol describes the steps involved in performing HTPD of process chromatography step. It described operation of a commercially available device (PreDictor™ plates from GE Healthcare). This device is available in 96-well format with 2 or 6 μL well size. We also discuss the challenges that one faces when performing such experiments as well as possible solutions to alleviate them. Besides describing the operation of the device, the protocol also presents an approach for statistical analysis of the data that is gathered from such a platform. A case study involving use of the protocol for examining ion-exchange chromatography of granulocyte colony-stimulating factor (GCSF), a therapeutic product, is briefly discussed. This is intended to demonstrate the usefulness of this protocol in generating data that is representative of the data obtained at the traditional lab scale. The agreement in the

  1. Process-based tolerance assessment of connecting rod machining process

    NASA Astrophysics Data System (ADS)

    Sharma, G. V. S. S.; Rao, P. Srinivasa; Surendra Babu, B.

    2016-06-01

    Process tolerancing based on the process capability studies is the optimistic and pragmatic approach of determining the manufacturing process tolerances. On adopting the define-measure-analyze-improve-control approach, the process potential capability index ( C p) and the process performance capability index ( C pk) values of identified process characteristics of connecting rod machining process are achieved to be greater than the industry benchmark of 1.33, i.e., four sigma level. The tolerance chain diagram methodology is applied to the connecting rod in order to verify the manufacturing process tolerances at various operations of the connecting rod manufacturing process. This paper bridges the gap between the existing dimensional tolerances obtained via tolerance charting and process capability studies of the connecting rod component. Finally, the process tolerancing comparison has been done by adopting a tolerance capability expert software.

  2. Z-correction, a method for achieving ultraprecise self-calibration on large area coordinate measurement machines for photomasks

    NASA Astrophysics Data System (ADS)

    Ekberg, Peter; Stiblert, Lars; Mattsson, Lars

    2014-05-01

    High-quality photomasks are a prerequisite for the production of flat panel TVs, tablets and other kinds of high-resolution displays. During the past years, the resolution demand has become more and more accelerated, and today, the high-definition standard HD, 1920 × 1080 pixels2, is well established, and already the next-generation so-called ultra-high-definition UHD or 4K display is entering the market. Highly advanced mask writers are used to produce the photomasks needed for the production of such displays. The dimensional tolerance in X and Y on absolute pattern placement on these photomasks, with sizes of square meters, has been in the range of 200-300 nm (3σ), but is now on the way to be <150 nm (3σ). To verify these photomasks, 2D ultra-precision coordinate measurement machines are used with even tighter tolerance requirements. The metrology tool MMS15000 is today the world standard tool used for the verification of large area photomasks. This paper will present a method called Z-correction that has been developed for the purpose of improving the absolute X, Y placement accuracy of features on the photomask in the writing process. However, Z-correction is also a prerequisite for achieving X and Y uncertainty levels <90 nm (3σ) in the self-calibration process of the MMS15000 stage area of 1.4 × 1.5 m2. When talking of uncertainty specifications below 200 nm (3σ) of such a large area, the calibration object used, here an 8-16 mm thick quartz plate of size approximately a square meter, cannot be treated as a rigid body. The reason for this is that the absolute shape of the plate will be affected by gravity and will therefore not be the same at different places on the measurement machine stage when it is used in the self-calibration process. This mechanical deformation will stretch or compress the top surface (i.e. the image side) of the plate where the pattern resides, and therefore spatially deform the mask pattern in the X- and Y-directions. Errors due

  3. Value-driven process management: using value to improve processes.

    PubMed

    Melnyk, S A; Christensen, R T

    2000-08-01

    Every firm can be viewed as consisting of various processes. These processes affect everything that the firm does from accepting orders and designing products to scheduling production. In many firms, the management of processes often reflects considerations of efficiency (cost) rather than effectiveness (value). In this article, we introduce a well-structured process for managing processes that begins not with the process, but rather with the customer and the product and the concept of value. This process progresses through a number of steps which include issues such as defining value, generating the appropriate metrics, identifying the critical processes, mapping and assessing the performance of these processes, and identifying long- and short-term areas for action. What makes the approach presented in this article so powerful is that it explicitly links the customer to the process and that the process is evaluated in term of its ability to effectively serve the customers.

  4. 5 CFR 1653.13 - Processing legal processes.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... legal processes from the TSP is governed solely by the Federal Employees' Retirement System Act, 5 U.S.C... 5 Administrative Personnel 3 2010-01-01 2010-01-01 false Processing legal processes. 1653.13 Section 1653.13 Administrative Personnel FEDERAL RETIREMENT THRIFT INVESTMENT BOARD COURT ORDERS AND LEGAL...

  5. Coal liquefaction process with enhanced process solvent

    DOEpatents

    Givens, Edwin N.; Kang, Dohee

    1984-01-01

    In an improved coal liquefaction process, including a critical solvent deashing stage, high value product recovery is improved and enhanced process-derived solvent is provided by recycling second separator underflow in the critical solvent deashing stage to the coal slurry mix, for inclusion in the process solvent pool.

  6. A new process sensitivity index to identify important system processes under process model and parametric uncertainty

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dai, Heng; Ye, Ming; Walker, Anthony P.

    Hydrological models are always composed of multiple components that represent processes key to intended model applications. When a process can be simulated by multiple conceptual-mathematical models (process models), model uncertainty in representing the process arises. While global sensitivity analysis methods have been widely used for identifying important processes in hydrologic modeling, the existing methods consider only parametric uncertainty but ignore the model uncertainty for process representation. To address this problem, this study develops a new method to probe multimodel process sensitivity by integrating the model averaging methods into the framework of variance-based global sensitivity analysis, given that the model averagingmore » methods quantify both parametric and model uncertainty. A new process sensitivity index is derived as a metric of relative process importance, and the index includes variance in model outputs caused by uncertainty in both process models and model parameters. For demonstration, the new index is used to evaluate the processes of recharge and geology in a synthetic study of groundwater reactive transport modeling. The recharge process is simulated by two models that converting precipitation to recharge, and the geology process is also simulated by two models of different parameterizations of hydraulic conductivity; each process model has its own random parameters. The new process sensitivity index is mathematically general, and can be applied to a wide range of problems in hydrology and beyond.« less

  7. Defense Waste Processing Facility Process Enhancements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bricker, Jonathan

    2010-11-01

    Jonathan Bricker provides an overview of process enhancements currently being done at the Defense Waste Processing Facility (DWPF) at SRS. Some of these enhancements include: melter bubblers; reduction in water use, and alternate reductant.

  8. A new process sensitivity index to identify important system processes under process model and parametric uncertainty

    DOE PAGES

    Dai, Heng; Ye, Ming; Walker, Anthony P.; ...

    2017-03-28

    A hydrological model consists of multiple process level submodels, and each submodel represents a process key to the operation of the simulated system. Global sensitivity analysis methods have been widely used to identify important processes for system model development and improvement. The existing methods of global sensitivity analysis only consider parametric uncertainty, and are not capable of handling model uncertainty caused by multiple process models that arise from competing hypotheses about one or more processes. To address this problem, this study develops a new method to probe model output sensitivity to competing process models by integrating model averaging methods withmore » variance-based global sensitivity analysis. A process sensitivity index is derived as a single summary measure of relative process importance, and the index includes variance in model outputs caused by uncertainty in both process models and their parameters. Here, for demonstration, the new index is used to assign importance to the processes of recharge and geology in a synthetic study of groundwater reactive transport modeling. The recharge process is simulated by two models that convert precipitation to recharge, and the geology process is simulated by two models of hydraulic conductivity. Each process model has its own random parameters. Finally, the new process sensitivity index is mathematically general, and can be applied to a wide range of problems in hydrology and beyond.« less

  9. A new process sensitivity index to identify important system processes under process model and parametric uncertainty

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dai, Heng; Ye, Ming; Walker, Anthony P.

    A hydrological model consists of multiple process level submodels, and each submodel represents a process key to the operation of the simulated system. Global sensitivity analysis methods have been widely used to identify important processes for system model development and improvement. The existing methods of global sensitivity analysis only consider parametric uncertainty, and are not capable of handling model uncertainty caused by multiple process models that arise from competing hypotheses about one or more processes. To address this problem, this study develops a new method to probe model output sensitivity to competing process models by integrating model averaging methods withmore » variance-based global sensitivity analysis. A process sensitivity index is derived as a single summary measure of relative process importance, and the index includes variance in model outputs caused by uncertainty in both process models and their parameters. Here, for demonstration, the new index is used to assign importance to the processes of recharge and geology in a synthetic study of groundwater reactive transport modeling. The recharge process is simulated by two models that convert precipitation to recharge, and the geology process is simulated by two models of hydraulic conductivity. Each process model has its own random parameters. Finally, the new process sensitivity index is mathematically general, and can be applied to a wide range of problems in hydrology and beyond.« less

  10. Undergraduate Research Program in Atmospheric Science: Houston Ozone Studies

    NASA Astrophysics Data System (ADS)

    Morris, P. A.; Balimuttajjo, M.; Damon, D.; Herridge, A.; Hromis, A. G.; Litwin, D.; Wright, J. M.

    2011-12-01

    The Minority University Consortium for Earth and Space Sciences (MUCESS) composed of the University of Houston-Downtown (UHD), Medgar Evers College (City University of New York), South Carolina State University, is an undergraduate atmospheric science program funded by NSF. The program's goal is to increase the participation of minority universities in STEM activities and careers by providing students with the knowledge and skills needed to perform weather balloon launches, interpret ozone and temperature variations in the troposphere and stratosphere. Ozone profiles up to 30 km altitude are obtained via an instrument payload attached to a weather balloon. The payload instrumentation consists of an EN-SCI ECC ozonesonde and an iMET radiosonde. The data is transmitted to a base station in real time and includes pressure, temperature, humidity, and GPS coordinates This presentation is directed towards comparing our 2011 Houston data to data that either UHD or the University of Houston (UH) has collected. Our launches are primarily on Sunday, and UH's on Friday. Our primary objective is to identify ground level ozone variations on Sunday and compare with weekday levels as tropospheric ozone is largely controlled by anthropogenic activities. Ozone levels vary depending on the time of year, temperature, rain, wind direction, chemical plant activities, private and commercial traffic patterns.etc. Our limited Friday launches, supported by UH data, indicate that ground level ozone is generally elevated in contrast to Sunday data, For example, our Friday July 2011 launch detected elevated low-altitude ozone levels with ground level ozone levels of 42 nb that increased to 46 nb from 500 m to 1 km. Other peaks are at 2.7 km (44 nb) and 6km (41 nb), decreasing to 17 nb at the tropopause (12 km). Overall, Sunday low altitude ozone levels are generally lower. Our Sunday ground level ozone data ranges from a low of 25 nb on July 11 to a high of 50 nb on August 1. A combination of

  11. Analysis of Hospital Processes with Process Mining Techniques.

    PubMed

    Orellana García, Arturo; Pérez Alfonso, Damián; Larrea Armenteros, Osvaldo Ulises

    2015-01-01

    Process mining allows for discovery, monitoring, and improving processes identified in information systems from their event logs. In hospital environments, process analysis has been a crucial factor for cost reduction, control and proper use of resources, better patient care, and achieving service excellence. This paper presents a new component for event logs generation in the Hospital Information System or HIS, developed at University of Informatics Sciences. The event logs obtained are used for analysis of hospital processes with process mining techniques. The proposed solution intends to achieve the generation of event logs in the system with high quality. The performed analyses allowed for redefining functions in the system and proposed proper flow of information. The study exposed the need to incorporate process mining techniques in hospital systems to analyze the processes execution. Moreover, we illustrate its application for making clinical and administrative decisions for the management of hospital activities.

  12. Hyperspectral processing in graphical processing units

    NASA Astrophysics Data System (ADS)

    Winter, Michael E.; Winter, Edwin M.

    2011-06-01

    With the advent of the commercial 3D video card in the mid 1990s, we have seen an order of magnitude performance increase with each generation of new video cards. While these cards were designed primarily for visualization and video games, it became apparent after a short while that they could be used for scientific purposes. These Graphical Processing Units (GPUs) are rapidly being incorporated into data processing tasks usually reserved for general purpose computers. It has been found that many image processing problems scale well to modern GPU systems. We have implemented four popular hyperspectral processing algorithms (N-FINDR, linear unmixing, Principal Components, and the RX anomaly detection algorithm). These algorithms show an across the board speedup of at least a factor of 10, with some special cases showing extreme speedups of a hundred times or more.

  13. Voyager image processing at the Image Processing Laboratory

    NASA Astrophysics Data System (ADS)

    Jepsen, P. L.; Mosher, J. A.; Yagi, G. M.; Avis, C. C.; Lorre, J. J.; Garneau, G. W.

    1980-09-01

    This paper discusses new digital processing techniques as applied to the Voyager Imaging Subsystem and devised to explore atmospheric dynamics, spectral variations, and the morphology of Jupiter, Saturn and their satellites. Radiometric and geometric decalibration processes, the modulation transfer function, and processes to determine and remove photometric properties of the atmosphere and surface of Jupiter and its satellites are examined. It is exhibited that selected images can be processed into 'approach at constant longitude' time lapse movies which are useful in observing atmospheric changes of Jupiter. Photographs are included to illustrate various image processing techniques.

  14. Voyager image processing at the Image Processing Laboratory

    NASA Technical Reports Server (NTRS)

    Jepsen, P. L.; Mosher, J. A.; Yagi, G. M.; Avis, C. C.; Lorre, J. J.; Garneau, G. W.

    1980-01-01

    This paper discusses new digital processing techniques as applied to the Voyager Imaging Subsystem and devised to explore atmospheric dynamics, spectral variations, and the morphology of Jupiter, Saturn and their satellites. Radiometric and geometric decalibration processes, the modulation transfer function, and processes to determine and remove photometric properties of the atmosphere and surface of Jupiter and its satellites are examined. It is exhibited that selected images can be processed into 'approach at constant longitude' time lapse movies which are useful in observing atmospheric changes of Jupiter. Photographs are included to illustrate various image processing techniques.

  15. Assessment of hospital processes using a process mining technique: Outpatient process analysis at a tertiary hospital.

    PubMed

    Yoo, Sooyoung; Cho, Minsu; Kim, Eunhye; Kim, Seok; Sim, Yerim; Yoo, Donghyun; Hwang, Hee; Song, Minseok

    2016-04-01

    Many hospitals are increasing their efforts to improve processes because processes play an important role in enhancing work efficiency and reducing costs. However, to date, a quantitative tool has not been available to examine the before and after effects of processes and environmental changes, other than the use of indirect indicators, such as mortality rate and readmission rate. This study used process mining technology to analyze process changes based on changes in the hospital environment, such as the construction of a new building, and to measure the effects of environmental changes in terms of consultation wait time, time spent per task, and outpatient care processes. Using process mining technology, electronic health record (EHR) log data of outpatient care before and after constructing a new building were analyzed, and the effectiveness of the technology in terms of the process was evaluated. Using the process mining technique, we found that the total time spent in outpatient care did not increase significantly compared to that before the construction of a new building, considering that the number of outpatients increased, and the consultation wait time decreased. These results suggest that the operation of the outpatient clinic was effective after changes were implemented in the hospital environment. We further identified improvements in processes using the process mining technique, thereby demonstrating the usefulness of this technique for analyzing complex hospital processes at a low cost. This study confirmed the effectiveness of process mining technology at an actual hospital site. In future studies, the use of process mining technology will be expanded by applying this approach to a larger variety of process change situations. Copyright © 2016. Published by Elsevier Ireland Ltd.

  16. Monitoring autocorrelated process: A geometric Brownian motion process approach

    NASA Astrophysics Data System (ADS)

    Li, Lee Siaw; Djauhari, Maman A.

    2013-09-01

    Autocorrelated process control is common in today's modern industrial process control practice. The current practice of autocorrelated process control is to eliminate the autocorrelation by using an appropriate model such as Box-Jenkins models or other models and then to conduct process control operation based on the residuals. In this paper we show that many time series are governed by a geometric Brownian motion (GBM) process. Therefore, in this case, by using the properties of a GBM process, we only need an appropriate transformation and model the transformed data to come up with the condition needs in traditional process control. An industrial example of cocoa powder production process in a Malaysian company will be presented and discussed to illustrate the advantages of the GBM approach.

  17. Integrated Process Modeling-A Process Validation Life Cycle Companion.

    PubMed

    Zahel, Thomas; Hauer, Stefan; Mueller, Eric M; Murphy, Patrick; Abad, Sandra; Vasilieva, Elena; Maurer, Daniel; Brocard, Cécile; Reinisch, Daniela; Sagmeister, Patrick; Herwig, Christoph

    2017-10-17

    During the regulatory requested process validation of pharmaceutical manufacturing processes, companies aim to identify, control, and continuously monitor process variation and its impact on critical quality attributes (CQAs) of the final product. It is difficult to directly connect the impact of single process parameters (PPs) to final product CQAs, especially in biopharmaceutical process development and production, where multiple unit operations are stacked together and interact with each other. Therefore, we want to present the application of Monte Carlo (MC) simulation using an integrated process model (IPM) that enables estimation of process capability even in early stages of process validation. Once the IPM is established, its capability in risk and criticality assessment is furthermore demonstrated. IPMs can be used to enable holistic production control strategies that take interactions of process parameters of multiple unit operations into account. Moreover, IPMs can be trained with development data, refined with qualification runs, and maintained with routine manufacturing data which underlines the lifecycle concept. These applications will be shown by means of a process characterization study recently conducted at a world-leading contract manufacturing organization (CMO). The new IPM methodology therefore allows anticipation of out of specification (OOS) events, identify critical process parameters, and take risk-based decisions on counteractions that increase process robustness and decrease the likelihood of OOS events.

  18. MRI evaluation of spontaneous intervertebral disc degeneration in the alpaca cervical spine.

    PubMed

    Stolworthy, Dean K; Bowden, Anton E; Roeder, Beverly L; Robinson, Todd F; Holland, Jacob G; Christensen, S Loyd; Beatty, Amanda M; Bridgewater, Laura C; Eggett, Dennis L; Wendel, John D; Stieger-Vanegas, Susanne M; Taylor, Meredith D

    2015-12-01

    Animal models have historically provided an appropriate benchmark for understanding human pathology, treatment, and healing, but few animals are known to naturally develop intervertebral disc degeneration. The study of degenerative disc disease and its treatment would greatly benefit from a more comprehensive, and comparable animal model. Alpacas have recently been presented as a potential large animal model of intervertebral disc degeneration due to similarities in spinal posture, disc size, biomechanical flexibility, and natural disc pathology. This research further investigated alpacas by determining the prevalence of intervertebral disc degeneration among an aging alpaca population. Twenty healthy female alpacas comprised two age subgroups (5 young: 2-6 years; and 15 older: 10+ years) and were rated according to the Pfirrmann-grade for degeneration of the cervical intervertebral discs. Incidence rates of degeneration showed strong correlations with age and spinal level: younger alpacas were nearly immune to developing disc degeneration, and in older animals, disc degeneration had an increased incidence rate and severity at lower cervical levels. Advanced disc degeneration was present in at least one of the cervical intervertebral discs of 47% of the older alpacas, and it was most common at the two lowest cervical intervertebral discs. The prevalence of intervertebral disc degeneration encourages further investigation and application of the lower cervical spine of alpacas and similar camelids as a large animal model of intervertebral disc degeneration. © 2015 Orthopaedic Research Society. Published by Wiley Periodicals, Inc.

  19. Meat Processing.

    ERIC Educational Resources Information Center

    Legacy, Jim; And Others

    This publication provides an introduction to meat processing for adult students in vocational and technical education programs. Organized in four chapters, the booklet provides a brief overview of the meat processing industry and the techniques of meat processing and butchering. The first chapter introduces the meat processing industry and…

  20. Acetic acid and aromatics units planned in China

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Alperowicz, N.

    1993-01-27

    The Shanghai Wujing Chemical Complex (SWCC; Shanghai) is proceeding with construction of an acetic acid plant. The 100,000-m.t./year until will use BP Chemicals carbonylation technology, originally developed by Monsanto. John Brown has been selected by China National Technical Import Corp. (CNTIC) to supply the plant, Chinese sources say. The UK contractor, which competed against Mitsui Engineering Shipbuilding (Tokyo) and Lurgi (Frankfurt), has built a similar plant for BP in the UK, although using different technology. The new plant will require 54,000 m.t./year of methanol, which is available onsite. Carbon monoxide will be delivered from a new plant. The acetic acidmore » unit will joint two other acetic plants in China supplied some time ago by Uhde (Dortmund). SWCC is due to be integrated with two adjacent complexes to form Shanghai Pacific Chemical. Meanwhile, four groups are competing to supply a UOP-process aromatics complex for Jilin Chemical Industrial Corp. They are Toyo Engineering, Lurgi, Lucky/Foster Wheeler, and Eurotechnica. The complex will include plants with annual capacities for 115,000 m.t. of benzene, 90,000 m.t. of ortho-xylene, 93,000 m.t. of mixed xylenes, and 20,000 m.t. of toluene. The plants will form part of a $2-billion petrochemical complex based on a 300,000-m.t./year ethylene plant awarded last year to a consortium of Samsung Engineering and Linde. Downstream plants will have annual capacities for 120,000 m.t. of linear low-density polyethylene, 80,000 m.t. of ethylene oxide, 100,000 m.t. of ethylene glycol, 80,000 m.t. of phenol, 100,000 m.t. of acrylonitrile, 20,000 m.t. of sodium cyanide, 40,000 m.t. of phthalic anhydride, 40,000 m.t. of ethylene propylene rubber, 20,000 m.t. of styrene butadiene styrene, and 30,000 m.t. of acrylic fiber.« less

  1. Process for improving metal production in steelmaking processes

    DOEpatents

    Pal, Uday B.; Gazula, Gopala K. M.; Hasham, Ali

    1996-01-01

    A process and apparatus for improving metal production in ironmaking and steelmaking processes is disclosed. The use of an inert metallic conductor in the slag containing crucible and the addition of a transition metal oxide to the slag are the disclosed process improvements.

  2. Process for improving metal production in steelmaking processes

    DOEpatents

    Pal, U.B.; Gazula, G.K.M.; Hasham, A.

    1996-06-18

    A process and apparatus for improving metal production in ironmaking and steelmaking processes is disclosed. The use of an inert metallic conductor in the slag containing crucible and the addition of a transition metal oxide to the slag are the disclosed process improvements. 6 figs.

  3. Process-Response Modeling and the Scientific Process.

    ERIC Educational Resources Information Center

    Fichter, Lynn S.

    1988-01-01

    Discusses the process-response model (PRM) in its theoretical and practical forms. Describes how geologists attempt to reconstruct the process from the response (the geologic phenomenon) being studied. (TW)

  4. Cognitive Processes in Discourse Comprehension: Passive Processes, Reader-Initiated Processes, and Evolving Mental Representations

    ERIC Educational Resources Information Center

    van den Broek, Paul; Helder, Anne

    2017-01-01

    As readers move through a text, they engage in various types of processes that, if all goes well, result in a mental representation that captures their interpretation of the text. With each new text segment the reader engages in passive and, at times, reader-initiated processes. These processes are strongly influenced by the readers'…

  5. In-process and post-process measurements of drill wear for control of the drilling process

    NASA Astrophysics Data System (ADS)

    Liu, Tien-I.; Liu, George; Gao, Zhiyu

    2011-12-01

    Optical inspection was used in this research for the post-process measurements of drill wear. A precision toolmakers" microscope was used. Indirect index, cutting force, is used for in-process drill wear measurements. Using in-process measurements to estimate the drill wear for control purpose can decrease the operation cost and enhance the product quality and safety. The challenge is to correlate the in-process cutting force measurements with the post-process optical inspection of drill wear. To find the most important feature, the energy principle was used in this research. It is necessary to select only the cutting force feature which shows the highest sensitivity to drill wear. The best feature selected is the peak of torque in the drilling process. Neuro-fuzzy systems were used for correlation purposes. The Adaptive-Network-Based Fuzzy Inference System (ANFIS) can construct fuzzy rules with membership functions to generate an input-output pair. A 1x6 ANFIS architecture with product of sigmoid membership functions can in-process measure the drill wear with an error as low as 0.15%. This is extremely important for control of the drilling process. Furthermore, the measurement of drill wear was performed under different drilling conditions. This shows that ANFIS has the capability of generalization.

  6. Reflow process stabilization by chemical characteristics and process conditions

    NASA Astrophysics Data System (ADS)

    Kim, Myoung-Soo; Park, Jeong-Hyun; Kim, Hak-Joon; Kim, Il-Hyung; Jeon, Jae-Ha; Gil, Myung-Goon; Kim, Bong-Ho

    2002-07-01

    With the shrunken device rule below 130nm, the patterning of smaller contact hole with enough process margin is required for mass production. Therefore, shrinking technology using thermal reflow process has been applied for smaller contact hole formation. In this paper, we have investigated the effects of chemical characteristics such as molecular weight, blocking ratio of resin, cross-linker amount and solvent type with its composition to reflow process of resist and found the optimized chemical composition for reflow process applicable condition. And several process conditions like resist coating thickness and multi-step thermal reflow method have been also evaluated to stabilize the pattern profile and improve CD uniformity after reflow process. From the experiment results, it was confirmed that the effect of crosslinker in resist to reflow properties such as reflow temperature and reflow rate were very critical and it controlled the pattern profile during reflow processing. And also, it showed stable CD uniformity and improved resist properties for top loss, film shrinkage and etch selectivity. The application of lower coating thickness of resist induced symmetric pattern profile even at edge with wider process margin. The introduction of two-step baking method for reflow process showed uniform CD value, also. It is believed that the application of resist containing crosslinker and optimized process conditions for smaller contact hole patterning is necessary for the mass production with a design rule below 130nm.

  7. Process-in-Network: A Comprehensive Network Processing Approach

    PubMed Central

    Urzaiz, Gabriel; Villa, David; Villanueva, Felix; Lopez, Juan Carlos

    2012-01-01

    A solid and versatile communications platform is very important in modern Ambient Intelligence (AmI) applications, which usually require the transmission of large amounts of multimedia information over a highly heterogeneous network. This article focuses on the concept of Process-in-Network (PIN), which is defined as the possibility that the network processes information as it is being transmitted, and introduces a more comprehensive approach than current network processing technologies. PIN can take advantage of waiting times in queues of routers, idle processing capacity in intermediate nodes, and the information that passes through the network. PMID:22969390

  8. Statistical Process Control for KSC Processing

    NASA Technical Reports Server (NTRS)

    Ford, Roger G.; Delgado, Hector; Tilley, Randy

    1996-01-01

    The 1996 Summer Faculty Fellowship Program and Kennedy Space Center (KSC) served as the basis for a research effort into statistical process control for KSC processing. The effort entailed several tasks and goals. The first was to develop a customized statistical process control (SPC) course for the Safety and Mission Assurance Trends Analysis Group. The actual teaching of this course took place over several weeks. In addition, an Internet version of the same course complete with animation and video excerpts from the course when it was taught at KSC was developed. The application of SPC to shuttle processing took up the rest of the summer research project. This effort entailed the evaluation of SPC use at KSC, both present and potential, due to the change in roles for NASA and the Single Flight Operations Contractor (SFOC). Individual consulting on SPC use was accomplished as well as an evaluation of SPC software for KSC use in the future. A final accomplishment of the orientation of the author to NASA changes, terminology, data format, and new NASA task definitions will allow future consultation when the needs arise.

  9. Logistics Process Analysis ToolProcess Analysis Tool

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2008-03-31

    LPAT is the resulting integrated system between ANL-developed Enhanced Logistics Intra Theater Support Tool (ELIST) sponsored by SDDC-TEA and the Fort Future Virtual Installation Tool (sponsored by CERL). The Fort Future Simulation Engine was an application written in the ANL Repast Simphony framework and used as the basis for the process Anlysis Tool (PAT) which evolved into a stand=-along tool for detailed process analysis at a location. Combined with ELIST, an inter-installation logistics component was added to enable users to define large logistical agent-based models without having to program. PAT is the evolution of an ANL-developed software system called Fortmore » Future Virtual Installation Tool (sponsored by CERL). The Fort Future Simulation Engine was an application written in the ANL Repast Simphony framework and used as the basis for the Process Analysis Tool(PAT) which evolved into a stand-alone tool for detailed process analysis at a location (sponsored by the SDDC-TEA).« less

  10. Processing module operating methods, processing modules, and communications systems

    DOEpatents

    McCown, Steven Harvey; Derr, Kurt W.; Moore, Troy

    2014-09-09

    A processing module operating method includes using a processing module physically connected to a wireless communications device, requesting that the wireless communications device retrieve encrypted code from a web site and receiving the encrypted code from the wireless communications device. The wireless communications device is unable to decrypt the encrypted code. The method further includes using the processing module, decrypting the encrypted code, executing the decrypted code, and preventing the wireless communications device from accessing the decrypted code. Another processing module operating method includes using a processing module physically connected to a host device, executing an application within the processing module, allowing the application to exchange user interaction data communicated using a user interface of the host device with the host device, and allowing the application to use the host device as a communications device for exchanging information with a remote device distinct from the host device.

  11. The Research Process on Converter Steelmaking Process by Using Limestone

    NASA Astrophysics Data System (ADS)

    Tang, Biao; Li, Xing-yi; Cheng, Han-chi; Wang, Jing; Zhang, Yun-long

    2017-08-01

    Compared with traditional converter steelmaking process, steelmaking process with limestone uses limestone to replace lime partly. A lot of researchers have studied about the new steelmaking process. There are much related research about material balance calculation, the behaviour of limestone in the slag, limestone powder injection in converter and application of limestone in iron and steel enterprises. The results show that the surplus heat of converter can meet the need of the limestone calcination, and the new process can reduce the steelmaking process energy loss in the whole steelmaking process, reduce carbon dioxide emissions, and improve the quality of the gas.

  12. Supercritical crystallization: The RESs-process and the GAS-process

    NASA Astrophysics Data System (ADS)

    Berends, Edwin M.

    1994-09-01

    This Doctoral Ph.D. thesis describes the development of two novel crystallization processes utilizing supercritical fluids either as a solvent, the RESS-process, or as an anti-solvent, the GAS-process. In th RESS-process precipitation of the solute is performed by expansion of the solution over a nozzle to produce ultra-fine, monodisperse particles without any solvent inclusions. In the GAS-process a high pressure gas is dissolved into the liquid phase solvent, where it causes a volumetric expansion of this liquid solvent and lowers the equilibrium solubility. Particle size, particle size distribution and other particle characteristics such as their shape, internal structure and the residual amount of solvent in the particles are expected to be influenced by the liquid phase expansion profile.

  13. Membrane processes

    NASA Astrophysics Data System (ADS)

    Staszak, Katarzyna

    2017-11-01

    The membrane processes have played important role in the industrial separation process. These technologies can be found in all industrial areas such as food, beverages, metallurgy, pulp and paper, textile, pharmaceutical, automotive, biotechnology and chemical industry, as well as in water treatment for domestic and industrial application. Although these processes are known since twentieth century, there are still many studies that focus on the testing of new membranes' materials and determining of conditions for optimal selectivity, i. e. the optimum transmembrane pressure (TMP) or permeate flux to minimize fouling. Moreover the researchers proposed some calculation methods to predict the membrane processes properties. In this article, the laboratory scale experiments of membrane separation techniques, as well their validation by calculation methods are presented. Because membrane is the "heart" of the process, experimental and computational methods for its characterization are also described.

  14. Process Analytical Technology (PAT): batch-to-batch reproducibility of fermentation processes by robust process operational design and control.

    PubMed

    Gnoth, S; Jenzsch, M; Simutis, R; Lübbert, A

    2007-10-31

    The Process Analytical Technology (PAT) initiative of the FDA is a reaction on the increasing discrepancy between current possibilities in process supervision and control of pharmaceutical production processes and its current application in industrial manufacturing processes. With rigid approval practices based on standard operational procedures, adaptations of production reactors towards the state of the art were more or less inhibited for long years. Now PAT paves the way for continuous process and product improvements through improved process supervision based on knowledge-based data analysis, "Quality-by-Design"-concepts, and, finally, through feedback control. Examples of up-to-date implementations of this concept are presented. They are taken from one key group of processes in recombinant pharmaceutical protein manufacturing, the cultivations of genetically modified Escherichia coli bacteria.

  15. Gas processing handbook

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1982-04-01

    Brief details are given of processes including: BGC-Lurgi slagging gasification, COGAS, Exxon catalytic coal gasification, FW-Stoic 2-stage, GI two stage, HYGAS, Koppers-Totzek, Lurgi pressure gasification, Saarberg-Otto, Shell, Texaco, U-Gas, W-D.IGI, Wellman-Galusha, Westinghouse, and Winkler coal gasification processes; the Rectisol process; the Catacarb and the Benfield processes for removing CO/SUB/2, H/SUB/2s and COS from gases produced by the partial oxidation of coal; the selectamine DD, Selexol solvent, and Sulfinol gas cleaning processes; the sulphur-tolerant shift (SSK) process; and the Super-meth process for the production of high-Btu gas from synthesis gas.

  16. Development of functionally-oriented technological processes of electroerosive processing

    NASA Astrophysics Data System (ADS)

    Syanov, S. Yu

    2018-03-01

    The stages of the development of functionally oriented technological processes of electroerosive processing from the separation of the surfaces of parts and their service functions to the determination of the parameters of the process of electric erosion, which will provide not only the quality parameters of the surface layer, but also the required operational properties, are described.

  17. Emotional Processing, Interaction Process, and Outcome in Clarification-Oriented Psychotherapy for Personality Disorders: A Process-Outcome Analysis.

    PubMed

    Kramer, Ueli; Pascual-Leone, Antonio; Rohde, Kristina B; Sachse, Rainer

    2016-06-01

    It is important to understand the change processes involved in psychotherapies for patients with personality disorders (PDs). One patient process that promises to be useful in relation to the outcome of psychotherapy is emotional processing. In the present process-outcome analysis, we examine this question by using a sequential model of emotional processing and by additionally taking into account a therapist's appropriate responsiveness to a patient's presentation in clarification-oriented psychotherapy (COP), a humanistic-experiential form of therapy. The present study involved 39 patients with a range of PDs undergoing COP. Session 25 was assessed as part of the working phase of each therapy by external raters in terms of emotional processing using the Classification of Affective-Meaning States (CAMS) and in terms of the overall quality of therapist-patient interaction using the Process-Content-Relationship Scale (BIBS). Treatment outcome was assessed pre- and post-therapy using the Global Severity Index (GSI) of the SCL-90-R and the BDI. Results indicate that the good outcome cases showed more self-compassion, more rejecting anger, and a higher quality of therapist-patient interaction compared to poorer outcome cases. For good outcome cases, emotional processing predicted 18% of symptom change at the end of treatment, which was not found for poor outcome cases. These results are discussed within the framework of an integrative understanding of emotional processing as an underlying mechanism of change in COP, and perhaps in other effective therapy approaches for PDs.

  18. Process Materialization Using Templates and Rules to Design Flexible Process Models

    NASA Astrophysics Data System (ADS)

    Kumar, Akhil; Yao, Wen

    The main idea in this paper is to show how flexible processes can be designed by combining generic process templates and business rules. We instantiate a process by applying rules to specific case data, and running a materialization algorithm. The customized process instance is then executed in an existing workflow engine. We present an architecture and also give an algorithm for process materialization. The rules are written in a logic-based language like Prolog. Our focus is on capturing deeper process knowledge and achieving a holistic approach to robust process design that encompasses control flow, resources and data, as well as makes it easier to accommodate changes to business policy.

  19. Depth-of-processing effects on priming in stem completion: tests of the voluntary-contamination, conceptual-processing, and lexical-processing hypotheses.

    PubMed

    Richardson-Klavehn, A; Gardiner, J M

    1998-05-01

    Depth-of-processing effects on incidental perceptual memory tests could reflect (a) contamination by voluntary retrieval, (b) sensitivity of involuntary retrieval to prior conceptual processing, or (c) a deficit in lexical processing during graphemic study tasks that affects involuntary retrieval. The authors devised an extension of incidental test methodology--making conjunctive predictions about response times as well as response proportions--to discriminate among these alternatives. They used graphemic, phonemic, and semantic study tasks, and a word-stem completion test with incidental, intentional, and inclusion instructions. Semantic study processing was superior to phonemic study processing in the intentional and inclusion tests, but semantic and phonemic study processing produced equal priming in the incidental test, showing that priming was uncontaminated by voluntary retrieval--a conclusion reinforced by the response-time data--and that priming was insensitive to prior conceptual processing. The incidental test nevertheless showed a priming deficit following graphemic study processing, supporting the lexical-processing hypothesis. Adding a lexical decision to the 3 study tasks eliminated the priming deficit following graphemic study processing, but did not influence priming following phonemic and semantic processing. The results provide the first clear evidence that depth-of-processing effects on perceptual priming can reflect lexical processes, rather than voluntary contamination or conceptual processes.

  20. Assessment of Process Capability: the case of Soft Drinks Processing Unit

    NASA Astrophysics Data System (ADS)

    Sri Yogi, Kottala

    2018-03-01

    The process capability studies have significant impact in investigating process variation which is important in achieving product quality characteristics. Its indices are to measure the inherent variability of a process and thus to improve the process performance radically. The main objective of this paper is to understand capability of the process being produced within specification of the soft drinks processing unit, a premier brands being marketed in India. A few selected critical parameters in soft drinks processing: concentration of gas volume, concentration of brix, torque of crock has been considered for this study. Assessed some relevant statistical parameters: short term capability, long term capability as a process capability indices perspective. For assessment we have used real time data of soft drinks bottling company which is located in state of Chhattisgarh, India. As our research output suggested reasons for variations in the process which is validated using ANOVA and also predicted Taguchi cost function, assessed also predicted waste monetarily this shall be used by organization for improving process parameters. This research work has substantially benefitted the organization in understanding the various variations of selected critical parameters for achieving zero rejection.

  1. Integrated stationary Ornstein-Uhlenbeck process, and double integral processes

    NASA Astrophysics Data System (ADS)

    Abundo, Mario; Pirozzi, Enrica

    2018-03-01

    We find a representation of the integral of the stationary Ornstein-Uhlenbeck (ISOU) process in terms of Brownian motion Bt; moreover, we show that, under certain conditions on the functions f and g , the double integral process (DIP) D(t) = ∫βt g(s) (∫αs f(u) dBu) ds can be thought as the integral of a suitable Gauss-Markov process. Some theoretical and application details are given, among them we provide a simulation formula based on that representation by which sample paths, probability densities and first passage times of the ISOU process are obtained; the first-passage times of the DIP are also studied.

  2. Neurological Evidence Linguistic Processes Precede Perceptual Simulation in Conceptual Processing

    PubMed Central

    Louwerse, Max; Hutchinson, Sterling

    2012-01-01

    There is increasing evidence from response time experiments that language statistics and perceptual simulations both play a role in conceptual processing. In an EEG experiment we compared neural activity in cortical regions commonly associated with linguistic processing and visual perceptual processing to determine to what extent symbolic and embodied accounts of cognition applied. Participants were asked to determine the semantic relationship of word pairs (e.g., sky – ground) or to determine their iconic relationship (i.e., if the presentation of the pair matched their expected physical relationship). A linguistic bias was found toward the semantic judgment task and a perceptual bias was found toward the iconicity judgment task. More importantly, conceptual processing involved activation in brain regions associated with both linguistic and perceptual processes. When comparing the relative activation of linguistic cortical regions with perceptual cortical regions, the effect sizes for linguistic cortical regions were larger than those for the perceptual cortical regions early in a trial with the reverse being true later in a trial. These results map upon findings from other experimental literature and provide further evidence that processing of concept words relies both on language statistics and on perceptual simulations, whereby linguistic processes precede perceptual simulation processes. PMID:23133427

  3. Neurological evidence linguistic processes precede perceptual simulation in conceptual processing.

    PubMed

    Louwerse, Max; Hutchinson, Sterling

    2012-01-01

    There is increasing evidence from response time experiments that language statistics and perceptual simulations both play a role in conceptual processing. In an EEG experiment we compared neural activity in cortical regions commonly associated with linguistic processing and visual perceptual processing to determine to what extent symbolic and embodied accounts of cognition applied. Participants were asked to determine the semantic relationship of word pairs (e.g., sky - ground) or to determine their iconic relationship (i.e., if the presentation of the pair matched their expected physical relationship). A linguistic bias was found toward the semantic judgment task and a perceptual bias was found toward the iconicity judgment task. More importantly, conceptual processing involved activation in brain regions associated with both linguistic and perceptual processes. When comparing the relative activation of linguistic cortical regions with perceptual cortical regions, the effect sizes for linguistic cortical regions were larger than those for the perceptual cortical regions early in a trial with the reverse being true later in a trial. These results map upon findings from other experimental literature and provide further evidence that processing of concept words relies both on language statistics and on perceptual simulations, whereby linguistic processes precede perceptual simulation processes.

  4. Marshaling and Acquiring Resources for the Process Improvement Process

    DTIC Science & Technology

    1993-06-01

    stakeholders. ( Geber , 1990) D. IDENTIFYING SUPPLIERS Suppliers are just as crucial to setting requirements for processes as are customers. Although...output ( Geber , 1990, p. 32). Before gathering resources for process improvement, the functional manager must ensure that the relationship of internal...him patent information and clerical people process his applications. ( Geber , 1990, pp. 29-34) To get the full benefit of a white-collar worker as a

  5. Extensible packet processing architecture

    DOEpatents

    Robertson, Perry J.; Hamlet, Jason R.; Pierson, Lyndon G.; Olsberg, Ronald R.; Chun, Guy D.

    2013-08-20

    A technique for distributed packet processing includes sequentially passing packets associated with packet flows between a plurality of processing engines along a flow through data bus linking the plurality of processing engines in series. At least one packet within a given packet flow is marked by a given processing engine to signify by the given processing engine to the other processing engines that the given processing engine has claimed the given packet flow for processing. A processing function is applied to each of the packet flows within the processing engines and the processed packets are output on a time-shared, arbitered data bus coupled to the plurality of processing engines.

  6. Shuttle Processing

    NASA Technical Reports Server (NTRS)

    Guodace, Kimberly A.

    2010-01-01

    This slide presentation details shuttle processing flow which starts with wheel stop and ends with launching. The flow is from landing the orbiter is rolled into the Orbiter Processing Facility (OPF), where processing is performed, it is then rolled over to the Vehicle Assembly Building (VAB) where it is mated with the propellant tanks, and payloads are installed. A different flow is detailed if the weather at Kennedy Space Center requires a landing at Dryden.

  7. Adaptive-optics optical coherence tomography processing using a graphics processing unit.

    PubMed

    Shafer, Brandon A; Kriske, Jeffery E; Kocaoglu, Omer P; Turner, Timothy L; Liu, Zhuolin; Lee, John Jaehwan; Miller, Donald T

    2014-01-01

    Graphics processing units are increasingly being used for scientific computing for their powerful parallel processing abilities, and moderate price compared to super computers and computing grids. In this paper we have used a general purpose graphics processing unit to process adaptive-optics optical coherence tomography (AOOCT) images in real time. Increasing the processing speed of AOOCT is an essential step in moving the super high resolution technology closer to clinical viability.

  8. Processing approaches to cognition: the impetus from the levels-of-processing framework.

    PubMed

    Roediger, Henry L; Gallo, David A; Geraci, Lisa

    2002-01-01

    Processing approaches to cognition have a long history, from act psychology to the present, but perhaps their greatest boost was given by the success and dominance of the levels-of-processing framework. We review the history of processing approaches, and explore the influence of the levels-of-processing approach, the procedural approach advocated by Paul Kolers, and the transfer-appropriate processing framework. Processing approaches emphasise the procedures of mind and the idea that memory storage can be usefully conceptualised as residing in the same neural units that originally processed information at the time of encoding. Processing approaches emphasise the unity and interrelatedness of cognitive processes and maintain that they can be dissected into separate faculties only by neglecting the richness of mental life. We end by pointing to future directions for processing approaches.

  9. Nonthermal processing technologies as food safety intervention processes

    USDA-ARS?s Scientific Manuscript database

    Foods should provide sensorial satisfaction and nutrition to people. Yet, foodborne pathogens cause significant illness and lose of life to human kind every year. A processing intervention step may be necessary prior to the consumption to ensure the safety of foods. Nonthermal processing technologi...

  10. Batch Statistical Process Monitoring Approach to a Cocrystallization Process.

    PubMed

    Sarraguça, Mafalda C; Ribeiro, Paulo R S; Dos Santos, Adenilson O; Lopes, João A

    2015-12-01

    Cocrystals are defined as crystalline structures composed of two or more compounds that are solid at room temperature held together by noncovalent bonds. Their main advantages are the increase of solubility, bioavailability, permeability, stability, and at the same time retaining active pharmaceutical ingredient bioactivity. The cocrystallization between furosemide and nicotinamide by solvent evaporation was monitored on-line using near-infrared spectroscopy (NIRS) as a process analytical technology tool. The near-infrared spectra were analyzed using principal component analysis. Batch statistical process monitoring was used to create control charts to perceive the process trajectory and define control limits. Normal and non-normal operating condition batches were performed and monitored with NIRS. The use of NIRS associated with batch statistical process models allowed the detection of abnormal variations in critical process parameters, like the amount of solvent or amount of initial components present in the cocrystallization. © 2015 Wiley Periodicals, Inc. and the American Pharmacists Association.

  11. Process of discharging charge-build up in slag steelmaking processes

    DOEpatents

    Pal, Uday B.; Gazula, Gopala K. M.; Hasham, Ali

    1994-01-01

    A process and apparatus for improving metal production in ironmaking and steelmaking processes is disclosed. The use of an inert metallic conductor in the slag-containing crucible and the addition of a transition metal oxide to the slag are the disclosed process improvements.

  12. The process audit.

    PubMed

    Hammer, Michael

    2007-04-01

    Few executives question the idea that by redesigning business processes--work that runs from end to end across an enterprise--they can achieve extraordinary improvements in cost, quality, speed, profitability, and other key areas Yet in spite of their intentions and investments, many executives flounder, unsure about what exactly needs to be changed, by how much, and when. As a result, many organizations make little progress--if any at all--in their attempts to transform business processes. Michael Hammer has spent the past five years working with a group of leading companies to develop the Process and Enterprise Maturity Model (PEMM), a new framework that helps executives comprehend, formulate, and assess process-based transformation efforts. He has identified two distinct groups of characteristics that are needed for business processes to perform exceptionally well over a long period of time. Process enablers, which affect individual processes, determine how well a process is able to function. They are mutually interdependent--if any are missing, the others will be ineffective. However, enablers are not enough to develop high-performance processes; they only provide the potential to deliver high performance. A company must also possess or establish organizational capabilities that allow the business to offer a supportive environment. Together, the enablers and the capabilities provide an effective way for companies to plan and evaluate process-based transformations. PEMM is different from other frameworks, such as Capability Maturity Model Integration (CMMI), because it applies to all industries and all processes. The author describes how several companies--including Michelin, CSAA, Tetra Pak, Shell, Clorox, and Schneider National--have successfully used PEMM in various ways and at different stages to evaluate the progress of their process-based transformation efforts.

  13. Internal process: what is abstraction and distortion process?

    NASA Astrophysics Data System (ADS)

    Fiantika, F. R.; Budayasa, I. K.; Lukito, A.

    2018-03-01

    Geometry is one of the branch of mathematics that plays a major role in the development of science and technology. Thus, knowing the geometry concept is needed for students from their early basic level of thinking. A preliminary study showed that the elementary students have difficulty in perceiving parallelogram shape in a 2-dimention of a cube drawing as a square shape. This difficulty makes the students can not solve geometrical problems correctly. This problem is related to the internal thinking process in geometry. We conducted the exploration of students’ internal thinking processes in geometry particularly in distinguishing the square and parallelogram shape. How the students process their internal thinking through distortion and abstraction is the main aim of this study. Analysis of the geometrical test and deep interview are used in this study to obtain the data. The result of this study is there are two types of distortion and abstraction respectively in which the student used in their internal thinking processes.

  14. Transforming Collaborative Process Models into Interface Process Models by Applying an MDA Approach

    NASA Astrophysics Data System (ADS)

    Lazarte, Ivanna M.; Chiotti, Omar; Villarreal, Pablo D.

    Collaborative business models among enterprises require defining collaborative business processes. Enterprises implement B2B collaborations to execute these processes. In B2B collaborations the integration and interoperability of processes and systems of the enterprises are required to support the execution of collaborative processes. From a collaborative process model, which describes the global view of the enterprise interactions, each enterprise must define the interface process that represents the role it performs in the collaborative process in order to implement the process in a Business Process Management System. Hence, in this work we propose a method for the automatic generation of the interface process model of each enterprise from a collaborative process model. This method is based on a Model-Driven Architecture to transform collaborative process models into interface process models. By applying this method, interface processes are guaranteed to be interoperable and defined according to a collaborative process.

  15. Process Monitoring Evaluation and Implementation for the Wood Abrasive Machining Process

    PubMed Central

    Saloni, Daniel E.; Lemaster, Richard L.; Jackson, Steven D.

    2010-01-01

    Wood processing industries have continuously developed and improved technologies and processes to transform wood to obtain better final product quality and thus increase profits. Abrasive machining is one of the most important of these processes and therefore merits special attention and study. The objective of this work was to evaluate and demonstrate a process monitoring system for use in the abrasive machining of wood and wood based products. The system developed increases the life of the belt by detecting (using process monitoring sensors) and removing (by cleaning) the abrasive loading during the machining process. This study focused on abrasive belt machining processes and included substantial background work, which provided a solid base for understanding the behavior of the abrasive, and the different ways that the abrasive machining process can be monitored. In addition, the background research showed that abrasive belts can effectively be cleaned by the appropriate cleaning technique. The process monitoring system developed included acoustic emission sensors which tended to be sensitive to belt wear, as well as platen vibration, but not loading, and optical sensors which were sensitive to abrasive loading. PMID:22163477

  16. 21 CFR 1271.220 - Processing and process controls.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Processing and process controls. 1271.220 Section 1271.220 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) REGULATIONS UNDER CERTAIN OTHER ACTS ADMINISTERED BY THE FOOD AND DRUG ADMINISTRATION HUMAN CELLS...

  17. Pre- and Post-Processing Tools to Streamline the CFD Process

    NASA Technical Reports Server (NTRS)

    Dorney, Suzanne Miller

    2002-01-01

    This viewgraph presentation provides information on software development tools to facilitate the use of CFD (Computational Fluid Dynamics) codes. The specific CFD codes FDNS and CORSAIR are profiled, and uses for software development tools with these codes during pre-processing, interim-processing, and post-processing are explained.

  18. Processing Depth, Elaboration of Encoding, Memory Stores, and Expended Processing Capacity.

    ERIC Educational Resources Information Center

    Eysenck, Michael W.; Eysenck, M. Christine

    1979-01-01

    The effects of several factors on expended processing capacity were measured. Expended processing capacity was greater when information was retrieved from secondary memory than from primary memory, when processing was of a deep, semantic nature than when it was shallow and physical, and when processing was more elaborate. (Author/GDC)

  19. 2D segmentation of intervertebral discs and its degree of degeneration from T2-weighted magnetic resonance images

    NASA Astrophysics Data System (ADS)

    Castro-Mateos, Isaac; Pozo, José Maria; Lazary, Aron; Frangi, Alejandro F.

    2014-03-01

    Low back pain (LBP) is a disorder suffered by a large population around the world. A key factor causing this illness is Intervertebral Disc (IVD) degeneration, whose early diagnosis could help in preventing this widespread condition. Clinicians base their diagnosis on visual inspection of 2D slices of Magnetic Resonance (MR) images, which is subject to large interobserver variability. In this work, an automatic classification method is presented, which provides the Pfirrmann degree of degeneration from a mid-sagittal MR slice. The proposed method utilizes Active Contour Models, with a new geometrical energy, to achieve an initial segmentation, which is further improved using fuzzy C-means. Then, IVDs are classified according to their degree of degeneration. This classification is attained by employing Adaboost on five specific features: the mean and the variance of the probability map of the nucleus using two different approaches and the eccentricity of the fitting ellipse to the contour of the IVD. The classification method was evaluated using a cohort of 150 intervertebral discs assessed by three experts, resulting in a mean specificity (93%) and sensitivity (83%) similar to the one provided by every expert with respect to the most voted value. The segmentation accuracy was evaluated using the Dice Similarity Index (DSI) and Root Mean Square Error (RMSE) of the point-to-contour distance. The mean DSI ± 2 standard deviation was 91:7% ±5:6%, the mean RMSE was 0:82mm and the 95 percentile was 1:36mm. These results were found accurate when compared to the state-of-the-art.

  20. Turbine blade processing

    NASA Technical Reports Server (NTRS)

    1975-01-01

    Space processing of directionally solidified eutectic-alloy type turbine blades is envisioned as a simple remelt operations in which precast blades are remelted in a preformed mold. Process systems based on induction melting, continuous resistance furnaces, and batch resistance furnaces were evaluated. The batch resistance furnace type process using a multiblade mold is considered to offer the best possibility for turbine blade processing.

  1. Euglena Transcript Processing.

    PubMed

    McWatters, David C; Russell, Anthony G

    2017-01-01

    RNA transcript processing is an important stage in the gene expression pathway of all organisms and is subject to various mechanisms of control that influence the final levels of gene products. RNA processing involves events such as nuclease-mediated cleavage, removal of intervening sequences referred to as introns and modifications to RNA structure (nucleoside modification and editing). In Euglena, RNA transcript processing was initially examined in chloroplasts because of historical interest in the secondary endosymbiotic origin of this organelle in this organism. More recent efforts to examine mitochondrial genome structure and RNA maturation have been stimulated by the discovery of unusual processing pathways in other Euglenozoans such as kinetoplastids and diplonemids. Eukaryotes containing large genomes are now known to typically contain large collections of introns and regulatory RNAs involved in RNA processing events, and Euglena gracilis in particular has a relatively large genome for a protist. Studies examining the structure of nuclear genes and the mechanisms involved in nuclear RNA processing have revealed that indeed Euglena contains large numbers of introns in the limited set of genes so far examined and also possesses large numbers of specific classes of regulatory and processing RNAs, such as small nucleolar RNAs (snoRNAs). Most interestingly, these studies have also revealed that Euglena possesses novel processing pathways generating highly fragmented cytosolic ribosomal RNAs and subunits and non-conventional intron classes removed by unknown splicing mechanisms. This unexpected diversity in RNA processing pathways emphasizes the importance of identifying the components involved in these processing mechanisms and their evolutionary emergence in Euglena species.

  2. Post-processing of metal matrix composites by friction stir processing

    NASA Astrophysics Data System (ADS)

    Sharma, Vipin; Singla, Yogesh; Gupta, Yashpal; Raghuwanshi, Jitendra

    2018-05-01

    In metal matrix composites non-uniform distribution of reinforcement particles resulted in adverse affect on the mechanical properties. It is of great interest to explore post-processing techniques that can eliminate particle distribution heterogeneity. Friction stir processing is a relatively newer technique used for post-processing of metal matrix composites to improve homogeneity in particles distribution. In friction stir processing, synergistic effect of stirring, extrusion and forging resulted in refinement of grains, reduction of reinforcement particles size, uniformity in particles distribution, reduction in microstructural heterogeneity and elimination of defects.

  3. The roles of a process development group in biopharmaceutical process startup.

    PubMed

    Goochee, Charles F

    2002-01-01

    The transfer of processes for biotherapeutic products into finalmanufacturing facilities was frequently problematic during the 1980's and early 1990's, resulting in costly delays to licensure(Pisano 1997). While plant startups for this class of products can become chaotic affairs, this is not an inherent or intrinsic feature. Major classes of process startup problems have been identified andmechanisms have been developed to reduce their likelihood of occurrence. These classes of process startup problems and resolution mechanisms are the major topic of this article. With proper planning and sufficient staffing, the probably of a smooth process startup for a biopharmaceutical product can be very high - i.e., successful process performance will often beachieved within the first two full-scale process lots in the plant. The primary focus of this article is the role of the Process Development Group in helping to assure this high probability of success.

  4. Process-based costing.

    PubMed

    Lee, Robert H; Bott, Marjorie J; Forbes, Sarah; Redford, Linda; Swagerty, Daniel L; Taunton, Roma Lee

    2003-01-01

    Understanding how quality improvement affects costs is important. Unfortunately, low-cost, reliable ways of measuring direct costs are scarce. This article builds on the principles of process improvement to develop a costing strategy that meets both criteria. Process-based costing has 4 steps: developing a flowchart, estimating resource use, valuing resources, and calculating direct costs. To illustrate the technique, this article uses it to cost the care planning process in 3 long-term care facilities. We conclude that process-based costing is easy to implement; generates reliable, valid data; and allows nursing managers to assess the costs of new or modified processes.

  5. Biomass process handbook

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1983-01-01

    Descriptions are given of 42 processes which use biomass to produce chemical products. Marketing and economic background, process description, flow sheets, costs, major equipment, and availability of technology are given for each of the 42 processes. Some of the chemicals discussed are: ethanol, ethylene, acetaldehyde, butanol, butadiene, acetone, citric acid, gluconates, itaconic acid, lactic acid, xanthan gum, sorbitol, starch polymers, fatty acids, fatty alcohols, glycerol, soap, azelaic acid, perlargonic acid, nylon-11, jojoba oil, furfural, furfural alcohol, tetrahydrofuran, cellulose polymers, products from pulping wastes, and methane. Processes include acid hydrolysis, enzymatic hydrolysis, fermentation, distillation, Purox process, and anaerobic digestion.

  6. Laser Processing of Multilayered Thermal Spray Coatings: Optimal Processing Parameters

    NASA Astrophysics Data System (ADS)

    Tewolde, Mahder; Zhang, Tao; Lee, Hwasoo; Sampath, Sanjay; Hwang, David; Longtin, Jon

    2017-12-01

    Laser processing offers an innovative approach for the fabrication and transformation of a wide range of materials. As a rapid, non-contact, and precision material removal technology, lasers are natural tools to process thermal spray coatings. Recently, a thermoelectric generator (TEG) was fabricated using thermal spray and laser processing. The TEG device represents a multilayer, multimaterial functional thermal spray structure, with laser processing serving an essential role in its fabrication. Several unique challenges are presented when processing such multilayer coatings, and the focus of this work is on the selection of laser processing parameters for optimal feature quality and device performance. A parametric study is carried out using three short-pulse lasers, where laser power, repetition rate and processing speed are varied to determine the laser parameters that result in high-quality features. The resulting laser patterns are characterized using optical and scanning electron microscopy, energy-dispersive x-ray spectroscopy, and electrical isolation tests between patterned regions. The underlying laser interaction and material removal mechanisms that affect the feature quality are discussed. Feature quality was found to improve both by using a multiscanning approach and an optional assist gas of air or nitrogen. Electrically isolated regions were also patterned in a cylindrical test specimen.

  7. Application of statistical process control and process capability analysis procedures in orbiter processing activities at the Kennedy Space Center

    NASA Technical Reports Server (NTRS)

    Safford, Robert R.; Jackson, Andrew E.; Swart, William W.; Barth, Timothy S.

    1994-01-01

    Successful ground processing at KSC requires that flight hardware and ground support equipment conform to specifications at tens of thousands of checkpoints. Knowledge of conformance is an essential requirement for launch. That knowledge of conformance at every requisite point does not, however, enable identification of past problems with equipment, or potential problem areas. This paper describes how the introduction of Statistical Process Control and Process Capability Analysis identification procedures into existing shuttle processing procedures can enable identification of potential problem areas and candidates for improvements to increase processing performance measures. Results of a case study describing application of the analysis procedures to Thermal Protection System processing are used to illustrate the benefits of the approaches described in the paper.

  8. Effect of processing parameters on FDM process

    NASA Astrophysics Data System (ADS)

    Chari, V. Srinivasa; Venkatesh, P. R.; Krupashankar, Dinesh, Veena

    2018-04-01

    This paper focused on the process parameters on fused deposition modeling (FDM). Infill, resolution, temperature are the process variables considered for experimental studies. Compression strength, Hardness test microstructure are the outcome parameters, this experimental study done based on the taguchi's L9 orthogonal array is used. Taguchi array used to build the 9 different models and also to get the effective output results on the under taken parameters. The material used for this experimental study is Polylactic Acid (PLA).

  9. Westinghouse modular grinding process - improvement for follow on processes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fehrmann, Henning

    2013-07-01

    In nuclear power plants (NPP) ion exchange (IX) resins are used in several systems for water treatment. The resins can be in bead or powdered form. For waste treatment of spent IX resins, two methods are basically used: Direct immobilization (e.g. with cement, bitumen, polymer or High Integrity Container (HIC)); Thermal treatment (e.g. drying, oxidation or pyrolysis). Bead resins have some properties (e.g. particle size and density) that can have negative impacts on following waste treatment processes. Negative impacts could be: Floatation of bead resins in cementation process; Sedimentation in pipeline during transportation; Poor compaction properties for Hot Resin Supercompactionmore » (HRSC). Reducing the particle size of the bead resins can have beneficial effects enhancing further treatment processes and overcoming prior mentioned effects. Westinghouse Electric Company has developed a modular grinding process to crush/grind the bead resins. This modular process is designed for flexible use and enables a selective adjustment of particle size to tailor the grinding system to the customer needs. The system can be equipped with a crusher integrated in the process tank and if necessary a colloid mill. The crusher reduces the bead resins particle size and converts the bead resins to a pump able suspension with lower sedimentation properties. With the colloid mill the resins can be ground to a powder. Compared to existing grinding systems this equipment is designed to minimize radiation exposure of the worker during operation and maintenance. Using the crushed and/or ground bead resins has several beneficial effects like facilitating cementation process and recipe development, enhancing oxidation of resins, improving the Hot Resin Supercompaction volume reduction performance. (authors)« less

  10. Materials processing in space, 1980 science planning document. [crystal growth, containerless processing, solidification, bioprocessing, and ultrahigh vacuum processes

    NASA Technical Reports Server (NTRS)

    Naumann, R. J.

    1980-01-01

    The scientific aspects of the Materials Processing in Space program are described with emphasis on the major categories of interest: (1) crystal growth; (2) solidification of metals, alloys, and composites; (3) fluids and chemical processes; (4) containerless processing, glasses, and refractories; (5) ultrahigh vacuum processes; and (6) bioprocessing. An index is provided for each of these areas. The possible contributions that materials science experiments in space can make to the various disciplines are summarized, and the necessity for performing experiments in space is justified. What has been learned from previous experiments relating to space processing, current investigations, and remaining issues that require resolution are discussed. Recommendations for the future direction of the program are included.

  11. Process simulation during the design process makes the difference: process simulations applied to a traditional design.

    PubMed

    Traversari, Roberto; Goedhart, Rien; Schraagen, Jan Maarten

    2013-01-01

    The objective is evaluation of a traditionally designed operating room using simulation of various surgical workflows. A literature search showed that there is no evidence for an optimal operating room layout regarding the position and size of an ultraclean ventilation (UCV) canopy with a separate preparation room for laying out instruments and in which patients are induced in the operating room itself. Neither was literature found reporting on process simulation being used for this application. Many technical guidelines and designs have mainly evolved over time, and there is no evidence on whether the proposed measures are also effective for the optimization of the layout for workflows. The study was conducted by applying observational techniques to simulated typical surgical procedures. Process simulations which included complete surgical teams and equipment required for the intervention were carried out for four typical interventions. Four observers used a form to record conflicts with the clean area boundaries and the height of the supply bridge. Preferences for particular layouts were discussed with the surgical team after each simulated procedure. We established that a clean area measuring 3 × 3 m and a supply bridge height of 2.05 m was satisfactory for most situations, provided a movable operation table is used. The only cases in which conflicts with the supply bridge were observed were during the use of a surgical robot (Da Vinci) and a surgical microscope. During multiple trauma interventions, bottlenecks regarding the dimensions of the clean area will probably arise. The process simulation of four typical interventions has led to significantly different operating room layouts than were arrived at through the traditional design process. Evidence-based design, human factors, work environment, operating room, traditional design, process simulation, surgical workflowsPreferred Citation: Traversari, R., Goedhart, R., & Schraagen, J. M. (2013). Process

  12. The standards process: X3 information processing systems

    NASA Technical Reports Server (NTRS)

    Emard, Jean-Paul

    1993-01-01

    The topics are presented in viewgraph form and include the following: International Organization for Standards (ISO); International Electrotechnical Committee (IEC); ISO/IEC Joint Technical Committee 1 (JTC-1); U.S. interface to JTC-1; ANSI; national organizations; U.S. standards development processes; national and international standards developing organizations; regional organizations; and X3 information processing systems.

  13. Kidney transplantation process in Brazil represented in business process modeling notation.

    PubMed

    Peres Penteado, A; Molina Cohrs, F; Diniz Hummel, A; Erbs, J; Maciel, R F; Feijó Ortolani, C L; de Aguiar Roza, B; Torres Pisa, I

    2015-05-01

    Kidney transplantation is considered to be the best treatment for people with chronic kidney failure, because it improves the patients' quality of life and increases their length of survival compared with patients undergoing dialysis. The kidney transplantation process in Brazil is defined through laws, decrees, ordinances, and resolutions, but there is no visual representation of this process. The aim of this study was to analyze official documents to construct a representation of the kidney transplantation process in Brazil with the use of business process modeling notation (BPMN). The methodology for this study was based on an exploratory observational study, document analysis, and construction of process diagrams with the use of BPMN. Two rounds of validations by specialists were conducted. The result includes the kidney transplantation process in Brazil representation with the use of BPMN. We analyzed 2 digital documents that resulted in 2 processes with 45 total of activities and events, 6 organizations involved, and 6 different stages of the process. The constructed representation makes it easier to understand the rules for the business of kidney transplantation and can be used by the health care professionals involved in the various activities within this process. Construction of a representation with language appropriate for the Brazilian lay public is underway. Copyright © 2015 Elsevier Inc. All rights reserved.

  14. Situation awareness acquired from monitoring process plants - the Process Overview concept and measure.

    PubMed

    Lau, Nathan; Jamieson, Greg A; Skraaning, Gyrd

    2016-07-01

    We introduce Process Overview, a situation awareness characterisation of the knowledge derived from monitoring process plants. Process Overview is based on observational studies of process control work in the literature. The characterisation is applied to develop a query-based measure called the Process Overview Measure. The goal of the measure is to improve coupling between situation and awareness according to process plant properties and operator cognitive work. A companion article presents the empirical evaluation of the Process Overview Measure in a realistic process control setting. The Process Overview Measure demonstrated sensitivity and validity by revealing significant effects of experimental manipulations that corroborated with other empirical results. The measure also demonstrated adequate inter-rater reliability and practicality for measuring SA based on data collected by process experts. Practitioner Summary: The Process Overview Measure is a query-based measure for assessing operator situation awareness from monitoring process plants in representative settings.

  15. Evidence of automatic processing in sequence learning using process-dissociation

    PubMed Central

    Mong, Heather M.; McCabe, David P.; Clegg, Benjamin A.

    2012-01-01

    This paper proposes a way to apply process-dissociation to sequence learning in addition and extension to the approach used by Destrebecqz and Cleeremans (2001). Participants were trained on two sequences separated from each other by a short break. Following training, participants self-reported their knowledge of the sequences. A recognition test was then performed which required discrimination of two trained sequences, either under the instructions to call any sequence encountered in the experiment “old” (the inclusion condition), or only sequence fragments from one half of the experiment “old” (the exclusion condition). The recognition test elicited automatic and controlled process estimates using the process dissociation procedure, and suggested both processes were involved. Examining the underlying processes supporting performance may provide more information on the fundamental aspects of the implicit and explicit constructs than has been attainable through awareness testing. PMID:22679465

  16. Information Technology Process Improvement Decision-Making: An Exploratory Study from the Perspective of Process Owners and Process Managers

    ERIC Educational Resources Information Center

    Lamp, Sandra A.

    2012-01-01

    There is information available in the literature that discusses information technology (IT) governance and investment decision making from an executive-level perception, yet there is little information available that offers the perspective of process owners and process managers pertaining to their role in IT process improvement and investment…

  17. Economic-Oriented Stochastic Optimization in Advanced Process Control of Chemical Processes

    PubMed Central

    Dobos, László; Király, András; Abonyi, János

    2012-01-01

    Finding the optimal operating region of chemical processes is an inevitable step toward improving economic performance. Usually the optimal operating region is situated close to process constraints related to product quality or process safety requirements. Higher profit can be realized only by assuring a relatively low frequency of violation of these constraints. A multilevel stochastic optimization framework is proposed to determine the optimal setpoint values of control loops with respect to predetermined risk levels, uncertainties, and costs of violation of process constraints. The proposed framework is realized as direct search-type optimization of Monte-Carlo simulation of the controlled process. The concept is illustrated throughout by a well-known benchmark problem related to the control of a linear dynamical system and the model predictive control of a more complex nonlinear polymerization process. PMID:23213298

  18. Pre-processing and post-processing in group-cluster mergers

    NASA Astrophysics Data System (ADS)

    Vijayaraghavan, R.; Ricker, P. M.

    2013-11-01

    Galaxies in clusters are more likely to be of early type and to have lower star formation rates than galaxies in the field. Recent observations and simulations suggest that cluster galaxies may be `pre-processed' by group or filament environments and that galaxies that fall into a cluster as part of a larger group can stay coherent within the cluster for up to one orbital period (`post-processing'). We investigate these ideas by means of a cosmological N-body simulation and idealized N-body plus hydrodynamics simulations of a group-cluster merger. We find that group environments can contribute significantly to galaxy pre-processing by means of enhanced galaxy-galaxy merger rates, removal of galaxies' hot halo gas by ram pressure stripping and tidal truncation of their galaxies. Tidal distortion of the group during infall does not contribute to pre-processing. Post-processing is also shown to be effective: galaxy-galaxy collisions are enhanced during a group's pericentric passage within a cluster, the merger shock enhances the ram pressure on group and cluster galaxies and an increase in local density during the merger leads to greater galactic tidal truncation.

  19. 43 CFR 2804.19 - How will BLM process my Processing Category 6 application?

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 43 Public Lands: Interior 2 2012-10-01 2012-10-01 false How will BLM process my Processing... process my Processing Category 6 application? (a) For Processing Category 6 applications, you and BLM must enter into a written agreement that describes how BLM will process your application. The final agreement...

  20. 43 CFR 2804.19 - How will BLM process my Processing Category 6 application?

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 43 Public Lands: Interior 2 2013-10-01 2013-10-01 false How will BLM process my Processing... process my Processing Category 6 application? (a) For Processing Category 6 applications, you and BLM must enter into a written agreement that describes how BLM will process your application. The final agreement...

  1. Electrotechnologies to process foods

    USDA-ARS?s Scientific Manuscript database

    Electrical energy is being used to process foods. In conventional food processing plants, electricity drives mechanical devices and controls the degree of process. In recent years, several processing technologies are being developed to process foods directly with electricity. Electrotechnologies use...

  2. Optical signal processing

    NASA Technical Reports Server (NTRS)

    Casasent, D.

    1978-01-01

    The article discusses several optical configurations used for signal processing. Electronic-to-optical transducers are outlined, noting fixed window transducers and moving window acousto-optic transducers. Folded spectrum techniques are considered, with reference to wideband RF signal analysis, fetal electroencephalogram analysis, engine vibration analysis, signal buried in noise, and spatial filtering. Various methods for radar signal processing are described, such as phased-array antennas, the optical processing of phased-array data, pulsed Doppler and FM radar systems, a multichannel one-dimensional optical correlator, correlations with long coded waveforms, and Doppler signal processing. Means for noncoherent optical signal processing are noted, including an optical correlator for speech recognition and a noncoherent optical correlator.

  3. PROCESS IMPROVEMENT STUDIES ON THE BATTELLE HYDROTHERMAL COAL PROCESS

    EPA Science Inventory

    The report gives results of a study to improve the economic viability of the Battelle Hydrothermal (HT) Coal Process by reducing the costs associated with liquid/solid separation and leachant regeneration. Laboratory experiments were conducted to evaluate process improvements for...

  4. Process yield improvements with process control terminal for varian serial ion implanters

    NASA Astrophysics Data System (ADS)

    Higashi, Harry; Soni, Ameeta; Martinez, Larry; Week, Ken

    Implant processes in a modern wafer production fab are extremely complex. There can be several types of misprocessing, i.e. wrong dose or species, double implants and missed implants. Process Control Terminals (PCT) for Varian 350Ds installed at Intel fabs were found to substantially reduce the number of misprocessing steps. This paper describes those misprocessing steps and their subsequent reduction with use of PCTs. Reliable and simple process control with serial process ion implanters has been in increasing demand. A well designed process control terminal greatly increases device yield by monitoring all pertinent implanter functions and enabling process engineering personnel to set up process recipes for simple and accurate system operation. By programming user-selectable interlocks, implant errors are reduced and those that occur are logged for further analysis and prevention. A process control terminal should also be compatible with office personal computers for greater flexibility in system use and data analysis. The impact from the capability of a process control terminal is increased productivity, ergo higher device yield.

  5. Accelerated design of bioconversion processes using automated microscale processing techniques.

    PubMed

    Lye, Gary J; Ayazi-Shamlou, Parviz; Baganz, Frank; Dalby, Paul A; Woodley, John M

    2003-01-01

    Microscale processing techniques are rapidly emerging as a means to increase the speed of bioprocess design and reduce material requirements. Automation of these techniques can reduce labour intensity and enable a wider range of process variables to be examined. This article examines recent research on various individual microscale unit operations including microbial fermentation, bioconversion and product recovery techniques. It also explores the potential of automated whole process sequences operated in microwell formats. The power of the whole process approach is illustrated by reference to a particular bioconversion, namely the Baeyer-Villiger oxidation of bicyclo[3.2.0]hept-2-en-6-one for the production of optically pure lactones.

  6. An Integrated Model of Emotion Processes and Cognition in Social Information Processing.

    ERIC Educational Resources Information Center

    Lemerise, Elizabeth A.; Arsenio, William F.

    2000-01-01

    Interprets literature on contributions of social cognitive and emotion processes to children's social competence in the context of an integrated model of emotion processes and cognition in social information processing. Provides neurophysiological and functional evidence for the centrality of emotion processes in personal-social decision making.…

  7. Process for making unsaturated hydrocarbons using microchannel process technology

    DOEpatents

    Tonkovich, Anna Lee [Dublin, OH; Yuschak, Thomas [Lewis Center, OH; LaPlante, Timothy J [Columbus, OH; Rankin, Scott [Columbus, OH; Perry, Steven T [Galloway, OH; Fitzgerald, Sean Patrick [Columbus, OH; Simmons, Wayne W [Dublin, OH; Mazanec, Terry Daymo, Eric

    2011-04-12

    The disclosed invention relates to a process for converting a feed composition comprising one or more hydrocarbons to a product comprising one or more unsaturated hydrocarbons, the process comprising: flowing the feed composition and steam in contact with each other in a microchannel reactor at a temperature in the range from about 200.degree. C. to about 1200.degree. C. to convert the feed composition to the product, the process being characterized by the absence of catalyst for converting the one or more hydrocarbons to one or more unsaturated hydrocarbons. Hydrogen and/or oxygen may be combined with the feed composition and steam.

  8. INTERIOR PHOTO OF MAIN PROCESSING BUILDING PROCESS MAKEUP AREA (CPP601) ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    INTERIOR PHOTO OF MAIN PROCESSING BUILDING PROCESS MAKEUP AREA (CPP-601) LOOKING SOUTHWEST. PHOTO TAKEN FROM NORTHEAST CORNER. INL PHOTO NUMBER HD-50-4-2. Mike Crane, Photographer, 6/2005 - Idaho National Engineering Laboratory, Idaho Chemical Processing Plant, Fuel Reprocessing Complex, Scoville, Butte County, ID

  9. INTERIOR PHOTO OF MAIN PROCESSING BUILDING PROCESS MAKEUP AREA (CPP601) ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    INTERIOR PHOTO OF MAIN PROCESSING BUILDING PROCESS MAKEUP AREA (CPP-601) LOOKING NORTH. PHOTO TAKEN FROM SOUTHWEST CORNER. INL PHOTO NUMBER HD-50-1-3. Mike Crane, Photographer, 6/2005 - Idaho National Engineering Laboratory, Idaho Chemical Processing Plant, Fuel Reprocessing Complex, Scoville, Butte County, ID

  10. Development of a Process Signature for Manufacturing Processes with Thermal Loads

    NASA Astrophysics Data System (ADS)

    Frerichs, Friedhelm; Meyer, Heiner; Strunk, Rebecca; Kolkwitz, Benjamin; Epp, Jeremy

    2018-06-01

    The newly proposed concept of Process Signatures enables the comparison of seemingly different manufacturing processes via a process-independent approach based on the analysis of the loading condition and resulting material modification. This contribution compares the recently published results, based on numerically achieved data for the development of Process Signatures for sole surface and volume heatings without phase transformations, with the experimental data. The numerical approach applies the moving heat source theory in combination with energetic quantities. The external thermal loadings of both processes were characterized by the resulting temperature development, which correlates with a change in the residual stress state. The numerical investigations show that surface and volume heatings are interchangeable for certain parameter regimes regarding the changes in the residual stress state. Mainly, temperature gradients and thermal diffusion are responsible for the considered modifications. The applied surface- and volume-heating models are used in shallow cut grinding and induction heating, respectively. The comparison of numerical and experimental data reveals similarities, but also some systematic deviations of the residual stresses at the surface. The evaluation and final discussion support the assertion for very fast stress relaxation processes within the subsurface region. A consequence would be that the stress relaxation processes, which are not yet included in the numerical models, must be included in the Process Signatures for sole thermal impacts.

  11. Polyimide processing additives

    NASA Technical Reports Server (NTRS)

    Fletcher, James C. (Inventor); Pratt, J. Richard (Inventor); St.clair, Terry L. (Inventor); Stoakley, Diane M. (Inventor); Burks, Harold D. (Inventor)

    1992-01-01

    A process for preparing polyimides having enhanced melt flow properties is described. The process consists of heating a mixture of a high molecular weight poly-(amic acid) or polyimide with a low molecular weight amic acid or imide additive in the range of 0.05 to 15 percent by weight of additive. The polyimide powders so obtained show improved processability, as evidenced by lower melt viscosity by capillary rheometry. Likewise, films prepared from mixtures of polymers with additives show improved processability with earlier onset of stretching by TMA.

  12. Polyimide processing additives

    NASA Technical Reports Server (NTRS)

    Pratt, J. Richard (Inventor); St.clair, Terry L. (Inventor); Stoakley, Diane M. (Inventor); Burks, Harold D. (Inventor)

    1993-01-01

    A process for preparing polyimides having enhanced melt flow properties is described. The process consists of heating a mixture of a high molecular weight poly-(amic acid) or polyimide with a low molecular weight amic acid or imide additive in the range of 0.05 to 15 percent by weight of the additive. The polyimide powders so obtained show improved processability, as evidenced by lower melt viscosity by capillary rheometry. Likewise, films prepared from mixtures of polymers with additives show improved processability with earlier onset of stretching by TMA.

  13. INTERIOR PHOTO OF MAIN PROCESSING BUILDING (CPP601) PROCESS MAKEUP AREA ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    INTERIOR PHOTO OF MAIN PROCESSING BUILDING (CPP-601) PROCESS MAKEUP AREA LOOKING SOUTH. PHOTO TAKEN FROM CENTER OF WEST WALL. INL PHOTO NUMBER HD-50-1-4. Mike Crane, Photographer, 6/2005 - Idaho National Engineering Laboratory, Idaho Chemical Processing Plant, Fuel Reprocessing Complex, Scoville, Butte County, ID

  14. INTERIOR PHOTO OF MAIN PROCESSING BUILDING PROCESS MAKEUP AREA (CPP601) ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    INTERIOR PHOTO OF MAIN PROCESSING BUILDING PROCESS MAKEUP AREA (CPP-601) LOOKING NORTHWEST. PHOTO TAKEN FROM MIDDLE OF CORRIDOR. INL PHOTO NUMBER HD-50-2-3. Mike Crane, Photographer, 6/2005 - Idaho National Engineering Laboratory, Idaho Chemical Processing Plant, Fuel Reprocessing Complex, Scoville, Butte County, ID

  15. INTERIOR PHOTO OF MAIN PROCESSING BUILDING PROCESS MAKEUP AREA (CPP601) ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    INTERIOR PHOTO OF MAIN PROCESSING BUILDING PROCESS MAKEUP AREA (CPP-601) LOOKING SOUTH. PHOTO TAKEN FROM MIDDLE OF CORRIDOR. INL PHOTO NUMBER HD-50-3-2. Mike Crane, Photographer, 6/2005 - Idaho National Engineering Laboratory, Idaho Chemical Processing Plant, Fuel Reprocessing Complex, Scoville, Butte County, ID

  16. Assessment of Intervertebral Disc Degeneration Based on Quantitative MRI Analysis: an in vivo study

    PubMed Central

    Grunert, Peter; Hudson, Katherine D.; Macielak, Michael R.; Aronowitz, Eric; Borde, Brandon H.; Alimi, Marjan; Njoku, Innocent; Ballon, Douglas; Tsiouris, Apostolos John; Bonassar, Lawrence J.; Härtl, Roger

    2015-01-01

    Study design Animal experimental study Objective To evaluate a novel quantitative imaging technique for assessing disc degeneration. Summary of Background Data T2-relaxation time (T2-RT) measurements have been used to quantitatively assess disc degeneration. T2 values correlate with the water content of inter vertebral disc tissue and thereby allow for the indirect measurement of nucleus pulposus (NP) hydration. Methods We developed an algorithm to subtract out MRI voxels not representing NP tissue based on T2-RT values. Filtered NP voxels were used to measure nuclear size by their amount and nuclear hydration by their mean T2-RT. This technique was applied to 24 rat-tail intervertebral discs’ (IVDs), which had been punctured with an 18-gauge needle according to different techniques to induce varying degrees of degeneration. NP voxel count and average T2-RT were used as parameters to assess the degeneration process at 1 and 3 months post puncture. NP voxel counts were evaluated against X-ray disc height measurements and qualitative MRI studies based on the Pfirrmann grading system. Tails were collected for histology to correlate NP voxel counts to histological disc degeneration grades and to NP cross-sectional area measurements. Results NP voxel count measurements showed strong correlations to qualitative MRI analyses (R2=0.79, p<0.0001), histological degeneration grades (R2=0.902, p<0.0001) and histological NP cross-sectional area measurements (R2=0.887, p<0.0001). In contrast to NP voxel counts, the mean T2-RT for each punctured group remained constant between months 1 and 3. The mean T2-RTs for the punctured groups did not show a statistically significant difference from those of healthy IVDs (63.55ms ±5.88ms month 1 and 62.61ms ±5.02ms) at either time point. Conclusion The NP voxel count proved to be a valid parameter to quantitatively assess disc degeneration in a needle puncture model. The mean NP T2-RT does not change significantly in needle

  17. Process for separating nitrogen from methane using microchannel process technology

    DOEpatents

    Tonkovich, Anna Lee [Marysville, OH; Qiu, Dongming [Dublin, OH; Dritz, Terence Andrew [Worthington, OH; Neagle, Paul [Westerville, OH; Litt, Robert Dwayne [Westerville, OH; Arora, Ravi [Dublin, OH; Lamont, Michael Jay [Hilliard, OH; Pagnotto, Kristina M [Cincinnati, OH

    2007-07-31

    The disclosed invention relates to a process for separating methane or nitrogen from a fluid mixture comprising methane and nitrogen, the process comprising: (A) flowing the fluid mixture into a microchannel separator, the microchannel separator comprising a plurality of process microchannels containing a sorption medium, the fluid mixture being maintained in the microchannel separator until at least part of the methane or nitrogen is sorbed by the sorption medium, and removing non-sorbed parts of the fluid mixture from the microchannel separator; and (B) desorbing the methane or nitrogen from the sorption medium and removing the desorbed methane or nitrogen from the microchannel separator. The process is suitable for upgrading methane from coal mines, landfills, and other sub-quality sources.

  18. FLOOR PLAN OF MAIN PROCESSING BUILDING (CPP601) BASEMENT SHOWING PROCESS ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    FLOOR PLAN OF MAIN PROCESSING BUILDING (CPP-601) BASEMENT SHOWING PROCESS CORRIDOR AND EIGHTEEN CELLS. TO LEFT IS LABORATORY BUILDING (CPP-602). INL DRAWING NUMBER 200-0601-00-706-051981. ALTERNATE ID NUMBER CPP-E-1981. - Idaho National Engineering Laboratory, Idaho Chemical Processing Plant, Fuel Reprocessing Complex, Scoville, Butte County, ID

  19. Fermentation process using specific oxygen uptake rates as a process control

    DOEpatents

    Van Hoek, Pim; Aristidou, Aristos; Rush, Brian J.

    2016-08-30

    Specific oxygen uptake (OUR) is used as a process control parameter in fermentation processes. OUR is determined during at least the production phase of a fermentation process, and process parameters are adjusted to maintain the OUR within desired ranges. The invention is particularly applicable when the fermentation is conducted using a microorganism having a natural PDC pathway that has been disrupted so that it no longer functions. Microorganisms of this sort often produce poorly under strictly anaerobic conditions. Microaeration controlled by monitoring OUR allows the performance of the microorganism to be optimized.

  20. Fermentation process using specific oxygen uptake rates as a process control

    DOEpatents

    Van Hoek, Pim [Minnetonka, MN; Aristidou, Aristos [Maple Grove, MN; Rush, Brian [Minneapolis, MN

    2011-05-10

    Specific oxygen uptake (OUR) is used as a process control parameter in fermentation processes. OUR is determined during at least the production phase of a fermentation process, and process parameters are adjusted to maintain the OUR within desired ranges. The invention is particularly applicable when the fermentation is conducted using a microorganism having a natural PDC pathway that has been disrupted so that it no longer functions. Microorganisms of this sort often produce poorly under strictly anaerobic conditions. Microaeration controlled by monitoring OUR allows the performance of the microorganism to be optimized.

  1. Fermentation process using specific oxygen uptake rates as a process control

    DOEpatents

    Hoek, Van; Pim, Aristidou [Minnetonka, MN; Aristos, Rush [Maple Grove, MN; Brian, [Minneapolis, MN

    2007-06-19

    Specific oxygen uptake (OUR) is used as a process control parameter in fermentation processes. OUR is determined during at least the production phase of a fermentation process, and process parameters are adjusted to maintain the OUR within desired ranges. The invention is particularly applicable when the fermentation is conducted using a microorganism having a natural PDC pathway that has been disrupted so that it no longer functions. Microorganisms of this sort often produce poorly under strictly anaerobic conditions. Microaeration controlled by monitoring OUR allows the performance of the microorganism to be optimized.

  2. Fermentation process using specific oxygen uptake rates as a process control

    DOEpatents

    Van Hoek, Pim; Aristidou, Aristos; Rush, Brian

    2014-09-09

    Specific oxygen uptake (OUR) is used as a process control parameter in fermentation processes. OUR is determined during at least the production phase of a fermentation process, and process parameters are adjusted to maintain the OUR within desired ranges. The invention is particularly applicable when the fermentation is conducted using a microorganism having a natural PDC pathway that has been disrupted so that it no longer functions. Microorganisms of this sort often produce poorly under strictly anaerobic conditions. Microaeration controlled by monitoring OUR allows the performance of the microorganism to be optimized.

  3. Peat Processing

    NASA Technical Reports Server (NTRS)

    1986-01-01

    Humics, Inc. already had patented their process for separating wet peat into components and processing it when they consulted NERAC regarding possible applications. The NERAC search revealed numerous uses for humic acid extracted from peat. The product improves seed germination, stimulates root development, and improves crop yields. There are also potential applications in sewage disposal and horticultural peat, etc.

  4. Picturing Quantum Processes

    NASA Astrophysics Data System (ADS)

    Coecke, Bob; Kissinger, Aleks

    2017-03-01

    Preface; 1. Introduction; 2. Guide to reading this textbook; 3. Processes as diagrams; 4. String diagrams; 5. Hilbert space from diagrams; 6. Quantum processes; 7. Quantum measurement; 8. Picturing classical-quantum processes; 9. Picturing phases and complementarity; 10. Quantum theory: the full picture; 11. Quantum foundations; 12. Quantum computation; 13. Quantum resources; 14. Quantomatic; Appendix A. Some notations; References; Index.

  5. Visual Processing in Rapid-Chase Systems: Image Processing, Attention, and Awareness

    PubMed Central

    Schmidt, Thomas; Haberkamp, Anke; Veltkamp, G. Marina; Weber, Andreas; Seydell-Greenwald, Anna; Schmidt, Filipp

    2011-01-01

    Visual stimuli can be classified so rapidly that their analysis may be based on a single sweep of feedforward processing through the visuomotor system. Behavioral criteria for feedforward processing can be evaluated in response priming tasks where speeded pointing or keypress responses are performed toward target stimuli which are preceded by prime stimuli. We apply this method to several classes of complex stimuli. (1) When participants classify natural images into animals or non-animals, the time course of their pointing responses indicates that prime and target signals remain strictly sequential throughout all processing stages, meeting stringent behavioral criteria for feedforward processing (rapid-chase criteria). (2) Such priming effects are boosted by selective visual attention for positions, shapes, and colors, in a way consistent with bottom-up enhancement of visuomotor processing, even when primes cannot be consciously identified. (3) Speeded processing of phobic images is observed in participants specifically fearful of spiders or snakes, suggesting enhancement of feedforward processing by long-term perceptual learning. (4) When the perceived brightness of primes in complex displays is altered by means of illumination or transparency illusions, priming effects in speeded keypress responses can systematically contradict subjective brightness judgments, such that one prime appears brighter than the other but activates motor responses as if it was darker. We propose that response priming captures the output of the first feedforward pass of visual signals through the visuomotor system, and that this output lacks some characteristic features of more elaborate, recurrent processing. This way, visuomotor measures may become dissociated from several aspects of conscious vision. We argue that “fast” visuomotor measures predominantly driven by feedforward processing should supplement “slow” psychophysical measures predominantly based on visual awareness

  6. Neural competition as a developmental process: Early hemispheric specialization for word processing delays specialization for face processing

    PubMed Central

    Li, Su; Lee, Kang; Zhao, Jing; Yang, Zhi; He, Sheng; Weng, Xuchu

    2013-01-01

    Little is known about the impact of learning to read on early neural development for word processing and its collateral effects on neural development in non-word domains. Here, we examined the effect of early exposure to reading on neural responses to both word and face processing in preschool children with the use of the Event Related Potential (ERP) methodology. We specifically linked children’s reading experience (indexed by their sight vocabulary) to two major neural markers: the amplitude differences between the left and right N170 on the bilateral posterior scalp sites and the hemispheric spectrum power differences in the γ band on the same scalp sites. The results showed that the left-lateralization of both the word N170 and the spectrum power in the γ band were significantly positively related to vocabulary. In contrast, vocabulary and the word left-lateralization both had a strong negative direct effect on the face right-lateralization. Also, vocabulary negatively correlated with the right-lateralized face spectrum power in the γ band even after the effects of age and the word spectrum power were partialled out. The present study provides direct evidence regarding the role of reading experience in the neural specialization of word and face processing above and beyond the effect of maturation. The present findings taken together suggest that the neural development of visual word processing competes with that of face processing before the process of neural specialization has been consolidated. PMID:23462239

  7. Process evaluation distributed system

    NASA Technical Reports Server (NTRS)

    Moffatt, Christopher L. (Inventor)

    2006-01-01

    The distributed system includes a database server, an administration module, a process evaluation module, and a data display module. The administration module is in communication with the database server for providing observation criteria information to the database server. The process evaluation module is in communication with the database server for obtaining the observation criteria information from the database server and collecting process data based on the observation criteria information. The process evaluation module utilizes a personal digital assistant (PDA). A data display module in communication with the database server, including a website for viewing collected process data in a desired metrics form, the data display module also for providing desired editing and modification of the collected process data. The connectivity established by the database server to the administration module, the process evaluation module, and the data display module, minimizes the requirement for manual input of the collected process data.

  8. Word Processing and the Writing Process: Enhancement or Distraction?

    ERIC Educational Resources Information Center

    Dalton, David W.; Watson, James F.

    This study examined the effects of a year-long word processing program on learners' holistic writing skills. Based on results of a writing pretest, 80 seventh grade students were designated as relatively high or low in prior writing achievement and assigned to one of two groups: a word processing treatment and a conventional writing process…

  9. Polycrystalline semiconductor processing

    DOEpatents

    Glaeser, Andreas M.; Haggerty, John S.; Danforth, Stephen C.

    1983-01-01

    A process for forming large-grain polycrystalline films from amorphous films for use as photovoltaic devices. The process operates on the amorphous film and uses the driving force inherent to the transition from the amorphous state to the crystalline state as the force which drives the grain growth process. The resultant polycrystalline film is characterized by a grain size that is greater than the thickness of the film. A thin amorphous film is deposited on a substrate. The formation of a plurality of crystalline embryos is induced in the amorphous film at predetermined spaced apart locations and nucleation is inhibited elsewhere in the film. The crystalline embryos are caused to grow in the amorphous film, without further nucleation occurring in the film, until the growth of the embryos is halted by imgingement on adjacently growing embryos. The process is applicable to both batch and continuous processing techniques. In either type of process, the thin amorphous film is sequentially doped with p and n type dopants. Doping is effected either before or after the formation and growth of the crystalline embryos in the amorphous film, or during a continuously proceeding crystallization step.

  10. Polycrystalline semiconductor processing

    DOEpatents

    Glaeser, A.M.; Haggerty, J.S.; Danforth, S.C.

    1983-04-05

    A process is described for forming large-grain polycrystalline films from amorphous films for use as photovoltaic devices. The process operates on the amorphous film and uses the driving force inherent to the transition from the amorphous state to the crystalline state as the force which drives the grain growth process. The resultant polycrystalline film is characterized by a grain size that is greater than the thickness of the film. A thin amorphous film is deposited on a substrate. The formation of a plurality of crystalline embryos is induced in the amorphous film at predetermined spaced apart locations and nucleation is inhibited elsewhere in the film. The crystalline embryos are caused to grow in the amorphous film, without further nucleation occurring in the film, until the growth of the embryos is halted by impingement on adjacently growing embryos. The process is applicable to both batch and continuous processing techniques. In either type of process, the thin amorphous film is sequentially doped with p and n type dopants. Doping is effected either before or after the formation and growth of the crystalline embryos in the amorphous film, or during a continuously proceeding crystallization step. 10 figs.

  11. Helium process cycle

    DOEpatents

    Ganni, Venkatarao

    2008-08-12

    A unique process cycle and apparatus design separates the consumer (cryogenic) load return flow from most of the recycle return flow of a refrigerator and/or liquefier process cycle. The refrigerator and/or liquefier process recycle return flow is recompressed by a multi-stage compressor set and the consumer load return flow is recompressed by an independent consumer load compressor set that maintains a desirable constant suction pressure using a consumer load bypass control valve and the consumer load return pressure control valve that controls the consumer load compressor's suction pressure. The discharge pressure of this consumer load compressor is thereby allowed to float at the intermediate pressure in between the first and second stage recycle compressor sets. Utilizing the unique gas management valve regulation, the unique process cycle and apparatus design in which the consumer load return flow is separate from the recycle return flow, the pressure ratios of each recycle compressor stage and all main pressures associated with the recycle return flow are allowed to vary naturally, thus providing a naturally regulated and balanced floating pressure process cycle that maintains optimal efficiency at design and off-design process cycle capacity and conditions automatically.

  12. Helium process cycle

    DOEpatents

    Ganni, Venkatarao

    2007-10-09

    A unique process cycle and apparatus design separates the consumer (cryogenic) load return flow from most of the recycle return flow of a refrigerator and/or liquefier process cycle. The refrigerator and/or liquefier process recycle return flow is recompressed by a multi-stage compressor set and the consumer load return flow is recompressed by an independent consumer load compressor set that maintains a desirable constant suction pressure using a consumer load bypass control valve and the consumer load return pressure control valve that controls the consumer load compressor's suction pressure. The discharge pressure of this consumer load compressor is thereby allowed to float at the intermediate pressure in between the first and second stage recycle compressor sets. Utilizing the unique gas management valve regulation, the unique process cycle and apparatus design in which the consumer load return flow is separate from the recycle return flow, the pressure ratios of each recycle compressor stage and all main pressures associated with the recycle return flow are allowed to vary naturally, thus providing a naturally regulated and balanced floating pressure process cycle that maintains optimal efficiency at design and off-design process cycle capacity and conditions automatically.

  13. Improving the Document Development Process: Integrating Relational Data and Statistical Process Control.

    ERIC Educational Resources Information Center

    Miller, John

    1994-01-01

    Presents an approach to document numbering, document titling, and process measurement which, when used with fundamental techniques of statistical process control, reveals meaningful process-element variation as well as nominal productivity models. (SR)

  14. Thermochemical water decomposition processes

    NASA Technical Reports Server (NTRS)

    Chao, R. E.

    1974-01-01

    Thermochemical processes which lead to the production of hydrogen and oxygen from water without the consumption of any other material have a number of advantages when compared to other processes such as water electrolysis. It is possible to operate a sequence of chemical steps with net work requirements equal to zero at temperatures well below the temperature required for water dissociation in a single step. Various types of procedures are discussed, giving attention to halide processes, reverse Deacon processes, iron oxide and carbon oxide processes, and metal and alkali metal processes. Economical questions are also considered.

  15. Business Development Process

    DTIC Science & Technology

    2001-10-31

    832-4736. DISTRIBUTION STATEMENT A Approved for Public Release Distribution Unlimited Attorney Docket No. 83042 BUSINESS DEVELOPMENT PROCESS TO... BUSINESS DEVELOPMENT PROCESS 3 4 STATEMENT OF GOVERNMENT INTEREST 5 The invention described herein may be manufactured and used 6 by or for the...INVENTION 11 (1) Field of the Invention 12 This invention generally relates to a business 13 development process for assessing new business ideas

  16. [Definition and stabilization of processes I. Management processes and support in a Urology Department].

    PubMed

    Pascual, Carlos; Luján, Marcos; Mora, José Ramón; Chiva, Vicente; Gamarra, Manuela

    2015-01-01

    The implantation of total quality management models in clinical departments can better adapt to the 2009 ISO 9004 model. An essential part of implantation of these models is the establishment of processes and their stabilization. There are four types of processes: key, management, support and operative (clinical). Management processes have four parts: process stabilization form, process procedures form, medical activities cost estimation form and, process flow chart. In this paper we will detail the creation of an essential process in a surgical department, such as the process of management of the surgery waiting list.

  17. 7 Processes that Enable NASA Software Engineering Technologies: Value-Added Process Engineering

    NASA Technical Reports Server (NTRS)

    Housch, Helen; Godfrey, Sally

    2011-01-01

    The presentation reviews Agency process requirements and the purpose, benefits, and experiences or seven software engineering processes. The processes include: product integration, configuration management, verification, software assurance, measurement and analysis, requirements management, and planning and monitoring.

  18. Implementation of a process analytical technology system in a freeze-drying process using Raman spectroscopy for in-line process monitoring.

    PubMed

    De Beer, T R M; Allesø, M; Goethals, F; Coppens, A; Heyden, Y Vander; De Diego, H Lopez; Rantanen, J; Verpoort, F; Vervaet, C; Remon, J P; Baeyens, W R G

    2007-11-01

    The aim of the present study was to propose a strategy for the implementation of a Process Analytical Technology system in freeze-drying processes. Mannitol solutions, some of them supplied with NaCl, were used as models to freeze-dry. Noninvasive and in-line Raman measurements were continuously performed during lyophilization of the solutions to monitor real time the mannitol solid state, the end points of the different process steps (freezing, primary drying, secondary drying), and physical phenomena occurring during the process. At-line near-infrared (NIR) and X-ray powder diffractometry (XRPD) measurements were done to confirm the Raman conclusions and to find out additional information. The collected spectra during the processes were analyzed using principal component analysis and multivariate curve resolution. A two-level full factorial design was used to study the significant influence of process (freezing rate) and formulation variables (concentration of mannitol, concentration of NaCl, volume of freeze-dried sample) upon freeze-drying. Raman spectroscopy was able to monitor (i) the mannitol solid state (amorphous, alpha, beta, delta, and hemihydrate), (ii) several process step end points (end of mannitol crystallization during freezing, primary drying), and (iii) physical phenomena occurring during freeze-drying (onset of ice nucleation, onset of mannitol crystallization during the freezing step, onset of ice sublimation). NIR proved to be a more sensitive tool to monitor sublimation than Raman spectroscopy, while XRPD helped to unravel the mannitol hemihydrate in the samples. The experimental design results showed that several process and formulation variables significantly influence different aspects of lyophilization and that both are interrelated. Raman spectroscopy (in-line) and NIR spectroscopy and XRPD (at-line) not only allowed the real-time monitoring of mannitol freeze-drying processes but also helped (in combination with experimental design) us

  19. Real-Time Plasma Process Condition Sensing and Abnormal Process Detection

    PubMed Central

    Yang, Ryan; Chen, Rongshun

    2010-01-01

    The plasma process is often used in the fabrication of semiconductor wafers. However, due to the lack of real-time etching control, this may result in some unacceptable process performances and thus leads to significant waste and lower wafer yield. In order to maximize the product wafer yield, a timely and accurately process fault or abnormal detection in a plasma reactor is needed. Optical emission spectroscopy (OES) is one of the most frequently used metrologies in in-situ process monitoring. Even though OES has the advantage of non-invasiveness, it is required to provide a huge amount of information. As a result, the data analysis of OES becomes a big challenge. To accomplish real-time detection, this work employed the sigma matching method technique, which is the time series of OES full spectrum intensity. First, the response model of a healthy plasma spectrum was developed. Then, we defined a matching rate as an indictor for comparing the difference between the tested wafers response and the health sigma model. The experimental results showed that this proposal method can detect process faults in real-time, even in plasma etching tools. PMID:22219683

  20. Feller processes: the next generation in modeling. Brownian motion, Lévy processes and beyond.

    PubMed

    Böttcher, Björn

    2010-12-03

    We present a simple construction method for Feller processes and a framework for the generation of sample paths of Feller processes. The construction is based on state space dependent mixing of Lévy processes. Brownian Motion is one of the most frequently used continuous time Markov processes in applications. In recent years also Lévy processes, of which Brownian Motion is a special case, have become increasingly popular. Lévy processes are spatially homogeneous, but empirical data often suggest the use of spatially inhomogeneous processes. Thus it seems necessary to go to the next level of generalization: Feller processes. These include Lévy processes and in particular brownian motion as special cases but allow spatial inhomogeneities. Many properties of Feller processes are known, but proving the very existence is, in general, very technical. Moreover, an applicable framework for the generation of sample paths of a Feller process was missing. We explain, with practitioners in mind, how to overcome both of these obstacles. In particular our simulation technique allows to apply Monte Carlo methods to Feller processes.

  1. Feller Processes: The Next Generation in Modeling. Brownian Motion, Lévy Processes and Beyond

    PubMed Central

    Böttcher, Björn

    2010-01-01

    We present a simple construction method for Feller processes and a framework for the generation of sample paths of Feller processes. The construction is based on state space dependent mixing of Lévy processes. Brownian Motion is one of the most frequently used continuous time Markov processes in applications. In recent years also Lévy processes, of which Brownian Motion is a special case, have become increasingly popular. Lévy processes are spatially homogeneous, but empirical data often suggest the use of spatially inhomogeneous processes. Thus it seems necessary to go to the next level of generalization: Feller processes. These include Lévy processes and in particular Brownian motion as special cases but allow spatial inhomogeneities. Many properties of Feller processes are known, but proving the very existence is, in general, very technical. Moreover, an applicable framework for the generation of sample paths of a Feller process was missing. We explain, with practitioners in mind, how to overcome both of these obstacles. In particular our simulation technique allows to apply Monte Carlo methods to Feller processes. PMID:21151931

  2. Neural competition as a developmental process: early hemispheric specialization for word processing delays specialization for face processing.

    PubMed

    Li, Su; Lee, Kang; Zhao, Jing; Yang, Zhi; He, Sheng; Weng, Xuchu

    2013-04-01

    Little is known about the impact of learning to read on early neural development for word processing and its collateral effects on neural development in non-word domains. Here, we examined the effect of early exposure to reading on neural responses to both word and face processing in preschool children with the use of the Event Related Potential (ERP) methodology. We specifically linked children's reading experience (indexed by their sight vocabulary) to two major neural markers: the amplitude differences between the left and right N170 on the bilateral posterior scalp sites and the hemispheric spectrum power differences in the γ band on the same scalp sites. The results showed that the left-lateralization of both the word N170 and the spectrum power in the γ band were significantly positively related to vocabulary. In contrast, vocabulary and the word left-lateralization both had a strong negative direct effect on the face right-lateralization. Also, vocabulary negatively correlated with the right-lateralized face spectrum power in the γ band even after the effects of age and the word spectrum power were partialled out. The present study provides direct evidence regarding the role of reading experience in the neural specialization of word and face processing above and beyond the effect of maturation. The present findings taken together suggest that the neural development of visual word processing competes with that of face processing before the process of neural specialization has been consolidated. Copyright © 2013 Elsevier Ltd. All rights reserved.

  3. Process margin enhancement for 0.25-μm metal etch process

    NASA Astrophysics Data System (ADS)

    Lee, Chung Y.; Ma, Wei Wen; Lim, Eng H.; Cheng, Alex T.; Joy, Raymond; Ross, Matthew F.; Wong, Selmer S.; Marlowe, Trey

    2000-06-01

    This study evaluates electron beam stabilization of UV6, a positive tone Deep-UV (DUV) resist from Shipley, for a 0.25 micrometer metal etch application. Results are compared between untreated resist and resist treated with different levels of electron beam stabilization. The electron beam processing was carried out in an ElectronCureTM flood electron beam exposure system from Honeywell International Inc., Electron Vision. The ElectronCureTM system utilizes a flood electron beam source which is larger in diameter than the substrate being processed, and is capable of variable energy so that the electron range is matched to the resist film thickness. Changes in the UV6 resist material as a result of the electron beam stabilization are monitored via spectroscopic ellipsometry for film thickness and index of refraction changes and FTIR for analysis of chemical changes. Thermal flow stability is evaluated by applying hot plate bakes of 150 degrees Celsius and 200 degrees Celsius, to patterned resist wafers with no treatment and with an electron beam dose level of 2000 (mu) C/cm2. A significant improvement in the thermal flow stability of the patterned UV6 resist features is achieved with the electron beam stabilization process. Etch process performance of the UV6 resist was evaluated by performing a metal pattern transfer process on wafers with untreated resist and comparing these with etch results on wafers with different levels of electron beam stabilization. The etch processing was carried out in an Applied Materials reactor with an etch chemistry including BCl3 and Cl2. All wafers were etched under the same conditions and the resist was treated after etch to prevent further erosion after etch but before SEM analysis. Post metal etch SEM cross-sections show the enhancement in etch resistance provided by the electron beam stabilization process. Enhanced process margin is achieved as a result of the improved etch resistance, and is observed in reduced resist side

  4. Power processing

    NASA Technical Reports Server (NTRS)

    Schwarz, F. C.

    1971-01-01

    Processing of electric power has been presented as a discipline that draws on almost every field of electrical engineering, including system and control theory, communications theory, electronic network design, and power component technology. The cost of power processing equipment, which often equals that of expensive, sophisticated, and unconventional sources of electrical energy, such as solar batteries, is a significant consideration in the choice of electric power systems.

  5. Dynamic control of remelting processes

    DOEpatents

    Bertram, Lee A.; Williamson, Rodney L.; Melgaard, David K.; Beaman, Joseph J.; Evans, David G.

    2000-01-01

    An apparatus and method of controlling a remelting process by providing measured process variable values to a process controller; estimating process variable values using a process model of a remelting process; and outputting estimated process variable values from the process controller. Feedback and feedforward control devices receive the estimated process variable values and adjust inputs to the remelting process. Electrode weight, electrode mass, electrode gap, process current, process voltage, electrode position, electrode temperature, electrode thermal boundary layer thickness, electrode velocity, electrode acceleration, slag temperature, melting efficiency, cooling water temperature, cooling water flow rate, crucible temperature profile, slag skin temperature, and/or drip short events are employed, as are parameters representing physical constraints of electroslag remelting or vacuum arc remelting, as applicable.

  6. In-Process Thermal Imaging of the Electron Beam Freeform Fabrication Process

    NASA Technical Reports Server (NTRS)

    Taminger, Karen M.; Domack, Christopher S.; Zalameda, Joseph N.; Taminger, Brian L.; Hafley, Robert A.; Burke, Eric R.

    2016-01-01

    Researchers at NASA Langley Research Center have been developing the Electron Beam Freeform Fabrication (EBF3) metal additive manufacturing process for the past 15 years. In this process, an electron beam is used as a heat source to create a small molten pool on a substrate into which wire is fed. The electron beam and wire feed assembly are translated with respect to the substrate to follow a predetermined tool path. This process is repeated in a layer-wise fashion to fabricate metal structural components. In-process imaging has been integrated into the EBF3 system using a near-infrared (NIR) camera. The images are processed to provide thermal and spatial measurements that have been incorporated into a closed-loop control system to maintain consistent thermal conditions throughout the build. Other information in the thermal images is being used to assess quality in real time by detecting flaws in prior layers of the deposit. NIR camera incorporation into the system has improved the consistency of the deposited material and provides the potential for real-time flaw detection which, ultimately, could lead to the manufacture of better, more reliable components using this additive manufacturing process.

  7. Hierarchical process memory: memory as an integral component of information processing

    PubMed Central

    Hasson, Uri; Chen, Janice; Honey, Christopher J.

    2015-01-01

    Models of working memory commonly focus on how information is encoded into and retrieved from storage at specific moments. However, in the majority of real-life processes, past information is used continuously to process incoming information across multiple timescales. Considering single unit, electrocorticography, and functional imaging data, we argue that (i) virtually all cortical circuits can accumulate information over time, and (ii) the timescales of accumulation vary hierarchically, from early sensory areas with short processing timescales (tens to hundreds of milliseconds) to higher-order areas with long processing timescales (many seconds to minutes). In this hierarchical systems perspective, memory is not restricted to a few localized stores, but is intrinsic to information processing that unfolds throughout the brain on multiple timescales. “The present contains nothing more than the past, and what is found in the effect was already in the cause.”Henri L Bergson PMID:25980649

  8. Process Analyzer

    NASA Technical Reports Server (NTRS)

    1994-01-01

    The ChemScan UV-6100 is a spectrometry system originally developed by Biotronics Technologies, Inc. under a Small Business Innovation Research (SBIR) contract. It is marketed to the water and wastewater treatment industries, replacing "grab sampling" with on-line data collection. It analyzes the light absorbance characteristics of a water sample, simultaneously detects hundreds of individual wavelengths absorbed by chemical substances in a process solution, and quantifies the information. Spectral data is then processed by ChemScan analyzer and compared with calibration files in the system's memory in order to calculate concentrations of chemical substances that cause UV light absorbance in specific patterns. Monitored substances can be analyzed for quality and quantity. Applications include detection of a variety of substances, and the information provided enables an operator to control a process more efficiently.

  9. Rapid process development of chromatographic process using direct analysis in real time mass spectrometry as a process analytical technology tool.

    PubMed

    Yan, Binjun; Chen, Teng; Xu, Zhilin; Qu, Haibin

    2014-06-01

    The concept of quality by design (QbD) is widely applied in the process development of pharmaceuticals. However, the additional cost and time have caused some resistance about QbD implementation. To show a possible solution, this work proposed a rapid process development method, which used direct analysis in real time mass spectrometry (DART-MS) as a process analytical technology (PAT) tool for studying the chromatographic process of Ginkgo biloba L., as an example. The breakthrough curves were fast determined by DART-MS at-line. A high correlation coefficient of 0.9520 was found between the concentrations of ginkgolide A determined by DART-MS and HPLC. Based on the PAT tool, the impacts of process parameters on the adsorption capacity were discovered rapidly, which showed a decreased adsorption capacity with the increase of the flow rate. This work has shown the feasibility and advantages of integrating PAT into QbD implementation for rapid process development. Copyright © 2014 Elsevier B.V. All rights reserved.

  10. Recollection is a continuous process: implications for dual-process theories of recognition memory.

    PubMed

    Mickes, Laura; Wais, Peter E; Wixted, John T

    2009-04-01

    Dual-process theory, which holds that recognition decisions can be based on recollection or familiarity, has long seemed incompatible with signal detection theory, which holds that recognition decisions are based on a singular, continuous memory-strength variable. Formal dual-process models typically regard familiarity as a continuous process (i.e., familiarity comes in degrees), but they construe recollection as a categorical process (i.e., recollection either occurs or does not occur). A continuous process is characterized by a graded relationship between confidence and accuracy, whereas a categorical process is characterized by a binary relationship such that high confidence is associated with high accuracy but all lower degrees of confidence are associated with chance accuracy. Using a source-memory procedure, we found that the relationship between confidence and source-recollection accuracy was graded. Because recollection, like familiarity, is a continuous process, dual-process theory is more compatible with signal detection theory than previously thought.

  11. Rethinking Process through Design

    ERIC Educational Resources Information Center

    Newcomb, Matthew; Leshowitz, Allison

    2017-01-01

    We take a look at work on writing processes by examining design processes. Design processes offer a greater emphasis on empathy with users, feedback and critique during idea generation, and varied uses of materials. After considering work already done on design and composition, we explore a variety of design processes and develop our own…

  12. Making process improvement 'stick'.

    PubMed

    Studer, Quint

    2014-06-01

    To sustain gains from a process improvement initiative, healthcare organizations should: Explain to staff why a process improvement initiative is needed. Encourage leaders within the organization to champion the process improvement, and tie their evaluations to its outcomes. Ensure that both leaders and employees have the skills to help sustain the sought-after process improvements.

  13. Computers for symbolic processing

    NASA Technical Reports Server (NTRS)

    Wah, Benjamin W.; Lowrie, Matthew B.; Li, Guo-Jie

    1989-01-01

    A detailed survey on the motivations, design, applications, current status, and limitations of computers designed for symbolic processing is provided. Symbolic processing computations are performed at the word, relation, or meaning levels, and the knowledge used in symbolic applications may be fuzzy, uncertain, indeterminate, and ill represented. Various techniques for knowledge representation and processing are discussed from both the designers' and users' points of view. The design and choice of a suitable language for symbolic processing and the mapping of applications into a software architecture are then considered. The process of refining the application requirements into hardware and software architectures is treated, and state-of-the-art sequential and parallel computers designed for symbolic processing are discussed.

  14. Honing process optimization algorithms

    NASA Astrophysics Data System (ADS)

    Kadyrov, Ramil R.; Charikov, Pavel N.; Pryanichnikova, Valeria V.

    2018-03-01

    This article considers the relevance of honing processes for creating high-quality mechanical engineering products. The features of the honing process are revealed and such important concepts as the task for optimization of honing operations, the optimal structure of the honing working cycles, stepped and stepless honing cycles, simulation of processing and its purpose are emphasized. It is noted that the reliability of the mathematical model determines the quality parameters of the honing process control. An algorithm for continuous control of the honing process is proposed. The process model reliably describes the machining of a workpiece in a sufficiently wide area and can be used to operate the CNC machine CC743.

  15. HYNOL PROCESS ENGINEERING: PROCESS CONFIGURATION, SITE PLAN, AND EQUIPMENT DESIGN

    EPA Science Inventory

    The report describes the design of the hydropyrolysis reactor system of the Hynol process. (NOTE: A bench scale methanol production facility is being constructed to demonstrate the technical feasibility of producing methanol from biomass using the Hynol process. The plant is bein...

  16. Silicon production process evaluations

    NASA Technical Reports Server (NTRS)

    1982-01-01

    Chemical engineering analyses involving the preliminary process design of a plant (1,000 metric tons/year capacity) to produce silicon via the technology under consideration were accomplished. Major activities in the chemical engineering analyses included base case conditions, reaction chemistry, process flowsheet, material balance, energy balance, property data, equipment design, major equipment list, production labor and forward for economic analysis. The process design package provided detailed data for raw materials, utilities, major process equipment and production labor requirements necessary for polysilicon production in each process.

  17. Modular Toolkit for Data Processing (MDP): A Python Data Processing Framework.

    PubMed

    Zito, Tiziano; Wilbert, Niko; Wiskott, Laurenz; Berkes, Pietro

    2008-01-01

    Modular toolkit for Data Processing (MDP) is a data processing framework written in Python. From the user's perspective, MDP is a collection of supervised and unsupervised learning algorithms and other data processing units that can be combined into data processing sequences and more complex feed-forward network architectures. Computations are performed efficiently in terms of speed and memory requirements. From the scientific developer's perspective, MDP is a modular framework, which can easily be expanded. The implementation of new algorithms is easy and intuitive. The new implemented units are then automatically integrated with the rest of the library. MDP has been written in the context of theoretical research in neuroscience, but it has been designed to be helpful in any context where trainable data processing algorithms are used. Its simplicity on the user's side, the variety of readily available algorithms, and the reusability of the implemented units make it also a useful educational tool.

  18. Examining Candidate Information Search Processes: The Impact of Processing Goals and Sophistication.

    ERIC Educational Resources Information Center

    Huang, Li-Ning

    2000-01-01

    Investigates how 4 different information-processing goals, varying on the dimensions of effortful versus effortless and impression-driven versus non-impression-driven processing, and individual difference in political sophistication affect the depth at which undergraduate students process candidate information and their decision-making strategies.…

  19. On Intelligent Design and Planning Method of Process Route Based on Gun Breech Machining Process

    NASA Astrophysics Data System (ADS)

    Hongzhi, Zhao; Jian, Zhang

    2018-03-01

    The paper states an approach of intelligent design and planning of process route based on gun breech machining process, against several problems, such as complex machining process of gun breech, tedious route design and long period of its traditional unmanageable process route. Based on gun breech machining process, intelligent design and planning system of process route are developed by virtue of DEST and VC++. The system includes two functional modules--process route intelligent design and its planning. The process route intelligent design module, through the analysis of gun breech machining process, summarizes breech process knowledge so as to complete the design of knowledge base and inference engine. And then gun breech process route intelligently output. On the basis of intelligent route design module, the final process route is made, edited and managed in the process route planning module.

  20. Process Mining Methodology for Health Process Tracking Using Real-Time Indoor Location Systems.

    PubMed

    Fernandez-Llatas, Carlos; Lizondo, Aroa; Monton, Eduardo; Benedi, Jose-Miguel; Traver, Vicente

    2015-11-30

    The definition of efficient and accurate health processes in hospitals is crucial for ensuring an adequate quality of service. Knowing and improving the behavior of the surgical processes in a hospital can improve the number of patients that can be operated on using the same resources. However, the measure of this process is usually made in an obtrusive way, forcing nurses to get information and time data, affecting the proper process and generating inaccurate data due to human errors during the stressful journey of health staff in the operating theater. The use of indoor location systems can take time information about the process in an unobtrusive way, freeing nurses, allowing them to engage in purely welfare work. However, it is necessary to present these data in a understandable way for health professionals, who cannot deal with large amounts of historical localization log data. The use of process mining techniques can deal with this problem, offering an easily understandable view of the process. In this paper, we present a tool and a process mining-based methodology that, using indoor location systems, enables health staff not only to represent the process, but to know precise information about the deployment of the process in an unobtrusive and transparent way. We have successfully tested this tool in a real surgical area with 3613 patients during February, March and April of 2015.

  1. Process Mining Methodology for Health Process Tracking Using Real-Time Indoor Location Systems

    PubMed Central

    Fernandez-Llatas, Carlos; Lizondo, Aroa; Monton, Eduardo; Benedi, Jose-Miguel; Traver, Vicente

    2015-01-01

    The definition of efficient and accurate health processes in hospitals is crucial for ensuring an adequate quality of service. Knowing and improving the behavior of the surgical processes in a hospital can improve the number of patients that can be operated on using the same resources. However, the measure of this process is usually made in an obtrusive way, forcing nurses to get information and time data, affecting the proper process and generating inaccurate data due to human errors during the stressful journey of health staff in the operating theater. The use of indoor location systems can take time information about the process in an unobtrusive way, freeing nurses, allowing them to engage in purely welfare work. However, it is necessary to present these data in a understandable way for health professionals, who cannot deal with large amounts of historical localization log data. The use of process mining techniques can deal with this problem, offering an easily understandable view of the process. In this paper, we present a tool and a process mining-based methodology that, using indoor location systems, enables health staff not only to represent the process, but to know precise information about the deployment of the process in an unobtrusive and transparent way. We have successfully tested this tool in a real surgical area with 3613 patients during February, March and April of 2015. PMID:26633395

  2. Closed-Loop Process Control for Electron Beam Freeform Fabrication and Deposition Processes

    NASA Technical Reports Server (NTRS)

    Taminger, Karen M. (Inventor); Hofmeister, William H. (Inventor); Martin, Richard E. (Inventor); Hafley, Robert A. (Inventor)

    2013-01-01

    A closed-loop control method for an electron beam freeform fabrication (EBF(sup 3)) process includes detecting a feature of interest during the process using a sensor(s), continuously evaluating the feature of interest to determine, in real time, a change occurring therein, and automatically modifying control parameters to control the EBF(sup 3) process. An apparatus provides closed-loop control method of the process, and includes an electron gun for generating an electron beam, a wire feeder for feeding a wire toward a substrate, wherein the wire is melted and progressively deposited in layers onto the substrate, a sensor(s), and a host machine. The sensor(s) measure the feature of interest during the process, and the host machine continuously evaluates the feature of interest to determine, in real time, a change occurring therein. The host machine automatically modifies control parameters to the EBF(sup 3) apparatus to control the EBF(sup 3) process in a closed-loop manner.

  3. 43 CFR 2884.17 - How will BLM process my Processing Category 6 application?

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 43 Public Lands: Interior 2 2012-10-01 2012-10-01 false How will BLM process my Processing...-WAY UNDER THE MINERAL LEASING ACT Applying for MLA Grants or TUPs § 2884.17 How will BLM process my... written agreement that describes how BLM will process your application. The final agreement consists of a...

  4. 43 CFR 2884.17 - How will BLM process my Processing Category 6 application?

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 43 Public Lands: Interior 2 2013-10-01 2013-10-01 false How will BLM process my Processing...-WAY UNDER THE MINERAL LEASING ACT Applying for MLA Grants or TUPs § 2884.17 How will BLM process my... written agreement that describes how BLM will process your application. The final agreement consists of a...

  5. Laser dimpling process parameters selection and optimization using surrogate-driven process capability space

    NASA Astrophysics Data System (ADS)

    Ozkat, Erkan Caner; Franciosa, Pasquale; Ceglarek, Dariusz

    2017-08-01

    Remote laser welding technology offers opportunities for high production throughput at a competitive cost. However, the remote laser welding process of zinc-coated sheet metal parts in lap joint configuration poses a challenge due to the difference between the melting temperature of the steel (∼1500 °C) and the vapourizing temperature of the zinc (∼907 °C). In fact, the zinc layer at the faying surface is vapourized and the vapour might be trapped within the melting pool leading to weld defects. Various solutions have been proposed to overcome this problem over the years. Among them, laser dimpling has been adopted by manufacturers because of its flexibility and effectiveness along with its cost advantages. In essence, the dimple works as a spacer between the two sheets in lap joint and allows the zinc vapour escape during welding process, thereby preventing weld defects. However, there is a lack of comprehensive characterization of dimpling process for effective implementation in real manufacturing system taking into consideration inherent changes in variability of process parameters. This paper introduces a methodology to develop (i) surrogate model for dimpling process characterization considering multiple-inputs (i.e. key control characteristics) and multiple-outputs (i.e. key performance indicators) system by conducting physical experimentation and using multivariate adaptive regression splines; (ii) process capability space (Cp-Space) based on the developed surrogate model that allows the estimation of a desired process fallout rate in the case of violation of process requirements in the presence of stochastic variation; and, (iii) selection and optimization of the process parameters based on the process capability space. The proposed methodology provides a unique capability to: (i) simulate the effect of process variation as generated by manufacturing process; (ii) model quality requirements with multiple and coupled quality requirements; and (iii

  6. Cox process representation and inference for stochastic reaction-diffusion processes

    NASA Astrophysics Data System (ADS)

    Schnoerr, David; Grima, Ramon; Sanguinetti, Guido

    2016-05-01

    Complex behaviour in many systems arises from the stochastic interactions of spatially distributed particles or agents. Stochastic reaction-diffusion processes are widely used to model such behaviour in disciplines ranging from biology to the social sciences, yet they are notoriously difficult to simulate and calibrate to observational data. Here we use ideas from statistical physics and machine learning to provide a solution to the inverse problem of learning a stochastic reaction-diffusion process from data. Our solution relies on a non-trivial connection between stochastic reaction-diffusion processes and spatio-temporal Cox processes, a well-studied class of models from computational statistics. This connection leads to an efficient and flexible algorithm for parameter inference and model selection. Our approach shows excellent accuracy on numeric and real data examples from systems biology and epidemiology. Our work provides both insights into spatio-temporal stochastic systems, and a practical solution to a long-standing problem in computational modelling.

  7. Performance of biofuel processes utilising separate lignin and carbohydrate processing.

    PubMed

    Melin, Kristian; Kohl, Thomas; Koskinen, Jukka; Hurme, Markku

    2015-09-01

    Novel biofuel pathways with increased product yields are evaluated against conventional lignocellulosic biofuel production processes: methanol or methane production via gasification and ethanol production via steam-explosion pre-treatment. The novel processes studied are ethanol production combined with methanol production by gasification, hydrocarbon fuel production with additional hydrogen produced from lignin residue gasification, methanol or methane synthesis using synthesis gas from lignin residue gasification and additional hydrogen obtained by aqueous phase reforming in synthesis gas production. The material and energy balances of the processes were calculated by Aspen flow sheet models and add on excel calculations applicable at the conceptual design stage to evaluate the pre-feasibility of the alternatives. The processes were compared using the following criteria: energy efficiency from biomass to products, primary energy efficiency, GHG reduction potential and economy (expressed as net present value: NPV). Several novel biorefinery concepts gave higher energy yields, GHG reduction potential and NPV. Copyright © 2015 Elsevier Ltd. All rights reserved.

  8. Neurophysiological evidence for transfer appropriate processing of memory: processing versus feature similarity.

    PubMed

    Schendan, Haune E; Kutas, Malra

    2007-08-01

    Transfer appropriate processing (TAP) accounts propose that memory is a function of the degree to which the same neural processes transfer appropriately from the study experience to the memory test. However, in prior research, study and test stimuli were often similar physically. In two experiments, event-related brain potentials (ERPs) were recorded to fragmented objects during an indirect memory test to isolate transfer of a specific perceptual process from overlap of physical features between experiences. An occipitotemporoparietal P2(00) at 200 msec showed implicit memory effects only when similar perceptual grouping processes of good continuation were repeatedly engaged-despite physical feature differences--as TAP accounts hypothesize. This result provides direct neurophysiological evidence for the critical role of process transfer across experiences for memory.

  9. Manufacturing Process Selection of Composite Bicycle’s Crank Arm using Analytical Hierarchy Process (AHP)

    NASA Astrophysics Data System (ADS)

    Luqman, M.; Rosli, M. U.; Khor, C. Y.; Zambree, Shayfull; Jahidi, H.

    2018-03-01

    Crank arm is one of the important parts in a bicycle that is an expensive product due to the high cost of material and production process. This research is aimed to investigate the potential type of manufacturing process to fabricate composite bicycle crank arm and to describe an approach based on analytical hierarchy process (AHP) that assists decision makers or manufacturing engineers in determining the most suitable process to be employed in manufacturing of composite bicycle crank arm at the early stage of the product development process to reduce the production cost. There are four types of processes were considered, namely resin transfer molding (RTM), compression molding (CM), vacuum bag molding and filament winding (FW). The analysis ranks these four types of process for its suitability in the manufacturing of bicycle crank arm based on five main selection factors and 10 sub factors. Determining the right manufacturing process was performed based on AHP process steps. Consistency test was performed to make sure the judgements are consistent during the comparison. The results indicated that the compression molding was the most appropriate manufacturing process because it has the highest value (33.6%) among the other manufacturing processes.

  10. Optimization of a novel enzyme treatment process for early-stage processing of sheepskins.

    PubMed

    Lim, Y F; Bronlund, J E; Allsop, T F; Shilton, A N; Edmonds, R L

    2010-01-01

    An enzyme treatment process for early-stage processing of sheepskins has been previously reported by the Leather and Shoe Research Association of New Zealand (LASRA) as an alternative to current industry operations. The newly developed process had marked benefits over conventional processing in terms of a lowered energy usage (73%), processing time (47%) as well as water use (49%), but had been developed as a "proof of principle''. The objective of this work was to develop the process further to a stage ready for adoption by industry. Mass balancing was used to investigate potential modifications for the process based on the understanding developed from a detailed analysis of preliminary design trials. Results showed that a configuration utilising a 2 stage counter-current system for the washing stages and segregation and recycling of enzyme float prior to dilution in the neutralization stage was a significant improvement. Benefits over conventional processing include a reduction of residual TDS by 50% at the washing stages and 70% savings on water use overall. Benefits over the un-optimized LASRA process are reduction of solids in product after enzyme treatment and neutralization stages by 30%, additional water savings of 21%, as well as 10% savings of enzyme usage.

  11. Business Process Management

    NASA Astrophysics Data System (ADS)

    Hantry, Francois; Papazoglou, Mike; van den Heuvel, Willem-Jan; Haque, Rafique; Whelan, Eoin; Carroll, Noel; Karastoyanova, Dimka; Leymann, Frank; Nikolaou, Christos; Lammersdorf, Winfried; Hacid, Mohand-Said

    Business process management is one of the core drivers of business innovation and is based on strategic technology and capable of creating and successfully executing end-to-end business processes. The trend will be to move from relatively stable, organization-specific applications to more dynamic, high-value ones where business process interactions and trends are examined closely to understand more accurately an application's requirements. Such collaborative, complex end-to-end service interactions give rise to the concept of Service Networks (SNs).

  12. Issues Management Process Course # 38401

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Binion, Ula Marie

    The purpose of this training it to advise Issues Management Coordinators (IMCs) on the revised Contractor Assurance System (CAS) Issues Management (IM) process. Terminal Objectives: Understand the Laboratory’s IM process; Understand your role in the Laboratory’s IM process. Learning Objectives: Describe the IM process within the context of the CAS; Describe the importance of implementing an institutional IM process at LANL; Describe the process flow for the Laboratory’s IM process; Apply the definition of an issue; Use available resources to determine initial screening risk levels for issues; Describe the required major process steps for each risk level; Describe the personnelmore » responsibilities for IM process implementation; Access available resources to support IM process implementation.« less

  13. gProcess and ESIP Platforms for Satellite Imagery Processing over the Grid

    NASA Astrophysics Data System (ADS)

    Bacu, Victor; Gorgan, Dorian; Rodila, Denisa; Pop, Florin; Neagu, Gabriel; Petcu, Dana

    2010-05-01

    The Environment oriented Satellite Data Processing Platform (ESIP) is developed through the SEE-GRID-SCI (SEE-GRID eInfrastructure for regional eScience) co-funded by the European Commission through FP7 [1]. The gProcess Platform [2] is a set of tools and services supporting the development and the execution over the Grid of the workflow based processing, and particularly the satelite imagery processing. The ESIP [3], [4] is build on top of the gProcess platform by adding a set of satellite image processing software modules and meteorological algorithms. The satellite images can reveal and supply important information on earth surface parameters, climate data, pollution level, weather conditions that can be used in different research areas. Generally, the processing algorithms of the satellite images can be decomposed in a set of modules that forms a graph representation of the processing workflow. Two types of workflows can be defined in the gProcess platform: abstract workflow (PDG - Process Description Graph), in which the user defines conceptually the algorithm, and instantiated workflow (iPDG - instantiated PDG), which is the mapping of the PDG pattern on particular satellite image and meteorological data [5]. The gProcess platform allows the definition of complex workflows by combining data resources, operators, services and sub-graphs. The gProcess platform is developed for the gLite middleware that is available in EGEE and SEE-GRID infrastructures [6]. gProcess exposes the specific functionality through web services [7]. The Editor Web Service retrieves information on available resources that are used to develop complex workflows (available operators, sub-graphs, services, supported resources, etc.). The Manager Web Service deals with resources management (uploading new resources such as workflows, operators, services, data, etc.) and in addition retrieves information on workflows. The Executor Web Service manages the execution of the instantiated workflows

  14. A Process Research Framework: The International Process Research Consortium

    DTIC Science & Technology

    2006-12-01

    projects ? 52 Theme P | IPRC Framework 5 P-30 How should a process for collaborative development be formulated? The development at different companies...requires some process for the actual collaboration . How should it be handled? P-31 How do we handle change? Requirements change during development ...source projects employ a single-site development model in which there is no large community of testers but rather a single-site small group

  15. Transformation from manufacturing process taxonomy to repair process taxonomy: a phenetic approach

    NASA Astrophysics Data System (ADS)

    Raza, Umair; Ahmad, Wasim; Khan, Atif

    2018-02-01

    The need of taxonomy is vital for knowledge sharing. This need has been portrayed by through-life engineering services/systems. This paper addresses this issue by repair process taxonomy development. Framework for repair process taxonomy was developed followed by its implementation. The importance of repair process taxonomy has been highlighted.

  16. Empirical evaluation of the Process Overview Measure for assessing situation awareness in process plants.

    PubMed

    Lau, Nathan; Jamieson, Greg A; Skraaning, Gyrd

    2016-03-01

    The Process Overview Measure is a query-based measure developed to assess operator situation awareness (SA) from monitoring process plants. A companion paper describes how the measure has been developed according to process plant properties and operator cognitive work. The Process Overview Measure demonstrated practicality, sensitivity, validity and reliability in two full-scope simulator experiments investigating dramatically different operational concepts. Practicality was assessed based on qualitative feedback of participants and researchers. The Process Overview Measure demonstrated sensitivity and validity by revealing significant effects of experimental manipulations that corroborated with other empirical results. The measure also demonstrated adequate inter-rater reliability and practicality for measuring SA in full-scope simulator settings based on data collected on process experts. Thus, full-scope simulator studies can employ the Process Overview Measure to reveal the impact of new control room technology and operational concepts on monitoring process plants. Practitioner Summary: The Process Overview Measure is a query-based measure that demonstrated practicality, sensitivity, validity and reliability for assessing operator situation awareness (SA) from monitoring process plants in representative settings.

  17. Badge Office Process Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Haurykiewicz, John Paul; Dinehart, Timothy Grant; Parker, Robert Young

    2016-05-12

    The purpose of this process analysis was to analyze the Badge Offices’ current processes from a systems perspective and consider ways of pursuing objectives set forth by SEC-PS, namely increased customer flow (throughput) and reduced customer wait times. Information for the analysis was gathered for the project primarily through Badge Office Subject Matter Experts (SMEs), and in-person observation of prevailing processes. Using the information gathered, a process simulation model was constructed to represent current operations and allow assessment of potential process changes relative to factors mentioned previously. The overall purpose of the analysis was to provide SEC-PS management with informationmore » and recommendations to serve as a basis for additional focused study and areas for potential process improvements in the future.« less

  18. The auditory basis of language impairments: temporal processing versus processing efficiency hypotheses.

    PubMed

    Hartley, Douglas E H; Hill, Penny R; Moore, David R

    2003-12-01

    Claims have been made that language-impaired children have deficits processing rapidly presented or brief sensory information. These claims, known as the 'temporal processing hypothesis', are supported by demonstrations that language-impaired children have excess backward masking (BM). One explanation for these results is that BM is developmentally delayed in these children. However, little was known about how BM normally develops. Recently, we assessed BM in normally developing 6- and 8-year-old children and adults. Results showed that BM thresholds continue to improve over a comparatively protracted period (>10 years old). We also analysed reported deficits in BM in language-impaired and younger children, in terms of a model of temporal resolution. This analysis suggests that poor processing efficiency, rather than deficits in temporal resolution, can account for these results. This 'processing efficiency hypothesis' was recently tested in our laboratory. This experiment measured BM as a function of delays between the tone and the noise in children and adults. Results supported the processing efficiency hypothesis, and suggested that reduced processing efficiency alone could account for differences between adults and children. These findings provide a new perspective on the mechanisms underlying communication disorders, and imply that remediation strategies should be directed towards improving processing efficiency, not temporal resolution.

  19. Anodizing Process

    NASA Technical Reports Server (NTRS)

    1983-01-01

    This anodizing process traces its origin to the 1960's when Reynolds Metals Company, under contract with Goddard Space Flight Center, developed a multipurpose anodizing electrolyte (MAE) process to produce a hard protective finish for spacecraft aluminum. MAE produces a high-density, abrasion-resistant film prior to the coloring step, in which the pores of the film are impregnated with a metallic form of salt. Tru-Color product applications include building fronts, railing, curtain walls, doors and windows.

  20. Fuel gas conditioning process

    DOEpatents

    Lokhandwala, Kaaeid A.

    2000-01-01

    A process for conditioning natural gas containing C.sub.3+ hydrocarbons and/or acid gas, so that it can be used as combustion fuel to run gas-powered equipment, including compressors, in the gas field or the gas processing plant. Compared with prior art processes, the invention creates lesser quantities of low-pressure gas per unit volume of fuel gas produced. Optionally, the process can also produce an NGL product.

  1. Chemical Processing Manual

    NASA Technical Reports Server (NTRS)

    Beyerle, F. J.

    1972-01-01

    Chemical processes presented in this document include cleaning, pickling, surface finishes, chemical milling, plating, dry film lubricants, and polishing. All types of chemical processes applicable to aluminum, for example, are to be found in the aluminum alloy section. There is a separate section for each category of metallic alloy plus a section for non-metals, such as plastics. The refractories, super-alloys and titanium, are prime candidates for the space shuttle, therefore, the chemical processes applicable to these alloys are contained in individual sections of this manual.

  2. Right Hemisphere Metaphor Processing? Characterizing the Lateralization of Semantic Processes

    ERIC Educational Resources Information Center

    Schmidt, Gwen L.; DeBuse, Casey J.; Seger, Carol A.

    2007-01-01

    Previous laterality studies have implicated the right hemisphere in the processing of metaphors, however it is not clear if this result is due to metaphoricity per se or another aspect of semantic processing. Three divided visual field experiments varied metaphorical and literal sentence familiarity. We found a right hemisphere advantage for…

  3. Feasibility of using continuous chromatography in downstream processing: Comparison of costs and product quality for a hybrid process vs. a conventional batch process.

    PubMed

    Ötes, Ozan; Flato, Hendrik; Winderl, Johannes; Hubbuch, Jürgen; Capito, Florian

    2017-10-10

    The protein A capture step is the main cost-driver in downstream processing, with high attrition costs especially when using protein A resin not until end of resin lifetime. Here we describe a feasibility study, transferring a batch downstream process to a hybrid process, aimed at replacing batch protein A capture chromatography with a continuous capture step, while leaving the polishing steps unchanged to minimize required process adaptations compared to a batch process. 35g of antibody were purified using the hybrid approach, resulting in comparable product quality and step yield compared to the batch process. Productivity for the protein A step could be increased up to 420%, reducing buffer amounts by 30-40% and showing robustness for at least 48h continuous run time. Additionally, to enable its potential application in a clinical trial manufacturing environment cost of goods were compared for the protein A step between hybrid process and batch process, showing a 300% cost reduction, depending on processed volumes and batch cycles. Copyright © 2017 Elsevier B.V. All rights reserved.

  4. Gas-separation process

    DOEpatents

    Toy, Lora G.; Pinnau, Ingo; Baker, Richard W.

    1994-01-01

    A process for separating condensable organic components from gas streams. The process makes use of a membrane made from a polymer material that is glassy and that has an unusually high free volume within the polymer material.

  5. Natural Language Processing.

    ERIC Educational Resources Information Center

    Chowdhury, Gobinda G.

    2003-01-01

    Discusses issues related to natural language processing, including theoretical developments; natural language understanding; tools and techniques; natural language text processing systems; abstracting; information extraction; information retrieval; interfaces; software; Internet, Web, and digital library applications; machine translation for…

  6. Explicit and Implicit Processes Constitute the Fast and Slow Processes of Sensorimotor Learning.

    PubMed

    McDougle, Samuel D; Bond, Krista M; Taylor, Jordan A

    2015-07-01

    A popular model of human sensorimotor learning suggests that a fast process and a slow process work in parallel to produce the canonical learning curve (Smith et al., 2006). Recent evidence supports the subdivision of sensorimotor learning into explicit and implicit processes that simultaneously subserve task performance (Taylor et al., 2014). We set out to test whether these two accounts of learning processes are homologous. Using a recently developed method to assay explicit and implicit learning directly in a sensorimotor task, along with a computational modeling analysis, we show that the fast process closely resembles explicit learning and the slow process approximates implicit learning. In addition, we provide evidence for a subdivision of the slow/implicit process into distinct manifestations of motor memory. We conclude that the two-state model of motor learning is a close approximation of sensorimotor learning, but it is unable to describe adequately the various implicit learning operations that forge the learning curve. Our results suggest that a wider net be cast in the search for the putative psychological mechanisms and neural substrates underlying the multiplicity of processes involved in motor learning. Copyright © 2015 the authors 0270-6474/15/359568-12$15.00/0.

  7. Explicit and Implicit Processes Constitute the Fast and Slow Processes of Sensorimotor Learning

    PubMed Central

    Bond, Krista M.; Taylor, Jordan A.

    2015-01-01

    A popular model of human sensorimotor learning suggests that a fast process and a slow process work in parallel to produce the canonical learning curve (Smith et al., 2006). Recent evidence supports the subdivision of sensorimotor learning into explicit and implicit processes that simultaneously subserve task performance (Taylor et al., 2014). We set out to test whether these two accounts of learning processes are homologous. Using a recently developed method to assay explicit and implicit learning directly in a sensorimotor task, along with a computational modeling analysis, we show that the fast process closely resembles explicit learning and the slow process approximates implicit learning. In addition, we provide evidence for a subdivision of the slow/implicit process into distinct manifestations of motor memory. We conclude that the two-state model of motor learning is a close approximation of sensorimotor learning, but it is unable to describe adequately the various implicit learning operations that forge the learning curve. Our results suggest that a wider net be cast in the search for the putative psychological mechanisms and neural substrates underlying the multiplicity of processes involved in motor learning. PMID:26134640

  8. Cascading activation from lexical processing to letter-level processing in written word production.

    PubMed

    Buchwald, Adam; Falconer, Carolyn

    2014-01-01

    Descriptions of language production have identified processes involved in producing language and the presence and type of interaction among those processes. In the case of spoken language production, consensus has emerged that there is interaction among lexical selection processes and phoneme-level processing. This issue has received less attention in written language production. In this paper, we present a novel analysis of the writing-to-dictation performance of an individual with acquired dysgraphia revealing cascading activation from lexical processing to letter-level processing. The individual produced frequent lexical-semantic errors (e.g., chipmunk → SQUIRREL) as well as letter errors (e.g., inhibit → INBHITI) and had a profile consistent with impairment affecting both lexical processing and letter-level processing. The presence of cascading activation is suggested by lower letter accuracy on words that are more weakly activated during lexical selection than on those that are more strongly activated. We operationalize weakly activated lexemes as those lexemes that are produced as lexical-semantic errors (e.g., lethal in deadly → LETAHL) compared to strongly activated lexemes where the intended target word (e.g., lethal) is the lexeme selected for production.

  9. Spitzer Telemetry Processing System

    NASA Technical Reports Server (NTRS)

    Stanboli, Alice; Martinez, Elmain M.; McAuley, James M.

    2013-01-01

    The Spitzer Telemetry Processing System (SirtfTlmProc) was designed to address objectives of JPL's Multi-mission Image Processing Lab (MIPL) in processing spacecraft telemetry and distributing the resulting data to the science community. To minimize costs and maximize operability, the software design focused on automated error recovery, performance, and information management. The system processes telemetry from the Spitzer spacecraft and delivers Level 0 products to the Spitzer Science Center. SirtfTlmProc is a unique system with automated error notification and recovery, with a real-time continuous service that can go quiescent after periods of inactivity. The software can process 2 GB of telemetry and deliver Level 0 science products to the end user in four hours. It provides analysis tools so the operator can manage the system and troubleshoot problems. It automates telemetry processing in order to reduce staffing costs.

  10. An improved plating process

    NASA Technical Reports Server (NTRS)

    Askew, John C.

    1994-01-01

    An alternative to the immersion process for the electrodeposition of chromium from aqueous solutions on the inside diameter (ID) of long tubes is described. The Vessel Plating Process eliminates the need for deep processing tanks, large volumes of solutions, and associated safety and environmental concerns. Vessel Plating allows the process to be monitored and controlled by computer thus increasing reliability, flexibility and quality. Elimination of the trivalent chromium accumulation normally associated with ID plating is intrinsic to the Vessel Plating Process. The construction and operation of a prototype Vessel Plating Facility with emphasis on materials of construction, engineered and operational safety and a unique system for rinse water recovery are described.

  11. Analyzing Discourse Processing Using a Simple Natural Language Processing Tool

    ERIC Educational Resources Information Center

    Crossley, Scott A.; Allen, Laura K.; Kyle, Kristopher; McNamara, Danielle S.

    2014-01-01

    Natural language processing (NLP) provides a powerful approach for discourse processing researchers. However, there remains a notable degree of hesitation by some researchers to consider using NLP, at least on their own. The purpose of this article is to introduce and make available a "simple" NLP (SiNLP) tool. The overarching goal of…

  12. Precision Heating Process

    NASA Technical Reports Server (NTRS)

    1992-01-01

    A heat sealing process was developed by SEBRA based on technology that originated in work with NASA's Jet Propulsion Laboratory. The project involved connecting and transferring blood and fluids between sterile plastic containers while maintaining a closed system. SEBRA markets the PIRF Process to manufacturers of medical catheters. It is a precisely controlled method of heating thermoplastic materials in a mold to form or weld catheters and other products. The process offers advantages in fast, precise welding or shape forming of catheters as well as applications in a variety of other industries.

  13. A Petri Net-Based Software Process Model for Developing Process-Oriented Information Systems

    NASA Astrophysics Data System (ADS)

    Li, Yu; Oberweis, Andreas

    Aiming at increasing flexibility, efficiency, effectiveness, and transparency of information processing and resource deployment in organizations to ensure customer satisfaction and high quality of products and services, process-oriented information systems (POIS) represent a promising realization form of computerized business information systems. Due to the complexity of POIS, explicit and specialized software process models are required to guide POIS development. In this chapter we characterize POIS with an architecture framework and present a Petri net-based software process model tailored for POIS development with consideration of organizational roles. As integrated parts of the software process model, we also introduce XML nets, a variant of high-level Petri nets as basic methodology for business processes modeling, and an XML net-based software toolset providing comprehensive functionalities for POIS development.

  14. CMOS/SOS processing

    NASA Technical Reports Server (NTRS)

    Ramondetta, P.

    1980-01-01

    Report describes processes used in making complementary - metal - oxide - semiconductor/silicon-on-sapphire (CMOS/SOS) integrated circuits. Report lists processing steps ranging from initial preparation of sapphire wafers to final mapping of "good" and "bad" circuits on a wafer.

  15. Drug Development Process

    MedlinePlus

    ... Home Food Drugs Medical Devices Radiation-Emitting Products Vaccines, Blood & Biologics Animal & Veterinary Cosmetics Tobacco Products For Patients Home For Patients Learn About Drug and Device Approvals The Drug Development Process The Drug Development Process Share Tweet Linkedin Pin ...

  16. Muscle atrophy and metal-on-metal hip implants: a serial MRI study of 74 hips.

    PubMed

    Berber, Reshid; Khoo, Michael; Cook, Erica; Guppy, Andrew; Hua, Jia; Miles, Jonathan; Carrington, Richard; Skinner, John; Hart, Alister

    2015-06-01

    Muscle atrophy is seen in patients with metal-on-metal (MOM) hip implants, probably because of inflammatory destruction of the musculo-tendon junction. However, like pseudotumors, it is unclear when atrophy occurs and whether it progresses with time. Our objective was to determine whether muscle atrophy associated with MOM hip implants progresses with time. We retrospectively reviewed 74 hips in 56 patients (32 of them women) using serial MRI. Median age was 59 (23-83) years. The median time post-implantation was 83 (35-142) months, and the median interval between scans was 11 months. Hip muscles were scored using the Pfirrmann system. The mean scores for muscle atrophy were compared between the first and second MRI scans. Blood cobalt and chromium concentrations were determined. The median blood cobalt was 6.84 (0.24-90) ppb and median chromium level was 4.42 (0.20-45) ppb. The median Oxford hip score was 34 (5-48). The change in the gluteus minimus mean atrophy score between first and second MRI was 0.12 (p = 0.002). Mean change in the gluteus medius posterior portion (unaffected by surgical approach) was 0.08 (p = 0.01) and mean change in the inferior portion was 0.10 (p = 0.05). Mean pseudotumor grade increased by 0.18 (p = 0.02). Worsening muscle atrophy and worsening pseudotumor grade occur over a 1-year period in a substantial proportion of patients with MOM hip implants. Serial MRI helps to identify those patients who are at risk of developing worsening soft-tissue pathology. These patients should be considered for revision surgery before irreversible muscle destruction occurs.

  17. A Large Animal Model that Recapitulates the Spectrum of Human Intervertebral Disc Degeneration

    PubMed Central

    Gullbrand, Sarah E.; Malhotra, Neil R.; Schaer, Thomas P.; Zawacki, Zosia; Martin, John T.; Bendigo, Justin R.; Milby, Andrew H.; Dodge, George R.; Vresilovic, Edward J.; Elliott, Dawn M.; Mauck, Robert L.; Smith, Lachlan J.

    2016-01-01

    Objective The objective of this study was to establish a large animal model that recapitulates the spectrum of intervertebral disc degeneration that occurs in humans and which is suitable for pre-clinical evaluation of a wide range of experimental therapeutics. Design Degeneration was induced in the lumbar intervertebral discs of large frame goats by either intradiscal injection of chondroitinase ABC (ChABC) over a range of dosages (0.1U, 1U or 5U) or subtotal nucleotomy. Radiographs were used to assess disc height changes over 12 weeks. Degenerative changes to the discs and endplates were assessed via magnetic resonance imaging (MRI), semi-quantitative histological grading, micro-computed tomography (µCT), and measurement of disc biomechanical properties. Results Degenerative changes were observed for all interventions that ranged from mild (0.1U ChABC) to moderate (1U ChABC and nucleotomy) to severe (5U ChABC). All groups showed progressive reductions in disc height over 12 weeks. Histological scores were significantly increased in the 1U and 5U ChABC groups. Reductions in T2 and T1ρ, and increased Pfirrmann grade were observed on MRI. Resorption and remodeling of the cortical boney endplate adjacent to ChABC injected discs also occurred. Spine segment range of motion was greater and compressive modulus was lower in 1U ChABC and nucleotomy discs compared to intact. Conclusions A large animal model of disc degeneration was established that recapitulates the spectrum of structural, compositional and biomechanical features of human disc degeneration. This model may serve as a robust platform for evaluating the efficacy of therapeutics targeted towards varying degrees of disc degeneration. PMID:27568573

  18. Do the disc degeneration and osteophyte contribute to the curve rigidity of degenerative scoliosis?

    PubMed

    Zhu, Feng; Bao, Hongda; Yan, Peng; Liu, Shunan; Bao, Mike; Zhu, Zezhang; Liu, Zhen; Qiu, Yong

    2017-03-29

    The factors associated with lateral curve flexibility in degenerative scoliosis have not been well documented. Disc degeneration could result in significant change in stiffness and range of motion in lateral bending films. The osteophytes could be commonly observed in degenerative spine but the relationship between osteophyte formation and curve flexibility remains controversial. The aim of the current study is to clarify if the disc degeneration and osteophyte formation were both associated with curve flexibility of degenerative scoliosis. A total of 85 patients were retrospectively analyzed. The inclusion criteria were as follow: age greater than 45 years, diagnosed as degenerative scoliosis and coronal Cobb angle greater than 20°. Curve flexibility was calculated based on Cobb angle, and range of motion (ROM) was based on disc angle evaluation. Regional disc degeneration score (RDS) was obtained according to Pfirrmann classification and osteophyte formation score (OFS) was based on Nanthan classification. Spearman correlation was performed to analyze the relationship between curve flexibility and RDS as well as OFS. Moderate correlation was found between RDS and curve flexibility with a Spearman coefficient of -0.487 (P = 0.009). Similarly, moderate correlation was observed between curve flexibility and OFS with a Spearman coefficient of -0.429 (P = 0.012). Strong correlation was found between apical ROM and OFS compared to the relationship between curve flexibility and OFS with a Spearman coefficient of -0.627 (P < 0.001). Both disc degeneration and osteophytes formation correlated with curve rigidity. The pre-operative evaluation of both features may aid in the surgical decision-making in degenerative scoliosis patients.

  19. Process based analysis of manually controlled drilling processes for bone

    NASA Astrophysics Data System (ADS)

    Teicher, Uwe; Achour, Anas Ben; Nestler, Andreas; Brosius, Alexander; Lauer, Günter

    2018-05-01

    The machining operation drilling is part of the standard repertoire for medical applications. This machining cycle, which is usually a multi-stage process, generates the geometric element for the subsequent integration of implants, which are screwed into the bone in subsequent processes. In addition to the form, shape and position of the generated drill hole, it is also necessary to use a technology that ensures an operation with minimal damage. A surface damaged by excessive mechanical and thermal energy input shows a deterioration in the healing capacity of implants and represents a structure with complications for inflammatory reactions. The resulting loads are influenced by the material properties of the bone, the used technology and the tool properties. An important aspect of the process analysis is the fact that machining of bone is in most of the cases a manual process that depends mainly on the skills of the operator. This includes, among other things, the machining time for the production of a drill hole, since manual drilling is a force-controlled process. Experimental work was carried out on the bone of a porcine mandible in order to investigate the interrelation of the applied load during drilling. It can be shown that the load application can be subdivided according to the working feed direction. The entire drilling process thus consists of several time domains, which can be divided into the geometry-generating feed motion and a retraction movement of the tool. It has been shown that the removal of the tool from the drill hole has a significant influence on the mechanical load input. This fact is proven in detail by a new evaluation methodology. The causes of this characteristic can also be identified, as well as possible ways of reducing the load input.

  20. Gas-separation process

    DOEpatents

    Toy, L.G.; Pinnau, I.; Baker, R.W.

    1994-01-25

    A process is described for separating condensable organic components from gas streams. The process makes use of a membrane made from a polymer material that is glassy and that has an unusually high free volume within the polymer material. 6 figures.

  1. News: Process intensification

    EPA Science Inventory

    Conservation of materials and energy is a major objective to the philosophy of sustainability. Where production processes can be intensified to assist these objectives, significant advances have been developed to assist conservation as well as cost. Process intensification (PI) h...

  2. What Is Group Process?: Integrating Process Work into Psychoeducational Groups

    ERIC Educational Resources Information Center

    Mills, Bethany; McBride, Dawn Lorraine

    2016-01-01

    Process work has long been a tenet of successful counseling outcomes. However, there is little literature available that focuses on how to best integrate process work into group settings--particularly psychoeducational groups that are content heavy and most often utilized in a school setting. In this article, the authors provide an overview of the…

  3. Processes for metal extraction

    NASA Technical Reports Server (NTRS)

    Bowersox, David F.

    1992-01-01

    This report describes the processing of plutonium at Los Alamos National Laboratory (LANL), and operation illustrating concepts that may be applicable to the processing of lunar materials. The toxic nature of plutonium requires a highly closed system for processing lunar surface materials.

  4. EARSEC SAR processing system

    NASA Astrophysics Data System (ADS)

    Protheroe, Mark; Sloggett, David R.; Sieber, Alois J.

    1994-12-01

    Traditionally, the production of high quality Synthetic Aperture Radar imagery has been an area where a potential user would have to expend large amounts of money in either the bespoke development of a processing chain dedicated to his requirements or in the purchase of a dedicated hardware platform adapted using accelerator boards and enhanced memory management. Whichever option the user adopted there were limitations based on the desire for a realistic throughput in data load and time. The user had a choice, made early in the purchase, for either a system that adopted innovative algorithmic manipulation, to limit the processing time of the purchase of expensive hardware. The former limits the quality of the product, while the latter excludes the user from any visibility into the processing chain. Clearly there was a need for a SAR processing architecture that gave the user a choice into the methodology to be adopted for a particular processing sequence, allowing him to decide on either a quick (lower quality) product or a detailed slower (high quality) product, without having to change the algorithmic base of his processor or the hardware platform. The European Commission, through the Advanced Techniques unit of the Joint Research Centre (JRC) Institute for Remote Sensing at Ispra in Italy, realizing the limitations on current processing abilities, initiated its own program to build airborne SAR and Electro-Optical (EO) sensor systems. This program is called the European Airborne Remote Sensing Capabilities (EARSEC) program. This paper describes the processing system developed for the airborne SAR sensor system. The paper considers the requirements for the system and the design of the EARSEC Airborne SAR Processing System. It highlights the development of an open SAR processing architecture where users have full access to intermediate products that arise from each of the major processing stages. It also describes the main processing stages in the overall

  5. Reasoning with case histories of process knowledge for efficient process development

    NASA Technical Reports Server (NTRS)

    Bharwani, Seraj S.; Walls, Joe T.; Jackson, Michael E.

    1988-01-01

    The significance of compiling case histories of empirical process knowledge and the role of such histories in improving the efficiency of manufacturing process development is discussed in this paper. Methods of representing important investigations as cases and using the information from such cases to eliminate redundancy of empirical investigations in analogous process development situations are also discussed. A system is proposed that uses such methods to capture the problem-solving framework of the application domain. A conceptual design of the system is presented and discussed.

  6. Chemical processing of glasses

    NASA Astrophysics Data System (ADS)

    Laine, Richard M.

    1990-11-01

    The development of chemical processing methods for the fabrication of glass and ceramic shapes for photonic applications is frequently Edisonian in nature. In part, this is because the numerous variables that must be optimized to obtain a given material with a specific shape and particular properties cannot be readily defined based on fundamental principles. In part, the problems arise because the basic chemistry of common chemical processing systems has not been fully delineated. The prupose of this paper is to provide an overview of the basic chemical problems associated with chemical processing. The emphasis will be on sol-gel processing, a major subset pf chemical processing. Two alternate approaches to chemical processing of glasses are also briefly discussed. One approach concerns the use of bimetallic alkoxide oligomers and polymers as potential precursors to mulimetallic glasses. The second approach describes the utility of metal carboxylate precursors to multimetallic glasses.

  7. 5 CFR 1653.13 - Processing legal processes.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... complete, a legal process must contain all pages and attachments; it must also provide (or be accompanied... no further action will be taken with respect to the document. (f) As soon as practicable after...

  8. 5 CFR 1653.13 - Processing legal processes.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... complete, a legal process must contain all pages and attachments; it must also provide (or be accompanied... no further action will be taken with respect to the document. (f) As soon as practicable after...

  9. Baldovin-Stella stochastic volatility process and Wiener process mixtures

    NASA Astrophysics Data System (ADS)

    Peirano, P. P.; Challet, D.

    2012-08-01

    Starting from inhomogeneous time scaling and linear decorrelation between successive price returns, Baldovin and Stella recently proposed a powerful and consistent way to build a model describing the time evolution of a financial index. We first make it fully explicit by using Student distributions instead of power law-truncated Lévy distributions and show that the analytic tractability of the model extends to the larger class of symmetric generalized hyperbolic distributions and provide a full computation of their multivariate characteristic functions; more generally, we show that the stochastic processes arising in this framework are representable as mixtures of Wiener processes. The basic Baldovin and Stella model, while mimicking well volatility relaxation phenomena such as the Omori law, fails to reproduce other stylized facts such as the leverage effect or some time reversal asymmetries. We discuss how to modify the dynamics of this process in order to reproduce real data more accurately.

  10. Serial Learning Process: Test of Chaining, Position, and Dual-Process Hypotheses

    ERIC Educational Resources Information Center

    Giurintano, S. L.

    1973-01-01

    The chaining, position, and dual-process hypotheses of serial learning (SL) as well as serial recall, reordering, and relearning of paired-associate learning were examined to establish learning patterns. Results provide evidence for dual-process hypothesis. (DS)

  11. Elemental sulfur recovery process

    DOEpatents

    Flytzani-Stephanopoulos, M.; Zhicheng Hu.

    1993-09-07

    An improved catalytic reduction process for the direct recovery of elemental sulfur from various SO[sub 2]-containing industrial gas streams. The catalytic process provides combined high activity and selectivity for the reduction of SO[sub 2] to elemental sulfur product with carbon monoxide or other reducing gases. The reaction of sulfur dioxide and reducing gas takes place over certain catalyst formulations based on cerium oxide. The process is a single-stage, catalytic sulfur recovery process in conjunction with regenerators, such as those used in dry, regenerative flue gas desulfurization or other processes, involving direct reduction of the SO[sub 2] in the regenerator off gas stream to elemental sulfur in the presence of a catalyst. 4 figures.

  12. Elemental sulfur recovery process

    DOEpatents

    Flytzani-Stephanopoulos, Maria; Hu, Zhicheng

    1993-01-01

    An improved catalytic reduction process for the direct recovery of elemental sulfur from various SO.sub.2 -containing industrial gas streams. The catalytic process provides combined high activity and selectivity for the reduction of SO.sub.2 to elemental sulfur product with carbon monoxide or other reducing gases. The reaction of sulfur dioxide and reducing gas takes place over certain catalyst formulations based on cerium oxide. The process is a single-stage, catalytic sulfur recovery process in conjunction with regenerators, such as those used in dry, regenerative flue gas desulfurization or other processes, involving direct reduction of the SO.sub.2 in the regenerator off gas stream to elemental sulfur in the presence of a catalyst.

  13. Processed and ultra-processed foods are associated with lower-quality nutrient profiles in children from Colombia.

    PubMed

    Cornwell, Brittany; Villamor, Eduardo; Mora-Plazas, Mercedes; Marin, Constanza; Monteiro, Carlos A; Baylin, Ana

    2018-01-01

    To determine if processed and ultra-processed foods consumed by children in Colombia are associated with lower-quality nutrition profiles than less processed foods. We obtained information on sociodemographic and anthropometric variables and dietary information through dietary records and 24 h recalls from a convenience sample of the Bogotá School Children Cohort. Foods were classified into three categories: (i) unprocessed and minimally processed foods, (ii) processed culinary ingredients and (iii) processed and ultra-processed foods. We also examined the combination of unprocessed foods and processed culinary ingredients. Representative sample of children from low- to middle-income families in Bogotá, Colombia. Children aged 5-12 years in 2011 Bogotá School Children Cohort. We found that processed and ultra-processed foods are of lower dietary quality in general. Nutrients that were lower in processed and ultra-processed foods following adjustment for total energy intake included: n-3 PUFA, vitamins A, B12, C and E, Ca and Zn. Nutrients that were higher in energy-adjusted processed and ultra-processed foods compared with unprocessed foods included: Na, sugar and trans-fatty acids, although we also found that some healthy nutrients, including folate and Fe, were higher in processed and ultra-processed foods compared with unprocessed and minimally processed foods. Processed and ultra-processed foods generally have unhealthy nutrition profiles. Our findings suggest the categorization of foods based on processing characteristics is promising for understanding the influence of food processing on children's dietary quality. More studies accounting for the type and degree of food processing are needed.

  14. Monitoring Process Effectiveness

    EPA Science Inventory

    Treatment of municipal sludges to produce biosolids which meet federal and/or state requirements for land application requires process monitoring. The goal of process monitoring is to produce biosolids of consistent and reliable quality. In its simplest form, for Class B treatme...

  15. Food processing and allergenicity.

    PubMed

    Verhoeckx, Kitty C M; Vissers, Yvonne M; Baumert, Joseph L; Faludi, Roland; Feys, Marcel; Flanagan, Simon; Herouet-Guicheney, Corinne; Holzhauser, Thomas; Shimojo, Ryo; van der Bolt, Nieke; Wichers, Harry; Kimber, Ian

    2015-06-01

    Food processing can have many beneficial effects. However, processing may also alter the allergenic properties of food proteins. A wide variety of processing methods is available and their use depends largely on the food to be processed. In this review the impact of processing (heat and non-heat treatment) on the allergenic potential of proteins, and on the antigenic (IgG-binding) and allergenic (IgE-binding) properties of proteins has been considered. A variety of allergenic foods (peanuts, tree nuts, cows' milk, hens' eggs, soy, wheat and mustard) have been reviewed. The overall conclusion drawn is that processing does not completely abolish the allergenic potential of allergens. Currently, only fermentation and hydrolysis may have potential to reduce allergenicity to such an extent that symptoms will not be elicited, while other methods might be promising but need more data. Literature on the effect of processing on allergenic potential and the ability to induce sensitisation is scarce. This is an important issue since processing may impact on the ability of proteins to cause the acquisition of allergic sensitisation, and the subject should be a focus of future research. Also, there remains a need to develop robust and integrated methods for the risk assessment of food allergenicity. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.

  16. Space processing economics

    NASA Technical Reports Server (NTRS)

    Bredt, J. H.

    1974-01-01

    Two types of space processing operations may be considered economically justified; they are manufacturing operations that make profits and experiment operations that provide needed applied research results at lower costs than those of alternative methods. Some examples from the Skylab experiments suggest that applied research should become cost effective soon after the space shuttle and Spacelab become operational. In space manufacturing, the total cost of space operations required to process materials must be repaid by the value added to the materials by the processing. Accurate estimates of profitability are not yet possible because shuttle operational costs are not firmly established and the markets for future products are difficult to estimate. However, approximate calculations show that semiconductor products and biological preparations may be processed on a scale consistent with market requirements and at costs that are at least compatible with profitability using the Shuttle/Spacelab system.

  17. FLUORINATION PROCESS

    DOEpatents

    McMillan, T.S.

    1957-10-29

    A process for the fluorination of uranium metal is described. It is known that uranium will react with liquid chlorine trifluoride but the reaction proceeds at a slow rate. However, a mixture of a halogen trifluoride together with hydrogen fluoride reacts with uranium at a significantly faster rate than does a halogen trifluoride alone. Bromine trifluoride is suitable for use in the process, but chlorine trifluoride is preferred. Particularly suitable is a mixture of ClF/sub 3/ and HF having a mole ratio (moles

  18. A DMAIC approach for process capability improvement an engine crankshaft manufacturing process

    NASA Astrophysics Data System (ADS)

    Sharma, G. V. S. S.; Rao, P. Srinivasa

    2014-05-01

    The define-measure-analyze-improve-control (DMAIC) approach is a five-strata approach, namely DMAIC. This approach is the scientific approach for reducing the deviations and improving the capability levels of the manufacturing processes. The present work elaborates on DMAIC approach applied in reducing the process variations of the stub-end-hole boring operation of the manufacture of crankshaft. This statistical process control study starts with selection of the critical-to-quality (CTQ) characteristic in the define stratum. The next stratum constitutes the collection of dimensional measurement data of the CTQ characteristic identified. This is followed by the analysis and improvement strata where the various quality control tools like Ishikawa diagram, physical mechanism analysis, failure modes effects analysis and analysis of variance are applied. Finally, the process monitoring charts are deployed at the workplace for regular monitoring and control of the concerned CTQ characteristic. By adopting DMAIC approach, standard deviation is reduced from 0.003 to 0.002. The process potential capability index ( C P) values improved from 1.29 to 2.02 and the process performance capability index ( C PK) values improved from 0.32 to 1.45, respectively.

  19. Process control systems: integrated for future process technologies

    NASA Astrophysics Data System (ADS)

    Botros, Youssry; Hajj, Hazem M.

    2003-06-01

    Process Control Systems (PCS) are becoming more crucial to the success of Integrated Circuit makers due to their direct impact on product quality, cost, and Fab output. The primary objective of PCS is to minimize variability by detecting and correcting non optimal performance. Current PCS implementations are considered disparate, where each PCS application is designed, deployed and supported separately. Each implementation targets a specific area of control such as equipment performance, wafer manufacturing, and process health monitoring. With Intel entering the nanometer technology era, tighter process specifications are required for higher yields and lower cost. This requires areas of control to be tightly coupled and integrated to achieve the optimal performance. This requirement can be achieved via consistent design and deployment of the integrated PCS. PCS integration will result in several benefits such as leveraging commonalities, avoiding redundancy, and facilitating sharing between implementations. This paper will address PCS implementations and focus on benefits and requirements of the integrated PCS. Intel integrated PCS Architecture will be then presented and its components will be briefly discussed. Finally, industry direction and efforts to standardize PCS interfaces that enable PCS integration will be presented.

  20. Generic Health Management: A System Engineering Process Handbook Overview and Process

    NASA Technical Reports Server (NTRS)

    Wilson, Moses Lee; Spruill, Jim; Hong, Yin Paw

    1995-01-01

    Health Management, a System Engineering Process, is one of those processes-techniques-and-technologies used to define, design, analyze, build, verify, and operate a system from the viewpoint of preventing, or minimizing, the effects of failure or degradation. It supports all ground and flight elements during manufacturing, refurbishment, integration, and operation through combined use of hardware, software, and personnel. This document will integrate Health Management Processes (six phases) into five phases in such a manner that it is never a stand alone task/effort which separately defines independent work functions.

  1. Metallurgical processing: A compilation

    NASA Technical Reports Server (NTRS)

    1973-01-01

    The items in this compilation, all relating to metallurgical processing, are presented in two sections. The first section includes processes which are general in scope and applicable to a variety of metals or alloys. The second describes the processes that concern specific metals and their alloys.

  2. Comprehension Processes in Reading.

    ERIC Educational Resources Information Center

    Balota, D. A., Ed.; And Others

    Focusing on the process of reading comprehension, this book contains chapters on some central topics relevant to understanding the processes associated with comprehending text. The articles and their authors are as follows: (1) "Comprehension Processes: Introduction" (K. Rayner); (2) "The Role of Meaning in Word Recognition"…

  3. Change Processes in Organization.

    ERIC Educational Resources Information Center

    1998

    This document contains four papers from a symposium on change processes in organizations. "Mid-stream Corrections: Decisions Leaders Make during Organizational Change Processes" (David W. Frantz) analyzes three organizational leaders to determine whether and how they take corrective actions or adapt their decision-making processes when…

  4. Global-local processing relates to spatial and verbal processing: implications for sex differences in cognition.

    PubMed

    Pletzer, Belinda; Scheuringer, Andrea; Scherndl, Thomas

    2017-09-05

    Sex differences have been reported for a variety of cognitive tasks and related to the use of different cognitive processing styles in men and women. It was recently argued that these processing styles share some characteristics across tasks, i.e. male approaches are oriented towards holistic stimulus aspects and female approaches are oriented towards stimulus details. In that respect, sex-dependent cognitive processing styles share similarities with attentional global-local processing. A direct relationship between cognitive processing and global-local processing has however not been previously established. In the present study, 49 men and 44 women completed a Navon paradigm and a Kimchi Palmer task as well as a navigation task and a verbal fluency task with the goal to relate the global advantage (GA) effect as a measure of global processing to holistic processing styles in both tasks. Indeed participants with larger GA effects displayed more holistic processing during spatial navigation and phonemic fluency. However, the relationship to cognitive processing styles was modulated by the specific condition of the Navon paradigm, as well as the sex of participants. Thus, different types of global-local processing play different roles for cognitive processing in men and women.

  5. Real-time radar signal processing using GPGPU (general-purpose graphic processing unit)

    NASA Astrophysics Data System (ADS)

    Kong, Fanxing; Zhang, Yan Rockee; Cai, Jingxiao; Palmer, Robert D.

    2016-05-01

    This study introduces a practical approach to develop real-time signal processing chain for general phased array radar on NVIDIA GPUs(Graphical Processing Units) using CUDA (Compute Unified Device Architecture) libraries such as cuBlas and cuFFT, which are adopted from open source libraries and optimized for the NVIDIA GPUs. The processed results are rigorously verified against those from the CPUs. Performance benchmarked in computation time with various input data cube sizes are compared across GPUs and CPUs. Through the analysis, it will be demonstrated that GPGPUs (General Purpose GPU) real-time processing of the array radar data is possible with relatively low-cost commercial GPUs.

  6. [Definition and stabilization of processes II. Clinical Processes in a Urology Department].

    PubMed

    Pascual, Carlos; Luján, Marcos; Mora, José Ramón; Diz, Manuel Ramón; Martín, Carlos; López, María Carmen

    2015-01-01

    New models in clinical management seek a clinical practice based on quality, efficacy and efficiency, avoiding variability and improvisation. In this paper we have developed one of the most frequent clinical processes in our speciality, the process based on DRG 311 or transurethral procedures without complications. Along it we will describe its components: Stabilization form, clinical trajectory, cost calculation, and finally the process flowchart.

  7. Chemical processing of lunar materials

    NASA Technical Reports Server (NTRS)

    Criswell, D. R.; Waldron, R. D.

    1979-01-01

    The paper highlights recent work on the general problem of processing lunar materials. The discussion covers lunar source materials, refined products, motivations for using lunar materials, and general considerations for a lunar or space processing plant. Attention is given to chemical processing through various techniques, including electrolysis of molten silicates, carbothermic/silicothermic reduction, carbo-chlorination process, NaOH basic-leach process, and HF acid-leach process. Several options for chemical processing of lunar materials are well within the state of the art of applied chemistry and chemical engineering to begin development based on the extensive knowledge of lunar materials.

  8. Process evaluation of discharge planning implementation in healthcare using normalization process theory.

    PubMed

    Nordmark, Sofi; Zingmark, Karin; Lindberg, Inger

    2016-04-27

    Discharge planning is a care process that aims to secure the transfer of care for the patient at transition from home to the hospital and back home. Information exchange and collaboration between care providers are essential, but deficits are common. A wide range of initiatives to improve the discharge planning process have been developed and implemented for the past three decades. However, there are still high rates of reported medical errors and adverse events related to failures in the discharge planning. Using theoretical frameworks such as Normalization Process Theory (NPT) can support evaluations of complex interventions and processes in healthcare. The aim of this study was to explore the embedding and integration of the DPP from the perspective of registered nurses, district nurses and homecare organizers. The study design was explorative, using the NPT as a framework to explore the embedding and integration of the DPP. Data consisted of written documentation from; workshops with staff, registered adverse events and system failures, web based survey and individual interviews with staff. Using the NPT as a framework to explore the embedding and integration of discharge planning after 10 years in use showed that the staff had reached a consensus of opinion of what the process was (coherence) and how they evaluated the process (reflexive monitoring). However, they had not reached a consensus of opinion of who performed the process (cognitive participation) and how it was performed (collective action). This could be interpreted as the process had not become normalized in daily practice. The result shows necessity to observe the implementation of old practices to better understand the needs of new ones before developing and implementing new practices or supportive tools within healthcare to reach the aim of development and to accomplish sustainable implementation. The NPT offers a generalizable framework for analysis, which can explain and shape the

  9. Illuminating e-beam processing

    USDA-ARS?s Scientific Manuscript database

    This month's Processing column will explore electronic beam (e-beam) processing. E-beam processing uses a low energy form of irradiation and has emerged as a highly promising treatment for both food safety and quarantine purposes. It is also used to extend food shelf life. This column will review...

  10. Lyophilization process design space.

    PubMed

    Patel, Sajal Manubhai; Pikal, Michael J

    2013-11-01

    The application of key elements of quality by design (QbD), such as risk assessment, process analytical technology, and design space, is discussed widely as it relates to freeze-drying process design and development. However, this commentary focuses on constructing the Design and Control Space, particularly for the primary drying step of the freeze-drying process. Also, practical applications and considerations of claiming a process Design Space under the QbD paradigm have been discussed. © 2013 Wiley Periodicals, Inc. and the American Pharmacists Association.

  11. Advanced Hydrogen Liquefaction Process

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schwartz, Joseph; Kromer, Brian; Neu, Ben

    2011-09-28

    The project identified and quantified ways to reduce the cost of hydrogen liquefaction, and reduce the cost of hydrogen distribution. The goal was to reduce the power consumption by 20% and then to reduce the capital cost. Optimizing the process, improving process equipment, and improving ortho-para conversion significantly reduced the power consumption of liquefaction, but by less than 20%. Because the efficiency improvement was less than the target, the program was stopped before the capital cost was addressed. These efficiency improvements could provide a benefit to the public to improve the design of future hydrogen liquefiers. The project increased themore » understanding of hydrogen liquefaction by modeling different processes and thoroughly examining ortho-para separation and conversion. The process modeling provided a benefit to the public because the project incorporated para hydrogen into the process modeling software, so liquefaction processes can be modeled more accurately than using only normal hydrogen. Adding catalyst to the first heat exchanger, a simple method to reduce liquefaction power, was identified, analyzed, and quantified. The demonstrated performance of ortho-para separation is sufficient for at least one identified process concept to show reduced power cost when compared to hydrogen liquefaction processes using conventional ortho-para conversion. The impact of improved ortho-para conversion can be significant because ortho para conversion uses about 20-25% of the total liquefaction power, but performance improvement is necessary to realize a substantial benefit. Most of the energy used in liquefaction is for gas compression. Improvements in hydrogen compression will have a significant impact on overall liquefier efficiency. Improvements to turbines, heat exchangers, and other process equipment will have less impact.« less

  12. EDITORIAL: Industrial Process Tomography

    NASA Astrophysics Data System (ADS)

    Anton Johansen, Geir; Wang, Mi

    2008-09-01

    There has been tremendous development within measurement science and technology over the past couple of decades. New sensor technologies and compact versatile signal recovery electronics are continuously expanding the limits of what can be measured and the accuracy with which this can be done. Miniaturization of sensors and the use of nanotechnology push these limits further. Also, thanks to powerful and cost-effective computer systems, sophisticated measurement and reconstruction algorithms previously only accessible in advanced laboratories are now available for in situ online measurement systems. The process industries increasingly require more process-related information, motivated by key issues such as improved process control, process utilization and process yields, ultimately driven by cost-effectiveness, quality assurance, environmental and safety demands. Industrial process tomography methods have taken advantage of the general progress in measurement science, and aim at providing more information, both quantitatively and qualitatively, on multiphase systems and their dynamics. The typical approach for such systems has been to carry out one local or bulk measurement and assume that this is representative of the whole system. In some cases, this is sufficient. However, there are many complex systems where the component distribution varies continuously and often unpredictably in space and time. The foundation of industrial tomography is to conduct several measurements around the periphery of a multiphase process, and use these measurements to unravel the cross-sectional distribution of the process components in time and space. This information is used in the design and optimization of industrial processes and process equipment, and also to improve the accuracy of multiphase system measurements in general. In this issue we are proud to present a selection of the 145 papers presented at the 5th World Congress on Industrial Process Tomography in Bergen

  13. Using Unified Modelling Language (UML) as a process-modelling technique for clinical-research process improvement.

    PubMed

    Kumarapeli, P; De Lusignan, S; Ellis, T; Jones, B

    2007-03-01

    The Primary Care Data Quality programme (PCDQ) is a quality-improvement programme which processes routinely collected general practice computer data. Patient data collected from a wide range of different brands of clinical computer systems are aggregated, processed, and fed back to practices in an educational context to improve the quality of care. Process modelling is a well-established approach used to gain understanding and systematic appraisal, and identify areas of improvement of a business process. Unified modelling language (UML) is a general purpose modelling technique used for this purpose. We used UML to appraise the PCDQ process to see if the efficiency and predictability of the process could be improved. Activity analysis and thinking-aloud sessions were used to collect data to generate UML diagrams. The UML model highlighted the sequential nature of the current process as a barrier for efficiency gains. It also identified the uneven distribution of process controls, lack of symmetric communication channels, critical dependencies among processing stages, and failure to implement all the lessons learned in the piloting phase. It also suggested that improved structured reporting at each stage - especially from the pilot phase, parallel processing of data and correctly positioned process controls - should improve the efficiency and predictability of research projects. Process modelling provided a rational basis for the critical appraisal of a clinical data processing system; its potential maybe underutilized within health care.

  14. Processed and ultra-processed food products: consumption trends in Canada from 1938 to 2011.

    PubMed

    Moubarac, Jean-Claude; Batal, Malek; Martins, Ana Paula Bortoletto; Claro, Rafael; Levy, Renata Bertazzi; Cannon, Geoffrey; Monteiro, Carlos

    2014-01-01

    A classification of foods based on the nature, extent, and purpose of industrial food processing was used to assess changes in household food expenditures and dietary energy availability between 1938 and 2011 in Canada. Food acquisitions from six household food budget surveys (1938/1939 , 1953, 1969, 1984, 2001, and 2011) were classified into unprocessed or minimally processed foods, processed culinary ingredients, and ready-to-consume processed or ultra-processed products. Contributions of each group to household food expenditures, and to dietary energy availability (kcal per capita) were calculated. During the period studied, household expenditures and dietary energy availability fell for both unprocessed or minimally processed foods and culinary ingredients, and rose for ready-to-consume products. The caloric share of foods fell from 34.3% to 25.6% and from 37% to 12.7% for culinary ingredients. The share of ready-to-consume products rose from 28.7% to 61.7%, and the increase was especially noteworthy for those that were ultra-processed. The most important factor that has driven changes in Canadian dietary patterns between 1938 and 2011 is the replacement of unprocessed or minimally processed foods and culinary ingredients used in the preparation of dishes and meals; these have been displaced by ready-to-consume ultra-processed products. Nutrition research and practice should incorporate information about food processing into dietary assessments.

  15. The Constitutional Amendment Process

    ERIC Educational Resources Information Center

    Chism, Kahlil

    2005-01-01

    This article discusses the constitutional amendment process. Although the process is not described in great detail, Article V of the United States Constitution allows for and provides instruction on amending the Constitution. While the amendment process currently consists of six steps, the Constitution is nevertheless quite difficult to change.…

  16. Survey of Event Processing

    DTIC Science & Technology

    2007-12-01

    1 A Brief History of Event Processing... history of event processing. The Applications section defines several application domains and use cases for event processing technology. Event...subscription” and “subscription language” will be used where some will often use “(continuous) query” or “query language.” A Brief History of

  17. Methane/nitrogen separation process

    DOEpatents

    Baker, Richard W.; Lokhandwala, Kaaeid A.; Pinnau, Ingo; Segelke, Scott

    1997-01-01

    A membrane separation process for treating a gas stream containing methane and nitrogen, for example, natural gas. The separation process works by preferentially permeating methane and rejecting nitrogen. We have found that the process is able to meet natural gas pipeline specifications for nitrogen, with acceptably small methane loss, so long as the membrane can exhibit a methane/nitrogen selectivity of about 4, 5 or more. This selectivity can be achieved with some rubbery and super-glassy membranes at low temperatures. The process can also be used for separating ethylene from nitrogen.

  18. Gasoline from coal in the state of Illinois: feasibility study. Volume I. Design. [KBW gasification process, ICI low-pressure methanol process and Mobil M-gasoline process

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1980-01-01

    Volume 1 describes the proposed plant: KBW gasification process, ICI low-pressure methanol process and Mobil M-gasoline process, and also with ancillary processes, such as oxygen plant, shift process, RECTISOL purification process, sulfur recovery equipment and pollution control equipment. Numerous engineering diagrams are included. (LTN)

  19. Computer integrated manufacturing/processing in the HPI. [Hydrocarbon Processing Industry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yoshimura, J.S.

    1993-05-01

    Hydrocarbon Processing and Systemhouse Inc., developed a comprehensive survey on the status of computer integrated manufacturing/processing (CIM/CIP) targeted specifically to the unique requirements of the hydrocarbon processing industry. These types of surveys and other benchmarking techniques can be invaluable in assisting companies to maximize business benefits from technology investments. The survey was organized into 5 major areas: CIM/CIP planning, management perspective, functional applications, integration and technology infrastructure and trends. The CIM/CIP planning area dealt with the use and type of planning methods to plan, justify implement information technology projects. The management perspective section addressed management priorities, expenditure levels and implementationmore » barriers. The functional application area covered virtually all functional areas of organization and focused on the specific solutions and benefits in each of the functional areas. The integration section addressed the needs and integration status of the organization's functional areas. Finally, the technology infrastructure and trends section dealt with specific technologies in use as well as trends over the next three years. In February 1993, summary areas from preliminary results were presented at the 2nd International Conference on Productivity and Quality in the Hydrocarbon Processing Industry.« less

  20. How yogurt is processed

    USDA-ARS?s Scientific Manuscript database

    This month’s Processing column on the theme of “How Is It Processed?” focuses on yogurt. Yogurt is known for its health-promoting properties. This column will provide a brief overview of the history of yogurt and the current market. It will also unveil both traditional and modern yogurt processing t...

  1. Process Development of Porcelain Ceramic Material with Binder Jetting Process for Dental Applications

    NASA Astrophysics Data System (ADS)

    Miyanaji, Hadi; Zhang, Shanshan; Lassell, Austin; Zandinejad, Amirali; Yang, Li

    2016-03-01

    Custom ceramic structures possess significant potentials in many applications such as dentistry and aerospace where extreme environments are present. Specifically, highly customized geometries with adequate performance are needed for various dental prostheses applications. This paper demonstrates the development of process and post-process parameters for a dental porcelain ceramic material using binder jetting additive manufacturing (AM). Various process parameters such as binder amount, drying power level, drying time and powder spread speed were studied experimentally for their effect on geometrical and mechanical characteristics of green parts. In addition, the effects of sintering and printing parameters on the qualities of the densified ceramic structures were also investigated experimentally. The results provide insights into the process-property relationships for the binder jetting AM process, and some of the challenges of the process that need to be further characterized for the successful adoption of the binder jetting technology in high quality ceramic fabrications are discussed.

  2. NTP comparison process

    NASA Technical Reports Server (NTRS)

    Corban, Robert

    1993-01-01

    The systems engineering process for the concept definition phase of the program involves requirements definition, system definition, and consistent concept definition. The requirements definition process involves obtaining a complete understanding of the system requirements based on customer needs, mission scenarios, and nuclear thermal propulsion (NTP) operating characteristics. A system functional analysis is performed to provide a comprehensive traceability and verification of top-level requirements down to detailed system specifications and provides significant insight into the measures of system effectiveness to be utilized in system evaluation. The second key element in the process is the definition of system concepts to meet the requirements. This part of the process involves engine system and reactor contractor teams to develop alternative NTP system concepts that can be evaluated against specific attributes, as well as a reference configuration against which to compare system benefits and merits. Quality function deployment (QFD), as an excellent tool within Total Quality Management (TQM) techniques, can provide the required structure and provide a link to the voice of the customer in establishing critical system qualities and their relationships. The third element of the process is the consistent performance comparison. The comparison process involves validating developed concept data and quantifying system merits through analysis, computer modeling, simulation, and rapid prototyping of the proposed high risk NTP subsystems. The maximum amount possible of quantitative data will be developed and/or validated to be utilized in the QFD evaluation matrix. If upon evaluation of a new concept or its associated subsystems determine to have substantial merit, those features will be incorporated into the reference configuration for subsequent system definition and comparison efforts.

  3. RATES OF REACTION AND PROCESS DESIGN DATA FOR THE HYDROCARB PROCESS

    EPA Science Inventory

    The report provides experimental and process design data in support of studies for developing the coprocessing of fossil fuels with biomass by the Hydrocarb process. The experimental work includes the hydropyrolysis of biomass and the thermal decomposition of methane in a 2.44 m ...

  4. Developing the JPL Engineering Processes

    NASA Technical Reports Server (NTRS)

    Linick, Dave; Briggs, Clark

    2004-01-01

    This paper briefly recounts the recent history of process reengineering at the NASA Jet Propulsion Laboratory, with a focus on the engineering processes. The JPL process structure is described and the process development activities of the past several years outlined. The main focus of the paper is on the current process structure, the emphasis on the flight project life cycle, the governance approach that lead to Flight Project Practices, and the remaining effort to capture process knowledge at the detail level of the work group.

  5. Economics of polysilicon processes

    NASA Technical Reports Server (NTRS)

    Yaws, C. L.; Li, K. Y.; Chou, S. M.

    1986-01-01

    Techniques are being developed to provide lower cost polysilicon material for solar cells. Existing technology which normally provides semiconductor industry polysilicon material is undergoing changes and also being used to provide polysilicon material for solar cells. Economics of new and existing technologies are presented for producing polysilicon. The economics are primarily based on the preliminary process design of a plant producing 1,000 metric tons/year of silicon. The polysilicon processes include: Siemen's process (hydrogen reduction of trichlorosilane); Union Carbide process (silane decomposition); and Hemlock Semiconductor process (hydrogen reduction of dichlorosilane). The economics include cost estimates of capital investment and product cost to produce polysilicon via the technology. Sensitivity analysis results are also presented to disclose the effect of major paramentes such as utilities, labor, raw materials and capital investment.

  6. Spacelab Data Processing Facility

    NASA Technical Reports Server (NTRS)

    1983-01-01

    The Spacelab Data Processing Facility (SDPF) processes, monitors, and accounts for the payload data from Spacelab and other Shuttle missions and forwards relevant data to various user facilities worldwide. The SLDPF is divided into the Spacelab Input Processing System (SIPS) and the Spacelab Output Processing System (SOPS). The SIPS division demultiplexes, synchronizes, time tags, quality checks, accounts for the data, and formats the data onto tapes. The SOPS division further edits, blocks, formats, and records the data on tape for shipment to users. User experiments must conform to the Spacelab's onboard High Rate Multiplexer (HRM) format for maximum process ability. Audio, analog, instrumentation, high density, experiment data, input/output data, quality control and accounting, and experimental channel tapes along with a variety of spacelab ancillary tapes are provided to the user by SLDPF.

  7. Styrene process condensate treatment with a combination process of UF and NF for reuse.

    PubMed

    Wang, Aijun; Liu, Guangmin; Huang, Jin; Wang, Lijuan; Li, Guangbin; Su, Xudong; Qi, Hong

    2013-01-15

    Aiming at reusing the SPC to save water resource and heat energy, a combination treatment process of UF/NF was applied to remove inorganic irons, suspended particles and little amount of organic contaminants in this article. To achieve the indexes of CODM≤5.00 mg L(-1), oil≤2.00 mg L(-1), conductivity≤10.00 μs cm(-1), pH of 6.0-8.0, the NF membrane process was adopted. It was necessary to employ a pretreatment process to reduce NF membrane fouling. Hence UF membrane as an efficient pretreatment unit was proposed to remove the inorganic particles, such as iron oxide catalyst, to meet the influent demands of NF. The effluent of UF, which was less than 0.02 mg L(-1) of total iron, went into a security filter and then was pumped into the NF process unit. High removal efficiencies of CODM, oil and conductivity were achieved by using NF process. The ABS grafting copolymerization experiment showed that the effluent of the combination process met the criteria of ABS production process, meanwhile the process could alleviate the environment pollution. It was shown that this combination process concept was feasible and successful in treating the SPC. Copyright © 2012 Elsevier B.V. All rights reserved.

  8. Cantilever epitaxial process

    DOEpatents

    Ashby, Carol I.; Follstaedt, David M.; Mitchell, Christine C.; Han, Jung

    2003-07-29

    A process of growing a material on a substrate, particularly growing a Group II-VI or Group III-V material, by a vapor-phase growth technique where the growth process eliminates the need for utilization of a mask or removal of the substrate from the reactor at any time during the processing. A nucleation layer is first grown upon which a middle layer is grown to provide surfaces for subsequent lateral cantilever growth. The lateral growth rate is controlled by altering the reactor temperature, pressure, reactant concentrations or reactant flow rates. Semiconductor materials, such as GaN, can be produced with dislocation densities less than 10.sup.7 /cm.sup.2.

  9. A case study: application of statistical process control tool for determining process capability and sigma level.

    PubMed

    Chopra, Vikram; Bairagi, Mukesh; Trivedi, P; Nagar, Mona

    2012-01-01

    Statistical process control is the application of statistical methods to the measurement and analysis of variation process. Various regulatory authorities such as Validation Guidance for Industry (2011), International Conference on Harmonisation ICH Q10 (2009), the Health Canada guidelines (2009), Health Science Authority, Singapore: Guidance for Product Quality Review (2008), and International Organization for Standardization ISO-9000:2005 provide regulatory support for the application of statistical process control for better process control and understanding. In this study risk assessments, normal probability distributions, control charts, and capability charts are employed for selection of critical quality attributes, determination of normal probability distribution, statistical stability, and capability of production processes, respectively. The objective of this study is to determine tablet production process quality in the form of sigma process capability. By interpreting data and graph trends, forecasting of critical quality attributes, sigma process capability, and stability of process were studied. The overall study contributes to an assessment of process at the sigma level with respect to out-of-specification attributes produced. Finally, the study will point to an area where the application of quality improvement and quality risk assessment principles for achievement of six sigma-capable processes is possible. Statistical process control is the most advantageous tool for determination of the quality of any production process. This tool is new for the pharmaceutical tablet production process. In the case of pharmaceutical tablet production processes, the quality control parameters act as quality assessment parameters. Application of risk assessment provides selection of critical quality attributes among quality control parameters. Sequential application of normality distributions, control charts, and capability analyses provides a valid statistical

  10. A comprehensive analysis of the IMRT dose delivery process using statistical process control (SPC).

    PubMed

    Gérard, Karine; Grandhaye, Jean-Pierre; Marchesi, Vincent; Kafrouni, Hanna; Husson, François; Aletti, Pierre

    2009-04-01

    The aim of this study is to introduce tools to improve the security of each IMRT patient treatment by determining action levels for the dose delivery process. To achieve this, the patient-specific quality control results performed with an ionization chamber--and which characterize the dose delivery process--have been retrospectively analyzed using a method borrowed from industry: Statistical process control (SPC). The latter consisted in fulfilling four principal well-structured steps. The authors first quantified the short-term variability of ionization chamber measurements regarding the clinical tolerances used in the cancer center (+/- 4% of deviation between the calculated and measured doses) by calculating a control process capability (C(pc)) index. The C(pc) index was found superior to 4, which implies that the observed variability of the dose delivery process is not biased by the short-term variability of the measurement. Then, the authors demonstrated using a normality test that the quality control results could be approximated by a normal distribution with two parameters (mean and standard deviation). Finally, the authors used two complementary tools--control charts and performance indices--to thoroughly analyze the IMRT dose delivery process. Control charts aim at monitoring the process over time using statistical control limits to distinguish random (natural) variations from significant changes in the process, whereas performance indices aim at quantifying the ability of the process to produce data that are within the clinical tolerances, at a precise moment. The authors retrospectively showed that the analysis of three selected control charts (individual value, moving-range, and EWMA control charts) allowed efficient drift detection of the dose delivery process for prostate and head-and-neck treatments before the quality controls were outside the clinical tolerances. Therefore, when analyzed in real time, during quality controls, they should improve the

  11. CIMOSA process classification for business process mapping in non-manufacturing firms: A case study

    NASA Astrophysics Data System (ADS)

    Latiffianti, Effi; Siswanto, Nurhadi; Wiratno, Stefanus Eko; Saputra, Yudha Andrian

    2017-11-01

    A business process mapping is one important means to enable an enterprise to effectively manage the value chain. One of widely used approaches to classify business process for mapping purpose is Computer Integrated Manufacturing System Open Architecture (CIMOSA). CIMOSA was initially designed for Computer Integrated Manufacturing (CIM) system based enterprises. This paper aims to analyze the use of CIMOSA process classification for business process mapping in the firms that do not fall within the area of CIM. Three firms of different business area that have used CIMOSA process classification were observed: an airline firm, a marketing and trading firm for oil and gas products, and an industrial estate management firm. The result of the research has shown that CIMOSA can be used in non-manufacturing firms with some adjustment. The adjustment includes addition, reduction, or modification of some processes suggested by CIMOSA process classification as evidenced by the case studies.

  12. Experimental research of solid waste drying in the process of thermal processing

    NASA Astrophysics Data System (ADS)

    Bukhmirov, V. V.; Kolibaba, O. B.; Gabitov, R. N.

    2015-10-01

    The convective drying process of municipal solid waste layer as a polydispersed multicomponent porous structure is studied. On the base of the experimental data criterial equations for calculating heat transfer and mass transfer processes in the layer, depending on the humidity of the material, the speed of the drying agent and the layer height are obtained. These solutions are used in the thermal design of reactors for the thermal processing of multicomponent organic waste.

  13. Conceptual models of information processing

    NASA Technical Reports Server (NTRS)

    Stewart, L. J.

    1983-01-01

    The conceptual information processing issues are examined. Human information processing is defined as an active cognitive process that is analogous to a system. It is the flow and transformation of information within a human. The human is viewed as an active information seeker who is constantly receiving, processing, and acting upon the surrounding environmental stimuli. Human information processing models are conceptual representations of cognitive behaviors. Models of information processing are useful in representing the different theoretical positions and in attempting to define the limits and capabilities of human memory. It is concluded that an understanding of conceptual human information processing models and their applications to systems design leads to a better human factors approach.

  14. Alsep data processing: How we processed Apollo Lunar Seismic Data

    NASA Technical Reports Server (NTRS)

    Latham, G. V.; Nakamura, Y.; Dorman, H. J.

    1979-01-01

    The Apollo lunar seismic station network gathered data continuously at a rate of 3 x 10 to the 8th power bits per day for nearly eight years until the termination in September, 1977. The data were processed and analyzed using a PDP-15 minicomputer. On the average, 1500 long-period seismic events were detected yearly. Automatic event detection and identification schemes proved unsuccessful because of occasional high noise levels and, above all, the risk of overlooking unusual natural events. The processing procedures finally settled on consist of first plotting all the data on a compressed time scale, visually picking events from the plots, transferring event data to separate sets of tapes and performing detailed analyses using the latter. Many problems remain especially for automatically processing extraterrestrial seismic signals.

  15. The "Process" of Process Use: Methods for Longitudinal Assessment in a Multisite Evaluation

    ERIC Educational Resources Information Center

    Shaw, Jessica; Campbell, Rebecca

    2014-01-01

    Process use refers to the ways in which stakeholders and/or evaluands change as a function of participating in evaluation activities. Although the concept of process use has been well discussed in the literature, exploration of methodological strategies for the measurement and assessment of process use has been limited. Typically, empirical…

  16. Emotional language processing: how mood affects integration processes during discourse comprehension.

    PubMed

    Egidi, Giovanna; Nusbaum, Howard C

    2012-09-01

    This research tests whether mood affects semantic processing during discourse comprehension by facilitating integration of information congruent with moods' valence. Participants in happy, sad, or neutral moods listened to stories with positive or negative endings during EEG recording. N400 peak amplitudes showed mood congruence for happy and sad participants: endings incongruent with participants' moods demonstrated larger peaks. Happy and neutral moods exhibited larger peaks for negative endings, thus showing a similarity between negativity bias (neutral mood) and mood congruence (happy mood). Mood congruence resulted in differential processing of negative information: happy mood showed larger amplitudes for negative endings than neutral mood, and sad mood showed smaller amplitudes. N400 peaks were also sensitive to whether ending valence was communicated directly or as a result of inference. This effect was moderately modulated by mood. In conclusion, the notion of context for discourse processing should include comprehenders' affective states preceding language processing. Copyright © 2011 Elsevier Inc. All rights reserved.

  17. Semisolid Metal Processing Consortium

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Apelian,Diran

    Mathematical modeling and simulations of semisolid filling processes remains a critical issue in understanding and optimizing the process. Semisolid slurries are non-Newtonian materials that exhibit complex rheological behavior. There the way these slurries flow in cavities is very different from the way liquid in classical casting fills cavities. Actually filling in semisolid processing is often counter intuitive

  18. How tofu is processed

    USDA-ARS?s Scientific Manuscript database

    This month’s Processing column will continue the theme of “How Is It Processed?” The column will focus on tofu, which is sometimes called “the cheese of Asia.” It is a nutritious, protein-rich bean curd made by coagulating soy milk. There are many different types of tofu, and they are processed in a...

  19. Kennedy Space Center Payload Processing

    NASA Technical Reports Server (NTRS)

    Lawson, Ronnie; Engler, Tom; Colloredo, Scott; Zide, Alan

    2011-01-01

    This slide presentation reviews the payload processing functions at Kennedy Space Center. It details some of the payloads processed at KSC, the typical processing tasks, the facilities available for processing payloads, and the capabilities and customer services that are available.

  20. Parallel Processing with Digital Signal Processing Hardware and Software

    NASA Technical Reports Server (NTRS)

    Swenson, Cory V.

    1995-01-01

    The assembling and testing of a parallel processing system is described which will allow a user to move a Digital Signal Processing (DSP) application from the design stage to the execution/analysis stage through the use of several software tools and hardware devices. The system will be used to demonstrate the feasibility of the Algorithm To Architecture Mapping Model (ATAMM) dataflow paradigm for static multiprocessor solutions of DSP applications. The individual components comprising the system are described followed by the installation procedure, research topics, and initial program development.

  1. Microsystem process networks

    DOEpatents

    Wegeng, Robert S [Richland, WA; TeGrotenhuis, Ward E [Kennewick, WA; Whyatt, Greg A [West Richland, WA

    2006-10-24

    Various aspects and applications of microsystem process networks are described. The design of many types of microsystems can be improved by ortho-cascading mass, heat, or other unit process operations. Microsystems having exergetically efficient microchannel heat exchangers are also described. Detailed descriptions of numerous design features in microcomponent systems are also provided.

  2. METAL PLATING PROCESS

    DOEpatents

    Walker, D.E.; Noland, R.A.

    1958-08-12

    A process ts described for obtaining a closely bonded coating of steel or iron on uranium. The process consists of providing, between the steel and uramium. a layer of silver. amd then pressure rolling tbe assembly at about 600 deg C until a reduction of from l0 to 50% has been obtained.

  3. Microsystem process networks

    DOEpatents

    Wegeng, Robert S [Richland, WA; TeGrotenhuis, Ward E [Kennewick, WA; Whyatt, Greg A [West Richland, WA

    2010-01-26

    Various aspects and applications or microsystem process networks are described. The design of many types of microsystems can be improved by ortho-cascading mass, heat, or other unit process operations. Microsystems having energetically efficient microchannel heat exchangers are also described. Detailed descriptions of numerous design features in microcomponent systems are also provided.

  4. Microsystem process networks

    DOEpatents

    Wegeng, Robert S.; TeGrotenhuis, Ward E.; Whyatt, Greg A.

    2007-09-18

    Various aspects and applications of microsystem process networks are described. The design of many types of Microsystems can be improved by ortho-cascading mass, heat, or other unit process operations. Microsystems having energetically efficient microchannel heat exchangers are also described. Detailed descriptions of numerous design features in microcomponent systems are also provided.

  5. Industrial process surveillance system

    DOEpatents

    Gross, Kenneth C.; Wegerich, Stephan W.; Singer, Ralph M.; Mott, Jack E.

    1998-01-01

    A system and method for monitoring an industrial process and/or industrial data source. The system includes generating time varying data from industrial data sources, processing the data to obtain time correlation of the data, determining the range of data, determining learned states of normal operation and using these states to generate expected values, comparing the expected values to current actual values to identify a current state of the process closest to a learned, normal state; generating a set of modeled data, and processing the modeled data to identify a data pattern and generating an alarm upon detecting a deviation from normalcy.

  6. Industrial Process Surveillance System

    DOEpatents

    Gross, Kenneth C.; Wegerich, Stephan W; Singer, Ralph M.; Mott, Jack E.

    2001-01-30

    A system and method for monitoring an industrial process and/or industrial data source. The system includes generating time varying data from industrial data sources, processing the data to obtain time correlation of the data, determining the range of data, determining learned states of normal operation and using these states to generate expected values, comparing the expected values to current actual values to identify a current state of the process closest to a learned, normal state; generating a set of modeled data, and processing the modeled data to identify a data pattern and generating an alarm upon detecting a deviation from normalcy.

  7. Methane/nitrogen separation process

    DOEpatents

    Baker, R.W.; Lokhandwala, K.A.; Pinnau, I.; Segelke, S.

    1997-09-23

    A membrane separation process is described for treating a gas stream containing methane and nitrogen, for example, natural gas. The separation process works by preferentially permeating methane and rejecting nitrogen. The authors have found that the process is able to meet natural gas pipeline specifications for nitrogen, with acceptably small methane loss, so long as the membrane can exhibit a methane/nitrogen selectivity of about 4, 5 or more. This selectivity can be achieved with some rubbery and super-glassy membranes at low temperatures. The process can also be used for separating ethylene from nitrogen. 11 figs.

  8. Business Process Modeling: Perceived Benefits

    NASA Astrophysics Data System (ADS)

    Indulska, Marta; Green, Peter; Recker, Jan; Rosemann, Michael

    The process-centered design of organizations and information systems is globally seen as an appropriate response to the increased economic pressure on organizations. At the methodological core of process-centered management is process modeling. However, business process modeling in large initiatives can be a time-consuming and costly exercise, making it potentially difficult to convince executive management of its benefits. To date, and despite substantial interest and research in the area of process modeling, the understanding of the actual benefits of process modeling in academia and practice is limited. To address this gap, this paper explores the perception of benefits derived from process modeling initiatives, as reported through a global Delphi study. The study incorporates the views of three groups of stakeholders - academics, practitioners and vendors. Our findings lead to the first identification and ranking of 19 unique benefits associated with process modeling. The study in particular found that process modeling benefits vary significantly between practitioners and academics. We argue that the variations may point to a disconnect between research projects and practical demands.

  9. Applied digital signal processing systems for vortex flowmeter with digital signal processing.

    PubMed

    Xu, Ke-Jun; Zhu, Zhi-Hai; Zhou, Yang; Wang, Xiao-Fen; Liu, San-Shan; Huang, Yun-Zhi; Chen, Zhi-Yuan

    2009-02-01

    The spectral analysis is combined with digital filter to process the vortex sensor signal for reducing the effect of disturbance at low frequency from pipe vibrations and increasing the turndown ratio. Using digital signal processing chip, two kinds of digital signal processing systems are developed to implement these algorithms. One is an integrative system, and the other is a separated system. A limiting amplifier is designed in the input analog condition circuit to adapt large amplitude variation of sensor signal. Some technique measures are taken to improve the accuracy of the output pulse, speed up the response time of the meter, and reduce the fluctuation of the output signal. The experimental results demonstrate the validity of the digital signal processing systems.

  10. Modeling biochemical transformation processes and information processing with Narrator.

    PubMed

    Mandel, Johannes J; Fuss, Hendrik; Palfreyman, Niall M; Dubitzky, Werner

    2007-03-27

    Software tools that model and simulate the dynamics of biological processes and systems are becoming increasingly important. Some of these tools offer sophisticated graphical user interfaces (GUIs), which greatly enhance their acceptance by users. Such GUIs are based on symbolic or graphical notations used to describe, interact and communicate the developed models. Typically, these graphical notations are geared towards conventional biochemical pathway diagrams. They permit the user to represent the transport and transformation of chemical species and to define inhibitory and stimulatory dependencies. A critical weakness of existing tools is their lack of supporting an integrative representation of transport, transformation as well as biological information processing. Narrator is a software tool facilitating the development and simulation of biological systems as Co-dependence models. The Co-dependence Methodology complements the representation of species transport and transformation together with an explicit mechanism to express biological information processing. Thus, Co-dependence models explicitly capture, for instance, signal processing structures and the influence of exogenous factors or events affecting certain parts of a biological system or process. This combined set of features provides the system biologist with a powerful tool to describe and explore the dynamics of life phenomena. Narrator's GUI is based on an expressive graphical notation which forms an integral part of the Co-dependence Methodology. Behind the user-friendly GUI, Narrator hides a flexible feature which makes it relatively easy to map models defined via the graphical notation to mathematical formalisms and languages such as ordinary differential equations, the Systems Biology Markup Language or Gillespie's direct method. This powerful feature facilitates reuse, interoperability and conceptual model development. Narrator is a flexible and intuitive systems biology tool. It is

  11. Modeling biochemical transformation processes and information processing with Narrator

    PubMed Central

    Mandel, Johannes J; Fuß, Hendrik; Palfreyman, Niall M; Dubitzky, Werner

    2007-01-01

    Background Software tools that model and simulate the dynamics of biological processes and systems are becoming increasingly important. Some of these tools offer sophisticated graphical user interfaces (GUIs), which greatly enhance their acceptance by users. Such GUIs are based on symbolic or graphical notations used to describe, interact and communicate the developed models. Typically, these graphical notations are geared towards conventional biochemical pathway diagrams. They permit the user to represent the transport and transformation of chemical species and to define inhibitory and stimulatory dependencies. A critical weakness of existing tools is their lack of supporting an integrative representation of transport, transformation as well as biological information processing. Results Narrator is a software tool facilitating the development and simulation of biological systems as Co-dependence models. The Co-dependence Methodology complements the representation of species transport and transformation together with an explicit mechanism to express biological information processing. Thus, Co-dependence models explicitly capture, for instance, signal processing structures and the influence of exogenous factors or events affecting certain parts of a biological system or process. This combined set of features provides the system biologist with a powerful tool to describe and explore the dynamics of life phenomena. Narrator's GUI is based on an expressive graphical notation which forms an integral part of the Co-dependence Methodology. Behind the user-friendly GUI, Narrator hides a flexible feature which makes it relatively easy to map models defined via the graphical notation to mathematical formalisms and languages such as ordinary differential equations, the Systems Biology Markup Language or Gillespie's direct method. This powerful feature facilitates reuse, interoperability and conceptual model development. Conclusion Narrator is a flexible and intuitive systems

  12. Silicon production process evaluations

    NASA Technical Reports Server (NTRS)

    1981-01-01

    Chemical engineering analysis of the HSC process (Hemlock Semiconductor Corporation) for producing silicon from dichlorosilane in a 1,000 MT/yr plant was continued. Progress and status for the chemical engineering analysis of the HSC process are reported for the primary process design engineering activities: base case conditions (85%), reaction chemistry (85%), process flow diagram (60%), material balance (60%), energy balance (30%), property data (30%), equipment design (20%) and major equipment list (10%). Engineering design of the initial distillation column (D-01, stripper column) in the process was initiated. The function of the distillation column is to remove volatile gases (such as hydrogen and nitrogen) which are dissolved in liquid chlorosilanes. Initial specifications and results for the distillation column design are reported including the variation of tray requirements (equilibrium stages) with reflux ratio for the distillation.

  13. Ordinal Process Dissociation and the Measurement of Automatic and Controlled Processes

    ERIC Educational Resources Information Center

    Hirshman, Elliot

    2004-01-01

    The process-dissociation equations (L. Jacoby, 1991) have been applied to results from inclusion and exclusion tasks to derive quantitative estimates of the influence of controlled and automatic processes on memory. This research has provoked controversies (e.g., T. Curran & D. Hintzman, 1995) regarding the validity of specific assumptions…

  14. Component processes underlying future thinking.

    PubMed

    D'Argembeau, Arnaud; Ortoleva, Claudia; Jumentier, Sabrina; Van der Linden, Martial

    2010-09-01

    This study sought to investigate the component processes underlying the ability to imagine future events, using an individual-differences approach. Participants completed several tasks assessing different aspects of future thinking (i.e., fluency, specificity, amount of episodic details, phenomenology) and were also assessed with tasks and questionnaires measuring various component processes that have been hypothesized to support future thinking (i.e., executive processes, visual-spatial processing, relational memory processing, self-consciousness, and time perspective). The main results showed that executive processes were correlated with various measures of future thinking, whereas visual-spatial processing abilities and time perspective were specifically related to the number of sensory descriptions reported when specific future events were imagined. Furthermore, individual differences in self-consciousness predicted the subjective feeling of experiencing the imagined future events. These results suggest that future thinking involves a collection of processes that are related to different facets of future-event representation.

  15. Hynol Process Engineering: Process Configuration, Site Plan, and Equipment Design

    DTIC Science & Technology

    1996-02-01

    feed stock. Compared with other methanol production processes, direct emissions of carbon dioxide can be substantially reduced by using the Hynol...A bench scale methanol production facility is being constructed to demonstrate the technical feasibility of producing methanol from biomass using the ...Hynol process. The plant is being designed to convert 50 lb./hr of biomass to methanol. The biomass consists of wood, and natural gas is used as a co

  16. Business process modeling in healthcare.

    PubMed

    Ruiz, Francisco; Garcia, Felix; Calahorra, Luis; Llorente, César; Gonçalves, Luis; Daniel, Christel; Blobel, Bernd

    2012-01-01

    The importance of the process point of view is not restricted to a specific enterprise sector. In the field of health, as a result of the nature of the service offered, health institutions' processes are also the basis for decision making which is focused on achieving their objective of providing quality medical assistance. In this chapter the application of business process modelling - using the Business Process Modelling Notation (BPMN) standard is described. Main challenges of business process modelling in healthcare are the definition of healthcare processes, the multi-disciplinary nature of healthcare, the flexibility and variability of the activities involved in health care processes, the need of interoperability between multiple information systems, and the continuous updating of scientific knowledge in healthcare.

  17. Adaptive Memory: Evaluating Alternative Forms of Fitness-Relevant Processing in the Survival Processing Paradigm

    PubMed Central

    Sandry, Joshua; Trafimow, David; Marks, Michael J.; Rice, Stephen

    2013-01-01

    Memory may have evolved to preserve information processed in terms of its fitness-relevance. Based on the assumption that the human mind comprises different fitness-relevant adaptive mechanisms contributing to survival and reproductive success, we compared alternative fitness-relevant processing scenarios with survival processing. Participants rated words for relevancy to fitness-relevant and control conditions followed by a delay and surprise recall test (Experiment 1a). Participants recalled more words processed for their relevance to a survival situation. We replicated these findings in an online study (Experiment 2) and a study using revised fitness-relevant scenarios (Experiment 3). Across all experiments, we did not find a mnemonic benefit for alternative fitness-relevant processing scenarios, questioning assumptions associated with an evolutionary account of remembering. Based on these results, fitness-relevance seems to be too wide-ranging of a construct to account for the memory findings associated with survival processing. We propose that memory may be hierarchically sensitive to fitness-relevant processing instructions. We encourage future researchers to investigate the underlying mechanisms responsible for survival processing effects and work toward developing a taxonomy of adaptive memory. PMID:23585858

  18. Integrated decontamination process for metals

    DOEpatents

    Snyder, Thomas S.; Whitlow, Graham A.

    1991-01-01

    An integrated process for decontamination of metals, particularly metals that are used in the nuclear energy industry contaminated with radioactive material. The process combines the processes of electrorefining and melt refining to purify metals that can be decontaminated using either electrorefining or melt refining processes.

  19. Word Processing Competencies.

    ERIC Educational Resources Information Center

    Gatlin, Rebecca; And Others

    Research indicates that people tend to use only five percent of the capabilities available in word processing software. The major objective of this study was to determine to what extent word processing was used by businesses, what competencies were required by those businesses, and how those competencies were being learned in Mid-South states. A…

  20. Handbook of Petroleum Processing

    NASA Astrophysics Data System (ADS)

    Jones, David S. J.; Pujado, Peter P.

    This handbook describes and discusses the features that make up the petroleum refining industry. It begins with a description of the crude oils and their nature, and continues with the saleable products from the refining processes, with a review of the environmental impact. There is a complete overview of the processes that make up the refinery with a brief history of those processes.

  1. A KPI-based process monitoring and fault detection framework for large-scale processes.

    PubMed

    Zhang, Kai; Shardt, Yuri A W; Chen, Zhiwen; Yang, Xu; Ding, Steven X; Peng, Kaixiang

    2017-05-01

    Large-scale processes, consisting of multiple interconnected subprocesses, are commonly encountered in industrial systems, whose performance needs to be determined. A common approach to this problem is to use a key performance indicator (KPI)-based approach. However, the different KPI-based approaches are not developed with a coherent and consistent framework. Thus, this paper proposes a framework for KPI-based process monitoring and fault detection (PM-FD) for large-scale industrial processes, which considers the static and dynamic relationships between process and KPI variables. For the static case, a least squares-based approach is developed that provides an explicit link with least-squares regression, which gives better performance than partial least squares. For the dynamic case, using the kernel representation of each subprocess, an instrument variable is used to reduce the dynamic case to the static case. This framework is applied to the TE benchmark process and the hot strip mill rolling process. The results show that the proposed method can detect faults better than previous methods. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.

  2. Advanced process control framework initiative

    NASA Astrophysics Data System (ADS)

    Hill, Tom; Nettles, Steve

    1997-01-01

    The semiconductor industry, one the world's most fiercely competitive industries, is driven by increasingly complex process technologies and global competition to improve cycle time, quality, and process flexibility. Due to the complexity of these problems, current process control techniques are generally nonautomated, time-consuming, reactive, nonadaptive, and focused on individual fabrication tools and processes. As the semiconductor industry moves into higher density processes, radical new approaches are required. To address the need for advanced factory-level process control in this environment, Honeywell, Advanced Micro Devices (AMD), and SEMATECH formed the Advanced Process Control Framework Initiative (APCFI) joint research project. The project defines and demonstrates an Advanced Process Control (APC) approach based on SEMATECH's Computer Integrated Manufacturing (CIM) Framework. Its scope includes the coordination of Manufacturing Execution Systems, process control tools, and wafer fabrication equipment to provide necessary process control capabilities. Moreover, it takes advantage of the CIM Framework to integrate and coordinate applications from other suppliers that provide services necessary for the overall system to function. This presentation discusses the key concept of model-based process control that differentiates the APC Framework. This major improvement over current methods enables new systematic process control by linking the knowledge of key process settings to desired product characteristics that reside in models created with commercial model development tools The unique framework-based approach facilitates integration of commercial tools and reuse of their data by tying them together in an object-based structure. The presentation also explores the perspective of each organization's involvement in the APCFI project. Each has complementary goals and expertise to contribute; Honeywell represents the supplier viewpoint, AMD represents the user

  3. Lubricant Coating Process

    NASA Technical Reports Server (NTRS)

    1989-01-01

    "Peen Plating," a NASA developed process for applying molybdenum disulfide, is the key element of Techniblast Co.'s SURFGUARD process for applying high strength solid lubricants. The process requires two machines -- one for cleaning and one for coating. The cleaning step allows the coating to be bonded directly to the substrate to provide a better "anchor." The coating machine applies a half a micron thick coating. Then, a blast gun, using various pressures to vary peening intensities for different applications, fires high velocity "media" -- peening hammers -- ranging from plastic pellets to steel shot. Techniblast was assisted by Rural Enterprises, Inc. Coating service can be performed at either Techniblast's or a customer's facility.

  4. Liquefaction processes and systems and liquefaction process intermediate compositions

    DOEpatents

    Schmidt, Andrew J.; Hart, Todd R.; Billing, Justin M.; Maupin, Gary D.; Hallen, Richard T.; Anderson, Daniel B.

    2014-07-12

    Liquefaction processes are provided that can include: providing a biomass slurry solution having a temperature of at least 300.degree. C. at a pressure of at least 2000 psig; cooling the solution to a temperature of less than 150.degree. C.; and depressurizing the solution to release carbon dioxide from the solution and form at least part of a bio-oil foam. Liquefaction processes are also provided that can include: filtering the biomass slurry to remove particulates; and cooling and depressurizing the filtered solution to form the bio-oil foam. Liquefaction systems are provided that can include: a heated biomass slurry reaction zone maintained above 300.degree. C. and at least 2000 psig and in continuous fluid communication with a flash cooling/depressurization zone maintained below 150.degree. C. and between about 125 psig and about atmospheric pressure. Liquefaction systems are also provided that can include a foam/liquid separation system. Liquefaction process intermediate compositions are provided that can include a bio-oil foam phase separated from an aqueous biomass solids solution.

  5. [Preliminary processing, processing and usage of Dendrobii Caulis in history].

    PubMed

    Yang, Wen-yu; Tang, Sheng; Shi, Dong-jun; Chen, Xiang-gui; Li, Ming-yuan; Tang, Xian-fu; Yuan, Chang-jiang

    2015-07-01

    On account of the dense cuticles of the fresh stem and the light, hard and pliable texture of the dried stem, Dendrobii Caulis is difficult to dry or pulverize. So, it is very important to the ancient doctors that Dendrobii Caulis should be properly treated and applied to keep or evoke its medicinal effects. The current textual research results about the preliminary processing, processing and usage methods of Dendrobii Caulis showed that: (1) In history the clinical use of fresh or processed Dendrobii Caulis as teas and tinctures were very common. (2) Its roots and rhizomes would be removed before using. (3) Some ancillary approaches were applied to shorten drying times, such as rinsing with boiling mulberry-ash soup, washing or soaking with liquor, mixing with rice pulp and then basking, etc. (4) According to the ancients knowledge, the sufficient pulverization, by means of slicing, rasping, hitting or pestling techniques, was necessary for Dendrobii Caulis to take its effects. (5) The heat processing methods for Dendrobii Caulis included stir-baking, stir-frying, steaming, decocting and stewing techniques, usually with liquor as an auxiliary material. Among above mentioned, steaming by pretreating with liquor was most commonly used, and this scheme was colorfully drawn in Bu Yi Lei Gong Pao Zhi Bian Lan (Ming Dynasty, 1591 CE) ; moreover, decocting in advance or long-time simmering so as to prepare paste products were recommended in the Qing Dynasty. (6) Some different processing programs involving stir-baking with grit, air-tightly baking with ondol (Kangs), fumigating with sulfur, which appeared in modern times and brought attractive outward appearance of the drug, went against ancients original intentions of ensuring drug efficacy.

  6. A comprehensive analysis of the IMRT dose delivery process using statistical process control (SPC)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gerard, Karine; Grandhaye, Jean-Pierre; Marchesi, Vincent

    The aim of this study is to introduce tools to improve the security of each IMRT patient treatment by determining action levels for the dose delivery process. To achieve this, the patient-specific quality control results performed with an ionization chamber--and which characterize the dose delivery process--have been retrospectively analyzed using a method borrowed from industry: Statistical process control (SPC). The latter consisted in fulfilling four principal well-structured steps. The authors first quantified the short term variability of ionization chamber measurements regarding the clinical tolerances used in the cancer center ({+-}4% of deviation between the calculated and measured doses) by calculatingmore » a control process capability (C{sub pc}) index. The C{sub pc} index was found superior to 4, which implies that the observed variability of the dose delivery process is not biased by the short term variability of the measurement. Then, the authors demonstrated using a normality test that the quality control results could be approximated by a normal distribution with two parameters (mean and standard deviation). Finally, the authors used two complementary tools--control charts and performance indices--to thoroughly analyze the IMRT dose delivery process. Control charts aim at monitoring the process over time using statistical control limits to distinguish random (natural) variations from significant changes in the process, whereas performance indices aim at quantifying the ability of the process to produce data that are within the clinical tolerances, at a precise moment. The authors retrospectively showed that the analysis of three selected control charts (individual value, moving-range, and EWMA control charts) allowed efficient drift detection of the dose delivery process for prostate and head-and-neck treatments before the quality controls were outside the clinical tolerances. Therefore, when analyzed in real time, during quality controls, they

  7. Tracing the decision-making process of physicians with a Decision Process Matrix.

    PubMed

    Hausmann, Daniel; Zulian, Cristina; Battegay, Edouard; Zimmerli, Lukas

    2016-10-18

    Decision-making processes in a medical setting are complex, dynamic and under time pressure, often with serious consequences for a patient's condition. The principal aim of the present study was to trace and map the individual diagnostic process of real medical cases using a Decision Process Matrix [DPM]). The naturalistic decision-making process of 11 residents and a total of 55 medical cases were recorded in an emergency department, and a DPM was drawn up according to a semi-structured technique following four steps: 1) observing and recording relevant information throughout the entire diagnostic process, 2) assessing options in terms of suspected diagnoses, 3) drawing up an initial version of the DPM, and 4) verifying the DPM, while adding the confidence ratings. The DPM comprised an average of 3.2 suspected diagnoses and 7.9 information units (cues). The following three-phase pattern could be observed: option generation, option verification, and final diagnosis determination. Residents strove for the highest possible level of confidence before making the final diagnoses (in two-thirds of the medical cases with a rating of practically certain) or excluding suspected diagnoses (with practically impossible in half of the cases). The following challenges have to be addressed in the future: real-time capturing of emerging suspected diagnoses in the memory of the physician, definition of meaningful information units, and a more contemporary measurement of confidence. DPM is a useful tool for tracing real and individual diagnostic processes. The methodological approach with DPM allows further investigations into the underlying cognitive diagnostic processes on a theoretical level and improvement of individual clinical reasoning skills in practice.

  8. Processing of plastics

    PubMed Central

    Spaak, Albert

    1975-01-01

    An overview is given of the processing of plastic materials from the handling of polymers in the pellet and powder form to manufacturing of a plastic fabricated product. Various types of equipment used and melt processing ranges of various polymer formulations to make the myriad of plastic products that are commercially available are discussed. PMID:1175556

  9. Manufacturability improvements in EUV resist processing toward NXE:3300 processing

    NASA Astrophysics Data System (ADS)

    Kuwahara, Yuhei; Matsunaga, Koichi; Shimoaoki, Takeshi; Kawakami, Shinichiro; Nafus, Kathleen; Foubert, Philippe; Goethals, Anne-Marie; Shimura, Satoru

    2014-03-01

    As the design rule of semiconductor process gets finer, extreme ultraviolet lithography (EUVL) technology is aggressively studied as a process for 22nm half pitch and beyond. At present, the studies for EUV focus on manufacturability. It requires fine resolution, uniform, smooth patterns and low defectivity, not only after lithography but also after the etch process. In the first half of 2013, a CLEAN TRACKTM LITHIUS ProTMZ-EUV was installed at imec for POR development in preparation of the ASML NXE:3300. This next generation coating/developing system is equipped with state of the art defect reduction technology. This tool with advanced functions can achieve low defect levels. This paper reports on the progress towards manufacturing defectivity levels and latest optimizations towards the NXE:3300 POR for both lines/spaces and contact holes at imec.

  10. [Sociophysiology: basic processes of empathy].

    PubMed

    Haker, Helene; Schimansky, Jenny; Rössler, Wulf

    2010-01-01

    The aim of this review is to describe sociophysiological and social cognitive processes that underlie the complex phenomenon of human empathy. Automatic reflexive processes such as physiological contagion and action mirroring are mediated by the mirror neuron system. They are a basis for further processing of social signals and a physiological link between two individuals. This link comprises simultaneous activation of shared motor representations. Shared representations lead implicitly via individual associations in the limbic and vegetative system to a shared affective state. These processes are called sociophysiology. Further controlled- reflective, self-referential processing of those social signals leads to explicit, conscious representations of others' minds. Those higher-order processes are called social cognition. The interaction of physiological and cognitive social processes lets arise the phenomenon of human empathy.

  11. Energy saving processes for nitrogen removal in organic wastewater from food processing industries in Thailand.

    PubMed

    Johansen, N H; Suksawad, N; Balslev, P

    2004-01-01

    Nitrogen removal from organic wastewater is becoming a demand in developed communities. The use of nitrite as intermediate in the treatment of wastewater has been largely ignored, but is actually a relevant energy saving process compared to conventional nitrification/denitrification using nitrate as intermediate. Full-scale results and pilot-scale results using this process are presented. The process needs some additional process considerations and process control to be utilized. Especially under tropical conditions the nitritation process will round easily, and it must be expected that many AS treatment plants in the food industry already produce NO2-N. This uncontrolled nitrogen conversion can be the main cause for sludge bulking problems. It is expected that sludge bulking problems in many cases can be solved just by changing the process control in order to run a more consequent nitritation. Theoretically this process will decrease the oxygen consumption for oxidation by 25% and the use of carbon source for the reduction will be decreased by 40% compared to the conventional process.

  12. Layer-based buffer aware rate adaptation design for SHVC video streaming

    NASA Astrophysics Data System (ADS)

    Gudumasu, Srinivas; Hamza, Ahmed; Asbun, Eduardo; He, Yong; Ye, Yan

    2016-09-01

    This paper proposes a layer based buffer aware rate adaptation design which is able to avoid abrupt video quality fluctuation, reduce re-buffering latency and improve bandwidth utilization when compared to a conventional simulcast based adaptive streaming system. The proposed adaptation design schedules DASH segment requests based on the estimated bandwidth, dependencies among video layers and layer buffer fullness. Scalable HEVC video coding is the latest state-of-art video coding technique that can alleviate various issues caused by simulcast based adaptive video streaming. With scalable coded video streams, the video is encoded once into a number of layers representing different qualities and/or resolutions: a base layer (BL) and one or more enhancement layers (EL), each incrementally enhancing the quality of the lower layers. Such layer based coding structure allows fine granularity rate adaptation for the video streaming applications. Two video streaming use cases are presented in this paper. The first use case is to stream HD SHVC video over a wireless network where available bandwidth varies, and the performance comparison between proposed layer-based streaming approach and conventional simulcast streaming approach is provided. The second use case is to stream 4K/UHD SHVC video over a hybrid access network that consists of a 5G millimeter wave high-speed wireless link and a conventional wired or WiFi network. The simulation results verify that the proposed layer based rate adaptation approach is able to utilize the bandwidth more efficiently. As a result, a more consistent viewing experience with higher quality video content and minimal video quality fluctuations can be presented to the user.

  13. Living olefin polymerization processes

    DOEpatents

    Schrock, Richard R.; Baumann, Robert

    1999-01-01

    Processes for the living polymerization of olefin monomers with terminal carbon-carbon double bonds are disclosed. The processes employ initiators that include a metal atom and a ligand having two group 15 atoms and a group 16 atom or three group 15 atoms. The ligand is bonded to the metal atom through two anionic or covalent bonds and a dative bond. The initiators are particularly stable under reaction conditions in the absence of olefin monomer. The processes provide polymers having low polydispersities, especially block copolymers having low polydispersities. It is an additional advantage of these processes that, during block copolymer synthesis, a relatively small amount of homopolymer is formed.

  14. Living olefin polymerization processes

    DOEpatents

    Schrock, R.R.; Baumann, R.

    1999-03-30

    Processes for the living polymerization of olefin monomers with terminal carbon-carbon double bonds are disclosed. The processes employ initiators that include a metal atom and a ligand having two group 15 atoms and a group 16 atom or three group 15 atoms. The ligand is bonded to the metal atom through two anionic or covalent bonds and a dative bond. The initiators are particularly stable under reaction conditions in the absence of olefin monomer. The processes provide polymers having low polydispersities, especially block copolymers having low polydispersities. It is an additional advantage of these processes that, during block copolymer synthesis, a relatively small amount of homopolymer is formed.

  15. Living olefin polymerization processes

    DOEpatents

    Schrock, Richard R.; Baumann, Robert

    2003-08-26

    Processes for the living polymerization of olefin monomers with terminal carbon-carbon double bonds are disclosed. The processes employ initiators that include a metal atom and a ligand having two group 15 atoms and a group 16 atom or three group 15 atoms. The ligand is bonded to the metal atom through two anionic or covalent bonds and a dative bond. The initiators are particularly stable under reaction conditions in the absence of olefin monomer. The processes provide polymers having low polydispersities, especially block copolymers having low polydispersities. It is an additional advantage of these processes that, during block copolymer synthesis, a relatively small amount of homopolymer is formed.

  16. Living olefin polymerization processes

    DOEpatents

    Schrock, Richard R.; Bauman, Robert

    2006-11-14

    Processes for the living polymerization of olefin monomers with terminal carbon-carbon double bonds are disclosed. The processes employ initiators that include a metal atom and a ligand having two group 15 atoms and a group 16 atom or three group 15 atoms. The ligand is bonded to the metal atom through two anionic or covalent bonds and a dative bond. The initiators are particularly stable under reaction conditions in the absence of olefin monomer. The processes provide polymers having low polydispersities, especially block copolymers having low polydispersities. It is an additional advantage of these processes that, during block copolymer synthesis, a relatively small amount of homopolymer is formed.

  17. Industrial process surveillance system

    DOEpatents

    Gross, K.C.; Wegerich, S.W.; Singer, R.M.; Mott, J.E.

    1998-06-09

    A system and method are disclosed for monitoring an industrial process and/or industrial data source. The system includes generating time varying data from industrial data sources, processing the data to obtain time correlation of the data, determining the range of data, determining learned states of normal operation and using these states to generate expected values, comparing the expected values to current actual values to identify a current state of the process closest to a learned, normal state; generating a set of modeled data, and processing the modeled data to identify a data pattern and generating an alarm upon detecting a deviation from normalcy. 96 figs.

  18. Neuroscientific Model of Motivational Process

    PubMed Central

    Kim, Sung-il

    2013-01-01

    Considering the neuroscientific findings on reward, learning, value, decision-making, and cognitive control, motivation can be parsed into three sub processes, a process of generating motivation, a process of maintaining motivation, and a process of regulating motivation. I propose a tentative neuroscientific model of motivational processes which consists of three distinct but continuous sub processes, namely reward-driven approach, value-based decision-making, and goal-directed control. Reward-driven approach is the process in which motivation is generated by reward anticipation and selective approach behaviors toward reward. This process recruits the ventral striatum (reward area) in which basic stimulus-action association is formed, and is classified as an automatic motivation to which relatively less attention is assigned. By contrast, value-based decision-making is the process of evaluating various outcomes of actions, learning through positive prediction error, and calculating the value continuously. The striatum and the orbitofrontal cortex (valuation area) play crucial roles in sustaining motivation. Lastly, the goal-directed control is the process of regulating motivation through cognitive control to achieve goals. This consciously controlled motivation is associated with higher-level cognitive functions such as planning, retaining the goal, monitoring the performance, and regulating action. The anterior cingulate cortex (attention area) and the dorsolateral prefrontal cortex (cognitive control area) are the main neural circuits related to regulation of motivation. These three sub processes interact with each other by sending reward prediction error signals through dopaminergic pathway from the striatum and to the prefrontal cortex. The neuroscientific model of motivational process suggests several educational implications with regard to the generation, maintenance, and regulation of motivation to learn in the learning environment. PMID:23459598

  19. Neuroscientific model of motivational process.

    PubMed

    Kim, Sung-Il

    2013-01-01

    Considering the neuroscientific findings on reward, learning, value, decision-making, and cognitive control, motivation can be parsed into three sub processes, a process of generating motivation, a process of maintaining motivation, and a process of regulating motivation. I propose a tentative neuroscientific model of motivational processes which consists of three distinct but continuous sub processes, namely reward-driven approach, value-based decision-making, and goal-directed control. Reward-driven approach is the process in which motivation is generated by reward anticipation and selective approach behaviors toward reward. This process recruits the ventral striatum (reward area) in which basic stimulus-action association is formed, and is classified as an automatic motivation to which relatively less attention is assigned. By contrast, value-based decision-making is the process of evaluating various outcomes of actions, learning through positive prediction error, and calculating the value continuously. The striatum and the orbitofrontal cortex (valuation area) play crucial roles in sustaining motivation. Lastly, the goal-directed control is the process of regulating motivation through cognitive control to achieve goals. This consciously controlled motivation is associated with higher-level cognitive functions such as planning, retaining the goal, monitoring the performance, and regulating action. The anterior cingulate cortex (attention area) and the dorsolateral prefrontal cortex (cognitive control area) are the main neural circuits related to regulation of motivation. These three sub processes interact with each other by sending reward prediction error signals through dopaminergic pathway from the striatum and to the prefrontal cortex. The neuroscientific model of motivational process suggests several educational implications with regard to the generation, maintenance, and regulation of motivation to learn in the learning environment.

  20. Design of production process main shaft process with lean manufacturing to improve productivity

    NASA Astrophysics Data System (ADS)

    Siregar, I.; Nasution, A. A.; Andayani, U.; Anizar; Syahputri, K.

    2018-02-01

    This object research is one of manufacturing companies that produce oil palm machinery parts. In the production process there is delay in the completion of the Main shaft order. Delays in the completion of the order indicate the low productivity of the company in terms of resource utilization. This study aimed to obtain a draft improvement of production processes that can improve productivity by identifying and eliminating activities that do not add value (non-value added activity). One approach that can be used to reduce and eliminate non-value added activity is Lean Manufacturing. This study focuses on the identification of non-value added activity with value stream mapping analysis tools, while the elimination of non-value added activity is done with tools 5 whys and implementation of pull demand system. Based on the research known that non-value added activity on the production process of the main shaft is 9,509.51 minutes of total lead time 10,804.59 minutes. This shows the level of efficiency (Process Cycle Efficiency) in the production process of the main shaft is still very low by 11.89%. Estimation results of improvement showed a decrease in total lead time became 4,355.08 minutes and greater process cycle efficiency that is equal to 29.73%, which indicates that the process was nearing the concept of lean production.

  1. Real-time monitoring of clinical processes using complex event processing and transition systems.

    PubMed

    Meinecke, Sebastian

    2014-01-01

    Dependencies between tasks in clinical processes are often complex and error-prone. Our aim is to describe a new approach for the automatic derivation of clinical events identified via the behaviour of IT systems using Complex Event Processing. Furthermore we map these events on transition systems to monitor crucial clinical processes in real-time for preventing and detecting erroneous situations.

  2. Composing Models of Geographic Physical Processes

    NASA Astrophysics Data System (ADS)

    Hofer, Barbara; Frank, Andrew U.

    Processes are central for geographic information science; yet geographic information systems (GIS) lack capabilities to represent process related information. A prerequisite to including processes in GIS software is a general method to describe geographic processes independently of application disciplines. This paper presents such a method, namely a process description language. The vocabulary of the process description language is derived formally from mathematical models. Physical processes in geography can be described in two equivalent languages: partial differential equations or partial difference equations, where the latter can be shown graphically and used as a method for application specialists to enter their process models. The vocabulary of the process description language comprises components for describing the general behavior of prototypical geographic physical processes. These process components can be composed by basic models of geographic physical processes, which is shown by means of an example.

  3. Eigenforms, Discrete Processes and Quantum Processes

    NASA Astrophysics Data System (ADS)

    Kauffman, Louis H.

    2012-05-01

    This essay is a discussion of the concept of eigenform, due to Heinz von Foerster, and its relationship with discrete physics and quantum mechanics. We interpret the square root of minus one as a simple oscillatory process - a clock, and as an eigenform. By taking a generalization of this identification of i as a clock and eigenform, we show how quantum mechanics emerges from discrete physics.

  4. 40 CFR 63.138 - Process wastewater provisions-performance standards for treatment processes managing Group 1...

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    .../or Table 9 compounds are similar and often identical. (3) Biological treatment processes. Biological treatment processes in compliance with this section may be either open or closed biological treatment processes as defined in § 63.111. An open biological treatment process in compliance with this section need...

  5. Reforming process

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mitsche, R.T.; Pope, G.N.

    A process for reforming a naphtha feedstock is disclosed. The reforming process is effected at reforming conditions in contact with a catalyst comprising a platinum group metal component and a group iv-a metal component composited with an alumina support wherein said support is prepared by admixing an alpha alumina monohydrate with an aqueous ammoniacal solution having a ph of at least about 7.5 to form a stable suspension. A salt of a strong acid, e.g., aluminum nitrate, is commingled with the suspension to form an extrudable paste or dough. On extrusion, the extrudate is dried and calcined to form saidmore » alumina support.« less

  6. Implementation of the Business Process Modelling Notation (BPMN) in the modelling of anatomic pathology processes.

    PubMed

    Rojo, Marcial García; Rolón, Elvira; Calahorra, Luis; García, Felix Oscar; Sánchez, Rosario Paloma; Ruiz, Francisco; Ballester, Nieves; Armenteros, María; Rodríguez, Teresa; Espartero, Rafael Martín

    2008-07-15

    Process orientation is one of the essential elements of quality management systems, including those in use in healthcare. Business processes in hospitals are very complex and variable. BPMN (Business Process Modelling Notation) is a user-oriented language specifically designed for the modelling of business (organizational) processes. Previous experiences of the use of this notation in the processes modelling within the Pathology in Spain or another country are not known. We present our experience in the elaboration of the conceptual models of Pathology processes, as part of a global programmed surgical patient process, using BPMN. With the objective of analyzing the use of BPMN notation in real cases, a multidisciplinary work group was created, including software engineers from the Dep. of Technologies and Information Systems from the University of Castilla-La Mancha and health professionals and administrative staff from the Hospital General de Ciudad Real. The work in collaboration was carried out in six phases: informative meetings, intensive training, process selection, definition of the work method, process describing by hospital experts, and process modelling. The modelling of the processes of Anatomic Pathology is presented using BPMN. The presented subprocesses are those corresponding to the surgical pathology examination of the samples coming from operating theatre, including the planning and realization of frozen studies. The modelling of Anatomic Pathology subprocesses has allowed the creation of an understandable graphical model, where management and improvements are more easily implemented by health professionals.

  7. Implementation of the Business Process Modelling Notation (BPMN) in the modelling of anatomic pathology processes

    PubMed Central

    Rojo, Marcial García; Rolón, Elvira; Calahorra, Luis; García, Felix Óscar; Sánchez, Rosario Paloma; Ruiz, Francisco; Ballester, Nieves; Armenteros, María; Rodríguez, Teresa; Espartero, Rafael Martín

    2008-01-01

    Background Process orientation is one of the essential elements of quality management systems, including those in use in healthcare. Business processes in hospitals are very complex and variable. BPMN (Business Process Modelling Notation) is a user-oriented language specifically designed for the modelling of business (organizational) processes. Previous experiences of the use of this notation in the processes modelling within the Pathology in Spain or another country are not known. We present our experience in the elaboration of the conceptual models of Pathology processes, as part of a global programmed surgical patient process, using BPMN. Methods With the objective of analyzing the use of BPMN notation in real cases, a multidisciplinary work group was created, including software engineers from the Dep. of Technologies and Information Systems from the University of Castilla-La Mancha and health professionals and administrative staff from the Hospital General de Ciudad Real. The work in collaboration was carried out in six phases: informative meetings, intensive training, process selection, definition of the work method, process describing by hospital experts, and process modelling. Results The modelling of the processes of Anatomic Pathology is presented using BPMN. The presented subprocesses are those corresponding to the surgical pathology examination of the samples coming from operating theatre, including the planning and realization of frozen studies. Conclusion The modelling of Anatomic Pathology subprocesses has allowed the creation of an understandable graphical model, where management and improvements are more easily implemented by health professionals. PMID:18673511

  8. Food-Processing Wastes.

    PubMed

    Frenkel, Val S; Cummings, Gregg A; Maillacheruvu, K Y; Tang, Walter Z

    2015-10-01

    Literature published in 2014 and early 2015 related to food processing wastes treatment for industrial applications are reviewed. This review is a subsection of the Treatment Systems section of the annual Water Environment Federation literature review and covers the following food processing industries and applications: general, meat and poultry, fruits and vegetables, dairy and beverage, and miscellaneous treatment of food wastes.

  9. Food-Processing Wastes.

    PubMed

    Frenkel, Val S; Cummings, Gregg A; Maillacheruvu, K Y; Tang, Walter Z

    2017-10-01

    Literature published in 2016 and early 2017 related to food processing wastes treatment for industrial applications are reviewed. This review is a subsection of the Treatment Systems section of the annual Water Environment Federation literature review and covers the following food processing industries and applications: general, meat and poultry, fruits and vegetables, dairy and beverage, and miscellaneous treatment of food wastes.

  10. Food-Processing Wastes.

    PubMed

    Frenkel, Val S; Cummings, Gregg A; Maillacheruvu, K Y; Tang, Walter Z

    2016-10-01

    Literature published in 2015 and early 2016 related to food processing wastes treatment for industrial applications are reviewed. This review is a subsection of the Treatment Systems section of the annual Water Environment Federation literature review and covers the following food processing industries and applications: general, meat and poultry, fruits and vegetables, dairy and beverage, and miscellaneous treatment of food wastes.

  11. Powder treatment process

    DOEpatents

    Weyand, J.D.

    1988-02-09

    Disclosed are: (1) a process comprising spray drying a powder-containing slurry, the slurry containing a powder constituent susceptible of oxidizing under the temperature conditions of the spray drying, while reducing the tendency for oxidation of the constituent by including as a liquid constituent of the slurry an organic liquid; (2) a process comprising spray drying a powder-containing slurry, the powder having been pretreated to reduce content of a powder constituent susceptible of oxidizing under the temperature conditions of the spray drying, the pretreating comprising heating the powder to react the constituent; and (3) a process comprising reacting ceramic powder, grinding the reacted powder, slurrying the ground powder, spray drying the slurried powder, and blending the dried powder with metal powder. 2 figs.

  12. Powder treatment process

    DOEpatents

    Weyand, John D.

    1988-01-01

    (1) A process comprising spray drying a powder-containing slurry, the slurry containing a powder constituent susceptible of oxidizing under the temperature conditions of the spray drying, while reducing the tendency for oxidation of the constituent by including as a liquid constituent of the slurry an organic liquid; (2) a process comprising spray drying a powder-containing slurry, the powder having been pretreated to reduce content of a powder constituent susceptible of oxidizing under the temperature conditions of the spray drying, the pretreating comprising heating the powder to react the constituent; and (3) a process comprising reacting ceramic powder, grinding the reacted powder, slurrying the ground powder, spray drying the slurried powder, and blending the dried powder with metal powder.

  13. Research in Stochastic Processes

    DTIC Science & Technology

    1988-08-31

    stationary sequence, Stochastic Proc. Appl. 29, 1988, 155-169 T. Hsing, J. Husler and M.R. Leadbetter, On the exceedance point process for a stationary...Nandagopalan, On exceedance point processes for "regular" sample functions, Proc. Volume, Oberxolfach Conf. on Extreme Value Theory, J. Husler and R. Reiss...exceedance point processes for stationary sequences under mild oscillation restrictions, Apr. 88. Obermotfach Conf. on Extremal Value Theory. Ed. J. HUsler

  14. Heat Transfer Processes for the Thermal Energy Balance of Organisms. Physical Processes in Terrestrial and Aquatic Ecosystems, Transport Processes.

    ERIC Educational Resources Information Center

    Stevenson, R. D.

    This module is part of a series designed to be used by life science students for instruction in the application of physical theory to ecosystem operation. Most modules contain computer programs which are built around a particular application of a physical process. This module describes heat transfer processes involved in the exchange of heat…

  15. Process Intensification for Cellulosic Biorefineries.

    PubMed

    Sadula, Sunitha; Athaley, Abhay; Zheng, Weiqing; Ierapetritou, Marianthi; Saha, Basudeb

    2017-06-22

    Utilization of renewable carbon source, especially non-food biomass is critical to address the climate change and future energy challenge. Current chemical and enzymatic processes for producing cellulosic sugars are multistep, and energy- and water-intensive. Techno-economic analysis (TEA) suggests that upstream lignocellulose processing is a major hurdle to the economic viability of the cellulosic biorefineries. Process intensification, which integrates processes and uses less water and energy, has the potential to overcome the aforementioned challenges. Here, we demonstrate a one-pot depolymerization and saccharification process of woody biomass, energy crops, and agricultural residues to produce soluble sugars with high yields. Lignin is separated as a solid for selective upgrading. Further integration of our upstream process with a reactive extraction step makes energy-efficient separation of sugars in the form of furans. TEA reveals that the process efficiency and integration enable, for the first time, economic production of feed streams that could profoundly improve process economics for downstream cellulosic bioproducts. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  16. DEFINITIVE SOX CONTROL PROCESS EVALUATIONS: LIMESTONE, DOUBLE ALKALI, AND CITRATE FGD PROCESSES

    EPA Science Inventory

    The report gives results of a detailed comparative technical and economic evaluation of limestone slurry, generic double alkali, and citrate flue gas desulfurization (FGD) processes, assuming proven technology and using representative power plant, process design, and economic pre...

  17. Supporting Cross-Organizational Process Control

    NASA Astrophysics Data System (ADS)

    Angelov, Samuil; Vonk, Jochem; Vidyasankar, Krishnamurthy; Grefen, Paul

    E-contracts express the rights and obligations of parties through a formal, digital representation of the contract provisions. In process intensive relationships, e-contracts contain business processes that a party promises to perform for the counter party, optionally allowing monitoring of the execution of the promised processes. In this paper, we describe an approach in which the counter party is allowed to control the process execution. This approach will lead to more flexible and efficient business relations which are essential in the context of modern, highly dynamic and complex collaborations among companies. We present a specification of the process controls available to the consumer and their support in the private process specification of the provider.

  18. Central waste processing system

    NASA Technical Reports Server (NTRS)

    Kester, F. L.

    1973-01-01

    A new concept for processing spacecraft type wastes has been evaluated. The feasibility of reacting various waste materials with steam at temperatures of 538 - 760 C in both a continuous and batch reactor with residence times from 3 to 60 seconds has been established. Essentially complete gasification is achieved. Product gases are primarily hydrogen, carbon dioxide, methane, and carbon monoxide. Water soluble synthetic wastes are readily processed in a continuous tubular reactor at concentrations up to 20 weight percent. The batch reactor is able to process wet and dry wastes at steam to waste weight ratios from 2 to 20. Feces, urine, and synthetic wastes have been successfully processed in the batch reactor.

  19. Certification-Based Process Analysis

    NASA Technical Reports Server (NTRS)

    Knight, Russell L.

    2013-01-01

    Space mission architects are often challenged with knowing which investment in technology infusion will have the highest return. Certification-based analysis (CBA) gives architects and technologists a means to communicate the risks and advantages of infusing technologies at various points in a process. Various alternatives can be compared, and requirements based on supporting streamlining or automation can be derived and levied on candidate technologies. CBA is a technique for analyzing a process and identifying potential areas of improvement. The process and analysis products are used to communicate between technologists and architects. Process means any of the standard representations of a production flow; in this case, any individual steps leading to products, which feed into other steps, until the final product is produced at the end. This sort of process is common for space mission operations, where a set of goals is reduced eventually to a fully vetted command sequence to be sent to the spacecraft. Fully vetting a product is synonymous with certification. For some types of products, this is referred to as verification and validation, and for others it is referred to as checking. Fundamentally, certification is the step in the process where one insures that a product works as intended, and contains no flaws.

  20. Laminar soot processes

    NASA Technical Reports Server (NTRS)

    Sunderland, P. B.; Lin, K.-C.; Faeth, G. M.

    1995-01-01

    Soot processes within hydrocarbon fueled flames are important because they affect the durability and performance of propulsion systems, the hazards of unwanted fires, the pollutant and particulate emissions from combustion processes, and the potential for developing computational combustion. Motivated by these observations, the present investigation is studying soot processes in laminar diffusion and premixed flames in order to better understand the soot and thermal radiation emissions of luminous flames. Laminar flames are being studied due to their experimental and computational tractability, noting the relevance of such results to practical turbulent flames through the laminar flamelet concept. Weakly-buoyant and nonbuoyant laminar diffusion flames are being considered because buoyancy affects soot processes in flames while most practical flames involve negligible effects of buoyancy. Thus, low-pressure weakly-buoyant flames are being observed during ground-based experiments while near atmospheric pressure nonbuoyant flames will be observed during space flight experiments at microgravity. Finally, premixed laminar flames also are being considered in order to observe some aspects of soot formation for simpler flame conditions than diffusion flames. The main emphasis of current work has been on measurements of soot nucleation and growth in laminar diffusion and premixed flames.

  1. Development of wide band digital receiver for atmospheric radars using COTS board based SDR

    NASA Astrophysics Data System (ADS)

    Yasodha, Polisetti; Jayaraman, Achuthan; Thriveni, A.

    2016-07-01

    Digital receiver extracts the received echo signal information, and is a potential subsystem for atmospheric radar, also referred to as wind profiling radar (WPR), which provides the vertical profiles of 3-dimensional wind vector in the atmosphere. This paper presents the development of digital receiver using COTS board based Software Defined Radio technique, which can be used for atmospheric radars. The developmental work is being carried out at National Atmospheric Research Laboratory (NARL), Gadanki. The digital receiver consists of a commercially available software defined radio (SDR) board called as universal software radio peripheral B210 (USRP B210) and a personal computer. USRP B210 operates over a wider frequency range from 70 MHz to 6 GHz and hence can be used for variety of radars like Doppler weather radars operating in S/C bands, in addition to wind profiling radars operating in VHF, UHF and L bands. Due to the flexibility and re-configurability of SDR, where the component functionalities are implemented in software, it is easy to modify the software to receive the echoes and process them as per the requirement suitable for the type of the radar intended. Hence, USRP B210 board along with the computer forms a versatile digital receiver from 70 MHz to 6 GHz. It has an inbuilt direct conversion transceiver with two transmit and two receive channels, which can be operated in fully coherent 2x2 MIMO fashion and thus it can be used as a two channel receiver. Multiple USRP B210 boards can be synchronized using the pulse per second (PPS) input provided on the board, to configure multi-channel digital receiver system. RF gain of the transceiver can be varied from 0 to 70 dB. The board can be controlled from the computer via USB 3.0 interface through USRP hardware driver (UHD), which is an open source cross platform driver. The USRP B210 board is connected to the personal computer through USB 3.0. Reference (10 MHz) clock signal from the radar master oscillator

  2. Silicon production process evaluations

    NASA Technical Reports Server (NTRS)

    1981-01-01

    The chemical engineering analysis of the preliminary process design of a process for producing solar cell grade silicon from dichlorosilane is presented. A plant to produce 1,000 MT/yr of silicon is analyzed. Progress and status for the plant design are reported for the primary activities of base case conditions (60 percent), reaction chemistry (50 percent), process flow diagram (35 percent), energy balance (10 percent), property data (10 percent) and equipment design (5 percent).

  3. NASA Hazard Analysis Process

    NASA Technical Reports Server (NTRS)

    Deckert, George

    2010-01-01

    This viewgraph presentation reviews The NASA Hazard Analysis process. The contents include: 1) Significant Incidents and Close Calls in Human Spaceflight; 2) Subsystem Safety Engineering Through the Project Life Cycle; 3) The Risk Informed Design Process; 4) Types of NASA Hazard Analysis; 5) Preliminary Hazard Analysis (PHA); 6) Hazard Analysis Process; 7) Identify Hazardous Conditions; 8) Consider All Interfaces; 9) Work a Preliminary Hazard List; 10) NASA Generic Hazards List; and 11) Final Thoughts

  4. Hydrogen recovery process

    DOEpatents

    Baker, Richard W.; Lokhandwala, Kaaeid A.; He, Zhenjie; Pinnau, Ingo

    2000-01-01

    A treatment process for a hydrogen-containing off-gas stream from a refinery, petrochemical plant or the like. The process includes three separation steps: condensation, membrane separation and hydrocarbon fraction separation. The membrane separation step is characterized in that it is carried out under conditions at which the membrane exhibits a selectivity in favor of methane over hydrogen of at least about 2.5.

  5. Coal liquefaction process

    DOEpatents

    Skinner, Ronald W.; Tao, John C.; Znaimer, Samuel

    1985-01-01

    This invention relates to an improved process for the production of liquid carbonaceous fuels and solvents from carbonaceous solid fuels, especially coal. The claimed improved process includes the hydrocracking of the light SRC mixed with a suitable hydrocracker solvent. The recycle of the resulting hydrocracked product, after separation and distillation, is used to produce a solvent for the hydrocracking of the light solvent refined coal.

  6. Quality-by-Design (QbD): An integrated process analytical technology (PAT) approach for a dynamic pharmaceutical co-precipitation process characterization and process design space development.

    PubMed

    Wu, Huiquan; White, Maury; Khan, Mansoor A

    2011-02-28

    The aim of this work was to develop an integrated process analytical technology (PAT) approach for a dynamic pharmaceutical co-precipitation process characterization and design space development. A dynamic co-precipitation process by gradually introducing water to the ternary system of naproxen-Eudragit L100-alcohol was monitored at real-time in situ via Lasentec FBRM and PVM. 3D map of count-time-chord length revealed three distinguishable process stages: incubation, transition, and steady-state. The effects of high risk process variables (slurry temperature, stirring rate, and water addition rate) on both derived co-precipitation process rates and final chord-length-distribution were evaluated systematically using a 3(3) full factorial design. Critical process variables were identified via ANOVA for both transition and steady state. General linear models (GLM) were then used for parameter estimation for each critical variable. Clear trends about effects of each critical variable during transition and steady state were found by GLM and were interpreted using fundamental process principles and Nyvlt's transfer model. Neural network models were able to link process variables with response variables at transition and steady state with R(2) of 0.88-0.98. PVM images evidenced nucleation and crystal growth. Contour plots illustrated design space via critical process variables' ranges. It demonstrated the utility of integrated PAT approach for QbD development. Published by Elsevier B.V.

  7. Silicon-gate CMOS/SOS processing

    NASA Technical Reports Server (NTRS)

    Ramondetta, P.

    1979-01-01

    Major silicon-gate CMOS/SOS processes are described. Sapphire substrate preparation is also discussed, as well as the following process variations: (1) the double epi process; and (2) ion implantation.

  8. Global processing takes time: A meta-analysis on local-global visual processing in ASD.

    PubMed

    Van der Hallen, Ruth; Evers, Kris; Brewaeys, Katrien; Van den Noortgate, Wim; Wagemans, Johan

    2015-05-01

    What does an individual with autism spectrum disorder (ASD) perceive first: the forest or the trees? In spite of 30 years of research and influential theories like the weak central coherence (WCC) theory and the enhanced perceptual functioning (EPF) account, the interplay of local and global visual processing in ASD remains only partly understood. Research findings vary in indicating a local processing bias or a global processing deficit, and often contradict each other. We have applied a formal meta-analytic approach and combined 56 articles that tested about 1,000 ASD participants and used a wide range of stimuli and tasks to investigate local and global visual processing in ASD. Overall, results show no enhanced local visual processing nor a deficit in global visual processing. Detailed analysis reveals a difference in the temporal pattern of the local-global balance, that is, slow global processing in individuals with ASD. Whereas task-dependent interaction effects are obtained, gender, age, and IQ of either participant groups seem to have no direct influence on performance. Based on the overview of the literature, suggestions are made for future research. (c) 2015 APA, all rights reserved).

  9. [Practice report: the process-based indicator dashboard. Visualising quality assurance results in standardised processes].

    PubMed

    Petzold, Thomas; Hertzschuch, Diana; Elchlep, Frank; Eberlein-Gonska, Maria

    2014-01-01

    Process management (PM) is a valuable method for the systematic analysis and structural optimisation of the quality and safety of clinical treatment. PM requires a high motivation and willingness to implement changes of both employees and management. Definition of quality indicators is required to systematically measure the quality of the specified processes. One way to represent comparable quality results is the use of quality indicators of the external quality assurance in accordance with Sect. 137 SGB V—a method which the Federal Joint Committee (GBA) and the institutions commissioned by the GBA have employed and consistently enhanced for more than ten years. Information on the quality of inpatient treatment is available for 30 defined subjects throughout Germany. The combination of specified processes with quality indicators is beneficial for the information of employees. A process-based indicator dashboard provides essential information about the treatment process. These can be used for process analysis. In a continuous consideration of these indicator results values can be determined and errors will be remedied quickly. If due consideration is given to these indicators, they can be used for benchmarking to identify potential process improvements. Copyright © 2014. Published by Elsevier GmbH.

  10. From Process to Product: Your Risk Process at Work

    NASA Technical Reports Server (NTRS)

    Kundrot, Craig E.; Fogarty, Jenifer; Charles, John; Buquo, Lynn; Sibonga, Jean; Alexander, David; Horn, Wayne G.; Edwards, J. Michelle

    2010-01-01

    The Space Life Sciences Directorate (SLSD) and Human Research Program (HRP) at the NASA/Johnson Space Center work together to address and manage the human health and performance risks associated with human space flight. This includes all human system requirements before, during, and after space flight, providing for research, and managing the risk of adverse long-term health outcomes for the crew. We previously described the framework and processes developed for identifying and managing these human system risks. The focus of this panel is to demonstrate how the implementation of the framework and associated processes has provided guidance in the management and communication of human system risks. The risks of early onset osteoporosis, CO2 exposure, and intracranial hypertension in particular have all benefitted from the processes developed for human system risk management. Moreover, we are continuing to develop capabilities, particularly in the area of information architecture, which will also be described. We are working to create a system whereby all risks and associated actions can be tracked and related to one another electronically. Such a system will enhance the management and communication capabilities for the human system risks, thereby increasing the benefit to researchers and flight surgeons.

  11. Social Network Supported Process Recommender System

    PubMed Central

    Ye, Yanming; Yin, Jianwei; Xu, Yueshen

    2014-01-01

    Process recommendation technologies have gained more and more attention in the field of intelligent business process modeling to assist the process modeling. However, most of the existing technologies only use the process structure analysis and do not take the social features of processes into account, while the process modeling is complex and comprehensive in most situations. This paper studies the feasibility of social network research technologies on process recommendation and builds a social network system of processes based on the features similarities. Then, three process matching degree measurements are presented and the system implementation is discussed subsequently. Finally, experimental evaluations and future works are introduced. PMID:24672309

  12. Social network supported process recommender system.

    PubMed

    Ye, Yanming; Yin, Jianwei; Xu, Yueshen

    2014-01-01

    Process recommendation technologies have gained more and more attention in the field of intelligent business process modeling to assist the process modeling. However, most of the existing technologies only use the process structure analysis and do not take the social features of processes into account, while the process modeling is complex and comprehensive in most situations. This paper studies the feasibility of social network research technologies on process recommendation and builds a social network system of processes based on the features similarities. Then, three process matching degree measurements are presented and the system implementation is discussed subsequently. Finally, experimental evaluations and future works are introduced.

  13. Influence of Process Parameters on the Process Efficiency in Laser Metal Deposition Welding

    NASA Astrophysics Data System (ADS)

    Güpner, Michael; Patschger, Andreas; Bliedtner, Jens

    Conventionally manufactured tools are often completely constructed of a high-alloyed, expensive tool steel. An alternative way to manufacture tools is the combination of a cost-efficient, mild steel and a functional coating in the interaction zone of the tool. Thermal processing methods, like laser metal deposition, are always characterized by thermal distortion. The resistance against the thermal distortion decreases with the reduction of the material thickness. As a consequence, there is a necessity of a special process management for the laser based coating of thin parts or tools. The experimental approach in the present paper is to keep the energy and the mass per unit length constant by varying the laser power, the feed rate and the powder mass flow. The typical seam parameters are measured in order to characterize the cladding process, define process limits and evaluate the process efficiency. Ways to optimize dilution, angular distortion and clad height are presented.

  14. Second Language Processing: When Are First and Second Languages Processed Similarly?

    ERIC Educational Resources Information Center

    Sabourin, Laura; Stowe, Laurie A.

    2008-01-01

    In this article we investigate the effects of first language (L1) on second language (L2) neural processing for two grammatical constructions (verbal domain dependency and grammatical gender), focusing on the event-related potential P600 effect, which has been found in both L1 and L2 processing. Native Dutch speakers showed a P600 effect for both…

  15. Optimum processing of mammographic film.

    PubMed

    Sprawls, P; Kitts, E L

    1996-03-01

    Underprocessing of mammographic film can result in reduced contrast and visibility of breast structures and an unnecessary increase in radiation dose to the patient. Underprocessing can be caused by physical factors (low developer temperature, inadequate development time, insufficient developer agitation) or chemical factors (developer not optimized for film type; overdiluted, underreplenished, contaminated, or frequently changed developer). Conventional quality control programs are designed to produce consistent processing but do not address the issue of optimum processing. Optimum processing is defined as the level of processing that produces the film performance characteristics (contrast and sensitivity) specified by the film manufacturer. Optimum processing of mammographic film can be achieved by following a two-step protocol. The first step is to set up the processing conditions according to recommendations from the film and developer chemistry manufacturers. The second step is to verify the processing results by comparing them with sensitometric data provided by the film manufacturer.

  16. Materials processing in space

    NASA Technical Reports Server (NTRS)

    1975-01-01

    The feasibility and possible advantages of processing materials in a nongravitational field are considered. Areas of investigation include biomedical applications, the processing of inorganic materials, and flight programs and funding.

  17. Isothermal separation processes

    NASA Technical Reports Server (NTRS)

    England, C.

    1982-01-01

    The isothermal processes of membrane separation, supercritical extraction and chromatography were examined using availability analysis. The general approach was to derive equations that identified where energy is consumed in these processes and how they compare with conventional separation methods. These separation methods are characterized by pure work inputs, chiefly in the form of a pressure drop which supplies the required energy. Equations were derived for the energy requirement in terms of regular solution theory. This approach is believed to accurately predict the work of separation in terms of the heat of solution and the entropy of mixing. It can form the basis of a convenient calculation method for optimizing membrane and solvent properties for particular applications. Calculations were made on the energy requirements for a membrane process separating air into its components.

  18. Biomimetics: process, tools and practice.

    PubMed

    Fayemi, P E; Wanieck, K; Zollfrank, C; Maranzana, N; Aoussat, A

    2017-01-23

    Biomimetics applies principles and strategies abstracted from biological systems to engineering and technological design. With a huge potential for innovation, biomimetics could evolve into a key process in businesses. Yet challenges remain within the process of biomimetics, especially from the perspective of potential users. We work to clarify the understanding of the process of biomimetics. Therefore, we briefly summarize the terminology of biomimetics and bioinspiration. The implementation of biomimetics requires a stated process. Therefore, we present a model of the problem-driven process of biomimetics that can be used for problem-solving activity. The process of biomimetics can be facilitated by existing tools and creative methods. We mapped a set of tools to the biomimetic process model and set up assessment sheets to evaluate the theoretical and practical value of these tools. We analyzed the tools in interdisciplinary research workshops and present the characteristics of the tools. We also present the attempt of a utility tree which, once finalized, could be used to guide users through the process by choosing appropriate tools respective to their own expertize. The aim of this paper is to foster the dialogue and facilitate a closer collaboration within the field of biomimetics.

  19. Membrane thickening aerobic digestion processes.

    PubMed

    Woo, Bryen

    2014-01-01

    Sludge management accounts for approximately 60% of the total wastewater treatment plant expenditure and laws for sludge disposal are becoming increasingly stringent, therefore much consideration is required when designing a solids handling process. A membrane thickening aerobic digestion process integrates a controlled aerobic digestion process with pre-thickening waste activated sludge using membrane technology. This process typically features an anoxic tank, an aerated membrane thickener operating in loop with a first-stage digester followed by second-stage digestion. Membrane thickening aerobic digestion processes can handle sludge from any liquid treatment process and is best for facilities obligated to meet low total phosphorus and nitrogen discharge limits. Membrane thickening aerobic digestion processes offer many advantages including: producing a reusable quality permeate with minimal levels of total phosphorus and nitrogen that can be recycled to the head works of a plant, protecting the performance of a biological nutrient removal liquid treatment process without requiring chemical addition, providing reliable thickening up to 4% solids concentration without the use of polymers or attention to decanting, increasing sludge storage capacities in existing tanks, minimizing the footprint of new tanks, reducing disposal costs, and providing Class B stabilization.

  20. Space-time-modulated stochastic processes

    NASA Astrophysics Data System (ADS)

    Giona, Massimiliano

    2017-10-01

    Starting from the physical problem associated with the Lorentzian transformation of a Poisson-Kac process in inertial frames, the concept of space-time-modulated stochastic processes is introduced for processes possessing finite propagation velocity. This class of stochastic processes provides a two-way coupling between the stochastic perturbation acting on a physical observable and the evolution of the physical observable itself, which in turn influences the statistical properties of the stochastic perturbation during its evolution. The definition of space-time-modulated processes requires the introduction of two functions: a nonlinear amplitude modulation, controlling the intensity of the stochastic perturbation, and a time-horizon function, which modulates its statistical properties, providing irreducible feedback between the stochastic perturbation and the physical observable influenced by it. The latter property is the peculiar fingerprint of this class of models that makes them suitable for extension to generic curved-space times. Considering Poisson-Kac processes as prototypical examples of stochastic processes possessing finite propagation velocity, the balance equations for the probability density functions associated with their space-time modulations are derived. Several examples highlighting the peculiarities of space-time-modulated processes are thoroughly analyzed.

  1. Ultrasonic Processing of Materials

    NASA Astrophysics Data System (ADS)

    Han, Qingyou

    2015-08-01

    Irradiation of high-energy ultrasonic vibration in metals and alloys generates oscillating strain and stress fields in solids, and introduces nonlinear effects such as cavitation, acoustic streaming, and radiation pressure in molten materials. These nonlinear effects can be utilized to assist conventional material processing processes. This article describes recent research at Oak Ridge National Labs and Purdue University on using high-intensity ultrasonic vibrations for degassing molten aluminum, processing particulate-reinforced metal matrix composites, refining metals and alloys during solidification process and welding, and producing bulk nanostructures in solid metals and alloys. Research results suggest that high-intensity ultrasonic vibration is capable of degassing and dispersing small particles in molten alloys, reducing grain size during alloy solidification, and inducing nanostructures in solid metals.

  2. Biodiesel production process from microalgae oil by waste heat recovery and process integration.

    PubMed

    Song, Chunfeng; Chen, Guanyi; Ji, Na; Liu, Qingling; Kansha, Yasuki; Tsutsumi, Atsushi

    2015-10-01

    In this work, the optimization of microalgae oil (MO) based biodiesel production process is carried out by waste heat recovery and process integration. The exergy analysis of each heat exchanger presented an efficient heat coupling between hot and cold streams, thus minimizing the total exergy destruction. Simulation results showed that the unit production cost of optimized process is 0.592$/L biodiesel, and approximately 0.172$/L biodiesel can be avoided by heat integration. Although the capital cost of the optimized biodiesel production process increased 32.5% and 23.5% compared to the reference cases, the operational cost can be reduced by approximately 22.5% and 41.6%. Copyright © 2015 Elsevier Ltd. All rights reserved.

  3. Thermodynamics of Irreversible Processes. Physical Processes in Terrestrial and Aquatic Ecosystems, Transport Processes.

    ERIC Educational Resources Information Center

    Levin, Michael; Gallucci, V. F.

    These materials were designed to be used by life science students for instruction in the application of physical theory to ecosystem operation. Most modules contain computer programs which are built around a particular application of a physical process. This module describes the application of irreversible thermodynamics to biology. It begins with…

  4. Leading processes of patient care and treatment in hierarchical healthcare organizations in Sweden--process managers' experiences.

    PubMed

    Nilsson, Kerstin; Sandoff, Mette

    2015-01-01

    The purpose of this study is to gain better understanding of the roles and functions of process managers by describing Swedish process managers' experiences of leading processes involving patient care and treatment when working in a hierarchical health-care organization. This study is based on an explorative design. The data were gathered from interviews with 12 process managers at three Swedish hospitals. These data underwent qualitative and interpretative analysis with a modified editing style. The process managers' experiences of leading processes in a hierarchical health-care organization are described under three themes: having or not having a mandate, exposure to conflict situations and leading process development. The results indicate a need for clarity regarding process manager's responsibility and work content, which need to be communicated to all managers and staff involved in the patient care and treatment process, irrespective of department. There also needs to be an emphasis on realistic expectations and orientation of the goals that are an intrinsic part of the task of being a process manager. Generalizations from the results of the qualitative interview studies are limited, but a deeper understanding of the phenomenon was reached, which, in turn, can be transferred to similar settings. This study contributes qualitative descriptions of leading care and treatment processes in a functional, hierarchical health-care organization from process managers' experiences, a subject that has not been investigated earlier.

  5. Microencapsulation Processes

    NASA Astrophysics Data System (ADS)

    Whateley, T. L.; Poncelet, D.

    2005-06-01

    Microencapsulation by solvent evaporation is a novel technique to enable the controlled delivery of active materials.The controlled release of drugs, for example, is a key challenge in the pharmaceutical industries. Although proposed several decades ago, it remains largely an empirical laboratory process.The Topical Team has considered its critical points and the work required to produce a more effective technology - better control of the process for industrial production, understanding of the interfacial dynamics, determination of the solvent evaporation profile, and establishment of the relation between polymer/microcapsule structures.The Team has also defined how microgravity experiments could help in better understanding microencapsulation by solvent evaporation, and it has proposed a strategy for a collaborative project on the topic.

  6. SEL's Software Process-Improvement Program

    NASA Technical Reports Server (NTRS)

    Basili, Victor; Zelkowitz, Marvin; McGarry, Frank; Page, Jerry; Waligora, Sharon; Pajerski, Rose

    1995-01-01

    The goals and operations of the Software Engineering Laboratory (SEL) is reviewed. For nearly 20 years the SEL has worked to understand, assess, and improve software and the development process within the production environment of the Flight Dynamics Division (FDD) of NASA's Goddard Space Flight Center. The SEL was established in 1976 with the goals of reducing: (1) the defect rate of delivered software, (2) the cost of software to support flight projects, and (3) the average time to produce mission-support software. After studying over 125 projects of FDD, the results have guided the standards, management practices, technologies, and the training within the division. The results of the studies have been a 75 percent reduction in defects, a 50 percent reduction in cost, and a 25 percent reduction in development time. Over time the goals of SEL have been clarified. The goals are now stated as: (1) Understand baseline processes and product characteristics, (2) Assess improvements that have been incorporated into the development projects, (3) Package and infuse improvements into the standard SEL process. The SEL improvement goal is to demonstrate continual improvement of the software process by carrying out analysis, measurement and feedback to projects with in the FDD environment. The SEL supports the understanding of the process by study of several processes including, the effort distribution, and error detection rates. The SEL assesses and refines the processes. Once the assessment and refinement of a process is completed, the SEL packages the process by capturing the process in standards, tools and training.

  7. 49 CFR Appendix A to Part 580 - Secure Printing Processes and Other Secure Processes

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 49 Transportation 7 2014-10-01 2014-10-01 false Secure Printing Processes and Other Secure Processes A Appendix A to Part 580 Transportation Other Regulations Relating to Transportation (Continued... DISCLOSURE REQUIREMENTS Pt. 580, App. A Appendix A to Part 580—Secure Printing Processes and Other Secure...

  8. 49 CFR Appendix A to Part 580 - Secure Printing Processes and Other Secure Processes

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 49 Transportation 7 2012-10-01 2012-10-01 false Secure Printing Processes and Other Secure Processes A Appendix A to Part 580 Transportation Other Regulations Relating to Transportation (Continued... DISCLOSURE REQUIREMENTS Pt. 580, App. A Appendix A to Part 580—Secure Printing Processes and Other Secure...

  9. 49 CFR Appendix A to Part 580 - Secure Printing Processes and Other Secure Processes

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 49 Transportation 7 2011-10-01 2011-10-01 false Secure Printing Processes and Other Secure Processes A Appendix A to Part 580 Transportation Other Regulations Relating to Transportation (Continued... DISCLOSURE REQUIREMENTS Pt. 580, App. A Appendix A to Part 580—Secure Printing Processes and Other Secure...

  10. 49 CFR Appendix A to Part 580 - Secure Printing Processes and Other Secure Processes

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 49 Transportation 7 2013-10-01 2013-10-01 false Secure Printing Processes and Other Secure Processes A Appendix A to Part 580 Transportation Other Regulations Relating to Transportation (Continued... DISCLOSURE REQUIREMENTS Pt. 580, App. A Appendix A to Part 580—Secure Printing Processes and Other Secure...

  11. Teaching Process Design through Integrated Process Synthesis

    ERIC Educational Resources Information Center

    Metzger, Matthew J.; Glasser, Benjamin J.; Patel, Bilal; Hildebrandt, Diane; Glasser, David

    2012-01-01

    The design course is an integral part of chemical engineering education. A novel approach to the design course was recently introduced at the University of the Witwatersrand, Johannesburg, South Africa. The course aimed to introduce students to systematic tools and techniques for setting and evaluating performance targets for processes, as well as…

  12. Risk-based process safety assessment and control measures design for offshore process facilities.

    PubMed

    Khan, Faisal I; Sadiq, Rehan; Husain, Tahir

    2002-09-02

    Process operation is the most hazardous activity next to the transportation and drilling operation on an offshore oil and gas (OOG) platform. Past experiences of onshore and offshore oil and gas activities have revealed that a small mis-happening in the process operation might escalate to a catastrophe. This is of especial concern in the OOG platform due to the limited space and compact geometry of the process area, less ventilation, and difficult escape routes. On an OOG platform, each extra control measure, which is implemented, not only occupies space on the platform and increases congestion but also adds extra load to the platform. Eventualities in the OOG platform process operation can be avoided through incorporating the appropriate control measures at the early design stage. In this paper, the authors describe a methodology for risk-based process safety decision making for OOG activities. The methodology is applied to various offshore process units, that is, the compressor, separators, flash drum and driers of an OOG platform. Based on the risk potential, appropriate safety measures are designed for each unit. This paper also illustrates that implementation of the designed safety measures reduces the high Fatal accident rate (FAR) values to an acceptable level.

  13. Active pharmaceutical ingredient (API) production involving continuous processes--a process system engineering (PSE)-assisted design framework.

    PubMed

    Cervera-Padrell, Albert E; Skovby, Tommy; Kiil, Søren; Gani, Rafiqul; Gernaey, Krist V

    2012-10-01

    A systematic framework is proposed for the design of continuous pharmaceutical manufacturing processes. Specifically, the design framework focuses on organic chemistry based, active pharmaceutical ingredient (API) synthetic processes, but could potentially be extended to biocatalytic and fermentation-based products. The method exploits the synergic combination of continuous flow technologies (e.g., microfluidic techniques) and process systems engineering (PSE) methods and tools for faster process design and increased process understanding throughout the whole drug product and process development cycle. The design framework structures the many different and challenging design problems (e.g., solvent selection, reactor design, and design of separation and purification operations), driving the user from the initial drug discovery steps--where process knowledge is very limited--toward the detailed design and analysis. Examples from the literature of PSE methods and tools applied to pharmaceutical process design and novel pharmaceutical production technologies are provided along the text, assisting in the accumulation and interpretation of process knowledge. Different criteria are suggested for the selection of batch and continuous processes so that the whole design results in low capital and operational costs as well as low environmental footprint. The design framework has been applied to the retrofit of an existing batch-wise process used by H. Lundbeck A/S to produce an API: zuclopenthixol. Some of its batch operations were successfully converted into continuous mode, obtaining higher yields that allowed a significant simplification of the whole process. The material and environmental footprint of the process--evaluated through the process mass intensity index, that is, kg of material used per kg of product--was reduced to half of its initial value, with potential for further reduction. The case-study includes reaction steps typically used by the pharmaceutical

  14. A Controlled Agitation Process for Improving Quality of Canned Green Beans during Agitation Thermal Processing.

    PubMed

    Singh, Anika; Pratap Singh, Anubhav; Ramaswamy, Hosahalli S

    2016-06-01

    This work introduces the concept of a controlled agitation thermal process to reduce quality damage in liquid-particulate products during agitation thermal processing. Reciprocating agitation thermal processing (RA-TP) was used as the agitation thermal process. In order to reduce the impact of agitation, a new concept of "stopping agitations after sufficient development of cold-spot temperature" was proposed. Green beans were processed in No. 2 (307×409) cans filled with liquids of various consistency (0% to 2% CMC) at various frequencies (1 to 3 Hz) of RA-TP using a full-factorial design and heat penetration results were collected. Corresponding operator's process time to impart a 10-min process lethality (Fo ) and agitation time (AT) were calculated using heat penetration results. Accordingly, products were processed again by stopping agitations as per 3 agitation regimes, namely; full time agitation, equilibration time agitation, and partial time agitation. Processed products were photographed and tested for visual quality, color, texture, breakage of green beans, turbidity, and percentage of insoluble solids in can liquid. Results showed that stopping agitations after sufficient development of cold-spot temperatures is an effective way of reducing product damages caused by agitation (for example, breakage of beans and its leaching into liquid). Agitations till one-log temperature difference gave best color, texture and visual product quality for low-viscosity liquid-particulate mixture and extended agitations till equilibration time was best for high-viscosity products. Thus, it was shown that a controlled agitation thermal process is more effective in obtaining high product quality as compared to a regular agitation thermal process. © 2016 Institute of Food Technologists®

  15. 21 CFR 660.51 - Processing.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ...) Processing method. (1) The processing method shall be one that has been shown to yield consistently a... be colored green. (3) Only that material which has been fully processed, thoroughly mixed in a single...

  16. Chemical Sensing in Process Analysis.

    ERIC Educational Resources Information Center

    Hirschfeld, T.; And Others

    1984-01-01

    Discusses: (1) rationale for chemical sensors in process analysis; (2) existing types of process chemical sensors; (3) sensor limitations, considering lessons of chemometrics; (4) trends in process control sensors; and (5) future prospects. (JN)

  17. Spacelab Data Processing Facility

    NASA Technical Reports Server (NTRS)

    1983-01-01

    The capabilities of the Spacelab Data Processing Facility (SPDPF) are highlighted. The capturing, quality monitoring, processing, accounting, and forwarding of vital Spacelab data to various user facilities around the world are described.

  18. Telerobotic electronic materials processing experiment

    NASA Technical Reports Server (NTRS)

    Ollendorf, Stanford

    1991-01-01

    The Office of Commercial Programs (OCP), working in conjunction with NASA engineers at the Goddard Space Flight Center, is supporting research efforts in robot technology and microelectronics materials processing that will provide many spinoffs for science and industry. The Telerobotic Materials Processing Experiment (TRMPX) is a Shuttle-launched materials processing test payload using a Get Away Special can. The objectives of the project are to define, develop, and demonstrate an automated materials processing capability under realistic flight conditions. TRMPX will provide the capability to test the production processes that are dependent on microgravity. The processes proposed for testing include the annealing of amorphous silicon to increase grain size for more efficient solar cells, thin film deposition to demonstrate the potential of fabricating solar cells in orbit, and the annealing of radiation damaged solar cells.

  19. Image Processing Software

    NASA Technical Reports Server (NTRS)

    1992-01-01

    To convert raw data into environmental products, the National Weather Service and other organizations use the Global 9000 image processing system marketed by Global Imaging, Inc. The company's GAE software package is an enhanced version of the TAE, developed by Goddard Space Flight Center to support remote sensing and image processing applications. The system can be operated in three modes and is combined with HP Apollo workstation hardware.

  20. Hydropyrolysis process

    DOEpatents

    Ullman, Alan Z.; Silverman, Jacob; Friedman, Joseph

    1986-01-01

    An improved process for producing a methane-enriched gas wherein a hydrogen-deficient carbonaceous material is treated with a hydrogen-containing pyrolysis gas at an elevated temperature and pressure to produce a product gas mixture including methane, carbon monoxide and hydrogen. The improvement comprises passing the product gas mixture sequentially through a water-gas shift reaction zone and a gas separation zone to provide separate gas streams of methane and of a recycle gas comprising hydrogen, carbon monoxide and methane for recycle to the process. A controlled amount of steam also is provided which when combined with the recycle gas provides a pyrolysis gas for treatment of additional hydrogen-deficient carbonaceous material. The amount of steam used and the conditions within the water-gas shift reaction zone and gas separation zone are controlled to obtain a steady-state composition of pyrolysis gas which will comprise hydrogen as the principal constituent and a minor amount of carbon monoxide, steam and methane so that no external source of hydrogen is needed to supply the hydrogen requirements of the process. In accordance with a particularly preferred embodiment, conditions are controlled such that there also is produced a significant quantity of benzene as a valuable coproduct.

  1. The Integration of Word Processing with Data Processing in an Educational Environment. Final Report.

    ERIC Educational Resources Information Center

    Patterson, Lorna; Schlender, Jim

    A project examined the Office of the Future and determined trends regarding an integration of word processing and data processing. It then sought to translate those trends into an educational package to develop the potential information specialist. A survey instrument completed by 33 office managers and word processing and data processing…

  2. 49 CFR Appendix A to Part 580 - Secure Printing Processes and Other Secure Processes

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 49 Transportation 7 2010-10-01 2010-10-01 false Secure Printing Processes and Other Secure... DISCLOSURE REQUIREMENTS Pt. 580, App. A Appendix A to Part 580—Secure Printing Processes and Other Secure... printing—a printing process utilized in the production of bank-notes and other security documents whereby...

  3. Spoken Language Processing Model: Bridging Auditory and Language Processing to Guide Assessment and Intervention

    ERIC Educational Resources Information Center

    Medwetsky, Larry

    2011-01-01

    Purpose: This article outlines the author's conceptualization of the key mechanisms that are engaged in the processing of spoken language, referred to as the spoken language processing model. The act of processing what is heard is very complex and involves the successful intertwining of auditory, cognitive, and language mechanisms. Spoken language…

  4. Clinical process cost analysis.

    PubMed

    Marrin, C A; Johnson, L C; Beggs, V L; Batalden, P B

    1997-09-01

    New systems of reimbursement are exerting enormous pressure on clinicians and hospitals to reduce costs. Using cheaper supplies or reducing the length of stay may be a satisfactory short-term solution, but the best strategy for long-term success is radical reduction of costs by reengineering the processes of care. However, few clinicians or institutions know the actual costs of medical care; nor do they understand, in detail, the activities involved in the delivery of care. Finally, there is no accepted method for linking the two. Clinical process cost analysis begins with the construction of a detailed flow diagram incorporating each activity in the process of care. The cost of each activity is then calculated, and the two are linked. This technique was applied to Diagnosis Related Group 75 to analyze the real costs of the operative treatment of lung cancer at one institution. Total costs varied between $6,400 and $7,700. The major driver of costs was personnel time, which accounted for 55% of the total. Forty percent of the total cost was incurred in the operating room. The cost of care decreased progressively during hospitalization. Clinical process cost analysis provides detailed information about the costs and processes of care. The insights thus obtained may be used to reduce costs by reengineering the process.

  5. Simulative design and process optimization of the two-stage stretch-blow molding process

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hopmann, Ch.; Rasche, S.; Windeck, C.

    2015-05-22

    The total production costs of PET bottles are significantly affected by the costs of raw material. Approximately 70 % of the total costs are spent for the raw material. Therefore, stretch-blow molding industry intends to reduce the total production costs by an optimized material efficiency. However, there is often a trade-off between an optimized material efficiency and required product properties. Due to a multitude of complex boundary conditions, the design process of new stretch-blow molded products is still a challenging task and is often based on empirical knowledge. Application of current CAE-tools supports the design process by reducing development timemore » and costs. This paper describes an approach to determine optimized preform geometry and corresponding process parameters iteratively. The wall thickness distribution and the local stretch ratios of the blown bottle are calculated in a three-dimensional process simulation. Thereby, the wall thickness distribution is correlated with an objective function and preform geometry as well as process parameters are varied by an optimization algorithm. Taking into account the correlation between material usage, process history and resulting product properties, integrative coupled simulation steps, e.g. structural analyses or barrier simulations, are performed. The approach is applied on a 0.5 liter PET bottle of Krones AG, Neutraubling, Germany. The investigations point out that the design process can be supported by applying this simulative optimization approach. In an optimization study the total bottle weight is reduced from 18.5 g to 15.5 g. The validation of the computed results is in progress.« less

  6. Simulative design and process optimization of the two-stage stretch-blow molding process

    NASA Astrophysics Data System (ADS)

    Hopmann, Ch.; Rasche, S.; Windeck, C.

    2015-05-01

    The total production costs of PET bottles are significantly affected by the costs of raw material. Approximately 70 % of the total costs are spent for the raw material. Therefore, stretch-blow molding industry intends to reduce the total production costs by an optimized material efficiency. However, there is often a trade-off between an optimized material efficiency and required product properties. Due to a multitude of complex boundary conditions, the design process of new stretch-blow molded products is still a challenging task and is often based on empirical knowledge. Application of current CAE-tools supports the design process by reducing development time and costs. This paper describes an approach to determine optimized preform geometry and corresponding process parameters iteratively. The wall thickness distribution and the local stretch ratios of the blown bottle are calculated in a three-dimensional process simulation. Thereby, the wall thickness distribution is correlated with an objective function and preform geometry as well as process parameters are varied by an optimization algorithm. Taking into account the correlation between material usage, process history and resulting product properties, integrative coupled simulation steps, e.g. structural analyses or barrier simulations, are performed. The approach is applied on a 0.5 liter PET bottle of Krones AG, Neutraubling, Germany. The investigations point out that the design process can be supported by applying this simulative optimization approach. In an optimization study the total bottle weight is reduced from 18.5 g to 15.5 g. The validation of the computed results is in progress.

  7. Conceptual Framework for the Mapping of Management Process with Information Technology in a Business Process

    PubMed Central

    Chellappa, Swarnalatha; Nagarajan, Asha

    2015-01-01

    This study on component framework reveals the importance of management process and technology mapping in a business environment. We defined ERP as a software tool, which has to provide business solution but not necessarily an integration of all the departments. Any business process can be classified as management process, operational process and the supportive process. We have gone through entire management process and were enable to bring influencing components to be mapped with a technology for a business solution. Governance, strategic management, and decision making are thoroughly discussed and the need of mapping these components with the ERP is clearly explained. Also we suggest that implementation of this framework might reduce the ERP failures and especially the ERP misfit was completely rectified. PMID:25861688

  8. Conceptual framework for the mapping of management process with information technology in a business process.

    PubMed

    Rajarathinam, Vetrickarthick; Chellappa, Swarnalatha; Nagarajan, Asha

    2015-01-01

    This study on component framework reveals the importance of management process and technology mapping in a business environment. We defined ERP as a software tool, which has to provide business solution but not necessarily an integration of all the departments. Any business process can be classified as management process, operational process and the supportive process. We have gone through entire management process and were enable to bring influencing components to be mapped with a technology for a business solution. Governance, strategic management, and decision making are thoroughly discussed and the need of mapping these components with the ERP is clearly explained. Also we suggest that implementation of this framework might reduce the ERP failures and especially the ERP misfit was completely rectified.

  9. PROCESS WATER BUILDING, TRA605. CONTEXTUAL VIEW, CAMERA FACING SOUTHEAST. PROCESS ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    PROCESS WATER BUILDING, TRA-605. CONTEXTUAL VIEW, CAMERA FACING SOUTHEAST. PROCESS WATER BUILDING AND ETR STACK ARE IN LEFT HALF OF VIEW. TRA-666 IS NEAR CENTER, ABUTTED BY SECURITY BUILDING; TRA-626, AT RIGHT EDGE OF VIEW BEHIND BUS. INL NEGATIVE NO. HD46-34-1. Mike Crane, Photographer, 4/2005 - Idaho National Engineering Laboratory, Test Reactor Area, Materials & Engineering Test Reactors, Scoville, Butte County, ID

  10. GPU applications for data processing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vladymyrov, Mykhailo, E-mail: mykhailo.vladymyrov@cern.ch; Aleksandrov, Andrey; INFN sezione di Napoli, I-80125 Napoli

    2015-12-31

    Modern experiments that use nuclear photoemulsion imply fast and efficient data acquisition from the emulsion can be performed. The new approaches in developing scanning systems require real-time processing of large amount of data. Methods that use Graphical Processing Unit (GPU) computing power for emulsion data processing are presented here. It is shown how the GPU-accelerated emulsion processing helped us to rise the scanning speed by factor of nine.

  11. Model-based software process improvement

    NASA Technical Reports Server (NTRS)

    Zettervall, Brenda T.

    1994-01-01

    The activities of a field test site for the Software Engineering Institute's software process definition project are discussed. Products tested included the improvement model itself, descriptive modeling techniques, the CMM level 2 framework document, and the use of process definition guidelines and templates. The software process improvement model represents a five stage cyclic approach for organizational process improvement. The cycles consist of the initiating, diagnosing, establishing, acting, and leveraging phases.

  12. Carbon wastewater treatment process

    NASA Technical Reports Server (NTRS)

    Humphrey, M. F.; Simmons, G. M.; Dowler, W. L.

    1974-01-01

    A new powdered-carbon treatment process is being developed for the elimination of the present problems, associated with the disposal of biologically active sewage waste solids, and with water reuse. This counter-current flow process produces an activated carbon, which is obtained from the pyrolysis of the sewage solids, and utilizes this material to remove the adulterating materials from the water. Additional advantages of the process are the elimination of odors, the removal of heavy metals, and the potential for energy conservation.

  13. Expert system and process optimization techniques for real-time monitoring and control of plasma processes

    NASA Astrophysics Data System (ADS)

    Cheng, Jie; Qian, Zhaogang; Irani, Keki B.; Etemad, Hossein; Elta, Michael E.

    1991-03-01

    To meet the ever-increasing demand of the rapidly-growing semiconductor manufacturing industry it is critical to have a comprehensive methodology integrating techniques for process optimization real-time monitoring and adaptive process control. To this end we have accomplished an integrated knowledge-based approach combining latest expert system technology machine learning method and traditional statistical process control (SPC) techniques. This knowledge-based approach is advantageous in that it makes it possible for the task of process optimization and adaptive control to be performed consistently and predictably. Furthermore this approach can be used to construct high-level and qualitative description of processes and thus make the process behavior easy to monitor predict and control. Two software packages RIST (Rule Induction and Statistical Testing) and KARSM (Knowledge Acquisition from Response Surface Methodology) have been developed and incorporated with two commercially available packages G2 (real-time expert system) and ULTRAMAX (a tool for sequential process optimization).

  14. Ultra-processed Food Intake and Obesity: What Really Matters for Health-Processing or Nutrient Content?

    PubMed

    Poti, Jennifer M; Braga, Bianca; Qin, Bo

    2017-12-01

    The aim of this narrative review was to summarize and critique recent evidence evaluating the association between ultra-processed food intake and obesity. Four of five studies found that higher purchases or consumption of ultra-processed food was associated with overweight/obesity. Additional studies reported relationships between ultra-processed food intake and higher fasting glucose, metabolic syndrome, increases in total and LDL cholesterol, and risk of hypertension. It remains unclear whether associations can be attributed to processing itself or the nutrient content of ultra-processed foods. Only three of nine studies used a prospective design, and the potential for residual confounding was high. Recent research provides fairly consistent support for the association of ultra-processed food intake with obesity and related cardiometabolic outcomes. There is a clear need for further studies, particularly those using longitudinal designs and with sufficient control for confounding, to potentially confirm these findings in different populations and to determine whether ultra-processed food consumption is associated with obesity independent of nutrient content.

  15. Cognitive Risk Factors for Specific Learning Disorder: Processing Speed, Temporal Processing, and Working Memory.

    PubMed

    Moll, Kristina; Göbel, Silke M; Gooch, Debbie; Landerl, Karin; Snowling, Margaret J

    2016-01-01

    High comorbidity rates between reading disorder (RD) and mathematics disorder (MD) indicate that, although the cognitive core deficits underlying these disorders are distinct, additional domain-general risk factors might be shared between the disorders. Three domain-general cognitive abilities were investigated in children with RD and MD: processing speed, temporal processing, and working memory. Since attention problems frequently co-occur with learning disorders, the study examined whether these three factors, which are known to be associated with attention problems, account for the comorbidity between these disorders. The sample comprised 99 primary school children in four groups: children with RD, children with MD, children with both disorders (RD+MD), and typically developing children (TD controls). Measures of processing speed, temporal processing, and memory were analyzed in a series of ANCOVAs including attention ratings as covariate. All three risk factors were associated with poor attention. After controlling for attention, associations with RD and MD differed: Although deficits in verbal memory were associated with both RD and MD, reduced processing speed was related to RD, but not MD; and the association with RD was restricted to processing speed for familiar nameable symbols. In contrast, impairments in temporal processing and visuospatial memory were associated with MD, but not RD. © Hammill Institute on Disabilities 2014.

  16. Efficient Process Migration for Parallel Processing on Non-Dedicated Networks of Workstations

    NASA Technical Reports Server (NTRS)

    Chanchio, Kasidit; Sun, Xian-He

    1996-01-01

    This paper presents the design and preliminary implementation of MpPVM, a software system that supports process migration for PVM application programs in a non-dedicated heterogeneous computing environment. New concepts of migration point as well as migration point analysis and necessary data analysis are introduced. In MpPVM, process migrations occur only at previously inserted migration points. Migration point analysis determines appropriate locations to insert migration points; whereas, necessary data analysis provides a minimum set of variables to be transferred at each migration pint. A new methodology to perform reliable point-to-point data communications in a migration environment is also discussed. Finally, a preliminary implementation of MpPVM and its experimental results are presented, showing the correctness and promising performance of our process migration mechanism in a scalable non-dedicated heterogeneous computing environment. While MpPVM is developed on top of PVM, the process migration methodology introduced in this study is general and can be applied to any distributed software environment.

  17. An Application of X-Ray Fluorescence as Process Analytical Technology (PAT) to Monitor Particle Coating Processes.

    PubMed

    Nakano, Yoshio; Katakuse, Yoshimitsu; Azechi, Yasutaka

    2018-06-01

    An attempt to apply X-Ray Fluorescence (XRF) analysis to evaluate small particle coating process as a Process Analytical Technologies (PAT) was made. The XRF analysis was used to monitor coating level in small particle coating process with at-line manner. The small particle coating process usually consists of multiple coating processes. This study was conducted by a simple coating particles prepared by first coating of a model compound (DL-methionine) and second coating by talc on spherical microcrystalline cellulose cores. The particles with two layered coating are enough to demonstrate the small particle coating process. From the result by the small particle coating process, it was found that the XRF signal played different roles, resulting that XRF signals by first coating (layering) and second coating (mask coating) could demonstrate the extent with different mechanisms for the coating process. Furthermore, the particle coating of the different particle size has also been investigated to evaluate size effect of these coating processes. From these results, it was concluded that the XRF could be used as a PAT in monitoring particle coating processes and become powerful tool in pharmaceutical manufacturing.

  18. Fuzzy image processing in sun sensor

    NASA Technical Reports Server (NTRS)

    Mobasser, S.; Liebe, C. C.; Howard, A.

    2003-01-01

    This paper will describe how the fuzzy image processing is implemented in the instrument. Comparison of the Fuzzy image processing and a more conventional image processing algorithm is provided and shows that the Fuzzy image processing yields better accuracy then conventional image processing.

  19. [Process optimisation in hospitals: from process to business organisation].

    PubMed

    Eberlein-Gonska, Maria

    2010-01-01

    Apart from a multidimensional quality definition and the understanding of quality as a company-wide challenge, a third essential element of quality management is prevention. Thus, company quality policy has to be prevention-oriented and requires both customer and process orientation as important prerequisites. Process orientation especially focuses on the critical analyses of work flows as a condition for identifying early intervention options which, in turn, may influence the result. Developing a business organisation requires the definition of criteria for space planning, room assignment and room integration in consideration of both medical and economic aspects and the architectural concept. Specific experiences will be demonstrated as a case study using the example of a new building in the midst of the Carl Gustav Carus University Hospital in Dresden, the Diagnostic Centre for Internal Medicine and Neurology. The hospital management placed an order to develop a sustainable as well as feasible business organisation for all the different departments. The idea was to create a medical centre where maximum use was made of all planned spaces and resources on the basis of target processes which had to be defined and agreed upon with all the persons concerned. In a next step all the personal, space and operational resources required were assigned. The success of management in all industries, including the health care sector, crucially depends on the translation of ideas into practice, among them the critical factor of sustainability. In this context, the support by the management as a role model, a formal frame for the respective project group and the definition of controlling via defined indicators have special importance. The example of the Diagnostic Centre for Internal Medicine and Neurology demonstrates that the result of changed processes may release a cultural change where competition can be replaced by cooperation step by step. Copyright © 2010. Published by

  20. Graphical Language for Data Processing

    NASA Technical Reports Server (NTRS)

    Alphonso, Keith

    2011-01-01

    A graphical language for processing data allows processing elements to be connected with virtual wires that represent data flows between processing modules. The processing of complex data, such as lidar data, requires many different algorithms to be applied. The purpose of this innovation is to automate the processing of complex data, such as LIDAR, without the need for complex scripting and programming languages. The system consists of a set of user-interface components that allow the user to drag and drop various algorithmic and processing components onto a process graph. By working graphically, the user can completely visualize the process flow and create complex diagrams. This innovation supports the nesting of graphs, such that a graph can be included in another graph as a single step for processing. In addition to the user interface components, the system includes a set of .NET classes that represent the graph internally. These classes provide the internal system representation of the graphical user interface. The system includes a graph execution component that reads the internal representation of the graph (as described above) and executes that graph. The execution of the graph follows the interpreted model of execution in that each node is traversed and executed from the original internal representation. In addition, there are components that allow external code elements, such as algorithms, to be easily integrated into the system, thus making the system infinitely expandable.

  1. Clients' emotional processing in psychotherapy: a comparison between cognitive-behavioral and process-experiential therapies.

    PubMed

    Watson, Jeanne C; Bedard, Danielle L

    2006-02-01

    The authors compared clients' emotional processing in good and bad outcome cases in cognitive behavioral therapy (CBT) and process-experiential therapy (PET) and investigated whether clients' emotional processing increases over the course of therapy. Twenty minutes from each of 3 sessions from 40 clients were rated on the Experiencing Scale. A 2x2x3 analysis of variance showed a significant difference between outcome and therapy groups, with clients in the good outcome and PET groups showing significantly higher levels of emotional processing than those in the poor outcome and CBT groups, respectively. Clients' level of emotional processing significantly increased from the beginning to the midpoint of therapy. The results indicate that CBT clients are more distant and disengaged from their emotional experience than clients in PET. Copyright (c) 2006 APA, all rights reserved.

  2. The Internet Process Addiction Test: Screening for Addictions to Processes Facilitated by the Internet.

    PubMed

    Northrup, Jason C; Lapierre, Coady; Kirk, Jeffrey; Rae, Cosette

    2015-07-28

    The Internet Process Addiction Test (IPAT) was created to screen for potential addictive behaviors that could be facilitated by the internet. The IPAT was created with the mindset that the term "Internet addiction" is structurally problematic, as the Internet is simply the medium that one uses to access various addictive processes. The role of the internet in facilitating addictions, however, cannot be minimized. A new screening tool that effectively directed researchers and clinicians to the specific processes facilitated by the internet would therefore be useful. This study shows that the Internet Process Addiction Test (IPAT) demonstrates good validity and reliability. Four addictive processes were effectively screened for with the IPAT: Online video game playing, online social networking, online sexual activity, and web surfing. Implications for further research and limitations of the study are discussed.

  3. Infrared processing of foods

    USDA-ARS?s Scientific Manuscript database

    Infrared (IR) processing of foods has been gaining popularity over conventional processing in several unit operations, including drying, peeling, baking, roasting, blanching, pasteurization, sterilization, disinfection, disinfestation, cooking, and popping . It has shown advantages over conventional...

  4. Methods in Astronomical Image Processing

    NASA Astrophysics Data System (ADS)

    Jörsäter, S.

    A Brief Introductory Note History of Astronomical Imaging Astronomical Image Data Images in Various Formats Digitized Image Data Digital Image Data Philosophy of Astronomical Image Processing Properties of Digital Astronomical Images Human Image Processing Astronomical vs. Computer Science Image Processing Basic Tools of Astronomical Image Processing Display Applications Calibration of Intensity Scales Calibration of Length Scales Image Re-shaping Feature Enhancement Noise Suppression Noise and Error Analysis Image Processing Packages: Design of AIPS and MIDAS AIPS MIDAS Reduction of CCD Data Bias Subtraction Clipping Preflash Subtraction Dark Subtraction Flat Fielding Sky Subtraction Extinction Correction Deconvolution Methods Rebinning/Combining Summary and Prospects for the Future

  5. Basic abnormalities in visual processing affect face processing at an early age in autism spectrum disorder.

    PubMed

    Vlamings, Petra Hendrika Johanna Maria; Jonkman, Lisa Marthe; van Daalen, Emma; van der Gaag, Rutger Jan; Kemner, Chantal

    2010-12-15

    A detailed visual processing style has been noted in autism spectrum disorder (ASD); this contributes to problems in face processing and has been directly related to abnormal processing of spatial frequencies (SFs). Little is known about the early development of face processing in ASD and the relation with abnormal SF processing. We investigated whether young ASD children show abnormalities in low spatial frequency (LSF, global) and high spatial frequency (HSF, detailed) processing and explored whether these are crucially involved in the early development of face processing. Three- to 4-year-old children with ASD (n = 22) were compared with developmentally delayed children without ASD (n = 17). Spatial frequency processing was studied by recording visual evoked potentials from visual brain areas while children passively viewed gratings (HSF/LSF). In addition, children watched face stimuli with different expressions, filtered to include only HSF or LSF. Enhanced activity in visual brain areas was found in response to HSF versus LSF information in children with ASD, in contrast to control subjects. Furthermore, facial-expression processing was also primarily driven by detail in ASD. Enhanced visual processing of detailed (HSF) information is present early in ASD and occurs for neutral (gratings), as well as for socially relevant stimuli (facial expressions). These data indicate that there is a general abnormality in visual SF processing in early ASD and are in agreement with suggestions that a fast LSF subcortical face processing route might be affected in ASD. This could suggest that abnormal visual processing is causative in the development of social problems in ASD. Copyright © 2010 Society of Biological Psychiatry. Published by Elsevier Inc. All rights reserved.

  6. Bank Record Processing

    NASA Technical Reports Server (NTRS)

    1982-01-01

    Barnett Banks of Florida, Inc. operates 150 banking offices in 80 Florida cities. Banking offices have computerized systems for processing deposits or withdrawals in checking/savings accounts, and for handling commercial and installment loan transactions. In developing a network engineering design for the terminals used in record processing, an affiliate, Barnett Computing Company, used COSMIC's STATCOM program. This program provided a reliable network design tool and avoided the cost of developing new software.

  7. Qualitative Process Theory.

    DTIC Science & Technology

    1982-02-01

    4. PERFORMING ORG. REPORT NUMBER S 7. AUTNOR(a) S.CONTRACT OR GRANT NUMBER(.j- Kenneth .D. Forbus NOq,,4-8o-C-o5o5 S. PERFORMING ORGANIZATION NAME...AND ADDRESS 10. PROGRAM ELEMENT. PROjECT. TASK Artificial Intelligence Laboratory. AREA & WORK UNIT NUMBERS 545 -Technology Square Cambridge...about processes, their effects , and their limits. Qualitati e rocess theory defines a simple notion of Reasoning about process also Imotivates a new

  8. Coal liquefaction process

    DOEpatents

    Karr, Jr., Clarence

    1977-04-19

    An improved coal liquefaction process is provided which enables conversion of a coal-oil slurry to a synthetic crude refinable to produce larger yields of gasoline and diesel oil. The process is characterized by a two-step operation applied to the slurry prior to catalytic desulfurization and hydrogenation in which the slurry undergoes partial hydrogenation to crack and hydrogenate asphaltenes and the partially hydrogenated slurry is filtered to remove minerals prior to subsequent catalytic hydrogenation.

  9. Dawn Spacecraft Processing

    NASA Image and Video Library

    2007-04-10

    The Dawn spacecraft is seen here in clean room C of Astrotech's Payload Processing Facility. In the clean room, the spacecraft will undergo further processing. Dawn's mission is to explore two of the asteroid belt's most intriguing and dissimilar occupants: asteroid Vesta and the dwarf planet Ceres. The Dawn mission is managed by JPL, a division of the California Institute of Technology in Pasadena, for NASA's Science Mission Directorate in Washington, D.C.

  10. Image Processing Software

    NASA Technical Reports Server (NTRS)

    1990-01-01

    The Ames digital image velocimetry technology has been incorporated in a commercially available image processing software package that allows motion measurement of images on a PC alone. The software, manufactured by Werner Frei Associates, is IMAGELAB FFT. IMAGELAB FFT is a general purpose image processing system with a variety of other applications, among them image enhancement of fingerprints and use by banks and law enforcement agencies for analysis of videos run during robberies.

  11. A comparison of transfer-appropriate processing and multi-process frameworks for prospective memory performance.

    PubMed

    McBride, Dawn M; Abney, Drew H

    2012-01-01

    We examined multi-process (MP) and transfer-appropriate processing descriptions of prospective memory (PM). Three conditions were compared that varied the overlap in processing type (perceptual/conceptual) between the ongoing and PM tasks such that two conditions involved a match of perceptual processing and one condition involved a mismatch in processing (conceptual ongoing task/perceptual PM task). One of the matched processing conditions also created a focal PM task, whereas the other two conditions were considered non-focal (Einstein & McDaniel, 2005). PM task accuracy and ongoing task completion speed in baseline and PM task conditions were measured. Accuracy results indicated a higher PM task completion rate for the focal condition than the non-focal conditions, a finding that is consistent with predictions made by the MP view. However, reaction time (RT) analyses indicated that PM task cost did not differ across conditions when practice effects are considered. Thus, the PM accuracy results are consistent with a MP description of PM, but RT results did not support the MP view predictions regarding PM cost.

  12. A Taguchi approach on optimal process control parameters for HDPE pipe extrusion process

    NASA Astrophysics Data System (ADS)

    Sharma, G. V. S. S.; Rao, R. Umamaheswara; Rao, P. Srinivasa

    2017-06-01

    High-density polyethylene (HDPE) pipes find versatile applicability for transportation of water, sewage and slurry from one place to another. Hence, these pipes undergo tremendous pressure by the fluid carried. The present work entails the optimization of the withstanding pressure of the HDPE pipes using Taguchi technique. The traditional heuristic methodology stresses on a trial and error approach and relies heavily upon the accumulated experience of the process engineers for determining the optimal process control parameters. This results in setting up of less-than-optimal values. Hence, there arouse a necessity to determine optimal process control parameters for the pipe extrusion process, which can ensure robust pipe quality and process reliability. In the proposed optimization strategy, the design of experiments (DoE) are conducted wherein different control parameter combinations are analyzed by considering multiple setting levels of each control parameter. The concept of signal-to-noise ratio ( S/ N ratio) is applied and ultimately optimum values of process control parameters are obtained as: pushing zone temperature of 166 °C, Dimmer speed at 08 rpm, and Die head temperature to be 192 °C. Confirmation experimental run is also conducted to verify the analysis and research result and values proved to be in synchronization with the main experimental findings and the withstanding pressure showed a significant improvement from 0.60 to 1.004 Mpa.

  13. Process development

    NASA Technical Reports Server (NTRS)

    Bickler, D. B.

    1985-01-01

    An overview is given of seven process development activities which were presented at this session. Pulsed excimer laser processing of photovoltaic cells was presented. A different pulsed excimer laser annealing was described using a 50 w laser. Diffusion barrier research focused on lowering the chemical reactivity of amorphous thin film on silicon. In another effort adherent and conductive films were successfully achieved. Other efforts were aimed at achieving a simultaneous front and back junction. Microwave enhanced plasma deposition experiments were performed. An updated version of the Solar Array Manufacturing Industry Costing Standards (SAMICS) was presented, along with a life cycle cost analysis of high efficiency cells. The last presentation was on the evaluation of the ethyl vinyl acetate encapsulating system.

  14. A Measurable Model of the Creative Process in the Context of a Learning Process

    ERIC Educational Resources Information Center

    Ma, Min; Van Oystaeyen, Fred

    2016-01-01

    The authors' aim was to arrive at a measurable model of the creative process by putting creativity in the context of a learning process. The authors aimed to provide a rather detailed description of how creative thinking fits in a general description of the learning process without trying to go into an analysis of a biological description of the…

  15. Assessment of Advanced Coal Gasification Processes

    NASA Technical Reports Server (NTRS)

    McCarthy, John; Ferrall, Joseph; Charng, Thomas; Houseman, John

    1981-01-01

    This report represents a technical assessment of the following advanced coal gasification processes: AVCO High Throughput Gasification (HTG) Process; Bell Single-Stage High Mass Flux (HMF) Process; Cities Service/Rockwell (CS/R) Hydrogasification Process; Exxon Catalytic Coal Gasification (CCG) Process. Each process is evaluated for its potential to produce SNG from a bituminous coal. In addition to identifying the new technology these processes represent, key similarities/differences, strengths/weaknesses, and potential improvements to each process are identified. The AVCO HTG and the Bell HMF gasifiers share similarities with respect to: short residence time (SRT), high throughput rate, slagging and syngas as the initial raw product gas. The CS/R Hydrogasifier is also SRT but is non-slagging and produces a raw gas high in methane content. The Exxon CCG gasifier is a long residence time, catalytic, fluidbed reactor producing all of the raw product methane in the gasifier. The report makes the following assessments: 1) while each process has significant potential as coal gasifiers, the CS/R and Exxon processes are better suited for SNG production; 2) the Exxon process is the closest to a commercial level for near-term SNG production; and 3) the SRT processes require significant development including scale-up and turndown demonstration, char processing and/or utilization demonstration, and reactor control and safety features development.

  16. Advances in downstream processing of biologics - Spectroscopy: An emerging process analytical technology.

    PubMed

    Rüdt, Matthias; Briskot, Till; Hubbuch, Jürgen

    2017-03-24

    Process analytical technologies (PAT) for the manufacturing of biologics have drawn increased interest in the last decade. Besides being encouraged by the Food and Drug Administration's (FDA's) PAT initiative, PAT promises to improve process understanding, reduce overall production costs and help to implement continuous manufacturing. This article focuses on spectroscopic tools for PAT in downstream processing (DSP). Recent advances and future perspectives will be reviewed. In order to exploit the full potential of gathered data, chemometric tools are widely used for the evaluation of complex spectroscopic information. Thus, an introduction into the field will be given. Copyright © 2016 The Authors. Published by Elsevier B.V. All rights reserved.

  17. PROCESSING ALTERNATIVES FOR DESTRUCTION OF TETRAPHENYLBORATE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lambert, D; Thomas Peters, T; Samuel Fink, S

    Two processes were chosen in the 1980's at the Savannah River Site (SRS) to decontaminate the soluble High Level Waste (HLW). The In Tank Precipitation (ITP) process (1,2) was developed at SRS for the removal of radioactive cesium and actinides from the soluble HLW. Sodium tetraphenylborate was added to the waste to precipitate cesium and monosodium titanate (MST) was added to adsorb actinides, primarily uranium and plutonium. Two products of this process were a low activity waste stream and a concentrated organic stream containing cesium tetraphenylborate and actinides adsorbed on monosodium titanate (MST). A copper catalyzed acid hydrolysis process wasmore » built to process (3, 4) the Tank 48H cesium tetraphenylborate waste in the SRS's Defense Waste Processing Facility (DWPF). Operation of the DWPF would have resulted in the production of benzene for incineration in SRS's Consolidated Incineration Facility. This process was abandoned together with the ITP process in 1998 due to high benzene in ITP caused by decomposition of excess sodium tetraphenylborate. Processing in ITP resulted in the production of approximately 1.0 million liters of HLW. SRS has chosen a solvent extraction process combined with adsorption of the actinides to decontaminate the soluble HLW stream (5). However, the waste in Tank 48H is incompatible with existing waste processing facilities. As a result, a processing facility is needed to disposition the HLW in Tank 48H. This paper will describe the process for searching for processing options by SRS task teams for the disposition of the waste in Tank 48H. In addition, attempts to develop a caustic hydrolysis process for in tank destruction of tetraphenylborate will be presented. Lastly, the development of both a caustic and acidic copper catalyzed peroxide oxidation process will be discussed.« less

  18. Product/Process (P/P) Models For The Defense Waste Processing Facility (DWPF): Model Ranges And Validation Ranges For Future Processing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jantzen, C.; Edwards, T.

    Radioactive high level waste (HLW) at the Savannah River Site (SRS) has successfully been vitrified into borosilicate glass in the Defense Waste Processing Facility (DWPF) since 1996. Vitrification requires stringent product/process (P/P) constraints since the glass cannot be reworked once it is poured into ten foot tall by two foot diameter canisters. A unique “feed forward” statistical process control (SPC) was developed for this control rather than statistical quality control (SQC). In SPC, the feed composition to the DWPF melter is controlled prior to vitrification. In SQC, the glass product would be sampled after it is vitrified. Individual glass property-compositionmore » models form the basis for the “feed forward” SPC. The models transform constraints on the melt and glass properties into constraints on the feed composition going to the melter in order to guarantee, at the 95% confidence level, that the feed will be processable and that the durability of the resulting waste form will be acceptable to a geologic repository.« less

  19. HMI conventions for process control graphics.

    PubMed

    Pikaar, Ruud N

    2012-01-01

    Process operators supervise and control complex processes. To enable the operator to do an adequate job, instrumentation and process control engineers need to address several related topics, such as console design, information design, navigation, and alarm management. In process control upgrade projects, usually a 1:1 conversion of existing graphics is proposed. This paper suggests another approach, efficiently leading to a reduced number of new powerful process graphics, supported by a permanent process overview displays. In addition a road map for structuring content (process information) and conventions for the presentation of objects, symbols, and so on, has been developed. The impact of the human factors engineering approach on process control upgrade projects is illustrated by several cases.

  20. Defining the paramedic process.

    PubMed

    Carter, Holly; Thompson, James

    2015-01-01

    The use of a 'process of care' is well established in several health professions, most evidently within the field of nursing. Now ingrained within methods of care delivery, it offers a logical approach to problem solving and ensures an appropriate delivery of interventions that are specifically suited to the individual patient. Paramedicine is a rapidly advancing profession despite a wide acknowledgement of limited research provisions. This frequently results in the borrowing of evidence from other disciplines. While this has often been useful, there are many concerns relating to the acceptable limit of evidence transcription between professions. To date, there is no formally recognised 'process of care'-defining activity within the pre-hospital arena. With much current focus on the professional classification of paramedic work, it is considered timely to formally define a formula that underpins other professional roles such as nursing. It is hypothesised that defined processes of care, particularly the nursing process, may have features that would readily translate to pre-hospital practice. The literature analysed was obtained through systematic searches of a range of databases, including Ovid MEDLINE, Cumulative Index to Nursing and Allied Health. The results demonstrated that the defined process of care provides nursing with more than just a structure for practice, but also has implications for education, clinical governance and professional standing. The current nursing process does not directly articulate to the complex and often unstructured role of the paramedic; however, it has many principles that offer value to the paramedic in their practice. Expanding the nursing process model to include the stages of Dispatch Considerations, Scene Assessment, First Impressions, Patient History, Physical Examination, Clinical Decision-Making, Interventions, Re-evaluation, Transport Decisions, Handover and Reflection would provide an appropriate model for pre

  1. [A new strategy for Chinese medicine processing technologies: coupled with individuation processed and cybernetics].

    PubMed

    Zhang, Ding-kun; Yang, Ming; Han, Xue; Lin, Jun-zhi; Wang, Jia-bo; Xiao, Xiao-he

    2015-08-01

    The stable and controllable quality of decoction pieces is an important factor to ensure the efficacy of clinical medicine. Considering the dilemma that the existing standardization of processing mode cannot effectively eliminate the variability of quality raw ingredients, and ensure the stability between different batches, we first propose a new strategy for Chinese medicine processing technologies that coupled with individuation processed and cybernetics. In order to explain this thinking, an individual study case about different grades aconite is provided. We hope this strategy could better serve for clinical medicine, and promote the inheritance and innovation of Chinese medicine processing skills and theories.

  2. Information Processing Concepts: A Cure for "Technofright." Information Processing in the Electronic Office. Part 1: Concepts.

    ERIC Educational Resources Information Center

    Popyk, Marilyn K.

    1986-01-01

    Discusses the new automated office and its six major technologies (data processing, word processing, graphics, image, voice, and networking), the information processing cycle (input, processing, output, distribution/communication, and storage and retrieval), ergonomics, and ways to expand office education classes (versus class instruction). (CT)

  3. 75 FR 3893 - Legal Processes

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-01-25

    ... DEPARTMENT OF COMMERCE Patent and Trademark Office Legal Processes ACTION: Proposed collection... legal processes may be found under 37 CFR Part 104, which outlines procedures for service of process, demands for employee testimony and production of documents in legal proceedings, reports of unauthorized...

  4. MRO DKF Post-Processing Tool

    NASA Technical Reports Server (NTRS)

    Ayap, Shanti; Fisher, Forest; Gladden, Roy; Khanampompan, Teerapat

    2008-01-01

    This software tool saves time and reduces risk by automating two labor-intensive and error-prone post-processing steps required for every DKF [DSN (Deep Space Network) Keyword File] that MRO (Mars Reconnaissance Orbiter) produces, and is being extended to post-process the corresponding TSOE (Text Sequence Of Events) as well. The need for this post-processing step stems from limitations in the seq-gen modeling resulting in incorrect DKF generation that is then cleaned up in post-processing.

  5. Developments in Signature Process Control

    NASA Astrophysics Data System (ADS)

    Keller, L. B.; Dominski, Marty

    1993-01-01

    Developments in the adaptive process control technique known as Signature Process Control for Advanced Composites (SPCC) are described. This computer control method for autoclave processing of composites was used to develop an optimum cure cycle for AFR 700B polyamide and for an experimental poly-isoimide. An improved process cycle was developed for Avimid N polyamide. The potential for extending the SPCC technique to pre-preg quality control, press modeling, pultrusion and RTM is briefly discussed.

  6. Space processing of chalcogenide glass

    NASA Technical Reports Server (NTRS)

    Larsen, D. C.; Ali, M. I.

    1977-01-01

    The manner in which the weightless, containerless nature of in-space processing can be successfully utilized to improve the quality of infrared transmitting chalcogenide glasses is determined. The technique of space processing chalcogenide glass was developed, and the process and equipment necessary to do so was defined. Earthbound processing experiments with As2S3 and G28Sb12Se60 glasses were experimented with. Incorporated into these experiments is the use of an acoustic levitation device.

  7. Range Process Simulation Tool

    NASA Technical Reports Server (NTRS)

    Phillips, Dave; Haas, William; Barth, Tim; Benjamin, Perakath; Graul, Michael; Bagatourova, Olga

    2005-01-01

    Range Process Simulation Tool (RPST) is a computer program that assists managers in rapidly predicting and quantitatively assessing the operational effects of proposed technological additions to, and/or upgrades of, complex facilities and engineering systems such as the Eastern Test Range. Originally designed for application to space transportation systems, RPST is also suitable for assessing effects of proposed changes in industrial facilities and large organizations. RPST follows a model-based approach that includes finite-capacity schedule analysis and discrete-event process simulation. A component-based, scalable, open architecture makes RPST easily and rapidly tailorable for diverse applications. Specific RPST functions include: (1) definition of analysis objectives and performance metrics; (2) selection of process templates from a processtemplate library; (3) configuration of process models for detailed simulation and schedule analysis; (4) design of operations- analysis experiments; (5) schedule and simulation-based process analysis; and (6) optimization of performance by use of genetic algorithms and simulated annealing. The main benefits afforded by RPST are provision of information that can be used to reduce costs of operation and maintenance, and the capability for affordable, accurate, and reliable prediction and exploration of the consequences of many alternative proposed decisions.

  8. Improvement of hospital processes through business process management in Qaem Teaching Hospital: A work in progress.

    PubMed

    Yarmohammadian, Mohammad H; Ebrahimipour, Hossein; Doosty, Farzaneh

    2014-01-01

    In a world of continuously changing business environments, organizations have no option; however, to deal with such a big level of transformation in order to adjust the consequential demands. Therefore, many companies need to continually improve and review their processes to maintain their competitive advantages in an uncertain environment. Meeting these challenges requires implementing the most efficient possible business processes, geared to the needs of the industry and market segments that the organization serves globally. In the last 10 years, total quality management, business process reengineering, and business process management (BPM) have been some of the management tools applied by organizations to increase business competiveness. This paper is an original article that presents implementation of "BPM" approach in the healthcare domain that allows an organization to improve and review its critical business processes. This project was performed in "Qaem Teaching Hospital" in Mashhad city, Iran and consists of four distinct steps; (1) identify business processes, (2) document the process, (3) analyze and measure the process, and (4) improve the process. Implementing BPM in Qaem Teaching Hospital changed the nature of management by allowing the organization to avoid the complexity of disparate, soloed systems. BPM instead enabled the organization to focus on business processes at a higher level.

  9. Dynamic analysis of process reactors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shadle, L.J.; Lawson, L.O.; Noel, S.D.

    1995-06-01

    The approach and methodology of conducting a dynamic analysis is presented in this poster session in order to describe how this type of analysis can be used to evaluate the operation and control of process reactors. Dynamic analysis of the PyGas{trademark} gasification process is used to illustrate the utility of this approach. PyGas{trademark} is the gasifier being developed for the Gasification Product Improvement Facility (GPIF) by Jacobs-Siffine Engineering and Riley Stoker. In the first step of the analysis, process models are used to calculate the steady-state conditions and associated sensitivities for the process. For the PyGas{trademark} gasifier, the process modelsmore » are non-linear mechanistic models of the jetting fluidized-bed pyrolyzer and the fixed-bed gasifier. These process sensitivities are key input, in the form of gain parameters or transfer functions, to the dynamic engineering models.« less

  10. Reward Processing, Neuroeconomics, and Psychopathology.

    PubMed

    Zald, David H; Treadway, Michael T

    2017-05-08

    Abnormal reward processing is a prominent transdiagnostic feature of psychopathology. The present review provides a framework for considering the different aspects of reward processing and their assessment, and highlights recent insights from the field of neuroeconomics that may aid in understanding these processes. Although altered reward processing in psychopathology has often been treated as a general hypo- or hyperresponsivity to reward, increasing data indicate that a comprehensive understanding of reward dysfunction requires characterization within more specific reward-processing domains, including subjective valuation, discounting, hedonics, reward anticipation and facilitation, and reinforcement learning. As such, more nuanced models of the nature of these abnormalities are needed. We describe several processing abnormalities capable of producing the types of selective alterations in reward-related behavior observed in different forms of psychopathology, including (mal)adaptive scaling and anchoring, dysfunctional weighting of reward and cost variables, competition between valuation systems, and reward prediction error signaling.

  11. The Internet Process Addiction Test: Screening for Addictions to Processes Facilitated by the Internet

    PubMed Central

    Northrup, Jason C.; Lapierre, Coady; Kirk, Jeffrey; Rae, Cosette

    2015-01-01

    The Internet Process Addiction Test (IPAT) was created to screen for potential addictive behaviors that could be facilitated by the internet. The IPAT was created with the mindset that the term “Internet addiction” is structurally problematic, as the Internet is simply the medium that one uses to access various addictive processes. The role of the internet in facilitating addictions, however, cannot be minimized. A new screening tool that effectively directed researchers and clinicians to the specific processes facilitated by the internet would therefore be useful. This study shows that the Internet Process Addiction Test (IPAT) demonstrates good validity and reliability. Four addictive processes were effectively screened for with the IPAT: Online video game playing, online social networking, online sexual activity, and web surfing. Implications for further research and limitations of the study are discussed. PMID:26226007

  12. Processed foods: contributions to nutrition.

    PubMed

    Weaver, Connie M; Dwyer, Johanna; Fulgoni, Victor L; King, Janet C; Leveille, Gilbert A; MacDonald, Ruth S; Ordovas, Jose; Schnakenberg, David

    2014-06-01

    Both fresh and processed foods make up vital parts of the food supply. Processed food contributes to both food security (ensuring that sufficient food is available) and nutrition security (ensuring that food quality meets human nutrient needs). This ASN scientific statement focuses on one aspect of processed foods: their nutritional impacts. Specifically, this scientific statement 1) provides an introduction to how processed foods contribute to the health of populations, 2) analyzes the contribution of processed foods to "nutrients to encourage" and "constituents to limit" in the American diet as recommended by the Dietary Guidelines for Americans, 3) identifies the responsibilities of various stakeholders in improving the American diet, and 4) reviews emerging technologies and the research needed for a better understanding of the role of processed foods in a healthy diet. Analyses of the NHANES 2003-2008 show that processed foods provide both nutrients to encourage and constituents to limit as specified in the 2010 Dietary Guidelines for Americans. Of the nutrients to encourage, processed foods contributed 55% of dietary fiber, 48% of calcium, 43% of potassium, 34% of vitamin D, 64% of iron, 65% of folate, and 46% of vitamin B-12. Of the constituents to limit, processed foods contributed 57% of energy, 52% of saturated fat, 75% of added sugars, and 57% of sodium. Diets are more likely to meet food guidance recommendations if nutrient-dense foods, either processed or not, are selected. Nutrition and food science professionals, the food industry, and other stakeholders can help to improve the diets of Americans by providing a nutritious food supply that is safe, enjoyable, affordable, and sustainable by communicating effectively and accurately with each other and by working together to improve the overall knowledge of consumers. © 2014 American Society for Nutrition.

  13. SEMICONDUCTOR TECHNOLOGY A signal processing method for the friction-based endpoint detection system of a CMP process

    NASA Astrophysics Data System (ADS)

    Chi, Xu; Dongming, Guo; Zhuji, Jin; Renke, Kang

    2010-12-01

    A signal processing method for the friction-based endpoint detection system of a chemical mechanical polishing (CMP) process is presented. The signal process method uses the wavelet threshold denoising method to reduce the noise contained in the measured original signal, extracts the Kalman filter innovation from the denoised signal as the feature signal, and judges the CMP endpoint based on the feature of the Kalman filter innovation sequence during the CMP process. Applying the signal processing method, the endpoint detection experiments of the Cu CMP process were carried out. The results show that the signal processing method can judge the endpoint of the Cu CMP process.

  14. Zucchini-dependent piRNA processing is triggered by recruitment to the cytoplasmic processing machinery

    PubMed Central

    Rogers, Alicia K.; Situ, Kathy; Perkins, Edward M.; Toth, Katalin Fejes

    2017-01-01

    The piRNA pathway represses transposable elements in the gonads and thereby plays a vital role in protecting the integrity of germline genomes of animals. Mature piRNAs are processed from longer transcripts, piRNA precursors (pre-piRNAs). In Drosophila, processing of pre-piRNAs is initiated by piRNA-guided Slicer cleavage or the endonuclease Zucchini (Zuc). As Zuc does not have any sequence or structure preferences in vitro, it is not known how piRNA precursors are selected and channeled into the Zuc-dependent processing pathway. We show that a heterologous RNA that lacks complementary piRNAs is processed into piRNAs upon recruitment of several piRNA pathway factors. This processing requires Zuc and the helicase Armitage (Armi). Aubergine (Aub), Argonaute 3 (Ago3), and components of the nuclear RDC complex, which are required for normal piRNA biogenesis in germ cells, are dispensable. Our approach allows discrimination of proteins involved in the transcription and export of piRNA precursors from components required for the cytoplasmic processing steps. piRNA processing correlates with localization of the substrate RNA to nuage, a distinct membraneless cytoplasmic compartment, which surrounds the nucleus of germ cells, suggesting that sequestration of RNA to this subcellular compartment is both necessary and sufficient for selecting piRNA biogenesis substrates. PMID:29021243

  15. Catalytic biomass pyrolysis process

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dayton, David C.; Gupta, Raghubir P.; Turk, Brian S.

    Described herein are processes for converting a biomass starting material (such as lignocellulosic materials) into a low oxygen containing, stable liquid intermediate that can be refined to make liquid hydrocarbon fuels. More specifically, the process can be a catalytic biomass pyrolysis process wherein an oxygen removing catalyst is employed in the reactor while the biomass is subjected to pyrolysis conditions. The stream exiting the pyrolysis reactor comprises bio-oil having a low oxygen content, and such stream may be subjected to further steps, such as separation and/or condensation to isolate the bio-oil.

  16. GREENSCOPE: Sustainable Process Modeling

    EPA Science Inventory

    EPA researchers are responding to environmental problems by incorporating sustainability into process design and evaluation. EPA researchers are also developing a tool that allows users to assess modifications to existing and new chemical processes to determine whether changes in...

  17. The Diazo Copying Process.

    ERIC Educational Resources Information Center

    Osterby, Bruce

    1989-01-01

    Described is an activity which demonstrates an organic-based reprographic method that is used extensively for the duplication of microfilm and engineering drawings. Discussed are the chemistry of the process and how to demonstrate the process for students. (CW)

  18. Detecting determinism from point processes.

    PubMed

    Andrzejak, Ralph G; Mormann, Florian; Kreuz, Thomas

    2014-12-01

    The detection of a nonrandom structure from experimental data can be crucial for the classification, understanding, and interpretation of the generating process. We here introduce a rank-based nonlinear predictability score to detect determinism from point process data. Thanks to its modular nature, this approach can be adapted to whatever signature in the data one considers indicative of deterministic structure. After validating our approach using point process signals from deterministic and stochastic model dynamics, we show an application to neuronal spike trains recorded in the brain of an epilepsy patient. While we illustrate our approach in the context of temporal point processes, it can be readily applied to spatial point processes as well.

  19. Dawn Spacecraft Processing

    NASA Image and Video Library

    2007-04-10

    In Astrotech's Payload Processing Facility, technicians help secure the Dawn spacecraft onto a moveable stand. Dawn will be moved into clean room C for unbagging and further processing. Dawn's mission is to explore two of the asteroid belt's most intriguing and dissimilar occupants: asteroid Vesta and the dwarf planet Ceres. The Dawn mission is managed by JPL, a division of the California Institute of Technology in Pasadena, for NASA's Science Mission Directorate in Washington, D.C.

  20. Dawn Spacecraft Processing

    NASA Image and Video Library

    2007-04-10

    In Astrotech's Payload Processing Facility, an overhead crane lifts the Dawn spacecraft from its transporter. Dawn will be moved into clean room C for unbagging and further processing. Dawn's mission is to explore two of the asteroid belt's most intriguing and dissimilar occupants: asteroid Vesta and the dwarf planet Ceres. The Dawn mission is managed by JPL, a division of the California Institute of Technology in Pasadena, for NASA's Science Mission Directorate in Washington, D.C

  1. Materials and Process Design for High-Temperature Carburizing: Integrating Processing and Performance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    D. Apelian

    2007-07-23

    The objective of the project is to develop an integrated process for fast, high-temperature carburizing. The new process results in an order of magnitude reduction in cycle time compared to conventional carburizing and represents significant energy savings in addition to a corresponding reduction of scrap associated with distortion free carburizing steels.

  2. Teaching the NIATx Model of Process Improvement as an Evidence-Based Process

    ERIC Educational Resources Information Center

    Evans, Alyson C.; Rieckmann, Traci; Fitzgerald, Maureen M.; Gustafson, David H.

    2007-01-01

    Process Improvement (PI) is an approach for helping organizations to identify and resolve inefficient and ineffective processes through problem solving and pilot testing change. Use of PI in improving client access, retention and outcomes in addiction treatment is on the rise through the teaching of the Network for the Improvement of Addiction…

  3. Test processing system (SEE)

    NASA Technical Reports Server (NTRS)

    Gaulene, P.

    1986-01-01

    The SEE data processing system, developed in 1985, manages and process test results. General information is provided on the SEE system: objectives, characteristics, basic principles, general organization, and operation. Full documentation is accessible by computer using the HELP SEE command.

  4. Hyperspectral image processing methods

    USDA-ARS?s Scientific Manuscript database

    Hyperspectral image processing refers to the use of computer algorithms to extract, store and manipulate both spatial and spectral information contained in hyperspectral images across the visible and near-infrared portion of the electromagnetic spectrum. A typical hyperspectral image processing work...

  5. Suppliers solve processing problems

    USDA-ARS?s Scientific Manuscript database

    The year's IFT food expo showcased numerous companies and organizations offering solutions to food processing needs and challenges. From small-scale unit operations to commercial-scale equipment lines, exhibitors highlighted both traditional and novel food processing operations fro food product dev...

  6. Information processing, computation, and cognition.

    PubMed

    Piccinini, Gualtiero; Scarantino, Andrea

    2011-01-01

    Computation and information processing are among the most fundamental notions in cognitive science. They are also among the most imprecisely discussed. Many cognitive scientists take it for granted that cognition involves computation, information processing, or both - although others disagree vehemently. Yet different cognitive scientists use 'computation' and 'information processing' to mean different things, sometimes without realizing that they do. In addition, computation and information processing are surrounded by several myths; first and foremost, that they are the same thing. In this paper, we address this unsatisfactory state of affairs by presenting a general and theory-neutral account of computation and information processing. We also apply our framework by analyzing the relations between computation and information processing on one hand and classicism, connectionism, and computational neuroscience on the other. We defend the relevance to cognitive science of both computation, at least in a generic sense, and information processing, in three important senses of the term. Our account advances several foundational debates in cognitive science by untangling some of their conceptual knots in a theory-neutral way. By leveling the playing field, we pave the way for the future resolution of the debates' empirical aspects.

  7. Associative architecture for image processing

    NASA Astrophysics Data System (ADS)

    Adar, Rutie; Akerib, Avidan

    1997-09-01

    This article presents a new generation in parallel processing architecture for real-time image processing. The approach is implemented in a real time image processor chip, called the XiumTM-2, based on combining a fully associative array which provides the parallel engine with a serial RISC core on the same die. The architecture is fully programmable and can be programmed to implement a wide range of color image processing, computer vision and media processing functions in real time. The associative part of the chip is based on patented pending methodology of Associative Computing Ltd. (ACL), which condenses 2048 associative processors, each of 128 'intelligent' bits. Each bit can be a processing bit or a memory bit. At only 33 MHz and 0.6 micron manufacturing technology process, the chip has a computational power of 3 billion ALU operations per second and 66 billion string search operations per second. The fully programmable nature of the XiumTM-2 chip enables developers to use ACL tools to write their own proprietary algorithms combined with existing image processing and analysis functions from ACL's extended set of libraries.

  8. Multibeam sonar backscatter data processing

    NASA Astrophysics Data System (ADS)

    Schimel, Alexandre C. G.; Beaudoin, Jonathan; Parnum, Iain M.; Le Bas, Tim; Schmidt, Val; Keith, Gordon; Ierodiaconou, Daniel

    2018-06-01

    Multibeam sonar systems now routinely record seafloor backscatter data, which are processed into backscatter mosaics and angular responses, both of which can assist in identifying seafloor types and morphology. Those data products are obtained from the multibeam sonar raw data files through a sequence of data processing stages that follows a basic plan, but the implementation of which varies greatly between sonar systems and software. In this article, we provide a comprehensive review of this backscatter data processing chain, with a focus on the variability in the possible implementation of each processing stage. Our objective for undertaking this task is twofold: (1) to provide an overview of backscatter data processing for the consideration of the general user and (2) to provide suggestions to multibeam sonar manufacturers, software providers and the operators of these systems and software for eventually reducing the lack of control, uncertainty and variability associated with current data processing implementations and the resulting backscatter data products. One such suggestion is the adoption of a nomenclature for increasingly refined levels of processing, akin to the nomenclature adopted for satellite remote-sensing data deliverables.

  9. Consumers' conceptualization of ultra-processed foods.

    PubMed

    Ares, Gastón; Vidal, Leticia; Allegue, Gimena; Giménez, Ana; Bandeira, Elisa; Moratorio, Ximena; Molina, Verónika; Curutchet, María Rosa

    2016-10-01

    Consumption of ultra-processed foods has been associated with low diet quality, obesity and other non-communicable diseases. This situation makes it necessary to develop educational campaigns to discourage consumers from substituting meals based on unprocessed or minimally processed foods by ultra-processed foods. In this context, the aim of the present work was to investigate how consumers conceptualize the term ultra-processed foods and to evaluate if the foods they perceive as ultra-processed are in concordance with the products included in the NOVA classification system. An online study was carried out with 2381 participants. They were asked to explain what they understood by ultra-processed foods and to list foods that can be considered ultra-processed. Responses were analysed using inductive coding. The great majority of the participants was able to provide an explanation of what ultra-processed foods are, which was similar to the definition described in the literature. Most of the participants described ultra-processed foods as highly processed products that usually contain additives and other artificial ingredients, stressing that they have low nutritional quality and are unhealthful. The most relevant products for consumers' conceptualization of the term were in agreement with the NOVA classification system and included processed meats, soft drinks, snacks, burgers, powdered and packaged soups and noodles. However, some of the participants perceived processed foods, culinary ingredients and even some minimally processed foods as ultra-processed. This suggests that in order to accurately convey their message, educational campaigns aimed at discouraging consumers from consuming ultra-processed foods should include a clear definition of the term and describe some of their specific characteristics, such as the type of ingredients included in their formulation and their nutritional composition. Copyright © 2016 Elsevier Ltd. All rights reserved.

  10. 7 CFR 932.14 - Process.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 8 2010-01-01 2010-01-01 false Process. 932.14 Section 932.14 Agriculture Regulations of the Department of Agriculture (Continued) AGRICULTURAL MARKETING SERVICE (Marketing Agreements and... Handling Definitions § 932.14 Process. Process means to change olives in any way from their natural...

  11. Parallel processing spacecraft communication system

    NASA Technical Reports Server (NTRS)

    Bolotin, Gary S. (Inventor); Donaldson, James A. (Inventor); Luong, Huy H. (Inventor); Wood, Steven H. (Inventor)

    1998-01-01

    An uplink controlling assembly speeds data processing using a special parallel codeblock technique. A correct start sequence initiates processing of a frame. Two possible start sequences can be used; and the one which is used determines whether data polarity is inverted or non-inverted. Processing continues until uncorrectable errors are found. The frame ends by intentionally sending a block with an uncorrectable error. Each of the codeblocks in the frame has a channel ID. Each channel ID can be separately processed in parallel. This obviates the problem of waiting for error correction processing. If that channel number is zero, however, it indicates that the frame of data represents a critical command only. That data is handled in a special way, independent of the software. Otherwise, the processed data further handled using special double buffering techniques to avoid problems from overrun. When overrun does occur, the system takes action to lose only the oldest data.

  12. Reward Processing, Neuroeconomics, and Psychopathology

    PubMed Central

    Zald, David H.; Treadway, Michael

    2018-01-01

    Abnormal reward processing is a prominent transdiagnostic feature of psychopathology. The present review provides a framework for considering the different aspects of reward processing and their assessment and highlight recent insights from the field of neuroeconomics that may aid in understanding these processes. Although altered reward processing in psychopathology has often been treated as a general hypo- or hyper-responsivity to reward, increasing data indicate that a comprehensive understanding of reward dysfunction requires characterization within more specific reward processing domains, including subjective valuation, discounting, hedonics, reward anticipation and facilitation, and reinforcement learning. As such, more nuanced models of the nature of these abnormalities are needed. We describe several processing abnormalities capable of producing the types of selective alterations in reward related behavior observed in different forms of psychopathology, including (mal)adaptive scaling and anchoring, dysfunctional weighting of reward and cost variables, completion between valuation systems, and positive prediction error signaling. PMID:28301764

  13. Enriching the Web Processing Service

    NASA Astrophysics Data System (ADS)

    Wosniok, Christoph; Bensmann, Felix; Wössner, Roman; Kohlus, Jörn; Roosmann, Rainer; Heidmann, Carsten; Lehfeldt, Rainer

    2014-05-01

    The OGC Web Processing Service (WPS) provides a standard for implementing geospatial processes in service-oriented networks. In its current version 1.0.0 it allocates the operations GetCapabilities, DescribeProcess and Execute, which can be used to offer custom processes based on single or multiple sub-processes. A large range of ready to use fine granular, fundamental geospatial processes have been developed by the GIS-community in the past. However, modern use cases or whole workflow processes demand specifications of lifecycle management and service orchestration. Orchestrating smaller sub-processes is a task towards interoperability; a comprehensive documentation by using appropriate metadata is also required. Though different approaches were tested in the past, developing complex WPS applications still requires programming skills, knowledge about software libraries in use and a lot of effort for integration. Our toolset RichWPS aims at providing a better overall experience by setting up two major components. The RichWPS ModelBuilder enables the graphics-aided design of workflow processes based on existing local and distributed processes and geospatial services. Once tested by the RichWPS Server, a composition can be deployed for production use on the RichWPS Server. The ModelBuilder obtains necessary processes and services from a directory service, the RichWPS semantic proxy. It manages the lifecycle and is able to visualize results and debugging-information. One aim will be to generate reproducible results; the workflow should be documented by metadata that can be integrated in Spatial Data Infrastructures. The RichWPS Server provides a set of interfaces to the ModelBuilder for, among others, testing composed workflow sequences, estimating their performance and to publish them as common processes. Therefore the server is oriented towards the upcoming WPS 2.0 standard and its ability to transactionally deploy and undeploy processes making use of a WPS

  14. Decommissioning the Fuel Process Building, a Shift in Paradigm for Terminating Safeguards on Process Holdup

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ivan R. Thomas

    INMM Abstract 51st Annual Meeting Decommissioning the Fuel Process Building, a Shift in Paradigm for Terminating Safeguards on Process Holdup The Fuel Process Building at the Idaho Nuclear Technology and Engineering Center (INTEC) is being decommissioned after nearly four decades of recovering high enriched uranium from various government owned spent nuclear fuels. The separations process began with fuel dissolution in one of multiple head-ends, followed by three cycles of uranium solvent extraction, and ending with denitration of uranyl nitrate product. The entire process was very complex, and the associated equipment formed an extensive maze of vessels, pumps, piping, and instrumentationmore » within several layers of operating corridors and process cells. Despite formal flushing and cleanout procedures, an accurate accounting for the residual uranium held up in process equipment over extended years of operation, presented a daunting safeguards challenge. Upon cessation of domestic reprocessing, the holdup remained inaccessible and was exempt from measurement during ensuing physical inventories. In decommissioning the Fuel Process Building, the Idaho Cleanup Project, which operates the INTEC, deviated from the established requirements that all nuclear material holdup be measured and credited to the accountability books and that all nuclear materials, except attractiveness level E residual holdup, be transferred to another facility. Instead, the decommissioning involved grouting the process equipment in place, rather than measuring and removing the contained holdup for subsequent transfer. The grouting made the potentially attractiveness level C and D holdup even more inaccessible, thereby effectually converting the holdup to attractiveness level E and allowing for termination of safeguards controls. Prior to grouting the facility, the residual holdup was estimated by limited sampling and destructive analysis of solutions in process lines and by acceptable

  15. Model medication management process in Australian nursing homes using business process modeling.

    PubMed

    Qian, Siyu; Yu, Ping

    2013-01-01

    One of the reasons for end user avoidance or rejection to use health information systems is poor alignment of the system with healthcare workflow, likely causing by system designers' lack of thorough understanding about healthcare process. Therefore, understanding the healthcare workflow is the essential first step for the design of optimal technologies that will enable care staff to complete the intended tasks faster and better. The often use of multiple or "high risk" medicines by older people in nursing homes has the potential to increase medication error rate. To facilitate the design of information systems with most potential to improve patient safety, this study aims to understand medication management process in nursing homes using business process modeling method. The paper presents study design and preliminary findings from interviewing two registered nurses, who were team leaders in two nursing homes. Although there were subtle differences in medication management between the two homes, major medication management activities were similar. Further field observation will be conducted. Based on the data collected from observations, an as-is process model for medication management will be developed.

  16. Shallow Processing and Underspecification

    ERIC Educational Resources Information Center

    Sanford, Anthony J.; Graesser, Arthur C.

    2006-01-01

    Discourse comprehension theories frequently assume that discourse comprehension involves a complete analysis of lexical, syntactic, semantic, and discourse levels of processing. However, discourse psychologists have documented some conditions when a partial processing and underspecification of the resulting representations occurs. The articles in…

  17. Innovation processes in technologies for the processing of refractory mineral raw materials

    NASA Astrophysics Data System (ADS)

    Chanturiya, V. A.

    2008-12-01

    Analysis of the grade of mineral resources of Russia and other countries shows that end products that are competitive in terms of both technological and environmental criteria in the world market can only be obtained by the development and implementation of progressive technologies based on the up-to-date achievements of fundamental sciences. The essence of modern innovation processes in technologies developed in Russia for the complex and comprehensive processing of refractory raw materials with a complex composition is ascertained. These processes include (i) radiometric methods of concentration of valuable components, (ii) high-energy methods of disintegration of highly dispersed mineral components, and (iii) electrochemical methods of water conditioning to obtain target products for solving specific technological problems.

  18. Effects of process parameters in plastic, metal, and ceramic injection molding processes

    NASA Astrophysics Data System (ADS)

    Lee, Shi W.; Ahn, Seokyoung; Whang, Chul Jin; Park, Seong Jin; Atre, Sundar V.; Kim, Jookwon; German, Randall M.

    2011-09-01

    Plastic injection molding has been widely used in the past and is a dominant forming approach today. As the customer demands require materials with better engineering properties that were not feasible with polymers, powder injection molding with metal and ceramic powders has received considerable attention in recent decades. To better understand the differences in the plastic injection molding, metal injection molding, and ceramic injection molding, the effects of the core process parameters on the process performances has been studied using the state-of-the-art computer-aided engineering (CAE) design tool, PIMSolver® The design of experiments has been conducted using the Taguchi method to obtain the relative contributions of various process parameters onto the successful operations.

  19. Processing industrial wastes with the liquid-phase reduction romelt process

    NASA Astrophysics Data System (ADS)

    Romenets, V.; Valavin, V.; Pokhvisnev, Yu.; Vandariev, S.

    1999-08-01

    The Romelt technology for liquid-phase reduction has been developed for processing metallurgical wastes containing nonferrousmetal components. Thermodynamic calculations were made to investigate the behavior of silver, copper, zinc, manganese, vanadium, chrome, and silicon when reduced from the slag melt into the metallic solution containing iron. The process can be applied to all types of iron-bearing wastes, including electric arc furnace dust. The distribution of elements between the phases can be controlled by adjusting the slag bath temperature. Experiments at a pilot Romelt plant proved the possibility of recovering the metallurgical wastes and obtaining iron.

  20. Information Processing in Cognition Process and New Artificial Intelligent Systems

    NASA Astrophysics Data System (ADS)

    Zheng, Nanning; Xue, Jianru

    In this chapter, we discuss, in depth, visual information processing and a new artificial intelligent (AI) system that is based upon cognitive mechanisms. The relationship between a general model of intelligent systems and cognitive mechanisms is described, and in particular we explore visual information processing with selective attention. We also discuss a methodology for studying the new AI system and propose some important basic research issues that have emerged in the intersecting fields of cognitive science and information science. To this end, a new scheme for associative memory and a new architecture for an AI system with attractors of chaos are addressed.