FEMA and RAM Analysis for the Multi Canister Overpack (MCO) Handling Machine
DOE Office of Scientific and Technical Information (OSTI.GOV)
SWENSON, C.E.
2000-06-01
The Failure Modes and Effects Analysis and the Reliability, Availability, and Maintainability Analysis performed for the Multi-Canister Overpack Handling Machine (MHM) has shown that the current design provides for a safe system, but the reliability of the system (primarily due to the complexity of the interlocks and permissive controls) is relatively low. No specific failure modes were identified where significant consequences to the public occurred, or where significant impact to nearby workers should be expected. The overall reliability calculation for the MHM shows a 98.1 percent probability of operating for eight hours without failure, and an availability of the MHMmore » of 90 percent. The majority of the reliability issues are found in the interlocks and controls. The availability of appropriate spare parts and maintenance personnel, coupled with well written operating procedures, will play a more important role in successful mission completion for the MHM than other less complicated systems.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
HOLLENBECK, R.G.
The Spent Nuclear Fuel (SNF) Canister Storage Building (CSB) is the interim storage facility for the K-Basin SNF at the US. Department of Energy (DOE) Hanford Site. The SNF is packaged in multi-canister overpacks (MCOs). The MCOs are placed inside transport casks, then delivered to the service station inside the CSB. At the service station, the MCO handling machine (MHM) moves the MCO from the cask to a storage tube or one of two sample/weld stations. There are 220 standard storage tubes and six overpack storage tubes in a below grade reinforced concrete vault. Each storage tube can hold twomore » MCOs.« less
Uchiyama, Keiji; Miyata, Hironori; Yano, Masashi; Yamaguchi, Yoshitaka; Imamura, Morikazu; Muramatsu, Naomi; Das, Nandita Rani; Chida, Junji; Hara, Hideyuki; Sakaguchi, Suehiro
2014-01-01
Prion infection induces conformational conversion of the normal prion protein PrPC, into the pathogenic isoform PrPSc, in prion diseases. It has been shown that PrP-knockout (Prnp0/0) mice transgenically reconstituted with a mouse-hamster chimeric PrP lacking N-terminal residues 23-88, or Tg(MHM2Δ23-88)/Prnp 0/0 mice, neither developed the disease nor accumulated MHM2ScΔ23-88 in their brains after inoculation with RML prions. In contrast, RML-inoculated Tg(MHM2Δ23-88)/Prnp 0/+ mice developed the disease with abundant accumulation of MHM2ScΔ23-88 in their brains. These results indicate that MHM2Δ23-88 itself might either lose or greatly reduce the converting capacity to MHM2ScΔ23-88, and that the co-expressing wild-type PrPC can stimulate the conversion of MHM2Δ23-88 to MHM2ScΔ23-88 in trans. In the present study, we confirmed that Tg(MHM2Δ23-88)/Prnp 0/0 mice remained resistant to RML prions for up to 730 days after inoculation. However, we found that Tg(MHM2Δ23-88)/Prnp 0/0 mice were susceptible to 22L prions, developing the disease with prolonged incubation times and accumulating MHM2ScΔ23-88 in their brains. We also found accelerated conversion of MHM2Δ23-88 into MHM2ScΔ23-88 in the brains of RML- and 22L-inoculated Tg(MHM2Δ23-88)/Prnp 0/+ mice. However, wild-type PrPSc accumulated less in the brains of these inoculated Tg(MHM2Δ23-88)/Prnp 0/+ mice, compared with RML- and 22L-inoculated Prnp 0/+ mice. These results show that MHM2Δ23-88 itself can convert into MHM2ScΔ23-88 without the help of the trans-acting PrPC, and that, irrespective of prion strains inoculated, the co-expressing wild-type PrPC stimulates the conversion of MHM2Δ23-88 into MHM2ScΔ23-88, but to the contrary, the co-expressing MHM2Δ23-88 disturbs the conversion of wild-type PrPC into PrPSc.
Yano, Masashi; Yamaguchi, Yoshitaka; Imamura, Morikazu; Muramatsu, Naomi; Das, Nandita Rani; Chida, Junji; Hara, Hideyuki; Sakaguchi, Suehiro
2014-01-01
Prion infection induces conformational conversion of the normal prion protein PrPC, into the pathogenic isoform PrPSc, in prion diseases. It has been shown that PrP-knockout (Prnp0/0) mice transgenically reconstituted with a mouse-hamster chimeric PrP lacking N-terminal residues 23-88, or Tg(MHM2Δ23-88)/Prnp0/0 mice, neither developed the disease nor accumulated MHM2ScΔ23-88 in their brains after inoculation with RML prions. In contrast, RML-inoculated Tg(MHM2Δ23-88)/Prnp0/+ mice developed the disease with abundant accumulation of MHM2ScΔ23-88 in their brains. These results indicate that MHM2Δ23-88 itself might either lose or greatly reduce the converting capacity to MHM2ScΔ23-88, and that the co-expressing wild-type PrPC can stimulate the conversion of MHM2Δ23-88 to MHM2ScΔ23-88 in trans. In the present study, we confirmed that Tg(MHM2Δ23-88)/Prnp0/0 mice remained resistant to RML prions for up to 730 days after inoculation. However, we found that Tg(MHM2Δ23-88)/Prnp0/0 mice were susceptible to 22L prions, developing the disease with prolonged incubation times and accumulating MHM2ScΔ23-88 in their brains. We also found accelerated conversion of MHM2Δ23-88 into MHM2ScΔ23-88 in the brains of RML- and 22L-inoculated Tg(MHM2Δ23-88)/Prnp0/+ mice. However, wild-type PrPSc accumulated less in the brains of these inoculated Tg(MHM2Δ23-88)/Prnp0/+ mice, compared with RML- and 22L-inoculated Prnp0/+ mice. These results show that MHM2Δ23-88 itself can convert into MHM2ScΔ23-88 without the help of the trans-acting PrPC, and that, irrespective of prion strains inoculated, the co-expressing wild-type PrPC stimulates the conversion of MHM2Δ23-88 into MHM2ScΔ23-88, but to the contrary, the co-expressing MHM2Δ23-88 disturbs the conversion of wild-type PrPC into PrPSc. PMID:25330286
Reeves, Aaron A.; Johnson, Marney C.; Vasquez, Margarita M.; Maheshwari, Akhil
2013-01-01
Abstract Objective: This study compared cytokines (in particular transforming growth factor [TGF]-β2) and lactoferrin in maternal human milk (MHM), human-derived milk fortifier (HDMF), and donor human milk (DHM). Materials and Methods: MHM was randomly collected from breastfeeding mothers who had no infectious illness at the time of milk expression. HDMF and DHM were products derived from human milk processed by Holder pasteurization. MHM samples were collected at different times (early/late) and gestations (preterm/term). Lactoferrin was analyzed by western blotting, and cytokines were quantified using commercial enzyme-linked immunosorbent assays. Significance was determined using analysis of variance. Results: In the 164 samples analyzed, TGF-β2 concentrations in HDMF and preterm MHM (at all collection times) were fivefold higher than in DHM (p<0.05). Early preterm MHM had levels of interleukin (IL)-10 and IL-18, 11-fold higher than DHM (p<0.05). IL-6 in DHM was 0.3% of the content found in MHM. IL-18 was fourfold higher in early MHM versus late MHM regardless of gestational age (p<0.05). Lactoferrin concentration in DHM was 6% of that found in MHM. Conclusions: Pasteurization decreases concentrations of most cytokines and lactoferrin in DHM. TGF-β2, a protective intestinal cytokine, has comparable concentrations in HDMF to MHM despite pasteurization. PMID:23869537
Hennegan, Julie; Wu, Maryalice; Scott, Linda; Montgomery, Paul
2016-01-01
Objectives The primary objective was to describe Ugandan schoolgirls’ menstrual hygiene management (MHM) practices and estimate the prevalence of inadequate MHM. Second, to assess the relative contribution of aspects of MHM to health, education and psychosocial outcomes. Design Secondary analysis of survey data collected as part of the final follow-up from a controlled trial of reusable sanitary pad and puberty education provision was used to provide a cross-sectional description of girls’ MHM practices and assess relationships with outcomes. Setting Rural primary schools in the Kamuli district, Uganda. Participants Participants were 205 menstruating schoolgirls (10–19 years) from the eight study sites. Primary and secondary outcome measures The prevalence of adequate MHM, consistent with the concept definition, was estimated using dimensions of absorbent used, frequency of absorbent change, washing and drying procedures and privacy. Self-reported health, education (school attendance and engagement) and psychosocial (shame, insecurity, embarrassment) outcomes hypothesised to result from poor MHM were assessed as primary outcomes. Outcomes were measured through English surveys loaded on iPads and administered verbally in the local language. Results 90.5% (95% CI 85.6% to 93.9%) of girls failed to meet available criteria for adequate MHM, with no significant difference between those using reusable sanitary pads (88.9%, 95% CI 79.0% to 94.4%) and those using existing methods, predominantly cloth (91.5%, 95% CI 85.1% to 95.3%; χ2 (1)=0.12, p=0.729). Aspects of MHM predicted some consequences including shame, not standing in class to answer questions and concerns about odour. Conclusions This study was the first to assess the prevalence of MHM consistent with the concept definition. Results suggest that when all aspects of menstrual hygiene are considered together, the prevalence is much higher than has previously been reported based on absorbents alone. The work demonstrates an urgent need for improved assessment and reporting of MHM, and for primary research testing the links between menstrual management and health, education and psychosocial consequences. PMID:28039290
Hennegan, Julie; Dolan, Catherine; Wu, Maryalice; Scott, Linda; Montgomery, Paul
2016-12-30
The primary objective was to describe Ugandan schoolgirls' menstrual hygiene management (MHM) practices and estimate the prevalence of inadequate MHM. Second, to assess the relative contribution of aspects of MHM to health, education and psychosocial outcomes. Secondary analysis of survey data collected as part of the final follow-up from a controlled trial of reusable sanitary pad and puberty education provision was used to provide a cross-sectional description of girls' MHM practices and assess relationships with outcomes. Rural primary schools in the Kamuli district, Uganda. Participants were 205 menstruating schoolgirls (10-19 years) from the eight study sites. The prevalence of adequate MHM, consistent with the concept definition, was estimated using dimensions of absorbent used, frequency of absorbent change, washing and drying procedures and privacy. Self-reported health, education (school attendance and engagement) and psychosocial (shame, insecurity, embarrassment) outcomes hypothesised to result from poor MHM were assessed as primary outcomes. Outcomes were measured through English surveys loaded on iPads and administered verbally in the local language. 90.5% (95% CI 85.6% to 93.9%) of girls failed to meet available criteria for adequate MHM, with no significant difference between those using reusable sanitary pads (88.9%, 95% CI 79.0% to 94.4%) and those using existing methods, predominantly cloth (91.5%, 95% CI 85.1% to 95.3%; χ 2 (1)=0.12, p=0.729). Aspects of MHM predicted some consequences including shame, not standing in class to answer questions and concerns about odour. This study was the first to assess the prevalence of MHM consistent with the concept definition. Results suggest that when all aspects of menstrual hygiene are considered together, the prevalence is much higher than has previously been reported based on absorbents alone. The work demonstrates an urgent need for improved assessment and reporting of MHM, and for primary research testing the links between menstrual management and health, education and psychosocial consequences. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.
Schmitt, Margaret L; Clatworthy, David; Ratnayake, Ruwan; Klaesener-Metzner, Nicole; Roesch, Elizabeth; Wheeler, Erin; Sommer, Marni
2017-01-01
There is a significant gap in empirical evidence on the menstrual hygiene management (MHM) challenges faced by adolescent girls and women in emergency contexts, and on appropriate humanitarian response approaches to meet their needs in diverse emergency contexts. To begin filling the gap in the evidence, we conducted a study in two diverse contexts (Myanmar and Lebanon), exploring the MHM barriers facing girls and women, and the various relevant sectoral responses being conducted (e.g. water, sanitation and hygiene (WASH), Protection, Health, Education and Camp Management). Two qualitative assessments were conducted: one in camps for internally displaced populations in Myanmar, and one with refugees living in informal settlements and host communities in Lebanon. Key informant interviews were conducted with emergency response staff in both sites, and focus group discussion and participatory mapping activities conducted with adolescent girls and women. Key findings included that there was insufficient access to safe and private facilities for MHM coupled with displacement induced shifts in menstrual practices by girls and women. Among staff, there was a narrow interpretation of what an MHM response includes, with a focus on supplies; significant interest in understanding what an improved MHM response would include and acknowledgement of limited existing MHM guidance across various sectors; and insufficient consultation with beneficiaries, related to discomfort asking about menstruation, and limited coordination between sectors. There is a significant need for improved guidance across all relevant sectors for improving MHM response in emergency context, along with increased evidence on effective approaches for integrating MHM into existing responses.
A systematic review of the health and social effects of menstrual hygiene management.
Sumpter, Colin; Torondel, Belen
2013-01-01
Differing approaches to menstrual hygiene management (MHM) have been associated with a wide range of health and psycho-social outcomes in lower income settings. This paper systematically collates, summarizes and critically appraises the available evidence. Following the PRISMA guidelines a structured search strategy was used to identify articles investigating the effects of MHM on health and psycho-social outcomes. The search was conducted in May 2012 and had no date limit. Data was extracted and quality of methodology was independently assessed by two researchers. Where no measure of effect was provided, but sufficient data were available to calculate one, this was undertaken. Meta-analysis was conducted where sufficient data were available. 14 articles were identified which looked at health outcomes, primarily reproductive tract infections (RTI). 11 articles were identified investigating associations between MHM, social restrictions and school attendance. MHM was found to be associated with RTI in 7 papers. Methodologies however varied greatly and overall quality was low. Meta-analysis of a subset of studies found no association between confirmed bacterial vaginosis and MHM (OR: 1.07, 95% CI: 0.52-2.24). No other substantial associations with health outcomes were found. Although there was good evidence that educational interventions can improve MHM practices and reduce social restrictions there was no quantitative evidence that improvements in management methods reduce school absenteeism. The management of menstruation presents significant challenges for women in lower income settings; the effect of poor MHM however remains unclear. It is plausible that MHM can affect the reproductive tract but the specific infections, the strength of effect, and the route of transmission, remain unclear. There is a gap in the evidence for high quality randomised intervention studies which combine hardware and software interventions, in particular for better understanding the nuanced effect improving MHM may have on girls' attendance at school.
A Systematic Review of the Health and Social Effects of Menstrual Hygiene Management
Sumpter, Colin; Torondel, Belen
2013-01-01
Background Differing approaches to menstrual hygiene management (MHM) have been associated with a wide range of health and psycho-social outcomes in lower income settings. This paper systematically collates, summarizes and critically appraises the available evidence. Methods Following the PRISMA guidelines a structured search strategy was used to identify articles investigating the effects of MHM on health and psycho-social outcomes. The search was conducted in May 2012 and had no date limit. Data was extracted and quality of methodology was independently assessed by two researchers. Where no measure of effect was provided, but sufficient data were available to calculate one, this was undertaken. Meta-analysis was conducted where sufficient data were available. Results 14 articles were identified which looked at health outcomes, primarily reproductive tract infections (RTI). 11 articles were identified investigating associations between MHM, social restrictions and school attendance. MHM was found to be associated with RTI in 7 papers. Methodologies however varied greatly and overall quality was low. Meta-analysis of a subset of studies found no association between confirmed bacterial vaginosis and MHM (OR: 1.07, 95% CI: 0.52–2.24). No other substantial associations with health outcomes were found. Although there was good evidence that educational interventions can improve MHM practices and reduce social restrictions there was no quantitative evidence that improvements in management methods reduce school absenteeism. Conclusion The management of menstruation presents significant challenges for women in lower income settings; the effect of poor MHM however remains unclear. It is plausible that MHM can affect the reproductive tract but the specific infections, the strength of effect, and the route of transmission, remain unclear. There is a gap in the evidence for high quality randomised intervention studies which combine hardware and software interventions, in particular for better understanding the nuanced effect improving MHM may have on girls’ attendance at school. PMID:23637945
Wilson, Zakiya S; Stanley, George G; Vicic, David A
2010-06-21
The M-H-M bonding in the dinuclear complexes Ni(2)(mu-H)(mu-P(2))(2)X(2) (P(2) = R(2)PCH(2)PR(2), R = iPr, Cy; X = Cl, Br) has been investigated. These dinickel A-frames were studied via density functional theory (DFT) calculations to analyze the factors that influence linear and bent M-H-M bonding. The DFT calculations indicate that the bent geometry is favored electronically, with ligand steric effects driving the formation of the linear M-H-M structures.
Phillips-Howard, Penelope A.; Caruso, Bethany; Torondel, Belen; Zulaika, Garazi; Sahin, Murat; Sommer, Marni
2016-01-01
Background A lack of adequate guidance on menstrual management; water, disposal, and private changing facilities; and sanitary hygiene materials in low- and middle-income countries leaves schoolgirls with limited options for healthy personal hygiene during monthly menses. While a plethora of observational studies have described how menstrual hygiene management (MHM) barriers in school impact girls’ dignity, well-being, and engagement in school activities, studies have yet to confirm if inadequate information and facilities for MHM significantly affects quantifiable school and health outcomes influencing girls’ life chances. Evidence on these hard outcomes will take time to accrue; however, a current lack of standardized methods, tools, and research funding is hampering progress and must be addressed. Objectives Compile research priorities for MHM and types of research methods that can be used. Results In this article, we highlight the current knowledge gaps in school-aged girls’ MHM research, and identify opportunities for addressing the dearth of hard evidence limiting the ability of governments, donors, and other agencies to appropriately target resources. We outline a series of research priorities and methodologies that were drawn from an expert panel to address global priorities for MHM in schools for the next 10 years. Conclusions A strong evidence base for different settings, standardized definitions regarding MHM outcomes, improved study designs and methodologies, and the creation of an MHM research consortia to focus attention on this neglected global issue. PMID:27938648
Phillips-Howard, Penelope A; Caruso, Bethany; Torondel, Belen; Zulaika, Garazi; Sahin, Murat; Sommer, Marni
2016-01-01
A lack of adequate guidance on menstrual management; water, disposal, and private changing facilities; and sanitary hygiene materials in low- and middle-income countries leaves schoolgirls with limited options for healthy personal hygiene during monthly menses. While a plethora of observational studies have described how menstrual hygiene management (MHM) barriers in school impact girls' dignity, well-being, and engagement in school activities, studies have yet to confirm if inadequate information and facilities for MHM significantly affects quantifiable school and health outcomes influencing girls' life chances. Evidence on these hard outcomes will take time to accrue; however, a current lack of standardized methods, tools, and research funding is hampering progress and must be addressed. Compile research priorities for MHM and types of research methods that can be used. In this article, we highlight the current knowledge gaps in school-aged girls' MHM research, and identify opportunities for addressing the dearth of hard evidence limiting the ability of governments, donors, and other agencies to appropriately target resources. We outline a series of research priorities and methodologies that were drawn from an expert panel to address global priorities for MHM in schools for the next 10 years. A strong evidence base for different settings, standardized definitions regarding MHM outcomes, improved study designs and methodologies, and the creation of an MHM research consortia to focus attention on this neglected global issue.
Reduction of Kinematic Short Baseline Multipath Effects Based on Multipath Hemispherical Map
Cai, Miaomiao; Chen, Wen; Dong, Danan; Song, Le; Wang, Minghua; Wang, Zhiren; Zhou, Feng; Zheng, Zhengqi; Yu, Chao
2016-01-01
Multipath hemispherical map (MHM) is a kind of multipath mitigation approach that takes advantage of the spatial repeatability of the multipath effect under an unchanged environment. This approach is not only suitable for static environments, but also for some kinematic platforms, such as a moving ship and airplane, where the dominant multipath effects come from the platform itself and the multipath effects from the surrounding environment are considered minor or negligible. Previous studies have verified the feasibility of the MHM approach in static environments. In this study, we expanded the MHM approach to a kinematic shipborne environment. Both static and kinematic tests were carried out to demonstrate the feasibility of the MHM approach. The results indicate that, after MHM multipath mitigation, the root mean square (RMS) of baseline length deviations are reduced by 10.47% and 10.57%, and the RMS of residual values are reduced by 39.89% and 21.91% for the static and kinematic tests, respectively. Power spectrum analysis has shown that the MHM approach is more effective in mitigating multipath in low-frequency bands; the high-frequency multipath effects still exist, and are indistinguishable from observation noise. Taking the observation noise into account, the residual reductions increase to 41.68% and 24.51% in static and kinematic tests, respectively. To further improve the performance of MHM for kinematic platforms, we also analyzed the influence of spatial coverage and resolution on residual reduction. PMID:27754322
Identification of Two Prion Protein Regions That Modify Scrapie Incubation Time
Supattapone, Surachai; Muramoto, Tamaki; Legname, Giuseppe; Mehlhorn, Ingrid; Cohen, Fred E.; DeArmond, Stephen J.; Prusiner, Stanley B.; Scott, Michael R.
2001-01-01
A series of prion transmission experiments was performed in transgenic (Tg) mice expressing either wild-type, chimeric, or truncated prion protein (PrP) molecules. Following inoculation with Rocky Mountain Laboratory (RML) murine prions, scrapie incubation times for Tg(MoPrP)4053, Tg(MHM2)294/Prnp0/0, and Tg(MoPrP,Δ23–88)9949/Prnp0/0 mice were ∼50, 120, and 160 days, respectively. Similar scrapie incubation times were obtained after inoculation of these lines of Tg mice with either MHM2(MHM2(RML)) or MoPrP(Δ23–88)(RML) prions, excluding the possibility that sequence-dependent transmission barriers could account for the observed differences. Tg(MHM2)294/Prnp0/0 mice displayed prolonged scrapie incubation times with four different strains of murine prions. These data provide evidence that the N terminus of MoPrP and the chimeric region of MHM2 PrP (residues 108 through 111) both influence the inherent efficiency of prion propagation. PMID:11152514
Mitigation of multipath effect in GNSS short baseline positioning by the multipath hemispherical map
NASA Astrophysics Data System (ADS)
Dong, D.; Wang, M.; Chen, W.; Zeng, Z.; Song, L.; Zhang, Q.; Cai, M.; Cheng, Y.; Lv, J.
2016-03-01
Multipath is one major error source in high-accuracy GNSS positioning. Various hardware and software approaches are developed to mitigate the multipath effect. Among them the MHM (multipath hemispherical map) and sidereal filtering (SF)/advanced SF (ASF) approaches utilize the spatiotemporal repeatability of multipath effect under static environment, hence they can be implemented to generate multipath correction model for real-time GNSS data processing. We focus on the spatial-temporal repeatability-based MHM and SF/ASF approaches and compare their performances for multipath reduction. Comparisons indicate that both MHM and ASF approaches perform well with residual variance reduction (50 %) for short span (next 5 days) and maintains roughly 45 % reduction level for longer span (next 6-25 days). The ASF model is more suitable for high frequency multipath reduction, such as high-rate GNSS applications. The MHM model is easier to implement for real-time multipath mitigation when the overall multipath regime is medium to low frequency.
NASA Astrophysics Data System (ADS)
Jing, Miao; Heße, Falk; Kumar, Rohini; Wang, Wenqing; Fischer, Thomas; Walther, Marc; Zink, Matthias; Zech, Alraune; Samaniego, Luis; Kolditz, Olaf; Attinger, Sabine
2018-06-01
Most large-scale hydrologic models fall short in reproducing groundwater head dynamics and simulating transport process due to their oversimplified representation of groundwater flow. In this study, we aim to extend the applicability of the mesoscale Hydrologic Model (mHM v5.7) to subsurface hydrology by coupling it with the porous media simulator OpenGeoSys (OGS). The two models are one-way coupled through model interfaces GIS2FEM and RIV2FEM, by which the grid-based fluxes of groundwater recharge and the river-groundwater exchange generated by mHM are converted to fixed-flux boundary conditions of the groundwater model OGS. Specifically, the grid-based vertical reservoirs in mHM are completely preserved for the estimation of land-surface fluxes, while OGS acts as a plug-in to the original mHM modeling framework for groundwater flow and transport modeling. The applicability of the coupled model (mHM-OGS v1.0) is evaluated by a case study in the central European mesoscale river basin - Nägelstedt. Different time steps, i.e., daily in mHM and monthly in OGS, are used to account for fast surface flow and slow groundwater flow. Model calibration is conducted following a two-step procedure using discharge for mHM and long-term mean of groundwater head measurements for OGS. Based on the model summary statistics, namely the Nash-Sutcliffe model efficiency (NSE), the mean absolute error (MAE), and the interquartile range error (QRE), the coupled model is able to satisfactorily represent the dynamics of discharge and groundwater heads at several locations across the study basin. Our exemplary calculations show that the one-way coupled model can take advantage of the spatially explicit modeling capabilities of surface and groundwater hydrologic models and provide an adequate representation of the spatiotemporal behaviors of groundwater storage and heads, thus making it a valuable tool for addressing water resources and management problems.
Das, Padma; Baker, Kelly K.; Dutta, Ambarish; Swain, Tapoja; Sahoo, Sunita; Das, Bhabani Sankar; Panda, Bijay; Nayak, Arati; Bara, Mary; Bilung, Bibiana; Mishra, Pravas Ranjan; Panigrahi, Pinaki; Cairncross, Sandy; Torondel, Belen
2015-01-01
Menstrual hygiene management (MHM) practices vary worldwide and depend on the individual’s socioeconomic status, personal preferences, local traditions and beliefs, and access to water and sanitation resources. MHM practices can be particularly unhygienic and inconvenient for girls and women in poorer settings. Little is known about whether unhygienic MHM practices increase a woman’s exposure to urogenital infections, such as bacterial vaginosis (BV) and urinary tract infection (UTI). This study aimed to determine the association of MHM practices with urogenital infections, controlling for environmental drivers. A hospital-based case-control study was conducted on 486 women at Odisha, India. Cases and controls were recruited using a syndromic approach. Vaginal swabs were collected from all the participants and tested for BV status using Amsel’s criteria. Urine samples were cultured to assess UTI status. Socioeconomic status, clinical symptoms and reproductive history, and MHM and water and sanitation practices were obtained by standardised questionnaire. A total of 486 women were recruited to the study, 228 symptomatic cases and 258 asymptomatic controls. Women who used reusable absorbent pads were more likely to have symptoms of urogenital infection (AdjOR=2.3, 95%CI1.5-3.4) or to be diagnosed with at least one urogenital infection (BV or UTI) (AdjOR=2.8, 95%CI1.7-4.5), than women using disposable pads. Increased wealth and space for personal hygiene in the household were protective for BV (AdjOR=0.5, 95%CI0.3-0.9 and AdjOR=0.6, 95%CI0.3-0.9 respectively). Lower education of the participants was the only factor associated with UTI after adjusting for all the confounders (AdjOR=3.1, 95%CI1.2-7.9). Interventions that ensure women have access to private facilities with water for MHM and that educate women about safer, low-cost MHM materials could reduce urogenital disease among women. Further studies of the effects of specific practices for managing hygienically reusable pads and studies to explore other pathogenic reproductive tract infections are needed. PMID:26125184
Das, Padma; Baker, Kelly K; Dutta, Ambarish; Swain, Tapoja; Sahoo, Sunita; Das, Bhabani Sankar; Panda, Bijay; Nayak, Arati; Bara, Mary; Bilung, Bibiana; Mishra, Pravas Ranjan; Panigrahi, Pinaki; Cairncross, Sandy; Torondel, Belen
2015-01-01
Menstrual hygiene management (MHM) practices vary worldwide and depend on the individual's socioeconomic status, personal preferences, local traditions and beliefs, and access to water and sanitation resources. MHM practices can be particularly unhygienic and inconvenient for girls and women in poorer settings. Little is known about whether unhygienic MHM practices increase a woman's exposure to urogenital infections, such as bacterial vaginosis (BV) and urinary tract infection (UTI). This study aimed to determine the association of MHM practices with urogenital infections, controlling for environmental drivers. A hospital-based case-control study was conducted on 486 women at Odisha, India. Cases and controls were recruited using a syndromic approach. Vaginal swabs were collected from all the participants and tested for BV status using Amsel's criteria. Urine samples were cultured to assess UTI status. Socioeconomic status, clinical symptoms and reproductive history, and MHM and water and sanitation practices were obtained by standardised questionnaire. A total of 486 women were recruited to the study, 228 symptomatic cases and 258 asymptomatic controls. Women who used reusable absorbent pads were more likely to have symptoms of urogenital infection (AdjOR=2.3, 95%CI1.5-3.4) or to be diagnosed with at least one urogenital infection (BV or UTI) (AdjOR=2.8, 95%CI1.7-4.5), than women using disposable pads. Increased wealth and space for personal hygiene in the household were protective for BV (AdjOR=0.5, 95%CI0.3-0.9 and AdjOR=0.6, 95%CI0.3-0.9 respectively). Lower education of the participants was the only factor associated with UTI after adjusting for all the confounders (AdjOR=3.1, 95%CI1.2-7.9). Interventions that ensure women have access to private facilities with water for MHM and that educate women about safer, low-cost MHM materials could reduce urogenital disease among women. Further studies of the effects of specific practices for managing hygienically reusable pads and studies to explore other pathogenic reproductive tract infections are needed.
Association between personal health record enrollment and patient loyalty.
Turley, Marianne; Garrido, Terhilda; Lowenthal, Alex; Zhou, Yi Yvonne
2012-07-01
To examine the association between patient loyalty, as measured by member retention in the health plan, and access to My Health Manager (MHM), Kaiser Permanente's PHR, which is linked to its electronic health record, KP HealthConnect. We conducted a retrospective cohort observational quality improvement project from the third quarter of 2005 to the fourth quarter of 2008 for approximately 394,000 Kaiser Permanente Northwest members. To control for self-selection bias, we used propensity scores to perform exact 1-to-1 matching without replacement between MHM users and nonusers. We estimated retention rates of the matched data and assessed the association between MHM use and retention versus voluntary termination. We also estimated odds ratios of significant variables impacting member retention. The probability of remaining a member or being involuntarily terminated versus voluntary termination was 96.7% for users (95% confidence interval [CI], 96.6%-96.7%) and 92.2% for nonusers (95% CI, 92.1%-92.4%; P <.001). In the logistic model, MHM use was a significant predictor; only tenure and illness burden were stronger predictors. Users were 2.578 (95% CI, 2.487%-2.671%) times more likely to choose to remain members than were nonusers. The impact was more substantial among newer members. MHM use was significantly associated with voluntary membership retention. An indicator of patient loyalty, retention is critical to healthcare organizations.
77 FR 14508 - Notice of Intent To Grant Exclusive Patent License; MHM Technologies, LLC
Federal Register 2010, 2011, 2012, 2013, 2014
2012-03-12
... DEPARTMENT OF DEFENSE Department of the Navy Notice of Intent To Grant Exclusive Patent License... the Navy hereby gives notice of its intent to grant to MHM Technologies, LLC a revocable, nonassignable, exclusive license to practice in the United States, the Government-owned invention described in U...
Boehn, Susanne N.E.; Spahn, Sonja; Neudecker, Sabine; Keppler, Andrea; Bihoreau, Marie-Thérèse; Kränzlin, Bettina; Pandey, Priyanka; Hoffmann, Sigrid C.; Li, Li; Torres, Vicente E.; Gröne, Hermann-Josef; Gretz, Norbert
2013-01-01
Background Autosomal dominant polycystic kidney disease (ADPKD) is one of the most common human inherited diseases. Modifier genes seem to modulate the disease progression and might therefore be promising drug targets. Although a number of modifier loci have been already identified, no modifier gene has been proven to be a real modifier yet. Methods Gene expression profiling of two substrains of the Han:SPRD rat, namely PKD/Mhm and PKD/US, both harboring the same mutation, was conducted in 36-day-old animals. Catechol-O-methyltransferase (Comt) was identified as a potential modifier gene. A 3-month treatment with tolcapone, a selective inhibitor of Comt, was carried out in PKD/Mhm and PKD/US (cy/+) animals. Results Comt is localized within a known modifier locus of PKD (MOP2). The enzyme encoding gene was found upregulated in the more severely affected PKD/Mhm substrain and was hence presumed to be a putative modifier gene of PKD. The treatment with tolcapone markedly attenuated the loss of renal function, inhibited renal enlargement, shifted the size distribution of renal cysts and retarded cell proliferation, apoptosis, inflammation and fibrosis development in affected (cy/+) male and female PKD/Mhm and PKD/US rats. Conclusions Comt has been confirmed to be the first reported modifier gene for PKD and tolcapone offers a promising drug for treating PKD. PMID:23543593
Itoh, Yuichiro; Replogle, Kirstin; Kim, Yong-Hwan; Wade, Juli; Clayton, David F.; Arnold, Arthur P.
2010-01-01
We compared global patterns of gene expression between two bird species, the chicken and zebra finch, with regard to sex bias of autosomal versus Z chromosome genes, dosage compensation, and evolution of sex bias. Both species appear to lack a Z chromosome–wide mechanism of dosage compensation, because both have a similar pattern of significantly higher expression of Z genes in males relative to females. Unlike the chicken Z chromosome, which has female-specific expression of the noncoding RNA MHM (male hypermethylated) and acetylation of histone 4 lysine 16 (H4K16) near MHM, the zebra finch Z chromosome appears to lack the MHM sequence and acetylation of H4K16. The zebra finch also does not show the reduced male-to-female (M:F) ratio of gene expression near MHM similar to that found in the chicken. Although the M:F ratios of Z chromosome gene expression are similar across tissues and ages within each species, they differ between the two species. Z genes showing the greatest species difference in M:F ratio were concentrated near the MHM region of the chicken Z chromosome. This study shows that the zebra finch differs from the chicken because it lacks a specialized region of greater dosage compensation along the Z chromosome, and shows other differences in sex bias. These patterns suggest that different avian taxa may have evolved specific compensatory mechanisms. PMID:20357053
Weisman, Hannah L; Kia-Keating, Maryam; Lippincott, Ann; Taylor, Zachary; Zheng, Jimmy
2016-10-01
Researchers have emphasized the importance of integrating mental health education with academic curriculum. The focus of the current studies was Mental Health Matters (MHM), a mental health curriculum that is integrated with English language arts. It is taught by trained community member volunteers and aims to increase knowledge and decrease stigma toward individuals with mental health disorders. In Study 1, 142 sixth graders participated in MHM and completed pre- and postprogram measures of mental health knowledge, stigma, and program acceptability. Teachers also completed ratings of acceptability. Study 2 (N = 120 seventh graders) compared participants who had participated in MHM the previous year with those who had not using the same measures. Sixth grade students and teachers rated the program as highly acceptable. Participants significantly increased their knowledge and decreased their levels of stigma. Seventh graders who had participated in MHM had significantly more mental health knowledge than peers who had not, but there were no differences in stigma. The model appears to be acceptable to students and teachers. Future research is needed to assess the long-term effectiveness of integrating mental health education with other academic curriculum such as language arts or science. © 2016, American School Health Association.
Anglewicz, Philip; VanLandingham, Mark; Manda-Taylor, Lucinda; Kohler, Hans-Peter
2017-01-01
Purpose The Migration and Health in Malawi (MHM) study focuses on a key challenge in migration research: although it has long been established that migration and health are closely linked, identifying the effect of migration on various health outcomes is complicated by methodological challenges. The MHM study uses a longitudinal panel premigration and postmigration study design (with a non-migrant comparison group) to measure and/or control for important characteristics that affect both migration and health outcomes. Participants Data are available for two waves. The MHM interviewed 398 of 715 migrants in 2007 (55.7%) and 722 of 1013 in 2013 (71.3%); as well as 604 of 751 (80.4%) for a non-migrant reference group in 2013. The total interviewed sample size for the MHM in both waves is 1809. These data include extensive information on lifetime migration, socioeconomic and demographic characteristics, sexual behaviours, marriage, household/family structure, social networks and social capital, HIV/AIDS biomarkers and other dimensions of health. Findings to date Our result for the relationship between migration and health differs by health measure and analytic approach. Migrants in Malawi have a significantly higher HIV prevalence than non-migrants, which is primarily due to the selection of HIV-positive individuals into migration. We find evidence for health selection; physically healthier men and women are more likely to move, partly because migration selects younger individuals. However, we do not find differences in physical or mental health between migrants and non-migrants after moving. Future plans We are preparing a third round of data collection for these (and any new) migrants, which will take place in 2018. This cohort will be used to examine the effect of migration on various health measures and behaviours, including general mental and physical health, smoking and alcohol use, access to and use of health services and use of antiretroviral therapy. PMID:28515195
NASA Astrophysics Data System (ADS)
Nijzink, R. C.; Samaniego, L.; Mai, J.; Kumar, R.; Thober, S.; Zink, M.; Schäfer, D.; Savenije, H. H. G.; Hrachowitz, M.
2015-12-01
Heterogeneity of landscape features like terrain, soil, and vegetation properties affect the partitioning of water and energy. However, it remains unclear to which extent an explicit representation of this heterogeneity at the sub-grid scale of distributed hydrological models can improve the hydrological consistency and the robustness of such models. In this study, hydrological process complexity arising from sub-grid topography heterogeneity was incorporated in the distributed mesoscale Hydrologic Model (mHM). Seven study catchments across Europe were used to test whether (1) the incorporation of additional sub-grid variability on the basis of landscape-derived response units improves model internal dynamics, (2) the application of semi-quantitative, expert-knowledge based model constraints reduces model uncertainty; and (3) the combined use of sub-grid response units and model constraints improves the spatial transferability of the model. Unconstrained and constrained versions of both, the original mHM and mHMtopo, which allows for topography-based sub-grid heterogeneity, were calibrated for each catchment individually following a multi-objective calibration strategy. In addition, four of the study catchments were simultaneously calibrated and their feasible parameter sets were transferred to the remaining three receiver catchments. In a post-calibration evaluation procedure the probabilities of model and transferability improvement, when accounting for sub-grid variability and/or applying expert-knowledge based model constraints, were assessed on the basis of a set of hydrological signatures. In terms of the Euclidian distance to the optimal model, used as overall measure for model performance with respect to the individual signatures, the model improvement achieved by introducing sub-grid heterogeneity to mHM in mHMtopo was on average 13 %. The addition of semi-quantitative constraints to mHM and mHMtopo resulted in improvements of 13 and 19 % respectively, compared to the base case of the unconstrained mHM. Most significant improvements in signature representations were, in particular, achieved for low flow statistics. The application of prior semi-quantitative constraints further improved the partitioning between runoff and evaporative fluxes. Besides, it was shown that suitable semi-quantitative prior constraints in combination with the transfer function based regularization approach of mHM, can be beneficial for spatial model transferability as the Euclidian distances for the signatures improved on average by 2 %. The effect of semi-quantitative prior constraints combined with topography-guided sub-grid heterogeneity on transferability showed a more variable picture of improvements and deteriorations, but most improvements were observed for low flow statistics.
NASA Astrophysics Data System (ADS)
Nijzink, Remko C.; Samaniego, Luis; Mai, Juliane; Kumar, Rohini; Thober, Stephan; Zink, Matthias; Schäfer, David; Savenije, Hubert H. G.; Hrachowitz, Markus
2016-03-01
Heterogeneity of landscape features like terrain, soil, and vegetation properties affects the partitioning of water and energy. However, it remains unclear to what extent an explicit representation of this heterogeneity at the sub-grid scale of distributed hydrological models can improve the hydrological consistency and the robustness of such models. In this study, hydrological process complexity arising from sub-grid topography heterogeneity was incorporated into the distributed mesoscale Hydrologic Model (mHM). Seven study catchments across Europe were used to test whether (1) the incorporation of additional sub-grid variability on the basis of landscape-derived response units improves model internal dynamics, (2) the application of semi-quantitative, expert-knowledge-based model constraints reduces model uncertainty, and whether (3) the combined use of sub-grid response units and model constraints improves the spatial transferability of the model. Unconstrained and constrained versions of both the original mHM and mHMtopo, which allows for topography-based sub-grid heterogeneity, were calibrated for each catchment individually following a multi-objective calibration strategy. In addition, four of the study catchments were simultaneously calibrated and their feasible parameter sets were transferred to the remaining three receiver catchments. In a post-calibration evaluation procedure the probabilities of model and transferability improvement, when accounting for sub-grid variability and/or applying expert-knowledge-based model constraints, were assessed on the basis of a set of hydrological signatures. In terms of the Euclidian distance to the optimal model, used as an overall measure of model performance with respect to the individual signatures, the model improvement achieved by introducing sub-grid heterogeneity to mHM in mHMtopo was on average 13 %. The addition of semi-quantitative constraints to mHM and mHMtopo resulted in improvements of 13 and 19 %, respectively, compared to the base case of the unconstrained mHM. Most significant improvements in signature representations were, in particular, achieved for low flow statistics. The application of prior semi-quantitative constraints further improved the partitioning between runoff and evaporative fluxes. In addition, it was shown that suitable semi-quantitative prior constraints in combination with the transfer-function-based regularization approach of mHM can be beneficial for spatial model transferability as the Euclidian distances for the signatures improved on average by 2 %. The effect of semi-quantitative prior constraints combined with topography-guided sub-grid heterogeneity on transferability showed a more variable picture of improvements and deteriorations, but most improvements were observed for low flow statistics.
Verification of High Resolution Soil Moisture and Latent Heat in Germany
NASA Astrophysics Data System (ADS)
Samaniego, L. E.; Warrach-Sagi, K.; Zink, M.; Wulfmeyer, V.
2012-12-01
Improving our understanding of soil-land-surface-atmosphere feedbacks is fundamental to make reliable predictions of water and energy fluxes on land systems influenced by anthropogenic activities. Estimating, for instance, which would be the likely consequences of changing climatic regimes on water availability and crop yield, requires of high resolution soil moisture. Modeling it at large-scales, however, is difficult and uncertain because of the interplay between state variables and fluxes and the significant parameter uncertainty of the predicting models. At larger scales, the sub-grid variability of the variables involved and the nonlinearity of the processes complicate the modeling exercise even further because parametrization schemes might be scale dependent. Two contrasting modeling paradigms (WRF/Noah-MP and mHM) were employed to quantify the effects of model and data complexity on soil moisture and latent heat over Germany. WRF/Noah-MP was forced ERA-interim on the boundaries of the rotated CORDEX-Grid (www.meteo.unican.es/wiki/cordexwrf) with a spatial resolution of 0.11o covering Europe during the period from 1989 to 2009. Land cover and soil texture were represented in WRF/Noah-MP with 1×1~km MODIS images and a single horizon, coarse resolution European-wide soil map with 16 soil texture classes, respectively. To ease comparison, the process-based hydrological model mHM was forced with daily precipitation and temperature fields generated by WRF during the same period. The spatial resolution of mHM was fixed at 4×4~km. The multiscale parameter regionalization technique (MPR, Samaniego et al. 2010) was embedded in mHM to be able to estimate effective model parameters using hyper-resolution input data (100×100~km) obtained from Corine land cover and detailed soil texture fields for various horizons comprising 72 soil texture classes for Germany, among other physiographical variables. mHM global parameters, in contrast with those of Noah-MP, were obtained by closing the water balance over major river basins in Germany. Simulated soil moisture and latent heat flux were also evaluated at several eddy covariance sites in Germany. Comparison of monthly soil moisture and latent heat fields obtained with both models over Germany exhibited significant differences, which are mainly attributed to the subgrid variability of key model parameters such as porosity and aerodynamic resistance. Comparison of soil moisture fields obtained with WRF/Noah-MP and mHM forced with grided metereological observations (German Meteorological Service) showed that the differences between both models are mainly due to a combination of precipitation bias and different soil texture resolution. However, EOF analyses indicate that CORDEX results start recovering structures due to soil and vegetation properties. This experiment clearly highlighted the importance of hyper resolution input data to address these challenge. High resolution mHM simulations also indicate that the parametric uncertainty of land surface models is significant, and should not be neglected if a model is to be employed for application at regional scales, e.g. for drought monitoring.
Mughini-Gras, L; van Pelt, W; van der Voort, M; Heck, M; Friesema, I; Franz, E
2018-02-01
Shiga toxin-producing Escherichia coli (STEC) is a zoonotic pathogen of public health concern whose sources and transmission routes are difficult to trace. Using a combined source attribution and case-control analysis, we determined the relative contributions of four putative livestock sources (cattle, small ruminants, pigs, poultry) to human STEC infections and their associated dietary, animal contact, temporal and socio-econo-demographic risk factors in the Netherlands in 2010/2011-2014. Dutch source data were supplemented with those from other European countries with similar STEC epidemiology. Human STEC infections were attributed to sources using both the modified Dutch model (mDM) and the modified Hald model (mHM) supplied with the same O-serotyping data. Cattle accounted for 48.6% (mDM) and 53.1% (mHM) of the 1,183 human cases attributed, followed by small ruminants (mDM: 23.5%; mHM: 25.4%), pigs (mDM: 12.5%; mHM: 5.7%) and poultry (mDM: 2.7%; mHM: 3.1%), whereas the sources of the remaining 12.8% of cases could not be attributed. Of the top five O-serotypes infecting humans, O157, O26, O91 and O103 were mainly attributed to cattle (61%-75%) and O146 to small ruminants (71%-77%). Significant risk factors for human STEC infection as a whole were the consumption of beef, raw/undercooked meat or cured meat/cold cuts. For cattle-attributed STEC infections, specific risk factors were consuming raw meat spreads and beef. Consuming raw/undercooked or minced meat were risk factors for STEC infections attributed to small ruminants. For STEC infections attributed to pigs, only consuming raw/undercooked meat was significant. Consuming minced meat, raw/undercooked meat or cured meat/cold cuts were associated with poultry-attributed STEC infections. Consuming raw vegetables was protective for all STEC infections. We concluded that domestic ruminants account for approximately three-quarters of reported human STEC infections, whereas pigs and poultry play a minor role and that risk factors for human STEC infection vary according to the attributed source. © 2017 Blackwell Verlag GmbH.
Milk-substitute diet composition and abomasal secretion in the calf.
Williams, V J; Roy, J H; Gillies, C M
1976-11-01
1. The effect of different protein sources in milk-substitute diets on abomasal acidity and proteolytic activity was studied in Friesian calves, aged 20-58 d (Expt 1). The diets contained 'mildly' preheated, spray-dried skim-milk powder (MHM), severely preheated, spray-dried skim-milk powder (SHM), fish-protein concentrate (FPC) or solvent-extracted soya-bean flour (SF) as the main protein source. 2. Gastric juice was collected from abomasal pouches before feeding and at 15 min intervals for 8 h after the morning feed. Samples of digesta were obtained from the abomasum at 1 h intervals during the same period. 3. Digesta pH was lower and titratable acidity higher 0-3 after giving the diet containing MHM than when any of the other three diets was given. 3. Acid secretion from the pouches for the different diets was in the order: FPC greater than MHM greater than SHM greater than or equal to SF. 5. Protease secretion from the pouches, assayed at pH 2-1, was in the order: MHM greater than SHM = FPC greater than SF. 6. The effect of dry matter (DM) intake and concentration on abomasal acidity was also studied in calves given diets which contained MHM (Expt 2). This diet was reconstituted at either 100 or 149 g DM/kg liquid diet and fed at either 32-5 or 49-0 g DM/kg live weight 0-75 per d. Samples of abomasal digesta were collected as in Expt 1. 7. A high intake of DM at a low DM concentration resulted in low acidity of the digesta in the first 3 h after feeding, which suggested a dilution effect. Comparison of two diets of different DM concentration, which were fed in the same volume of liquid, indicated that the greater the DM intake, the greater was the amount of acid secreted. 8. It is concluded that the protein sources varied in their ability to stimulate abomasal acid and protease secretion and it is suggested that this may relate to calf performance.
2016-01-01
Purpose/Objective(s) We sought to identify swallowing muscle dose-response thresholds associated with chronic radiation-associated dysphagia (RAD) after IMRT for oropharyngeal cancer. Materials/Methods T1-4 N0-3 M0 oropharyngeal cancer patients who received definitive IMRT and systemic therapy were examined. Chronic RAD was coded as any of the following ≥ 12 months post-IMRT: videofluoroscopy/endoscopy detected aspiration or stricture, gastrostomy tube and/or aspiration pneumonia. DICOM-RT plan data were autosegmented using a custom region-of-interest (ROI) library and included inferior, middle and superior constrictors (IPC, MPC, and SPC), medial and lateral pterygoids (MPM, LPM), anterior and posterior digastrics (ADM, PDM), intrinsic tongue muscles (ITM), mylo/geniohyoid complex (MHM), genioglossus (GGM), ), masseter (MM), Buccinator (BM), palatoglossus (PGM), and cricopharyngeus (CPM), with ROI dose-volume histograms (DVHs) calculated. Recursive partitioning analysis (RPA) was used to identify dose-volume effects associated with chronic-RAD, for use in a multivariate (MV) model. Results Of 300 patients, 34 (11%) had chronic-RAD. RPA showed DVH-derived MHM V69 (i.e. the volume receiving ≥69Gy), GGM V35, ADM V60, MPC V49, and SPC V70 were associated with chronic-RAD. A model including age in addition to MHM V69 as continuous variables was optimal among tested MV models (AUC 0.835). Conclusion In addition to SPCs, dose to MHM should be monitored and constrained, especially in older patients (>62-years), when feasible. PMID:26897515
2016-02-01
We sought to identify swallowing muscle dose-response thresholds associated with chronic radiation-associated dysphagia (RAD) after IMRT for oropharyngeal cancer. T1-4 N0-3 M0 oropharyngeal cancer patients who received definitive IMRT and systemic therapy were examined. Chronic RAD was coded as any of the following ⩾12months post-IMRT: videofluoroscopy/endoscopy detected aspiration or stricture, gastrostomy tube and/or aspiration pneumonia. DICOM-RT plan data were autosegmented using a custom region-of-interest (ROI) library and included inferior, middle and superior constrictors (IPC, MPC, and SPC), medial and lateral pterygoids (MPM, LPM), anterior and posterior digastrics (ADM, PDM), intrinsic tongue muscles (ITM), mylo/geniohyoid complex (MHM), genioglossus (GGM), masseter (MM), buccinator (BM), palatoglossus (PGM), and cricopharyngeus (CPM), with ROI dose-volume histograms (DVHs) calculated. Recursive partitioning analysis (RPA) was used to identify dose-volume effects associated with chronic-RAD, for use in a multivariate (MV) model. Of 300 patients, 34 (11%) had chronic-RAD. RPA showed DVH-derived MHM V69 (i.e. the volume receiving⩾69Gy), GGM V35, ADM V60, MPC V49, and SPC V70 were associated with chronic-RAD. A model including age in addition to MHM V69 as continuous variables was optimal among tested MV models (AUC 0.835). In addition to SPCs, dose to MHM should be monitored and constrained, especially in older patients (>62-years), when feasible. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Multiscale verification of water fluxes and states over Pan European river basins
NASA Astrophysics Data System (ADS)
Samaniego, Luis; Rakovec, Oldrich; Schaefer, David; Kumar, Rohini; Cuntz, Matthias; Mai, Juliane; Craven, John
2014-05-01
Developing the ability to predict the movement of water at regional scales with a spatial resolution from 1 to 5 km is one of grand challenges in land surface modelling. Coping with this grand challenge implies that land surface models (LSM) should be able to make reliable predictions across locations and/or scales other than those used for parameter estimation. Validating LSM only against integral basin response such as streamflow is a necessary but not a sufficient condition to warranty the appropriate partitioning of incoming precipitation and radiation into different water budget components. Extensive in-situ observations of state variables (e.g., soil moisture), on the contrary, are not feasible at regional scales. Remote sensing has been considered as the solution for this dilemma because they constitute a cost-effective source of information and provide a valuable insight about the spatio-temporal patterns of state variables. Their main disadvantage is their large uncertainty. The mesoscale hydrologic model (mHM 5.0 http://www.ufz.de/index.php?en=31389) is used in this study to estimate uncalibrated water fluxes and states and then to investigate which are the effects of conditioning this model with freely available multiple-scale data sets. The main characteristic of mHM is the treatment of the sub-grid variability of input variables and model parameters which clearly distinguishes this model from existing precipitation-runoff models or land surface models. It uses a Multiscale Parameter Regionalization (MPR) to account for the sub-grid variability and to avoid systematic re-calibration. Another key characteristic of mHM is that it can simultaneously estimate fluxes in nested-scales and/or in multiple basins keeping its global parameters (i.e., regionalization coefficients) unaltered across scales and basins. These key characteristics of the model would allow to assimilate disparate sources of information such as satellite data, streamflow gauging stations, and eddy covariance data at their native resolutions. To address these objectives, mHM was set up over more than 280 Pan-European river basins. This model was forced with the gridded EOBS data set (25x25 km2) obtained from the European Climate Assessment & Dataset projec. The required morphological data was derived from the FAO soil map (1:5,000,000), the SRTM DEM (500 m) and three CORINE land cover scenes (500 m). MODIS LAI (NASA) was used to estimate a dynamic LAI model for every land cover class. mHM simulations were obtained at 25 km spatial resolution for the period 1950-2012. The multi-scale verification of simulated water fluxes was carried out using observation data sets such as: latent heat flux obtained from more than 150 eddy flux stations (FLUXNET), streamflow in more than 250 gauging stations (GRDC), and the remotely sensed Earth's gravity field anomalies retrieved by the Gravity Recovery and Climate Experiment (GRACE) release 05 (Landerer and Swenson, 2012, WRR). The former are used a proxy of the total water storage anomalies in mHM. In Germany, over 1000 weakly groundwater stage stations were used to evaluate and/or condition groundwater level anomalies. mHM water storage anomalies simulated over Europe from 2003 to 2012 at monthly time step were compared with those of GRACE. Results lead to the conclusion that mHM water fluxes are robust since less than 25% of river basins exhibit Nash-Sutcliffe efficiencies (NSE) of 0.5 or less. Likewise, the soil moisture and groundwater anomalies, specially in severe drought years such as 2003, exhibit a large spatial correlation with those obtained from remotely sensed products. Comparison against observed latent heat indicates that the dynamics and magnitude of the simulated values were well captured by the model at most locations. In general, deficient model performance (NSE
20. MACHINE SHOP, LOOKING SOUTH. SHOP IS EQUIPPED WITH A ...
20. MACHINE SHOP, LOOKING SOUTH. SHOP IS EQUIPPED WITH A 25-TON SHAW CRANE TO HANDLE PARTS FROM RAIL CARS INTO THE SHOP. MACHINE SHOP HANDLES ALL NECESSARY REPAIR WORK ON THE DOCK MACHINERY. - Pennsylvania Railway Ore Dock, Lake Erie at Whiskey Island, approximately 1.5 miles west of Public Square, Cleveland, Cuyahoga County, OH
Parallel database search and prime factorization with magnonic holographic memory devices
DOE Office of Scientific and Technical Information (OSTI.GOV)
Khitun, Alexander
In this work, we describe the capabilities of Magnonic Holographic Memory (MHM) for parallel database search and prime factorization. MHM is a type of holographic device, which utilizes spin waves for data transfer and processing. Its operation is based on the correlation between the phases and the amplitudes of the input spin waves and the output inductive voltage. The input of MHM is provided by the phased array of spin wave generating elements allowing the producing of phase patterns of an arbitrary form. The latter makes it possible to code logic states into the phases of propagating waves and exploitmore » wave superposition for parallel data processing. We present the results of numerical modeling illustrating parallel database search and prime factorization. The results of numerical simulations on the database search are in agreement with the available experimental data. The use of classical wave interference may results in a significant speedup over the conventional digital logic circuits in special task data processing (e.g., √n in database search). Potentially, magnonic holographic devices can be implemented as complementary logic units to digital processors. Physical limitations and technological constrains of the spin wave approach are also discussed.« less
Parallel database search and prime factorization with magnonic holographic memory devices
NASA Astrophysics Data System (ADS)
Khitun, Alexander
2015-12-01
In this work, we describe the capabilities of Magnonic Holographic Memory (MHM) for parallel database search and prime factorization. MHM is a type of holographic device, which utilizes spin waves for data transfer and processing. Its operation is based on the correlation between the phases and the amplitudes of the input spin waves and the output inductive voltage. The input of MHM is provided by the phased array of spin wave generating elements allowing the producing of phase patterns of an arbitrary form. The latter makes it possible to code logic states into the phases of propagating waves and exploit wave superposition for parallel data processing. We present the results of numerical modeling illustrating parallel database search and prime factorization. The results of numerical simulations on the database search are in agreement with the available experimental data. The use of classical wave interference may results in a significant speedup over the conventional digital logic circuits in special task data processing (e.g., √n in database search). Potentially, magnonic holographic devices can be implemented as complementary logic units to digital processors. Physical limitations and technological constrains of the spin wave approach are also discussed.
Towards reliable ET estimates in the semi-arid Júcar region in Spain.
NASA Astrophysics Data System (ADS)
Brenner, Johannes; Zink, Matthias; Schrön, Martin; Thober, Stephan; Rakovec, Oldrich; Cuntz, Matthias; Merz, Ralf; Samaniego, Luis
2017-04-01
Current research indicated the potential for improving evapotranspiration (ET) estimates in state-of-the-art hydrologic models such as the mesoscale Hydrological Model (mHM, www.ufz.de/mhm). Most models exhibit deficiencies to estimate the ET flux in semi-arid regions. Possible reasons for poor performance may be related to the low resolution of the forcings, the estimation of the PET, which is in most cases based on temperature only, the joint estimation of the transpiration and evaporation through the Feddes equation, poor process parameterizations, among others. In this study, we aim at sequential hypothesis-based experiments to uncover the main reasons of these deficiencies at the Júcar basin in Spain. We plan the following experiments: 1) Use the high resolution meteorological forcing (P and T) provided by local authorities to estimate its effects on ET and streamflow. 2) Use local ET measurements at seven eddy covariance stations to estimate evaporation related parameters. 3) Test the influence of the PET formulations (Hargreaves-Samani, Priestley-Taylor, Penman-Montheith). 4) Estimate evaporation and transpiration separately based on equations proposed by Bohn and Vivoni (2016) 5) Incorporate local soil moisture measurements to re-estimate ET and soil moisture related parameters. We set-up mHM for seven eddy-covariance sites at the local scale (100 × 100 m2). This resolution was chosen because it is representative for the footprint of the latent heat estimation at the eddy-covariance station. In the second experiment, for example, a parameter set is to be found as a compromised solution between ET measured at local stations and the streamflow observations at eight sub-basins of the Júcar river. Preliminary results indicate that higher model performance regarding streamflow can be achieved using local high-resolution meteorology. ET performance is, however, still deficient. On the contrary, using ET site calibrations alone increase performance in ET but yields in poor performance in streamflow. Results suggest the need of multi-variable, simultaneous calibration schemes to reliable estimate ET and streamflow in the Júcar basin. Penman-Montheith appears to be the best performing PET formulation. Experiments 4 and 5 should reveal the benefits of separating evaporation from bare soil and transpiration in semi-arid regions using mHM. Further research in this direction is foreseen by incorporating neutron counts from Cosmic Ray Neutron Sensing technology in the calibration/validation procedure of mHM.
'We do not know': a qualitative study exploring boys perceptions of menstruation in India.
Mason, Linda; Sivakami, Muthusamy; Thakur, Harshad; Kakade, Narendra; Beauman, Ashley; Alexander, Kelly T; van Eijke, Anna Maria; Laserson, Kayla F; Thakkar, Mamita B; Phillips-Howard, Penelope A
2017-12-08
In low-middle income countries and other areas of poverty, menstrual hygiene management (MHM) can be problematic for women and girls. Issues include lack of knowledge about menstruation and MHM, and stigma around menstruation, also access to affordable and absorbent materials; privacy to change; adequate washing, cleaning and drying facilities; as well as appropriate and accessible disposal facilities. In order to effect change and tackle these issues, particularly in patriarchal societies, males may need to become advocates for MHM alongside women. However, little is known about their knowledge and attitudes towards menstruation, which may need addressing before they can assist in acting as advocates for change. The present study was undertaken to explore knowledge and attitudes about menstruation among adolescent boys across India, in order to gauge their potential to support their 'sisters'. The study was undertaken across three states in India, chosen a priori to represent the cultural and socio-economic diversity. Qualitative data using focus group discussions with 85 boys aged 13-17 years, from 8 schools, was gathered. Data were analysed using thematic analysis. The results were organised into three main themes, reflecting the key research questions: boys' knowledge of menstruation, source of knowledge, and attitudes towards menstruation and menstruating girls. Knowledge comprised three aspects; biological function which were generally poorly understood; cultural rites which were recognized by all; and girls' behaviour and demeanour, which were noted to be withdrawn. Some boys learnt about puberty and menstruation as part of the curriculum but had concerns this was not in-depth, or was missed out altogether. Most gathered knowledge from informal sources, from overhearing conversations or observing cultural rituals. Few boys openly displayed a negative attitude, although a minority voiced the idea that menstruation is a 'disease'. Boys were mostly sympathetic to their menstruating sisters and wanted to support them. These findings provide some optimism that males can become advocates in moving forward the MHM agenda. The reasons for this are twofold: boys were keen for knowledge about menstruation, searching information out despite societal norms being for them to remain ignorant, they were also largely sympathetic to their menstruating sisters and fellow classmates and understanding of the issues surrounding the need for good MHM.
Calibration of a distributed hydrologic model using observed spatial patterns from MODIS data
NASA Astrophysics Data System (ADS)
Demirel, Mehmet C.; González, Gorka M.; Mai, Juliane; Stisen, Simon
2016-04-01
Distributed hydrologic models are typically calibrated against streamflow observations at the outlet of the basin. Along with these observations from gauging stations, satellite based estimates offer independent evaluation data such as remotely sensed actual evapotranspiration (aET) and land surface temperature. The primary objective of the study is to compare model calibrations against traditional downstream discharge measurements with calibrations against simulated spatial patterns and combinations of both types of observations. While the discharge based model calibration typically improves the temporal dynamics of the model, it seems to give rise to minimum improvement of the simulated spatial patterns. In contrast, objective functions specifically targeting the spatial pattern performance could potentially increase the spatial model performance. However, most modeling studies, including the model formulations and parameterization, are not designed to actually change the simulated spatial pattern during calibration. This study investigates the potential benefits of incorporating spatial patterns from MODIS data to calibrate the mesoscale hydrologic model (mHM). This model is selected as it allows for a change in the spatial distribution of key soil parameters through the optimization of pedo-transfer function parameters and includes options for using fully distributed daily Leaf Area Index (LAI) values directly as input. In addition the simulated aET can be estimated at a spatial resolution suitable for comparison to the spatial patterns observed with MODIS data. To increase our control on spatial calibration we introduced three additional parameters to the model. These new parameters are part of an empirical equation to the calculate crop coefficient (Kc) from daily LAI maps and used to update potential evapotranspiration (PET) as model inputs. This is done instead of correcting/updating PET with just a uniform (or aspect driven) factor used in the mHM model (version 5.3). We selected the 20 most important parameters out of 53 mHM parameters based on a comprehensive sensitivity analysis (Cuntz et al., 2015). We calibrated 1km-daily mHM for the Skjern basin in Denmark using the Shuffled Complex Evolution (SCE) algorithm and inputs at different spatial scales i.e. meteorological data at 10km and morphological data at 250 meters. We used correlation coefficients between observed monthly (summer months only) MODIS data calculated from cloud free days over the calibration period from 2001 to 2008 and simulated aET from mHM over the same period. Similarly other metrics, e.g mapcurves and fraction skill-score, are also included in our objective function to assess the co-location of the grid-cells. The preliminary results show that multi-objective calibration of mHM against observed streamflow and spatial patterns together does not significantly reduce the spatial errors in aET while it improves the streamflow simulations. This is a strong signal for further investigation of the multi parameter regionalization affecting spatial aET patterns and weighting the spatial metrics in the objective function relative to the streamflow metrics.
Control rod system useable for fuel handling in a gas-cooled nuclear reactor
Spurrier, Francis R.
1976-11-30
A control rod and its associated drive are used to elevate a complete stack of fuel blocks to a position above the core of a gas-cooled nuclear reactor. A fuel-handling machine grasps the control rod and the drive is unlatched from the rod. The stack and rod are transferred out of the reactor, or to a new location in the reactor, by the fuel-handling machine.
VanLeeuwen, Crystal; Torondel, Belen
2018-01-01
Management of menstruation in contexts of humanitarian emergencies can be challenging. A lack of empirical research about effective interventions which improve menstrual hygiene management (MHM) among female populations in humanitarian emergencies and a lack of clarity about which sectors within a humanitarian response should deliver MHM interventions can both be attributable to the lack of clear guidance on design and delivery of culturally appropriate MHM intervention in settings of humanitarian emergencies. The objective of this review was to collate, summarize, and appraise existing peer-reviewed and gray literature that describes the current scenario of MHM in emergency contexts in order to describe the breadth and depth of current policies, guidelines, empirical research, and humanitarian aid activities addressing populations' menstrual needs. A structured-search strategy was conducted for peer-reviewed and gray literature to identify studies, published reports, guidelines, and policy papers related to menstrual response in emergency humanitarian contexts. Of the 51 articles included in the review, 16 were peer-reviewed papers and 35 were gray literature. Most of the literature agreed that hardware interventions should focus on the supply of adequate material (not only absorbent material but also other supportive material) and adequate sanitation facilities, with access to water and private space for washing, changing, drying, and disposing menstrual materials. Software interventions should focus on education in the usage of materials to manage menstruation hygienically and education about the female body's biological processes. There was clear agreement that the needs of the target population should be assessed before designing any intervention. Although there is insight about which factors should be included in an effective menstrual hygiene intervention, there is insufficient empirical evidence to establish which interventions are most effective in humanitarian emergencies and which sectors should be responsible for the coordination and implementation of such. Increased monitoring and evaluation studies of interventions should be completed and publicly shared, in order to feed evidence-based guidelines in the humanitarian sector.
VanLeeuwen, Crystal; Torondel, Belen
2018-01-01
Management of menstruation in contexts of humanitarian emergencies can be challenging. A lack of empirical research about effective interventions which improve menstrual hygiene management (MHM) among female populations in humanitarian emergencies and a lack of clarity about which sectors within a humanitarian response should deliver MHM interventions can both be attributable to the lack of clear guidance on design and delivery of culturally appropriate MHM intervention in settings of humanitarian emergencies. The objective of this review was to collate, summarize, and appraise existing peer-reviewed and gray literature that describes the current scenario of MHM in emergency contexts in order to describe the breadth and depth of current policies, guidelines, empirical research, and humanitarian aid activities addressing populations’ menstrual needs. A structured-search strategy was conducted for peer-reviewed and gray literature to identify studies, published reports, guidelines, and policy papers related to menstrual response in emergency humanitarian contexts. Of the 51 articles included in the review, 16 were peer-reviewed papers and 35 were gray literature. Most of the literature agreed that hardware interventions should focus on the supply of adequate material (not only absorbent material but also other supportive material) and adequate sanitation facilities, with access to water and private space for washing, changing, drying, and disposing menstrual materials. Software interventions should focus on education in the usage of materials to manage menstruation hygienically and education about the female body’s biological processes. There was clear agreement that the needs of the target population should be assessed before designing any intervention. Although there is insight about which factors should be included in an effective menstrual hygiene intervention, there is insufficient empirical evidence to establish which interventions are most effective in humanitarian emergencies and which sectors should be responsible for the coordination and implementation of such. Increased monitoring and evaluation studies of interventions should be completed and publicly shared, in order to feed evidence-based guidelines in the humanitarian sector. PMID:29692636
Tarride, J E; Harrington, K; Balfour, R; Simpson, P; Foord, L; Anderson, L; Lakey, W
2011-01-01
To evaluate the My Health Matters! (MHM) program, a multifaceted workplace intervention relying on education and awareness, early detection and disease management with a focus on risk factors for metabolic syndrome. The MHM program was offered to 2,000 public servants working in more than 30 worksites in British Columbia, Canada. The MHM program included a health risk assessment combined with an opportunity to attend an on-site screening and face-to-face call back visits and related on-site educational programs. Clinical and economic outcomes were collected over time in this one-year prospective study coupled with administrative and survey data. Forty three per cent of employees (N=857) completed the online HRA and 23 per cent (N=447) attended the initial clinical visit with the nurse. Risk factors for metabolic syndrome were identified in more than half of those attending the clinical visit. The number of risk factors significantly decreased by 15 per cent over six months (N=141). The cost per employee completing the HRA was $205 while the cost per employee attending the initial clinical visit was $394. Eighty-two per cent of employees would recommend the program to other employers. This study supports that workplace interventions are feasible, sustainable and valued by employees. As such, this study provides a new framework for implementing and evaluating workplace interventions focussing on metabolic disorders.
Ammunition Loading and Firing Test Pretest Physical Conditioning of Female Soldier Participants
1978-10-01
appear to be a significant improvement considering that Cooper’s values are based upon women running it, shorts and tennis shoes as opposed to the Ss who...machine. of the other, facing machine between handles. 2. Grasp lift handles. 2. Squat down, bending at knees and hips, and 3. "Pin" elbows to your side
NASA Astrophysics Data System (ADS)
Kalra, Anisha; Vura, Sandeep; Rathkanthiwar, Shashwat; Muralidharan, Rangarajan; Raghavan, Srinivasan; Nath, Digbijoy N.
2018-06-01
We demonstrate epitaxial β-Ga2O3/GaN-based vertical metal–heterojunction-metal (MHM) broadband UV-A/UV-C photodetectors with high responsivity (3.7 A/W) at 256 and 365 nm, UV-to-visible rejection >103, and a photo-to-dark current ratio of ∼100. A small (large) conduction (valence) band offset at the heterojunction of pulsed laser deposition (PLD)-grown β-Ga2O3 on metal organic chemical vapor deposition (MOCVD)-grown GaN-on-silicon with epitaxial registry, as confirmed by X-ray diffraction (XRD) azimuthal scanning, is exploited to realize detectors with an asymmetric photoresponse and is explained with one-dimensional (1D) band diagram simulations. The demonstrated novel vertical MHM detectors on silicon are fully scalable and promising for enabling focal plane arrays for broadband ultraviolet sensing.
A Look at Technologies Vis-a-vis Information Handling Techniques.
ERIC Educational Resources Information Center
Swanson, Rowena W.
The paper examines several ideas for information handling implemented with new technologies that suggest directions for future development. These are grouped under the topic headings: Handling Large Data Banks, Providing Personalized Information Packages, Providing Information Specialist Services, and Expanding Man-Machine Interaction. Guides in…
The Multiscale Robin Coupled Method for flows in porous media
NASA Astrophysics Data System (ADS)
Guiraldello, Rafael T.; Ausas, Roberto F.; Sousa, Fabricio S.; Pereira, Felipe; Buscaglia, Gustavo C.
2018-02-01
A multiscale mixed method aiming at the accurate approximation of velocity and pressure fields in heterogeneous porous media is proposed. The procedure is based on a new domain decomposition method in which the local problems are subject to Robin boundary conditions. The domain decomposition procedure is defined in terms of two independent spaces on the skeleton of the decomposition, corresponding to interface pressures and fluxes, that can be chosen with great flexibility to accommodate local features of the underlying permeability fields. The well-posedness of the new domain decomposition procedure is established and its connection with the method of Douglas et al. (1993) [12], is identified, also allowing us to reinterpret the known procedure as an optimized Schwarz (or Two-Lagrange-Multiplier) method. The multiscale property of the new domain decomposition method is indicated, and its relation with the Multiscale Mortar Mixed Finite Element Method (MMMFEM) and the Multiscale Hybrid-Mixed (MHM) Finite Element Method is discussed. Numerical simulations are presented aiming at illustrating several features of the new method. Initially we illustrate the possibility of switching from MMMFEM to MHM by suitably varying the Robin condition parameter in the new multiscale method. Then we turn our attention to realistic flows in high-contrast, channelized porous formations. We show that for a range of values of the Robin condition parameter our method provides better approximations for pressure and velocity than those computed with either the MMMFEM and the MHM. This is an indication that our method has the potential to produce more accurate velocity fields in the presence of rough, realistic permeability fields of petroleum reservoirs.
Simulation Platform: a cloud-based online simulation environment.
Yamazaki, Tadashi; Ikeno, Hidetoshi; Okumura, Yoshihiro; Satoh, Shunji; Kamiyama, Yoshimi; Hirata, Yutaka; Inagaki, Keiichiro; Ishihara, Akito; Kannon, Takayuki; Usui, Shiro
2011-09-01
For multi-scale and multi-modal neural modeling, it is needed to handle multiple neural models described at different levels seamlessly. Database technology will become more important for these studies, specifically for downloading and handling the neural models seamlessly and effortlessly. To date, conventional neuroinformatics databases have solely been designed to archive model files, but the databases should provide a chance for users to validate the models before downloading them. In this paper, we report our on-going project to develop a cloud-based web service for online simulation called "Simulation Platform". Simulation Platform is a cloud of virtual machines running GNU/Linux. On a virtual machine, various software including developer tools such as compilers and libraries, popular neural simulators such as GENESIS, NEURON and NEST, and scientific software such as Gnuplot, R and Octave, are pre-installed. When a user posts a request, a virtual machine is assigned to the user, and the simulation starts on that machine. The user remotely accesses to the machine through a web browser and carries out the simulation, without the need to install any software but a web browser on the user's own computer. Therefore, Simulation Platform is expected to eliminate impediments to handle multiple neural models that require multiple software. Copyright © 2011 Elsevier Ltd. All rights reserved.
Reprint of: Simulation Platform: a cloud-based online simulation environment.
Yamazaki, Tadashi; Ikeno, Hidetoshi; Okumura, Yoshihiro; Satoh, Shunji; Kamiyama, Yoshimi; Hirata, Yutaka; Inagaki, Keiichiro; Ishihara, Akito; Kannon, Takayuki; Usui, Shiro
2011-11-01
For multi-scale and multi-modal neural modeling, it is needed to handle multiple neural models described at different levels seamlessly. Database technology will become more important for these studies, specifically for downloading and handling the neural models seamlessly and effortlessly. To date, conventional neuroinformatics databases have solely been designed to archive model files, but the databases should provide a chance for users to validate the models before downloading them. In this paper, we report our on-going project to develop a cloud-based web service for online simulation called "Simulation Platform". Simulation Platform is a cloud of virtual machines running GNU/Linux. On a virtual machine, various software including developer tools such as compilers and libraries, popular neural simulators such as GENESIS, NEURON and NEST, and scientific software such as Gnuplot, R and Octave, are pre-installed. When a user posts a request, a virtual machine is assigned to the user, and the simulation starts on that machine. The user remotely accesses to the machine through a web browser and carries out the simulation, without the need to install any software but a web browser on the user's own computer. Therefore, Simulation Platform is expected to eliminate impediments to handle multiple neural models that require multiple software. Copyright © 2011 Elsevier Ltd. All rights reserved.
Menstrual hygiene management among adolescent girls in India: a systematic review and meta-analysis
van Eijk, Anna Maria; Sivakami, M; Thakkar, Mamita Bora; Bauman, Ashley; Laserson, Kayla F; Coates, Susanne; Phillips-Howard, Penelope A
2016-01-01
Objectives To assess the status of menstrual hygiene management (MHM) among adolescent girls in India to determine unmet needs. Design Systematic review and meta-analysis. We searched PubMed, The Global Health Database, Google Scholar and references for studies published from 2000 to September 2015 on girls’ MHM. Setting India. Participants Adolescent girls. Outcome measures Information on menarche awareness, type of absorbent used, disposal, hygiene, restrictions and school absenteeism was extracted from eligible materials; a quality score was applied. Meta-analysis was used to estimate pooled prevalence (PP), and meta-regression to examine the effect of setting, region and time. Results Data from 138 studies involving 193 subpopulations and 97 070 girls were extracted. In 88 studies, half of the girls reported being informed prior to menarche (PP 48%, 95% CI 43% to 53%, I2 98.6%). Commercial pad use was more common among urban (PP 67%, 57% to 76%, I2 99.3%, n=38) than rural girls (PP 32%, 25% to 38%, I2 98.6%, n=56, p<0.0001), with use increasing over time (p<0.0001). Inappropriate disposal was common (PP 23%, 16% to 31%, I2 99.0%, n=34). Menstruating girls experienced many restrictions, especially for religious activities (PP 0.77, 0.71 to 0.83, I2 99.1%, n=67). A quarter (PP 24%, 19% to 30%, I2 98.5%, n=64) reported missing school during periods. A lower prevalence of absenteeism was associated with higher commercial pad use in univariate (p=0.023) but not in multivariate analysis when adjusted for region (p=0.232, n=53). Approximately a third of girls changed their absorbents in school facilities (PP 37%, 29% to 46%, I2 97.8%, n=17). Half of the girls’ homes had a toilet (PP 51%, 36% to 67%, I2 99.4%, n=21). The quality of studies imposed limitations on analyses and the interpretation of results (mean score 3 on a scale of 0–7). Conclusions Strengthening of MHM programmes in India is needed. Education on awareness, access to hygienic absorbents and disposal of MHM items need to be addressed. Trial registration number CRD42015019197. PMID:26936906
Menstrual hygiene management among adolescent girls in India: a systematic review and meta-analysis.
van Eijk, Anna Maria; Sivakami, M; Thakkar, Mamita Bora; Bauman, Ashley; Laserson, Kayla F; Coates, Susanne; Phillips-Howard, Penelope A
2016-03-02
To assess the status of menstrual hygiene management (MHM) among adolescent girls in India to determine unmet needs. Systematic review and meta-analysis. We searched PubMed, The Global Health Database, Google Scholar and references for studies published from 2000 to September 2015 on girls' MHM. India. Adolescent girls. Information on menarche awareness, type of absorbent used, disposal, hygiene, restrictions and school absenteeism was extracted from eligible materials; a quality score was applied. Meta-analysis was used to estimate pooled prevalence (PP), and meta-regression to examine the effect of setting, region and time. Data from 138 studies involving 193 subpopulations and 97,070 girls were extracted. In 88 studies, half of the girls reported being informed prior to menarche (PP 48%, 95% CI 43% to 53%, I(2) 98.6%). Commercial pad use was more common among urban (PP 67%, 57% to 76%, I(2) 99.3%, n=38) than rural girls (PP 32%, 25% to 38%, I(2) 98.6%, n=56, p<0.0001), with use increasing over time (p<0.0001). Inappropriate disposal was common (PP 23%, 16% to 31%, I(2) 99.0%, n=34). Menstruating girls experienced many restrictions, especially for religious activities (PP 0.77, 0.71 to 0.83, I(2) 99.1%, n=67). A quarter (PP 24%, 19% to 30%, I(2) 98.5%, n=64) reported missing school during periods. A lower prevalence of absenteeism was associated with higher commercial pad use in univariate (p=0.023) but not in multivariate analysis when adjusted for region (p=0.232, n=53). Approximately a third of girls changed their absorbents in school facilities (PP 37%, 29% to 46%, I(2) 97.8%, n=17). Half of the girls' homes had a toilet (PP 51%, 36% to 67%, I(2) 99.4%, n=21). The quality of studies imposed limitations on analyses and the interpretation of results (mean score 3 on a scale of 0-7). Strengthening of MHM programmes in India is needed. Education on awareness, access to hygienic absorbents and disposal of MHM items need to be addressed. CRD42015019197. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/
4. VIEW LOOKING NORTHWEST OF FUEL HANDLING BUILDING (CENTER), REACTOR ...
4. VIEW LOOKING NORTHWEST OF FUEL HANDLING BUILDING (CENTER), REACTOR SERVICE BUILDING (RIGHT), MACHINE SHOP (LEFT) - Shippingport Atomic Power Station, On Ohio River, 25 miles Northwest of Pittsburgh, Shippingport, Beaver County, PA
Technical preventive measures in Japan.
Yonekawa, Y
1994-05-01
Technical preventive measures against vibration syndrome in the field of industrial health are reviewed in the present paper. The first technical prevention measure is to reduce vibration transmission from the tools to the operators. This measure employs vibration isolators between the handles and vibration sources of machine tools. Handles of tools using Neidhalt dampers, shear type rubber mounts and springs have reduced frequency-weighted acceleration levels (Lh,w) from 2 dB to 10 dB (Lh,w (dB) = 20 log a/ao; a: frequency-weighted acceleration (rms), ao = 10(-5) m/s2) in Z direction, while no reduction was found in X, Y directions. The second measure is to reduce vibration at the source; New chain saws have been developed to reduce vibration with twin cylinder instead of a single cylinder engines. This cancels unbalanced movements inside the internal combustion engine. Such chain saws reduced Lh,w values more than 10 dB in both front and rear handles except in Z direction of the front handle. A new type of impact wrench has been devised with an oil pulse device to avoid direct metal contact inside the power source. This new impact wrench lowered Lh,w values more than 10 dB in three directions. The third measure is to use a remote control system or to substitute another machine generating less vibration. Vibration reduction at the handle lever of the remote control chain saw was more than 20 dB. A more effective means is to substitute other machines for conventional tools: a hydraulic wheel jumbo instead of a leg-type rock drill; a hydraulic breaker instead of a hand-held breaker. However, these heavy machines produce whole-body vibration which might give rise to other problems such as back pain.
Satellite antenna management system and method
NASA Technical Reports Server (NTRS)
Leath, Timothy T (Inventor); Azzolini, John D (Inventor)
1999-01-01
The antenna management system and method allow a satellite to communicate with a ground station either directly or by an intermediary of a second satellite, thus permitting communication even when the satellite is not within range of the ground station. The system and method employ five major software components, which are the control and initialization module, the command and telemetry handler module, the contact schedule processor module, the contact state machining module, and the telemetry state machine module. The control and initialization module initializes the system and operates the main control cycle, in which the other modules are called. The command and telemetry handler module handles communication to and from the ground station. The contact scheduler processor module handles the contact entry schedules to allow scheduling of contacts with the second satellite. The contact and telemetry state machine modules handle the various states of the satellite in beginning, maintaining and ending contact with the second satellite and in beginning, maintaining and ending communication with the satellite.
Automated solar module assembly line
NASA Technical Reports Server (NTRS)
Bycer, M.
1980-01-01
The solar module assembly machine which Kulicke and Soffa delivered under this contract is a cell tabbing and stringing machine, and capable of handling a variety of cells and assembling strings up to 4 feet long which then can be placed into a module array up to 2 feet by 4 feet in a series of parallel arrangement, and in a straight or interdigitated array format. The machine cycle is 5 seconds per solar cell. This machine is primarily adapted to 3 inch diameter round cells with two tabs between cells. Pulsed heat is used as the bond technique for solar cell interconnects. The solar module assembly machine unloads solar cells from a cassette, automatically orients them, applies flux and solders interconnect ribbons onto the cells. It then inverts the tabbed cells, connects them into cell strings, and delivers them into a module array format using a track mounted vacuum lance, from which they are taken to test and cleaning benches prior to final encapsulation into finished solar modules. Throughout the machine the solar cell is handled very carefully, and any contact with the collector side of the cell is avoided or minimized.
NASA Astrophysics Data System (ADS)
Demirel, M. C.; Mai, J.; Stisen, S.; Mendiguren González, G.; Koch, J.; Samaniego, L. E.
2016-12-01
Distributed hydrologic models are traditionally calibrated and evaluated against observations of streamflow. Spatially distributed remote sensing observations offer a great opportunity to enhance spatial model calibration schemes. For that it is important to identify the model parameters that can change spatial patterns before the satellite based hydrologic model calibration. Our study is based on two main pillars: first we use spatial sensitivity analysis to identify the key parameters controlling the spatial distribution of actual evapotranspiration (AET). Second, we investigate the potential benefits of incorporating spatial patterns from MODIS data to calibrate the mesoscale Hydrologic Model (mHM). This distributed model is selected as it allows for a change in the spatial distribution of key soil parameters through the calibration of pedo-transfer function parameters and includes options for using fully distributed daily Leaf Area Index (LAI) directly as input. In addition the simulated AET can be estimated at the spatial resolution suitable for comparison to the spatial patterns observed using MODIS data. We introduce a new dynamic scaling function employing remotely sensed vegetation to downscale coarse reference evapotranspiration. In total, 17 parameters of 47 mHM parameters are identified using both sequential screening and Latin hypercube one-at-a-time sampling methods. The spatial patterns are found to be sensitive to the vegetation parameters whereas streamflow dynamics are sensitive to the PTF parameters. The results of multi-objective model calibration show that calibration of mHM against observed streamflow does not reduce the spatial errors in AET while they improve only the streamflow simulations. We will further examine the results of model calibration using only multi spatial objective functions measuring the association between observed AET and simulated AET maps and another case including spatial and streamflow metrics together.
Solli, Hans Magnus; Barbosa da Silva, António; Egeland, Jens
2015-01-01
To investigate whether adding descriptions of the health factors "ability," "environment" and "intentions/goals" to the officially sanctioned biomedical disability model (BDM) would improve assessments of work ability for social security purposes. The study was based on a theoretical design consisting of textual analysis and interpretation. Two further work ability models were defined: the mixed health model (MHM), which describes health factors without assessing a person's abilities in context, and the ability-based health model (AHM), which assesses abilities in a concrete context of environment and intention. Eighty-six social security certificates, written by psychiatrists and psychology specialists in a Norwegian hospital-based mental health clinic, were analysed in relation to the three work ability/disability models. In certificates based on the BDM, a general pattern was found of "gradual work training". The MHM added health factors, but without linking them together in a concrete way. With the AHM, work ability was assessed in terms of a concrete unified evaluation of the claimant's abilities, environments and intentions/goals. Applying the AHM in work ability assessments, in comparison with the BDM and the MHM, is useful because this foregrounds claimants' abilities in a context of concrete goals and work-related opportunities, as a unity. Implications for Rehabilitation A concept of health should include ability, environment and intentions/goals as components. When all three of these components are described in concrete terms in a work ability assessment, an integrated picture of the individual's abilities in the context of his/her particular intentions/goals and work opportunities comes to the fore. This kind of assessment makes it possible to meet the individual's needs for individual follow-up in a work environment.
Speed-Selector Guard For Machine Tool
NASA Technical Reports Server (NTRS)
Shakhshir, Roda J.; Valentine, Richard L.
1992-01-01
Simple guardplate prevents accidental reversal of direction of rotation or sudden change of speed of lathe, milling machine, or other machine tool. Custom-made for specific machine and control settings. Allows control lever to be placed at only one setting. Operator uses handle to slide guard to engage or disengage control lever. Protects personnel from injury and equipment from damage occurring if speed- or direction-control lever inadvertently placed in wrong position.
NASA Technical Reports Server (NTRS)
Blackburn, C. L.; Dovi, A. R.; Kurtze, W. L.; Storaasli, O. O.
1981-01-01
A computer software system for the processing and integration of engineering data and programs, called IPAD (Integrated Programs for Aerospace-Vehicle Design), is described. The ability of the system to relieve the engineer of the mundane task of input data preparation is demonstrated by the application of a prototype system to the design, analysis, and/or machining of three simple structures. Future work to further enhance the system's automated data handling and ability to handle larger and more varied design problems are also presented.
... should— • handle soiled items carefully without agitating them, • wear rubber or disposable gloves while handling soiled items and wash your hands after, and wash the items with detergent at the maximum available cycle length then machine dry them. Visit CDC’s Norovirus Web site at ...
Container Prevents Oxidation Of Metal Powder
NASA Technical Reports Server (NTRS)
Woodford, William H.; Power, Christopher A.; Mckechnie, Timothy N.; Burns, David H.
1992-01-01
Sealed high-vacuum container holds metal powder required free of contamination by oxygen from point of manufacture to point of use at vacuum-plasma-spraying machine. Container protects powder from air during filling, storage, and loading of spraying machine. Eliminates unnecessary handling and transfer of powder from one container to another. Stainless-steel container sits on powder feeder of vacuum-plasma-spraying machine.
Manual actuator. [for spacecraft exercising machines
NASA Technical Reports Server (NTRS)
Gause, R. L.; Glenn, C. G. (Inventor)
1974-01-01
An actuator for an exercising machine employable by a crewman aboard a manned spacecraft is presented. The actuator is characterized by a force delivery arm projected from a rotary imput shaft of an exercising machine and having a force input handle extended orthogonally from its distal end. The handle includes a hand-grip configured to be received within the palm of the crewman's hand and a grid pivotally supported for angular displacement between a first position, wherein the grid is disposed in an overlying juxtaposition with the hand-grip, and a second position, angularly displaced from the first position, for affording access to the hand-grip, and a latching mechanism fixed to the sole of a shoe worn by the crewman for latching the shoe to the grid when the grid is in the first position.
Reading Machines for Blind People.
ERIC Educational Resources Information Center
Fender, Derek H.
1983-01-01
Ten stages of developing reading machines for blind people are analyzed: handling of text material; optics; electro-optics; pattern recognition; character recognition; storage; speech synthesizers; browsing and place finding; computer indexing; and other sources of input. Cost considerations of the final product are emphasized. (CL)
Concept Design of the Payload Handling Manipulator System. [space shuttle orbiters
NASA Technical Reports Server (NTRS)
1975-01-01
The design, requirements, and interface definition of a remote manipulator system developed to handle orbiter payloads are presented. End effector design, control system concepts, and man-machine engineering are considered along with crew station requirements and closed circuit television system performance requirements.
Energy-efficient container handling using hybrid model predictive control
NASA Astrophysics Data System (ADS)
Xin, Jianbin; Negenborn, Rudy R.; Lodewijks, Gabriel
2015-11-01
The performance of container terminals needs to be improved to adapt the growth of containers while maintaining sustainability. This paper provides a methodology for determining the trajectory of three key interacting machines for carrying out the so-called bay handling task, involving transporting containers between a vessel and the stacking area in an automated container terminal. The behaviours of the interacting machines are modelled as a collection of interconnected hybrid systems. Hybrid model predictive control (MPC) is proposed to achieve optimal performance, balancing the handling capacity and energy consumption. The underlying control problem is hereby formulated as a mixed-integer linear programming problem. Simulation studies illustrate that a higher penalty on energy consumption indeed leads to improved sustainability using less energy. Moreover, simulations illustrate how the proposed energy-efficient hybrid MPC controller performs under different types of uncertainties.
Intelligible machine learning with malibu.
Langlois, Robert E; Lu, Hui
2008-01-01
malibu is an open-source machine learning work-bench developed in C/C++ for high-performance real-world applications, namely bioinformatics and medical informatics. It leverages third-party machine learning implementations for more robust bug-free software. This workbench handles several well-studied supervised machine learning problems including classification, regression, importance-weighted classification and multiple-instance learning. The malibu interface was designed to create reproducible experiments ideally run in a remote and/or command line environment. The software can be found at: http://proteomics.bioengr. uic.edu/malibu/index.html.
Operating System For Numerically Controlled Milling Machine
NASA Technical Reports Server (NTRS)
Ray, R. B.
1992-01-01
OPMILL program is operating system for Kearney and Trecker milling machine providing fast easy way to program manufacture of machine parts with IBM-compatible personal computer. Gives machinist "equation plotter" feature, which plots equations that define movements and converts equations to milling-machine-controlling program moving cutter along defined path. System includes tool-manager software handling up to 25 tools and automatically adjusts to account for each tool. Developed on IBM PS/2 computer running DOS 3.3 with 1 MB of random-access memory.
Checkpoint repair for high-performance out-of-order execution machines
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hwu, W.M.W.; Patt, Y.N.
Out-or-order execution and branch prediction are two mechanisms that can be used profitably in the design of supercomputers to increase performance. Proper exception handling and branch prediction miss handling in an out-of-order execution machine to require some kind of repair mechanism which can restore the machine to a known previous state. In this paper the authors present a class of repair mechanisms using the concept of checkpointing. The authors derive several properties of checkpoint repair mechanisms. In addition, they provide algorithms for performing checkpoint repair that incur little overhead in time and modest cost in hardware, which also require nomore » additional complexity or time for use with write-back cache memory systems than they do with write-through cache memory systems, contrary to statements made by previous researchers.« less
ERIC Educational Resources Information Center
Weisman, Hannah L.; Kia-Keating, Maryam; Lippincott, Ann; Taylor, Zachary; Zheng, Jimmy
2016-01-01
Background: Researchers have emphasized the importance of integrating mental health education with academic curriculum. The focus of the current studies was "Mental Health Matters" (MHM), a mental health curriculum that is integrated with English language arts. It is taught by trained community member volunteers and aims to increase…
Provision of Information to the Research Staff.
ERIC Educational Resources Information Center
Williams, Martha E.
The Information Sciences section at Illinois Institute of Technology Research Institute (IITRI) is now operating a Computer Search Center (CSC) for handling numerous machine-readable data bases. The computer programs are generalized in the sense that they will handle any incoming data base. This is accomplished by means of a preprocessor system…
Fatigue Life Variability in Large Aluminum Forgings with Residual Stress
2011-07-01
been conducted. A detailed finite element analysis of the forge/ quench /coldwork/machine process was performed in order to predict the bulk residual...forge/ quench /coldwork/machine process was performed in order to predict the bulk residual stresses in a fictitious aluminum bulkhead. The residual...continues to develop the capability for computational simulation of the forge, quench , cold work and machining processes. In order to handle the
Murano, Hirotatsu; Suzuki, Katsuhiro; Kayada, Saori; Saito, Mitsuhiko; Yuge, Naoya; Arishiro, Takuji; Watanabe, Akira; Isoi, Toshiyuki
2018-02-15
Humic substances (HS) in soil and sediments, and surface water influence the behavior of organic xenobiotics in the environment. However, our knowledge of the effects of specific HS fractions, i.e., humic acids (HAs), fulvic acids (FAs), and humin (HM), on the sorption of organic xenobiotics is limited. The neonicotinoid insecticide acetamiprid is thought to contribute to the collapse of honeybee colonies. To understand the role that soil organic matter plays in the fate of acetamiprid, interactions between acetamiprid and the above HS fractions were examined. Batch experiments were conducted using various combinations of a field soil sample and the above 3 HS fractions prepared from the same soil, and differences in isotherm values for acetamiprid sorption were investigated based on the structural differences among the HS fractions. The sorption of acetamiprid to soil minerals associated with HM (MHM) (Freundlich isotherm constant, K f : 6.100) was reduced when HAs or FAs were added (K f : 4.179 and 4.756, respectively). This can be attributed to hydrophobic interactions between HM and HAs or FAs in which their dissociated carboxyl and phenolic groups become oriented to face the soil solution. The amount of acetamiprid that was adsorbed to (MHM+HA) or (MHM+FA) increased when aluminum ions were added (K f : 6.933 and 10.48, respectively), or iron ions were added (K f : 7.303 and 11.29, respectively). Since acetamiprid has no affinity for inorganic components in soil, the formation of HS-metal complexes by cation bridging may have oriented the hydrophobic moieties in the HAs or FAs to face the soil solution and may also have resulted in the formation of dense structures, resulting in an increase in the amount of acetamiprid that becomes adsorbed to these structures. These results highlight the importance of interactions among soil components in the pedospheric diffusion of acetamiprid. Copyright © 2017 Elsevier B.V. All rights reserved.
mRM - multiscale Routing Model for Land Surface and Hydrologic Models
NASA Astrophysics Data System (ADS)
Cuntz, M.; Thober, S.; Mai, J.; Samaniego, L. E.; Gochis, D. J.; Kumar, R.
2015-12-01
Routing streamflow through a river network is a basic step within any distributed hydrologic model. It integrates the generated runoff and allows comparison with observed discharge at the outlet of a catchment. The Muskingum routing is a textbook river routing scheme that has been implemented in Earth System Models (e.g., WRF-HYDRO), stand-alone routing schemes (e.g., RAPID), and hydrologic models (e.g., the mesoscale Hydrologic Model). Most implementations suffer from a high computational demand because the spatial routing resolution is fixed to that of the elevation model irrespective of the hydrologic modeling resolution. This is because the model parameters are scale-dependent and cannot be used at other resolutions without re-estimation. Here, we present the multiscale Routing Model (mRM) that allows for a flexible choice of the routing resolution. mRM exploits the Multiscale Parameter Regionalization (MPR) included in the open-source mesoscale Hydrologic Model (mHM, www.ufz.de/mhm) that relates model parameters to physiographic properties and allows to estimate scale-independent model parameters. mRM is currently coupled to mHM and is presented here as stand-alone Free and Open Source Software (FOSS). The mRM source code is highly modular and provides a subroutine for internal re-use in any land surface scheme. mRM is coupled in this work to the state-of-the-art land surface model Noah-MP. Simulation results using mRM are compared with those available in WRF-HYDRO for the Red River during the period 1990-2000. mRM allows to increase the routing resolution from 100m to more than 10km without deteriorating the model performance. Therefore, it speeds up model calculation by reducing the contribution of routing to total runtime from over 80% to less than 5% in the case of WRF-HYDRO. mRM thus makes discharge data available to land surface modeling with only little extra calculations.
Spatially Distributed Characterization of Catchment Dynamics Using Travel-Time Distributions
NASA Astrophysics Data System (ADS)
Heße, F.; Zink, M.; Attinger, S.
2015-12-01
The description of storage and transport of both water and solved contaminants in catchments is very difficult due to the high heterogeneity of the subsurface properties that govern their fate. This heterogeneity, combined with a generally limited knowledge about the subsurface, results in high degrees of uncertainty. As a result, stochastic methods are increasingly applied, where the relevant processes are modeled as being random. Within these methods, quantities like the catchment travel or residence time of a water parcel are described using probability density functions (PDF). The derivation of these PDF's is typically done by using the water fluxes and states of the catchment. A successful application of such frameworks is therefore contingent on a good quantification of these fluxes and states across the different spatial scales. The objective of this study is to use travel times for the characterization of an ca. 1000 square kilometer, humid catchment in Central Germany. To determine the states and fluxes, we apply the mesoscale Hydrological Model mHM, a spatially distributed hydrological model to the catchment. Using detailed data of precipitation, land cover, morphology and soil type as inputs, mHM is able to determine fluxes like recharge and evapotranspiration and states like soil moisture as outputs. Using these data, we apply the above theoretical framework to our catchment. By virtue of the aforementioned properties of mHM, we are able to describe the storage and release of water with a high spatial resolution. This allows for a comprehensive description of the flow and transport dynamics taking place in the catchment. The spatial distribution of such dynamics is then compared with land cover and soil moisture maps as well as driving forces like precipitation and temperature to determine the most predictive factors. In addition, we investigate how non-local data like the age distribution of discharge flows are impacted by, and therefore allow to infer, local properties of the catchment.
NASA Astrophysics Data System (ADS)
Noh, Seong Jin; Rakovec, Oldrich; Kumar, Rohini; Samaniego, Luis
2016-04-01
There have been tremendous improvements in distributed hydrologic modeling (DHM) which made a process-based simulation with a high spatiotemporal resolution applicable on a large spatial scale. Despite of increasing information on heterogeneous property of a catchment, DHM is still subject to uncertainties inherently coming from model structure, parameters and input forcing. Sequential data assimilation (DA) may facilitate improved streamflow prediction via DHM using real-time observations to correct internal model states. In conventional DA methods such as state updating, parametric uncertainty is, however, often ignored mainly due to practical limitations of methodology to specify modeling uncertainty with limited ensemble members. If parametric uncertainty related with routing and runoff components is not incorporated properly, predictive uncertainty by DHM may be insufficient to capture dynamics of observations, which may deteriorate predictability. Recently, a multi-scale parameter regionalization (MPR) method was proposed to make hydrologic predictions at different scales using a same set of model parameters without losing much of the model performance. The MPR method incorporated within the mesoscale hydrologic model (mHM, http://www.ufz.de/mhm) could effectively represent and control uncertainty of high-dimensional parameters in a distributed model using global parameters. In this study, we present a global multi-parametric ensemble approach to incorporate parametric uncertainty of DHM in DA to improve streamflow predictions. To effectively represent and control uncertainty of high-dimensional parameters with limited number of ensemble, MPR method is incorporated with DA. Lagged particle filtering is utilized to consider the response times and non-Gaussian characteristics of internal hydrologic processes. The hindcasting experiments are implemented to evaluate impacts of the proposed DA method on streamflow predictions in multiple European river basins having different climate and catchment characteristics. Because augmentation of parameters is not required within an assimilation window, the approach could be stable with limited ensemble members and viable for practical uses.
29 CFR 1919.2 - Definition of terms.
Code of Federal Regulations, 2010 CFR
2010-07-01
... horizontal plane by guys (vangs). The term includes shear legs. (2) Crane means a mechanical device, intended... integral part of the machine. A crane may be a fixed or mobile machine. (3) Bulk cargo spout means a spout... ton of 2,000 pounds when applied to shore-based material handling devices or to shore-type cranes...
Visible Machine Learning for Biomedicine.
Yu, Michael K; Ma, Jianzhu; Fisher, Jasmin; Kreisberg, Jason F; Raphael, Benjamin J; Ideker, Trey
2018-06-14
A major ambition of artificial intelligence lies in translating patient data to successful therapies. Machine learning models face particular challenges in biomedicine, however, including handling of extreme data heterogeneity and lack of mechanistic insight into predictions. Here, we argue for "visible" approaches that guide model structure with experimental biology. Copyright © 2018. Published by Elsevier Inc.
Cellular Manufacturing System with Dynamic Lot Size Material Handling
NASA Astrophysics Data System (ADS)
Khannan, M. S. A.; Maruf, A.; Wangsaputra, R.; Sutrisno, S.; Wibawa, T.
2016-02-01
Material Handling take as important role in Cellular Manufacturing System (CMS) design. In several study at CMS design material handling was assumed per pieces or with constant lot size. In real industrial practice, lot size may change during rolling period to cope with demand changes. This study develops CMS Model with Dynamic Lot Size Material Handling. Integer Linear Programming is used to solve the problem. Objective function of this model is minimizing total expected cost consisting machinery depreciation cost, operating costs, inter-cell material handling cost, intra-cell material handling cost, machine relocation costs, setup costs, and production planning cost. This model determines optimum cell formation and optimum lot size. Numerical examples are elaborated in the paper to ilustrate the characterictic of the model.
Rendos, Nicole K; Heredia Vargas, Héctor M; Alipio, Taislaine C; Regis, Rebeca C; Romero, Matthew A; Signorile, Joseph F
2016-07-01
Rendos, NK, Heredia Vargas, HM, Alipio, TC, Regis, RC, Romero, MA, and Signorile, JF. Differences in muscle activity during cable resistance training are influenced by variations in handle types. J Strength Cond Res 30(7): 2001-2009, 2016-There has been a recent resurgence in the use of cable machines for resistance training allowing movements that more effectively simulate daily activities and sports-specific movements. By necessity, these devices require a machine/human interface through some type of handle. Considerable data from material handling, industrial engineering, and exercise training studies indicate that handle qualities, especially size and shape, can significantly influence force production and muscular activity, particularly of the forearm muscles, which affect the critical link in activities that require object manipulation. The purpose for this study was to examine the influence of three different handle conditions: standard handle (StandH), ball handle with the cable between the index and middle fingers (BallIM), and ball handle with the cable between the middle and ring fingers (BallMR), on activity levels (rmsEMG) of the triceps brachii lateral and long heads (TriHLat, TriHLong), brachioradialis (BR), flexor carpi radialis (FCR), extensor carpi ulnaris, and extensor digitorum (ED) during eight repetitions of standing triceps pushdown performed from 90° to 0° elbow flexion at 1.5 s per contractile stage. Handle order was randomized. No significant differences were seen for triceps or BR rmsEMG across handle conditions; however, relative patterns of activation did vary for the forearm muscles by handle condition, with more coordinated activation levels for the FCR and ED during the ball handle conditions. In addition, the rmsEMG for the ED was significantly higher during the BallIM than any other condition and during the BallMR than the StandH. These results indicate that the use of ball handles with the cable passing between different fingers can vary the utilization patterns of selected forearm muscles and may therefore be advantageous for coaches, personal trainers, therapists, or bodybuilders for targeted training or rehabilitation of these muscles.
Upgrading the fuel-handling machine of the Novovoronezh nuclear power plant unit no. 5
NASA Astrophysics Data System (ADS)
Terekhov, D. V.; Dunaev, V. I.
2014-02-01
The calculation of safety parameters was carried out in the process of upgrading the fuel-handling machine (FHM) of the Novovoronezh nuclear power plant (NPP) unit no. 5 based on the results of quantitative safety analysis of nuclear fuel transfer operations using a dynamic logical-and-probabilistic model of the processing procedure. Specific engineering and design concepts that made it possible to reduce the probability of damaging the fuel assemblies (FAs) when performing various technological operations by an order of magnitude and introduce more flexible algorithms into the modernized FHM control system were developed. The results of pilot operation during two refueling campaigns prove that the total reactor shutdown time is lowered.
29 CFR 1926.1000 - Rollover protective structures (ROPS) for material handling equipment.
Code of Federal Regulations, 2013 CFR
2013-07-01
... two times the weight of the prime mover applied at the point of impact. (i) The design objective shall..., if any; (3) Machine make, model, or series number that the structure is designed to fit. (f) Machines... performance criteria detailed in §§ 1926.1001 and 1926.1002, as applicable or shall be designed, fabricated...
29 CFR 1926.1000 - Rollover protective structures (ROPS) for material handling equipment.
Code of Federal Regulations, 2010 CFR
2010-07-01
... two times the weight of the prime mover applied at the point of impact. (i) The design objective shall..., if any; (3) Machine make, model, or series number that the structure is designed to fit. (f) Machines... performance criteria detailed in §§ 1926.1001 and 1926.1002, as applicable or shall be designed, fabricated...
29 CFR 1926.1000 - Rollover protective structures (ROPS) for material handling equipment.
Code of Federal Regulations, 2014 CFR
2014-07-01
... two times the weight of the prime mover applied at the point of impact. (i) The design objective shall..., if any; (3) Machine make, model, or series number that the structure is designed to fit. (f) Machines... performance criteria detailed in §§ 1926.1001 and 1926.1002, as applicable or shall be designed, fabricated...
29 CFR 1926.1000 - Rollover protective structures (ROPS) for material handling equipment.
Code of Federal Regulations, 2011 CFR
2011-07-01
... two times the weight of the prime mover applied at the point of impact. (i) The design objective shall..., if any; (3) Machine make, model, or series number that the structure is designed to fit. (f) Machines... performance criteria detailed in §§ 1926.1001 and 1926.1002, as applicable or shall be designed, fabricated...
29 CFR 1926.1000 - Rollover protective structures (ROPS) for material handling equipment.
Code of Federal Regulations, 2012 CFR
2012-07-01
... two times the weight of the prime mover applied at the point of impact. (i) The design objective shall..., if any; (3) Machine make, model, or series number that the structure is designed to fit. (f) Machines... performance criteria detailed in §§ 1926.1001 and 1926.1002, as applicable or shall be designed, fabricated...
20 CFR 404.1568 - Skill requirements.
Code of Federal Regulations, 2010 CFR
2010-04-01
... are handling, feeding and offbearing (that is, placing or removing materials from machines which are automatic or operated by others), or machine tending, and a person can usually learn to do the job in 30... judgment to do simple duties that can be learned on the job in a short period of time. The job may or may...
20 CFR 404.1568 - Skill requirements.
Code of Federal Regulations, 2011 CFR
2011-04-01
... are handling, feeding and offbearing (that is, placing or removing materials from machines which are automatic or operated by others), or machine tending, and a person can usually learn to do the job in 30... judgment to do simple duties that can be learned on the job in a short period of time. The job may or may...
A Multiple Sensor Machine Vision System for Automatic Hardwood Feature Detection
D. Earl Kline; Richard W. Conners; Daniel L. Schmoldt; Philip A. Araman; Robert L. Brisbin
1993-01-01
A multiple sensor machine vision prototype is being developed to scan full size hardwood lumber at industrial speeds for automatically detecting features such as knots holes, wane, stain, splits, checks, and color. The prototype integrates a multiple sensor imaging system, a materials handling system, a computer system, and application software. The prototype provides...
20 CFR 404.1568 - Skill requirements.
Code of Federal Regulations, 2012 CFR
2012-04-01
... are handling, feeding and offbearing (that is, placing or removing materials from machines which are automatic or operated by others), or machine tending, and a person can usually learn to do the job in 30... judgment to do simple duties that can be learned on the job in a short period of time. The job may or may...
20 CFR 404.1568 - Skill requirements.
Code of Federal Regulations, 2014 CFR
2014-04-01
... are handling, feeding and offbearing (that is, placing or removing materials from machines which are automatic or operated by others), or machine tending, and a person can usually learn to do the job in 30... judgment to do simple duties that can be learned on the job in a short period of time. The job may or may...
20 CFR 404.1568 - Skill requirements.
Code of Federal Regulations, 2013 CFR
2013-04-01
... are handling, feeding and offbearing (that is, placing or removing materials from machines which are automatic or operated by others), or machine tending, and a person can usually learn to do the job in 30... judgment to do simple duties that can be learned on the job in a short period of time. The job may or may...
NASA Technical Reports Server (NTRS)
Byman, J. E.
1985-01-01
A brief history of aircraft production techniques is given. A flexible machining cell is then described. It is a computer controlled system capable of performing 4-axis machining part cleaning, dimensional inspection and materials handling functions in an unmanned environment. The cell was designed to: allow processing of similar and dissimilar parts in random order without disrupting production; allow serial (one-shipset-at-a-time) manufacturing; reduce work-in-process inventory; maximize machine utilization through remote set-up; maximize throughput and minimize labor.
Toward a mathematical formalism of performance, task difficulty, and activation
NASA Technical Reports Server (NTRS)
Samaras, George M.
1988-01-01
The rudiments of a mathematical formalism for handling operational, physiological, and psychological concepts are developed for use by the man-machine system design engineer. The formalism provides a framework for developing a structured, systematic approach to the interface design problem, using existing mathematical tools, and simplifying the problem of telling a machine how to measure and use performance.
Automated Solar Module Assembly Line
NASA Technical Reports Server (NTRS)
Bycer, M.
1979-01-01
The gathering of information that led to the design approach of the machine, and a summary of the findings in the areas of study along with a description of each station of the machine are discussed. The machine is a cell stringing and string applique machine which is flexible in design, capable of handling a variety of cells and assembling strings of cells which can then be placed in a matrix up to 4 ft x 2 ft. in series or parallel arrangement. The target machine cycle is to be 5 seconds per cell. This machine is primarily adapted to 100 MM round cells with one or two tabs between cells. It places finished strings of up to twelve cells in a matrix of up to six such strings arranged in series or in parallel.
Analysis of German Patent Literature
2012-08-01
the entities that are pictured in the gures, as they are likely to be important parts of the patent. Chunking is not a big source of errors - most...document groups, where the documents need not be exact translations. 21 Bibliography [1] Sabine Brants, Stefanie Dipper , Silvia Hansen, Wolfgang Lezius...mit ] A big sh [ übersetzt] ITJ Interjektion interjection mhm, ach, tja KOUI unterordnende Konjunktion mit zu und Innitiv subordinating conjunction
Lin, Hui; Jing, Jia; Xu, Liangfeng; Mao, Xiaoli
2017-12-01
To evaluate the influence of energy spectra, mesh sizes, high Z element on dose and PVDR in Microbeam Radiation Therapy (MRT) based on 1-D analogy-mouse-head-model (1-D MHM) and 3-D voxel-mouse-head-phantom (3-D VMHP) by Monte Carlo simulation. A Microbeam-Array-Source-Model was implemented into EGSnrc/DOSXYZnrc. The microbeam size is assumed to be 25μm, 50μm or 75μm in thickness and fixed 1mm in height with 200μmc-t-c. The influence of the energy spectra of ID17@ESRF and BMIT@CLS were investigated. The mesh size was optimized. PVDR in 1-D MHM and 3-D VMHP was compared with the homogeneous water phantom. The arc influence of 3-D VMHP filled with water (3-D VMHWP) was compared with the rectangle phantom. PVDR of the lower BMIT@CLS spectrum is 2.4times that of ID17@ESRF for lower valley dose. The optimized mesh is 5µm for 25µm, and 10µm for 50µm and 75µm microbeams with 200µmc-t-c. A 500μm skull layer could make PVDR difference up to 62.5% for 1-D MHM. However this influence is limited (<5%) for the farther homogeneous media (e.g. 600µm). The peak dose uniformity of 3-D VMHP at the same depth could be up to 8% for 1.85mm×1mm irradiation field, whereas that of 3-D VMHWP is<1%. The high Z element makes the dose uniformity enhance in target. The surface arc could affect the superficial PVDR (from 44% to 21% in 0.2mm depth), whereas this influence is limited for the more depth (<1%). An accurate MRT dose calculation algorithm should include the influence of 3-D heterogeneous media. Copyright © 2017 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.
Voltage THD Improvement for an Outer Rotor Permanent Magnet Synchronous Machine
NASA Astrophysics Data System (ADS)
de la Cruz, Javier; Ramirez, Juan M.; Leyva, Luis
2013-08-01
This article deals with the design of an outer rotor Permanent Magnet Synchronous Machines (PMSM) driven by wind turbines. The Voltage Total Harmonic Distortion (VTHD) is especially addressed, under design parameters' handling, i.e., the geometry of the stator, the polar arc percentage, the air gap, the skew angle in rotor poles, the pole length and the core steel class. Seventy-six cases are simulated and the results provide information useful for designing this kind of machines. The study is conducted on a 5 kW PMSM.
SIGPROC: Pulsar Signal Processing Programs
NASA Astrophysics Data System (ADS)
Lorimer, D. R.
2011-07-01
SIGPROC is a package designed to standardize the initial analysis of the many types of fast-sampled pulsar data. Currently recognized machines are the Wide Band Arecibo Pulsar Processor (WAPP), the Penn State Pulsar Machine (PSPM), the Arecibo Observatory Fourier Transform Machine (AOFTM), the Berkeley Pulsar Processors (BPP), the Parkes/Jodrell 1-bit filterbanks (SCAMP) and the filterbank at the Ooty radio telescope (OOTY). The SIGPROC tools should help users look at their data quickly, without the need to write (yet) another routine to read data or worry about big/little endian compatibility (byte swapping is handled automatically).
Production planning, production systems for flexible automation
NASA Astrophysics Data System (ADS)
Spur, G.; Mertins, K.
1982-09-01
Trends in flexible manufacturing system (FMS) applications are reviewed. Machining systems contain machines which complement each other and can replace each other. Computer controlled storage systems are widespread, with central storage capacity ranging from 20 pallet spaces to 200 magazine spaces. Handling function is fulfilled by pallet chargers in over 75% of FMS's. Data system degree of automation varies considerably. No trends are noted for transport systems.
Agile Machining and Inspection Non-Nuclear Report (NNR) Project
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lazarus, Lloyd
This report is a high level summary of the eight major projects funded by the Agile Machining and Inspection Non-Nuclear Readiness (NNR) project (FY06.0422.3.04.R1). The largest project of the group is the Rapid Response project in which the six major sub categories are summarized. This project focused on the operations of the machining departments that will comprise Special Applications Machining (SAM) in the Kansas City Responsive Infrastructure Manufacturing & Sourcing (KCRIMS) project. This project was aimed at upgrading older machine tools, developing new inspection tools, eliminating Classified Removable Electronic Media (CREM) in the handling of classified Numerical Control (NC) programsmore » by installing the CRONOS network, and developing methods to automatically load Coordinated-Measuring Machine (CMM) inspection data into bomb books and product score cards. Finally, the project personnel leaned perations of some of the machine tool cells, and now have the model to continue this activity.« less
Research on intelligent machine self-perception method based on LSTM
NASA Astrophysics Data System (ADS)
Wang, Qiang; Cheng, Tao
2018-05-01
In this paper, we use the advantages of LSTM in feature extraction and processing high-dimensional and complex nonlinear data, and apply it to the autonomous perception of intelligent machines. Compared with the traditional multi-layer neural network, this model has memory, can handle time series information of any length. Since the multi-physical domain signals of processing machines have a certain timing relationship, and there is a contextual relationship between states and states, using this deep learning method to realize the self-perception of intelligent processing machines has strong versatility and adaptability. The experiment results show that the method proposed in this paper can obviously improve the sensing accuracy under various working conditions of the intelligent machine, and also shows that the algorithm can well support the intelligent processing machine to realize self-perception.
Flight simulator for hypersonic vehicle and a study of NASP handling qualities
NASA Technical Reports Server (NTRS)
Ntuen, Celestine A.; Park, Eui H.; Deeb, Joseph M.; Kim, Jung H.
1992-01-01
The research goal of the Human-Machine Systems Engineering Group was to study the existing handling quality studies in aircraft with sonic to supersonic speeds and power in order to understand information requirements needed for a hypersonic vehicle flight simulator. This goal falls within the NASA task statements: (1) develop flight simulator for hypersonic vehicle; (2) study NASP handling qualities; and (3) study effects of flexibility on handling qualities and on control system performance. Following the above statement of work, the group has developed three research strategies. These are: (1) to study existing handling quality studies and the associated aircraft and develop flight simulation data characterization; (2) to develop a profile for flight simulation data acquisition based on objective statement no. 1 above; and (3) to develop a simulator and an embedded expert system platform which can be used in handling quality experiments for hypersonic aircraft/flight simulation training.
2008-11-01
Attendees of the SPIder Workshop D. Bierhuizen—Medis Medical Imaging Systems L. Braafhart—LogicaCMG Nederland H.J.J. Cannegieter—SYSQA W. den...Dekker—LogicaCMG Nederland L. Delmelk—LogicaCMG België A.J. Donderman—Transfer Solutions G.H.M. Friedhoff—SYSQA L.L. van der Giessen—ABN AMRO M...Mechelen—Compuware Nederland M.P.H.M. Mermans—Philips Medical Systems C. Michielsen—ITIB N. van Mourik—SYSQA E.M. Oostveen—Advanced M.H.M. van
Calibration of a distributed hydrologic model for six European catchments using remote sensing data
NASA Astrophysics Data System (ADS)
Stisen, S.; Demirel, M. C.; Mendiguren González, G.; Kumar, R.; Rakovec, O.; Samaniego, L. E.
2017-12-01
While observed streamflow has been the single reference for most conventional hydrologic model calibration exercises, the availability of spatially distributed remote sensing observations provide new possibilities for multi-variable calibration assessing both spatial and temporal variability of different hydrologic processes. In this study, we first identify the key transfer parameters of the mesoscale Hydrologic Model (mHM) controlling both the discharge and the spatial distribution of actual evapotranspiration (AET) across six central European catchments (Elbe, Main, Meuse, Moselle, Neckar and Vienne). These catchments are selected based on their limited topographical and climatic variability which enables to evaluate the effect of spatial parameterization on the simulated evapotranspiration patterns. We develop a European scale remote sensing based actual evapotranspiration dataset at a 1 km grid scale driven primarily by land surface temperature observations from MODIS using the TSEB approach. Using the observed AET maps we analyze the potential benefits of incorporating spatial patterns from MODIS data to calibrate the mHM model. This model allows calibrating one-basin-at-a-time or all-basins-together using its unique structure and multi-parameter regionalization approach. Results will indicate any tradeoffs between spatial pattern and discharge simulation during model calibration and through validation against independent internal discharge locations. Moreover, added value on internal water balances will be analyzed.
NASA Astrophysics Data System (ADS)
Boger, R. A.; Low, R.; Paull, S.; Anyamba, A.; Soebiyanto, R. P.
2017-12-01
Temperature and precipitation are important drivers of mosquito population dynamics, and a growing set of models have been proposed to characterize these relationships. Validation of these models, and development of broader theories across mosquito species and regions could nonetheless be improved by comparing observations from a global dataset of mosquito larvae with satellite-based measurements of meteorological variables. Citizen science data can be particularly useful for two such aspects of research into the meteorological drivers of mosquito populations: i) Broad-scale validation of mosquito distribution models and ii) Generation of quantitative hypotheses regarding changes to mosquito abundance and phenology across scales. The recently released GLOBE Observer Mosquito Habitat Mapper (GO-MHM) app engages citizen scientists in identifying vector taxa, mapping breeding sites and decommissioning non-natural habitats, and provides a potentially useful new tool for validating mosquito ubiquity projections based on the analysis of remotely sensed environmental data. Our early work with GO-MHM data focuses on two objectives: validating citizen science reports of Aedes aegypti distribution through comparison with accepted scientific data sources, and exploring the relationship between extreme temperature and precipitation events and subsequent observations of mosquito larvae. Ultimately the goal is to develop testable hypotheses regarding the shape and character of this relationship between mosquito species and regions.
Topics in programmable automation. [for materials handling, inspection, and assembly
NASA Technical Reports Server (NTRS)
Rosen, C. A.
1975-01-01
Topics explored in the development of integrated programmable automation systems include: numerically controlled and computer controlled machining; machine intelligence and the emulation of human-like capabilities; large scale semiconductor integration technology applications; and sensor technology for asynchronous local computation without burdening the executive minicomputer which controls the whole system. The role and development of training aids, and the potential application of these aids to augmented teleoperator systems are discussed.
Flexible Manufacturing System Handbook. Volume IV. Appendices
1983-02-01
and Acceptance Test(s)" on page 26 of this Proposal Request. 1.1.10 Options 1. Centralized Automatic Chip/Coolant Recovery System a. Scope The...viable, from manual- ly moving the pallet/fixture/part combinations from machine to machine to fully automatic , unmanned material handling systems , such...English. Where dimensions are shown in metric units, the English system (inch) equivalent will also be shown. Hydraulic, pneumatic , and electrical
On Why It Is Impossible to Prove that the BDX90 Dispatcher Implements a Time-sharing System
NASA Technical Reports Server (NTRS)
Boyer, R. S.; Moore, J. S.
1983-01-01
The Software Implemented Fault Tolerance SIFT system, is written in PASCAL except for about a page of machine code. The SIFT system implements a small time sharing system in which PASCAL programs for separate application tasks are executed according to a schedule with real time constraints. The PASCAL language has no provision for handling the notion of an interrupt such as the B930 clock interrupt. The PASCAL language also lacks the notion of running a PASCAL subroutine for a given amount of time, suspending it, saving away the suspension, and later activating the suspension. Machine code was used to overcome these inadequacies of PASCAL. Code which handles clock interrupts and suspends processes is called a dispatcher. The time sharing/virtual machine idea is completely destroyed by the reconfiguration task. After termination of the reconfiguration task, the tasks run by the dispatcher have no relation to those run before reconfiguration. It is impossible to view the dispatcher as a time-sharing system implementing virtual BDX930s running concurrently when one process can wipe out the others.
Action languages: Dimensions, effects
NASA Technical Reports Server (NTRS)
Hayes, Daniel G.; Streeter, Gordon
1989-01-01
Dimensions of action languages are discussed for communication between humans and machines, and the message handling capabilities of object oriented programming systems are examined. Design of action languages is seen to be very contextual. Economical and effective design will depend on features of situations, the tasks intended to be accomplished, and the nature of the devices themselves. Current object oriented systems turn out to have fairly simple and straightforward message handling facilities, which in themselves do little to buffer action or even in some cases to handle competing messages. Even so, it is possible to program a certain amount of discretion about how they react to messages. Such thoughtfulness and perhaps relative autonomy of program modules seems prerequisite to future systems to handle complex interactions in changing situations.
Design of robotic cells based on relative handling modules with use of SolidWorks system
NASA Astrophysics Data System (ADS)
Gaponenko, E. V.; Anciferov, S. I.
2018-05-01
The article presents a diagramed engineering solution for a robotic cell with six degrees of freedom for machining of complex details, consisting of the base with a tool installation module and a detail machining module made as parallel structure mechanisms. The output links of the detail machining module and the tool installation module can move along X-Y-Z coordinate axes each. A 3D-model of the complex is designed in the SolidWorks system. It will be used further for carrying out engineering calculations and mathematical analysis and obtaining all required documentation.
A Data Envelopment Analysis Model for Selecting Material Handling System Designs
NASA Astrophysics Data System (ADS)
Liu, Fuh-Hwa Franklin; Kuo, Wan-Ting
The material handling system under design is an unmanned job shop with an automated guided vehicle that transport loads within the processing machines. The engineering task is to select the design alternatives that are the combinations of the four design factors: the ratio of production time to transportation time, mean job arrival rate to the system, input/output buffer capacities at each processing machine, and the vehicle control strategies. Each of the design alternatives is simulated to collect the upper and lower bounds of the five performance indices. We develop a Data Envelopment Analysis (DEA) model to assess the 180 designs with imprecise data of the five indices. The three-ways factorial experiment analysis for the assessment results indicates the buffer capacity and the interaction of job arrival rate and buffer capacity affect the performance significantly.
Nurse Staffing at Methodist Heathcare Ministries: Factors Influencing Recruiting and Retention
2007-04-01
a provider’s personality strongly influences their satisfaction with the working environment. The study recognized some critical factors for working ... with us today?" (Smith, 2005, p. 56). Nurse Staffing at 22 After the organization has addressed its culture, it can then begin transforming work ...most important to the employees working at MHM were identified. These factors can be used , in conjunction with the literature, to tailor applicable
1975-06-30
assigned small inte- gers called Job File Numbers or JFNs with which future references are made. Since the name of the device on which a file...what could reasonably be called the "Datacomputer proper", and are the primary output of the Datacomputer project. They are conceptually and func...sections, each of which is broken into 512 word blocks called pages. When the Request Handler - 18 - MHM^MMMMMM 1 P ■■" Li..i..ii.i »I
The JPL Library Information Retrieval System
ERIC Educational Resources Information Center
Walsh, Josephine
1975-01-01
The development, capabilities, and products of the computer-based retrieval system of the Jet Propulsion Laboratory Library are described. The system handles books and documents, produces a book catalog, and provides a machine search capability. (Author)
Ljungquist, Bengt; Petersson, Per; Johansson, Anders J; Schouenborg, Jens; Garwicz, Martin
2018-04-01
Recent neuroscientific and technical developments of brain machine interfaces have put increasing demands on neuroinformatic databases and data handling software, especially when managing data in real time from large numbers of neurons. Extrapolating these developments we here set out to construct a scalable software architecture that would enable near-future massive parallel recording, organization and analysis of neurophysiological data on a standard computer. To this end we combined, for the first time in the present context, bit-encoding of spike data with a specific communication format for real time transfer and storage of neuronal data, synchronized by a common time base across all unit sources. We demonstrate that our architecture can simultaneously handle data from more than one million neurons and provide, in real time (< 25 ms), feedback based on analysis of previously recorded data. In addition to managing recordings from very large numbers of neurons in real time, it also has the capacity to handle the extensive periods of recording time necessary in certain scientific and clinical applications. Furthermore, the bit-encoding proposed has the additional advantage of allowing an extremely fast analysis of spatiotemporal spike patterns in a large number of neurons. Thus, we conclude that this architecture is well suited to support current and near-future Brain Machine Interface requirements.
Climatic and landscape controls on travel time distributions across Europe
NASA Astrophysics Data System (ADS)
Kumar, Rohini; Rao, Suresh; Hesse, Falk; Borchardt, Dietrich; Fleckenstein, Jan; Jawitz, James; Musolff, Andreas; Rakovec, Oldrich; Samaniego, Luis; Yang, Soohyun; Zink, Matthias; Attinger, Sabine
2017-04-01
Travel time distributions (TTDs) are fundamental descriptors to characterize the functioning of storage, mixing and release of water and solutes in a river basin. Identifying the relative importance (and controls) of climate and landscape attributes on TDDs is fundamental to improve our understanding of the underlying mechanism controlling the spatial heterogeneity of TTDs, and their moments (e.g., mean TT). Studies aimed at elucidating such controls have focused on either theoretical developments to gain (physical) insights using mostly synthetic datasets or empirical relationships using limited datasets from experimental sites. A study painting a general picture of emerging controls at a continental scale is still lacking. In this study, we make use of spatially resolved hydrologic fluxes and states generated through an observationally driven, mesoscale Hydrologic Model (mHM; www.ufz.de/mhm) to comprehensively characterize the dominant controls of climate and landscape attributes on TDDs in the vadose zone across the entire European region. mHM uses a novel Multiscale Parameter Regionalization (MPR; Samaniego et al., 2010 and Kumar et al., 2013) scheme that encapsulates fine scale landscape attributes (e.g., topography, soil, and vegetation characteristics) to account for the sub-grid variability in model parameterization. The model was established at 25 km spatial resolution to simulate the daily gridded fluxes and states over Europe for the period 1955-2015. We utilized recent developments in TTDs theory (e.g., Botter et al., 2010, Harman et al., 2011) to characterize the stationary and non-stationary behavior of water particles transported through the vadose zone at every grid cell. Our results suggest a complex set of interactions between climate and landscape properties controlling the spatial heterogeneity of the mean travel time (TT). The spatial variability in the mean TT across the Pan-EU generally follows the climatic gradient with lower values in humid regions and higher in semi-arid or drier regions. The results signifies the role of a landscape attributes like plant available soil-water-storage capacity, when expressed in a dimensionless number that also include climate attributes such as average rain depth and aridity index, forms a potentially useful predictor for explaining the spatial heterogeneity of mean TTs. Finally, the study also highlights the time-varying behavior of TTDs and discusses the seasonal variation in mean TTs across Europe.
Budhathoki, Shyam Sundar; Bhattachan, Meika; Castro-Sánchez, Enrique; Sagtani, Reshu Agrawal; Rayamajhi, Rajan Bikram; Rai, Pramila; Sharma, Gaurav
2018-02-02
Menstrual hygiene management (MHM) is an essential aspect of hygiene for women and adolescent girls between menarche and menopause. Despite being an important issue concerning women and girls in the menstruating age group MHM is often overlooked in post-disaster responses. Further, there is limited evidence of menstrual hygiene management in humanitarian settings. This study aims to describe the experiences and perceptions of women and adolescent girls on menstrual hygiene management in post-earthquake Nepal. A mixed methods study was carried out among the earthquake affected women and adolescent girls in three villages of Sindhupalchowk district of Nepal. Data was collected using a semi-structured questionnaire that captured experiences and perceptions of respondents on menstrual hygiene management in the aftermath of the Nepal earthquake. Quantitative data were triangulated with in-depth interview regarding respondent's personal experiences of menstrual hygiene management. Menstrual hygiene was rated as the sixth highest overall need and perceived as an immediate need by 18.8% of the respondents. There were 42.8% women & girls who menstruated within first week of the earthquake. Reusable sanitary cloth were used by about 66.7% of the respondents before the earthquake and remained a popular method (76.1%) post-earthquake. None of the respondents reported receiving menstrual adsorbents as relief materials in the first month following the earthquake. Disposable pads (77.8%) were preferred by respondents as they were perceived to be clean and convenient to use. Most respondents (73.5%) felt that reusable sanitary pads were a sustainable choice. Women who were in the age group of 15-34 years (OR = 3.14; CI = (1.07-9.20), did not go to school (OR = 9.68; CI = 2.16-43.33), married (OR = 2.99; CI = 1.22-7.31) and previously used reusable sanitary cloth (OR = 5.82; CI = 2.33-14.55) were more likely to use the reusable sanitary cloth. In the immediate aftermath of the earthquake, women and girls completely depended on the use of locally available resources as adsorbents during menstruation. Immediate relief activities by humanitarian agencies, lacked MHM activities. Understanding the previous practice and using local resources, the reusable sanitary cloth is a way to address the menstrual hygiene needs in the post-disaster situations in Nepal.
NASA Technical Reports Server (NTRS)
Ray, R. B.
1994-01-01
OPMILL is a computer operating system for a Kearney and Trecker milling machine that provides a fast and easy way to program machine part manufacture with an IBM compatible PC. The program gives the machinist an "equation plotter" feature which plots any set of equations that define axis moves (up to three axes simultaneously) and converts those equations to a machine milling program that will move a cutter along a defined path. Other supported functions include: drill with peck, bolt circle, tap, mill arc, quarter circle, circle, circle 2 pass, frame, frame 2 pass, rotary frame, pocket, loop and repeat, and copy blocks. The system includes a tool manager that can handle up to 25 tools and automatically adjusts tool length for each tool. It will display all tool information and stop the milling machine at the appropriate time. Information for the program is entered via a series of menus and compiled to the Kearney and Trecker format. The program can then be loaded into the milling machine, the tool path graphically displayed, and tool change information or the program in Kearney and Trecker format viewed. The program has a complete file handling utility that allows the user to load the program into memory from the hard disk, save the program to the disk with comments, view directories, merge a program on the disk with one in memory, save a portion of a program in memory, and change directories. OPMILL was developed on an IBM PS/2 running DOS 3.3 with 1 MB of RAM. OPMILL was written for an IBM PC or compatible 8088 or 80286 machine connected via an RS-232 port to a Kearney and Trecker Data Mill 700/C Control milling machine. It requires a "D:" drive (fixed-disk or virtual), a browse or text display utility, and an EGA or better display. Users wishing to modify and recompile the source code will also need Turbo BASIC, Turbo C, and Crescent Software's QuickPak for Turbo BASIC. IBM PC and IBM PS/2 are registered trademarks of International Business Machines. Turbo BASIC and Turbo C are trademarks of Borland International.
Dai, Wujiao; Shi, Qiang; Cai, Changsheng
2017-01-01
The carrier phase multipath effect is one of the most significant error sources in the precise positioning of BeiDou Navigation Satellite System (BDS). We analyzed the characteristics of BDS multipath, and found the multipath errors of geostationary earth orbit (GEO) satellite signals are systematic, whereas those of inclined geosynchronous orbit (IGSO) or medium earth orbit (MEO) satellites are both systematic and random. The modified multipath mitigation methods, including sidereal filtering algorithm and multipath hemispherical map (MHM) model, were used to improve BDS dynamic deformation monitoring. The results indicate that the sidereal filtering methods can reduce the root mean square (RMS) of positioning errors in the east, north and vertical coordinate directions by 15%, 37%, 25% and 18%, 51%, 27% in the coordinate and observation domains, respectively. By contrast, the MHM method can reduce the RMS by 22%, 52% and 27% on average. In addition, the BDS multipath errors in static baseline solutions are a few centimeters in multipath-rich environments, which is different from that of Global Positioning System (GPS) multipath. Therefore, we add a parameter representing the GEO multipath error in observation equation to the adjustment model to improve the precision of BDS static baseline solutions. And the results show that the modified model can achieve an average precision improvement of 82%, 54% and 68% in the east, north and up coordinate directions, respectively. PMID:28387744
Dai, Wujiao; Shi, Qiang; Cai, Changsheng
2017-04-07
The carrier phase multipath effect is one of the most significant error sources in the precise positioning of BeiDou Navigation Satellite System (BDS). We analyzed the characteristics of BDS multipath, and found the multipath errors of geostationary earth orbit (GEO) satellite signals are systematic, whereas those of inclined geosynchronous orbit (IGSO) or medium earth orbit (MEO) satellites are both systematic and random. The modified multipath mitigation methods, including sidereal filtering algorithm and multipath hemispherical map (MHM) model, were used to improve BDS dynamic deformation monitoring. The results indicate that the sidereal filtering methods can reduce the root mean square (RMS) of positioning errors in the east, north and vertical coordinate directions by 15%, 37%, 25% and 18%, 51%, 27% in the coordinate and observation domains, respectively. By contrast, the MHM method can reduce the RMS by 22%, 52% and 27% on average. In addition, the BDS multipath errors in static baseline solutions are a few centimeters in multipath-rich environments, which is different from that of Global Positioning System (GPS) multipath. Therefore, we add a parameter representing the GEO multipath error in observation equation to the adjustment model to improve the precision of BDS static baseline solutions. And the results show that the modified model can achieve an average precision improvement of 82%, 54% and 68% in the east, north and up coordinate directions, respectively.
The dynamic forces and moments required in handling tree-length logs.
John A. Sturos
1971-01-01
Realistic dynamic loading requirements for tree- or log-harvesting machines were determined. The study showed that dynamic forces and moments four times as great as those required statically can occur in the field.
Identification of the mechanical properties of bicycle tyres for modelling of bicycle dynamics
NASA Astrophysics Data System (ADS)
Doria, Alberto; Tognazzo, Mauro; Cusimano, Gianmaria; Bulsink, Vera; Cooke, Adrian; Koopman, Bart
2013-03-01
Advanced simulation of the stability and handling properties of bicycles requires detailed road-tyre contact models. In order to develop these models, in this study, four bicycle tyres are tested by means of a rotating disc machine with the aim of measuring the components of tyre forces and torques that influence the safety and handling of bicycles. The effect of inflation pressure and tyre load is analysed. The measured properties of bicycle tyres are compared with those of motorcycle tyres.
Vision-Based People Detection System for Heavy Machine Applications
Fremont, Vincent; Bui, Manh Tuan; Boukerroui, Djamal; Letort, Pierrick
2016-01-01
This paper presents a vision-based people detection system for improving safety in heavy machines. We propose a perception system composed of a monocular fisheye camera and a LiDAR. Fisheye cameras have the advantage of a wide field-of-view, but the strong distortions that they create must be handled at the detection stage. Since people detection in fisheye images has not been well studied, we focus on investigating and quantifying the impact that strong radial distortions have on the appearance of people, and we propose approaches for handling this specificity, adapted from state-of-the-art people detection approaches. These adaptive approaches nevertheless have the drawback of high computational cost and complexity. Consequently, we also present a framework for harnessing the LiDAR modality in order to enhance the detection algorithm for different camera positions. A sequential LiDAR-based fusion architecture is used, which addresses directly the problem of reducing false detections and computational cost in an exclusively vision-based system. A heavy machine dataset was built, and different experiments were carried out to evaluate the performance of the system. The results are promising, in terms of both processing speed and performance. PMID:26805838
Vision-Based People Detection System for Heavy Machine Applications.
Fremont, Vincent; Bui, Manh Tuan; Boukerroui, Djamal; Letort, Pierrick
2016-01-20
This paper presents a vision-based people detection system for improving safety in heavy machines. We propose a perception system composed of a monocular fisheye camera and a LiDAR. Fisheye cameras have the advantage of a wide field-of-view, but the strong distortions that they create must be handled at the detection stage. Since people detection in fisheye images has not been well studied, we focus on investigating and quantifying the impact that strong radial distortions have on the appearance of people, and we propose approaches for handling this specificity, adapted from state-of-the-art people detection approaches. These adaptive approaches nevertheless have the drawback of high computational cost and complexity. Consequently, we also present a framework for harnessing the LiDAR modality in order to enhance the detection algorithm for different camera positions. A sequential LiDAR-based fusion architecture is used, which addresses directly the problem of reducing false detections and computational cost in an exclusively vision-based system. A heavy machine dataset was built, and different experiments were carried out to evaluate the performance of the system. The results are promising, in terms of both processing speed and performance.
NASA Astrophysics Data System (ADS)
Aalaei, Amin; Davoudpour, Hamid
2012-11-01
This article presents designing a new mathematical model for integrating dynamic cellular manufacturing into supply chain system with an extensive coverage of important manufacturing features consideration of multiple plants location, multi-markets allocation, multi-period planning horizons with demand and part mix variation, machine capacity, and the main constraints are demand of markets satisfaction in each period, machine availability, machine time-capacity, worker assignment, available time of worker, production volume for each plant and the amounts allocated to each market. The aim of the proposed model is to minimize holding and outsourcing costs, inter-cell material handling cost, external transportation cost, procurement & maintenance and overhead cost of machines, setup cost, reconfiguration cost of machines installation and removal, hiring, firing and salary worker costs. Aimed to prove the potential benefits of such a design, presented an example is shown using a proposed model.
Predicting the dissolution kinetics of silicate glasses using machine learning
NASA Astrophysics Data System (ADS)
Anoop Krishnan, N. M.; Mangalathu, Sujith; Smedskjaer, Morten M.; Tandia, Adama; Burton, Henry; Bauchy, Mathieu
2018-05-01
Predicting the dissolution rates of silicate glasses in aqueous conditions is a complex task as the underlying mechanism(s) remain poorly understood and the dissolution kinetics can depend on a large number of intrinsic and extrinsic factors. Here, we assess the potential of data-driven models based on machine learning to predict the dissolution rates of various aluminosilicate glasses exposed to a wide range of solution pH values, from acidic to caustic conditions. Four classes of machine learning methods are investigated, namely, linear regression, support vector machine regression, random forest, and artificial neural network. We observe that, although linear methods all fail to describe the dissolution kinetics, the artificial neural network approach offers excellent predictions, thanks to its inherent ability to handle non-linear data. Overall, we suggest that a more extensive use of machine learning approaches could significantly accelerate the design of novel glasses with tailored properties.
Trust metrics in information fusion
NASA Astrophysics Data System (ADS)
Blasch, Erik
2014-05-01
Trust is an important concept for machine intelligence and is not consistent across many applications. In this paper, we seek to understand trust from a variety of factors: humans, sensors, communications, intelligence processing algorithms and human-machine displays of information. In modeling the various aspects of trust, we provide an example from machine intelligence that supports the various attributes of measuring trust such as sensor accuracy, communication timeliness, machine processing confidence, and display throughput to convey the various attributes that support user acceptance of machine intelligence results. The example used is fusing video and text whereby an analyst needs trust information in the identified imagery track. We use the proportional conflict redistribution rule as an information fusion technique that handles conflicting data from trusted and mistrusted sources. The discussion of the many forms of trust explored in the paper seeks to provide a systems-level design perspective for information fusion trust quantification.
Grinding Parts For Automatic Welding
NASA Technical Reports Server (NTRS)
Burley, Richard K.; Hoult, William S.
1989-01-01
Rollers guide grinding tool along prospective welding path. Skatelike fixture holds rotary grinder or file for machining large-diameter rings or ring segments in preparation for welding. Operator grasps handles to push rolling fixture along part. Rollers maintain precise dimensional relationship so grinding wheel cuts precise depth. Fixture-mounted grinder machines surface to quality sufficient for automatic welding; manual welding with attendant variations and distortion not necessary. Developed to enable automatic welding of parts, manual welding of which resulted in weld bead permeated with microscopic fissures.
Summary of vulnerability related technologies based on machine learning
NASA Astrophysics Data System (ADS)
Zhao, Lei; Chen, Zhihao; Jia, Qiong
2018-04-01
As the scale of information system increases by an order of magnitude, the complexity of system software is getting higher. The vulnerability interaction from design, development and deployment to implementation stages greatly increases the risk of the entire information system being attacked successfully. Considering the limitations and lags of the existing mainstream security vulnerability detection techniques, this paper summarizes the development and current status of related technologies based on the machine learning methods applied to deal with massive and irregular data, and handling security vulnerabilities.
2007-11-01
athankelijkheden, de hoeveelheid situatie? Auteur (s) ir. P.L.H. Cleophas drs. M.H.M. Delmee PROGRAMMA PROJECT drs. S. Hoesmans drs. P.G.M. van Scheepstal drs...hiertussen een aanvullende belangrijke schakel. In theorie worden binnen de CLAS en CZSK verschillende methoden voor het plannen van een operatie gehanteerd...J.H.A.Blokker MCM drs P.G.M. van Scheepstal Afdelingshoofd Auteur 72/72 TNO-rapport I TNO-DV 2007 A452 TNO-rapport I TNO-DV 2007 A452 Bijlage A 11/4 A Fysiek
Yamin, Samuel C; Bejan, Anca; Parker, David L; Xi, Min; Brosseau, Lisa M
2016-08-01
Metal fabrication workers are at high risk for machine-related injury. Apart from amputations, data on factors contributing to this problem are generally absent. Narrative text analysis was performed on workers' compensation claims in order to identify machine-related injuries and determine work tasks involved. Data were further evaluated on the basis of cost per claim, nature of injury, and part of body. From an initial set of 4,268 claims, 1,053 were classified as machine-related. Frequently identified tasks included machine operation (31%), workpiece handling (20%), setup/adjustment (15%), and removing chips (12%). Lacerations to finger(s), hand, or thumb comprised 38% of machine-related injuries; foreign body in the eye accounted for 20%. Amputations were relatively rare but had highest costs per claim (mean $21,059; median $11,998). Despite limitations, workers' compensation data were useful in characterizing machine-related injuries. Improving the quality of data collected by insurers would enhance occupational injury surveillance and prevention efforts. Am. J. Ind. Med. 59:656-664, 2016. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.
Redundant Asynchronous Microprocessor System
NASA Technical Reports Server (NTRS)
Meyer, G.; Johnston, J. O.; Dunn, W. R.
1985-01-01
Fault-tolerant computer structure called RAMPS (for redundant asynchronous microprocessor system) has simplicity of static redundancy but offers intermittent-fault handling ability of complex, dynamically redundant systems. New structure useful wherever several microprocessors are employed for control - in aircraft, industrial processes, robotics, and automatic machining, for example.
NASA Astrophysics Data System (ADS)
Cuntz, Matthias; Mai, Juliane; Zink, Matthias; Thober, Stephan; Kumar, Rohini; Schäfer, David; Schrön, Martin; Craven, John; Rakovec, Oldrich; Spieler, Diana; Prykhodko, Vladyslav; Dalmasso, Giovanni; Musuuza, Jude; Langenberg, Ben; Attinger, Sabine; Samaniego, Luis
2015-08-01
Environmental models tend to require increasing computational time and resources as physical process descriptions are improved or new descriptions are incorporated. Many-query applications such as sensitivity analysis or model calibration usually require a large number of model evaluations leading to high computational demand. This often limits the feasibility of rigorous analyses. Here we present a fully automated sequential screening method that selects only informative parameters for a given model output. The method requires a number of model evaluations that is approximately 10 times the number of model parameters. It was tested using the mesoscale hydrologic model mHM in three hydrologically unique European river catchments. It identified around 20 informative parameters out of 52, with different informative parameters in each catchment. The screening method was evaluated with subsequent analyses using all 52 as well as only the informative parameters. Subsequent Sobol's global sensitivity analysis led to almost identical results yet required 40% fewer model evaluations after screening. mHM was calibrated with all and with only informative parameters in the three catchments. Model performances for daily discharge were equally high in both cases with Nash-Sutcliffe efficiencies above 0.82. Calibration using only the informative parameters needed just one third of the number of model evaluations. The universality of the sequential screening method was demonstrated using several general test functions from the literature. We therefore recommend the use of the computationally inexpensive sequential screening method prior to rigorous analyses on complex environmental models.
NASA Astrophysics Data System (ADS)
Mai, Juliane; Cuntz, Matthias; Zink, Matthias; Thober, Stephan; Kumar, Rohini; Schäfer, David; Schrön, Martin; Craven, John; Rakovec, Oldrich; Spieler, Diana; Prykhodko, Vladyslav; Dalmasso, Giovanni; Musuuza, Jude; Langenberg, Ben; Attinger, Sabine; Samaniego, Luis
2016-04-01
Environmental models tend to require increasing computational time and resources as physical process descriptions are improved or new descriptions are incorporated. Many-query applications such as sensitivity analysis or model calibration usually require a large number of model evaluations leading to high computational demand. This often limits the feasibility of rigorous analyses. Here we present a fully automated sequential screening method that selects only informative parameters for a given model output. The method requires a number of model evaluations that is approximately 10 times the number of model parameters. It was tested using the mesoscale hydrologic model mHM in three hydrologically unique European river catchments. It identified around 20 informative parameters out of 52, with different informative parameters in each catchment. The screening method was evaluated with subsequent analyses using all 52 as well as only the informative parameters. Subsequent Sobol's global sensitivity analysis led to almost identical results yet required 40% fewer model evaluations after screening. mHM was calibrated with all and with only informative parameters in the three catchments. Model performances for daily discharge were equally high in both cases with Nash-Sutcliffe efficiencies above 0.82. Calibration using only the informative parameters needed just one third of the number of model evaluations. The universality of the sequential screening method was demonstrated using several general test functions from the literature. We therefore recommend the use of the computationally inexpensive sequential screening method prior to rigorous analyses on complex environmental models.
Interaction with Machine Improvisation
NASA Astrophysics Data System (ADS)
Assayag, Gerard; Bloch, George; Cont, Arshia; Dubnov, Shlomo
We describe two multi-agent architectures for an improvisation oriented musician-machine interaction systems that learn in real time from human performers. The improvisation kernel is based on sequence modeling and statistical learning. We present two frameworks of interaction with this kernel. In the first, the stylistic interaction is guided by a human operator in front of an interactive computer environment. In the second framework, the stylistic interaction is delegated to machine intelligence and therefore, knowledge propagation and decision are taken care of by the computer alone. The first framework involves a hybrid architecture using two popular composition/performance environments, Max and OpenMusic, that are put to work and communicate together, each one handling the process at a different time/memory scale. The second framework shares the same representational schemes with the first but uses an Active Learning architecture based on collaborative, competitive and memory-based learning to handle stylistic interactions. Both systems are capable of processing real-time audio/video as well as MIDI. After discussing the general cognitive background of improvisation practices, the statistical modelling tools and the concurrent agent architecture are presented. Then, an Active Learning scheme is described and considered in terms of using different improvisation regimes for improvisation planning. Finally, we provide more details about the different system implementations and describe several performances with the system.
Determination of densified biomass mass properties using 3D laser scanning and image analysis
USDA-ARS?s Scientific Manuscript database
Biomass densification is viewed as the indispensable feedstock preprocessing operation for efficient transport, storage, material flow through machines, and handling activities. Accurate mass properties of densified biomass such as surface area, volume, and envelope density form fundamental data for...
Manipulating Tabu List to Handle Machine Breakdowns in Job Shop Scheduling Problems
NASA Astrophysics Data System (ADS)
Nababan, Erna Budhiarti; SalimSitompul, Opim
2011-06-01
Machine breakdowns in a production schedule may occur on a random basis that make the well-known hard combinatorial problem of Job Shop Scheduling Problems (JSSP) becomes more complex. One of popular techniques used to solve the combinatorial problems is Tabu Search. In this technique, moves that will be not allowed to be revisited are retained in a tabu list in order to avoid in gaining solutions that have been obtained previously. In this paper, we propose an algorithm to employ a second tabu list to keep broken machines, in addition to the tabu list that keeps the moves. The period of how long the broken machines will be kept on the list is categorized using fuzzy membership function. Our technique are tested to the benchmark data of JSSP available on the OR library. From the experiment, we found that our algorithm is promising to help a decision maker to face the event of machine breakdowns.
NASA Astrophysics Data System (ADS)
Nawir, Mukrimah; Amir, Amiza; Lynn, Ong Bi; Yaakob, Naimah; Badlishah Ahmad, R.
2018-05-01
The rapid growth of technologies might endanger them to various network attacks due to the nature of data which are frequently exchange their data through Internet and large-scale data that need to be handle. Moreover, network anomaly detection using machine learning faced difficulty when dealing the involvement of dataset where the number of labelled network dataset is very few in public and this caused many researchers keep used the most commonly network dataset (KDDCup99) which is not relevant to employ the machine learning (ML) algorithms for a classification. Several issues regarding these available labelled network datasets are discussed in this paper. The aim of this paper to build a network anomaly detection system using machine learning algorithms that are efficient, effective and fast processing. The finding showed that AODE algorithm is performed well in term of accuracy and processing time for binary classification towards UNSW-NB15 dataset.
NASA Astrophysics Data System (ADS)
Sharif, Safian; Sadiq, Ibrahim Ogu; Suhaimi, Mohd Azlan; Rahim, Shayfull Zamree Abd
2017-09-01
Pollution related activities in addition to handling cost of conventional cutting fluid application in metal cutting industry has generated a lot of concern over time. The desire for a green machining environment which will preserve the environment through reduction or elimination of machining related pollution, reduction in oil consumption and safety of the machine operators without compromising an efficient machining process led to search for alternatives to conventional cutting fluid. Amongst the alternatives of dry machining, cryogenic cooling, high pressure cooling, near dry or minimum quantity lubrication (MQL), MQL have shown remarkable performance in terms of cost, machining output, safety of environment and machine operators. However, the MQL under aggressive machining or very high speed machining pose certain restriction as the lubrication media cannot perform efficiently at elevated temperature. In compensating for the shortcomings of MQL technique, high thermal conductivity nanoparticles are introduced in cutting fluids for use in the MQL lubrication process. They have indicated enhanced performance of machining process and significant reduction of loads on the environment. The present work is aimed at evaluating the application and performance of nanofluid in metal cutting process through MQL lubrication technique highlighting their impacts and prospects as lubrication strategy in metal cutting process for sustainable green manufacturing. Enhanced performance of vegetable oil based nanofluids over mineral oil-based nanofluids have been reported and thus highlighted.
The JPL Library information retrieval system
NASA Technical Reports Server (NTRS)
Walsh, J.
1975-01-01
The development, capabilities, and products of the computer-based retrieval system of the Jet Propulsion Laboratory Library are described. The system handles books and documents, produces a book catalog, and provides a machine search capability. Programs and documentation are available to the public through NASA's computer software dissemination program.
Intracellular Protein Delivery for Treating Breast Cancer
2014-08-01
protein derived from chicken anemia virus (Backendorf et al., 2008). When transgenically expressed, apoptin can induce p53-independent apoptosis in a...and sho quantified b each figure cell lines He nes HeLa er S-S Rho signals rem re well-shie cessible to rong red flu pping of rh e of the nuc s...Jochemsen, A.G., Vandereb, A.J., and Noteborn, M.H.M. (1995). Apoptin, a Protein- Derived from Chicken Anemia Virus, Induces P53-Independent Apoptosis in Human Osteosarcoma Cells. Cancer Res 55, 486-489.
Mars Science Laboratory CHIMRA/IC/DRT Flight Software for Sample Acquisition and Processing
NASA Technical Reports Server (NTRS)
Kim, Won S.; Leger, Chris; Carsten, Joseph; Helmick, Daniel; Kuhn, Stephen; Redick, Richard; Trujillo, Diana
2013-01-01
The design methodologies of using sequence diagrams, multi-process functional flow diagrams, and hierarchical state machines were successfully applied in designing three MSL (Mars Science Laboratory) flight software modules responsible for handling actuator motions of the CHIMRA (Collection and Handling for In Situ Martian Rock Analysis), IC (Inlet Covers), and DRT (Dust Removal Tool) mechanisms. The methodologies were essential to specify complex interactions with other modules, support concurrent foreground and background motions, and handle various fault protections. Studying task scenarios with multi-process functional flow diagrams yielded great insight to overall design perspectives. Since the three modules require three different levels of background motion support, the methodologies presented in this paper provide an excellent comparison. All three modules are fully operational in flight.
33 CFR 142.27 - Eye and face protection.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 33 Navigation and Navigable Waters 2 2010-07-01 2010-07-01 false Eye and face protection. 142.27... Eye and face protection. (a) Personnel engaged in or observing welding, grinding, machining, chipping, handling hazardous materials, or acetylene burning or cutting shall wear the eye and face protector...
33 CFR 142.27 - Eye and face protection.
Code of Federal Regulations, 2011 CFR
2011-07-01
... 33 Navigation and Navigable Waters 2 2011-07-01 2011-07-01 false Eye and face protection. 142.27... Eye and face protection. (a) Personnel engaged in or observing welding, grinding, machining, chipping, handling hazardous materials, or acetylene burning or cutting shall wear the eye and face protector...
33 CFR 142.27 - Eye and face protection.
Code of Federal Regulations, 2013 CFR
2013-07-01
... 33 Navigation and Navigable Waters 2 2013-07-01 2013-07-01 false Eye and face protection. 142.27... Eye and face protection. (a) Personnel engaged in or observing welding, grinding, machining, chipping, handling hazardous materials, or acetylene burning or cutting shall wear the eye and face protector...
33 CFR 142.27 - Eye and face protection.
Code of Federal Regulations, 2014 CFR
2014-07-01
... 33 Navigation and Navigable Waters 2 2014-07-01 2014-07-01 false Eye and face protection. 142.27... Eye and face protection. (a) Personnel engaged in or observing welding, grinding, machining, chipping, handling hazardous materials, or acetylene burning or cutting shall wear the eye and face protector...
33 CFR 142.27 - Eye and face protection.
Code of Federal Regulations, 2012 CFR
2012-07-01
... 33 Navigation and Navigable Waters 2 2012-07-01 2012-07-01 false Eye and face protection. 142.27... Eye and face protection. (a) Personnel engaged in or observing welding, grinding, machining, chipping, handling hazardous materials, or acetylene burning or cutting shall wear the eye and face protector...
A natural language interface to databases
NASA Technical Reports Server (NTRS)
Ford, D. R.
1988-01-01
The development of a Natural Language Interface which is semantic-based and uses Conceptual Dependency representation is presented. The system was developed using Lisp and currently runs on a Symbolics Lisp machine. A key point is that the parser handles morphological analysis, which expands its capabilities of understanding more words.
NASA Astrophysics Data System (ADS)
Anderson, R. B.; Finch, N.; Clegg, S.; Graff, T.; Morris, R. V.; Laura, J.
2017-06-01
We present a Python-based library and graphical interface for the analysis of point spectra. The tool is being developed with a focus on methods used for ChemCam data, but is flexible enough to handle spectra from other instruments.
Exception handling for sensor fusion
NASA Astrophysics Data System (ADS)
Chavez, G. T.; Murphy, Robin R.
1993-08-01
This paper presents a control scheme for handling sensing failures (sensor malfunctions, significant degradations in performance due to changes in the environment, and errant expectations) in sensor fusion for autonomous mobile robots. The advantages of the exception handling mechanism are that it emphasizes a fast response to sensing failures, is able to use only a partial causal model of sensing failure, and leads to a graceful degradation of sensing if the sensing failure cannot be compensated for. The exception handling mechanism consists of two modules: error classification and error recovery. The error classification module in the exception handler attempts to classify the type and source(s) of the error using a modified generate-and-test procedure. If the source of the error is isolated, the error recovery module examines its cache of recovery schemes, which either repair or replace the current sensing configuration. If the failure is due to an error in expectation or cannot be identified, the planner is alerted. Experiments using actual sensor data collected by the CSM Mobile Robotics/Machine Perception Laboratory's Denning mobile robot demonstrate the operation of the exception handling mechanism.
Identification of failed fuel element
Fryer, Richard M.; Matlock, Robert G.
1976-06-22
A passive fission product gas trap is provided in the upper portion of each fuel subassembly in a nuclear reactor. The gas trap consists of an inverted funnel of less diameter than the subassembly having a valve at the apex thereof. An actuating rod extends upwardly from the valve through the subassembly to a point where it can be contacted by the fuel handling mechanism for the reactor. Interrogation of the subassembly for the presence of fission products is accomplished by lowering the fuel handling machine onto the subassembly to press down on the actuating rod and open the valve.
Tethering sockets and wrenches
NASA Technical Reports Server (NTRS)
Johnson, E. P.
1990-01-01
The tethering of sockets and wrenches was accomplished to improve the safety of working over motor segments. To accomplish the tethering of the sockets to the ratchets, a special design was implemented in which a groove was machined into each socket. Each socket was then fitted with a snap ring that can spin around the machined groove. The snap ring is tethered to the handle of the ratchet. All open end wrenches are also tethered to the ratchet or to the operator, depending upon the type. Tests were run to ensure that the modified tools meet torque requirements. The design was subsequently approved by Space Safety.
Analysis of tasks for dynamic man/machine load balancing in advanced helicopters
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jorgensen, C.C.
1987-10-01
This report considers task allocation requirements imposed by advanced helicopter designs incorporating mixes of human pilots and intelligent machines. Specifically, it develops an analogy between load balancing using distributed non-homogeneous multiprocessors and human team functions. A taxonomy is presented which can be used to identify task combinations likely to cause overload for dynamic scheduling and process allocation mechanisms. Designer criteria are given for function decomposition, separation of control from data, and communication handling for dynamic tasks. Possible effects of n-p complete scheduling problems are noted and a class of combinatorial optimization methods are examined.
NIST Automated Manufacturing Research Facility (AMRF): March 1987
NASA Technical Reports Server (NTRS)
Herbert, Judith E. (Editor); Kane, Richard (Editor)
1987-01-01
The completion and advances to the NIST Automated Manufacturing Research Facility (AMRF) is described in this video. The six work stations: (1) horizontal machining; (2) vertical machining; (3) turning machinery; (4) cleaning and deburring; (5) materials handling; and (6) inspection are shown and uses for each workstation are cited. Visiting researchers and scientists within NIST describe the advantages of each of the workstations, what the facility is used for, future applications for the technological advancements from the AMRF, including examples of how AMRF technology is being transferred to the U.S. Navy industry and discuss future technological goals for the facility.
Classifying Structures in the ISM with Machine Learning Techniques
NASA Astrophysics Data System (ADS)
Beaumont, Christopher; Goodman, A. A.; Williams, J. P.
2011-01-01
The processes which govern molecular cloud evolution and star formation often sculpt structures in the ISM: filaments, pillars, shells, outflows, etc. Because of their morphological complexity, these objects are often identified manually. Manual classification has several disadvantages; the process is subjective, not easily reproducible, and does not scale well to handle increasingly large datasets. We have explored to what extent machine learning algorithms can be trained to autonomously identify specific morphological features in molecular cloud datasets. We show that the Support Vector Machine algorithm can successfully locate filaments and outflows blended with other emission structures. When the objects of interest are morphologically distinct from the surrounding emission, this autonomous classification achieves >90% accuracy. We have developed a set of IDL-based tools to apply this technique to other datasets.
Method and system for providing work machine multi-functional user interface
Hoff, Brian D [Peoria, IL; Akasam, Sivaprasad [Peoria, IL; Baker, Thomas M [Peoria, IL
2007-07-10
A method is performed to provide a multi-functional user interface on a work machine for displaying suggested corrective action. The process includes receiving status information associated with the work machine and analyzing the status information to determine an abnormal condition. The process also includes displaying a warning message on the display device indicating the abnormal condition and determining one or more corrective actions to handle the abnormal condition. Further, the process includes determining an appropriate corrective action among the one or more corrective actions and displaying a recommendation message on the display device reflecting the appropriate corrective action. The process may also include displaying a list including the remaining one or more corrective actions on the display device to provide alternative actions to an operator.
Neo-Symbiosis: The Next Stage in the Evolution of Human Information Interaction.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Griffith, Douglas; Greitzer, Frank L.
In his 1960 paper Man-Machine Symbiosis, Licklider predicted that human brains and computing machines will be coupled in a tight partnership that will think as no human brain has ever thought and process data in a way not approached by the information-handling machines we know today. Today we are on the threshold of resurrecting the vision of symbiosis. While Licklider’s original vision suggested a co-equal relationship, here we discuss an updated vision, neo-symbiosis, in which the human holds a superordinate position in an intelligent human-computer collaborative environment. This paper was originally published as a journal article and is being publishedmore » as a chapter in an upcoming book series, Advances in Novel Approaches in Cognitive Informatics and Natural Intelligence.« less
NASA Astrophysics Data System (ADS)
Noh, S. J.; Rakovec, O.; Kumar, R.; Samaniego, L. E.
2015-12-01
Accurate and reliable streamflow prediction is essential to mitigate social and economic damage coming from water-related disasters such as flood and drought. Sequential data assimilation (DA) may facilitate improved streamflow prediction using real-time observations to correct internal model states. In conventional DA methods such as state updating, parametric uncertainty is often ignored mainly due to practical limitations of methodology to specify modeling uncertainty with limited ensemble members. However, if parametric uncertainty related with routing and runoff components is not incorporated properly, predictive uncertainty by model ensemble may be insufficient to capture dynamics of observations, which may deteriorate predictability. Recently, a multi-scale parameter regionalization (MPR) method was proposed to make hydrologic predictions at different scales using a same set of model parameters without losing much of the model performance. The MPR method incorporated within the mesoscale hydrologic model (mHM, http://www.ufz.de/mhm) could effectively represent and control uncertainty of high-dimensional parameters in a distributed model using global parameters. In this study, we evaluate impacts of streamflow data assimilation over European river basins. Especially, a multi-parametric ensemble approach is tested to consider the effects of parametric uncertainty in DA. Because augmentation of parameters is not required within an assimilation window, the approach could be more stable with limited ensemble members and have potential for operational uses. To consider the response times and non-Gaussian characteristics of internal hydrologic processes, lagged particle filtering is utilized. The presentation will be focused on gains and limitations of streamflow data assimilation and multi-parametric ensemble method over large-scale basins.
Denadai, Angelo M L; De Sousa, Frederico B; Passos, Joel J; Guatimosim, Fernando C; Barbosa, Kirla D; Burgos, Ana E; de Oliveira, Fernando Castro; da Silva, Jeann C; Neves, Bernardo R A; Mohallem, Nelcy D S; Sinisterra, Rubén D
2012-01-01
Organic-inorganic magnetic hybrid materials (MHMs) combine a nonmagnetic and a magnetic component by means of electrostatic interactions or covalent bonds, and notable features can be achieved. Herein, we describe an application of a self-assembled material based on ferrite associated with β-cyclodextrin (Fe-Ni/Zn/βCD) at the nanoscale level. This MHM and pure ferrite (Fe-Ni/Zn) were used as an adsorbent system for Cr(3+) and Cr(2)O(7) (2-) ions in aqueous solutions. Prior to the adsorption studies, both ferrites were characterized in order to determine the particle size distribution, morphology and available binding sites on the surface of the materials. Microscopy analysis demonstrated that both ferrites present two different size domains, at the micro- and nanoscale level, with the latter being able to self-assemble into larger particles. Fe-Ni/Zn/βCD presented smaller particles and a more homogeneous particle size distribution. Higher porosity for this MHM compared to Fe-Ni/Zn was observed by Brunauer-Emmett-Teller isotherms and positron-annihilation-lifetime spectroscopy. Based on the pKa values, potentiometric titrations demonstrated the presence of βCD in the inorganic matrix, indicating that the lamellar structures verified by transmission electronic microscopy can be associated with βCD assembled structures. Colloidal stability was inferred as a function of time at different pH values, indicating the sedimentation rate as a function of pH. Zeta potential measurements identified an amphoteric behavior for the Fe-Ni/Zn/βCD, suggesting its better capability to remove ions (cations and anions) from aqueous solutions compared to that of Fe-Ni/Zn.
Boosey, Robyn; Prestwich, Georgina; Deave, Toity
2014-01-01
An increasing number of studies have found that girls in low-income settings miss or struggle at school during menstruation if they are unable to manage their menstrual hygiene effectively. This study explores the menstrual hygiene practices and knowledge of girls at rural government primary schools in the Rukungiri district in Uganda and assesses the extent to which poor menstrual hygiene management (MHM) affects their education. A self-administered questionnaire was completed by schoolgirls in six government-run primary schools in the Rukungiri district. Focus groups were held with girls from each school and semi-structured interviews were conducted with headteachers and female teachers from the participating schools. A toilet assessment was also conducted in each school. One hundred and forty schoolgirls completed the questionnaire. The girls reported a lack of access to adequate resources, facilities and accurate information to manage their menstrual hygiene effectively at school. They reported that, as a result, during menstruation they often struggle at school or miss school. Eighty-six girls (61.7%) reported missing school each month for menstrual-related reasons (mean 1.64, range 0-10, SD. 1.84). It is common for girls who attend government-run primary schools in the Rukungiri district to miss school or struggle in lessons during menstruation because they do not have access to the resources, facilities, or information they need to manage for effective MHM. This is likely to have detrimental effects on their education and future prospects. A large-scale study is needed to explore the extent of this issue.
Design and Simulation Plant Layout Using Systematic Layout Planning
NASA Astrophysics Data System (ADS)
Suhardini, D.; Septiani, W.; Fauziah, S.
2017-12-01
This research aims to design the factory layout of PT. Gunaprima Budiwijaya in order to increase production capacity. The problem faced by this company is inappropriate layout causes cross traffic on the production floor. The re-layout procedure consist of these three steps: analysing the existing layout, designing plant layout based on SLP and evaluation and selection of alternative layout using Simulation Pro model version 6. Systematic layout planning is used to re-layout not based on the initial layout. This SLP produces four layout alternatives, and each alternative will be evaluated based on two criteria, namely cost of material handling using Material Handling Evaluation Sheet (MHES) and processing time by simulation. The results showed that production capacity is increasing as much as 37.5% with the addition of the machine and the operator, while material handling cost was reduced by improvement of the layout. The use of systematic layout planning method reduces material handling cost of 10,98% from initial layout or amounting to Rp1.229.813,34.
Rao, Carol Y; Pachucki, Constance; Cali, Salvatore; Santhiraj, Mangai; Krankoski, Kathi L K; Noble-Wang, Judith A; Leehey, David; Popli, Subhash; Brandt, Mary E; Lindsley, Mark D; Fridkin, Scott K; Arduino, Matthew J
2009-09-01
We investigated a cluster of cases of bloodstream infection (BSI) due to the mold Phialemonium at a hemodialysis center in Illinois and conducted a cohort study to identify risk factors. Environmental assessment and cohort study. A hemodialysis center in a tertiary care hospital. A case patient was defined as a person who underwent dialysis at the center and had a blood sample that tested positive for Phialemonium curvatum on culture. We reviewed microbiology and medical records and tested water, surface, and dialysate samples by culture. Molds isolated from environmental and clinical specimens were identified by their morphological features and confirmed by sequencing DNA. We identified 2 case patients with BSI due to P. curvatum. Both became febrile and hypotensive while undergoing dialysis on the same machine at the same treatment station, although on different days. Dialysis machines were equipped with waste handling option ports that are used to discard dialyzer priming fluid. We isolated P. curvatum from the product water (ie, water used for dialysis purposes) at 2 of 19 treatment stations, one of which was the implicated station. The source of P. curvatum was likely the water distribution system. To our knowledge, this is the first report of patients acquiring a mold BSI from contaminated product water. The route of exposure in these cases of BSI due to P. curvatum may be related to the malfunction and improper maintenance of the waste handling option ports. Waste handling option ports have been previously implicated as the source of bacterial BSI due to the backflow of waste fluid into a patient's blood line. No additional cases of infection were noted after remediation of the water distribution system and after discontinuing use of waste handling option ports at the facility.
Huang, Yukun; Chen, Rong; Wei, Jingbo; Pei, Xilong; Cao, Jing; Prakash Jayaraman, Prem; Ranjan, Rajiv
2014-01-01
JNI in the Android platform is often observed with low efficiency and high coding complexity. Although many researchers have investigated the JNI mechanism, few of them solve the efficiency and the complexity problems of JNI in the Android platform simultaneously. In this paper, a hybrid polylingual object (HPO) model is proposed to allow a CAR object being accessed as a Java object and as vice in the Dalvik virtual machine. It is an acceptable substitute for JNI to reuse the CAR-compliant components in Android applications in a seamless and efficient way. The metadata injection mechanism is designed to support the automatic mapping and reflection between CAR objects and Java objects. A prototype virtual machine, called HPO-Dalvik, is implemented by extending the Dalvik virtual machine to support the HPO model. Lifespan management, garbage collection, and data type transformation of HPO objects are also handled in the HPO-Dalvik virtual machine automatically. The experimental result shows that the HPO model outweighs the standard JNI in lower overhead on native side, better executing performance with no JNI bridging code being demanded.
Repurposing mainstream CNC machine tools for laser-based additive manufacturing
NASA Astrophysics Data System (ADS)
Jones, Jason B.
2016-04-01
The advent of laser technology has been a key enabler for industrial 3D printing, known as Additive Manufacturing (AM). Despite its commercial success and unique technical capabilities, laser-based AM systems are not yet able to produce parts with the same accuracy and surface finish as CNC machining. To enable the geometry and material freedoms afforded by AM, yet achieve the precision and productivity of CNC machining, hybrid combinations of these two processes have started to gain traction. To achieve the benefits of combined processing, laser technology has been integrated into mainstream CNC machines - effectively repurposing them as hybrid manufacturing platforms. This paper reviews how this engineering challenge has prompted beam delivery innovations to allow automated changeover between laser processing and machining, using standard CNC tool changers. Handling laser-processing heads using the tool changer also enables automated change over between different types of laser processing heads, further expanding the breadth of laser processing flexibility in a hybrid CNC. This paper highlights the development, challenges and future impact of hybrid CNCs on laser processing.
InfoQUEST: An Online Catalog for Small Libraries.
ERIC Educational Resources Information Center
Campbell, Bonnie
1984-01-01
InfoQUEST is a microcomputer-based online public access catalog, designed for the small library handling file sizes up to 25,000 records. Based on the IBM-PC, or compatible machines, the system will accept downloading, in batch mode, of records from the library's file on the UTLAS Catalog Support System. (Author/EJS)
ERIC Educational Resources Information Center
Kantor-Horning, Susan
2009-01-01
It's called GoLibrary in the United States and Bokomaten in its native Sweden. Patrons know it as Library-a-Go-Go in Contra Costa County, California, but whatever its name, the automated lending service this materials handling machine provides has proved a tremendous aid in addressing underserved segments of this sprawling community. It's not hard…
Failure Analysis of Sapphire Refractive Secondary Concentrators
NASA Technical Reports Server (NTRS)
Salem, Jonathan A.; Quinn, George D.
2009-01-01
Failure analysis was performed on two sapphire, refractive secondary concentrators (RSC) that failed during elevated temperature testing. Both concentrators failed from machining/handling damage on the lens face. The first concentrator, which failed during testing to 1300 C, exhibited a large r-plane twin extending from the lens through much of the cone. The second concentrator, which was an attempt to reduce temperature gradients and failed during testing to 649 C, exhibited a few small twins on the lens face. The twins were not located at the origin, but represent another mode of failure that needs to be considered in the design of sapphire components. In order to estimate the fracture stress from fractographic evidence, branching constants were measured on sapphire strength specimens. The fractographic analysis indicated radial tensile stresses of 44 to 65 MPa on the lens faces near the origins. Finite element analysis indicated similar stresses for the first RSC, but lower stresses for the second RSC. Better machining and handling might have prevented the fractures, however, temperature gradients and resultant thermal stresses need to be reduced to prevent twinning.
Prediction of hourly PM2.5 using a space-time support vector regression model
NASA Astrophysics Data System (ADS)
Yang, Wentao; Deng, Min; Xu, Feng; Wang, Hang
2018-05-01
Real-time air quality prediction has been an active field of research in atmospheric environmental science. The existing methods of machine learning are widely used to predict pollutant concentrations because of their enhanced ability to handle complex non-linear relationships. However, because pollutant concentration data, as typical geospatial data, also exhibit spatial heterogeneity and spatial dependence, they may violate the assumptions of independent and identically distributed random variables in most of the machine learning methods. As a result, a space-time support vector regression model is proposed to predict hourly PM2.5 concentrations. First, to address spatial heterogeneity, spatial clustering is executed to divide the study area into several homogeneous or quasi-homogeneous subareas. To handle spatial dependence, a Gauss vector weight function is then developed to determine spatial autocorrelation variables as part of the input features. Finally, a local support vector regression model with spatial autocorrelation variables is established for each subarea. Experimental data on PM2.5 concentrations in Beijing are used to verify whether the results of the proposed model are superior to those of other methods.
Distributed state machine supervision for long-baseline gravitational-wave detectors
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rollins, Jameson Graef, E-mail: jameson.rollins@ligo.org
The Laser Interferometer Gravitational-wave Observatory (LIGO) consists of two identical yet independent, widely separated, long-baseline gravitational-wave detectors. Each Advanced LIGO detector consists of complex optical-mechanical systems isolated from the ground by multiple layers of active seismic isolation, all controlled by hundreds of fast, digital, feedback control systems. This article describes a novel state machine-based automation platform developed to handle the automation and supervisory control challenges of these detectors. The platform, called Guardian, consists of distributed, independent, state machine automaton nodes organized hierarchically for full detector control. User code is written in standard Python and the platform is designed to facilitatemore » the fast-paced development process associated with commissioning the complicated Advanced LIGO instruments. While developed specifically for the Advanced LIGO detectors, Guardian is a generic state machine automation platform that is useful for experimental control at all levels, from simple table-top setups to large-scale multi-million dollar facilities.« less
ODC-Free Solvent Implementation for Phenolics Cleaning
NASA Technical Reports Server (NTRS)
Wurth, Laura; Biegert, Lydia; Lamont, DT; McCool, Alex (Technical Monitor)
2001-01-01
During phenolic liner manufacture, resin-impregnated (pre-preg) bias tape of silica, glass, or carbon cloth is tape-wrapped, cured, machined, and then wiped with 1,1,1 tri-chloroethane (TCA) to remove contaminants that may have been introduced during machining and handling. Following the TCA wipe, the machined surface is given a resin wet-coat and over-wrapped with more prepreg and cured. A TCA replacement solvent for these wiping operations must effectively remove both surface contaminants, and sub-surface oils and greases while not compromising the integrity of this interface. Selection of a TCA replacement solvent for phenolic over-wrap interface cleaning began with sub-scale compatibility tests with cured phenolics. Additional compatibility tests included assessment of solvent retention in machined phenolic surfaces. Results from these tests showed that, while the candidate solvent did not degrade the cured phenolics, it was retained in higher concentrations than TCA in phenolic surfaces. This effect was most pronounced with glass and silica cloth phenolics with steep ply angles relative to the wiped surfaces.
Christensen, H; Pedersen, M B; Sjøgaard, G
1995-04-01
Musculoskeletal disorders constitute a major problem in the wood and furniture industry and identification of risk factors is needed urgently. Therefore, exposures to different work tasks and variation in the job were recorded based on an observation survey in combination with an interview among 281 employees working in wood working and painting departments. A questionnaire survey confirmed high frequencies of symptoms from the musculoskeletal system: The one-year prevalence of symptoms from the low back was 42% and symptoms from the neck/shoulder was 40%. The exposure was evaluated based on: (1) classification of work tasks, (2) work cycle time, (3) manual materials handling, (4) working postures, and (5) variation in the job. Among the employees 47% performed feeding or clearing of machines, 35% performed wood working or painting materials, and 18% performed various other operations. Among the employees 20% had no variation in their job while 44% had little variation. Manual materials handling of 375 different burdens was observed, which most often occurred during feeding or clearing of machines. The weight of burdens lifted was 0.5-87.0 kg, where 2% had a weight of more than 50 kg. Among the lifting conditions 30% were evaluated as implying a risk of injury. An additional risk factor was the high total tonnage lifted per day, which was estimated to range from 132 kg to 58,800 kg. Working postures implied a risk of injury due to prolonged forward and lateral flexions of the neck, which was seen most frequently during wood working or painting materials. These data substantiate the finding that work tasks mainly during feeding or clearing of machines imply a risk of injury to the low back and a risk of injury to the neck and shoulder area mainly during wood working or painting materials. Optimal strategies for job redesign may be worked out by using these data in order to prevent occupational musculoskeletal disorders.
Sivakumar, B; Bhalaji, N; Sivakumar, D
2014-01-01
In mobile ad hoc networks connectivity is always an issue of concern. Due to dynamism in the behavior of mobile nodes, efficiency shall be achieved only with the assumption of good network infrastructure. Presence of critical links results in deterioration which should be detected in advance to retain the prevailing communication setup. This paper discusses a short survey on the specialized algorithms and protocols related to energy efficient load balancing for critical link detection in the recent literature. This paper also suggests a machine learning based hybrid power-aware approach for handling critical nodes via load balancing.
Sivakumar, B.; Bhalaji, N.; Sivakumar, D.
2014-01-01
In mobile ad hoc networks connectivity is always an issue of concern. Due to dynamism in the behavior of mobile nodes, efficiency shall be achieved only with the assumption of good network infrastructure. Presence of critical links results in deterioration which should be detected in advance to retain the prevailing communication setup. This paper discusses a short survey on the specialized algorithms and protocols related to energy efficient load balancing for critical link detection in the recent literature. This paper also suggests a machine learning based hybrid power-aware approach for handling critical nodes via load balancing. PMID:24790546
Replacement Condition Detection of Railway Point Machines Using an Electric Current Sensor.
Sa, Jaewon; Choi, Younchang; Chung, Yongwha; Kim, Hee-Young; Park, Daihee; Yoon, Sukhan
2017-01-29
Detecting replacement conditions of railway point machines is important to simultaneously satisfy the budget-limit and train-safety requirements. In this study, we consider classification of the subtle differences in the aging effect-using electric current shape analysis-for the purpose of replacement condition detection of railway point machines. After analyzing the shapes of after-replacement data and then labeling the shapes of each before-replacement data, we can derive the criteria that can handle the subtle differences between "does-not-need-to-be-replaced" and "needs-to-be-replaced" shapes. On the basis of the experimental results with in-field replacement data, we confirmed that the proposed method could detect the replacement conditions with acceptable accuracy, as well as provide visual interpretability of the criteria used for the time-series classification.
Replacement Condition Detection of Railway Point Machines Using an Electric Current Sensor
Sa, Jaewon; Choi, Younchang; Chung, Yongwha; Kim, Hee-Young; Park, Daihee; Yoon, Sukhan
2017-01-01
Detecting replacement conditions of railway point machines is important to simultaneously satisfy the budget-limit and train-safety requirements. In this study, we consider classification of the subtle differences in the aging effect—using electric current shape analysis—for the purpose of replacement condition detection of railway point machines. After analyzing the shapes of after-replacement data and then labeling the shapes of each before-replacement data, we can derive the criteria that can handle the subtle differences between “does-not-need-to-be-replaced” and “needs-to-be-replaced” shapes. On the basis of the experimental results with in-field replacement data, we confirmed that the proposed method could detect the replacement conditions with acceptable accuracy, as well as provide visual interpretability of the criteria used for the time-series classification. PMID:28146057
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wolfe, Justin E.; Qiu, S. Roger; Stolz, Christopher J.
2011-03-20
Femtosecond laser machining is used to create mitigation pits to stabilize nanosecond laser-induced damage in multilayer dielectric mirror coatings on BK7 substrates. In this paper, we characterize features and the artifacts associated with mitigation pits and further investigate the impact of pulse energy and pulse duration on pit quality and damage resistance. Our results show that these mitigation features can double the fluence-handling capability of large-aperture optical multilayer mirror coatings and further demonstrate that femtosecond laser macromachining is a promising means for fabricating mitigation geometry in multilayer coatings to increase mirror performance under high-power laser irradiation.
Taking Interpersonal Communication out of the Classroom into the World of Computer Technology.
ERIC Educational Resources Information Center
Gantt, Vernon W.
The emergence of the information society introduces the academic community to the most significant revolution since the invention of the printing press. The growing use of computers can lead to a depreciation of self-worth. Since the machine can handle complex logical applications with considerably more speed and accuracy than most people, many…
Managing Quality, Identity and Adversaries in Public Discourse with Machine Learning
ERIC Educational Resources Information Center
Brennan, Michael
2012-01-01
Automation can mitigate issues when scaling and managing quality and identity in public discourse on the web. Discourse needs to be curated and filtered. Anonymous speech has to be supported while handling adversaries. Reliance on human curators or analysts does not scale and content can be missed. These scaling and management issues include the…
Rechargeable Magnesium Power Cells
NASA Technical Reports Server (NTRS)
Koch, Victor R.; Nanjundiah, Chenniah; Orsini, Michael
1995-01-01
Rechargeable power cells based on magnesium anodes developed as safer alternatives to high-energy-density cells like those based on lithium and sodium anodes. At cost of some reduction in energy density, magnesium-based cells safer because less susceptible to catastrophic meltdown followed by flames and venting of toxic fumes. Other advantages include ease of handling, machining, and disposal, and relatively low cost.
Cutting Tool For Shaving Weld Beads
NASA Technical Reports Server (NTRS)
Hoffman, David S.; Mcferrin, David C.; Daniel, Ronald L., Jr.; Coby, John B., Jr.; Dawson, Sidney G.
1995-01-01
Cutting tool proposed for use in shaving weld beads flush with adjacent surfaces of weldments. Modified version of commercial pneumatically driven rotary cutting tool, cutting wheel of which turns at speeds sufficient for machining nickel alloys, titanium, and stainless steels. Equipped with forward-mounted handle and rear-mounted skid plate to maximize control and reduce dependence on skill of technician.
Handling of Varied Data Bases in an Information Center Environment.
ERIC Educational Resources Information Center
Williams, Martha E.
Information centers exist to provide information from machine-readable data bases to users in industry, universities and other organizations. The computer Search Center of the IIT Research Institute was designed with a number of variables and uncertainties before it. In this paper, the author discusses how the Center was designed to enable it to…
Nätt, Daniel; Agnvall, Beatrix; Jensen, Per
2014-01-01
While behavioral sex differences have repeatedly been reported across taxa, the underlying epigenetic mechanisms in the brain are mostly lacking. Birds have previously shown to have only limited dosage compensation, leading to high sex bias of Z-chromosome gene expression. In chickens, a male hyper-methylated region (MHM) on the Z-chromosome has been associated with a local type of dosage compensation, but a more detailed characterization of the avian methylome is limiting our interpretations. Here we report an analysis of genome wide sex differences in promoter DNA-methylation and gene expression in the brain of three weeks old chickens, and associated sex differences in behavior of Red Junglefowl (ancestor of domestic chickens). Combining DNA-methylation tiling arrays with gene expression microarrays we show that a specific locus of the MHM region, together with the promoter for the zinc finger RNA binding protein (ZFR) gene on chromosome 1, is strongly associated with sex dimorphism in gene expression. Except for this, we found few differences in promoter DNA-methylation, even though hundreds of genes were robustly differentially expressed across distantly related breeds. Several of the differentially expressed genes are known to affect behavior, and as suggested from their functional annotation, we found that female Red Junglefowl are more explorative and fearful in a range of tests performed throughout their lives. This paper identifies new sites and, with increased resolution, confirms known sites where DNA-methylation seems to affect sexually dimorphic gene expression, but the general lack of this association is noticeable and strengthens the view that birds do not have dosage compensation. PMID:24782041
Denadai, Ângelo M L; De Sousa, Frederico B; Passos, Joel J; Guatimosim, Fernando C; Barbosa, Kirla D; Burgos, Ana E; de Oliveira, Fernando Castro; da Silva, Jeann C; Neves, Bernardo R A; Mohallem, Nelcy D S
2012-01-01
Summary Organic–inorganic magnetic hybrid materials (MHMs) combine a nonmagnetic and a magnetic component by means of electrostatic interactions or covalent bonds, and notable features can be achieved. Herein, we describe an application of a self-assembled material based on ferrite associated with β-cyclodextrin (Fe-Ni/Zn/βCD) at the nanoscale level. This MHM and pure ferrite (Fe-Ni/Zn) were used as an adsorbent system for Cr3+ and Cr2O7 2− ions in aqueous solutions. Prior to the adsorption studies, both ferrites were characterized in order to determine the particle size distribution, morphology and available binding sites on the surface of the materials. Microscopy analysis demonstrated that both ferrites present two different size domains, at the micro- and nanoscale level, with the latter being able to self-assemble into larger particles. Fe-Ni/Zn/βCD presented smaller particles and a more homogeneous particle size distribution. Higher porosity for this MHM compared to Fe-Ni/Zn was observed by Brunauer–Emmett–Teller isotherms and positron-annihilation-lifetime spectroscopy. Based on the pKa values, potentiometric titrations demonstrated the presence of βCD in the inorganic matrix, indicating that the lamellar structures verified by transmission electronic microscopy can be associated with βCD assembled structures. Colloidal stability was inferred as a function of time at different pH values, indicating the sedimentation rate as a function of pH. Zeta potential measurements identified an amphoteric behavior for the Fe-Ni/Zn/βCD, suggesting its better capability to remove ions (cations and anions) from aqueous solutions compared to that of Fe-Ni/Zn. PMID:23209524
Boosey, Robyn; Prestwich, Georgina; Deave, Toity
2014-01-01
Introduction An increasing number of studies have found that girls in low-income settings miss or struggle at school during menstruation if they are unable to manage their menstrual hygiene effectively. This study explores the menstrual hygiene practices and knowledge of girls at rural government primary schools in the Rukungiri district in Uganda and assesses the extent to which poor menstrual hygiene management (MHM) affects their education. Methods A self-administered questionnaire was completed by schoolgirls in six government-run primary schools in the Rukungiri district. Focus groups were held with girls from each school and semi-structured interviews were conducted with headteachers and female teachers from the participating schools. A toilet assessment was also conducted in each school. Results One hundred and forty schoolgirls completed the questionnaire. The girls reported a lack of access to adequate resources, facilities and accurate information to manage their menstrual hygiene effectively at school. They reported that, as a result, during menstruation they often struggle at school or miss school. Eighty-six girls (61.7%) reported missing school each month for menstrual-related reasons (mean 1.64, range 0-10, SD. 1.84). Conclusion It is common for girls who attend government-run primary schools in the Rukungiri district to miss school or struggle in lessons during menstruation because they do not have access to the resources, facilities, or information they need to manage for effective MHM. This is likely to have detrimental effects on their education and future prospects. A large-scale study is needed to explore the extent of this issue. PMID:25852796
Huang, Yukun; Chen, Rong; Wei, Jingbo; Pei, Xilong; Cao, Jing; Prakash Jayaraman, Prem; Ranjan, Rajiv
2014-01-01
JNI in the Android platform is often observed with low efficiency and high coding complexity. Although many researchers have investigated the JNI mechanism, few of them solve the efficiency and the complexity problems of JNI in the Android platform simultaneously. In this paper, a hybrid polylingual object (HPO) model is proposed to allow a CAR object being accessed as a Java object and as vice in the Dalvik virtual machine. It is an acceptable substitute for JNI to reuse the CAR-compliant components in Android applications in a seamless and efficient way. The metadata injection mechanism is designed to support the automatic mapping and reflection between CAR objects and Java objects. A prototype virtual machine, called HPO-Dalvik, is implemented by extending the Dalvik virtual machine to support the HPO model. Lifespan management, garbage collection, and data type transformation of HPO objects are also handled in the HPO-Dalvik virtual machine automatically. The experimental result shows that the HPO model outweighs the standard JNI in lower overhead on native side, better executing performance with no JNI bridging code being demanded. PMID:25110745
Proposed algorithm to improve job shop production scheduling using ant colony optimization method
NASA Astrophysics Data System (ADS)
Pakpahan, Eka KA; Kristina, Sonna; Setiawan, Ari
2017-12-01
This paper deals with the determination of job shop production schedule on an automatic environment. On this particular environment, machines and material handling system are integrated and controlled by a computer center where schedule were created and then used to dictate the movement of parts and the operations at each machine. This setting is usually designed to have an unmanned production process for a specified interval time. We consider here parts with various operations requirement. Each operation requires specific cutting tools. These parts are to be scheduled on machines each having identical capability, meaning that each machine is equipped with a similar set of cutting tools therefore is capable of processing any operation. The availability of a particular machine to process a particular operation is determined by the remaining life time of its cutting tools. We proposed an algorithm based on the ant colony optimization method and embedded them on matlab software to generate production schedule which minimize the total processing time of the parts (makespan). We test the algorithm on data provided by real industry and the process shows a very short computation time. This contributes a lot to the flexibility and timelines targeted on an automatic environment.
Prototype Vector Machine for Large Scale Semi-Supervised Learning
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, Kai; Kwok, James T.; Parvin, Bahram
2009-04-29
Practicaldataminingrarelyfalls exactlyinto the supervisedlearning scenario. Rather, the growing amount of unlabeled data poses a big challenge to large-scale semi-supervised learning (SSL). We note that the computationalintensivenessofgraph-based SSLarises largely from the manifold or graph regularization, which in turn lead to large models that are dificult to handle. To alleviate this, we proposed the prototype vector machine (PVM), a highlyscalable,graph-based algorithm for large-scale SSL. Our key innovation is the use of"prototypes vectors" for effcient approximation on both the graph-based regularizer and model representation. The choice of prototypes are grounded upon two important criteria: they not only perform effective low-rank approximation of themore » kernel matrix, but also span a model suffering the minimum information loss compared with the complete model. We demonstrate encouraging performance and appealing scaling properties of the PVM on a number of machine learning benchmark data sets.« less
Cheminformatics in Drug Discovery, an Industrial Perspective.
Chen, Hongming; Kogej, Thierry; Engkvist, Ola
2018-05-18
Cheminformatics has established itself as a core discipline within large scale drug discovery operations. It would be impossible to handle the amount of data generated today in a small molecule drug discovery project without persons skilled in cheminformatics. In addition, due to increased emphasis on "Big Data", machine learning and artificial intelligence, not only in the society in general, but also in drug discovery, it is expected that the cheminformatics field will be even more important in the future. Traditional areas like virtual screening, library design and high-throughput screening analysis are highlighted in this review. Applying machine learning in drug discovery is an area that has become very important. Applications of machine learning in early drug discovery has been extended from predicting ADME properties and target activity to tasks like de novo molecular design and prediction of chemical reactions. © 2018 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.
New fuzzy support vector machine for the class imbalance problem in medical datasets classification.
Gu, Xiaoqing; Ni, Tongguang; Wang, Hongyuan
2014-01-01
In medical datasets classification, support vector machine (SVM) is considered to be one of the most successful methods. However, most of the real-world medical datasets usually contain some outliers/noise and data often have class imbalance problems. In this paper, a fuzzy support machine (FSVM) for the class imbalance problem (called FSVM-CIP) is presented, which can be seen as a modified class of FSVM by extending manifold regularization and assigning two misclassification costs for two classes. The proposed FSVM-CIP can be used to handle the class imbalance problem in the presence of outliers/noise, and enhance the locality maximum margin. Five real-world medical datasets, breast, heart, hepatitis, BUPA liver, and pima diabetes, from the UCI medical database are employed to illustrate the method presented in this paper. Experimental results on these datasets show the outperformed or comparable effectiveness of FSVM-CIP.
Handling knowledge via Concept Maps: a space weather use case
NASA Astrophysics Data System (ADS)
Messerotti, Mauro; Fox, Peter
Concept Maps (Cmaps) are powerful means for knowledge coding in graphical form. As flexible software tools exist to manipulate the knowledge embedded in Cmaps in machine-readable form, such complex entities are suitable candidates not only for the representation of ontologies and semantics in Virtual Observatory (VO) architectures, but also for knowledge handling and knowledge discovery. In this work, we present a use case relevant to space weather applications and we elaborate on its possible implementation and adavanced use in Semantic Virtual Observatories dedicated to Sun-Earth Connections. This analysis was carried out in the framework of the Electronic Geophysical Year (eGY) and represents an achievement synergized by the eGY Virtual Observatories Working Group.
NASA Technical Reports Server (NTRS)
Gangal, M. D.
1985-01-01
Version of jaw miner operates without mechanical cutting and crushing. Forward-pointing jets of water dislodge and break up coal. Rearward-pointing jets further break up coal and force particles into slurry chamber. Oscillatingjet mechanism itself stays within "jaw" structure and protected from wear and tear associated with coal handling. All-jet machine generates even less dust than anger, therefore poses lesser explosion or health hazard.
Processing woody biomass with a modified horizontal grinder
Dana Mitchell; John Klepac
2008-01-01
This study documents the production rate and cost of producing woody biomass chips for use in a power plant. The power plant has specific raw material handling requirements. Output from a 3-knife chipper, a tub grinder, and a horizontal grinder was considered. None of the samples from these machines met the specifications needed. A horizontal grinder was modified to...
ERIC Educational Resources Information Center
Weller, Herman G.; Hartson, H. Rex
1992-01-01
Describes human-computer interface needs for empowering environments in computer usage in which the machine handles the routine mechanics of problem solving while the user concentrates on its higher order meanings. A closed-loop model of interaction is described, interface as illusion is discussed, and metaphors for human-computer interaction are…
Automated Handling of Garments for Pressing
1991-09-30
Parallel Algorithms for 2D Kalman Filtering ................................. 47 DJ. Potter and M.P. Cline Hash Table and Sorted Array: A Case Study of... Kalman Filtering on the Connection Machine ............................ 55 MA. Palis and D.K. Krecker Parallel Sorting of Large Arrays on the MasPar...ALGORITHM’VS FOR SEAM SENSING. .. .. .. ... ... .... ..... 24 6.1 KarelTW Algorithms .. .. ... ... ... ... .... ... ...... 24 6.1.1 Image Filtering
Environmental Assessment: Anti-Terrorism/Force Protection McConnell Air Force Base, Kansas
2003-09-01
handled, stored, transported, disposed, or recycled in accordance with these regulations. The potential for hazardous waste generation from gate...Loader (rubber tire) Concrete Truck Concrete Finisher Crane Asphalt Spreader Roller Flat Bed Truck (18 wheel) Scraper Trenching Machine 1...plastics, and lumber. These materials would be placed in the appropriate construction materials landfill or recycled when possible. These wastes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pawlus, Witold, E-mail: witold.p.pawlus@ieee.org; Ebbesen, Morten K.; Hansen, Michael R.
Design of offshore drilling equipment is a task that involves not only analysis of strict machine specifications and safety requirements but also consideration of changeable weather conditions and harsh environment. These challenges call for a multidisciplinary approach and make the design process complex. Various modeling software products are currently available to aid design engineers in their effort to test and redesign equipment before it is manufactured. However, given the number of available modeling tools and methods, the choice of the proper modeling methodology becomes not obvious and – in some cases – troublesome. Therefore, we present a comparative analysis ofmore » two popular approaches used in modeling and simulation of mechanical systems: multibody and analytical modeling. A gripper arm of the offshore vertical pipe handling machine is selected as a case study for which both models are created. In contrast to some other works, the current paper shows verification of both systems by benchmarking their simulation results against each other. Such criteria as modeling effort and results accuracy are evaluated to assess which modeling strategy is the most suitable given its eventual application.« less
NASA Technical Reports Server (NTRS)
Crane, D. F.
1984-01-01
When human operators are performing precision tracking tasks, their dynamic response can often be modeled by quasilinear describing functions. That fact permits analysis of the effects of delay in certain man machine control systems using linear control system analysis techniques. The analysis indicates that a reduction in system stability is the immediate effect of additional control system delay, and that system characteristics moderate or exaggerate the importance of the delay. A selection of data (simulator and flight test) consistent with the analysis is reviewed. Flight simulator visual-display delay compensation, designed to restore pilot aircraft system stability, was evaluated in several studies which are reviewed here. The studies range from single-axis, tracking-task experiments (with sufficient subjects and trials to establish the statistical significance of the results) to a brief evaluation of compensation of a computer generated imagery (CGI) visual display system in a full six degree of freedom simulation. The compensation was effective, improvements in pilot performance and workload or aircraft handling qualities rating (HQR) were observed. Results from recent aircraft handling qualities research literature, which support the compensation design approach, are also reviewed.
A Spatiotemporal Prediction Framework for Air Pollution Based on Deep RNN
NASA Astrophysics Data System (ADS)
Fan, J.; Li, Q.; Hou, J.; Feng, X.; Karimian, H.; Lin, S.
2017-10-01
Time series data in practical applications always contain missing values due to sensor malfunction, network failure, outliers etc. In order to handle missing values in time series, as well as the lack of considering temporal properties in machine learning models, we propose a spatiotemporal prediction framework based on missing value processing algorithms and deep recurrent neural network (DRNN). By using missing tag and missing interval to represent time series patterns, we implement three different missing value fixing algorithms, which are further incorporated into deep neural network that consists of LSTM (Long Short-term Memory) layers and fully connected layers. Real-world air quality and meteorological datasets (Jingjinji area, China) are used for model training and testing. Deep feed forward neural networks (DFNN) and gradient boosting decision trees (GBDT) are trained as baseline models against the proposed DRNN. Performances of three missing value fixing algorithms, as well as different machine learning models are evaluated and analysed. Experiments show that the proposed DRNN framework outperforms both DFNN and GBDT, therefore validating the capacity of the proposed framework. Our results also provides useful insights for better understanding of different strategies that handle missing values.
Remote control for anode-cathode adjustment
Roose, Lars D.
1991-01-01
An apparatus for remotely adjusting the anode-cathode gap in a pulse power machine has an electric motor located within a hollow cathode inside the vacuum chamber of the pulse power machine. Input information for controlling the motor for adjusting the anode-cathode gap is fed into the apparatus using optical waveguides. The motor, controlled by the input information, drives a worm gear that moves a cathode tip. When the motor drives in one rotational direction, the cathode is moved toward the anode and the size of the anode-cathode gap is diminished. When the motor drives in the other direction, the cathode is moved away from the anode and the size of the anode-cathode gap is increased. The motor is powered by batteries housed in the hollow cathode. The batteries may be rechargeable, and they may be recharged by a photovoltaic cell in combination with an optical waveguide that receives recharging energy from outside the hollow cathode. Alternatively, the anode-cathode gap can be remotely adjusted by a manually-turned handle connected to mechanical linkage which is connected to a jack assembly. The jack assembly converts rotational motion of the handle and mechanical linkage to linear motion of the cathode moving toward or away from the anode.
Semi-supervised and unsupervised extreme learning machines.
Huang, Gao; Song, Shiji; Gupta, Jatinder N D; Wu, Cheng
2014-12-01
Extreme learning machines (ELMs) have proven to be efficient and effective learning mechanisms for pattern classification and regression. However, ELMs are primarily applied to supervised learning problems. Only a few existing research papers have used ELMs to explore unlabeled data. In this paper, we extend ELMs for both semi-supervised and unsupervised tasks based on the manifold regularization, thus greatly expanding the applicability of ELMs. The key advantages of the proposed algorithms are as follows: 1) both the semi-supervised ELM (SS-ELM) and the unsupervised ELM (US-ELM) exhibit learning capability and computational efficiency of ELMs; 2) both algorithms naturally handle multiclass classification or multicluster clustering; and 3) both algorithms are inductive and can handle unseen data at test time directly. Moreover, it is shown in this paper that all the supervised, semi-supervised, and unsupervised ELMs can actually be put into a unified framework. This provides new perspectives for understanding the mechanism of random feature mapping, which is the key concept in ELM theory. Empirical study on a wide range of data sets demonstrates that the proposed algorithms are competitive with the state-of-the-art semi-supervised or unsupervised learning algorithms in terms of accuracy and efficiency.
Study of monopropellants for electrothermal thrusters: Analytical task summary report
NASA Technical Reports Server (NTRS)
Kuenzly, J. D.; Grabbi, R.
1973-01-01
The feasibility of operating small thrust level electrothermal thrusters is determined with monopropellants other than MIL-grade hydrazine. The work scope includes analytical study, design and fabrication of demonstration thrusters, and an evaluation test program where monopropellants with freezing points lower than MIL-grade hydrazine are evaluated and characterized to determine their applicability to electrothermal thrusters for spacecraft attitude control. Results of propellant chemistry studies and performance analyses indicated that the most promising candidate monopropellants to be investigated are monomethylhydrazine, Aerozine-50, 77% hydrazine-23% hydrazine azide blend, and TRW formulated mixed hydrazine monopropellant (MHM) consisting of 35% hydrazine-50% monomethylhydrazine-15% ammonia.
A distributed version of the NASA Engine Performance Program
NASA Technical Reports Server (NTRS)
Cours, Jeffrey T.; Curlett, Brian P.
1993-01-01
Distributed NEPP, a version of the NASA Engine Performance Program, uses the original NEPP code but executes it in a distributed computer environment. Multiple workstations connected by a network increase the program's speed and, more importantly, the complexity of the cases it can handle in a reasonable time. Distributed NEPP uses the public domain software package, called Parallel Virtual Machine, allowing it to execute on clusters of machines containing many different architectures. It includes the capability to link with other computers, allowing them to process NEPP jobs in parallel. This paper discusses the design issues and granularity considerations that entered into programming Distributed NEPP and presents the results of timing runs.
NASA Technical Reports Server (NTRS)
Muszynska, Agnes; Bently, Donald E.
1991-01-01
Perturbation techniques used for identification of rotating system dynamic characteristics are described. A comparison between two periodic frequency-swept perturbation methods applied in identification of fluid forces of rotating machines is presented. The description of the fluid force model identified by inputting circular periodic frequency-swept force is given. This model is based on the existence and strength of the circumferential flow, most often generated by the shaft rotation. The application of the fluid force model in rotor dynamic analysis is presented. It is shown that the rotor stability is an entire rotating system property. Some areas for further research are discussed.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Page, Brandi R., E-mail: bpage@wakehealth.edu; Hudson, Alana D.; Brown, Derek W.
The international growth of cancer and lack of available treatment is en route to become a global crisis. With >60% of cancer patients needing radiation therapy at some point during their treatment course, the lack of available facilities and treatment programs worldwide is extremely problematic. The number of deaths from treatable cancers is projected to increase to 11.5 million deaths in 2030 because the international population is aging and growing. In this review, we present how best to answer the need for radiation therapy facilities from a technical standpoint. Specifically, we examine whether cobalt teletherapy machines or megavoltage linear acceleratormore » machines are best equipped to handle the multitudes in need of radiation therapy treatment in the developing world.« less
Step-and-Repeat Nanoimprint-, Photo- and Laser Lithography from One Customised CNC Machine.
Greer, Andrew Im; Della-Rosa, Benoit; Khokhar, Ali Z; Gadegaard, Nikolaj
2016-12-01
The conversion of a computer numerical control machine into a nanoimprint step-and-repeat tool with additional laser- and photolithography capacity is documented here. All three processes, each demonstrated on a variety of photoresists, are performed successfully and analysed so as to enable the reader to relate their known lithography process(es) to the findings. Using the converted tool, 1 cm(2) of nanopattern may be exposed in 6 s, over 3300 times faster than the electron beam equivalent. Nanoimprint tools are commercially available, but these can cost around 1000 times more than this customised computer numerical control (CNC) machine. The converted equipment facilitates rapid production and large area micro- and nanoscale research on small grants, ultimately enabling faster and more diverse growth in this field of science. In comparison to commercial tools, this converted CNC also boasts capacity to handle larger substrates, temperature control and active force control, up to ten times more curing dose and compactness. Actual devices are fabricated using the machine including an expanded nanotopographic array and microfluidic PDMS Y-channel mixers.
Step-and-Repeat Nanoimprint-, Photo- and Laser Lithography from One Customised CNC Machine
NASA Astrophysics Data System (ADS)
Greer, Andrew IM; Della-Rosa, Benoit; Khokhar, Ali Z.; Gadegaard, Nikolaj
2016-03-01
The conversion of a computer numerical control machine into a nanoimprint step-and-repeat tool with additional laser- and photolithography capacity is documented here. All three processes, each demonstrated on a variety of photoresists, are performed successfully and analysed so as to enable the reader to relate their known lithography process(es) to the findings. Using the converted tool, 1 cm2 of nanopattern may be exposed in 6 s, over 3300 times faster than the electron beam equivalent. Nanoimprint tools are commercially available, but these can cost around 1000 times more than this customised computer numerical control (CNC) machine. The converted equipment facilitates rapid production and large area micro- and nanoscale research on small grants, ultimately enabling faster and more diverse growth in this field of science. In comparison to commercial tools, this converted CNC also boasts capacity to handle larger substrates, temperature control and active force control, up to ten times more curing dose and compactness. Actual devices are fabricated using the machine including an expanded nanotopographic array and microfluidic PDMS Y-channel mixers.
Implementing Journaling in a Linux Shared Disk File System
NASA Technical Reports Server (NTRS)
Preslan, Kenneth W.; Barry, Andrew; Brassow, Jonathan; Cattelan, Russell; Manthei, Adam; Nygaard, Erling; VanOort, Seth; Teigland, David; Tilstra, Mike; O'Keefe, Matthew;
2000-01-01
In computer systems today, speed and responsiveness is often determined by network and storage subsystem performance. Faster, more scalable networking interfaces like Fibre Channel and Gigabit Ethernet provide the scaffolding from which higher performance computer systems implementations may be constructed, but new thinking is required about how machines interact with network-enabled storage devices. In this paper we describe how we implemented journaling in the Global File System (GFS), a shared-disk, cluster file system for Linux. Our previous three papers on GFS at the Mass Storage Symposium discussed our first three GFS implementations, their performance, and the lessons learned. Our fourth paper describes, appropriately enough, the evolution of GFS version 3 to version 4, which supports journaling and recovery from client failures. In addition, GFS scalability tests extending to 8 machines accessing 8 4-disk enclosures were conducted: these tests showed good scaling. We describe the GFS cluster infrastructure, which is necessary for proper recovery from machine and disk failures in a collection of machines sharing disks using GFS. Finally, we discuss the suitability of Linux for handling the big data requirements of supercomputing centers.
Using microwave Doppler radar in automated manufacturing applications
NASA Astrophysics Data System (ADS)
Smith, Gregory C.
Since the beginning of the Industrial Revolution, manufacturers worldwide have used automation to improve productivity, gain market share, and meet growing or changing consumer demand for manufactured products. To stimulate further industrial productivity, manufacturers need more advanced automation technologies: "smart" part handling systems, automated assembly machines, CNC machine tools, and industrial robots that use new sensor technologies, advanced control systems, and intelligent decision-making algorithms to "see," "hear," "feel," and "think" at the levels needed to handle complex manufacturing tasks without human intervention. The investigator's dissertation offers three methods that could help make "smart" CNC machine tools and industrial robots possible: (1) A method for detecting acoustic emission using a microwave Doppler radar detector, (2) A method for detecting tool wear on a CNC lathe using a Doppler radar detector, and (3) An online non-contact method for detecting industrial robot position errors using a microwave Doppler radar motion detector. The dissertation studies indicate that microwave Doppler radar could be quite useful in automated manufacturing applications. In particular, the methods developed may help solve two difficult problems that hinder further progress in automating manufacturing processes: (1) Automating metal-cutting operations on CNC machine tools by providing a reliable non-contact method for detecting tool wear, and (2) Fully automating robotic manufacturing tasks by providing a reliable low-cost non-contact method for detecting on-line position errors. In addition, the studies offer a general non-contact method for detecting acoustic emission that may be useful in many other manufacturing and non-manufacturing areas, as well (e.g., monitoring and nondestructively testing structures, materials, manufacturing processes, and devices). By advancing the state of the art in manufacturing automation, the studies may help stimulate future growth in industrial productivity, which also promises to fuel economic growth and promote economic stability. The study also benefits the Department of Industrial Technology at Iowa State University and the field of Industrial Technology by contributing to the ongoing "smart" machine research program within the Department of Industrial Technology and by stimulating research into new sensor technologies within the University and within the field of Industrial Technology.
NASA Astrophysics Data System (ADS)
Sinha, Sumit; Rode, Michael; Kumar, Rohini; Yang, Xiaoqiang; Samaniego, Luis; Borchardt, Dietrich
2016-04-01
Precise measurements of where, when and how much denitrification occurs on the basis of measurements alone persist to be vexing and intractable research problem at all spatial and temporal scales. As a result, models have become essential and vital tools for furthering our current understanding of the processes that control denitrification on catchment scale. Emplacement of Water Framework Directive (WFD) and continued efforts in improving water treatment facilities has resulted in alleviating the problems associated with point sources of pollution. However, the problem of eutrophication still persists and is primarily associated with the diffused sources of pollution originating from agricultural area. In this study, the nitrate transport and reaction (NTR) routines are developed inside the distributed mesoscale Hydrological Model (mHM www.ufz.de/mhm) which is a fully distributed hydrological model with a novel parameter regionalization scheme (Samaniego et al. 2010; Kumar et al. 2013) and has been applied to whole Europe (Rakovec et al. 2016) and numerous catchments worldwide. The aforementioned NTR model is applied to a mesoscale river basin, Selke (463 km2) located in central Germany. The NTR model takes in account the critical and pertinent processes like transformation in vadose zone, atmospheric deposition, plant uptake, instream denitrification and also simulates the process of manure and fertilizer application. Both streamflow routines and the NTR model are run on daily time steps. The split-sample approach was used for model calibration (1994-1999) and validation (2000-2004). Flow dynamics at three gauging stations located inside this catchment are successfully captured by the model with consistently high Nash-Sutcliffe Efficiency (NSE) of at least 0.8. Regarding nitrate estimates, the NSE values are greater than 0.7 for both validation and calibration periods. Finally, the NTR model is used for identifying the critical source areas (CSAs) that contribute significantly to nutrient pollution due to different local hydrological and topographical conditions. Postulations for a comprehensive sensitivity analysis and further regionalization of key parameters of the NTR model are also investigated. References: Kumar, R., L. Samaniego, and S. Attinger (2013a), Implications of distributed hydrologic model parameterization on water fluxes at multiple scales and locations, Water Resour. Res., 49, 360-379, doi:10.1029/2012WR012195. Samaniego, L., R. Kumar, and S. Attinger (2010), Multiscale parameter regionalization of a grid-based hydrologic model at the mesoscale, Water Resour. Res., 46, W05523, doi:10.1029/2008WR007327. Rakovec, O., Kumar, R., Mai, J., Cuntz, M., Thober, S., Zink, M., Attinger, S., Schäfer, D., Schrön, M., Samaniego, L. (2016): Multiscale and multivariate evaluation of water fluxes and states over European river basins, J. Hydrometeorol., 17, 287-307, doi: 10.1175/JHM-D-15-0054.1.
Early experiences in developing and managing the neuroscience gateway.
Sivagnanam, Subhashini; Majumdar, Amit; Yoshimoto, Kenneth; Astakhov, Vadim; Bandrowski, Anita; Martone, MaryAnn; Carnevale, Nicholas T
2015-02-01
The last few decades have seen the emergence of computational neuroscience as a mature field where researchers are interested in modeling complex and large neuronal systems and require access to high performance computing machines and associated cyber infrastructure to manage computational workflow and data. The neuronal simulation tools, used in this research field, are also implemented for parallel computers and suitable for high performance computing machines. But using these tools on complex high performance computing machines remains a challenge because of issues with acquiring computer time on these machines located at national supercomputer centers, dealing with complex user interface of these machines, dealing with data management and retrieval. The Neuroscience Gateway is being developed to alleviate and/or hide these barriers to entry for computational neuroscientists. It hides or eliminates, from the point of view of the users, all the administrative and technical barriers and makes parallel neuronal simulation tools easily available and accessible on complex high performance computing machines. It handles the running of jobs and data management and retrieval. This paper shares the early experiences in bringing up this gateway and describes the software architecture it is based on, how it is implemented, and how users can use this for computational neuroscience research using high performance computing at the back end. We also look at parallel scaling of some publicly available neuronal models and analyze the recent usage data of the neuroscience gateway.
Modeling Medical Ethics through Intelligent Agents
NASA Astrophysics Data System (ADS)
Machado, José; Miranda, Miguel; Abelha, António; Neves, José; Neves, João
The amount of research using health information has increased dramatically over the last past years. Indeed, a significative number of healthcare institutions have extensive Electronic Health Records (EHR), collected over several years for clinical and teaching purposes, but are uncertain as to the proper circumstances in which to use them to improve the delivery of care to the ones in need. Research Ethics Boards in Portugal and elsewhere in the world are grappling with these issues, but lack clear guidance regarding their role in the creation of and access to EHRs. However, we feel we have an effective way to handle Medical Ethics if we look to the problem under a structured and more rational way. Indeed, we felt that physicians were not aware of the relevance of the subject in their pre-clinical years, but their interest increase when they were exposed to patients. On the other hand, once EHRs are stored in machines, we also felt that we had to find a way to ensure that the behavior of machines toward human users, and perhaps other machines as well, is ethically acceptable. Therefore, in this article we discuss the importance of machine ethics and the need for machines that represent ethical principles explicitly. It is also shown how a machine may abstract an ethical principle from a logical representation of ethical judgments and use that principle to guide its own behavior.
Prediction of mortality after radical cystectomy for bladder cancer by machine learning techniques.
Wang, Guanjin; Lam, Kin-Man; Deng, Zhaohong; Choi, Kup-Sze
2015-08-01
Bladder cancer is a common cancer in genitourinary malignancy. For muscle invasive bladder cancer, surgical removal of the bladder, i.e. radical cystectomy, is in general the definitive treatment which, unfortunately, carries significant morbidities and mortalities. Accurate prediction of the mortality of radical cystectomy is therefore needed. Statistical methods have conventionally been used for this purpose, despite the complex interactions of high-dimensional medical data. Machine learning has emerged as a promising technique for handling high-dimensional data, with increasing application in clinical decision support, e.g. cancer prediction and prognosis. Its ability to reveal the hidden nonlinear interactions and interpretable rules between dependent and independent variables is favorable for constructing models of effective generalization performance. In this paper, seven machine learning methods are utilized to predict the 5-year mortality of radical cystectomy, including back-propagation neural network (BPN), radial basis function (RBFN), extreme learning machine (ELM), regularized ELM (RELM), support vector machine (SVM), naive Bayes (NB) classifier and k-nearest neighbour (KNN), on a clinicopathological dataset of 117 patients of the urology unit of a hospital in Hong Kong. The experimental results indicate that RELM achieved the highest average prediction accuracy of 0.8 at a fast learning speed. The research findings demonstrate the potential of applying machine learning techniques to support clinical decision making. Copyright © 2015 Elsevier Ltd. All rights reserved.
Early experiences in developing and managing the neuroscience gateway
Sivagnanam, Subhashini; Majumdar, Amit; Yoshimoto, Kenneth; Astakhov, Vadim; Bandrowski, Anita; Martone, MaryAnn; Carnevale, Nicholas. T.
2015-01-01
SUMMARY The last few decades have seen the emergence of computational neuroscience as a mature field where researchers are interested in modeling complex and large neuronal systems and require access to high performance computing machines and associated cyber infrastructure to manage computational workflow and data. The neuronal simulation tools, used in this research field, are also implemented for parallel computers and suitable for high performance computing machines. But using these tools on complex high performance computing machines remains a challenge because of issues with acquiring computer time on these machines located at national supercomputer centers, dealing with complex user interface of these machines, dealing with data management and retrieval. The Neuroscience Gateway is being developed to alleviate and/or hide these barriers to entry for computational neuroscientists. It hides or eliminates, from the point of view of the users, all the administrative and technical barriers and makes parallel neuronal simulation tools easily available and accessible on complex high performance computing machines. It handles the running of jobs and data management and retrieval. This paper shares the early experiences in bringing up this gateway and describes the software architecture it is based on, how it is implemented, and how users can use this for computational neuroscience research using high performance computing at the back end. We also look at parallel scaling of some publicly available neuronal models and analyze the recent usage data of the neuroscience gateway. PMID:26523124
Numerical Control/Computer Aided Manufacturing (NC/CAM), A Descom Study
1979-07-01
CAM machines operate directly from computers, but most get instructions in the form of punched tape. The applications of NC/CAM are virtually...Although most NC/CAM equipment is metal working, its applications include electronics manufacturing, glass making, food processing, materiel handling...drafting, woodworking, plastics and inspection, just to name a few. Numerical control, like most technologies, is an advancing and evolutionary process
Static Aeroelasticity in Combat Aircraft.
1986-01-01
stiffness scaled beam machined along a predicted elastic axis, and load iola- tion cuts forward and aft of the beam, has proved to be most successful...aircraft components. Many papers deal with the activities in the field of structural optimization.’ 4sing fiber composites , a new design technique...Supersonic Design Composite Structures Fly - by - Wire Thin Profiles Aeroelastic Tailoring Unstable Aircraft V Variable Camber Lght Weight Pilot Handling
ERIC Educational Resources Information Center
Klerfelt, Anna
2004-01-01
Children today live in different cultural settings. The pre-school culture is one of them and the media culture outside the pre-school another. These cultures are in different ways characterised by opposite and often even conflicting traditions. This article shows how educators and children handle this dilemma by using interaction as a tool to…
The development of an inert simulant for HNS/teflon explosive
NASA Technical Reports Server (NTRS)
Elban, W. L.
1972-01-01
The report describes the development and evaluation of an inert simulant for the thermally stable, heat-resistant plastic-bonded explosive HNS/Teflon. The simulant is made by dry blending vinylidene fluoride, melamine and Teflon which when compared has a pressed density and thermal properties corresponding closely to the explosive. In addition, the machinability and handling characteristics of the simulant are similar to the explosive.
Securing the Northern Maritime Border Through Maritime Domain Awareness
2010-09-01
is handled through 1) aircraft overflights (Coast Guard and Coast Guard Auxiliary aircraft, U.S. Air Force, Customs and Border Patrol, Canadian...caliber machine gun, or like automatic weapons. Bruce Levy, Director, U.S. Transboundary Division conveyed to Nancy Mason, Director, Office of...1 NORAD, the North American Aerospace Defense Command, is a binational military command focused on the air defense of North America and located
Uncertainty and Risk Management in Cyber Situational Awareness
NASA Astrophysics Data System (ADS)
Li, Jason; Ou, Xinming; Rajagopalan, Raj
Handling cyber threats unavoidably needs to deal with both uncertain and imprecise information. What we can observe as potential malicious activities can seldom give us 100% confidence on important questions we care about, e.g. what machines are compromised and what damage has been incurred. In security planning, we need information on how likely a vulnerability can lead to a successful compromise to better balance security and functionality, performance, and ease of use. These information are at best qualitative and are often vague and imprecise. In cyber situational awareness, we have to rely on such imperfect information to detect real attacks and to prevent an attack from happening through appropriate risk management. This chapter surveys existing technologies in handling uncertainty and risk management in cyber situational awareness.
Computational dynamics of soft machines
NASA Astrophysics Data System (ADS)
Hu, Haiyan; Tian, Qiang; Liu, Cheng
2017-06-01
Soft machine refers to a kind of mechanical system made of soft materials to complete sophisticated missions, such as handling a fragile object and crawling along a narrow tunnel corner, under low cost control and actuation. Hence, soft machines have raised great challenges to computational dynamics. In this review article, recent studies of the authors on the dynamic modeling, numerical simulation, and experimental validation of soft machines are summarized in the framework of multibody system dynamics. The dynamic modeling approaches are presented first for the geometric nonlinearities of coupled overall motions and large deformations of a soft component, the physical nonlinearities of a soft component made of hyperelastic or elastoplastic materials, and the frictional contacts/impacts of soft components, respectively. Then the computation approach is outlined for the dynamic simulation of soft machines governed by a set of differential-algebraic equations of very high dimensions, with an emphasis on the efficient computations of the nonlinear elastic force vector of finite elements. The validations of the proposed approaches are given via three case studies, including the locomotion of a soft quadrupedal robot, the spinning deployment of a solar sail of a spacecraft, and the deployment of a mesh reflector of a satellite antenna, as well as the corresponding experimental studies. Finally, some remarks are made for future studies.
Nguyen, Huu-Tho; Md Dawal, Siti Zawiah; Nukman, Yusoff; Aoyama, Hideki; Case, Keith
2015-01-01
Globalization of business and competitiveness in manufacturing has forced companies to improve their manufacturing facilities to respond to market requirements. Machine tool evaluation involves an essential decision using imprecise and vague information, and plays a major role to improve the productivity and flexibility in manufacturing. The aim of this study is to present an integrated approach for decision-making in machine tool selection. This paper is focused on the integration of a consistent fuzzy AHP (Analytic Hierarchy Process) and a fuzzy COmplex PRoportional ASsessment (COPRAS) for multi-attribute decision-making in selecting the most suitable machine tool. In this method, the fuzzy linguistic reference relation is integrated into AHP to handle the imprecise and vague information, and to simplify the data collection for the pair-wise comparison matrix of the AHP which determines the weights of attributes. The output of the fuzzy AHP is imported into the fuzzy COPRAS method for ranking alternatives through the closeness coefficient. Presentation of the proposed model application is provided by a numerical example based on the collection of data by questionnaire and from the literature. The results highlight the integration of the improved fuzzy AHP and the fuzzy COPRAS as a precise tool and provide effective multi-attribute decision-making for evaluating the machine tool in the uncertain environment.
Minicomputer front end. [Modcomp II/CP as buffer between CDC 6600 and PDP-9 at graphics stations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hudson, J.A.
1976-01-01
Sandia Labs developed an Interactive Graphics System (SIGS) that was established on a CDC 6600 using a communication scheme based on the Control Data Corporation product IGS. As implemented at Sandia, the graphics station consists primarily of a PDP-9 with a Vector General display. A system is being developed which uses a minicomputer (Modcomp II/CP) as the buffer machine for the graphics stations. The original SIGS required a dedicated peripheral processor (PP) on the CDC 6600 to handle the communication with the stations; however, with the Modcomp handling the actual communication protocol, the PP is only assigned as needed tomore » handle data transfer within the CDC 6600 portion of SIGS. The new system will thus support additional graphics stations with less impact on the CDC 6600. This paper discusses the design philosophy of the system, and the hardware and software used to implement it. 1 figure.« less
Automated packing systems: review of industrial implementations
NASA Astrophysics Data System (ADS)
Whelan, Paul F.; Batchelor, Bruce G.
1993-08-01
A rich theoretical background to the problems that occur in the automation of material handling can be found in operations research, production engineering, systems engineering and automation, more specifically machine vision, literature. This work has contributed towards the design of intelligent handling systems. This paper will review the application of these automated material handling and packing techniques to industrial problems. The discussion will also highlight the systems integration issues involved in these applications. An outline of one such industrial application, the automated placement of shape templates on to leather hides, is also discussed. The purpose of this system is to arrange shape templates on a leather hide in an efficient manner, so as to minimize the leather waste, before they are automatically cut from the hide. These pieces are used in the furniture and car manufacturing industries for the upholstery of high quality leather chairs and car seats. Currently this type of operation is semi-automated. The paper will outline the problems involved in the full automation of such a procedure.
Improving Memory Error Handling Using Linux
DOE Office of Scientific and Technical Information (OSTI.GOV)
Carlton, Michael Andrew; Blanchard, Sean P.; Debardeleben, Nathan A.
As supercomputers continue to get faster and more powerful in the future, they will also have more nodes. If nothing is done, then the amount of memory in supercomputer clusters will soon grow large enough that memory failures will be unmanageable to deal with by manually replacing memory DIMMs. "Improving Memory Error Handling Using Linux" is a process oriented method to solve this problem by using the Linux kernel to disable (offline) faulty memory pages containing bad addresses, preventing them from being used again by a process. The process of offlining memory pages simplifies error handling and results in reducingmore » both hardware and manpower costs required to run Los Alamos National Laboratory (LANL) clusters. This process will be necessary for the future of supercomputing to allow the development of exascale computers. It will not be feasible without memory error handling to manually replace the number of DIMMs that will fail daily on a machine consisting of 32-128 petabytes of memory. Testing reveals the process of offlining memory pages works and is relatively simple to use. As more and more testing is conducted, the entire process will be automated within the high-performance computing (HPC) monitoring software, Zenoss, at LANL.« less
Using container orchestration to improve service management at the RAL Tier-1
NASA Astrophysics Data System (ADS)
Lahiff, Andrew; Collier, Ian
2017-10-01
In recent years container orchestration has been emerging as a means of gaining many potential benefits compared to a traditional static infrastructure, such as increased utilisation through multi-tenancy, improved availability due to self-healing, and the ability to handle changing loads due to elasticity and auto-scaling. To this end we have been investigating migrating services at the RAL Tier-1 to an Apache Mesos cluster. In this model the concept of individual machines is abstracted away and services are run in containers on a cluster of machines, managed by schedulers, enabling a high degree of automation. Here we describe Mesos, the infrastructure deployed at RAL, and describe in detail the explicit example of running a batch farm on Mesos.
Automatic EEG artifact removal: a weighted support vector machine approach with error correction.
Shao, Shi-Yun; Shen, Kai-Quan; Ong, Chong Jin; Wilder-Smith, Einar P V; Li, Xiao-Ping
2009-02-01
An automatic electroencephalogram (EEG) artifact removal method is presented in this paper. Compared to past methods, it has two unique features: 1) a weighted version of support vector machine formulation that handles the inherent unbalanced nature of component classification and 2) the ability to accommodate structural information typically found in component classification. The advantages of the proposed method are demonstrated on real-life EEG recordings with comparisons made to several benchmark methods. Results show that the proposed method is preferable to the other methods in the context of artifact removal by achieving a better tradeoff between removing artifacts and preserving inherent brain activities. Qualitative evaluation of the reconstructed EEG epochs also demonstrates that after artifact removal inherent brain activities are largely preserved.
NASA Astrophysics Data System (ADS)
Lemanzyk, Thomas; Anding, Katharina; Linss, Gerhard; Rodriguez Hernández, Jorge; Theska, René
2015-02-01
The following paper deals with the classification of seeds and seed components of the South-American Incanut plant and the modification of a machine to handle this task. Initially the state of the art is being illustrated. The research was executed in Germany and with a relevant part in Peru and Ecuador. Theoretical considerations for the solution of an automatically analysis of the Incanut seeds were specified. The optimization of the analyzing software and the separation unit of the mechanical hardware are carried out with recognition results. In a final step the practical application of the analysis of the Incanut seeds is held on a trial basis and rated on the bases of statistic values.
Frutos, M; Méndez, M; Tohmé, F; Broz, D
2013-01-01
Many of the problems that arise in production systems can be handled with multiobjective techniques. One of those problems is that of scheduling operations subject to constraints on the availability of machines and buffer capacity. In this paper we analyze different Evolutionary multiobjective Algorithms (MOEAs) for this kind of problems. We consider an experimental framework in which we schedule production operations for four real world Job-Shop contexts using three algorithms, NSGAII, SPEA2, and IBEA. Using two performance indexes, Hypervolume and R2, we found that SPEA2 and IBEA are the most efficient for the tasks at hand. On the other hand IBEA seems to be a better choice of tool since it yields more solutions in the approximate Pareto frontier.
Growth and characterization of III-V epitaxial films
NASA Astrophysics Data System (ADS)
Tripathi, A.; Adamski, J.
1991-11-01
Investigations were conducted on the growth of epitaxial layers using an Organo Metallic Chemical Vapor Deposition technique of selected III-V materials which are potentially useful for photonics and microwave devices. RL/ERX's MOCVD machine was leak checked for safety. The whole gas handling plumbing system has been leak checked and the problems were reported to the manufacturer, CVD Equipment Corporation of Dear Park, NY. CVD Equipment Corporation is making an effort to correct these problems and also supply the part according to our redesign specifications. One of the main emphasis during this contract period was understanding the operating procedure and writing an operating manual for this MOCVD machine. To study the dynamic fluid flow in the vertical reactor of this MOCVD machine, an experimental apparatus was designed, tested, and put together. This study gave very important information on the turbulent gas flow patterns in this vertical reactor. The turbulent flow affects the epitaxial growth adversely. This study will also help in redesigning a vertical reactor so that the turbulent gas flow can be eliminated.
The precision measurement and assembly for miniature parts based on double machine vision systems
NASA Astrophysics Data System (ADS)
Wang, X. D.; Zhang, L. F.; Xin, M. Z.; Qu, Y. Q.; Luo, Y.; Ma, T. M.; Chen, L.
2015-02-01
In the process of miniature parts' assembly, the structural features on the bottom or side of the parts often need to be aligned and positioned. The general assembly equipment integrated with one vertical downward machine vision system cannot satisfy the requirement. A precision automatic assembly equipment was developed with double machine vision systems integrated. In the system, a horizontal vision system is employed to measure the position of the feature structure at the parts' side view, which cannot be seen with the vertical one. The position measured by horizontal camera is converted to the vertical vision system with the calibration information. By careful calibration, the parts' alignment and positioning in the assembly process can be guaranteed. The developed assembly equipment has the characteristics of easy implementation, modularization and high cost performance. The handling of the miniature parts and assembly procedure were briefly introduced. The calibration procedure was given and the assembly error was analyzed for compensation.
Transient analysis of an IVHM grapple impact test
NASA Technical Reports Server (NTRS)
Hill, R. G.
1972-01-01
A lumped mass model was used to represent the impact condition between a fuel duct and an IVHM in-vessel fuel handling machine. The nonlinear effects of a Bellville spring and the free fall impact of the fuel duct on the IVHM were included. The purpose of the tests was to determine the loads on the fuel duct due to the impact. A comparison between experimental and theoretical results is presented.
Bayesian Logic Programs for Plan Recognition and Machine Reading
2012-12-01
models is that they can handle both uncertainty and structured/ relational data. As a result, they are widely used in domains like social network...data. As a result, they are widely used in domains like social net- work analysis, biological data analysis, and natural language processing. Bayesian...the Story Understanding data set. (b) The logical representation of the observations. (c) The set of ground rules obtained from logical abduction
Resource Management in Constrained Dynamic Situations
NASA Astrophysics Data System (ADS)
Seok, Jinwoo
Resource management is considered in this dissertation for systems with limited resources, possibly combined with other system constraints, in unpredictably dynamic environments. Resources may represent fuel, power, capabilities, energy, and so on. Resource management is important for many practical systems; usually, resources are limited, and their use must be optimized. Furthermore, systems are often constrained, and constraints must be satisfied for safe operation. Simplistic resource management can result in poor use of resources and failure of the system. Furthermore, many real-world situations involve dynamic environments. Many traditional problems are formulated based on the assumptions of given probabilities or perfect knowledge of future events. However, in many cases, the future is completely unknown, and information on or probabilities about future events are not available. In other words, we operate in unpredictably dynamic situations. Thus, a method is needed to handle dynamic situations without knowledge of the future, but few formal methods have been developed to address them. Thus, the goal is to design resource management methods for constrained systems, with limited resources, in unpredictably dynamic environments. To this end, resource management is organized hierarchically into two levels: 1) planning, and 2) control. In the planning level, the set of tasks to be performed is scheduled based on limited resources to maximize resource usage in unpredictably dynamic environments. In the control level, the system controller is designed to follow the schedule by considering all the system constraints for safe and efficient operation. Consequently, this dissertation is mainly divided into two parts: 1) planning level design, based on finite state machines, and 2) control level methods, based on model predictive control. We define a recomposable restricted finite state machine to handle limited resource situations and unpredictably dynamic environments for the planning level. To obtain a policy, dynamic programing is applied, and to obtain a solution, limited breadth-first search is applied to the recomposable restricted finite state machine. A multi-function phased array radar resource management problem and an unmanned aerial vehicle patrolling problem are treated using recomposable restricted finite state machines. Then, we use model predictive control for the control level, because it allows constraint handling and setpoint tracking for the schedule. An aircraft power system management problem is treated that aims to develop an integrated control system for an aircraft gas turbine engine and electrical power system using rate-based model predictive control. Our results indicate that at the planning level, limited breadth-first search for recomposable restricted finite state machines generates good scheduling solutions in limited resource situations and unpredictably dynamic environments. The importance of cooperation in the planning level is also verified. At the control level, a rate-based model predictive controller allows good schedule tracking and safe operations. The importance of considering the system constraints and interactions between the subsystems is indicated. For the best resource management in constrained dynamic situations, the planning level and the control level need to be considered together.
Creating Situational Awareness in Spacecraft Operations with the Machine Learning Approach
NASA Astrophysics Data System (ADS)
Li, Z.
2016-09-01
This paper presents a machine learning approach for the situational awareness capability in spacecraft operations. There are two types of time dependent data patterns for spacecraft datasets: the absolute time pattern (ATP) and the relative time pattern (RTP). The machine learning captures the data patterns of the satellite datasets through the data training during the normal operations, which is represented by its time dependent trend. The data monitoring compares the values of the incoming data with the predictions of machine learning algorithm, which can detect any meaningful changes to a dataset above the noise level. If the difference between the value of incoming telemetry and the machine learning prediction are larger than the threshold defined by the standard deviation of datasets, it could indicate the potential anomaly that may need special attention. The application of the machine-learning approach to the Advanced Himawari Imager (AHI) on Japanese Himawari spacecraft series is presented, which has the same configuration as the Advanced Baseline Imager (ABI) on Geostationary Environment Operational Satellite (GOES) R series. The time dependent trends generated by the data-training algorithm are in excellent agreement with the datasets. The standard deviation in the time dependent trend provides a metric for measuring the data quality, which is particularly useful in evaluating the detector quality for both AHI and ABI with multiple detectors in each channel. The machine-learning approach creates the situational awareness capability, and enables engineers to handle the huge data volume that would have been impossible with the existing approach, and it leads to significant advances to more dynamic, proactive, and autonomous spacecraft operations.
Color line scan camera technology and machine vision: requirements to consider
NASA Astrophysics Data System (ADS)
Paernaenen, Pekka H. T.
1997-08-01
Color machine vision has shown a dynamic uptrend in use within the past few years as the introduction of new cameras and scanner technologies itself underscores. In the future, the movement from monochrome imaging to color will hasten, as machine vision system users demand more knowledge about their product stream. As color has come to the machine vision, certain requirements for the equipment used to digitize color images are needed. Color machine vision needs not only a good color separation but also a high dynamic range and a good linear response from the camera used. Good dynamic range and linear response is necessary for color machine vision. The importance of these features becomes even more important when the image is converted to another color space. There is always lost some information when converting integer data to another form. Traditionally the color image processing has been much slower technique than the gray level image processing due to the three times greater data amount per image. The same has applied for the three times more memory needed. The advancements in computers, memory and processing units has made it possible to handle even large color images today cost efficiently. In some cases he image analysis in color images can in fact even be easier and faster than with a similar gray level image because of more information per pixel. Color machine vision sets new requirements for lighting, too. High intensity and white color light is required in order to acquire good images for further image processing or analysis. New development in lighting technology is bringing eventually solutions for color imaging.
An evaluation of consensus techniques for diagnostic interpretation
NASA Astrophysics Data System (ADS)
Sauter, Jake N.; LaBarre, Victoria M.; Furst, Jacob D.; Raicu, Daniela S.
2018-02-01
Learning diagnostic labels from image content has been the standard in computer-aided diagnosis. Most computer-aided diagnosis systems use low-level image features extracted directly from image content to train and test machine learning classifiers for diagnostic label prediction. When the ground truth for the diagnostic labels is not available, reference truth is generated from the experts diagnostic interpretations of the image/region of interest. More specifically, when the label is uncertain, e.g. when multiple experts label an image and their interpretations are different, techniques to handle the label variability are necessary. In this paper, we compare three consensus techniques that are typically used to encode the variability in the experts labeling of the medical data: mean, median and mode, and their effects on simple classifiers that can handle deterministic labels (decision trees) and probabilistic vectors of labels (belief decision trees). Given that the NIH/NCI Lung Image Database Consortium (LIDC) data provides interpretations for lung nodules by up to four radiologists, we leverage the LIDC data to evaluate and compare these consensus approaches when creating computer-aided diagnosis systems for lung nodules. First, low-level image features of nodules are extracted and paired with their radiologists semantic ratings (1= most likely benign, , 5 = most likely malignant); second, machine learning multi-class classifiers that handle deterministic labels (decision trees) and probabilistic vectors of labels (belief decision trees) are built to predict the lung nodules semantic ratings. We show that the mean-based consensus generates the most robust classi- fier overall when compared to the median- and mode-based consensus. Lastly, the results of this study show that, when building CAD systems with uncertain diagnostic interpretation, it is important to evaluate different strategies for encoding and predicting the diagnostic label.
Cario, Clinton L; Witte, John S
2018-03-15
As whole-genome tumor sequence and biological annotation datasets grow in size, number and content, there is an increasing basic science and clinical need for efficient and accurate data management and analysis software. With the emergence of increasingly sophisticated data stores, execution environments and machine learning algorithms, there is also a need for the integration of functionality across frameworks. We present orchid, a python based software package for the management, annotation and machine learning of cancer mutations. Building on technologies of parallel workflow execution, in-memory database storage and machine learning analytics, orchid efficiently handles millions of mutations and hundreds of features in an easy-to-use manner. We describe the implementation of orchid and demonstrate its ability to distinguish tissue of origin in 12 tumor types based on 339 features using a random forest classifier. Orchid and our annotated tumor mutation database are freely available at https://github.com/wittelab/orchid. Software is implemented in python 2.7, and makes use of MySQL or MemSQL databases. Groovy 2.4.5 is optionally required for parallel workflow execution. JWitte@ucsf.edu. Supplementary data are available at Bioinformatics online.
Machine Learning and Computer Vision System for Phenotype Data Acquisition and Analysis in Plants.
Navarro, Pedro J; Pérez, Fernando; Weiss, Julia; Egea-Cortines, Marcos
2016-05-05
Phenomics is a technology-driven approach with promising future to obtain unbiased data of biological systems. Image acquisition is relatively simple. However data handling and analysis are not as developed compared to the sampling capacities. We present a system based on machine learning (ML) algorithms and computer vision intended to solve the automatic phenotype data analysis in plant material. We developed a growth-chamber able to accommodate species of various sizes. Night image acquisition requires near infrared lightning. For the ML process, we tested three different algorithms: k-nearest neighbour (kNN), Naive Bayes Classifier (NBC), and Support Vector Machine. Each ML algorithm was executed with different kernel functions and they were trained with raw data and two types of data normalisation. Different metrics were computed to determine the optimal configuration of the machine learning algorithms. We obtained a performance of 99.31% in kNN for RGB images and a 99.34% in SVM for NIR. Our results show that ML techniques can speed up phenomic data analysis. Furthermore, both RGB and NIR images can be segmented successfully but may require different ML algorithms for segmentation.
NASA Astrophysics Data System (ADS)
Gavrishchaka, V. V.; Ganguli, S. B.
2001-12-01
Reliable forecasting of rare events in a complex dynamical system is a challenging problem that is important for many practical applications. Due to the nature of rare events, data set available for construction of the statistical and/or machine learning model is often very limited and incomplete. Therefore many widely used approaches including such robust algorithms as neural networks can easily become inadequate for rare events prediction. Moreover in many practical cases models with high-dimensional inputs are required. This limits applications of the existing rare event modeling techniques (e.g., extreme value theory) that focus on univariate cases. These approaches are not easily extended to multivariate cases. Support vector machine (SVM) is a machine learning system that can provide an optimal generalization using very limited and incomplete training data sets and can efficiently handle high-dimensional data. These features may allow to use SVM to model rare events in some applications. We have applied SVM-based system to the problem of large-amplitude substorm prediction and extreme event forecasting in stock and currency exchange markets. Encouraging preliminary results will be presented and other possible applications of the system will be discussed.
A Comparison Study of Machine Learning Based Algorithms for Fatigue Crack Growth Calculation.
Wang, Hongxun; Zhang, Weifang; Sun, Fuqiang; Zhang, Wei
2017-05-18
The relationships between the fatigue crack growth rate ( d a / d N ) and stress intensity factor range ( Δ K ) are not always linear even in the Paris region. The stress ratio effects on fatigue crack growth rate are diverse in different materials. However, most existing fatigue crack growth models cannot handle these nonlinearities appropriately. The machine learning method provides a flexible approach to the modeling of fatigue crack growth because of its excellent nonlinear approximation and multivariable learning ability. In this paper, a fatigue crack growth calculation method is proposed based on three different machine learning algorithms (MLAs): extreme learning machine (ELM), radial basis function network (RBFN) and genetic algorithms optimized back propagation network (GABP). The MLA based method is validated using testing data of different materials. The three MLAs are compared with each other as well as the classical two-parameter model ( K * approach). The results show that the predictions of MLAs are superior to those of K * approach in accuracy and effectiveness, and the ELM based algorithms show overall the best agreement with the experimental data out of the three MLAs, for its global optimization and extrapolation ability.
High productivity mould robotic milling in Al-5083
NASA Astrophysics Data System (ADS)
Urresti, Iker; Arrazola, Pedro Jose; Ørskov, Klaus Bonde; Pelegay, Jose Angel
2018-05-01
Industrial serial robots were usually limited to welding, handling or spray painting operations until very recent years. However, some industries have already realized about their important capabilities in terms of flexibility, working space, adaptability and cost. Hence, currently they are seriously being considered to carry out certain metal machining tasks. Therefore, robot based machining is presented as a cost-saving and flexible manufacturing alternative compared to conventional CNC machines especially for roughing or even pre-roughing of large parts. Nevertheless, there are still some drawbacks usually referred as low rigidity, accuracy and repeatability. Thus, the process productivity is usually sacrificed getting low Material Removal Rates (MRR), and consequently not being competitive. Nevertheless, in this paper different techniques to obtain increased productivity are presented, though an appropriate selection of cutting strategies and parameters that are essential for it. During this research some rough milling tests in Al-5083 are presented where High Feed Milling (HFM) is implemented as productive cutting strategy and the experimental modal analysis named Tap-testing is used for the suitable choice of cutting conditions. Competitive productivity rates are experienced while process stability is checked through the cutting forces measurements in order to prove the effectiveness of the experimental modal analysis for robotic machining.
Social Intelligence in a Human-Machine Collaboration System
NASA Astrophysics Data System (ADS)
Nakajima, Hiroshi; Morishima, Yasunori; Yamada, Ryota; Brave, Scott; Maldonado, Heidy; Nass, Clifford; Kawaji, Shigeyasu
In this information society of today, it is often argued that it is necessary to create a new way of human-machine interaction. In this paper, an agent with social response capabilities has been developed to achieve this goal. There are two kinds of information that is exchanged by two entities: objective and functional information (e.g., facts, requests, states of matters, etc.) and subjective information (e.g., feelings, sense of relationship, etc.). Traditional interactive systems have been designed to handle the former kind of information. In contrast, in this study social agents handling the latter type of information are presented. The current study focuses on sociality of the agent from the view point of Media Equation theory. This article discusses the definition, importance, and benefits of social intelligence as agent technology and argues that social intelligence has a potential to enhance the user's perception of the system, which in turn can lead to improvements of the system's performance. In order to implement social intelligence in the agent, a mind model has been developed to render affective expressions and personality of the agent. The mind model has been implemented in a human-machine collaborative learning system. One differentiating feature of the collaborative learning system is that it has an agent that performs as a co-learner with which the user interacts during the learning session. The mind model controls the social behaviors of the agent, thus making it possible for the user to have more social interactions with the agent. The experiment with the system suggested that a greater degree of learning was achieved when the students worked with the co-learner agent and that the co-learner agent with the mind model that expressed emotions resulted in a more positive attitude toward the system.
NASA Astrophysics Data System (ADS)
Das, Ronnie; Burfeind, Chris W.; Lim, Saniel D.; Patle, Shubham; Seibel, Eric J.
2018-02-01
3D pathology is intrinsically dependent on 3D microscopy, or the whole tissue imaging of patient tissue biopsies (TBs). Consequently, unsectioned needle specimens must be processed whole: a procedure which cannot necessarily be accomplished through manual methods, or by retasking automated pathology machines. Thus "millifluidic" devices (for millimeter-scale biopsies) are an ideal solution for tissue handling/preparation. TBs are large, messy and a solid-liquid mixture; they vary in material, geometry and structure based on the organ biopsied, the clinician skill and the needle type used. As a result, traditional microfluidic devices are insufficient to handle such mm-sized samples and their associated fabrication techniques are impractical and costly with respect to time/efficiency. Our research group has devised a simple, rapid fabrication process for millifluidic devices using jointed skeletal molds composed of machined, reusable metal rods, segmented rods and stranded wire as structural cores; these cores are surrounded by Teflon outer housing. We can therefore produce curving, circular-cross-section (CCCS) millifluidic channels in rapid fashion that cannot normally be achieved by microfabrication, micro-/CNC-machining, or 3D printing. The approach has several advantages. CLINICAL: round channels interface coring needles. PROCESSING: CCCS channels permit multi-layer device designs for additional (processing, monitoring, testing) stages. REUSABILITY: for a biopsy/needle diameter, molding (interchangeable) components may be produced one-time then reused for other designs. RAPID: structural cores can be quickly removed due to Teflon®'s ultra-low friction; housing may be released with ethanol; PDMS volumes cure faster since metal skeleton molds conduct additional heat from within the curing elastomer.
Wireless brain-machine interface using EEG and EOG: brain wave classification and robot control
NASA Astrophysics Data System (ADS)
Oh, Sechang; Kumar, Prashanth S.; Kwon, Hyeokjun; Varadan, Vijay K.
2012-04-01
A brain-machine interface (BMI) links a user's brain activity directly to an external device. It enables a person to control devices using only thought. Hence, it has gained significant interest in the design of assistive devices and systems for people with disabilities. In addition, BMI has also been proposed to replace humans with robots in the performance of dangerous tasks like explosives handling/diffusing, hazardous materials handling, fire fighting etc. There are mainly two types of BMI based on the measurement method of brain activity; invasive and non-invasive. Invasive BMI can provide pristine signals but it is expensive and surgery may lead to undesirable side effects. Recent advances in non-invasive BMI have opened the possibility of generating robust control signals from noisy brain activity signals like EEG and EOG. A practical implementation of a non-invasive BMI such as robot control requires: acquisition of brain signals with a robust wearable unit, noise filtering and signal processing, identification and extraction of relevant brain wave features and finally, an algorithm to determine control signals based on the wave features. In this work, we developed a wireless brain-machine interface with a small platform and established a BMI that can be used to control the movement of a robot by using the extracted features of the EEG and EOG signals. The system records and classifies EEG as alpha, beta, delta, and theta waves. The classified brain waves are then used to define the level of attention. The acceleration and deceleration or stopping of the robot is controlled based on the attention level of the wearer. In addition, the left and right movements of eye ball control the direction of the robot.
Multidimensional Simulation Applied to Water Resources Management
NASA Astrophysics Data System (ADS)
Camara, A. S.; Ferreira, F. C.; Loucks, D. P.; Seixas, M. J.
1990-09-01
A framework for an integrated decision aiding simulation (IDEAS) methodology using numerical, linguistic, and pictorial entities and operations is introduced. IDEAS relies upon traditional numerical formulations, logical rules to handle linguistic entities with linguistic values, and a set of pictorial operations. Pictorial entities are defined by their shape, size, color, and position. Pictorial operators include reproduction (copy of a pictorial entity), mutation (expansion, rotation, translation, change in color), fertile encounters (intersection, reunion), and sterile encounters (absorption). Interaction between numerical, linguistic, and pictorial entities is handled through logical rules or a simplified vector calculus operation. This approach is shown to be applicable to various environmental and water resources management analyses using a model to assess the impacts of an oil spill. Future developments, including IDEAS implementation on parallel processing machines, are also discussed.
Distributed and parallel approach for handle and perform huge datasets
NASA Astrophysics Data System (ADS)
Konopko, Joanna
2015-12-01
Big Data refers to the dynamic, large and disparate volumes of data comes from many different sources (tools, machines, sensors, mobile devices) uncorrelated with each others. It requires new, innovative and scalable technology to collect, host and analytically process the vast amount of data. Proper architecture of the system that perform huge data sets is needed. In this paper, the comparison of distributed and parallel system architecture is presented on the example of MapReduce (MR) Hadoop platform and parallel database platform (DBMS). This paper also analyzes the problem of performing and handling valuable information from petabytes of data. The both paradigms: MapReduce and parallel DBMS are described and compared. The hybrid architecture approach is also proposed and could be used to solve the analyzed problem of storing and processing Big Data.
Yao, Chang-liang; Yang, Wen-zhi; Wu, Wan-Yyng; Da, Juan; Hou, Jin-jun; Zhang, Jing-xian; Zhang, Yan-hai; Jin, Yan; Yang, Min; Jiang, Bao-hong; Liu, Xuan; Guo, De-an
2015-07-10
Current China Pharmacopoeia (ChP) standards employ diversified and case-dependent assay methods to evaluate the quality of different Chinese patent medicines (CPMs) that contain Panax notoginseng as the monarch drug. These conventional, HPLC-based approaches, utilizing a complex sample preparation procedure, can easily result in low analytical efficiency and possible component loss. Here, a "monomethod-heterotrait matrix" (MHM) strategy is proposed, that is, developing a universal multi heart-cutting two-dimensional liquid chromatography (MHC-2D-LC) approach that facilitates the simultaneous quantitation of five P. notoginseng saponins (noto-R1, Re, Rg1, Rb1, and Rd) in eight different CPMs. The MHC-2D-LC system was constructed on a dual-gradient liquid chromatography instrument equipped with a Poroshell SB C18 column and a Zorbax SB-Aq column for respective (1)D and (2)D separation. Method validation was performed in terms of specificity, linearity (r(2) and F-test), intra-/inter-day precision (0.4-7.9%), stability (1.2-3.9%), and recovery (90.2-108.7%), and the LODs and LOQs (loaded masses) of the five analytes varied between 4.0-11.0ng and 6.0-33.0ng, respectively. The validated MHC-2D-LC approach was subsequently applied to quantify the five saponins in thirty batches of different CPMs. The method demonstrated superiority over the current ChP assay methods in respect of specificity (avoiding co-elution), resolution (Rs>1.5), sample preparation (easy-to-implement ultrasonic extraction without repeated re-extraction), and transfer rate (minimum component loss). This is the first application of an MHC-2D-LC method for the quantitative assessment of the constituents of CPMs. The MHM approach represents a new, strategically significant methodology for the quality control of CPMs that involve complex chemical matrix. Copyright © 2015 Elsevier B.V. All rights reserved.
Yamaguchi, Akemi; Matsuda, Kazuyuki; Sueki, Akane; Taira, Chiaki; Uehara, Masayuki; Saito, Yasunori; Honda, Takayuki
2015-08-25
Reverse transcription (RT)-nested polymerase chain reaction (PCR) is a time-consuming procedure because it has several handling steps and is associated with the risk of cross-contamination during each step. Therefore, a rapid and sensitive one-step RT-nested PCR was developed that could be performed in a single tube using a droplet-PCR machine. The K562 BCR-ABL mRNA-positive cell line as well as bone marrow aspirates from 5 patients with chronic myelogenous leukemia (CML) and 5 controls without CML were used. We evaluated one-step RT-nested PCR using the droplet-PCR machine. One-step RT-nested PCR performed in a single tube using the droplet-PCR machine enabled the detection of BCR-ABL mRNA within 40min, which was 10(3)-fold superior to conventional RT nested PCR using three steps in separate tubes. The sensitivity of the one-step RT-nested PCR was 0.001%, with sample reactivity comparable to that of the conventional assay. One-step RT-nested PCR was developed using the droplet-PCR machine, which enabled all reactions to be performed in a single tube accurately and rapidly and with high sensitivity. This one-step RT-nested PCR may be applicable to a wide spectrum of genetic tests in clinical laboratories. Copyright © 2015 Elsevier B.V. All rights reserved.
Nguyen, Huu-Tho; Md Dawal, Siti Zawiah; Nukman, Yusoff; Aoyama, Hideki; Case, Keith
2015-01-01
Globalization of business and competitiveness in manufacturing has forced companies to improve their manufacturing facilities to respond to market requirements. Machine tool evaluation involves an essential decision using imprecise and vague information, and plays a major role to improve the productivity and flexibility in manufacturing. The aim of this study is to present an integrated approach for decision-making in machine tool selection. This paper is focused on the integration of a consistent fuzzy AHP (Analytic Hierarchy Process) and a fuzzy COmplex PRoportional ASsessment (COPRAS) for multi-attribute decision-making in selecting the most suitable machine tool. In this method, the fuzzy linguistic reference relation is integrated into AHP to handle the imprecise and vague information, and to simplify the data collection for the pair-wise comparison matrix of the AHP which determines the weights of attributes. The output of the fuzzy AHP is imported into the fuzzy COPRAS method for ranking alternatives through the closeness coefficient. Presentation of the proposed model application is provided by a numerical example based on the collection of data by questionnaire and from the literature. The results highlight the integration of the improved fuzzy AHP and the fuzzy COPRAS as a precise tool and provide effective multi-attribute decision-making for evaluating the machine tool in the uncertain environment. PMID:26368541
Ergonomic evaluation of conventional and improved methods of aonla pricking with women workers.
Rai, Arpana; Gandhi, Sudesh; Sharma, D K
2012-01-01
Conventional and improved methods of aonla pricking were evaluated ergonomically on an experiment conducted for 20 minute with women workers. The working heart rate, energy expenditure rate, total cardiac cost of work and physiological cost of work with conventional tools varied from 93-102 beats.min-1, 6-7.5 kJ.min-1, 285-470 beats, 14 -23 beats.min-1 while with machine varied from 96-105 beats.min-1, 6.5-8 kJ.min-1 , 336-540 beats, 16-27 beats.min-1 respectively. OWAS score for conventional method was 2 indicating corrective measures in near future while with machine was 1 indicating no corrective measures. Result of Nordic Musculoskeletal Questionnaire revealed that subjects complaint of pain in back, neck, right shoulder and right hand due to unnatural body posture and repetitive movement with hand tool. Moreover pricking was carried out in improper lighting conditions (200-300 lux) resulting into finger injuries from sharp edges of hand tool, whereas with machine no such problems were observed. Output with machine increased thrice than hand pricking in a given time. Machine was found useful in terms of saving time, increased productivity, enhanced safety and comfort as involved improved posture, was easy to handle and operate, thus increasing efficiency of the worker leading to better quality of life.
NASA Technical Reports Server (NTRS)
1972-01-01
The overall program background, the various system concepts considered, and the rationale for the selected design are described. The concepts for each subsystem are also described and compared. Details are given for the requirements, boom configuration and dynamics, actuators, man/machine interface and control, visual system, control system, environmental control and life support, data processing, and materials.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Morgan, Michael J.
The Hydrogen Fracture Toughness Tester (HFTT) is a mechanical testing machine designed for conducting fracture mechanics tests on materials in high-pressure hydrogen gas. The tester is needed for evaluating the effects of hydrogen on the cracking properties of tritium reservoir materials. It consists of an Instron Model 8862 Electromechanical Test Frame; an Autoclave Engineering Pressure Vessel, an Electric Potential Drop Crack Length Measurement System, associated computer control and data acquisition systems, and a high-pressure hydrogen gas manifold and handling system.
Comprehensive helicopter analysis: A state of the art review
NASA Technical Reports Server (NTRS)
Johnson, W.
1978-01-01
An assessment of the status of helicopter theory and analysis is presented. The technology level embodied in available design tools (computer programs) is examined, considering the problem areas of performance, loads and vibration, handling qualities and simulation, and aeroelastic stability. The effectiveness of the present analyses is discussed. The characteristics of the technology in the analyses are reviewed, including the aerodynamics technology, induced velocity and wake geometry, dynamics technology, and machine limitations.
Computer interface for mechanical arm
NASA Technical Reports Server (NTRS)
Derocher, W. L.; Zermuehlen, R. O.
1978-01-01
Man/machine interface commands computer-controlled mechanical arm. Remotely-controlled arm has six degrees of freedom and is controlled through "supervisory-control" mode, in which all motions of arm follow set of preprogramed sequences. For simplicity, few prescribed commands are required to accomplish entire operation. Applications include operating computer-controlled arm to handle radioactive of explosive materials or commanding arm to perform functions in hostile environments. Modified version using displays may be applied in medicine.
Elevated-temperature Deformation Mechanisms in Ta2C: An Experimental Study
2013-01-01
result, tan- talum carbides have found uses in a variety of wear- resis - tant applications including machine tooling, coatings for injection molding...HIP billet. In addition , the near surface of the bil- let was mechanically ground to remove any possible inter- diffusion reaction zone between the...mounted in a conductive epoxy for handling. TEM foils were prepared by ultrasonically drilling 3 mm discs from the cross-sections using a Fischione
1991-09-05
34 Learning from Learning : Principles for Supporting Drivers" J A Groeger, MRC Applied Psychology Unit, UK "Argos: A Driver Behaviour Analysis System...Technology (CEST), UK MISCELLANEOUS "Modular Sensor System for Guiding Handling Machines " J Geit and J 423 Heinrich, TZN Forshcungs, FRG "Flexible...PUBLIC TRANSP . MANAa RESEARCH Arrrtympe PARTI "Implementation Strategl»» Systems engineering \\ PART III / Validation through Pilot
Engineering Design Handbook. Dielectric Embedding of Electrical or Electronic Components
1979-04-06
its excellent electrical properties are maintained at elevated temperatures. Even when the insulation is exposed to a direct flame, it burns to a...machine by one operator; these molds are generally equipped with insulated handles to prevent personal in- jury from burns . In electronic embedment...Excellent for large volume runs; tooling is minimal. Pres- ence of a shell or housing as- sures no exposed components, as can occur in casting. Some
1990-12-01
data rate to the electronics would be much lower on the average and the data much "richer" in information. Intelligent use of...system bottleneck, a high data rate should be provided by I/O systems. 2. machines with intelligent storage management specially designed for logic...management information processing, surveillance sensors, intelligence data collection and handling, solid state sciences, electromagnetics, and propagation, and electronic reliability/maintainability and compatibility.
Instruments for preparation of heterogeneous catalysts by an impregnation method
NASA Astrophysics Data System (ADS)
Yamada, Yusuke; Akita, Tomoki; Ueda, Atsushi; Shioyama, Hiroshi; Kobayashi, Tetsuhiko
2005-06-01
Instruments for the preparation of heterogeneous catalysts in powder form have been developed. The instruments consist of powder dispensing robot and an automated liquid handling machine equipped with an ultrasonic and a vortex mixer. The combination of these two instruments achieves the catalyst preparation by incipient wetness and ion exchange methods. The catalyst library prepared with these instruments were tested for dimethyl ether steam reforming and characterized by transmission electron microscopy observations.
Machine assisted histogram classification
NASA Astrophysics Data System (ADS)
Benyó, B.; Gaspar, C.; Somogyi, P.
2010-04-01
LHCb is one of the four major experiments under completion at the Large Hadron Collider (LHC). Monitoring the quality of the acquired data is important, because it allows the verification of the detector performance. Anomalies, such as missing values or unexpected distributions can be indicators of a malfunctioning detector, resulting in poor data quality. Spotting faulty or ageing components can be either done visually using instruments, such as the LHCb Histogram Presenter, or with the help of automated tools. In order to assist detector experts in handling the vast monitoring information resulting from the sheer size of the detector, we propose a graph based clustering tool combined with machine learning algorithm and demonstrate its use by processing histograms representing 2D hitmaps events. We prove the concept by detecting ion feedback events in the LHCb experiment's RICH subdetector.
Semantic Technologies and Bio-Ontologies.
Gutierrez, Fernando
2017-01-01
As information available through data repositories constantly grows, the need for automated mechanisms for linking, querying, and sharing data has become a relevant factor both in research and industry. This situation is more evident in research fields such as the life sciences, where new experiments by different research groups are constantly generating new information regarding a wide variety of related study objects. However, current methods for representing information and knowledge are not suited for machine processing. The Semantic Technologies are a set of standards and protocols that intend to provide methods for representing and handling data that encourages reusability of information and is machine-readable. In this chapter, we will provide a brief introduction to Semantic Technologies, and how these protocols and standards have been incorporated into the life sciences to facilitate dissemination and access to information.
NASA Technical Reports Server (NTRS)
Sanz, J.; Pischel, K.; Hubler, D.
1992-01-01
An application for parallel computation on a combined cluster of powerful workstations and supercomputers was developed. A Parallel Virtual Machine (PVM) is used as message passage language on a macro-tasking parallelization of the Aerodynamic Inverse Design and Analysis for a Full Engine computer code. The heterogeneous nature of the cluster is perfectly handled by the controlling host machine. Communication is established via Ethernet with the TCP/IP protocol over an open network. A reasonable overhead is imposed for internode communication, rendering an efficient utilization of the engaged processors. Perhaps one of the most interesting features of the system is its versatile nature, that permits the usage of the computational resources available that are experiencing less use at a given point in time.
Frutos, M.; Méndez, M.; Tohmé, F.; Broz, D.
2013-01-01
Many of the problems that arise in production systems can be handled with multiobjective techniques. One of those problems is that of scheduling operations subject to constraints on the availability of machines and buffer capacity. In this paper we analyze different Evolutionary multiobjective Algorithms (MOEAs) for this kind of problems. We consider an experimental framework in which we schedule production operations for four real world Job-Shop contexts using three algorithms, NSGAII, SPEA2, and IBEA. Using two performance indexes, Hypervolume and R2, we found that SPEA2 and IBEA are the most efficient for the tasks at hand. On the other hand IBEA seems to be a better choice of tool since it yields more solutions in the approximate Pareto frontier. PMID:24489502
Administrative Services Staff with New Teleticketing Machine
1968-02-21
Peggy Heintz, left, receives an airline ticket from supervisor Judy Kuebeler in the Administrative Services Building at the National Aeronautics and Space Administration (NASA) Lewis Research Center. The center had recently purchased a teleticket machine that automatically printed airline tickets as directed by the airline’s computer system. The Administrative Services Branch had 55 staff members performing a variety of roles. They served as telephone operators and set up communications with other centers. They operated the motor pool, handled all travel arrangements, prepared forms and work instructions, and planned offices. The staff was also responsible for records management and storage. It was reported that the staff processed 65 bags of incoming mail per day, which was said to be on par with a city of 15,000 to 20,000 people.
Translation of Japanese Noun Compounds at Super-Function Based MT System
NASA Astrophysics Data System (ADS)
Zhao, Xin; Ren, Fuji; Kuroiwa, Shingo
Noun compounds are frequently encountered construction in nature language processing (NLP), consisting of a sequence of two or more nouns which functions syntactically as one noun. The translation of noun compounds has become a major issue in Machine Translation (MT) due to their frequency of occurrence and high productivity. In our previous studies on Super-Function Based Machine Translation (SFBMT), we have found that noun compounds are very frequently used and difficult to be translated correctly, the overgeneration of noun compounds can be dangerous as it may introduce ambiguity in the translation. In this paper, we discuss the challenges in handling Japanese noun compounds in an SFBMT system, we present a shallow method for translating noun compounds by using a word level translation dictionary and target language monolingual corpus.
Chaudhary, Dhanjee Kumar; Bhattacherjee, Ashis; Patra, Aditya Kumar; Chau, Nearkasen
2015-12-01
This study aimed to assess the whole-body vibration (WBV) exposure among large blast hole drill machine operators with regard to the International Organization for Standardization (ISO) recommended threshold values and its association with machine- and rock-related factors and workers' individual characteristics. The study population included 28 drill machine operators who had worked in four opencast iron ore mines in eastern India. The study protocol comprised the following: measurements of WBV exposure [frequency weighted root mean square (RMS) acceleration (m/s(2))], machine-related data (manufacturer of machine, age of machine, seat height, thickness, and rest height) collected from mine management offices, measurements of rock hardness, uniaxial compressive strength and density, and workers' characteristics via face-to-face interviews. More than 90% of the operators were exposed to a higher level WBV than the ISO upper limit and only 3.6% between the lower and upper limits, mainly in the vertical axis. Bivariate correlations revealed that potential predictors of total WBV exposure were: machine manufacturer (r = 0.453, p = 0.015), age of drill (r = 0.533, p = 0.003), and hardness of rock (r = 0.561, p = 0.002). The stepwise multiple regression model revealed that the potential predictors are age of operator (regression coefficient β = -0.052, standard error SE = 0.023), manufacturer (β = 1.093, SE = 0.227), rock hardness (β = 0.045, SE = 0.018), uniaxial compressive strength (β = 0.027, SE = 0.009), and density (β = -1.135, SE = 0.235). Prevention should include using appropriate machines to handle rock hardness, rock uniaxial compressive strength and density, and seat improvement using ergonomic approaches such as including a suspension system.
Chaudhary, Dhanjee Kumar; Bhattacherjee, Ashis; Patra, Aditya Kumar; Chau, Nearkasen
2015-01-01
Background This study aimed to assess the whole-body vibration (WBV) exposure among large blast hole drill machine operators with regard to the International Organization for Standardization (ISO) recommended threshold values and its association with machine- and rock-related factors and workers' individual characteristics. Methods The study population included 28 drill machine operators who had worked in four opencast iron ore mines in eastern India. The study protocol comprised the following: measurements of WBV exposure [frequency weighted root mean square (RMS) acceleration (m/s2)], machine-related data (manufacturer of machine, age of machine, seat height, thickness, and rest height) collected from mine management offices, measurements of rock hardness, uniaxial compressive strength and density, and workers' characteristics via face-to-face interviews. Results More than 90% of the operators were exposed to a higher level WBV than the ISO upper limit and only 3.6% between the lower and upper limits, mainly in the vertical axis. Bivariate correlations revealed that potential predictors of total WBV exposure were: machine manufacturer (r = 0.453, p = 0.015), age of drill (r = 0.533, p = 0.003), and hardness of rock (r = 0.561, p = 0.002). The stepwise multiple regression model revealed that the potential predictors are age of operator (regression coefficient β = −0.052, standard error SE = 0.023), manufacturer (β = 1.093, SE = 0.227), rock hardness (β = 0.045, SE = 0.018), uniaxial compressive strength (β = 0.027, SE = 0.009), and density (β = –1.135, SE = 0.235). Conclusion Prevention should include using appropriate machines to handle rock hardness, rock uniaxial compressive strength and density, and seat improvement using ergonomic approaches such as including a suspension system. PMID:26929838
Ranjith, G; Parvathy, R; Vikas, V; Chandrasekharan, Kesavadas; Nair, Suresh
2015-04-01
With the advent of new imaging modalities, radiologists are faced with handling increasing volumes of data for diagnosis and treatment planning. The use of automated and intelligent systems is becoming essential in such a scenario. Machine learning, a branch of artificial intelligence, is increasingly being used in medical image analysis applications such as image segmentation, registration and computer-aided diagnosis and detection. Histopathological analysis is currently the gold standard for classification of brain tumors. The use of machine learning algorithms along with extraction of relevant features from magnetic resonance imaging (MRI) holds promise of replacing conventional invasive methods of tumor classification. The aim of the study is to classify gliomas into benign and malignant types using MRI data. Retrospective data from 28 patients who were diagnosed with glioma were used for the analysis. WHO Grade II (low-grade astrocytoma) was classified as benign while Grade III (anaplastic astrocytoma) and Grade IV (glioblastoma multiforme) were classified as malignant. Features were extracted from MR spectroscopy. The classification was done using four machine learning algorithms: multilayer perceptrons, support vector machine, random forest and locally weighted learning. Three of the four machine learning algorithms gave an area under ROC curve in excess of 0.80. Random forest gave the best performance in terms of AUC (0.911) while sensitivity was best for locally weighted learning (86.1%). The performance of different machine learning algorithms in the classification of gliomas is promising. An even better performance may be expected by integrating features extracted from other MR sequences. © The Author(s) 2015 Reprints and permissions: sagepub.co.uk/journalsPermissions.nav.
Compensation for time delay in flight simulator visual-display systems
NASA Technical Reports Server (NTRS)
Crane, D. F.
1983-01-01
A piloted aircraft can be viewed as a closed-loop, man-machine control system. When a simulator pilot is performing a precision maneuver, a delay in the visual display of aircraft response to pilot-control input decreases the stability of the pilot-aircraft system. The less stable system is more difficult to control precisely. Pilot dynamic response and performance change as the pilot attempts to compensate for the decrease in system stability, and these changes bias the simulation results by influencing the pilot's rating of the handling qualities of the simulated aircraft. Delay compensation, designed to restore pilot-aircraft system stability, was evaluated in several studies which are reported here. The studies range from single-axis, tracking-task experiments (with sufficient subjects and trials to establish statistical significance of the results) to a brief evaluation of compensation of a computer-generated-imagery (CGI) visual display system in a full six-degree-of-freedom simulation. The compensation was effective - improvements in pilot performance and workload or aircraft handling-qualities rating (HQR) were observed. Results from recent aircraft handling-qualities research literature which support the compensation design approach are also reviewed.
Digital Low Level RF Systems for Fermilab Main Ring and Tevatron
NASA Astrophysics Data System (ADS)
Chase, B.; Barnes, B.; Meisner, K.
1997-05-01
At Fermilab, a new Low Level RF system is successfully installed and operating in the Main Ring. Installation is proceeding for a Tevatron system. This upgrade replaces aging CAMAC/NIM components for an increase in accuracy, reliability, and flexibility. These VXI systems are based on a custom three channel direct digital synthesizer(DDS) module. Each synthesizer channel is capable of independent or ganged operation for both frequency and phase modulation. New frequency and phase values are computed at a 100kHz rate on the module's Analog Devices ADSP21062 (SHARC) digital signal processor. The DSP concurrently handles feedforward, feedback, and beam manipulations. Higher level state machines and the control system interface are handled at the crate level using the VxWorks operating system. This paper discusses the hardware, software and operational aspects of these LLRF systems.
NASA Technical Reports Server (NTRS)
Millwater, Harry; Riha, David
1996-01-01
The NESSUS and NASTRAN computer codes were successfully integrated. The enhanced NESSUS code will use NASTRAN for the structural Analysis and NESSUS for the probabilistic analysis. Any quantities in the NASTRAN bulk data input can be random variables. Any NASTRAN result that is written to the output2 file can be returned to NESSUS as the finite element result. The interfacing between NESSUS and NASTRAN is handled automatically by NESSUS. NESSUS and NASTRAN can be run on different machines using the remote host option.
Optimization-based controller design for rotorcraft
NASA Technical Reports Server (NTRS)
Tsing, N.-K.; Fan, M. K. H.; Barlow, J.; Tits, A. L.; Tischler, M. B.
1993-01-01
An optimization-based methodology for linear control system design is outlined by considering the design of a controller for a UH-60 rotorcraft in hover. A wide range of design specifications is taken into account: internal stability, decoupling between longitudinal and lateral motions, handling qualities, and rejection of windgusts. These specifications are investigated while taking into account physical limitations in the swashplate displacements and rates of displacement. The methodology crucially relies on user-machine interaction for tradeoff exploration.
The 14th Annual Conference on Manual Control. [digital simulation of human operator dynamics
NASA Technical Reports Server (NTRS)
1978-01-01
Human operator dynamics during actual manual control or while monitoring the automatic control systems involved in air-to-air tracking, automobile driving, the operator of undersea vehicles, and remote handling are examined. Optimal control models and the use of mathematical theory in representing man behavior in complex man machine system tasks are discussed with emphasis on eye/head tracking and scanning; perception and attention allocation; decision making; and motion simulation and effects.
Ebrahimi, Ahmad; Kia, Reza; Komijan, Alireza Rashidi
2016-01-01
In this article, a novel integrated mixed-integer nonlinear programming model is presented for designing a cellular manufacturing system (CMS) considering machine layout and part scheduling problems simultaneously as interrelated decisions. The integrated CMS model is formulated to incorporate several design features including part due date, material handling time, operation sequence, processing time, an intra-cell layout of unequal-area facilities, and part scheduling. The objective function is to minimize makespan, tardiness penalties, and material handling costs of inter-cell and intra-cell movements. Two numerical examples are solved by the Lingo software to illustrate the results obtained by the incorporated features. In order to assess the effects and importance of integration of machine layout and part scheduling in designing a CMS, two approaches, sequentially and concurrent are investigated and the improvement resulted from a concurrent approach is revealed. Also, due to the NP-hardness of the integrated model, an efficient genetic algorithm is designed. As a consequence, computational results of this study indicate that the best solutions found by GA are better than the solutions found by B&B in much less time for both sequential and concurrent approaches. Moreover, the comparisons between the objective function values (OFVs) obtained by sequential and concurrent approaches demonstrate that the OFV improvement is averagely around 17 % by GA and 14 % by B&B.
Annual report, FY 1979 Spent fuel and fuel pool component integrity.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Johnson, A.B. Jr.; Bailey, W.J.; Schreiber, R.E.
International meetings under the BEFAST program and under INFCE Working Group No. 6 during 1978 and 1979 continue to indicate that no cases of fuel cladding degradation have developed on pool-stored fuel from water reactors. A section from a spent fuel rack stand, exposed for 1.5 y in the Yankee Rowe (PWR) pool had 0.001- to 0.003-in.-deep (25- to 75-..mu..m) intergranular corrosion in weld heat-affected zones but no evidence of stress corrosion cracking. A section of a 304 stainless steel spent fuel storage rack exposed 6.67 y in the Point Beach reactor (PWR) spent fuel pool showed no significant corrosion.more » A section of 304 stainless steel 8-in.-dia pipe from the Three Mile Island No. 1 (PWR) spent fuel pool heat exchanger plumbing developed a through-wall crack. The crack was intergranular, initiating from the inside surface in a weld heat-affected zone. The zone where the crack occurred was severely sensitized during field welding. The Kraftwerk Union (Erlangen, GFR) disassembled a stainless-steel fuel-handling machine that operated for 12 y in a PWR (boric acid) spent fuel pool. There was no evidence of deterioration, and the fuel-handling machine was reassembled for further use. A spent fuel pool at a Swedish PWR was decontaminated. The procedure is outlined in this report.« less
Low latency messages on distributed memory multiprocessors
NASA Technical Reports Server (NTRS)
Rosing, Matthew; Saltz, Joel
1993-01-01
Many of the issues in developing an efficient interface for communication on distributed memory machines are described and a portable interface is proposed. Although the hardware component of message latency is less than one microsecond on many distributed memory machines, the software latency associated with sending and receiving typed messages is on the order of 50 microseconds. The reason for this imbalance is that the software interface does not match the hardware. By changing the interface to match the hardware more closely, applications with fine grained communication can be put on these machines. Based on several tests that were run on the iPSC/860, an interface that will better match current distributed memory machines is proposed. The model used in the proposed interface consists of a computation processor and a communication processor on each node. Communication between these processors and other nodes in the system is done through a buffered network. Information that is transmitted is either data or procedures to be executed on the remote processor. The dual processor system is better suited for efficiently handling asynchronous communications compared to a single processor system. The ability to send data or procedure is very flexible for minimizing message latency, based on the type of communication being performed. The test performed and the proposed interface are described.
Occupational Accidents with Agricultural Machinery in Austria.
Kogler, Robert; Quendler, Elisabeth; Boxberger, Josef
2016-01-01
The number of recognized accidents with fatalities during agricultural and forestry work, despite better technology and coordinated prevention and trainings, is still very high in Austria. The accident scenarios in which people are injured are very different on farms. The common causes of accidents in agriculture and forestry are the loss of control of machine, means of transport or handling equipment, hand-held tool, and object or animal, followed by slipping, stumbling and falling, breakage, bursting, splitting, slipping, fall, and collapse of material agent. In the literature, a number of studies of general (machine- and animal-related accidents) and specific (machine-related accidents) agricultural and forestry accident situations can be found that refer to different databases. From the database Data of the Austrian Workers Compensation Board (AUVA) about occupational accidents with different agricultural machinery over the period 2008-2010 in Austria, main characteristics of the accident, the victim, and the employer as well as variables on causes and circumstances by frequency and contexts of parameters were statistically analyzed by employing the chi-square test and odds ratio. The aim of the study was to determine the information content and quality of the European Statistics on Accidents at Work (ESAW) variables to evaluate safety gaps and risks as well as the accidental man-machine interaction.
Chapter 9: The FTU Machine - Design Construction and Assembly
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pizzuto, A.; Annino, C.; Baldarelli, M.
2004-05-15
The main design features and guidelines for the construction of the 8-T cryogenically cooled Frascati Tokamak Upgrade (FTU) are presented. The main features include the very compact toroidal magnets based on the concept of the 'Bitter' type of coil with wedge-shaped turns, utilized for the first time for the Alcator A and C magnets, and the original configuration of the vacuum vessel (VV) structure, which is fully welded in order to achieve the required high strength and electric resistivity. The present toroidal limiter has been installed following several years of operation, and this installation has required the development of specificmore » remote-handling tools. The toroidal limiter consists of 12 independent sectors made of stainless steel carriers and molybdenum alloy (TZM) tiles. The main fabrication processes developed for the toroidal and poloidal coils as well as for the VV are described. It is to be noted that the assembly procedure has required very accurate machining of all the structures requiring several trials and steps. The machine has shown no problem in operating routinely at its maximum design values (8 T, 1.6 MA)« less
Triangular Quantum Loop Topography for Machine Learning
NASA Astrophysics Data System (ADS)
Zhang, Yi; Kim, Eun-Ah
Despite rapidly growing interest in harnessing machine learning in the study of quantum many-body systems there has been little success in training neural networks to identify topological phases. The key challenge is in efficiently extracting essential information from the many-body Hamiltonian or wave function and turning the information into an image that can be fed into a neural network. When targeting topological phases, this task becomes particularly challenging as topological phases are defined in terms of non-local properties. Here we introduce triangular quantum loop (TQL) topography: a procedure of constructing a multi-dimensional image from the ''sample'' Hamiltonian or wave function using two-point functions that form triangles. Feeding the TQL topography to a fully-connected neural network with a single hidden layer, we demonstrate that the architecture can be effectively trained to distinguish Chern insulator and fractional Chern insulator from trivial insulators with high fidelity. Given the versatility of the TQL topography procedure that can handle different lattice geometries, disorder, interaction and even degeneracy our work paves the route towards powerful applications of machine learning in the study of topological quantum matters.
Clifford support vector machines for classification, regression, and recurrence.
Bayro-Corrochano, Eduardo Jose; Arana-Daniel, Nancy
2010-11-01
This paper introduces the Clifford support vector machines (CSVM) as a generalization of the real and complex-valued support vector machines using the Clifford geometric algebra. In this framework, we handle the design of kernels involving the Clifford or geometric product. In this approach, one redefines the optimization variables as multivectors. This allows us to have a multivector as output. Therefore, we can represent multiple classes according to the dimension of the geometric algebra in which we work. We show that one can apply CSVM for classification and regression and also to build a recurrent CSVM. The CSVM is an attractive approach for the multiple input multiple output processing of high-dimensional geometric entities. We carried out comparisons between CSVM and the current approaches to solve multiclass classification and regression. We also study the performance of the recurrent CSVM with experiments involving time series. The authors believe that this paper can be of great use for researchers and practitioners interested in multiclass hypercomplex computing, particularly for applications in complex and quaternion signal and image processing, satellite control, neurocomputation, pattern recognition, computer vision, augmented virtual reality, robotics, and humanoids.
NASA Astrophysics Data System (ADS)
Aaronson, Judith N.; Nablo, Sam V.
1985-05-01
Selfshielded electron accelerators have been successfully used in industry for more than ten years. One of the important advantages of these machines is their compactness for easy adaptation to conventional coating and product finishing machinery. It is equally important that these machines qualify for use under "unrestricted" conditions as specified by OSHA. The shielding and product handling configurations which make this unrestricted designation possible for operating voltages under 300 kV are discussed. Thin film dosimetry techniques used for the determination of the machine performance parameters are discussed along with the rotary scanner techniques employed for the dose rate studies which are important in the application of these processors. Paper and wood coatings, which are important industrial applications involving electron initiated polymerization, are reviewed. The sterilization and disinfestation applications are also discussed. The increasing concern of these industries for the more efficient use of energy and for compliance with more stringent pollution regulations, coupled with the novel processes this energy source makes possible, assure a bright future for this developing technology.
A genetic algorithm for a bi-objective mathematical model for dynamic virtual cell formation problem
NASA Astrophysics Data System (ADS)
Moradgholi, Mostafa; Paydar, Mohammad Mahdi; Mahdavi, Iraj; Jouzdani, Javid
2016-09-01
Nowadays, with the increasing pressure of the competitive business environment and demand for diverse products, manufacturers are force to seek for solutions that reduce production costs and rise product quality. Cellular manufacturing system (CMS), as a means to this end, has been a point of attraction to both researchers and practitioners. Limitations of cell formation problem (CFP), as one of important topics in CMS, have led to the introduction of virtual CMS (VCMS). This research addresses a bi-objective dynamic virtual cell formation problem (DVCFP) with the objective of finding the optimal formation of cells, considering the material handling costs, fixed machine installation costs and variable production costs of machines and workforce. Furthermore, we consider different skills on different machines in workforce assignment in a multi-period planning horizon. The bi-objective model is transformed to a single-objective fuzzy goal programming model and to show its performance; numerical examples are solved using the LINGO software. In addition, genetic algorithm (GA) is customized to tackle large-scale instances of the problems to show the performance of the solution method.
Data handling and representation of freeform surfaces
NASA Astrophysics Data System (ADS)
Steinkopf, Ralf; Dick, Lars; Kopf, Tino; Gebhardt, Andreas; Risse, Stefan; Eberhardt, Ramona
2011-10-01
Freeform surfaces enable innovative optics. They are not limited by axis symmetry and hence they are almost free in design. They are used to reduce the installation space and enhance the performance of optical elements. State of the art optical design tools are computing with powerful algorithms to simulate freeform surfaces. Even new mathematical approaches are under development /1/. In consequence, new optical designs /2/ are pushing the development of manufacturing processes consequently and novel types of datasets have to proceed through the process chain /3/. The complexity of these data is the huge challenge for the data handling. Because of the asymmetrical and 3-dimensional surfaces of freeforms, large data volumes have to be created, trimmed, extended and fitted. All these processes must be performed without losing the accuracy of the original design data. Additionally, manifold types of geometries results in different kinds of mathematical representations of freeform surfaces and furthermore the used CAD/CAM tools are dealing with a set of spatial transport formats. These are all reasons why manufacture-oriented approaches for the freeform data handling are not yet sufficiently developed. This paper suggests a classification of freeform surfaces based on the manufacturing methods which are offered by diamond machining. The different manufacturing technologies, ranging from servo-turning to shaping, require a differentiated approach for the data handling process. The usage of analytical descriptions in form of splines and polynomials as well as the application of discrete descriptions like point clouds is shown in relation to the previously made classification. Advantages and disadvantages of freeform representations are discussed. Aspects of the data handling in between different process steps are pointed out and suitable exchange formats for freeform data are proposed. The described approach offers the possibility for efficient data handling from optical design to systems in novel optics.
WASTE PACKAGE REMEDIATION SYSTEM DESCRIPTION DOCUMENT
DOE Office of Scientific and Technical Information (OSTI.GOV)
N.D. Sudan
2000-06-22
The Waste Package Remediation System remediates waste packages (WPs) and disposal containers (DCs) in one of two ways: preparation of rejected DC closure welds for repair or opening of the DC/WP. DCs are brought to the Waste Package Remediation System for preparation of rejected closure welds if testing of the closure weld by the Disposal Container Handling System indicates an unacceptable, but repairable, welding flaw. DC preparation of rejected closure welds will require removal of the weld in such a way that the Disposal Container Handling System may resume and complete the closure welding process. DCs/WPs are brought to themore » Waste Package Remediation System for opening if the Disposal Container Handling System testing of the DC closure weld indicates an unrepairable welding flaw, or if a WP is recovered from the subsurface repository because suspected damage to the WP or failure of the WP has occurred. DC/WP opening will require cutting of the DC/WP such that a temporary seal may be installed and the waste inside the DC/WP removed by another system. The system operates in a Waste Package Remediation System hot cell located in the Waste Handling Building that has direct access to the Disposal Container Handling System. One DC/WP at a time can be handled in the hot cell. The DC/WP arrives on a transfer cart, is positioned within the cell for system operations, and exits the cell without being removed from the cart. The system includes a wide variety of remotely operated components including a manipulator with hoist and/or jib crane, viewing systems, machine tools for opening WPs, and equipment used to perform pressure and gas composition sampling. Remotely operated equipment is designed to facilitate DC/WP decontamination and hot cell equipment maintenance, and interchangeable components are provided where appropriate. The Waste Package Remediation System interfaces with the Disposal Container Handling System for the receipt and transport of WPs and DCs. The Waste Handling Building System houses the system, and provides the facility, safety, and auxiliary systems required to support operations. The system receives power from the Waste Handling Building Electrical System. The system also interfaces with the various DC systems.« less
NASA Astrophysics Data System (ADS)
Dachyar, M.; Risky, S. A.
2014-06-01
Telecommunications company have to improve their business performance despite of the increase customers every year. In Indonesia, the telecommunication company have provided best services, improving operational systems by designing a framework for operational systems of the Internet of Things (IoT) other name of Machine to Machine (M2M). This study was conducted with expert opinion which further processed by the Analytic Hierarchy Process (AHP) to obtain important factor for organizations operational systems, and the Interpretive Structural Modeling (ISM) to determine factors of organization which found drives the biggest power. This study resulted, the greatest weight of SLA & KPI handling problems. The M2M current dashboard and current M2M connectivity have power to affect other factors and has important function for M2M operations roomates system which can be effectively carried out.
Hybrid Optimization Parallel Search PACKage
DOE Office of Scientific and Technical Information (OSTI.GOV)
2009-11-10
HOPSPACK is open source software for solving optimization problems without derivatives. Application problems may have a fully nonlinear objective function, bound constraints, and linear and nonlinear constraints. Problem variables may be continuous, integer-valued, or a mixture of both. The software provides a framework that supports any derivative-free type of solver algorithm. Through the framework, solvers request parallel function evaluation, which may use MPI (multiple machines) or multithreading (multiple processors/cores on one machine). The framework provides a Cache and Pending Cache of saved evaluations that reduces execution time and facilitates restarts. Solvers can dynamically create other algorithms to solve subproblems, amore » useful technique for handling multiple start points and integer-valued variables. HOPSPACK ships with the Generating Set Search (GSS) algorithm, developed at Sandia as part of the APPSPACK open source software project.« less
Praxis language reference manual
DOE Office of Scientific and Technical Information (OSTI.GOV)
Walker, J.H.
1981-01-01
This document is a language reference manual for the programming language Praxis. The document contains the specifications that must be met by any compiler for the language. The Praxis language was designed for systems programming in real-time process applications. Goals for the language and its implementations are: (1) highly efficient code generated by the compiler; (2) program portability; (3) completeness, that is, all programming requirements can be met by the language without needing an assembler; and (4) separate compilation to aid in design and management of large systems. The language does not provide any facilities for input/output, stack and queuemore » handling, string operations, parallel processing, or coroutine processing. These features can be implemented as routines in the language, using machine-dependent code to take advantage of facilities in the control environment on different machines.« less
Classification of Variable Objects in Massive Sky Monitoring Surveys
NASA Astrophysics Data System (ADS)
Woźniak, Przemek; Wyrzykowski, Łukasz; Belokurov, Vasily
2012-03-01
The era of great sky surveys is upon us. Over the past decade we have seen rapid progress toward a continuous photometric record of the optical sky. Numerous sky surveys are discovering and monitoring variable objects by hundreds of thousands. Advances in detector, computing, and networking technology are driving applications of all shapes and sizes ranging from small all sky monitors, through networks of robotic telescopes of modest size, to big glass facilities equipped with giga-pixel CCD mosaics. The Large Synoptic Survey Telescope will be the first peta-scale astronomical survey [18]. It will expand the volume of the parameter space available to us by three orders of magnitude and explore the mutable heavens down to an unprecedented level of sensitivity. Proliferation of large, multidimensional astronomical data sets is stimulating the work on new methods and tools to handle the identification and classification challenge [3]. Given exponentially growing data rates, automated classification of variability types is quickly becoming a necessity. Taking humans out of the loop not only eliminates the subjective nature of visual classification, but is also an enabling factor for time-critical applications. Full automation is especially important for studies of explosive phenomena such as γ-ray bursts that require rapid follow-up observations before the event is over. While there is a general consensus that machine learning will provide a viable solution, the available algorithmic toolbox remains underutilized in astronomy by comparison with other fields such as genomics or market research. Part of the problem is the nature of astronomical data sets that tend to be dominated by a variety of irregularities. Not all algorithms can handle gracefully uneven time sampling, missing features, or sparsely populated high-dimensional spaces. More sophisticated algorithms and better tools available in standard software packages are required to facilitate the adoption of machine learning in astronomy. The goal of this chapter is to show a number of successful applications of state-of-the-art machine learning methodology to time-resolved astronomical data, illustrate what is possible today, and help identify areas for further research and development. After a brief comparison of the utility of various machine learning classifiers, the discussion focuses on support vector machines (SVM), neural nets, and self-organizing maps. Traditionally, to detect and classify transient variability astronomers used ad hoc scan statistics. These methods will remain important as feature extractors for input into generic machine learning algorithms. Experience shows that the performance of machine learning tools on astronomical data critically depends on the definition and quality of the input features, and that a considerable amount of preprocessing is required before standard algorithms can be applied. However, with continued investments of effort by a growing number of astro-informatics savvy computer scientists and astronomers the much-needed expertise and infrastructure are growing faster than ever.
Bisele, Maria; Bencsik, Martin; Lewis, Martin G C; Barnett, Cleveland T
2017-01-01
Assessment methods in human locomotion often involve the description of normalised graphical profiles and/or the extraction of discrete variables. Whilst useful, these approaches may not represent the full complexity of gait data. Multivariate statistical methods, such as Principal Component Analysis (PCA) and Discriminant Function Analysis (DFA), have been adopted since they have the potential to overcome these data handling issues. The aim of the current study was to develop and optimise a specific machine learning algorithm for processing human locomotion data. Twenty participants ran at a self-selected speed across a 15m runway in barefoot and shod conditions. Ground reaction forces (BW) and kinematics were measured at 1000 Hz and 100 Hz, respectively from which joint angles (°), joint moments (N.m.kg-1) and joint powers (W.kg-1) for the hip, knee and ankle joints were calculated in all three anatomical planes. Using PCA and DFA, power spectra of the kinematic and kinetic variables were used as a training database for the development of a machine learning algorithm. All possible combinations of 10 out of 20 participants were explored to find the iteration of individuals that would optimise the machine learning algorithm. The results showed that the algorithm was able to successfully predict whether a participant ran shod or barefoot in 93.5% of cases. To the authors' knowledge, this is the first study to optimise the development of a machine learning algorithm.
Bisele, Maria; Bencsik, Martin; Lewis, Martin G. C.
2017-01-01
Assessment methods in human locomotion often involve the description of normalised graphical profiles and/or the extraction of discrete variables. Whilst useful, these approaches may not represent the full complexity of gait data. Multivariate statistical methods, such as Principal Component Analysis (PCA) and Discriminant Function Analysis (DFA), have been adopted since they have the potential to overcome these data handling issues. The aim of the current study was to develop and optimise a specific machine learning algorithm for processing human locomotion data. Twenty participants ran at a self-selected speed across a 15m runway in barefoot and shod conditions. Ground reaction forces (BW) and kinematics were measured at 1000 Hz and 100 Hz, respectively from which joint angles (°), joint moments (N.m.kg-1) and joint powers (W.kg-1) for the hip, knee and ankle joints were calculated in all three anatomical planes. Using PCA and DFA, power spectra of the kinematic and kinetic variables were used as a training database for the development of a machine learning algorithm. All possible combinations of 10 out of 20 participants were explored to find the iteration of individuals that would optimise the machine learning algorithm. The results showed that the algorithm was able to successfully predict whether a participant ran shod or barefoot in 93.5% of cases. To the authors’ knowledge, this is the first study to optimise the development of a machine learning algorithm. PMID:28886059
Modelling and Characterisation of Detection Models in WAMI for Handling Negative Information
2014-02-01
behaviour of the multi-stage detectors used in LoFT. This model is then used in a Probabilistic Hypothesis Density Filter (PHD). Unlike most multitarget...Therefore, we decided to use machine learning techniques which could model — and pre- dict — the behaviour of the detectors in LoFT. Because we are using...on feature detectors [8], motion models [13] and descriptor and template adaptation [9]. 2.3.2 State Model The state space of LoFT is defined in 2D
Some effects of stress on users of a voice recognition system: A preliminary inquiry
NASA Astrophysics Data System (ADS)
French, B. A.
1983-03-01
Recent work with Automatic Speech Recognition has focused on applications and productivity considerations in the man-machine interface. This thesis is an attempt to see if placing users of such equipment under time-induced stress has an effect on their percent correct recognition rates. Subjects were given a message-handling task of fixed length and allowed progressively shorter times to attempt to complete it. Questionnaire responses indicate stress levels increased with decreased time-allowance; recognition rates decreased as time was reduced.
Tele-assistance for semi-autonomous robots
NASA Technical Reports Server (NTRS)
Rogers, Erika; Murphy, Robin R.
1994-01-01
This paper describes a new approach in semi-autonomous mobile robots. In this approach the robot has sufficient computerized intelligence to function autonomously under a certain set of conditions, while the local system is a cooperative decision making unit that combines human and machine intelligence. Communication is then allowed to take place in a common mode and in a common language. A number of exception-handling scenarios that were constructed as a result of experiments with actual sensor data collected from two mobile robots were presented.
Considerations on automation of coating machines
NASA Astrophysics Data System (ADS)
Tilsch, Markus K.; O'Donnell, Michael S.
2015-04-01
Most deposition chambers sold into the optical coating market today are outfitted with an automated control system. We surveyed several of the larger equipment providers, and nine of them responded with information about their hardware architecture, data logging, level of automation, error handling, user interface, and interfacing options. In this paper, we present a summary of the results of the survey and describe commonalities and differences together with some considerations of tradeoffs, such as between capability for high customization and simplicity of operation.
Artillery ammunition marking tests
DOE Office of Scientific and Technical Information (OSTI.GOV)
Weil, B.S.; Lewis, J.C.
1995-04-01
This report describes the testing results of two approaches being considered for marking ink artillery ammunition with machine-readable data symbols. The first approach used ink-jet printing directly onto projectiles, and the second approach employed thermal-transfer printing onto self-adhesive labels that are subsequently applied automatically to projectiles. The objectives of this evaluation for each marking technology were to (1) determine typical system performance characteristics using the best commercially available equipment and (2) identify any special requirements necessary for handling ammunition when these technologies are employed.
Emotional Intelligence: Advocating for the Softer Side of Leadership
2013-03-01
handles social rejection and physical pain.30 In one study , patients in fMRI machines were told they were playing a game with two other players — a...operated more freely.”43 Yet these results do not indicate the cognitive system can be allowed to take a backseat. In another study , fMRI showed that...The roots of empathy can be found at an early age, which implies empathy is hardwired into the primitive limbic system. One study observed a toddler
40. BUILDING NO. 454, ORDNANCE FACILITY (BAG CHANGE FILLING PLANT), ...
40. BUILDING NO. 454, ORDNANCE FACILITY (BAG CHANGE FILLING PLANT), DETAIL SOUTHEAST SIDE OF EXTERIOR ELECTRICAL EQUIPMENT ROOM, SHOWING DOOR TO SEWING ROOM NO. 3, VENTILATOR FAN (OVER DOOR), STEAM LINE (PIPE), SEWING MACHINE MOTOR IN OVERHEAD, ALARM BELL, EXPLOSION-PROOF SWITCH BOXES, GROUNDS ON DOORS, PULL ALARM HANDLE (EXTREME RIGHT; PULLEY CABLE CONDUCTED IN CONDUIT TO SWITCH INSIDE BUILDING. PULLEYS INSIDE ALL ELBOW JOINTS.) - Picatinny Arsenal, 400 Area, Gun Bag Loading District, State Route 15 near I-80, Dover, Morris County, NJ
2015-10-02
ratio or physical layout than the training sample, or new vs old bananas . For our system, this is similar the multimodal case mentioned above; however...different modes. Foods with multiple “types” such as green, yellow, and brown bananas are seamlessly handled as well. Secondly, with hundreds or thousands...Recognition and Classification of Food Grains, Fruits and Flowers Using Machine Vision. INTERNATIONAL JOURNAL OF FOOD ENGINEERING, 5(4), 2009. [155] T. E
Autonomous Mechanical Assembly on the Space Shuttle: An Overview
NASA Technical Reports Server (NTRS)
Raibert, M. H.
1979-01-01
The space shuttle will be equipped with a pair of 50 ft. manipulators used to handle payloads and to perform mechanical assembly operations. Although current plans call for these manipulators to be operated by a human teleoperator. The possibility of using results from robotics and machine intelligence to automate this shuttle assembly system was investigated. The major components of an autonomous mechanical assembly system are examined, along with the technology base upon which they depend. The state of the art in advanced automation is also assessed.
ADST Software Design Document for the BDS-D VIDS-equipped M1
1993-09-10
system responds to perceived threats in the following ways:I a. by displaying visual icons on the Commander’s Controls Display Panel (CCDP). b. by...also referred to as the Soldier Machine Interface (SMI) and the Commander’s Controls Display Panel (CCDP). 3.2.1. VIDS-GT CSC The VIDS-GT CSC handles...countermeasure will be activated first in Individual_CM_Simul. 4.1.3.4.3. IndividualCMSimul CSU IndividualCM-Simul controls the activation and deactivation of
Close-Out Report for FY2002 - FY2005, DARPA Agreement
2010-06-29
controls, programming and software design . Specialized technologies and state-of-the-art and -market equipment available to private industry on a shared...Rest and Following Rest Designed to satisfy machinists’ needs, the Easy Turn represents high quality and value with trouble free use. This model is...fitted with a 3 % inch hole through spindle and a 12 inch chuck. It can handle parts up to 44 inches in length. • Cincinnati U5 6-axis CNC Machining
Wear Test Results of Candidate Materials for the OK-542 Towed Array Handling Machine Level Winder
1994-12-29
Stainless Steel, Inconel 625 , Nickel-Aluminum-Bronze, and Titanium. The specialty materials: Inconel 625 , Monel, Stainless and Stellite, were clad-welded...metals on a base of 1040 Carbon Steel. Finally, an economic carbide coating was deposited on a 316 Stainless Steel and Inconel 625 sample. Within a...damage in the shortest period of time. The Inconel 625 bar stock that was tested performed the best. It sustained the least amount of damage for one
Xie, X S; Zhang, M; Zheng, Y D; Du, X Y; Qi, C
2016-06-20
To investigate the influence of two positions for measuring instrument adapter on the measurement of hand-transmitted vibration in grinding machine using the intraclass correlation coefficient (ICC) of reliability assessment index, and to provide a basis for studies on the measurement standard for hand-transmitted vibration. With reference to the measurement standard for hand-transmitted vibration ISO 5349 Mechanical vibration-Measurement and evaluation of human exposure to hand-transmitted vibration-Part 1: General requirements and Mechanical vibration-Measurement and evaluation of human exposure to hand-transmitted vibration-Part 2: Practical guidance for measurement at the workplace, the domestic AWA5936 hand-transmitted vibration measuring instrument and SVAN-106 hand-transmitted vibration measuring instrument from Poland were used to measure hand-transmitted vibration in 3 workers for grinding machine in a foundry for 5 days continuously from September to October, 2014, and Y-axis data were recorded and compared. In worker A, the "T" -shaped adapter had a significantly higher mean Y-axis accelerated speed effective value than the "O" -shaped adapter [4.34 m/s(2) (95%CI 4.05(-)4.63) vs 2.32 m/s(2) (95%CI 2.27~2.38) , t=13.781, P<0.01]. In workers B and C, AWA5936 "U" -shaped adapter (placed at the position of the handle of grinding machine) had lower degrees of data variation of 12.55% and 15.77%, respectively, suggesting good data stability. The measurement results showed significant differences across different positions of adapter (P<0.01) and between all adapters except "O" -shaped and line-shaped adapters (all P<0.01) , while the measurement results showed no significant differences between the "O" -shaped and line-shaped adapters (P>0.01). The comparison of the measurement results of AWA5936 vibration measuring instrument with an "U" -shaped adapter and SVAN-106 vibration measuring instrument with an "S" -shaped adapter showed an ICC of >0.80 (ICC=0.82) , while the comparison of the measurement results of AWA5936 vibration measuring instrument with an "O" -shaped adapter and SVAN-106 vibration measuring instrument showed an ICC of <0.40. SVAN-106 vibration measuring instrument with an "S" -shaped adapter placed at the palm and AWA5936 vibration measuring instrument with an "U" -shaped adapter placed at the handle of grinding machine can give comparable measurement results with good reliability.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chang, P.Y.; Hao, E.; Patt, Y.
Conditional branches incur a severe performance penalty in wide-issue, deeply pipelined processors. Speculative execution and predicated execution are two mechanisms that have been proposed for reducing this penalty. Speculative execution can completely eliminate the penalty associated with a particular branch, but requires accurate branch prediction to be effective. Predicated execution does not require accurate branch prediction to eliminate the branch penalty, but is not applicable to all branches and can increase the latencies within the program. This paper examines the performance benefit of using both mechanisms to reduce the branch execution penalty. Predicated execution is used to handle the hard-to-protectmore » branches and speculative execution is used to handle the remaining branches. The hard-to-predict branches within the program are determined by profiling. We show that this approach can significantly reduce the branch execution penalty suffered by wide-issue processors.« less
SEM fractography studies of porous vitreous carbon: a candidate biomaterial.
Tarr, R R
1979-09-01
A new porous vitreous carbon material under development for use in orthopedic applications was investigated. Specimens were machined to appropriate sizes and fractured in one of the following modes: compression, cantilevered bending, or axial torsion. Scanning electron microscopy (SEM) was used to examine surface and internal features. Characteristics of a brittle, glassy material were noted. Findings included internal voids which appeared as craters, patches of whiskerlike fibrils, and edge impurities. Numerous microcracks caused by mechanical shaping and handling were the most remarkable structural defects. Pore channels which would allow bony ingrowth ranged in size from 50--500 micrometers with the majority between 200 and 300 micrometers. This study of porous vitreous carbon points to the need for stricter quality control in manufacturing, alternative methods for shaping and handling, and careful consideration in design and usage of a brittle material with marginal limits of safety for biomedical applications.
Determinants of wood dust exposure in the Danish furniture industry.
Mikkelsen, Anders B; Schlunssen, Vivi; Sigsgaard, Torben; Schaumburg, Inger
2002-11-01
This paper investigates the relation between wood dust exposure in the furniture industry and occupational hygiene variables. During the winter 1997-98 54 factories were visited and 2362 personal, passive inhalable dust samples were obtained; the geometric mean was 0.95 mg/m(3) and the geometric standard deviation was 2.08. In a first measuring round 1685 dust concentrations were obtained. For some of the workers repeated measurements were carried out 1 (351) and 2 weeks (326) after the first measurement. Hygiene variables like job, exhaust ventilation, cleaning procedures, etc., were documented. A multivariate analysis based on mixed effects models was used with hygiene variables being fixed effects and worker, machine, department and factory being random effects. A modified stepwise strategy of model making was adopted taking into account the hierarchically structured variables and making possible the exclusion of non-influential random as well as fixed effects. For woodworking, the following determinants of exposure increase the dust concentration: manual and automatic sanding and use of compressed air with fully automatic and semi-automatic machines and for cleaning of work pieces. Decreased dust exposure resulted from the use of compressed air with manual machines, working at fully automatic or semi-automatic machines, functioning exhaust ventilation, work on the night shift, daily cleaning of rooms, cleaning of work pieces with a brush, vacuum cleaning of machines, supplementary fresh air intake and safety representative elected within the last 2 yr. For handling and assembling, increased exposure results from work at automatic machines and presence of wood dust on the workpieces. Work on the evening shift, supplementary fresh air intake, work in a chair factory and special cleaning staff produced decreased exposure to wood dust. The implications of the results for the prevention of wood dust exposure are discussed.
NASA Astrophysics Data System (ADS)
Eidietis, N. W.; Choi, W.; Hahn, S. H.; Humphreys, D. A.; Sammuli, B. S.; Walker, M. L.
2018-05-01
A finite-state off-normal and fault response (ONFR) system is presented that provides the supervisory logic for comprehensive disruption avoidance and machine protection in tokamaks. Robust event handling is critical for ITER and future large tokamaks, where plasma parameters will necessarily approach stability limits and many systems will operate near their engineering limits. Events can be classified as off-normal plasmas events, e.g. neoclassical tearing modes or vertical displacements events, or faults, e.g. coil power supply failures. The ONFR system presented provides four critical features of a robust event handling system: sequential responses to cascading events, event recovery, simultaneous handling of multiple events and actuator prioritization. The finite-state logic is implemented in Matlab®/Stateflow® to allow rapid development and testing in an easily understood graphical format before automated export to the real-time plasma control system code. Experimental demonstrations of the ONFR algorithm on the DIII-D and KSTAR tokamaks are presented. In the most complex demonstration, the ONFR algorithm asynchronously applies ‘catch and subdue’ electron cyclotron current drive (ECCD) injection scheme to suppress a virulent 2/1 neoclassical tearing mode, subsequently shuts down ECCD for machine protection when the plasma becomes over-dense, and enables rotating 3D field entrainment of the ensuing locked mode to allow a safe rampdown, all in the same discharge without user intervention. When multiple ONFR states are active simultaneously and requesting the same actuator (e.g. neutral beam injection or gyrotrons), actuator prioritization is accomplished by sorting the pre-assigned priority values of each active ONFR state and giving complete control of the actuator to the state with highest priority. This early experience makes evident that additional research is required to develop an improved actuator sharing protocol, as well as a methodology to minimize the number and topological complexity of states as the finite-state ONFR system is scaled to a large, highly constrained device like ITER.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Eidietis, N. W.; Choi, W.; Hahn, S. H.
A finite-state off-normal and fault response (ONFR) system is presented that provides the supervisory logic for comprehensive disruption avoidance and machine protection in tokamaks. Robust event handling is critical for ITER and future large tokamaks, where plasma parameters will necessarily approach stability limits and many systems will operate near their engineering limits. Events can be classified as off-normal plasmas events, e.g. neoclassical tearing modes or vertical displacements events, or faults, e.g. coil power supply failures. The ONFR system presented provides four critical features of a robust event handling system: sequential responses to cascading events, event recovery, simultaneous handling of multiplemore » events and actuator prioritization. The finite-state logic is implemented in Matlab*/Stateflow* to allow rapid development and testing in an easily understood graphical format before automated export to the real-time plasma control system code. Experimental demonstrations of the ONFR algorithm on the DIII-D and KSTAR tokamaks are presented. In the most complex demonstration, the ONFR algorithm asynchronously applies “catch and subdue” electron cyclotron current drive (ECCD) injection scheme to suppress a virulent 2/1 neoclassical tearing mode, subsequently shuts down ECCD for machine protection when the plasma becomes over-dense, and enables rotating 3D field entrainment of the ensuing locked mode to allow a safe rampdown, all in the same discharge without user intervention. When multiple ONFR states are active simultaneously and requesting the same actuator (e.g. neutral beam injection or gyrotrons), actuator prioritization is accomplished by sorting the pre-assigned priority values of each active ONFR state and giving complete control of the actuator to the state with highest priority. This early experience makes evident that additional research is required to develop an improved actuator sharing protocol, as well as a methodology to minimize the number and topological complexity of states as the finite-state ONFR system is scaled to a large, highly constrained device like ITER.« less
Eidietis, N. W.; Choi, W.; Hahn, S. H.; ...
2018-03-29
A finite-state off-normal and fault response (ONFR) system is presented that provides the supervisory logic for comprehensive disruption avoidance and machine protection in tokamaks. Robust event handling is critical for ITER and future large tokamaks, where plasma parameters will necessarily approach stability limits and many systems will operate near their engineering limits. Events can be classified as off-normal plasmas events, e.g. neoclassical tearing modes or vertical displacements events, or faults, e.g. coil power supply failures. The ONFR system presented provides four critical features of a robust event handling system: sequential responses to cascading events, event recovery, simultaneous handling of multiplemore » events and actuator prioritization. The finite-state logic is implemented in Matlab*/Stateflow* to allow rapid development and testing in an easily understood graphical format before automated export to the real-time plasma control system code. Experimental demonstrations of the ONFR algorithm on the DIII-D and KSTAR tokamaks are presented. In the most complex demonstration, the ONFR algorithm asynchronously applies “catch and subdue” electron cyclotron current drive (ECCD) injection scheme to suppress a virulent 2/1 neoclassical tearing mode, subsequently shuts down ECCD for machine protection when the plasma becomes over-dense, and enables rotating 3D field entrainment of the ensuing locked mode to allow a safe rampdown, all in the same discharge without user intervention. When multiple ONFR states are active simultaneously and requesting the same actuator (e.g. neutral beam injection or gyrotrons), actuator prioritization is accomplished by sorting the pre-assigned priority values of each active ONFR state and giving complete control of the actuator to the state with highest priority. This early experience makes evident that additional research is required to develop an improved actuator sharing protocol, as well as a methodology to minimize the number and topological complexity of states as the finite-state ONFR system is scaled to a large, highly constrained device like ITER.« less
Hicken, Alexandra; White, Andrew J P; Crimmin, Mark R
2017-11-20
A series of heterobimetallic complexes containing three-center, two-electron Au-H-Cu bonds have been prepared from addition of a parent gold hydride to a bent d 10 copper(I) fragment. These highly unusual heterobimetallic complexes represent a missing link in the widely investigated series of neutral and cationic coinage metal hydride complexes containing Cu-H-Cu and M-H-M + moieties (M=Cu, Ag). The well-defined heterobimetallic hydride complexes act as precatalysts for the conversion of CO 2 into HCO 2 Bpin with HBpin as the reductant. The selectivity of the heterobimetallic complexes for the catalytic production of a formate equivalent surpasses that of the parent monomeric Group 11 complexes. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.
Recovery and regeneration of spent MHD seed material by the formate process
Sheth, Atul C.; Holt, Jeffrey K.; Rasnake, Darryll G.; Solomon, Robert L.; Wilson, Gregory L.; Herrigel, Howard R.
1991-01-01
The specification discloses a spent seed recovery and regeneration process for an MHM power plant employing an alkali metal salt seed material such as potassium salt wherein the spent potassium seed in the form of potassium sulfate is collected from the flue gas and reacted with calcium hydroxide and carbon monoxide in an aqueous solution to cause the formation of calcium sulfate and potassium formate. The pH of the solution is adjusted to supress formation of formic acid and to promote precipitation of any dissolved calcium salts. The solution containing potassium formate is then employed to provide the potassium salt in the form of potassium formate or, optionally, by heating the potassium formate under oxidizing conditions to convert the potassium formate to potassium carbonate.
Giacometti, F; Bonilauri, P; Piva, S; Scavia, G; Amatiste, S; Bianchi, D M; Losio, M N; Bilei, S; Cascone, G; Comin, D; Daminelli, P; Decastelli, L; Merialdi, G; Mioni, R; Peli, A; Petruzzelli, A; Tonucci, F; Liuzzo, G; Serraino, A
2017-11-01
A quantitative risk assessment (RA) was developed to estimate haemolytic-uremic syndrome (HUS) cases in paediatric population associated with the consumption of raw milk sold in vending machines in Italy. The historical national evolution of raw milk consumption phenomenon since 2008, when consumer interest started to grow, and after 7 years of marketing adjustment, is outlined. Exposure assessment was based on the official Shiga toxin-producing Escherichia coli O157:H7 (STEC) microbiological records of raw milk samples from vending machines monitored by the regional Veterinary Authorities from 2008 to 2014, microbial growth during storage, consumption frequency of raw milk, serving size, consumption preference and age of consumers. The differential risk considered milk handled under regulation conditions (4°C throughout all phases) and the worst time-temperature field handling conditions detected. In case of boiling milk before consumption, we assumed that the risk of HUS is fixed at zero. The model estimates clearly show that the public health significance of HUS cases due to raw milk STEC contamination depends on the current variability surrounding the risk profile of the food and the consumer behaviour has more impact than milk storage scenario. The estimated HUS cases predicted by our model are roughly in line with the effective STEC O157-associated HUS cases notified in Italy only when the proportion of consumers not boiling milk before consumption is assumed to be 1%. Raw milk consumption remains a source of E. coli O157:H7 for humans, but its overall relevance is likely to have subsided and significant caution should be exerted for temporal, geographical and consumers behaviour analysis. Health education programmes and regulatory actions are required to educate people, primarily children, on other STEC sources. © 2016 Blackwell Verlag GmbH.
Process service quality evaluation based on Dempster-Shafer theory and support vector machine.
Pei, Feng-Que; Li, Dong-Bo; Tong, Yi-Fei; He, Fei
2017-01-01
Human involvement influences traditional service quality evaluations, which triggers an evaluation's low accuracy, poor reliability and less impressive predictability. This paper proposes a method by employing a support vector machine (SVM) and Dempster-Shafer evidence theory to evaluate the service quality of a production process by handling a high number of input features with a low sampling data set, which is called SVMs-DS. Features that can affect production quality are extracted by a large number of sensors. Preprocessing steps such as feature simplification and normalization are reduced. Based on three individual SVM models, the basic probability assignments (BPAs) are constructed, which can help the evaluation in a qualitative and quantitative way. The process service quality evaluation results are validated by the Dempster rules; the decision threshold to resolve conflicting results is generated from three SVM models. A case study is presented to demonstrate the effectiveness of the SVMs-DS method.
A Relevance Vector Machine-Based Approach with Application to Oil Sand Pump Prognostics
Hu, Jinfei; Tse, Peter W.
2013-01-01
Oil sand pumps are widely used in the mining industry for the delivery of mixtures of abrasive solids and liquids. Because they operate under highly adverse conditions, these pumps usually experience significant wear. Consequently, equipment owners are quite often forced to invest substantially in system maintenance to avoid unscheduled downtime. In this study, an approach combining relevance vector machines (RVMs) with a sum of two exponential functions was developed to predict the remaining useful life (RUL) of field pump impellers. To handle field vibration data, a novel feature extracting process was proposed to arrive at a feature varying with the development of damage in the pump impellers. A case study involving two field datasets demonstrated the effectiveness of the developed method. Compared with standalone exponential fitting, the proposed RVM-based model was much better able to predict the remaining useful life of pump impellers. PMID:24051527
Privacy preserving RBF kernel support vector machine.
Li, Haoran; Xiong, Li; Ohno-Machado, Lucila; Jiang, Xiaoqian
2014-01-01
Data sharing is challenging but important for healthcare research. Methods for privacy-preserving data dissemination based on the rigorous differential privacy standard have been developed but they did not consider the characteristics of biomedical data and make full use of the available information. This often results in too much noise in the final outputs. We hypothesized that this situation can be alleviated by leveraging a small portion of open-consented data to improve utility without sacrificing privacy. We developed a hybrid privacy-preserving differentially private support vector machine (SVM) model that uses public data and private data together. Our model leverages the RBF kernel and can handle nonlinearly separable cases. Experiments showed that this approach outperforms two baselines: (1) SVMs that only use public data, and (2) differentially private SVMs that are built from private data. Our method demonstrated very close performance metrics compared to nonprivate SVMs trained on the private data.
Active machine learning-driven experimentation to determine compound effects on protein patterns.
Naik, Armaghan W; Kangas, Joshua D; Sullivan, Devin P; Murphy, Robert F
2016-02-03
High throughput screening determines the effects of many conditions on a given biological target. Currently, to estimate the effects of those conditions on other targets requires either strong modeling assumptions (e.g. similarities among targets) or separate screens. Ideally, data-driven experimentation could be used to learn accurate models for many conditions and targets without doing all possible experiments. We have previously described an active machine learning algorithm that can iteratively choose small sets of experiments to learn models of multiple effects. We now show that, with no prior knowledge and with liquid handling robotics and automated microscopy under its control, this learner accurately learned the effects of 48 chemical compounds on the subcellular localization of 48 proteins while performing only 29% of all possible experiments. The results represent the first practical demonstration of the utility of active learning-driven biological experimentation in which the set of possible phenotypes is unknown in advance.
Cervical cancer survival prediction using hybrid of SMOTE, CART and smooth support vector machine
NASA Astrophysics Data System (ADS)
Purnami, S. W.; Khasanah, P. M.; Sumartini, S. H.; Chosuvivatwong, V.; Sriplung, H.
2016-04-01
According to the WHO, every two minutes there is one patient who died from cervical cancer. The high mortality rate is due to the lack of awareness of women for early detection. There are several factors that supposedly influence the survival of cervical cancer patients, including age, anemia status, stage, type of treatment, complications and secondary disease. This study wants to classify/predict cervical cancer survival based on those factors. Various classifications methods: classification and regression tree (CART), smooth support vector machine (SSVM), three order spline SSVM (TSSVM) were used. Since the data of cervical cancer are imbalanced, synthetic minority oversampling technique (SMOTE) is used for handling imbalanced dataset. Performances of these methods are evaluated using accuracy, sensitivity and specificity. Results of this study show that balancing data using SMOTE as preprocessing can improve performance of classification. The SMOTE-SSVM method provided better result than SMOTE-TSSVM and SMOTE-CART.
A support vector machine based control application to the experimental three-tank system.
Iplikci, Serdar
2010-07-01
This paper presents a support vector machine (SVM) approach to generalized predictive control (GPC) of multiple-input multiple-output (MIMO) nonlinear systems. The possession of higher generalization potential and at the same time avoidance of getting stuck into the local minima have motivated us to employ SVM algorithms for modeling MIMO systems. Based on the SVM model, detailed and compact formulations for calculating predictions and gradient information, which are used in the computation of the optimal control action, are given in the paper. The proposed MIMO SVM-based GPC method has been verified on an experimental three-tank liquid level control system. Experimental results have shown that the proposed method can handle the control task successfully for different reference trajectories. Moreover, a detailed discussion on data gathering, model selection and effects of the control parameters have been given in this paper. 2010 ISA. Published by Elsevier Ltd. All rights reserved.
A relevance vector machine-based approach with application to oil sand pump prognostics.
Hu, Jinfei; Tse, Peter W
2013-09-18
Oil sand pumps are widely used in the mining industry for the delivery of mixtures of abrasive solids and liquids. Because they operate under highly adverse conditions, these pumps usually experience significant wear. Consequently, equipment owners are quite often forced to invest substantially in system maintenance to avoid unscheduled downtime. In this study, an approach combining relevance vector machines (RVMs) with a sum of two exponential functions was developed to predict the remaining useful life (RUL) of field pump impellers. To handle field vibration data, a novel feature extracting process was proposed to arrive at a feature varying with the development of damage in the pump impellers. A case study involving two field datasets demonstrated the effectiveness of the developed method. Compared with standalone exponential fitting, the proposed RVM-based model was much better able to predict the remaining useful life of pump impellers.
Privacy Preserving RBF Kernel Support Vector Machine
Xiong, Li; Ohno-Machado, Lucila
2014-01-01
Data sharing is challenging but important for healthcare research. Methods for privacy-preserving data dissemination based on the rigorous differential privacy standard have been developed but they did not consider the characteristics of biomedical data and make full use of the available information. This often results in too much noise in the final outputs. We hypothesized that this situation can be alleviated by leveraging a small portion of open-consented data to improve utility without sacrificing privacy. We developed a hybrid privacy-preserving differentially private support vector machine (SVM) model that uses public data and private data together. Our model leverages the RBF kernel and can handle nonlinearly separable cases. Experiments showed that this approach outperforms two baselines: (1) SVMs that only use public data, and (2) differentially private SVMs that are built from private data. Our method demonstrated very close performance metrics compared to nonprivate SVMs trained on the private data. PMID:25013805
Wire connector classification with machine vision and a novel hybrid SVM
NASA Astrophysics Data System (ADS)
Chauhan, Vedang; Joshi, Keyur D.; Surgenor, Brian W.
2018-04-01
A machine vision-based system has been developed and tested that uses a novel hybrid Support Vector Machine (SVM) in a part inspection application with clear plastic wire connectors. The application required the system to differentiate between 4 different known styles of connectors plus one unknown style, for a total of 5 classes. The requirement to handle an unknown class is what necessitated the hybrid approach. The system was trained with the 4 known classes and tested with 5 classes (the 4 known plus the 1 unknown). The hybrid classification approach used two layers of SVMs: one layer was semi-supervised and the other layer was supervised. The semi-supervised SVM was a special case of unsupervised machine learning that classified test images as one of the 4 known classes (to accept) or as the unknown class (to reject). The supervised SVM classified test images as one of the 4 known classes and consequently would give false positives (FPs). Two methods were tested. The difference between the methods was that the order of the layers was switched. The method with the semi-supervised layer first gave an accuracy of 80% with 20% FPs. The method with the supervised layer first gave an accuracy of 98% with 0% FPs. Further work is being conducted to see if the hybrid approach works with other applications that have an unknown class requirement.
Elkhoudary, Mahmoud M; Naguib, Ibrahim A; Abdel Salam, Randa A; Hadad, Ghada M
2017-05-01
Four accurate, sensitive and reliable stability indicating chemometric methods were developed for the quantitative determination of Agomelatine (AGM) whether in pure form or in pharmaceutical formulations. Two supervised learning machines' methods; linear artificial neural networks (PC-linANN) preceded by principle component analysis and linear support vector regression (linSVR), were compared with two principle component based methods; principle component regression (PCR) as well as partial least squares (PLS) for the spectrofluorimetric determination of AGM and its degradants. The results showed the benefits behind using linear learning machines' methods and the inherent merits of their algorithms in handling overlapped noisy spectral data especially during the challenging determination of AGM alkaline and acidic degradants (DG1 and DG2). Relative mean squared error of prediction (RMSEP) for the proposed models in the determination of AGM were 1.68, 1.72, 0.68 and 0.22 for PCR, PLS, SVR and PC-linANN; respectively. The results showed the superiority of supervised learning machines' methods over principle component based methods. Besides, the results suggested that linANN is the method of choice for determination of components in low amounts with similar overlapped spectra and narrow linearity range. Comparison between the proposed chemometric models and a reported HPLC method revealed the comparable performance and quantification power of the proposed models.
Modern laser technologies used for cutting textile materials
NASA Astrophysics Data System (ADS)
Isarie, Claudiu; Dragan, Anca; Isarie, Laura; Nastase, Dan
2006-02-01
With modern laser technologies we can cut multiple layers at once, yielding high production levels and short setup times between cutting runs. One example could be the operation of cutting the material named Nylon 66, used to manufacture automobile airbags. With laser, up to seven layers of Nylon 66 can be cut in one pass, that means high production rates on a single machine. Airbags must be precisely crafted piece of critical safety equipment that is built to very high levels of precision in a mass production environment. Of course, synthetic material, used for airbags, can be cut also by a conventional fixed blade system, but for a high production rates and a long term low-maintenance, laser cutting is most suitable. Most systems, are equipped with two material handling systems, which can cut on one half of he table while the finished product is being removed from the other half and the new stock material laid out. The laser system is reliable and adaptable to any flatbed-cutting task. Computer controlled industrial cutting and plotting machines are the latest offerings from a well established and experienced industrial engineering company that is dedicated to reduce cutting costs and boosting productivity in today's competitive industrial machine tool market. In this way, just one machine can carry out a multitude of production tasks. Authors have studied the cutting parameters for different textile materials, to reach the maximum output of the process.
Overview Of Dry-Etch Techniques
NASA Astrophysics Data System (ADS)
Salzer, John M.
1986-08-01
With pattern dimensions shrinking, dry methods of etching providing controllable degrees of anisotropy become a necessity. A number of different configurations of equipment - inline, hex, planar, barrel - have been offered, and within each type, there are numerous significant variations. Further, each specific type of machine must be perfected over a complex, interactive parameter space to achieve suitable removal of various materials. Among the most critical system parameters are the choice of cathode or anode to hold the wafers, the chamber pressure, the plasma excitation frequency, and the electrode and magnetron structures. Recent trends include the use of vacuum load locks, multiple chambers, multiple electrodes, downstream etching or stripping, and multistep processes. A major percentage of etches in production handle the three materials: polysilicon, oxide and aluminum. Recent process developments have targeted refractory metals, their silicides, and with increasing emphasis, silicon trenching. Indeed, with new VLSI structures, silicon trenching has become the process of greatest interest. For stripping, dry processes provide advantages other than anisotropy. Here, too, new configurations and methods have been introduced recently. While wet processes are less than desirable from a number of viewpoints (handling, safety, disposal, venting, classes of clean room, automatability), dry methods are still being perfected as a direct, universal replacement. The paper will give an overview of these machine structures and process solutions, together with examples of interest. These findings and the trends discussed are based on semiannual survey of manufacturers and users of the various types of equipment.
Effects of Krankcycle Training on Performance and Body Composition in Wheelchair Users.
Čichoň, Rostislav; Maszczyk, Adam; Stastny, Petr; Uhlíř, Petr; Petr, Miroslav; Doubrava, Ondřej; Mostowik, Aleksandra; Gołaś, Artur; Cieszczyk, Paweł; Żmijewski, Piotr
2015-11-22
Innovation in training equipment is important for increasing training effectiveness, performance and changes in body composition, especially in wheelchair users with paraplegia. The main objective of a workout session is to induce an adaptation stimulus, which requires overload of involved muscles by voluntary effort, yet this overload may be highly influenced by the size of the spinal cord lesion. Krancykl construction is designed to allow exercise on any wheelchair and with adjustable height or width of crank handles, where even the grip handle may be altered. The aim of this study was to determine the differences in body composition, performance and the rate of perceived exertion (RPE) in paraplegics with a different level of paralyses after a 12 week training programme of a unilateral regime on Krankcycle equipment (a crank machine). The study sample included four men and one women at a different spine lesion level. The 12 weeks programme was successfully completed by four participants, while one subject got injured during the intervention process. Three participants were paraplegics and one was quadriplegic with innervation of the biceps humeri, triceps humeri and deltoideus. The Krankcycle 30 min programme was followed by four other exercises, which were performed after themselves rather than in a circuit training manner as the latter would result in much longer rest periods between exercises, because paraplegics have to be fixed by straps during exercise on hydraulic machines. The RPE after the workout decreased following the twelve week adaptation period.
Effects of Krankcycle Training on Performance and Body Composition in Wheelchair Users
Čichoň, Rostislav; Maszczyk, Adam; Stastny, Petr; Uhlíř, Petr; Petr, Miroslav; Doubrava, Ondřej; Mostowik, Aleksandra; Gołaś, Artur; Cieszczyk, Paweł; Żmijewski, Piotr
2015-01-01
Innovation in training equipment is important for increasing training effectiveness, performance and changes in body composition, especially in wheelchair users with paraplegia. The main objective of a workout session is to induce an adaptation stimulus, which requires overload of involved muscles by voluntary effort, yet this overload may be highly influenced by the size of the spinal cord lesion. Krancykl construction is designed to allow exercise on any wheelchair and with adjustable height or width of crank handles, where even the grip handle may be altered. The aim of this study was to determine the differences in body composition, performance and the rate of perceived exertion (RPE) in paraplegics with a different level of paralyses after a 12 week training programme of a unilateral regime on Krankcycle equipment (a crank machine). The study sample included four men and one women at a different spine lesion level. The 12 weeks programme was successfully completed by four participants, while one subject got injured during the intervention process. Three participants were paraplegics and one was quadriplegic with innervation of the biceps humeri, triceps humeri and deltoideus. The Krankcycle 30 min programme was followed by four other exercises, which were performed after themselves rather than in a circuit training manner as the latter would result in much longer rest periods between exercises, because paraplegics have to be fixed by straps during exercise on hydraulic machines. The RPE after the workout decreased following the twelve week adaptation period. PMID:26834875
Behavioral Modeling for Mental Health using Machine Learning Algorithms.
Srividya, M; Mohanavalli, S; Bhalaji, N
2018-04-03
Mental health is an indicator of emotional, psychological and social well-being of an individual. It determines how an individual thinks, feels and handle situations. Positive mental health helps one to work productively and realize their full potential. Mental health is important at every stage of life, from childhood and adolescence through adulthood. Many factors contribute to mental health problems which lead to mental illness like stress, social anxiety, depression, obsessive compulsive disorder, drug addiction, and personality disorders. It is becoming increasingly important to determine the onset of the mental illness to maintain proper life balance. The nature of machine learning algorithms and Artificial Intelligence (AI) can be fully harnessed for predicting the onset of mental illness. Such applications when implemented in real time will benefit the society by serving as a monitoring tool for individuals with deviant behavior. This research work proposes to apply various machine learning algorithms such as support vector machines, decision trees, naïve bayes classifier, K-nearest neighbor classifier and logistic regression to identify state of mental health in a target group. The responses obtained from the target group for the designed questionnaire were first subject to unsupervised learning techniques. The labels obtained as a result of clustering were validated by computing the Mean Opinion Score. These cluster labels were then used to build classifiers to predict the mental health of an individual. Population from various groups like high school students, college students and working professionals were considered as target groups. The research presents an analysis of applying the aforementioned machine learning algorithms on the target groups and also suggests directions for future work.
Radiological tele-immersion for next generation networks.
Ai, Z; Dech, F; Rasmussen, M; Silverstein, J C
2000-01-01
Since the acquisition of high-resolution three-dimensional patient images has become widespread, medical volumetric datasets (CT or MR) larger than 100 MB and encompassing more than 250 slices are common. It is important to make this patient-specific data quickly available and usable to many specialists at different geographical sites. Web-based systems have been developed to provide volume or surface rendering of medical data over networks with low fidelity, but these cannot adequately handle stereoscopic visualization or huge datasets. State-of-the-art virtual reality techniques and high speed networks have made it possible to create an environment for clinicians geographically distributed to immersively share these massive datasets in real-time. An object-oriented method for instantaneously importing medical volumetric data into Tele-Immersive environments has been developed at the Virtual Reality in Medicine Laboratory (VRMedLab) at the University of Illinois at Chicago (UIC). This networked-VR setup is based on LIMBO, an application framework or template that provides the basic capabilities of Tele-Immersion. We have developed a modular general purpose Tele-Immersion program that automatically combines 3D medical data with the methods for handling the data. For this purpose a DICOM loader for IRIS Performer has been developed. The loader was designed for SGI machines as a shared object, which is executed at LIMBO's runtime. The loader loads not only the selected DICOM dataset, but also methods for rendering, handling, and interacting with the data, bringing networked, real-time, stereoscopic interaction with radiological data to reality. Collaborative, interactive methods currently implemented in the loader include cutting planes and windowing. The Tele-Immersive environment has been tested on the UIC campus over an ATM network. We tested the environment with 3 nodes; one ImmersaDesk at the VRMedLab, one CAVE at the Electronic Visualization Laboratory (EVL) on east campus, and a CT scan machine in UIC Hospital. CT data was pulled directly from the scan machine to the Tele-Immersion server in our Laboratory, and then the data was synchronously distributed by our Onyx2 Rack server to all the VR setups. Instead of permitting medical volume visualization at one VR device, by combining teleconferencing, tele-presence, and virtual reality, the Tele-Immersive environment will enable geographically distributed clinicians to intuitively interact with the same medical volumetric models, point, gesture, converse, and see each other. This environment will bring together clinicians at different geographic locations to participate in Tele-Immersive consultation and collaboration.
NASA Astrophysics Data System (ADS)
Kumar, Ravi; Singh, Surya Prakash
2017-11-01
The dynamic cellular facility layout problem (DCFLP) is a well-known NP-hard problem. It has been estimated that the efficient design of DCFLP reduces the manufacturing cost of products by maintaining the minimum material flow among all machines in all cells, as the material flow contributes around 10-30% of the total product cost. However, being NP hard, solving the DCFLP optimally is very difficult in reasonable time. Therefore, this article proposes a novel similarity score-based two-phase heuristic approach to solve the DCFLP optimally considering multiple products in multiple times to be manufactured in the manufacturing layout. In the first phase of the proposed heuristic, a machine-cell cluster is created based on similarity scores between machines. This is provided as an input to the second phase to minimize inter/intracell material handling costs and rearrangement costs over the entire planning period. The solution methodology of the proposed approach is demonstrated. To show the efficiency of the two-phase heuristic approach, 21 instances are generated and solved using the optimization software package LINGO. The results show that the proposed approach can optimally solve the DCFLP in reasonable time.
Osis, Sean T; Hettinga, Blayne A; Ferber, Reed
2016-05-01
An ongoing challenge in the application of gait analysis to clinical settings is the standardized detection of temporal events, with unobtrusive and cost-effective equipment, for a wide range of gait types. The purpose of the current study was to investigate a targeted machine learning approach for the prediction of timing for foot strike (or initial contact) and toe-off, using only kinematics for walking, forefoot running, and heel-toe running. Data were categorized by gait type and split into a training set (∼30%) and a validation set (∼70%). A principal component analysis was performed, and separate linear models were trained and validated for foot strike and toe-off, using ground reaction force data as a gold-standard for event timing. Results indicate the model predicted both foot strike and toe-off timing to within 20ms of the gold-standard for more than 95% of cases in walking and running gaits. The machine learning approach continues to provide robust timing predictions for clinical use, and may offer a flexible methodology to handle new events and gait types. Copyright © 2016 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Santosa, B.; Siswanto, N.; Fiqihesa
2018-04-01
This paper proposes a discrete Particle Swam Optimization (PSO) to solve limited-wait hybrid flowshop scheduing problem with multi objectives. Flow shop schedulimg represents the condition when several machines are arranged in series and each job must be processed at each machine with same sequence. The objective functions are minimizing completion time (makespan), total tardiness time, and total machine idle time. Flow shop scheduling model always grows to cope with the real production system accurately. Since flow shop scheduling is a NP-Hard problem then the most suitable method to solve is metaheuristics. One of metaheuristics algorithm is Particle Swarm Optimization (PSO), an algorithm which is based on the behavior of a swarm. Originally, PSO was intended to solve continuous optimization problems. Since flow shop scheduling is a discrete optimization problem, then, we need to modify PSO to fit the problem. The modification is done by using probability transition matrix mechanism. While to handle multi objectives problem, we use Pareto Optimal (MPSO). The results of MPSO is better than the PSO because the MPSO solution set produced higher probability to find the optimal solution. Besides the MPSO solution set is closer to the optimal solution
Pirooznia, Mehdi; Deng, Youping
2006-12-12
Graphical user interface (GUI) software promotes novelty by allowing users to extend the functionality. SVM Classifier is a cross-platform graphical application that handles very large datasets well. The purpose of this study is to create a GUI application that allows SVM users to perform SVM training, classification and prediction. The GUI provides user-friendly access to state-of-the-art SVM methods embodied in the LIBSVM implementation of Support Vector Machine. We implemented the java interface using standard swing libraries. We used a sample data from a breast cancer study for testing classification accuracy. We achieved 100% accuracy in classification among the BRCA1-BRCA2 samples with RBF kernel of SVM. We have developed a java GUI application that allows SVM users to perform SVM training, classification and prediction. We have demonstrated that support vector machines can accurately classify genes into functional categories based upon expression data from DNA microarray hybridization experiments. Among the different kernel functions that we examined, the SVM that uses a radial basis kernel function provides the best performance. The SVM Classifier is available at http://mfgn.usm.edu/ebl/svm/.
NASA Astrophysics Data System (ADS)
Pan, Leyun; Cheng, Caixia; Haberkorn, Uwe; Dimitrakopoulou-Strauss, Antonia
2017-05-01
A variety of compartment models are used for the quantitative analysis of dynamic positron emission tomography (PET) data. Traditionally, these models use an iterative fitting (IF) method to find the least squares between the measured and calculated values over time, which may encounter some problems such as the overfitting of model parameters and a lack of reproducibility, especially when handling noisy data or error data. In this paper, a machine learning (ML) based kinetic modeling method is introduced, which can fully utilize a historical reference database to build a moderate kinetic model directly dealing with noisy data but not trying to smooth the noise in the image. Also, due to the database, the presented method is capable of automatically adjusting the models using a multi-thread grid parameter searching technique. Furthermore, a candidate competition concept is proposed to combine the advantages of the ML and IF modeling methods, which could find a balance between fitting to historical data and to the unseen target curve. The machine learning based method provides a robust and reproducible solution that is user-independent for VOI-based and pixel-wise quantitative analysis of dynamic PET data.
NASA Technical Reports Server (NTRS)
Malin, J. T.; Carnes, J. G. (Principal Investigator)
1981-01-01
The U.S. corn and soybeans exploratory experiment is described which consisted of evaluations of two technology components of a production forecasting system: classification procedures (crop labeling and proportion estimation at the level of a sampling unit) and sampling and aggregation procedures. The results from the labeling evaluations indicate that the corn and soybeans labeling procedure works very well in the U.S. corn belt with full season (after tasseling) LANDSAT data. The procedure should be readily adaptable to corn and soybeans labeling required for subsequent exploratory experiments or pilot tests. The machine classification procedures evaluated in this experiment were not effective in improving the proportion estimates. The corn proportions produced by the machine procedures had a large bias when the bias correction was not performed. This bias was caused by the manner in which the machine procedures handled spectrally impure pixels. The simulation test indicated that the weighted aggregation procedure performed quite well. Although further work can be done to improve both the simulation tests and the aggregation procedure, the results of this test show that the procedure should serve as a useful baseline procedure in future exploratory experiments and pilot tests.
NASA Astrophysics Data System (ADS)
Setiawan, A.; Wangsaputra, R.; Martawirya, Y. Y.; Halim, A. H.
2016-02-01
This paper deals with Flexible Manufacturing System (FMS) production rescheduling due to unavailability of cutting tools caused either of cutting tool failure or life time limit. The FMS consists of parallel identical machines integrated with an automatic material handling system and it runs fully automatically. Each machine has a same cutting tool configuration that consists of different geometrical cutting tool types on each tool magazine. The job usually takes two stages. Each stage has sequential operations allocated to machines considering the cutting tool life. In the real situation, the cutting tool can fail before the cutting tool life is reached. The objective in this paper is to develop a dynamic scheduling algorithm when a cutting tool is broken during unmanned and a rescheduling needed. The algorithm consists of four steps. The first step is generating initial schedule, the second step is determination the cutting tool failure time, the third step is determination of system status at cutting tool failure time and the fourth step is the rescheduling for unfinished jobs. The approaches to solve the problem are complete-reactive scheduling and robust-proactive scheduling. The new schedules result differences starting time and completion time of each operations from the initial schedule.
Pan, Leyun; Cheng, Caixia; Haberkorn, Uwe; Dimitrakopoulou-Strauss, Antonia
2017-05-07
A variety of compartment models are used for the quantitative analysis of dynamic positron emission tomography (PET) data. Traditionally, these models use an iterative fitting (IF) method to find the least squares between the measured and calculated values over time, which may encounter some problems such as the overfitting of model parameters and a lack of reproducibility, especially when handling noisy data or error data. In this paper, a machine learning (ML) based kinetic modeling method is introduced, which can fully utilize a historical reference database to build a moderate kinetic model directly dealing with noisy data but not trying to smooth the noise in the image. Also, due to the database, the presented method is capable of automatically adjusting the models using a multi-thread grid parameter searching technique. Furthermore, a candidate competition concept is proposed to combine the advantages of the ML and IF modeling methods, which could find a balance between fitting to historical data and to the unseen target curve. The machine learning based method provides a robust and reproducible solution that is user-independent for VOI-based and pixel-wise quantitative analysis of dynamic PET data.
PIMS sequencing extension: a laboratory information management system for DNA sequencing facilities.
Troshin, Peter V; Postis, Vincent Lg; Ashworth, Denise; Baldwin, Stephen A; McPherson, Michael J; Barton, Geoffrey J
2011-03-07
Facilities that provide a service for DNA sequencing typically support large numbers of users and experiment types. The cost of services is often reduced by the use of liquid handling robots but the efficiency of such facilities is hampered because the software for such robots does not usually integrate well with the systems that run the sequencing machines. Accordingly, there is a need for software systems capable of integrating different robotic systems and managing sample information for DNA sequencing services. In this paper, we describe an extension to the Protein Information Management System (PIMS) that is designed for DNA sequencing facilities. The new version of PIMS has a user-friendly web interface and integrates all aspects of the sequencing process, including sample submission, handling and tracking, together with capture and management of the data. The PIMS sequencing extension has been in production since July 2009 at the University of Leeds DNA Sequencing Facility. It has completely replaced manual data handling and simplified the tasks of data management and user communication. Samples from 45 groups have been processed with an average throughput of 10000 samples per month. The current version of the PIMS sequencing extension works with Applied Biosystems 3130XL 96-well plate sequencer and MWG 4204 or Aviso Theonyx liquid handling robots, but is readily adaptable for use with other combinations of robots. PIMS has been extended to provide a user-friendly and integrated data management solution for DNA sequencing facilities that is accessed through a normal web browser and allows simultaneous access by multiple users as well as facility managers. The system integrates sequencing and liquid handling robots, manages the data flow, and provides remote access to the sequencing results. The software is freely available, for academic users, from http://www.pims-lims.org/.
Basics of robotics and manipulators in endoscopic surgery.
Rininsland, H H
1993-06-01
The experience with sophisticated remote handling systems for nuclear operations in inaccessible rooms can to a large extent be transferred to the development of robotics and telemanipulators for endoscopic surgery. A telemanipulator system is described consisting of manipulator, endeffector and tools, 3-D video-endoscope, sensors, intelligent control system, modeling and graphic simulation and man-machine interfaces as the main components or subsystems. Such a telemanipulator seems to be medically worthwhile and technically feasible, but needs a lot of effort from different scientific disciplines to become a safe and reliable instrument for future endoscopic surgery.
Storage and retrieval of mass spectral information
NASA Technical Reports Server (NTRS)
Hohn, M. E.; Humberston, M. J.; Eglinton, G.
1977-01-01
Computer handling of mass spectra serves two main purposes: the interpretation of the occasional, problematic mass spectrum, and the identification of the large number of spectra generated in the gas-chromatographic-mass spectrometric (GC-MS) analysis of complex natural and synthetic mixtures. Methods available fall into the three categories of library search, artificial intelligence, and learning machine. Optional procedures for coding, abbreviating and filtering a library of spectra minimize time and storage requirements. Newer techniques make increasing use of probability and information theory in accessing files of mass spectral information.
Dose rate evaluation of workers on the operation floor in Fukushima-Daiichi Unit 3
NASA Astrophysics Data System (ADS)
Matsushita, Kaoru; Kurosawa, Masahiko; Shirai, Keisuke; Matsuoka, Ippei; Mukaida, Naoki
2017-09-01
At Fukushima Daiichi Nuclear Power Plant Unit 3, installation of a fuel handling machine is planned to support the removal of spent fuel. The dose rates at the workplace were calculated based on the source distribution measured using a collimator in order to confirm that the dose rates on the operation floor were within a manageable range. It was confirmed that the accuracy of the source distribution was C/M = 1.0-2.4. These dose rates were then used to plan the work on the operation floor.
Identification of granite varieties from colour spectrum data.
Araújo, María; Martínez, Javier; Ordóñez, Celestino; Vilán, José Antonio
2010-01-01
The granite processing sector of the northwest of Spain handles many varieties of granite with specific technical and aesthetic properties that command different prices in the natural stone market. Hence, correct granite identification and classification from the outset of processing to the end-product stage optimizes the management and control of stocks of granite slabs and tiles and facilitates the operation of traceability systems. We describe a methodology for automatically identifying granite varieties by processing spectral information captured by a spectrophotometer at various stages of processing using functional machine learning techniques.
Identification of Granite Varieties from Colour Spectrum Data
Araújo, María; Martínez, Javier; Ordóñez, Celestino; Vilán, José Antonio
2010-01-01
The granite processing sector of the northwest of Spain handles many varieties of granite with specific technical and aesthetic properties that command different prices in the natural stone market. Hence, correct granite identification and classification from the outset of processing to the end-product stage optimizes the management and control of stocks of granite slabs and tiles and facilitates the operation of traceability systems. We describe a methodology for automatically identifying granite varieties by processing spectral information captured by a spectrophotometer at various stages of processing using functional machine learning techniques. PMID:22163673
Karski, Tomasz
2012-01-01
The observations from 1985-1995 and till 2012 clarify that the development of so-called idiopathic scoliosis is connected with "gait" and habitual permanent "standing at ease" on the right leg. The scoliosis is "a result" of asymmetry of "function" - "changed" loading during gait and asymmetry in time during 'at ease' standing, more prevalent on the right leg. Every types of scoliosis is connected with the adequate "model of hips movements" [MHM] (Karski et al., 2006 [1]). This new classification clarifies the therapeutic approach to each types of scoliosis and provides the possibility to introduce causative prophylaxis.
NASA Technical Reports Server (NTRS)
Robinson, Peter; Shirley, Mark; Fletcher, Daryl; Alena, Rick; Duncavage, Dan; Lee, Charles
2003-01-01
All of the International Space Station (ISS) systems which require computer control depend upon the hardware and software of the Command and Data Handling System (C&DH) system, currently a network of over 30 386-class computers called Multiplexor/Dimultiplexors (MDMs)[18]. The Caution and Warning System (C&W)[7], a set of software tasks that runs on the MDMs, is responsible for detecting, classifying, and reporting errors in all ISS subsystems including the C&DH. Fault Detection, Isolation and Recovery (FDIR) of these errors is typically handled with a combination of automatic and human effort. We are developing an Advanced Diagnostic System (ADS) to augment the C&W system with decision support tools to aid in root cause analysis as well as resolve differing human and machine C&DH state estimates. These tools which draw from sources in model-based reasoning[ 16,291, will improve the speed and accuracy of flight controllers by reducing the uncertainty in C&DH state estimation, allowing for a more complete assessment of risk. We have run tests with ISS telemetry and focus on those C&W events which relate to the C&DH system itself. This paper describes our initial results and subsequent plans.
ATLAS Metadata Infrastructure Evolution for Run 2 and Beyond
NASA Astrophysics Data System (ADS)
van Gemmeren, P.; Cranshaw, J.; Malon, D.; Vaniachine, A.
2015-12-01
ATLAS developed and employed for Run 1 of the Large Hadron Collider a sophisticated infrastructure for metadata handling in event processing jobs. This infrastructure profits from a rich feature set provided by the ATLAS execution control framework, including standardized interfaces and invocation mechanisms for tools and services, segregation of transient data stores with concomitant object lifetime management, and mechanisms for handling occurrences asynchronous to the control framework's state machine transitions. This metadata infrastructure is evolving and being extended for Run 2 to allow its use and reuse in downstream physics analyses, analyses that may or may not utilize the ATLAS control framework. At the same time, multiprocessing versions of the control framework and the requirements of future multithreaded frameworks are leading to redesign of components that use an incident-handling approach to asynchrony. The increased use of scatter-gather architectures, both local and distributed, requires further enhancement of metadata infrastructure in order to ensure semantic coherence and robust bookkeeping. This paper describes the evolution of ATLAS metadata infrastructure for Run 2 and beyond, including the transition to dual-use tools—tools that can operate inside or outside the ATLAS control framework—and the implications thereof. It further examines how the design of this infrastructure is changing to accommodate the requirements of future frameworks and emerging event processing architectures.
Manipulation and handling processes off-line programming and optimization with use of K-Roset
NASA Astrophysics Data System (ADS)
Gołda, G.; Kampa, A.
2017-08-01
Contemporary trends in development of efficient, flexible manufacturing systems require practical implementation of modern “Lean production” concepts for maximizing customer value through minimizing all wastes in manufacturing and logistics processes. Every FMS is built on the basis of automated and robotized production cells. Except flexible CNC machine tools and other equipments, the industrial robots are primary elements of the system. In the studies, authors look for wastes of time and cost in real tasks of robots, during manipulation processes. According to aspiration for optimization of handling and manipulation processes with use of the robots, the application of modern off-line programming methods and computer simulation, is the best solution and it is only way to minimize unnecessary movements and other instructions. The modelling process of robotized production cell and offline programming of Kawasaki robots in AS-Language will be described. The simulation of robotized workstation will be realized with use of virtual reality software K-Roset. Authors show the process of industrial robot’s programs improvement and optimization in terms of minimizing the number of useless manipulator movements and unnecessary instructions. This is realized in order to shorten the time of production cycles. This will also reduce costs of handling, manipulations and technological process.
NASA Astrophysics Data System (ADS)
Wolf, Nils; Hof, Angela
2012-10-01
Urban sprawl driven by shifts in tourism development produces new suburban landscapes of water consumption on Mediterranean coasts. Golf courses, ornamental, 'Atlantic' gardens and swimming pools are the most striking artefacts of this transformation, threatening the local water supply systems and exacerbating water scarcity. In the face of climate change, urban landscape irrigation is becoming increasingly important from a resource management point of view. This paper adopts urban remote sensing towards a targeted mapping approach using machine learning techniques and highresolution satellite imagery (WorldView-2) to generate GIS-ready information for urban water consumption studies. Swimming pools, vegetation and - as a subgroup of vegetation - turf grass are extracted as important determinants of water consumption. For image analysis, the complex nature of urban environments suggests spatial-spectral classification, i.e. the complementary use of the spectral signature and spatial descriptors. Multiscale image segmentation provides means to extract the spatial descriptors - namely object feature layers - which can be concatenated at pixel level to the spectral signature. This study assesses the value of object features using different machine learning techniques and amounts of labeled information for learning. The results indicate the benefit of the spatial-spectral approach if combined with appropriate classifiers like tree-based ensembles or support vector machines, which can handle high dimensionality. Finally, a Random Forest classifier was chosen to deliver the classified input data for the estimation of evaporative water loss and net landscape irrigation requirements.
Food equipment manufacturer takes a slice out of its scrap rate
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bernard, D.; Hannahs, J.; Carter, M.
1996-09-01
The PMI Food Equipment Group began manufacturing circular slicer knives for its commercial Hobart line of slicers in the early 1930s. The company manufacturers the only cast knife in the food industry. The cast knives offer superior edge retention and overall corrosion resistance. The slicer knives are cast in PMI`s foundry. The casting process sometimes produces shrinkage voids or gas bubbles in the knife blank. Surface discontinuities often do not appear until rough cutting or final machining, i.e., after several hours of value-added manufacturing. Knife blanks with these discontinuities were scrapped and sent back to the foundry for remelting. Tomore » scrap the knives at that point meant the cost for casting plus the value-added machining added up to a considerable amount. Weld repair allows the recovery of casting and machining expenses equal to a significant percentage of the total manufacturing cost of slicer knives. Repair costs include welding, grinding, shipping, surface finishing and material handling. Other good applications for this GMAW-P process include repair of jet engine components, rotating process industry equipment, and hardfacing of cutting tools and dies. In addition, dissimilar metals and any material that is heat treated to develop its properties such as precision investment castings are excellent applications. The low resultant distortion, elimination of postweld heat treatment and non-line-of-site welding capability solves thin wall, limited access and precision machined component repair challenges.« less
Evaluating open-source cloud computing solutions for geosciences
NASA Astrophysics Data System (ADS)
Huang, Qunying; Yang, Chaowei; Liu, Kai; Xia, Jizhe; Xu, Chen; Li, Jing; Gui, Zhipeng; Sun, Min; Li, Zhenglong
2013-09-01
Many organizations start to adopt cloud computing for better utilizing computing resources by taking advantage of its scalability, cost reduction, and easy to access characteristics. Many private or community cloud computing platforms are being built using open-source cloud solutions. However, little has been done to systematically compare and evaluate the features and performance of open-source solutions in supporting Geosciences. This paper provides a comprehensive study of three open-source cloud solutions, including OpenNebula, Eucalyptus, and CloudStack. We compared a variety of features, capabilities, technologies and performances including: (1) general features and supported services for cloud resource creation and management, (2) advanced capabilities for networking and security, and (3) the performance of the cloud solutions in provisioning and operating the cloud resources as well as the performance of virtual machines initiated and managed by the cloud solutions in supporting selected geoscience applications. Our study found that: (1) no significant performance differences in central processing unit (CPU), memory and I/O of virtual machines created and managed by different solutions, (2) OpenNebula has the fastest internal network while both Eucalyptus and CloudStack have better virtual machine isolation and security strategies, (3) Cloudstack has the fastest operations in handling virtual machines, images, snapshots, volumes and networking, followed by OpenNebula, and (4) the selected cloud computing solutions are capable for supporting concurrent intensive web applications, computing intensive applications, and small-scale model simulations without intensive data communication.
Application of numerical grid generation for improved CFD analysis of multiphase screw machines
NASA Astrophysics Data System (ADS)
Rane, S.; Kovačević, A.
2017-08-01
Algebraic grid generation is widely used for discretization of the working domain of twin screw machines. Algebraic grid generation is fast and has good control over the placement of grid nodes. However, the desired qualities of grid which should be able to handle multiphase flows such as oil injection, may be difficult to achieve at times. In order to obtain fast solution of multiphase screw machines, it is important to further improve the quality and robustness of the computational grid. In this paper, a deforming grid of a twin screw machine is generated using algebraic transfinite interpolation to produce initial mesh upon which an elliptic partial differential equations (PDE) of the Poisson’s form is solved numerically to produce smooth final computational mesh. The quality of numerical cells and their distribution obtained by the differential method is greatly improved. In addition, a similar procedure was introduced to fully smoothen the transition of the partitioning rack curve between the rotors thus improving continuous movement of grid nodes and in turn improve robustness and speed of the Computational Fluid Dynamic (CFD) solver. Analysis of an oil injected twin screw compressor is presented to compare the improvements in grid quality factors in the regions of importance such as interlobe space, radial tip and the core of the rotor. The proposed method that combines algebraic and differential grid generation offer significant improvement in grid quality and robustness of numerical solution.
Efficiently modeling neural networks on massively parallel computers
NASA Technical Reports Server (NTRS)
Farber, Robert M.
1993-01-01
Neural networks are a very useful tool for analyzing and modeling complex real world systems. Applying neural network simulations to real world problems generally involves large amounts of data and massive amounts of computation. To efficiently handle the computational requirements of large problems, we have implemented at Los Alamos a highly efficient neural network compiler for serial computers, vector computers, vector parallel computers, and fine grain SIMD computers such as the CM-2 connection machine. This paper describes the mapping used by the compiler to implement feed-forward backpropagation neural networks for a SIMD (Single Instruction Multiple Data) architecture parallel computer. Thinking Machines Corporation has benchmarked our code at 1.3 billion interconnects per second (approximately 3 gigaflops) on a 64,000 processor CM-2 connection machine (Singer 1990). This mapping is applicable to other SIMD computers and can be implemented on MIMD computers such as the CM-5 connection machine. Our mapping has virtually no communications overhead with the exception of the communications required for a global summation across the processors (which has a sub-linear runtime growth on the order of O(log(number of processors)). We can efficiently model very large neural networks which have many neurons and interconnects and our mapping can extend to arbitrarily large networks (within memory limitations) by merging the memory space of separate processors with fast adjacent processor interprocessor communications. This paper will consider the simulation of only feed forward neural network although this method is extendable to recurrent networks.
Lei, Tailong; Sun, Huiyong; Kang, Yu; Zhu, Feng; Liu, Hui; Zhou, Wenfang; Wang, Zhe; Li, Dan; Li, Youyong; Hou, Tingjun
2017-11-06
Xenobiotic chemicals and their metabolites are mainly excreted out of our bodies by the urinary tract through the urine. Chemical-induced urinary tract toxicity is one of the main reasons that cause failure during drug development, and it is a common adverse event for medications, natural supplements, and environmental chemicals. Despite its importance, there are only a few in silico models for assessing urinary tract toxicity for a large number of compounds with diverse chemical structures. Here, we developed a series of qualitative and quantitative structure-activity relationship (QSAR) models for predicting urinary tract toxicity. In our study, the recursive feature elimination method incorporated with random forests (RFE-RF) was used for dimension reduction, and then eight machine learning approaches were used for QSAR modeling, i.e., relevance vector machine (RVM), support vector machine (SVM), regularized random forest (RRF), C5.0 trees, eXtreme gradient boosting (XGBoost), AdaBoost.M1, SVM boosting (SVMBoost), and RVM boosting (RVMBoost). For building classification models, the synthetic minority oversampling technique was used to handle the imbalance data set problem. Among all the machine learning approaches, SVMBoost based on the RBF kernel achieves both the best quantitative (q ext 2 = 0.845) and qualitative predictions for the test set (MCC of 0.787, AUC of 0.893, sensitivity of 89.6%, specificity of 94.1%, and global accuracy of 90.8%). The application domains were then analyzed, and all of the tested chemicals fall within the application domain coverage. We also examined the structure features of the chemicals with large prediction errors. In brief, both the regression and classification models developed by the SVMBoost approach have reliable prediction capability for assessing chemical-induced urinary tract toxicity.
Automatic anatomy recognition in post-tonsillectomy MR images of obese children with OSAS
NASA Astrophysics Data System (ADS)
Tong, Yubing; Udupa, Jayaram K.; Odhner, Dewey; Sin, Sanghun; Arens, Raanan
2015-03-01
Automatic Anatomy Recognition (AAR) is a recently developed approach for the automatic whole body wide organ segmentation. We previously tested that methodology on image cases with some pathology where the organs were not distorted significantly. In this paper, we present an advancement of AAR to handle organs which may have been modified or resected by surgical intervention. We focus on MRI of the neck in pediatric Obstructive Sleep Apnea Syndrome (OSAS). The proposed method consists of an AAR step followed by support vector machine techniques to detect the presence/absence of organs. The AAR step employs a hierarchical organization of the organs for model building. For each organ, a fuzzy model over a population is built. The model of the body region is then described in terms of the fuzzy models and a host of other descriptors which include parent to offspring relationship estimated over the population. Organs are recognized following the organ hierarchy by using an optimal threshold based search. The SVM step subsequently checks for evidence of the presence of organs. Experimental results show that AAR techniques can be combined with machine learning strategies within the AAR recognition framework for good performance in recognizing missing organs, in our case missing tonsils in post-tonsillectomy images as well as in simulating tonsillectomy images. The previous recognition performance is maintained achieving an organ localization accuracy of within 1 voxel when the organ is actually not removed. To our knowledge, no methods have been reported to date for handling significantly deformed or missing organs, especially in neck MRI.
Handling imbalance data in churn prediction using combined SMOTE and RUS with bagging method
NASA Astrophysics Data System (ADS)
Pura Hartati, Eka; Adiwijaya; Arif Bijaksana, Moch
2018-03-01
Customer churn has become a significant problem and also a challenge for Telecommunication company such as PT. Telkom Indonesia. It is necessary to evaluate whether the big problems of churn customer and the company’s managements will make appropriate strategies to minimize the churn and retaining the customer. Churn Customer data which categorized churn Atas Permintaan Sendiri (APS) in this Company is an imbalance data, and this issue is one of the challenging tasks in machine learning. This study will investigate how is handling class imbalance in churn prediction using combined Synthetic Minority Over-Sampling (SMOTE) and Random Under-Sampling (RUS) with Bagging method for a better churn prediction performance’s result. The dataset that used is Broadband Internet data which is collected from Telkom Regional 6 Kalimantan. The research firstly using data preprocessing to balance the imbalanced dataset and also to select features by sampling technique SMOTE and RUS, and then building churn prediction model using Bagging methods and C4.5.
Sakr, Sherif; Elshawi, Radwa; Ahmed, Amjad M; Qureshi, Waqas T; Brawner, Clinton A; Keteyian, Steven J; Blaha, Michael J; Al-Mallah, Mouaz H
2017-12-19
Prior studies have demonstrated that cardiorespiratory fitness (CRF) is a strong marker of cardiovascular health. Machine learning (ML) can enhance the prediction of outcomes through classification techniques that classify the data into predetermined categories. The aim of this study is to present an evaluation and comparison of how machine learning techniques can be applied on medical records of cardiorespiratory fitness and how the various techniques differ in terms of capabilities of predicting medical outcomes (e.g. mortality). We use data of 34,212 patients free of known coronary artery disease or heart failure who underwent clinician-referred exercise treadmill stress testing at Henry Ford Health Systems Between 1991 and 2009 and had a complete 10-year follow-up. Seven machine learning classification techniques were evaluated: Decision Tree (DT), Support Vector Machine (SVM), Artificial Neural Networks (ANN), Naïve Bayesian Classifier (BC), Bayesian Network (BN), K-Nearest Neighbor (KNN) and Random Forest (RF). In order to handle the imbalanced dataset used, the Synthetic Minority Over-Sampling Technique (SMOTE) is used. Two set of experiments have been conducted with and without the SMOTE sampling technique. On average over different evaluation metrics, SVM Classifier has shown the lowest performance while other models like BN, BC and DT performed better. The RF classifier has shown the best performance (AUC = 0.97) among all models trained using the SMOTE sampling. The results show that various ML techniques can significantly vary in terms of its performance for the different evaluation metrics. It is also not necessarily that the more complex the ML model, the more prediction accuracy can be achieved. The prediction performance of all models trained with SMOTE is much better than the performance of models trained without SMOTE. The study shows the potential of machine learning methods for predicting all-cause mortality using cardiorespiratory fitness data.
A hybrid least squares support vector machines and GMDH approach for river flow forecasting
NASA Astrophysics Data System (ADS)
Samsudin, R.; Saad, P.; Shabri, A.
2010-06-01
This paper proposes a novel hybrid forecasting model, which combines the group method of data handling (GMDH) and the least squares support vector machine (LSSVM), known as GLSSVM. The GMDH is used to determine the useful input variables for LSSVM model and the LSSVM model which works as time series forecasting. In this study the application of GLSSVM for monthly river flow forecasting of Selangor and Bernam River are investigated. The results of the proposed GLSSVM approach are compared with the conventional artificial neural network (ANN) models, Autoregressive Integrated Moving Average (ARIMA) model, GMDH and LSSVM models using the long term observations of monthly river flow discharge. The standard statistical, the root mean square error (RMSE) and coefficient of correlation (R) are employed to evaluate the performance of various models developed. Experiment result indicates that the hybrid model was powerful tools to model discharge time series and can be applied successfully in complex hydrological modeling.
Neo-Symbiosis: The Next Stage in the Evolution of Human Information Interaction
DOE Office of Scientific and Technical Information (OSTI.GOV)
Griffith, Douglas; Greitzer, Frank L.
We re-address the vision of human-computer symbiosis expressed by J. C. R. Licklider nearly a half-century ago, when he wrote: “The hope is that in not too many years, human brains and computing machines will be coupled together very tightly, and that the resulting partnership will think as no human brain has ever thought and process data in a way not approached by the information-handling machines we know today.” (Licklider, 1960). Unfortunately, little progress was made toward this vision over four decades following Licklider’s challenge, despite significant advancements in the fields of human factors and computer science. Licklider’s vision wasmore » largely forgotten. However, recent advances in information science and technology, psychology, and neuroscience have rekindled the potential of making the Licklider’s vision a reality. This paper provides a historical context for and updates the vision, and it argues that such a vision is needed as a unifying framework for advancing IS&T.« less
Boosting specificity of MEG artifact removal by weighted support vector machine.
Duan, Fang; Phothisonothai, Montri; Kikuchi, Mitsuru; Yoshimura, Yuko; Minabe, Yoshio; Watanabe, Kastumi; Aihara, Kazuyuki
2013-01-01
An automatic artifact removal method of magnetoencephalogram (MEG) was presented in this paper. The method proposed is based on independent components analysis (ICA) and support vector machine (SVM). However, different from the previous studies, in this paper we consider two factors which would influence the performance. First, the imbalance factor of independent components (ICs) of MEG is handled by weighted SVM. Second, instead of simply setting a fixed weight to each class, a re-weighting scheme is used for the preservation of useful MEG ICs. Experimental results on manually marked MEG dataset showed that the method proposed could correctly distinguish the artifacts from the MEG ICs. Meanwhile, 99.72% ± 0.67 of MEG ICs were preserved. The classification accuracy was 97.91% ± 1.39. In addition, it was found that this method was not sensitive to individual differences. The cross validation (leave-one-subject-out) results showed an averaged accuracy of 97.41% ± 2.14.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Due to the increase in the use of Coordinate Measuring Machines (CMMs) to measure fine details and complex geometries in manufacturing, many programs have been made to compile and analyze the data. These programs typically require extensive setup to determine the expected results in order to not only track the pass/fail of a dimension, but also to use statistical process control (SPC). These extra steps and setup times have been addressed through the CMM Data Analysis Tool, which only requires the output of the CMM to provide both pass/fail analysis on all parts run to the same inspection program asmore » well as provide graphs which help visualize where the part measures within the allowed tolerances. This provides feedback not only to the customer for approval of a part during development, but also to machining process engineers to identify when any dimension is drifting towards an out of tolerance condition during production. This program can handle hundreds of parts with complex dimensions and will provide an analysis within minutes.« less
Apparatus to collect, classify, concentrate, and characterize gas-borne particles
Rader, Daniel J.; Torczynski, John R.; Wally, Karl; Brockmann, John E.
2002-01-01
An aerosol lab-on-a-chip (ALOC) integrates one or more of a variety of aerosol collection, classification, concentration (enrichment), and characterization processes onto a single substrate or layered stack of such substrates. By taking advantage of modern micro-machining capabilities, an entire suite of discrete laboratory aerosol handling and characterization techniques can be combined in a single portable device that can provide a wealth of data on the aerosol being sampled. The ALOC offers parallel characterization techniques and close proximity of the various characterization modules helps ensure that the same aerosol is available to all devices (dramatically reducing sampling and transport errors). Micro-machine fabrication of the ALOC significantly reduces unit costs relative to existing technology, and enables the fabrication of small, portable ALOC devices, as well as the potential for rugged design to allow operation in harsh environments. Miniaturization also offers the potential of working with smaller particle sizes and lower pressure drops (leading to reduction of power consumption).
A performance comparison of the IBM RS/6000 and the Astronautics ZS-1
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, W.M.; Abraham, S.G.; Davidson, E.S.
1991-01-01
Concurrent uniprocessor architectures, of which vector and superscalar are two examples, are designed to capitalize on fine-grain parallelism. The authors have developed a performance evaluation method for comparing and improving these architectures, and in this article they present the methodology and a detailed case study of two machines. The runtime of many programs is dominated by time spent in loop constructs - for example, Fortran Do-loops. Loops generally comprise two logical processes: The access process generates addresses for memory operations while the execute process operates on floating-point data. Memory access patterns typically can be generated independently of the data inmore » the execute process. This independence allows the access process to slip ahead, thereby hiding memory latency. The IBM 360/91 was designed in 1967 to achieve slip dynamically, at runtime. One CPU unit executes integer operations while another handles floating-point operations. Other machines, including the VAX 9000 and the IBM RS/6000, use a similar approach.« less
A group communication approach for mobile computing mobile channel: An ISIS tool for mobile services
NASA Astrophysics Data System (ADS)
Cho, Kenjiro; Birman, Kenneth P.
1994-05-01
This paper examines group communication as an infrastructure to support mobility of users, and presents a simple scheme to support user mobility by means of switching a control point between replicated servers. We describe the design and implementation of a set of tools, called Mobile Channel, for use with the ISIS system. Mobile Channel is based on a combination of the two replication schemes: the primary-backup approach and the state machine approach. Mobile Channel implements a reliable one-to-many FIFO channel, in which a mobile client sees a single reliable server; servers, acting as a state machine, see multicast messages from clients. Migrations of mobile clients are handled as an intentional primary switch, and hand-offs or server failures are completely masked to mobile clients. To achieve high performance, servers are replicated at a sliding-window level. Our scheme provides a simple abstraction of migration, eliminates complicated hand-off protocols, provides fault-tolerance and is implemented within the existing group communication mechanism.
Fletcher, Timothy L; Popelier, Paul L A
2016-06-14
A machine learning method called kriging is applied to the set of all 20 naturally occurring amino acids. Kriging models are built that predict electrostatic multipole moments for all topological atoms in any amino acid based on molecular geometry only. These models then predict molecular electrostatic interaction energies. On the basis of 200 unseen test geometries for each amino acid, no amino acid shows a mean prediction error above 5.3 kJ mol(-1), while the lowest error observed is 2.8 kJ mol(-1). The mean error across the entire set is only 4.2 kJ mol(-1) (or 1 kcal mol(-1)). Charged systems are created by protonating or deprotonating selected amino acids, and these show no significant deviation in prediction error over their neutral counterparts. Similarly, the proposed methodology can also handle amino acids with aromatic side chains, without the need for modification. Thus, we present a generic method capable of accurately capturing multipolar polarizable electrostatics in amino acids.
Scheduling algorithms for automatic control systems for technological processes
NASA Astrophysics Data System (ADS)
Chernigovskiy, A. S.; Tsarev, R. Yu; Kapulin, D. V.
2017-01-01
Wide use of automatic process control systems and the usage of high-performance systems containing a number of computers (processors) give opportunities for creation of high-quality and fast production that increases competitiveness of an enterprise. Exact and fast calculations, control computation, and processing of the big data arrays - all of this requires the high level of productivity and, at the same time, minimum time of data handling and result receiving. In order to reach the best time, it is necessary not only to use computing resources optimally, but also to design and develop the software so that time gain will be maximal. For this purpose task (jobs or operations), scheduling techniques for the multi-machine/multiprocessor systems are applied. Some of basic task scheduling methods for the multi-machine process control systems are considered in this paper, their advantages and disadvantages come to light, and also some usage considerations, in case of the software for automatic process control systems developing, are made.
DOE Office of Scientific and Technical Information (OSTI.GOV)
McGoldrick, P.R.
1981-01-01
The Mirror Fusion Test Facility (MFTF) is a complex facility requiring a highly-computerized Supervisory Control and Diagnostics System (SCDS) to monitor and provide control over ten subsystems; three of which require true process control. SCDS will provide physicists with a method of studying machine and plasma behavior by acquiring and processing up to four megabytes of plasma diagnostic information every five minutes. A high degree of availability and throughput is provided by a distributed computer system (nine 32-bit minicomputers on shared memory). Data, distributed across SCDS, is managed by a high-bandwidth Distributed Database Management System. The MFTF operators' control roommore » consoles use color television monitors with touch sensitive screens; this is a totally new approach. The method of handling deviations to normal machine operation and how the operator should be notified and assisted in the resolution of problems has been studied and a system designed.« less
The formation method of the feature space for the identification of fatigued bills
NASA Astrophysics Data System (ADS)
Kang, Dongshik; Oshiro, Ayumu; Ozawa, Kenji; Mitsui, Ikugo
2014-10-01
Fatigued bills make a trouble such as the paper jam in a bill handling machine. In the discrimination of fatigued bills using an acoustic signal, the variation of an observed bill sound is considered to be one of causes in misclassification. Therefore a technique has demanded in order to make the classification of fatigued bills more efficient. In this paper, we proposed the algorithm that extracted feature quantity of bill sound from acoustic signal using the frequency difference, and carried out discrimination experiment of fatigued bill money by Support Vector Machine(SVM). The feature quantity of frequency difference can represent the frequency components of an acoustic signal is varied by the fatigued degree of bill money. The generalization performance of SVM does not depend on the size of dimensions of the feature space, even in a high dimensional feature space such as bill-acoustic signals. Furthermore, SVM can induce an optimal classifier which considers the combination of features by the virtue of polynomial kernel functions.
Ultrasonic seam welding on thin silicon solar cells
NASA Technical Reports Server (NTRS)
Stofel, E. J.
1982-01-01
The ultrathin silicon solar cell has progressed to where it is a serious candidate for future light weight or radiation tolerant spacecraft. The ultrasonic method of producing welds was found to be satisfactory. These ultrathin cells could be handled without breakage in a semiautomated welding machine. This is a prototype of a machine capable of production rates sufficiently large to support spacecraft array assembly needs. For comparative purposes, this project also welded a variety of cells with thicknesses up to 0.23 mm as well as the 0.07 mm ultrathin cells. There was no electrical degradation in any cells. The mechanical pull strength of welds on the thick cells was excellent when using a large welding force. The mechanical strength of welds on thin cells was less since only a small welding force could be used without cracking these cells. Even so, the strength of welds on thin cells appears adequate for array application. The ability of such welds to survive multiyear, near Earth orbit thermal cycles needs to be demonstrated.
Active machine learning-driven experimentation to determine compound effects on protein patterns
Naik, Armaghan W; Kangas, Joshua D; Sullivan, Devin P; Murphy, Robert F
2016-01-01
High throughput screening determines the effects of many conditions on a given biological target. Currently, to estimate the effects of those conditions on other targets requires either strong modeling assumptions (e.g. similarities among targets) or separate screens. Ideally, data-driven experimentation could be used to learn accurate models for many conditions and targets without doing all possible experiments. We have previously described an active machine learning algorithm that can iteratively choose small sets of experiments to learn models of multiple effects. We now show that, with no prior knowledge and with liquid handling robotics and automated microscopy under its control, this learner accurately learned the effects of 48 chemical compounds on the subcellular localization of 48 proteins while performing only 29% of all possible experiments. The results represent the first practical demonstration of the utility of active learning-driven biological experimentation in which the set of possible phenotypes is unknown in advance. DOI: http://dx.doi.org/10.7554/eLife.10047.001 PMID:26840049
Purely Structural Protein Scoring Functions Using Support Vector Machine and Ensemble Learning.
Mirzaei, Shokoufeh; Sidi, Tomer; Keasar, Chen; Crivelli, Silvia
2016-08-24
The function of a protein is determined by its structure, which creates a need for efficient methods of protein structure determination to advance scientific and medical research. Because current experimental structure determination methods carry a high price tag, computational predictions are highly desirable. Given a protein sequence, computational methods produce numerous 3D structures known as decoys. However, selection of the best quality decoys is challenging as the end users can handle only a few ones. Therefore, scoring functions are central to decoy selection. They combine measurable features into a single number indicator of decoy quality. Unfortunately, current scoring functions do not consistently select the best decoys. Machine learning techniques offer great potential to improve decoy scoring. This paper presents two machine-learning based scoring functions to predict the quality of proteins structures, i.e., the similarity between the predicted structure and the experimental one without knowing the latter. We use different metrics to compare these scoring functions against three state-of-the-art scores. This is a first attempt at comparing different scoring functions using the same non-redundant dataset for training and testing and the same features. The results show that adding informative features may be more significant than the method used.
NASA Astrophysics Data System (ADS)
Ali, Salah M.; Hui, K. H.; Hee, L. M.; Salman Leong, M.; Al-Obaidi, M. A.; Ali, Y. H.; Abdelrhman, Ahmed M.
2018-03-01
Acoustic emission (AE) analysis has become a vital tool for initiating the maintenance tasks in many industries. However, the analysis process and interpretation has been found to be highly dependent on the experts. Therefore, an automated monitoring method would be required to reduce the cost and time consumed in the interpretation of AE signal. This paper investigates the application of two of the most common machine learning approaches namely artificial neural network (ANN) and support vector machine (SVM) to automate the diagnosis of valve faults in reciprocating compressor based on AE signal parameters. Since the accuracy is an essential factor in any automated diagnostic system, this paper also provides a comparative study based on predictive performance of ANN and SVM. AE parameters data was acquired from single stage reciprocating air compressor with different operational and valve conditions. ANN and SVM diagnosis models were subsequently devised by combining AE parameters of different conditions. Results demonstrate that ANN and SVM models have the same results in term of prediction accuracy. However, SVM model is recommended to automate diagnose the valve condition in due to the ability of handling a high number of input features with low sampling data sets.
NASA Technical Reports Server (NTRS)
Jackson, L. Neal; Crenshaw, John, Sr.; Hambright, R. N.; Nedungadi, A.; Mcfayden, G. M.; Tsuchida, M. S.
1989-01-01
A significant emphasis upon automation within the Space Biology Initiative hardware appears justified in order to conserve crew labor and crew training effort. Two generic forms of automation were identified: automation of data and information handling and decision making, and the automation of material handling, transfer, and processing. The use of automatic data acquisition, expert systems, robots, and machine vision will increase the volume of experiments and quality of results. The automation described may also influence efforts to miniaturize and modularize the large array of SBI hardware identified to date. The cost and benefit model developed appears to be a useful guideline for SBI equipment specifiers and designers. Additional refinements would enhance the validity of the model. Two NASA automation pilot programs, 'The Principal Investigator in a Box' and 'Rack Mounted Robots' were investigated and found to be quite appropriate for adaptation to the SBI program. There are other in-house NASA efforts that provide technology that may be appropriate for the SBI program. Important data is believed to exist in advanced medical labs throughout the U.S., Japan, and Europe. The information and data processing in medical analysis equipment is highly automated and future trends reveal continued progress in this area. However, automation of material handling and processing has progressed in a limited manner because the medical labs are not affected by the power and space constraints that Space Station medical equipment is faced with. Therefore, NASA's major emphasis in automation will require a lead effort in the automation of material handling to achieve optimal crew utilization.
NASA Astrophysics Data System (ADS)
Brown, M. G. L.; He, T.; Liang, S.
2016-12-01
Satellite-derived estimates of incident photosynthetically active radiation (PAR) can be used to monitor global change, are required by most terrestrial ecosystem models, and can be used to estimate primary production according to the theory of light use efficiency. Compared with parametric approaches, non-parametric techniques that include an artificial neural network (ANN), support vector machine regression (SVM), an artificial bee colony (ABC), and a look-up table (LUT) do not require many ancillary data as inputs for the estimation of PAR from satellite data. In this study, a selection of machine learning methods to estimate PAR from MODIS top of atmosphere (TOA) radiances are compared to a LUT approach to determine which techniques might best handle the nonlinear relationship between TOA radiance and incident PAR. Evaluation of these methods (ANN, SVM, and LUT) is performed with ground measurements at seven SURFRAD sites. Due to the design of the ANN, it can handle the nonlinear relationship between TOA radiance and PAR better than linearly interpolating between the values in the LUT; however, training the ANN has to be carried out on an angular-bin basis, which results in a LUT of ANNs. The SVM model may be better for incorporating multiple viewing angles than the ANN; however, both techniques require a large amount of training data, which may introduce a regional bias based on where the most training and validation data are available. Based on the literature, the ABC is a promising alternative to an ANN, SVM regression and a LUT, but further development for this application is required before concrete conclusions can be drawn. For now, the LUT method outperforms the machine-learning techniques, but future work should be directed at developing and testing the ABC method. A simple, robust method to estimate direct and diffuse incident PAR, with minimal inputs and a priori knowledge, would be very useful for monitoring global change of primary production, particularly of pastures and rangeland, which have implications for livestock and food security. Future work will delve deeper into the utility of satellite-derived PAR estimation for monitoring primary production in pasture and rangelands.
Bioinformatics in proteomics: application, terminology, and pitfalls.
Wiemer, Jan C; Prokudin, Alexander
2004-01-01
Bioinformatics applies data mining, i.e., modern computer-based statistics, to biomedical data. It leverages on machine learning approaches, such as artificial neural networks, decision trees and clustering algorithms, and is ideally suited for handling huge data amounts. In this article, we review the analysis of mass spectrometry data in proteomics, starting with common pre-processing steps and using single decision trees and decision tree ensembles for classification. Special emphasis is put on the pitfall of overfitting, i.e., of generating too complex single decision trees. Finally, we discuss the pros and cons of the two different decision tree usages.
TFTR diagnostic control and data acquisition system
NASA Astrophysics Data System (ADS)
Sauthoff, N. R.; Daniels, R. E.
1985-05-01
General computerized control and data-handling support for TFTR diagnostics is presented within the context of the Central Instrumentation, Control and Data Acquisition (CICADA) System. Procedures, hardware, the interactive man-machine interface, event-driven task scheduling, system-wide arming and data acquisition, and a hierarchical data base of raw data and results are described. Similarities in data structures involved in control, monitoring, and data acquisition afford a simplification of the system functions, based on ``groups'' of devices. Emphases and optimizations appropriate for fusion diagnostic system designs are provided. An off-line data reduction computer system is under development.
TFTR diagnostic control and data acquisition system
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sauthoff, N.R.; Daniels, R.E.; PPL Computer Division
1985-05-01
General computerized control and data-handling support for TFTR diagnostics is presented within the context of the Central Instrumentation, Control and Data Acquisition (CICADA) System. Procedures, hardware, the interactive man--machine interface, event-driven task scheduling, system-wide arming and data acquisition, and a hierarchical data base of raw data and results are described. Similarities in data structures involved in control, monitoring, and data acquisition afford a simplification of the system functions, based on ''groups'' of devices. Emphases and optimizations appropriate for fusion diagnostic system designs are provided. An off-line data reduction computer system is under development.
The integrated analysis capability (IAC Level 2.0)
NASA Technical Reports Server (NTRS)
Frisch, Harold P.; Vos, Robert G.
1988-01-01
The critical data management issues involved in the development of the integral analysis capability (IAC), Level 2, to support the design analysis and performance evaluation of large space structures, are examined. In particular, attention is given to the advantages and disadvantages of the formalized data base; merging of the matrix and relational data concepts; data types, query operators, and data handling; sequential versus direct-access files; local versus global data access; programming languages and host machines; and data flow techniques. The discussion also covers system architecture, recent system level enhancements, executive/user interface capabilities, and technology applications.
Computer aided indexing at NASA
NASA Technical Reports Server (NTRS)
Buchan, Ronald L.
1987-01-01
The application of computer technology to the construction of the NASA Thesaurus and in NASA Lexical Dictionary development is discussed in a brief overview. Consideration is given to the printed and online versions of the Thesaurus, retrospective indexing, the NASA RECON frequency command, demand indexing, lists of terms by category, and the STAR and IAA annual subject indexes. The evolution of computer methods in the Lexical Dictionary program is traced, from DOD and DOE subject switching to LCSH machine-aided indexing and current techniques for handling natural language (e.g., the elimination of verbs to facilitate breakdown of sentences into words and phrases).
Two-way cable television project
NASA Astrophysics Data System (ADS)
Wilkens, H.; Guenther, P.; Kiel, F.; Kraus, F.; Mahnkopf, P.; Schnee, R.
1982-02-01
The market demand for a multiuser computer system with interactive services was studied. Mean system work load at peak use hours was estimated and the complexity of dialog with a central computer was determined. Man machine communication by broadband cable television transmission, using digital techniques, was assumed. The end to end system is described. It is user friendly, able to handle 10,000 subscribers, and provides color television display. The central computer system architecture with remote audiovisual terminals is depicted and software is explained. Signal transmission requirements are dealt with. International availability of the test system, including sample programs, is indicated.
Overhearers Use Addressee Backchannels in Dialog Comprehension.
Tolins, Jackson; Fox Tree, Jean E
2016-08-01
Observing others in conversation is a common format for comprehending language, yet little work has been done to understand dialog comprehension. We tested whether overhearers use addressee backchannels as predictive cues for how to integrate information across speaker turns during comprehension of spontaneously produced collaborative narration. In Experiment 1, words that followed specific backchannels (e.g., really, oh) were recognized more slowly than words that followed either generic backchannels (e.g., uh huh, mhm) or pauses. In Experiment 2, we found that when the turn after the backchannel was a continuation of the narrative, specific backchannels prompted the fastest verification of prior information. When the turn after was an elaboration, they prompted the slowest, indicating that overhearers took specific backchannels as cues to integrate preceding talk with subsequent talk. These findings demonstrate that overhearers capitalize on the predictive relationship between backchannels and the development of speakers' talk, coordinating information across conversational roles. Copyright © 2015 Cognitive Science Society, Inc.
Bavelas, J B; Coates, L; Johnson, T
2000-12-01
A collaborative theory of narrative story-telling was tested in two experiments that examined what listeners do and their effect on the narrator. In 63 unacquainted dyads (81 women and 45 men), a narrator told his or her own close-call story. The listeners made 2 different kinds of listener responses: Generic responses included nodding and vocalizations such as "mhm." Specific responses, such as wincing or exclaiming, were tightly connected to (and served to illustrate) what the narrator was saying at the moment. In experimental conditions that distracted listeners from the narrative content, listeners made fewer responses, especially specific ones, and the narrators also told their stories significantly less well, particularly at what should have been the dramatic ending. Thus, listeners were co-narrators both through their own specific responses, which helped illustrate the story, and in their apparent effect on the narrator's performance. The results demonstrate the importance of moment-by-moment collaboration in face-to-face dialogue.
Integrating human and machine intelligence in galaxy morphology classification tasks
NASA Astrophysics Data System (ADS)
Beck, Melanie R.; Scarlata, Claudia; Fortson, Lucy F.; Lintott, Chris J.; Simmons, B. D.; Galloway, Melanie A.; Willett, Kyle W.; Dickinson, Hugh; Masters, Karen L.; Marshall, Philip J.; Wright, Darryl
2018-06-01
Quantifying galaxy morphology is a challenging yet scientifically rewarding task. As the scale of data continues to increase with upcoming surveys, traditional classification methods will struggle to handle the load. We present a solution through an integration of visual and automated classifications, preserving the best features of both human and machine. We demonstrate the effectiveness of such a system through a re-analysis of visual galaxy morphology classifications collected during the Galaxy Zoo 2 (GZ2) project. We reprocess the top-level question of the GZ2 decision tree with a Bayesian classification aggregation algorithm dubbed SWAP, originally developed for the Space Warps gravitational lens project. Through a simple binary classification scheme, we increase the classification rate nearly 5-fold classifying 226 124 galaxies in 92 d of GZ2 project time while reproducing labels derived from GZ2 classification data with 95.7 per cent accuracy. We next combine this with a Random Forest machine learning algorithm that learns on a suite of non-parametric morphology indicators widely used for automated morphologies. We develop a decision engine that delegates tasks between human and machine and demonstrate that the combined system provides at least a factor of 8 increase in the classification rate, classifying 210 803 galaxies in just 32 d of GZ2 project time with 93.1 per cent accuracy. As the Random Forest algorithm requires a minimal amount of computational cost, this result has important implications for galaxy morphology identification tasks in the era of Euclid and other large-scale surveys.
White, Donald J; Schneiderman, Eva; Colón, Ellen; St John, Samuel
2015-01-01
This paper describes the development and standardization of a profilometry-based method for assessment of dentifrice abrasivity called Radioactive Dentin Abrasivity - Profilometry Equivalent (RDA-PE). Human dentine substrates are mounted in acrylic blocks of precise standardized dimensions, permitting mounting and brushing in V8 brushing machines. Dentin blocks are masked to create an area of "contact brushing." Brushing is carried out in V8 brushing machines and dentifrices are tested as slurries. An abrasive standard is prepared by diluting the ISO 11609 abrasivity reference calcium pyrophosphate abrasive into carboxymethyl cellulose/glycerin, just as in the RDA method. Following brushing, masked areas are removed and profilometric analysis is carried out on treated specimens. Assessments of average abrasion depth (contact or optical profilometry) are made. Inclusion of standard calcium pyrophosphate abrasive permits a direct RDA equivalent assessment of abrasion, which is characterized with profilometry as Depth test/Depth control x 100. Within the test, the maximum abrasivity standard of 250 can be created in situ simply by including a treatment group of standard abrasive with 2.5x number of brushing strokes. RDA-PE is enabled in large part by the availability of easy-to-use and well-standardized modern profilometers, but its use in V8 brushing machines is enabled by the unique specific conditions described herein. RDA-PE permits the evaluation of dentifrice abrasivity to dentin without the requirement of irradiated teeth and infrastructure for handling them. In direct comparisons, the RDA-PE method provides dentifrice abrasivity assessments comparable to the gold industry standard RDA technique.
Machine learning for the New York City power grid.
Rudin, Cynthia; Waltz, David; Anderson, Roger N; Boulanger, Albert; Salleb-Aouissi, Ansaf; Chow, Maggie; Dutta, Haimonti; Gross, Philip N; Huang, Bert; Ierome, Steve; Isaac, Delfina F; Kressner, Arthur; Passonneau, Rebecca J; Radeva, Axinia; Wu, Leon
2012-02-01
Power companies can benefit from the use of knowledge discovery methods and statistical machine learning for preventive maintenance. We introduce a general process for transforming historical electrical grid data into models that aim to predict the risk of failures for components and systems. These models can be used directly by power companies to assist with prioritization of maintenance and repair work. Specialized versions of this process are used to produce 1) feeder failure rankings, 2) cable, joint, terminator, and transformer rankings, 3) feeder Mean Time Between Failure (MTBF) estimates, and 4) manhole events vulnerability rankings. The process in its most general form can handle diverse, noisy, sources that are historical (static), semi-real-time, or realtime, incorporates state-of-the-art machine learning algorithms for prioritization (supervised ranking or MTBF), and includes an evaluation of results via cross-validation and blind test. Above and beyond the ranked lists and MTBF estimates are business management interfaces that allow the prediction capability to be integrated directly into corporate planning and decision support; such interfaces rely on several important properties of our general modeling approach: that machine learning features are meaningful to domain experts, that the processing of data is transparent, and that prediction results are accurate enough to support sound decision making. We discuss the challenges in working with historical electrical grid data that were not designed for predictive purposes. The “rawness” of these data contrasts with the accuracy of the statistical models that can be obtained from the process; these models are sufficiently accurate to assist in maintaining New York City’s electrical grid.
The remapping of space in motor learning and human-machine interfaces
Mussa-Ivaldi, F.A.; Danziger, Z.
2009-01-01
Studies of motor adaptation to patterns of deterministic forces have revealed the ability of the motor control system to form and use predictive representations of the environment. One of the most fundamental elements of our environment is space itself. This article focuses on the notion of Euclidean space as it applies to common sensory motor experiences. Starting from the assumption that we interact with the world through a system of neural signals, we observe that these signals are not inherently endowed with metric properties of the ordinary Euclidean space. The ability of the nervous system to represent these properties depends on adaptive mechanisms that reconstruct the Euclidean metric from signals that are not Euclidean. Gaining access to these mechanisms will reveal the process by which the nervous system handles novel sophisticated coordinate transformation tasks, thus highlighting possible avenues to create functional human-machine interfaces that can make that task much easier. A set of experiments is presented that demonstrate the ability of the sensory-motor system to reorganize coordination in novel geometrical environments. In these environments multiple degrees of freedom of body motions are used to control the coordinates of a point in a two-dimensional Euclidean space. We discuss how practice leads to the acquisition of the metric properties of the controlled space. Methods of machine learning based on the reduction of reaching errors are tested as a means to facilitate learning by adaptively changing he map from body motions to controlled device. We discuss the relevance of the results to the development of adaptive human machine interfaces and optimal control. PMID:19665553
Machine learning algorithms for mode-of-action classification in toxicity assessment.
Zhang, Yile; Wong, Yau Shu; Deng, Jian; Anton, Cristina; Gabos, Stephan; Zhang, Weiping; Huang, Dorothy Yu; Jin, Can
2016-01-01
Real Time Cell Analysis (RTCA) technology is used to monitor cellular changes continuously over the entire exposure period. Combining with different testing concentrations, the profiles have potential in probing the mode of action (MOA) of the testing substances. In this paper, we present machine learning approaches for MOA assessment. Computational tools based on artificial neural network (ANN) and support vector machine (SVM) are developed to analyze the time-concentration response curves (TCRCs) of human cell lines responding to tested chemicals. The techniques are capable of learning data from given TCRCs with known MOA information and then making MOA classification for the unknown toxicity. A novel data processing step based on wavelet transform is introduced to extract important features from the original TCRC data. From the dose response curves, time interval leading to higher classification success rate can be selected as input to enhance the performance of the machine learning algorithm. This is particularly helpful when handling cases with limited and imbalanced data. The validation of the proposed method is demonstrated by the supervised learning algorithm applied to the exposure data of HepG2 cell line to 63 chemicals with 11 concentrations in each test case. Classification success rate in the range of 85 to 95 % are obtained using SVM for MOA classification with two clusters to cases up to four clusters. Wavelet transform is capable of capturing important features of TCRCs for MOA classification. The proposed SVM scheme incorporated with wavelet transform has a great potential for large scale MOA classification and high-through output chemical screening.
Alghamdi, Manal; Al-Mallah, Mouaz; Keteyian, Steven; Brawner, Clinton; Ehrman, Jonathan; Sakr, Sherif
2017-01-01
Machine learning is becoming a popular and important approach in the field of medical research. In this study, we investigate the relative performance of various machine learning methods such as Decision Tree, Naïve Bayes, Logistic Regression, Logistic Model Tree and Random Forests for predicting incident diabetes using medical records of cardiorespiratory fitness. In addition, we apply different techniques to uncover potential predictors of diabetes. This FIT project study used data of 32,555 patients who are free of any known coronary artery disease or heart failure who underwent clinician-referred exercise treadmill stress testing at Henry Ford Health Systems between 1991 and 2009 and had a complete 5-year follow-up. At the completion of the fifth year, 5,099 of those patients have developed diabetes. The dataset contained 62 attributes classified into four categories: demographic characteristics, disease history, medication use history, and stress test vital signs. We developed an Ensembling-based predictive model using 13 attributes that were selected based on their clinical importance, Multiple Linear Regression, and Information Gain Ranking methods. The negative effect of the imbalance class of the constructed model was handled by Synthetic Minority Oversampling Technique (SMOTE). The overall performance of the predictive model classifier was improved by the Ensemble machine learning approach using the Vote method with three Decision Trees (Naïve Bayes Tree, Random Forest, and Logistic Model Tree) and achieved high accuracy of prediction (AUC = 0.92). The study shows the potential of ensembling and SMOTE approaches for predicting incident diabetes using cardiorespiratory fitness data.
A finite state machine read-out chip for integrated surface acoustic wave sensors
NASA Astrophysics Data System (ADS)
Rakshit, Sambarta; Iliadis, Agis A.
2015-01-01
A finite state machine based integrated sensor circuit suitable for the read-out module of a monolithically integrated SAW sensor on Si is reported. The primary sensor closed loop consists of a voltage controlled oscillator (VCO), a peak detecting comparator, a finite state machine (FSM), and a monolithically integrated SAW sensor device. The output of the system oscillates within a narrow voltage range that correlates with the SAW pass-band response. The period of oscillation is of the order of the SAW phase delay. We use timing information from the FSM to convert SAW phase delay to an on-chip 10 bit digital output operating on the principle of time to digital conversion (TDC). The control inputs of this digital conversion block are generated by a second finite state machine operating under a divided system clock. The average output varies with changes in SAW center frequency, thus tracking mass sensing events in real time. Based on measured VCO gain of 16 MHz/V our system will convert a 10 kHz SAW frequency shift to a corresponding mean voltage shift of 0.7 mV. A corresponding shift in phase delay is converted to a one or two bit shift in the TDC output code. The system can handle alternate SAW center frequencies and group delays simply by adjusting the VCO control and TDC delay control inputs. Because of frequency to voltage and phase to digital conversion, this topology does not require external frequency counter setups and is uniquely suitable for full monolithic integration of autonomous sensor systems and tags.
PIMS sequencing extension: a laboratory information management system for DNA sequencing facilities
2011-01-01
Background Facilities that provide a service for DNA sequencing typically support large numbers of users and experiment types. The cost of services is often reduced by the use of liquid handling robots but the efficiency of such facilities is hampered because the software for such robots does not usually integrate well with the systems that run the sequencing machines. Accordingly, there is a need for software systems capable of integrating different robotic systems and managing sample information for DNA sequencing services. In this paper, we describe an extension to the Protein Information Management System (PIMS) that is designed for DNA sequencing facilities. The new version of PIMS has a user-friendly web interface and integrates all aspects of the sequencing process, including sample submission, handling and tracking, together with capture and management of the data. Results The PIMS sequencing extension has been in production since July 2009 at the University of Leeds DNA Sequencing Facility. It has completely replaced manual data handling and simplified the tasks of data management and user communication. Samples from 45 groups have been processed with an average throughput of 10000 samples per month. The current version of the PIMS sequencing extension works with Applied Biosystems 3130XL 96-well plate sequencer and MWG 4204 or Aviso Theonyx liquid handling robots, but is readily adaptable for use with other combinations of robots. Conclusions PIMS has been extended to provide a user-friendly and integrated data management solution for DNA sequencing facilities that is accessed through a normal web browser and allows simultaneous access by multiple users as well as facility managers. The system integrates sequencing and liquid handling robots, manages the data flow, and provides remote access to the sequencing results. The software is freely available, for academic users, from http://www.pims-lims.org/. PMID:21385349
Soil Moisture as an Estimator for Crop Yield in Germany
NASA Astrophysics Data System (ADS)
Peichl, Michael; Meyer, Volker; Samaniego, Luis; Thober, Stephan
2015-04-01
Annual crop yield depends on various factors such as soil properties, management decisions, and meteorological conditions. Unfavorable weather conditions, e.g. droughts, have the potential to drastically diminish crop yield in rain-fed agriculture. For example, the drought in 2003 caused direct losses of 1.5 billion EUR only in Germany. Predicting crop yields allows to mitigate negative effects of weather extremes which are assumed to occur more often in the future due to climate change. A standard approach in economics is to predict the impact of climate change on agriculture as a function of temperature and precipitation. This approach has been developed further using concepts like growing degree days. Other econometric models use nonlinear functions of heat or vapor pressure deficit. However, none of these approaches uses soil moisture to predict crop yield. We hypothesize that soil moisture is a better indicator to explain stress on plant growth than estimations based on precipitation and temperature. This is the case because the latter variables do not explicitly account for the available water content in the root zone, which is the primary source of water supply for plant growth. In this study, a reduced form panel approach is applied to estimate a multivariate econometric production function for the years 1999 to 2010. Annual crop yield data of various crops on the administrative district level serve as depending variables. The explanatory variable of major interest is the Soil Moisture Index (SMI), which quantifies anomalies in root zone soil moisture. The SMI is computed by the mesoscale Hydrological Model (mHM, www.ufz.de/mhm). The index represents the monthly soil water quantile at a 4 km2 grid resolution covering entire Germany. A reduced model approach is suitable because the SMI is the result of a stochastic weather process and therefore can be considered exogenous. For the ease of interpretation a linear functionality is preferred. Meteorological, phenological, geological, agronomic, and socio-economic variables are also considered to extend the model in order to reveal the proper causal relation. First results show that dry as well as wet extremes of SMI have a negative impact on crop yield for winter wheat. This indicates that soil moisture has at least a limiting affect on crop production.
Method-independent, Computationally Frugal Convergence Testing for Sensitivity Analysis Techniques
NASA Astrophysics Data System (ADS)
Mai, Juliane; Tolson, Bryan
2017-04-01
The increasing complexity and runtime of environmental models lead to the current situation that the calibration of all model parameters or the estimation of all of their uncertainty is often computationally infeasible. Hence, techniques to determine the sensitivity of model parameters are used to identify most important parameters or model processes. All subsequent model calibrations or uncertainty estimation procedures focus then only on these subsets of parameters and are hence less computational demanding. While the examination of the convergence of calibration and uncertainty methods is state-of-the-art, the convergence of the sensitivity methods is usually not checked. If any, bootstrapping of the sensitivity results is used to determine the reliability of the estimated indexes. Bootstrapping, however, might as well become computationally expensive in case of large model outputs and a high number of bootstraps. We, therefore, present a Model Variable Augmentation (MVA) approach to check the convergence of sensitivity indexes without performing any additional model run. This technique is method- and model-independent. It can be applied either during the sensitivity analysis (SA) or afterwards. The latter case enables the checking of already processed sensitivity indexes. To demonstrate the method independency of the convergence testing method, we applied it to three widely used, global SA methods: the screening method known as Morris method or Elementary Effects (Morris 1991, Campolongo et al., 2000), the variance-based Sobol' method (Solbol' 1993, Saltelli et al. 2010) and a derivative-based method known as Parameter Importance index (Goehler et al. 2013). The new convergence testing method is first scrutinized using 12 analytical benchmark functions (Cuntz & Mai et al. 2015) where the true indexes of aforementioned three methods are known. This proof of principle shows that the method reliably determines the uncertainty of the SA results when different budgets are used for the SA. Subsequently, we focus on the model-independency by testing the frugal method using the hydrologic model mHM (www.ufz.de/mhm) with about 50 model parameters. The results show that the new frugal method is able to test the convergence and therefore the reliability of SA results in an efficient way. The appealing feature of this new technique is the necessity of no further model evaluation and therefore enables checking of already processed (and published) sensitivity results. This is one step towards reliable and transferable, published sensitivity results.
Drought and heatwaves in Europe: historical reconstruction and future projections
NASA Astrophysics Data System (ADS)
Samaniego, Luis; Thober, Stephan; Kumar, Rohini; Rakovec, Olda; Wood, Eric; Sheffield, Justin; Pan, Ming; Wanders, Niko; Prudhomme, Christel
2017-04-01
Heat waves and droughts are creeping hydro-meteorological events that may bring societies and natural systems to their limits by inducing large famines, increasing health risks to the population, creating drinking and irrigation water shortfalls, inducing natural fires and degradation of soil and water quality, and in many cases causing large socio-economic losses. Europe, in particular, has endured large scale drought-heat-wave events during the recent past (e.g., 2003 European drought), which have induced enormous socio-economic losses as well as casualties. Recent studies showed that the prediction of droughts and heatwaves is subject to large-scale forcing and parametric uncertainties that lead to considerable uncertainties in the projections of extreme characteristics such as drought magnitude/duration and area under drought, among others. Future projections are also heavily influenced by the RCP scenario uncertainty as well as the coarser spatial resolution of the models. The EDgE project funded by the Copernicus programme (C3S) provides an unique opportunity to investigate the evolution of droughts and heatwaves from 1950 until 2099 over the Pan-EU domain at a scale of 5x5 km2. In this project, high-resolution multi-model hydrologic simulations with the mHM (www.ufz.de/mhm), Noah-MP, VIC and PCR-GLOBWB have been completed for the historical period 1955-2015. Climate projections have been carried out with five CMIP-5 GCMs: GFDL-ESM2M, HadGEM2-ES, IPSL-CM5A-LR, MIROC-ESM-CHEM, NorESM1-M from 2006 to 2099 under RCP2.6 and RCP8.5. Using these multi-model unprecedented simulations, daily soil moisture index and temperature anomalies since 1955 until 2099 will be estimated. Using the procedure proposed by Samaniego et al. (2013), the probabilities of exceeding the benchmark events in the reference period 1980-2010 will be estimated for each RCP scenario. References http://climate.copernicus.eu/edge-end-end-demonstrator-improved-decision-making-water-sector-europe Samaniego, L., R. Kumar, and M. Zink, 2013: Implications of parameter uncertainty on soil moisture drought analysis in Germany. J. Hydrometeor., 14, 47-68, doi:10.1175/JHM-D-12-075.1. Samaniego, L., et al. 2016: Propagation of forcing and model uncertainties on to hydrological drought characteristics in a multi-model century-long experiment in large river basins. Climatic Change. 1-15.
Velazquez-Pupo, Roxana; Sierra-Romero, Alberto; Torres-Roman, Deni; Shkvarko, Yuriy V.; Romero-Delgado, Misael
2018-01-01
This paper presents a high performance vision-based system with a single static camera for traffic surveillance, for moving vehicle detection with occlusion handling, tracking, counting, and One Class Support Vector Machine (OC-SVM) classification. In this approach, moving objects are first segmented from the background using the adaptive Gaussian Mixture Model (GMM). After that, several geometric features are extracted, such as vehicle area, height, width, centroid, and bounding box. As occlusion is present, an algorithm was implemented to reduce it. The tracking is performed with adaptive Kalman filter. Finally, the selected geometric features: estimated area, height, and width are used by different classifiers in order to sort vehicles into three classes: small, midsize, and large. Extensive experimental results in eight real traffic videos with more than 4000 ground truth vehicles have shown that the improved system can run in real time under an occlusion index of 0.312 and classify vehicles with a global detection rate or recall, precision, and F-measure of up to 98.190%, and an F-measure of up to 99.051% for midsize vehicles. PMID:29382078
Satellite operations support expert system
NASA Technical Reports Server (NTRS)
1985-01-01
The Satellite Operations Support Expert System is an effort to identify aspects of satellite ground support activity which could profitably be automated with artificial intelligence (AI) and to develop a feasibility demonstration for the automation of one such area. The hydrazine propulsion subsystems (HPS) of the International Sun Earth Explorer (ISEE) and the International Ultraviolet Explorer (IUS) were used as applications domains. A demonstration fault handling system was built. The system was written in Franz Lisp and is currently hosted on a VAX 11/750-11/780 family machine. The system allows the user to select which HPS (either from ISEE or IUE) is used. Then the user chooses the fault desired for the run. The demonstration system generates telemetry corresponding to the particular fault. The completely separate fault handling module then uses this telemetry to determine what and where the fault is and how to work around it. Graphics are used to depict the structure of the HPS, and the telemetry values displayed on the screen are continually updated. The capabilities of this system and its development cycle are described.
Using video-oriented instructions to speed up sequence comparison.
Wozniak, A
1997-04-01
This document presents an implementation of the well-known Smith-Waterman algorithm for comparison of proteic and nucleic sequences, using specialized video instructions. These instructions, SIMD-like in their design, make possible parallelization of the algorithm at the instruction level. Benchmarks on an ULTRA SPARC running at 167 MHz show a speed-up factor of two compared to the same algorithm implemented with integer instructions on the same machine. Performance reaches over 18 million matrix cells per second on a single processor, giving to our knowledge the fastest implementation of the Smith-Waterman algorithm on a workstation. The accelerated procedure was introduced in LASSAP--a LArge Scale Sequence compArison Package software developed at INRIA--which handles parallelism at higher level. On a SUN Enterprise 6000 server with 12 processors, a speed of nearly 200 million matrix cells per second has been obtained. A sequence of length 300 amino acids is scanned against SWISSPROT R33 (1,8531,385 residues) in 29 s. This procedure is not restricted to databank scanning. It applies to all cases handled by LASSAP (intra- and inter-bank comparisons, Z-score computation, etc.
Durham, Erin-Elizabeth A; Yu, Xiaxia; Harrison, Robert W
2014-12-01
Effective machine-learning handles large datasets efficiently. One key feature of handling large data is the use of databases such as MySQL. The freeware fuzzy decision tree induction tool, FDT, is a scalable supervised-classification software tool implementing fuzzy decision trees. It is based on an optimized fuzzy ID3 (FID3) algorithm. FDT 2.0 improves upon FDT 1.0 by bridging the gap between data science and data engineering: it combines a robust decisioning tool with data retention for future decisions, so that the tool does not need to be recalibrated from scratch every time a new decision is required. In this paper we briefly review the analytical capabilities of the freeware FDT tool and its major features and functionalities; examples of large biological datasets from HIV, microRNAs and sRNAs are included. This work shows how to integrate fuzzy decision algorithms with modern database technology. In addition, we show that integrating the fuzzy decision tree induction tool with database storage allows for optimal user satisfaction in today's Data Analytics world.
GELATIO: a general framework for modular digital analysis of high-purity Ge detector signals
NASA Astrophysics Data System (ADS)
Agostini, M.; Pandola, L.; Zavarise, P.; Volynets, O.
2011-08-01
GELATIO is a new software framework for advanced data analysis and digital signal processing developed for the GERDA neutrinoless double beta decay experiment. The framework is tailored to handle the full analysis flow of signals recorded by high purity Ge detectors and photo-multipliers from the veto counters. It is designed to support a multi-channel modular and flexible analysis, widely customizable by the user either via human-readable initialization files or via a graphical interface. The framework organizes the data into a multi-level structure, from the raw data up to the condensed analysis parameters, and includes tools and utilities to handle the data stream between the different levels. GELATIO is implemented in C++. It relies upon ROOT and its extension TAM, which provides compatibility with PROOF, enabling the software to run in parallel on clusters of computers or many-core machines. It was tested on different platforms and benchmarked in several GERDA-related applications. A stable version is presently available for the GERDA Collaboration and it is used to provide the reference analysis of the experiment data.
Big Data breaking barriers - first steps on a long trail
NASA Astrophysics Data System (ADS)
Schade, S.
2015-04-01
Most data sets and streams have a geospatial component. Some people even claim that about 80% of all data is related to location. In the era of Big Data this number might even be underestimated, as data sets interrelate and initially non-spatial data becomes indirectly geo-referenced. The optimal treatment of Big Data thus requires advanced methods and technologies for handling the geospatial aspects in data storage, processing, pattern recognition, prediction, visualisation and exploration. On the one hand, our work exploits earth and environmental sciences for existing interoperability standards, and the foundational data structures, algorithms and software that are required to meet these geospatial information handling tasks. On the other hand, we are concerned with the arising needs to combine human analysis capacities (intelligence augmentation) with machine power (artificial intelligence). This paper provides an overview of the emerging landscape and outlines our (Digital Earth) vision for addressing the upcoming issues. We particularly request the projection and re-use of the existing environmental, earth observation and remote sensing expertise in other sectors, i.e. to break the barriers of all of these silos by investigating integrated applications.
Mechanical characterization of Al-2024 reinforced with fly ash and E-glass by stir casting method
NASA Astrophysics Data System (ADS)
Ramesh, B. T.; Swamy, R. P.; Vinayak, Koppad
2018-04-01
The properties of MMCs enhance their handling in automotive and various applications for the reason that of encouraging properties of high stiffness and high strength, low density, high electrical and thermal conductivity, corrosion resistance, improved wear resistance etc. Metal Matrix Composites are a vital family of materials designed at achieving an improved combination of properties. Our paper deals through to fabricate Hybrid Composite by heating Al 2024 in furnace at a temperature of around 4000 C. E-Glass fiber & Fly ash will be added to the molten metal with changing weight fractions and stirred strongly. Then the ensuing composition will poured into the mould to obtain hybrid composite casting. Aluminium alloy (2024) is the matrix metal used in the present investigation. Fly ash and e-glass are used as the reinforced materials to produce the composite by stir casting. Fly ash is selected because of it is less expensive and low density reinforcement available in great quantities as solid disposal from thermal power plants. The Test specimen is prepared as per ASTM standards size by machining operations to conduct Tensile, Compression, Hardness, and wear test. The test specimens are furnished for tensile, compression strength and wear as per ASTM standard E8, E9 and G99 respectively using Universal Testing Machine and pin on disk machine. It is seen that the fabricated MMC obtained has got enhanced mechanical strength.
NASA Astrophysics Data System (ADS)
Delvecchio, S.; Antoni, J.
2012-02-01
This paper addresses the use of a cyclostationary blind source separation algorithm (namely RRCR) to extract angle deterministic signals from mechanical rotating machines in presence of stationary speed fluctuations. This means that only phase fluctuations while machine is running in steady-state conditions are considered while run-up or run-down speed variations are not taken into account. The machine is also supposed to run in idle conditions so non-stationary phenomena due to the load are not considered. It is theoretically assessed that in such operating conditions the deterministic (periodic) signal in the angle domain becomes cyclostationary at first and second orders in the time domain. This fact justifies the use of the RRCR algorithm, which is able to directly extract the angle deterministic signal from the time domain without performing any kind of interpolation. This is particularly valuable when angular resampling fails because of uncontrolled speed fluctuations. The capability of the proposed approach is verified by means of simulated and actual vibration signals captured on a pneumatic screwdriver handle. In this particular case not only the extraction of the angle deterministic part can be performed but also the separation of the main sources of excitation (i.e. motor shaft imbalance, epyciloidal gear meshing and air pressure forces) affecting the user hand during operations.
State of the art in nuclear telerobotics: focus on the man/machine connection
NASA Astrophysics Data System (ADS)
Greaves, Amna E.
1995-12-01
The interface between the human controller and remotely operated device is a crux of telerobotic investigation today. This human-to-machine connection is the means by which we communicate our commands to the device, as well as the medium for decision-critical feedback to the operator. The amount of information transferred through the user interface is growing. This can be seen as a direct result of our need to support added complexities, as well as a rapidly expanding domain of applications. A user interface, or UI, is therefore subject to increasing demands to present information in a meaningful manner to the user. Virtual reality, and multi degree-of-freedom input devices lend us the ability to augment the man/machine interface, and handle burgeoning amounts of data in a more intuitive and anthropomorphically correct manner. Along with the aid of 3-D input and output devices, there are several visual tools that can be employed as part of a graphical UI that enhance and accelerate our comprehension of the data being presented. Thus an advanced UI that features these improvements would reduce the amount of fatigue on the teleoperator, increase his level of safety, facilitate learning, augment his control, and potentially reduce task time. This paper investigates the cutting edge concepts and enhancements that lead to the next generation of telerobotic interface systems.
Lee, Unseok; Chang, Sungyul; Putra, Gian Anantrio; Kim, Hyoungseok; Kim, Dong Hwan
2018-01-01
A high-throughput plant phenotyping system automatically observes and grows many plant samples. Many plant sample images are acquired by the system to determine the characteristics of the plants (populations). Stable image acquisition and processing is very important to accurately determine the characteristics. However, hardware for acquiring plant images rapidly and stably, while minimizing plant stress, is lacking. Moreover, most software cannot adequately handle large-scale plant imaging. To address these problems, we developed a new, automated, high-throughput plant phenotyping system using simple and robust hardware, and an automated plant-imaging-analysis pipeline consisting of machine-learning-based plant segmentation. Our hardware acquires images reliably and quickly and minimizes plant stress. Furthermore, the images are processed automatically. In particular, large-scale plant-image datasets can be segmented precisely using a classifier developed using a superpixel-based machine-learning algorithm (Random Forest), and variations in plant parameters (such as area) over time can be assessed using the segmented images. We performed comparative evaluations to identify an appropriate learning algorithm for our proposed system, and tested three robust learning algorithms. We developed not only an automatic analysis pipeline but also a convenient means of plant-growth analysis that provides a learning data interface and visualization of plant growth trends. Thus, our system allows end-users such as plant biologists to analyze plant growth via large-scale plant image data easily.
Reliable optical card-edge (ROC) connector for avionics applications
NASA Astrophysics Data System (ADS)
Darden, Bruce V.; Pimpinella, Richard J.; Seals, John D.
1994-10-01
The Reliable Optical Card-Edge (ROC) Connector is a blind-mate backplane unit designed to meet military stress requirements for avionics applications. Its modular design represents the first significant advance in connector optics since the biconic butt-coupled connector was introduced twenty years ago. This multimode connector utilizes beam optics, micro-machined silicon, and a floating, low mass subassembly design to maintain low coupling loss under high levels of shock and vibration. The ROC connector also incorporates retracting doors to protect the unmated termini from environmental contamination and abusive handling. Design features and test results for the ROC connector are presented in this paper.
Introduction of Virtualization Technology to Multi-Process Model Checking
NASA Technical Reports Server (NTRS)
Leungwattanakit, Watcharin; Artho, Cyrille; Hagiya, Masami; Tanabe, Yoshinori; Yamamoto, Mitsuharu
2009-01-01
Model checkers find failures in software by exploring every possible execution schedule. Java PathFinder (JPF), a Java model checker, has been extended recently to cover networked applications by caching data transferred in a communication channel. A target process is executed by JPF, whereas its peer process runs on a regular virtual machine outside. However, non-deterministic target programs may produce different output data in each schedule, causing the cache to restart the peer process to handle the different set of data. Virtualization tools could help us restore previous states of peers, eliminating peer restart. This paper proposes the application of virtualization technology to networked model checking, concentrating on JPF.
Spline-Screw Payload-Fastening System
NASA Technical Reports Server (NTRS)
Vranish, John M.
1994-01-01
Payload handed off securely between robot and vehicle or structure. Spline-screw payload-fastening system includes mating female and male connector mechanisms. Clockwise (or counter-clockwise) rotation of splined male driver on robotic end effector causes connection between robot and payload to tighten (or loosen) and simultaneously causes connection between payload and structure to loosen (or tighten). Includes mechanisms like those described in "Tool-Changing Mechanism for Robot" (GSC-13435) and "Self-Aligning Mechanical and Electrical Coupling" (GSC-13430). Designed for use in outer space, also useful on Earth in applications needed for secure handling and secure mounting of equipment modules during storage, transport, and/or operation. Particularly useful in machine or robotic applications.
NASA Technical Reports Server (NTRS)
Piccolo, R.
1979-01-01
The design, development, efficiency, manufacturability, production costs, life cycle cost, and safety of sodium-sulfur, nickel-zinc, and lead-acid batteries for electric hybrid vehicles are discussed. Models are given for simulating the vehicle handling quality, and for finding the value of: (1) the various magnetic quantities in the different sections in which the magnetic circuit of the DC electric machine is divided; (2) flux distribution in the air gap and the magnetization curve under load conditions; and (3) the mechanical power curves versus motor speed at different values of armature current.
Arithmetic Data Cube as a Data Intensive Benchmark
NASA Technical Reports Server (NTRS)
Frumkin, Michael A.; Shabano, Leonid
2003-01-01
Data movement across computational grids and across memory hierarchy of individual grid machines is known to be a limiting factor for application involving large data sets. In this paper we introduce the Data Cube Operator on an Arithmetic Data Set which we call Arithmetic Data Cube (ADC). We propose to use the ADC to benchmark grid capabilities to handle large distributed data sets. The ADC stresses all levels of grid memory by producing 2d views of an Arithmetic Data Set of d-tuples described by a small number of parameters. We control data intensity of the ADC by controlling the sizes of the views through choice of the tuple parameters.
Calculating Trajectories And Orbits
NASA Technical Reports Server (NTRS)
Alderson, Daniel J.; Brady, Franklyn H.; Breckheimer, Peter J.; Campbell, James K.; Christensen, Carl S.; Collier, James B.; Ekelund, John E.; Ellis, Jordan; Goltz, Gene L.; Hintz, Gerarld R.;
1989-01-01
Double-Precision Trajectory Analysis Program, DPTRAJ, and Orbit Determination Program, ODP, developed and improved over years to provide highly reliable and accurate navigation capability for deep-space missions like Voyager. Each collection of programs working together to provide desired computational results. DPTRAJ, ODP, and supporting utility programs capable of handling massive amounts of data and performing various numerical calculations required for solving navigation problems associated with planetary fly-by and lander missions. Used extensively in support of NASA's Voyager project. DPTRAJ-ODP available in two machine versions. UNIVAC version, NPO-15586, written in FORTRAN V, SFTRAN, and ASSEMBLER. VAX/VMS version, NPO-17201, written in FORTRAN V, SFTRAN, PL/1 and ASSEMBLER.
Optimization-based manufacturing scheduling with multiple resources and setup requirements
NASA Astrophysics Data System (ADS)
Chen, Dong; Luh, Peter B.; Thakur, Lakshman S.; Moreno, Jack, Jr.
1998-10-01
The increasing demand for on-time delivery and low price forces manufacturer to seek effective schedules to improve coordination of multiple resources and to reduce product internal costs associated with labor, setup and inventory. This study describes the design and implementation of a scheduling system for J. M. Product Inc. whose manufacturing is characterized by the need to simultaneously consider machines and operators while an operator may attend several operations at the same time, and the presence of machines requiring significant setup times. The scheduling problem with these characteristics are typical for many manufacturers, very difficult to be handled, and have not been adequately addressed in the literature. In this study, both machine and operators are modeled as resources with finite capacities to obtain efficient coordination between them, and an operator's time can be shared by several operations at the same time to make full use of the operator. Setups are explicitly modeled following our previous work, with additional penalties on excessive setups to reduce setup costs and avoid possible scraps. An integer formulation with a separable structure is developed to maximize on-time delivery of products, low inventory and small number of setups. Within the Lagrangian relaxation framework, the problem is decomposed into individual subproblems that are effectively solved by using dynamic programming with additional penalties embedded in state transitions. Heuristics is then developed to obtain a feasible schedule following on our previous work with new mechanism to satisfy operator capacity constraints. The method has been implemented using the object-oriented programming language C++ with a user-friendly interface, and numerical testing shows that the method generates high quality schedules in a timely fashion. Through simultaneous consideration of machines and operators, machines and operators are well coordinated to facilitate the smooth flow of parts through the system. The explicit modeling of setups and the associated penalties let parts with same setup requirements clustered together to avoid excessive setups.
NASA Astrophysics Data System (ADS)
Neff, John A.
1989-12-01
Experiments originating from Gestalt psychology have shown that representing information in a symbolic form provides a more effective means to understanding. Computer scientists have been struggling for the last two decades to determine how best to create, manipulate, and store collections of symbolic structures. In the past, much of this struggling led to software innovations because that was the path of least resistance. For example, the development of heuristics for organizing the searching through knowledge bases was much less expensive than building massively parallel machines that could search in parallel. That is now beginning to change with the emergence of parallel architectures which are showing the potential for handling symbolic structures. This paper will review the relationships between symbolic computing and parallel computing architectures, and will identify opportunities for optics to significantly impact the performance of such computing machines. Although neural networks are an exciting subset of massively parallel computing structures, this paper will not touch on this area since it is receiving a great deal of attention in the literature. That is, the concepts presented herein do not consider the distributed representation of knowledge.
Airline Passenger Profiling Based on Fuzzy Deep Machine Learning.
Zheng, Yu-Jun; Sheng, Wei-Guo; Sun, Xing-Ming; Chen, Sheng-Yong
2017-12-01
Passenger profiling plays a vital part of commercial aviation security, but classical methods become very inefficient in handling the rapidly increasing amounts of electronic records. This paper proposes a deep learning approach to passenger profiling. The center of our approach is a Pythagorean fuzzy deep Boltzmann machine (PFDBM), whose parameters are expressed by Pythagorean fuzzy numbers such that each neuron can learn how a feature affects the production of the correct output from both the positive and negative sides. We propose a hybrid algorithm combining a gradient-based method and an evolutionary algorithm for training the PFDBM. Based on the novel learning model, we develop a deep neural network (DNN) for classifying normal passengers and potential attackers, and further develop an integrated DNN for identifying group attackers whose individual features are insufficient to reveal the abnormality. Experiments on data sets from Air China show that our approach provides much higher learning ability and classification accuracy than existing profilers. It is expected that the fuzzy deep learning approach can be adapted for a variety of complex pattern analysis tasks.
New concept for in-line OLED manufacturing
NASA Astrophysics Data System (ADS)
Hoffmann, U.; Landgraf, H.; Campo, M.; Keller, S.; Koening, M.
2011-03-01
A new concept of a vertical In-Line deposition machine for large area white OLED production has been developed. The concept targets manufacturing on large substrates (>= Gen 4, 750 x 920 mm2) using linear deposition source achieving a total material utilization of >= 50 % and tact time down to 80 seconds. The continuously improved linear evaporation sources for the organic material achieve thickness uniformity on Gen 4 substrate of better than +/- 3 % and stable deposition rates down to less than 0.1 nm m/min and up to more than 100 nm m/min. For Lithium-Fluoride but also for other high evaporation temperature materials like Magnesium or Silver a linear source with uniformity better than +/- 3 % has been developed. For Aluminum we integrated a vertical oriented point source using wire feed to achieve high (> 150 nm m/min) and stable deposition rates. The machine concept includes a new vertical vacuum handling and alignment system for Gen 4 shadow masks. A complete alignment cycle for the mask can be done in less than one minute achieving alignment accuracy in the range of several 10 μm.
Supervised Learning Applied to Air Traffic Trajectory Classification
NASA Technical Reports Server (NTRS)
Bosson, Christabelle S.; Nikoleris, Tasos
2018-01-01
Given the recent increase of interest in introducing new vehicle types and missions into the National Airspace System, a transition towards a more autonomous air traffic control system is required in order to enable and handle increased density and complexity. This paper presents an exploratory effort of the needed autonomous capabilities by exploring supervised learning techniques in the context of aircraft trajectories. In particular, it focuses on the application of machine learning algorithms and neural network models to a runway recognition trajectory-classification study. It investigates the applicability and effectiveness of various classifiers using datasets containing trajectory records for a month of air traffic. A feature importance and sensitivity analysis are conducted to challenge the chosen time-based datasets and the ten selected features. The study demonstrates that classification accuracy levels of 90% and above can be reached in less than 40 seconds of training for most machine learning classifiers when one track data point, described by the ten selected features at a particular time step, per trajectory is used as input. It also shows that neural network models can achieve similar accuracy levels but at higher training time costs.
A Review of Extra-Terrestrial Mining Robot Concepts
NASA Technical Reports Server (NTRS)
Mueller, Robert P.; Van Susante, Paul J.
2011-01-01
Outer space contains a vast amount of resources that offer virtually unlimited wealth to the humans that can access and use them for commercial purposes. One of the key technologies for harvesting these resources is robotic mining of regolith, minerals, ices and metals. The harsh environment and vast distances create challenges that are handled best by robotic machines working in collaboration with human explorers. Humans will benefit from the resources that will be mined by robots. They will visit outposts and mining camps as required for exploration, commerce and scientific research, but a continuous presence is most likely to be provided by robotic mining machines that are remotely controlled by humans. There have been a variety of extra-terrestrial robotic mining concepts proposed over the last 100 years and this paper will attempt to summarize and review concepts in the public domain (government, industry and academia) to serve as an informational resource for future mining robot developers and operators. The challenges associated with these concepts will be discussed and feasibility will be assessed. Future needs associated with commercial efforts will also be investigated.
The current status and portability of our sequence handling software.
Staden, R
1986-01-01
I describe the current status of our sequence analysis software. The package contains a comprehensive suite of programs for managing large shotgun sequencing projects, a program containing 61 functions for analysing single sequences and a program for comparing pairs of sequences for similarity. The programs that have been described before have been improved by the addition of new functions and by being made very much easier to use. The major interactive programs have 125 pages of online help available from within them. Several new programs are described including screen editing of aligned gel readings for shotgun sequencing projects; a method to highlight errors in aligned gel readings, new methods for searching for putative signals in sequences. We use the programs on a VAX computer but the whole package has been rewritten to make it easy to transport it to other machines. I believe the programs will now run on any machine with a FORTRAN77 compiler and sufficient memory. We are currently putting the programs onto an IBM PC XT/AT and another micro running under UNIX. PMID:3511446
Analysis of Big Data in Gait Biomechanics: Current Trends and Future Directions.
Phinyomark, Angkoon; Petri, Giovanni; Ibáñez-Marcelo, Esther; Osis, Sean T; Ferber, Reed
2018-01-01
The increasing amount of data in biomechanics research has greatly increased the importance of developing advanced multivariate analysis and machine learning techniques, which are better able to handle "big data". Consequently, advances in data science methods will expand the knowledge for testing new hypotheses about biomechanical risk factors associated with walking and running gait-related musculoskeletal injury. This paper begins with a brief introduction to an automated three-dimensional (3D) biomechanical gait data collection system: 3D GAIT, followed by how the studies in the field of gait biomechanics fit the quantities in the 5 V's definition of big data: volume, velocity, variety, veracity, and value. Next, we provide a review of recent research and development in multivariate and machine learning methods-based gait analysis that can be applied to big data analytics. These modern biomechanical gait analysis methods include several main modules such as initial input features, dimensionality reduction (feature selection and extraction), and learning algorithms (classification and clustering). Finally, a promising big data exploration tool called "topological data analysis" and directions for future research are outlined and discussed.
A Review of Extra-Terrestrial Mining Concepts
NASA Technical Reports Server (NTRS)
Mueller, R. P.; van Susante, P. J.
2012-01-01
Outer space contains a vast amount of resources that offer virtually unlimited wealth to the humans that can access and use them for commercial purposes. One of the key technologies for harvesting these resources is robotic mining of regolith, minerals, ices and metals. The harsh environment and vast distances create challenges that are handled best by robotic machines working in collaboration with human explorers. Humans will benefit from the resources that will be mined by robots. They will visit outposts and mining camps as required for exploration, commerce and scientific research, but a continuous presence is most likely to be provided by robotic mining machines that are remotely controlled by humans. There have been a variety of extra-terrestrial robotic mining concepts proposed over the last 40 years and this paper will attempt to summarize and review concepts in the public domain (government, industry and academia) to serve as an informational resource for future mining robot developers and operators. The challenges associated with these concepts will be discussed and feasibility will be assessed. Future needs associated with commercial efforts will also be investigated.
Amorphous and Nanocomposite Materials for Energy-Efficient Electric Motors
NASA Astrophysics Data System (ADS)
Silveyra, Josefina M.; Xu, Patricia; Keylin, Vladimir; DeGeorge, Vincent; Leary, Alex; McHenry, Michael E.
2016-01-01
We explore amorphous soft-magnetic alloys as candidates for electric motor applications. The Co-rich system combines the benefits of low hysteretic and eddy-current losses while exhibiting negligible magnetostriction and robust mechanical properties. The amorphous precursors can be devitrified to form nanocomposite magnets. The superior characteristics of these materials offer the advantages of ease of handling in the manufacturing processing and low iron losses during motor operation. Co-rich amorphous ribbons were laser-cut to build a stator for a small demonstrator permanent-magnet machine. The motor was tested up to ~30,000 rpm. Finite-element analyses proved that the iron losses of the Co-rich amorphous stator were ~80% smaller than for a Si steel stator in the same motor, at 18,000 rpm (equivalent to an electric frequency of 2.1 kHz). These low-loss soft magnets have great potential for application in highly efficient high-speed electric machines, leading to size reduction as well as reduction or replacement of rare earths in permanent-magnet motors. More studies evaluating further processing techniques for amorphous and nanocomposite materials are needed.
NASA Astrophysics Data System (ADS)
Moran, Niklas; Nieland, Simon; Tintrup gen. Suntrup, Gregor; Kleinschmit, Birgit
2017-02-01
Manual field surveys for nature conservation management are expensive and time-consuming and could be supplemented and streamlined by using Remote Sensing (RS). RS is critical to meet requirements of existing laws such as the EU Habitats Directive (HabDir) and more importantly to meet future challenges. The full potential of RS has yet to be harnessed as different nomenclatures and procedures hinder interoperability, comparison and provenance. Therefore, automated tools are needed to use RS data to produce comparable, empirical data outputs that lend themselves to data discovery and provenance. These issues are addressed by a novel, semi-automatic ontology-based classification method that uses machine learning algorithms and Web Ontology Language (OWL) ontologies that yields traceable, interoperable and observation-based classification outputs. The method was tested on European Union Nature Information System (EUNIS) grasslands in Rheinland-Palatinate, Germany. The developed methodology is a first step in developing observation-based ontologies in the field of nature conservation. The tests show promising results for the determination of the grassland indicators wetness and alkalinity with an overall accuracy of 85% for alkalinity and 76% for wetness.
Shi, Yingzhong; Chung, Fu-Lai; Wang, Shitong
2015-09-01
Recently, a time-adaptive support vector machine (TA-SVM) is proposed for handling nonstationary datasets. While attractive performance has been reported and the new classifier is distinctive in simultaneously solving several SVM subclassifiers locally and globally by using an elegant SVM formulation in an alternative kernel space, the coupling of subclassifiers brings in the computation of matrix inversion, thus resulting to suffer from high computational burden in large nonstationary dataset applications. To overcome this shortcoming, an improved TA-SVM (ITA-SVM) is proposed using a common vector shared by all the SVM subclassifiers involved. ITA-SVM not only keeps an SVM formulation, but also avoids the computation of matrix inversion. Thus, we can realize its fast version, that is, improved time-adaptive core vector machine (ITA-CVM) for large nonstationary datasets by using the CVM technique. ITA-CVM has the merit of asymptotic linear time complexity for large nonstationary datasets as well as inherits the advantage of TA-SVM. The effectiveness of the proposed classifiers ITA-SVM and ITA-CVM is also experimentally confirmed.
Real time PI-backstepping induction machine drive with efficiency optimization.
Farhani, Fethi; Ben Regaya, Chiheb; Zaafouri, Abderrahmen; Chaari, Abdelkader
2017-09-01
This paper describes a robust and efficient speed control of a three phase induction machine (IM) subjected to load disturbances. First, a Multiple-Input Multiple-Output (MIMO) PI-Backstepping controller is proposed for a robust and highly accurate tracking of the mechanical speed and rotor flux. Asymptotic stability of the control scheme is proven by Lyapunov Stability Theory. Second, an active online optimization algorithm is used to optimize the efficiency of the drive system. The efficiency improvement approach consists of adjusting the rotor flux with respect to the load torque in order to minimize total losses in the IM. A dSPACE DS1104 R&D board is used to implement the proposed solution. The experimental results released on 3kW squirrel cage IM, show that the reference speed as well as the rotor flux are rapidly achieved with a fast transient response and without overshoot. A good load disturbances rejection response and IM parameters variation are fairly handled. The improvement of drive system efficiency reaches up to 180% at light load. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.
Online Least Squares One-Class Support Vector Machines-Based Abnormal Visual Event Detection
Wang, Tian; Chen, Jie; Zhou, Yi; Snoussi, Hichem
2013-01-01
The abnormal event detection problem is an important subject in real-time video surveillance. In this paper, we propose a novel online one-class classification algorithm, online least squares one-class support vector machine (online LS-OC-SVM), combined with its sparsified version (sparse online LS-OC-SVM). LS-OC-SVM extracts a hyperplane as an optimal description of training objects in a regularized least squares sense. The online LS-OC-SVM learns a training set with a limited number of samples to provide a basic normal model, then updates the model through remaining data. In the sparse online scheme, the model complexity is controlled by the coherence criterion. The online LS-OC-SVM is adopted to handle the abnormal event detection problem. Each frame of the video is characterized by the covariance matrix descriptor encoding the moving information, then is classified into a normal or an abnormal frame. Experiments are conducted, on a two-dimensional synthetic distribution dataset and a benchmark video surveillance dataset, to demonstrate the promising results of the proposed online LS-OC-SVM method. PMID:24351629
Online least squares one-class support vector machines-based abnormal visual event detection.
Wang, Tian; Chen, Jie; Zhou, Yi; Snoussi, Hichem
2013-12-12
The abnormal event detection problem is an important subject in real-time video surveillance. In this paper, we propose a novel online one-class classification algorithm, online least squares one-class support vector machine (online LS-OC-SVM), combined with its sparsified version (sparse online LS-OC-SVM). LS-OC-SVM extracts a hyperplane as an optimal description of training objects in a regularized least squares sense. The online LS-OC-SVM learns a training set with a limited number of samples to provide a basic normal model, then updates the model through remaining data. In the sparse online scheme, the model complexity is controlled by the coherence criterion. The online LS-OC-SVM is adopted to handle the abnormal event detection problem. Each frame of the video is characterized by the covariance matrix descriptor encoding the moving information, then is classified into a normal or an abnormal frame. Experiments are conducted, on a two-dimensional synthetic distribution dataset and a benchmark video surveillance dataset, to demonstrate the promising results of the proposed online LS-OC-SVM method.
Human machine interface to manually drive rhombic like vehicles in remote handling operations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lopes, Pedro; Vale, Alberto; Ventura, Rodrigo
2015-07-01
In the thermonuclear experimental reactor ITER, a vehicle named CTS is designed to transport a container with activated components inside the buildings. In nominal operations, the CTS is autonomously guided under supervision. However, in some unexpected situations, such as in rescue and recovery operations, the autonomous mode must be overridden and the CTS must be remotely guided by an operator. The CTS is a rhombic-like vehicle, with two drivable and steerable wheels along its longitudinal axis, providing omni-directional capabilities. The rhombic kinematics correspond to four control variables, which are difficult to manage in manual mode operation. This paper proposes amore » Human Machine Interface (HMI) to remotely guide the vehicle in manual mode. The proposed solution is implemented using a HMI with an encoder connected to a micro-controller and an analog 2-axis joystick. Experimental results were obtained comparing the proposed solution with other controller devices in different scenarios and using a software platform that simulates the kinematics and dynamics of the vehicle. (authors)« less
Underestimating extreme events in power-law behavior due to machine-dependent cutoffs
NASA Astrophysics Data System (ADS)
Radicchi, Filippo
2014-11-01
Power-law distributions are typical macroscopic features occurring in almost all complex systems observable in nature. As a result, researchers in quantitative analyses must often generate random synthetic variates obeying power-law distributions. The task is usually performed through standard methods that map uniform random variates into the desired probability space. Whereas all these algorithms are theoretically solid, in this paper we show that they are subject to severe machine-dependent limitations. As a result, two dramatic consequences arise: (i) the sampling in the tail of the distribution is not random but deterministic; (ii) the moments of the sample distribution, which are theoretically expected to diverge as functions of the sample sizes, converge instead to finite values. We provide quantitative indications for the range of distribution parameters that can be safely handled by standard libraries used in computational analyses. Whereas our findings indicate possible reinterpretations of numerical results obtained through flawed sampling methodologies, they also pave the way for the search for a concrete solution to this central issue shared by all quantitative sciences dealing with complexity.
[Development of quality assurance/quality control web system in radiotherapy].
Okamoto, Hiroyuki; Mochizuki, Toshihiko; Yokoyama, Kazutoshi; Wakita, Akihisa; Nakamura, Satoshi; Ueki, Heihachi; Shiozawa, Keiko; Sasaki, Koji; Fuse, Masashi; Abe, Yoshihisa; Itami, Jun
2013-12-01
Our purpose is to develop a QA/QC (quality assurance/quality control) web system using a server-side script language such as HTML (HyperText Markup Language) and PHP (Hypertext Preprocessor), which can be useful as a tool to share information about QA/QC in radiotherapy. The system proposed in this study can be easily built in one's own institute, because HTML can be easily handled. There are two desired functions in a QA/QC web system: (i) To review the results of QA/QC for a radiotherapy machine, manuals, and reports necessary for routinely performing radiotherapy through this system. By disclosing the results, transparency can be maintained, (ii) To reveal a protocol for QA/QC in one's own institute using pictures and movies relating to QA/QC for simplicity's sake, which can also be used as an educational tool for junior radiation technologists and medical physicists. By using this system, not only administrators, but also all staff involved in radiotherapy, can obtain information about the conditions and accuracy of treatment machines through the QA/QC web system.
FDR Soil Moisture Sensor for Environmental Testing and Evaluation
NASA Astrophysics Data System (ADS)
Linmao, Ye; longqin, Xue; guangzhou, Zhang; haibo, Chen; likuai, Shi; zhigang, Wu; gouhe, Yu; yanbin, Wang; sujun, Niu; Jin, Ye; Qi, Jin
To test the affect of environmental stresses on a adaptability of soil moisture capacitance sensor(FDR) a number of stresses were induced including vibrational shock as well as temperature and humidity through the use of a CH-I constant humidity chamber with variable temperature. A Vibrational platform was used to exam the resistance and structural integrity of the sensor after vibrations simulating the process of using, transporting and handling the sensor. A Impactive trial platform was used to test the resistance and structural integrity of the sensor after enduring repeated mechanical shocks. An CH-I constant humidity chamber with high-low temperature was used to test the adaptability of sensor in different environments with high temperature, low temperature and constant humidity. Otherwise, scope of magnetic force line of sensor was also tested in this paper. Test show:the capacitance type soil moisture sensor spread a feeling machine to bear heat, high wet and low temperature, at bear impact and vibration experiment in pass an examination, is a kind of environment to adapt to ability very strong instrument;Spread a feeling machine moreover electric field strength function radius scope 7 cms.
The REX-ISOLDE charge breeder as an operational machine
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wenander, F.; Delahaye, P.; Scrivens, R.
2006-03-15
The charge breeding system of radioactive beam experiment at ISOLDE (REX-ISOLDE), consisting of a large Penning trap in combination with an electron-beam ion source (EBIS), is now a mature concept after having delivered radioactive beams for postacceleration to a number of experiments for three years. The system, preparing ions prior to injection into a compact linear accelerator, has shown to be versatile in terms of the ion species and energies that can be delivered. During the experimental periods 2004 and 2005 a significant part of the ISOLDE beam time was dedicated to REX-ISOLDE experiments. Ion masses in the range betweenmore » A=7 and 153 have been handled with record efficiencies. High-intensity as well as very-short-lived isotope beams were proven to be feasible. Continuous injection into the EBIS has also been successfully tested. Two means of suppressing unwanted beam contaminations were tested and are now in use. In addition, the experience gained from the trap-EBIS concept from a machine operational point of view will be discussed and the limitations described.« less
Blando, James D; Schill, Donald P; De La Cruz, Mary Pauline; Zhang, Lin; Zhang, Junfeng
2010-09-01
Many states are considering, and some states have actively pursued, banning the use of perchloroethylene (PERC) in dry cleaning establishments. Proposed legislation has led many dry cleaners to consider the use of products that contain greater than 90% n-propyl bromide (n-PB; also called 1-bromopropane or 1-BP). Very little information is known about toxicity and exposure to n-PB. Some n-PB-containing products are marketed as nonhazardous and "green" or "organic." This has resulted in some users perceiving the solvent as nontoxic and has resulted in at least one significant poisoning incident in New Jersey. In addition, many dry cleaning operators may not realize that the machine components and settings must be changed when converting from PERC to n-PB containing products. Not performing these modifications may result in overheating and significant leaks in the dry cleaning equipment. A preliminary investigation was conducted of the potential exposures to n-PB and isopropyl bromide (iso-PB; also called 2-bromopropane or 2-BP) among dry cleaners in New Jersey who have converted their machines from PERC to these new solvent products. Personal breathing zone and area samples were collected using the National Institute for Occupational Safety and Health Sampling and Analytical Method 1025, with a slight modification to gas chromatography conditions to facilitate better separation of n-PB from iso-PB. During the preliminary investigation, exposures to n-PB among some workers in two of three shops were measured that were greater than the American Conference of Governmental Industrial Hygienists (ACGIH) threshold limit value (TLV) for n-PB. The highest exposure measured among a dry cleaning machine operator was 54 parts per million (ppm) as an 8-hr time-weighted average, which is more than 5 times the ACGIH TLV of 10 ppm. The preliminary investigation also found that the work tasks most likely to result in the highest short-term exposures included the introduction of solvent to the machine, maintenance of the machine, unloading and handling of recently cleaned clothes, and interrupting the wash cycle of the machine. In addition, this assessment suggested that leaks may have contributed to exposure and may have resulted from normal machine wear over time, ineffective maintenance, and from the incompatibility of n-PB with gasket materials.
Elicitation of neurological knowledge with argument-based machine learning.
Groznik, Vida; Guid, Matej; Sadikov, Aleksander; Možina, Martin; Georgiev, Dejan; Kragelj, Veronika; Ribarič, Samo; Pirtošek, Zvezdan; Bratko, Ivan
2013-02-01
The paper describes the use of expert's knowledge in practice and the efficiency of a recently developed technique called argument-based machine learning (ABML) in the knowledge elicitation process. We are developing a neurological decision support system to help the neurologists differentiate between three types of tremors: Parkinsonian, essential, and mixed tremor (comorbidity). The system is intended to act as a second opinion for the neurologists, and most importantly to help them reduce the number of patients in the "gray area" that require a very costly further examination (DaTSCAN). We strive to elicit comprehensible and medically meaningful knowledge in such a way that it does not come at the cost of diagnostic accuracy. To alleviate the difficult problem of knowledge elicitation from data and domain experts, we used ABML. ABML guides the expert to explain critical special cases which cannot be handled automatically by machine learning. This very efficiently reduces the expert's workload, and combines expert's knowledge with learning data. 122 patients were enrolled into the study. The classification accuracy of the final model was 91%. Equally important, the initial and the final models were also evaluated for their comprehensibility by the neurologists. All 13 rules of the final model were deemed as appropriate to be able to support its decisions with good explanations. The paper demonstrates ABML's advantage in combining machine learning and expert knowledge. The accuracy of the system is very high with respect to the current state-of-the-art in clinical practice, and the system's knowledge base is assessed to be very consistent from a medical point of view. This opens up the possibility to use the system also as a teaching tool. Copyright © 2012 Elsevier B.V. All rights reserved.
Turbulence modeling for Francis turbine water passages simulation
NASA Astrophysics Data System (ADS)
Maruzewski, P.; Hayashi, H.; Munch, C.; Yamaishi, K.; Hashii, T.; Mombelli, H. P.; Sugow, Y.; Avellan, F.
2010-08-01
The applications of Computational Fluid Dynamics, CFD, to hydraulic machines life require the ability to handle turbulent flows and to take into account the effects of turbulence on the mean flow. Nowadays, Direct Numerical Simulation, DNS, is still not a good candidate for hydraulic machines simulations due to an expensive computational time consuming. Large Eddy Simulation, LES, even, is of the same category of DNS, could be an alternative whereby only the small scale turbulent fluctuations are modeled and the larger scale fluctuations are computed directly. Nevertheless, the Reynolds-Averaged Navier-Stokes, RANS, model have become the widespread standard base for numerous hydraulic machine design procedures. However, for many applications involving wall-bounded flows and attached boundary layers, various hybrid combinations of LES and RANS are being considered, such as Detached Eddy Simulation, DES, whereby the RANS approximation is kept in the regions where the boundary layers are attached to the solid walls. Furthermore, the accuracy of CFD simulations is highly dependent on the grid quality, in terms of grid uniformity in complex configurations. Moreover any successful structured and unstructured CFD codes have to offer a wide range to the variety of classic RANS model to hybrid complex model. The aim of this study is to compare the behavior of turbulent simulations for both structured and unstructured grids topology with two different CFD codes which used the same Francis turbine. Hence, the study is intended to outline the encountered discrepancy for predicting the wake of turbine blades by using either the standard k-epsilon model, or the standard k-epsilon model or the SST shear stress model in a steady CFD simulation. Finally, comparisons are made with experimental data from the EPFL Laboratory for Hydraulic Machines reduced scale model measurements.
2013-01-01
Background Machine learning techniques are becoming useful as an alternative approach to conventional medical diagnosis or prognosis as they are good for handling noisy and incomplete data, and significant results can be attained despite a small sample size. Traditionally, clinicians make prognostic decisions based on clinicopathologic markers. However, it is not easy for the most skilful clinician to come out with an accurate prognosis by using these markers alone. Thus, there is a need to use genomic markers to improve the accuracy of prognosis. The main aim of this research is to apply a hybrid of feature selection and machine learning methods in oral cancer prognosis based on the parameters of the correlation of clinicopathologic and genomic markers. Results In the first stage of this research, five feature selection methods have been proposed and experimented on the oral cancer prognosis dataset. In the second stage, the model with the features selected from each feature selection methods are tested on the proposed classifiers. Four types of classifiers are chosen; these are namely, ANFIS, artificial neural network, support vector machine and logistic regression. A k-fold cross-validation is implemented on all types of classifiers due to the small sample size. The hybrid model of ReliefF-GA-ANFIS with 3-input features of drink, invasion and p63 achieved the best accuracy (accuracy = 93.81%; AUC = 0.90) for the oral cancer prognosis. Conclusions The results revealed that the prognosis is superior with the presence of both clinicopathologic and genomic markers. The selected features can be investigated further to validate the potential of becoming as significant prognostic signature in the oral cancer studies. PMID:23725313
Sample handling for mass spectrometric proteomic investigations of human sera.
West-Nielsen, Mikkel; Høgdall, Estrid V; Marchiori, Elena; Høgdall, Claus K; Schou, Christian; Heegaard, Niels H H
2005-08-15
Proteomic investigations of sera are potentially of value for diagnosis, prognosis, choice of therapy, and disease activity assessment by virtue of discovering new biomarkers and biomarker patterns. Much debate focuses on the biological relevance and the need for identification of such biomarkers while less effort has been invested in devising standard procedures for sample preparation and storage in relation to model building based on complex sets of mass spectrometric (MS) data. Thus, development of standardized methods for collection and storage of patient samples together with standards for transportation and handling of samples are needed. This requires knowledge about how sample processing affects MS-based proteome analyses and thereby how nonbiological biased classification errors are avoided. In this study, we characterize the effects of sample handling, including clotting conditions, storage temperature, storage time, and freeze/thaw cycles, on MS-based proteomics of human serum by using principal components analysis, support vector machine learning, and clustering methods based on genetic algorithms as class modeling and prediction methods. Using spiking to artificially create differentiable sample groups, this integrated approach yields data that--even when working with sample groups that differ more than may be expected in biological studies--clearly demonstrate the need for comparable sampling conditions for samples used for modeling and for the samples that are going into the test set group. Also, the study emphasizes the difference between class prediction and class comparison studies as well as the advantages and disadvantages of different modeling methods.
Metabolic network prediction through pairwise rational kernels.
Roche-Lima, Abiel; Domaratzki, Michael; Fristensky, Brian
2014-09-26
Metabolic networks are represented by the set of metabolic pathways. Metabolic pathways are a series of biochemical reactions, in which the product (output) from one reaction serves as the substrate (input) to another reaction. Many pathways remain incompletely characterized. One of the major challenges of computational biology is to obtain better models of metabolic pathways. Existing models are dependent on the annotation of the genes. This propagates error accumulation when the pathways are predicted by incorrectly annotated genes. Pairwise classification methods are supervised learning methods used to classify new pair of entities. Some of these classification methods, e.g., Pairwise Support Vector Machines (SVMs), use pairwise kernels. Pairwise kernels describe similarity measures between two pairs of entities. Using pairwise kernels to handle sequence data requires long processing times and large storage. Rational kernels are kernels based on weighted finite-state transducers that represent similarity measures between sequences or automata. They have been effectively used in problems that handle large amount of sequence information such as protein essentiality, natural language processing and machine translations. We create a new family of pairwise kernels using weighted finite-state transducers (called Pairwise Rational Kernel (PRK)) to predict metabolic pathways from a variety of biological data. PRKs take advantage of the simpler representations and faster algorithms of transducers. Because raw sequence data can be used, the predictor model avoids the errors introduced by incorrect gene annotations. We then developed several experiments with PRKs and Pairwise SVM to validate our methods using the metabolic network of Saccharomyces cerevisiae. As a result, when PRKs are used, our method executes faster in comparison with other pairwise kernels. Also, when we use PRKs combined with other simple kernels that include evolutionary information, the accuracy values have been improved, while maintaining lower construction and execution times. The power of using kernels is that almost any sort of data can be represented using kernels. Therefore, completely disparate types of data can be combined to add power to kernel-based machine learning methods. When we compared our proposal using PRKs with other similar kernel, the execution times were decreased, with no compromise of accuracy. We also proved that by combining PRKs with other kernels that include evolutionary information, the accuracy can also also be improved. As our proposal can use any type of sequence data, genes do not need to be properly annotated, avoiding accumulation errors because of incorrect previous annotations.
Giacometti, Federica; Bonilauri, Paolo; Amatiste, Simonetta; Arrigoni, Norma; Bianchi, Manila; Losio, Marina Nadia; Bilei, Stefano; Cascone, Giuseppe; Comin, Damiano; Daminelli, Paolo; Decastelli, Lucia; Merialdi, Giuseppe; Mioni, Renzo; Peli, Angelo; Petruzzelli, Annalisa; Tonucci, Franco; Piva, Silvia; Serraino, Andrea
2015-09-01
A quantitative risk assessment (RA) model was developed to describe the risk of campylobacteriosis linked to consumption of raw milk sold in vending machines in Italy. Exposure assessment was based on the official microbiological records of raw milk samples from vending machines monitored by the regional Veterinary Authorities from 2008 to 2011, microbial growth during storage, destruction experiments, consumption frequency of raw milk, serving size, consumption preference and age of consumers. The differential risk considered milk handled under regulation conditions (4°C throughout all phases) and the worst time-temperature field handling conditions detected. Two separate RA models were developed, one for the consumption of boiled milk and the other for the consumption of raw milk, and two different dose-response (D-R) relationships were considered. The RA model predicted no human campylobacteriosis cases per year either in the best (4°C) storage conditions or in the case of thermal abuse in case of boiling raw milk, whereas in case of raw milk consumption the annual estimated campylobacteriosis cases depend on the dose-response relationships used in the model (D-R I or D-R II), the milk time-temperature storage conditions, consumer behaviour and age of consumers, namely young (with two cut-off values of ≤5 or ≤6 years old for the sensitive population) versus adult consumers. The annual estimated cases for young consumers using D-R II for the sensitive population (≤5 years old) ranged between 1013.7/100,000 population and 8110.3/100,000 population and for adult consumers using D-R I between 79.4/100,000 population and 333.1/100,000 population. Quantification of the risks associated with raw milk consumption is necessary from a public health perspective and the proposed RA model represents a useful and flexible tool to perform future RAs based on local consumer habits to support decision-making on safety policies. Further educational programmes for raw milk consumers or potential raw milk consumers are required to encourage consumers to boil milk to reduce the associated risk of illness. Copyright © 2015 Elsevier B.V. All rights reserved.
Activated, coal-based carbon foam
Rogers, Darren Kenneth; Plucinski, Janusz Wladyslaw
2004-12-21
An ablation resistant, monolithic, activated, carbon foam produced by the activation of a coal-based carbon foam through the action of carbon dioxide, ozone or some similar oxidative agent that pits and/or partially oxidizes the carbon foam skeleton, thereby significantly increasing its overall surface area and concurrently increasing its filtering ability. Such activated carbon foams are suitable for application in virtually all areas where particulate or gel form activated carbon materials have been used. Such an activated carbon foam can be fabricated, i.e. sawed, machined and otherwise shaped to fit virtually any required filtering location by simple insertion and without the need for handling the "dirty" and friable particulate activated carbon foam materials of the prior art.
Mathematical model of bone drilling for virtual surgery system
NASA Astrophysics Data System (ADS)
Alaytsev, Innokentiy K.; Danilova, Tatyana V.; Manturov, Alexey O.; Mareev, Gleb O.; Mareev, Oleg V.
2018-04-01
The bone drilling is an essential part of surgeries in ENT and Dentistry. A proper training of drilling machine handling skills is impossible without proper modelling of the drilling process. Utilization of high precision methods like FEM is limited due to the requirement of 1000 Hz update rate for haptic feedback. The study presents a mathematical model of the drilling process that accounts the properties of materials, the geometry and the rotation rate of a burr to compute the removed material volume. The simplicity of the model allows for integrating it in the high-frequency haptic thread. The precision of the model is enough for a virtual surgery system targeted on the training of the basic surgery skills.
NASA Technical Reports Server (NTRS)
Parrish, R. V.; Mckissick, B. T.; Steinmetz, G. G.
1979-01-01
A recent modification of the methodology of profile analysis, which allows the testing for differences between two functions as a whole with a single test, rather than point by point with multiple tests is discussed. The modification is applied to the examination of the issue of motion/no motion conditions as shown by the lateral deviation curve as a function of engine cut speed of a piloted 737-100 simulator. The results of this application are presented along with those of more conventional statistical test procedures on the same simulator data.
Tablet—next generation sequence assembly visualization
Milne, Iain; Bayer, Micha; Cardle, Linda; Shaw, Paul; Stephen, Gordon; Wright, Frank; Marshall, David
2010-01-01
Summary: Tablet is a lightweight, high-performance graphical viewer for next-generation sequence assemblies and alignments. Supporting a range of input assembly formats, Tablet provides high-quality visualizations showing data in packed or stacked views, allowing instant access and navigation to any region of interest, and whole contig overviews and data summaries. Tablet is both multi-core aware and memory efficient, allowing it to handle assemblies containing millions of reads, even on a 32-bit desktop machine. Availability: Tablet is freely available for Microsoft Windows, Apple Mac OS X, Linux and Solaris. Fully bundled installers can be downloaded from http://bioinf.scri.ac.uk/tablet in 32- and 64-bit versions. Contact: tablet@scri.ac.uk PMID:19965881
Robust analysis of trends in noisy tokamak confinement data using geodesic least squares regression
DOE Office of Scientific and Technical Information (OSTI.GOV)
Verdoolaege, G., E-mail: geert.verdoolaege@ugent.be; Laboratory for Plasma Physics, Royal Military Academy, B-1000 Brussels; Shabbir, A.
Regression analysis is a very common activity in fusion science for unveiling trends and parametric dependencies, but it can be a difficult matter. We have recently developed the method of geodesic least squares (GLS) regression that is able to handle errors in all variables, is robust against data outliers and uncertainty in the regression model, and can be used with arbitrary distribution models and regression functions. We here report on first results of application of GLS to estimation of the multi-machine scaling law for the energy confinement time in tokamaks, demonstrating improved consistency of the GLS results compared to standardmore » least squares.« less
Automation effects in a stereotypical multiloop manual control system. [for aircraft
NASA Technical Reports Server (NTRS)
Hess, R. A.; Mcnally, B. D.
1984-01-01
The increasing reliance of state-of-the art, high performance aircraft on high authority stability and command augmentation systems, in order to obtain satisfactory performance and handling qualities, has made critical the achievement of a better understanding of human capabilities, limitations, and preferences during interactions with complex dynamic systems that involve task allocation between man and machine. An analytical and experimental study has been undertaken to investigate human interaction with a simple, multiloop dynamic system in which human activity was systematically varied by changing the levels of automation. Task definition has led to a control loop structure which parallels that for any multiloop manual control system, and may therefore be considered a stereotype.
Some physical properties of ginkgo nuts and kernels
NASA Astrophysics Data System (ADS)
Ch'ng, P. E.; Abdullah, M. H. R. O.; Mathai, E. J.; Yunus, N. A.
2013-12-01
Some data of the physical properties of ginkgo nuts at a moisture content of 45.53% (±2.07) (wet basis) and of their kernels at 60.13% (± 2.00) (wet basis) are presented in this paper. It consists of the estimation of the mean length, width, thickness, the geometric mean diameter, sphericity, aspect ratio, unit mass, surface area, volume, true density, bulk density, and porosity measures. The coefficient of static friction for nuts and kernels was determined by using plywood, glass, rubber, and galvanized steel sheet. The data are essential in the field of food engineering especially dealing with design and development of machines, and equipment for processing and handling agriculture products.
NASA Astrophysics Data System (ADS)
Hirayama, Hideo; Kondo, Kenjiro; Suzuki, Seishiro; Hamamoto, Shimpei; Iwanaga, Kohei
2017-09-01
Pulse height distributions were measured using a LaBr3 detector set in a 1 cm lead collimator to investigate main radiation source at the operation floor of Fukushima Daiichi Nuclear Power Station Unit 4. It was confirmed that main radiation source above the reactor well was Co-60 from the activated steam dryer in the DS pool (Dryer-Separator pool) and that at the standby area was Cs-134 and Cs-137 from contaminated buildings and debris at the lower floor. Full energy peak count rate of Co-60 was reduced about 1/3 by 12mm lead sheet placed on the floor of the fuel handling machine.
Running functional sport vest and short for e-textile applications
NASA Astrophysics Data System (ADS)
Baskan, H.; Acikgoz, H.; Atakan, R.; Eryuruk, H.; Akalın, N.; Kose, H.; Li, Y.; Kursun Bahadir, S.; Kalaoglu, F.
2017-10-01
Sports garments with functional properties have become crucial as well as comfort properties since they improve the wearer performance. For this reason, sport vest and short having high elastic recovery with fall detection sensor, were designed and produced by using flat-bed knitting machine. Comfort properties of short and vest were tested with several test instruments and; tensile strength of elastomeric yarn, air permeability, moisture management, drape and objective handle (FAST tests) of garments were achieved. It was proved that short and vest samples have good comfort properties as a functional sport garment. It was also tested that fall-detection sensor can work efficiently by using a mobile phone application.
NASA Technical Reports Server (NTRS)
Smith, T. B., III; Lala, J. H.
1984-01-01
The FTMP architecture is a high reliability computer concept modeled after a homogeneous multiprocessor architecture. Elements of the FTMP are operated in tight synchronism with one another and hardware fault-detection and fault-masking is provided which is transparent to the software. Operating system design and user software design is thus greatly simplified. Performance of the FTMP is also comparable to that of a simplex equivalent due to the efficiency of fault handling hardware. The FTMP project constructed an engineering module of the FTMP, programmed the machine and extensively tested the architecture through fault injection and other stress testing. This testing confirmed the soundness of the FTMP concepts.
NASA Technical Reports Server (NTRS)
1980-01-01
The U.S./Canada wheat/barley exploratory experiment is discussed with emphasis on labeling, machine processing using P1A, and the crop calendar. Classification and the simulated aggregation test used in the U.S. corn/soybean exploratory experiment are also considered. Topics covered regarding the foreign commodity production forecasting project include: (1) the acquisition, handling, and processing of both U.S. and foreign agricultural data, as well as meteorological data. The accuracy assessment methodology, multicrop sampling and aggregation technology development, frame development, the yield project interface, and classification for area estimation are also examined.
An interactive editor for definition of touch-sensitive zones for a graphic display
NASA Technical Reports Server (NTRS)
Monroe, Burt L., III; Jones, Denise R.
1987-01-01
In the continuing effort to develop more efficient man-machine communications methods, touch displays have shown potential as straightforward input systems. The development of software necessary to handle such systems, however, can become tedious. In order to reduce the need for redundant programming, a touch editor has been developed which allows a programmer to interactively define touch-sensitive areas for a graphic display. The information produced during the editing process is written to a data file, which can be accessed easily when needed by an application program. This paper outlines the structure, logic, and use of the editor, as well as the hardware with which it is presently compatible.
INFIBRA: machine vision inspection of acrylic fiber production
NASA Astrophysics Data System (ADS)
Davies, Roger; Correia, Bento A. B.; Contreiras, Jose; Carvalho, Fernando D.
1998-10-01
This paper describes the implementation of INFIBRA, a machine vision system for the inspection of acrylic fiber production lines. The system was developed by INETI under a contract from Fisipe, Fibras Sinteticas de Portugal, S.A. At Fisipe there are ten production lines in continuous operation, each approximately 40 m in length. A team of operators used to perform periodic manual visual inspection of each line in conditions of high ambient temperature and humidity. It is not surprising that failures in the manual inspection process occurred with some frequency, with consequences that ranged from reduced fiber quality to production stoppages. The INFIBRA system architecture is a specialization of a generic, modular machine vision architecture based on a network of Personal Computers (PCs), each equipped with a low cost frame grabber. Each production line has a dedicated PC that performs automatic inspection, using specially designed metrology algorithms, via four video cameras located at key positions on the line. The cameras are mounted inside custom-built, hermetically sealed water-cooled housings to protect them from the unfriendly environment. The ten PCs, one for each production line, communicate with a central PC via a standard Ethernet connection. The operator controls all aspects of the inspection process, from configuration through to handling alarms, via a simple graphical interface on the central PC. At any time the operator can also view on the central PC's screen the live image from any one of the 40 cameras employed by the system.
NASA Astrophysics Data System (ADS)
Cauchi, Marija; Assmann, R. W.; Bertarelli, A.; Carra, F.; Lari, L.; Rossi, A.; Mollicone, P.; Sammut, N.
2015-02-01
The correct functioning of a collimation system is crucial to safely and successfully operate high-energy particle accelerators, such as the Large Hadron Collider (LHC). However, the requirements to handle high-intensity beams can be demanding, and accident scenarios must be well studied in order to assess if the collimator design is robust against possible error scenarios. One of the catastrophic, though not very probable, accident scenarios identified within the LHC is an asynchronous beam dump. In this case, one (or more) of the 15 precharged kicker circuits fires out of time with the abort gap, spraying beam pulses onto LHC machine elements before the machine protection system can fire the remaining kicker circuits and bring the beam to the dump. If a proton bunch directly hits a collimator during such an event, severe beam-induced damage such as magnet quenches and other equipment damage might result, with consequent downtime for the machine. This study investigates a number of newly defined jaw error cases, which include angular misalignment errors of the collimator jaw. A numerical finite element method approach is presented in order to precisely evaluate the thermomechanical response of tertiary collimators to beam impact. We identify the most critical and interesting cases, and show that a tilt of the jaw can actually mitigate the effect of an asynchronous dump on the collimators. Relevant collimator damage limits are taken into account, with the aim to identify optimal operational conditions for the LHC.
Choi, Sangjun; Kang, Dongmug; Park, Donguk; Lee, Hyunhee; Choi, Bongkyoo
2017-03-01
The goal of this study is to develop a general population job-exposure matrix (GPJEM) on asbestos to estimate occupational asbestos exposure levels in the Republic of Korea. Three Korean domestic quantitative exposure datasets collected from 1984 to 2008 were used to build the GPJEM. Exposure groups in collected data were reclassified based on the current Korean Standard Industrial Classification (9 th edition) and the Korean Standard Classification of Occupations code (6 th edition) that is in accordance to international standards. All of the exposure levels were expressed by weighted arithmetic mean (WAM) and minimum and maximum concentrations. Based on the established GPJEM, the 112 exposure groups could be reclassified into 86 industries and 74 occupations. In the 1980s, the highest exposure levels were estimated in "knitting and weaving machine operators" with a WAM concentration of 7.48 fibers/mL (f/mL); in the 1990s, "plastic products production machine operators" with 5.12 f/mL, and in the 2000s "detergents production machine operators" handling talc containing asbestos with 2.45 f/mL. Of the 112 exposure groups, 44 groups had higher WAM concentrations than the Korean occupational exposure limit of 0.1 f/mL. The newly constructed GPJEM which is generated from actual domestic quantitative exposure data could be useful in evaluating historical exposure levels to asbestos and could contribute to improved prediction of asbestos-related diseases among Koreans.
Status of the Future Circular Collider Study
NASA Astrophysics Data System (ADS)
Benedikt, Michael
2016-03-01
Following the 2013 update of the European Strategy for Particle Physics, the international Future Circular Collider (FCC) Study has been launched by CERN as host institute, to design an energy frontier hadron collider (FCC-hh) in a new 80-100 km tunnel with a centre-of-mass energy of about 100 TeV, an order of magnitude beyond the LHC's, as a long-term goal. The FCC study also includes the design of a 90-350 GeV high-luminosity lepton collider (FCC-ee) installed in the same tunnel, serving as Higgs, top and Z factory, as a potential intermediate step, as well as an electron-proton collider option (FCC-he). The physics cases for such machines will be assessed and concepts for experiments will be developed in time for the next update of the European Strategy for Particle Physics by the end of 2018. The presentation will summarize the status of machine designs and parameters and discuss the essential technical components to be developed in the frame of the FCC study. Key elements are superconducting accelerator-dipole magnets with a field of 16 T for the hadron collider and high-power, high-efficiency RF systems for the lepton collider. In addition the unprecedented beam power presents special challenges for the hadron collider for all aspects of beam handling and machine protection. First conclusions of geological investigations and implementation studies will be presented. The status of the FCC collaboration and the further planning for the study will be outlined.
NASA Astrophysics Data System (ADS)
Kumar, R.; Samaniego, L. E.; Livneh, B.
2013-12-01
Knowledge of soil hydraulic properties such as porosity and saturated hydraulic conductivity is required to accurately model the dynamics of near-surface hydrological processes (e.g. evapotranspiration and root-zone soil moisture dynamics) and provide reliable estimates of regional water and energy budgets. Soil hydraulic properties are commonly derived from pedo-transfer functions using soil textural information recorded during surveys, such as the fractions of sand and clay, bulk density, and organic matter content. Typically large scale land-surface models are parameterized using a relatively coarse soil map with little or no information on parametric sub-grid variability. In this study we analyze the impact of sub-grid soil variability on simulated hydrological fluxes over the Mississippi River Basin (≈3,240,000 km2) at multiple spatio-temporal resolutions. A set of numerical experiments were conducted with the distributed mesoscale hydrologic model (mHM) using two soil datasets: (a) the Digital General Soil Map of the United States or STATSGO2 (1:250 000) and (b) the recently collated Harmonized World Soil Database based on the FAO-UNESCO Soil Map of the World (1:5 000 000). mHM was parameterized with the multi-scale regionalization technique that derives distributed soil hydraulic properties via pedo-transfer functions and regional coefficients. Within the experimental framework, the 3-hourly model simulations were conducted at four spatial resolutions ranging from 0.125° to 1°, using meteorological datasets from the NLDAS-2 project for the time period 1980-2012. Preliminary results indicate that the model was able to capture observed streamflow behavior reasonably well with both soil datasets, in the major sub-basins (i.e. the Missouri, the Upper Mississippi, the Ohio, the Red, and the Arkansas). However, the spatio-temporal patterns of simulated water fluxes and states (e.g. soil moisture, evapotranspiration) from both simulations, showed marked differences; particularly at a shorter time scale (hours to days) in regions with coarse texture sandy soils. Furthermore, the partitioning of total runoff into near-surface interflows and baseflow components was also significantly different between the two simulations. Simulations with the coarser soil map produced comparatively higher baseflows. At longer time scales (months to seasons) where climatic factors plays a major role, the integrated fluxes and states from both sets of model simulations match fairly closely, despite the apparent discrepancy in the partitioning of total runoff.
Automatic assembly of micro-optical components
NASA Astrophysics Data System (ADS)
Gengenbach, Ulrich K.
1996-12-01
Automatic assembly becomes an important issue as hybrid micro systems enter industrial fabrication. Moving from a laboratory scale production with manual assembly and bonding processes to automatic assembly requires a thorough re- evaluation of the design, the characteristics of the individual components and of the processes involved. Parts supply for automatic operation, sensitive and intelligent grippers adapted to size, surface and material properties of the microcomponents gain importance when the superior sensory and handling skills of a human are to be replaced by a machine. This holds in particular for the automatic assembly of micro-optical components. The paper outlines these issues exemplified at the automatic assembly of a micro-optical duplexer consisting of a micro-optical bench fabricated by the LIGA technique, two spherical lenses, a wavelength filter and an optical fiber. Spherical lenses, wavelength filter and optical fiber are supplied by third party vendors, which raises the question of parts supply for automatic assembly. The bonding processes for these components include press fit and adhesive bonding. The prototype assembly system with all relevant components e.g. handling system, parts supply, grippers and control is described. Results of first automatic assembly tests are presented.
Biotechnology for Solar System Exploration
NASA Astrophysics Data System (ADS)
Steele, A.; Maule, J.; Toporski, J.; Parro-Garcia, V.; Briones, C.; Schweitzer, M.; McKay, D.
With the advent of a new era of astrobiology missions in the exploration of the solar system and the search for evidence of life elsewhere, we present a new approach to this goal, the integration of biotechnology. We have reviewed the current list of biotechnology techniques, which are applicable to miniaturization, automatization and integration into a combined flight platform. Amongst the techniques reviewed are- The uses of antibodies- Fluorescent detection strategies- Protein and DNA chip technology- Surface plasmon resonance and its relation to other techniques- Micro electronic machining (MEMS where applicable to biologicalsystems)- nanotechnology (e.g. molecular motors)- Lab-on-a-chip technology (including PCR)- Mass spectrometry (i.e. MALDI-TOF)- Fluid handling and extraction technologies- Chemical Force Microscopy (CFM)- Raman Spectroscopy We have begun to integrate this knowledge into a single flight instrument approach for the sole purpose of combining several mutually confirming tests for life, organic and/or microbial contamination, as well as prebiotic and abiotic organic chemicals. We will present several innovative designs for new instrumentation including pro- engineering design drawings of a protein chip reader for space flight and fluid handling strategies. We will also review the use of suitable extraction methodologies for use on different solar system bodies.
Multicategory Composite Least Squares Classifiers
Park, Seo Young; Liu, Yufeng; Liu, Dacheng; Scholl, Paul
2010-01-01
Classification is a very useful statistical tool for information extraction. In particular, multicategory classification is commonly seen in various applications. Although binary classification problems are heavily studied, extensions to the multicategory case are much less so. In view of the increased complexity and volume of modern statistical problems, it is desirable to have multicategory classifiers that are able to handle problems with high dimensions and with a large number of classes. Moreover, it is necessary to have sound theoretical properties for the multicategory classifiers. In the literature, there exist several different versions of simultaneous multicategory Support Vector Machines (SVMs). However, the computation of the SVM can be difficult for large scale problems, especially for problems with large number of classes. Furthermore, the SVM cannot produce class probability estimation directly. In this article, we propose a novel efficient multicategory composite least squares classifier (CLS classifier), which utilizes a new composite squared loss function. The proposed CLS classifier has several important merits: efficient computation for problems with large number of classes, asymptotic consistency, ability to handle high dimensional data, and simple conditional class probability estimation. Our simulated and real examples demonstrate competitive performance of the proposed approach. PMID:21218128
NASA Astrophysics Data System (ADS)
Hassan, A. H.; Fluke, C. J.; Barnes, D. G.
2012-09-01
Upcoming and future astronomy research facilities will systematically generate terabyte-sized data sets moving astronomy into the Petascale data era. While such facilities will provide astronomers with unprecedented levels of accuracy and coverage, the increases in dataset size and dimensionality will pose serious computational challenges for many current astronomy data analysis and visualization tools. With such data sizes, even simple data analysis tasks (e.g. calculating a histogram or computing data minimum/maximum) may not be achievable without access to a supercomputing facility. To effectively handle such dataset sizes, which exceed today's single machine memory and processing limits, we present a framework that exploits the distributed power of GPUs and many-core CPUs, with a goal of providing data analysis and visualizing tasks as a service for astronomers. By mixing shared and distributed memory architectures, our framework effectively utilizes the underlying hardware infrastructure handling both batched and real-time data analysis and visualization tasks. Offering such functionality as a service in a “software as a service” manner will reduce the total cost of ownership, provide an easy to use tool to the wider astronomical community, and enable a more optimized utilization of the underlying hardware infrastructure.
Automating quantum experiment control
NASA Astrophysics Data System (ADS)
Stevens, Kelly E.; Amini, Jason M.; Doret, S. Charles; Mohler, Greg; Volin, Curtis; Harter, Alexa W.
2017-03-01
The field of quantum information processing is rapidly advancing. As the control of quantum systems approaches the level needed for useful computation, the physical hardware underlying the quantum systems is becoming increasingly complex. It is already becoming impractical to manually code control for the larger hardware implementations. In this chapter, we will employ an approach to the problem of system control that parallels compiler design for a classical computer. We will start with a candidate quantum computing technology, the surface electrode ion trap, and build a system instruction language which can be generated from a simple machine-independent programming language via compilation. We incorporate compile time generation of ion routing that separates the algorithm description from the physical geometry of the hardware. Extending this approach to automatic routing at run time allows for automated initialization of qubit number and placement and additionally allows for automated recovery after catastrophic events such as qubit loss. To show that these systems can handle real hardware, we present a simple demonstration system that routes two ions around a multi-zone ion trap and handles ion loss and ion placement. While we will mainly use examples from transport-based ion trap quantum computing, many of the issues and solutions are applicable to other architectures.
NASA Technical Reports Server (NTRS)
Russo, Vincent; Johnston, Gary; Campbell, Roy
1988-01-01
The programming of the interrupt handling mechanisms, process switching primitives, scheduling mechanism, and synchronization primitives of an operating system for a multiprocessor require both efficient code in order to support the needs of high- performance or real-time applications and careful organization to facilitate maintenance. Although many advantages have been claimed for object-oriented class hierarchical languages and their corresponding design methodologies, the application of these techniques to the design of the primitives within an operating system has not been widely demonstrated. To investigate the role of class hierarchical design in systems programming, the authors have constructed the Choices multiprocessor operating system architecture the C++ programming language. During the implementation, it was found that many operating system design concerns can be represented advantageously using a class hierarchical approach, including: the separation of mechanism and policy; the organization of an operating system into layers, each of which represents an abstract machine; and the notions of process and exception management. In this paper, we discuss an implementation of the low-level primitives of this system and outline the strategy by which we developed our solution.
Robust Machine Learning-Based Correction on Automatic Segmentation of the Cerebellum and Brainstem.
Wang, Jun Yi; Ngo, Michael M; Hessl, David; Hagerman, Randi J; Rivera, Susan M
2016-01-01
Automated segmentation is a useful method for studying large brain structures such as the cerebellum and brainstem. However, automated segmentation may lead to inaccuracy and/or undesirable boundary. The goal of the present study was to investigate whether SegAdapter, a machine learning-based method, is useful for automatically correcting large segmentation errors and disagreement in anatomical definition. We further assessed the robustness of the method in handling size of training set, differences in head coil usage, and amount of brain atrophy. High resolution T1-weighted images were acquired from 30 healthy controls scanned with either an 8-channel or 32-channel head coil. Ten patients, who suffered from brain atrophy because of fragile X-associated tremor/ataxia syndrome, were scanned using the 32-channel head coil. The initial segmentations of the cerebellum and brainstem were generated automatically using Freesurfer. Subsequently, Freesurfer's segmentations were both manually corrected to serve as the gold standard and automatically corrected by SegAdapter. Using only 5 scans in the training set, spatial overlap with manual segmentation in Dice coefficient improved significantly from 0.956 (for Freesurfer segmentation) to 0.978 (for SegAdapter-corrected segmentation) for the cerebellum and from 0.821 to 0.954 for the brainstem. Reducing the training set size to 2 scans only decreased the Dice coefficient ≤0.002 for the cerebellum and ≤ 0.005 for the brainstem compared to the use of training set size of 5 scans in corrective learning. The method was also robust in handling differences between the training set and the test set in head coil usage and the amount of brain atrophy, which reduced spatial overlap only by <0.01. These results suggest that the combination of automated segmentation and corrective learning provides a valuable method for accurate and efficient segmentation of the cerebellum and brainstem, particularly in large-scale neuroimaging studies, and potentially for segmenting other neural regions as well.
Robust Machine Learning-Based Correction on Automatic Segmentation of the Cerebellum and Brainstem
Wang, Jun Yi; Ngo, Michael M.; Hessl, David; Hagerman, Randi J.; Rivera, Susan M.
2016-01-01
Automated segmentation is a useful method for studying large brain structures such as the cerebellum and brainstem. However, automated segmentation may lead to inaccuracy and/or undesirable boundary. The goal of the present study was to investigate whether SegAdapter, a machine learning-based method, is useful for automatically correcting large segmentation errors and disagreement in anatomical definition. We further assessed the robustness of the method in handling size of training set, differences in head coil usage, and amount of brain atrophy. High resolution T1-weighted images were acquired from 30 healthy controls scanned with either an 8-channel or 32-channel head coil. Ten patients, who suffered from brain atrophy because of fragile X-associated tremor/ataxia syndrome, were scanned using the 32-channel head coil. The initial segmentations of the cerebellum and brainstem were generated automatically using Freesurfer. Subsequently, Freesurfer’s segmentations were both manually corrected to serve as the gold standard and automatically corrected by SegAdapter. Using only 5 scans in the training set, spatial overlap with manual segmentation in Dice coefficient improved significantly from 0.956 (for Freesurfer segmentation) to 0.978 (for SegAdapter-corrected segmentation) for the cerebellum and from 0.821 to 0.954 for the brainstem. Reducing the training set size to 2 scans only decreased the Dice coefficient ≤0.002 for the cerebellum and ≤ 0.005 for the brainstem compared to the use of training set size of 5 scans in corrective learning. The method was also robust in handling differences between the training set and the test set in head coil usage and the amount of brain atrophy, which reduced spatial overlap only by <0.01. These results suggest that the combination of automated segmentation and corrective learning provides a valuable method for accurate and efficient segmentation of the cerebellum and brainstem, particularly in large-scale neuroimaging studies, and potentially for segmenting other neural regions as well. PMID:27213683
Prins, Noeline W.; Sanchez, Justin C.; Prasad, Abhishek
2014-01-01
Brain-Machine Interfaces (BMIs) can be used to restore function in people living with paralysis. Current BMIs require extensive calibration that increase the set-up times and external inputs for decoder training that may be difficult to produce in paralyzed individuals. Both these factors have presented challenges in transitioning the technology from research environments to activities of daily living (ADL). For BMIs to be seamlessly used in ADL, these issues should be handled with minimal external input thus reducing the need for a technician/caregiver to calibrate the system. Reinforcement Learning (RL) based BMIs are a good tool to be used when there is no external training signal and can provide an adaptive modality to train BMI decoders. However, RL based BMIs are sensitive to the feedback provided to adapt the BMI. In actor-critic BMIs, this feedback is provided by the critic and the overall system performance is limited by the critic accuracy. In this work, we developed an adaptive BMI that could handle inaccuracies in the critic feedback in an effort to produce more accurate RL based BMIs. We developed a confidence measure, which indicated how appropriate the feedback is for updating the decoding parameters of the actor. The results show that with the new update formulation, the critic accuracy is no longer a limiting factor for the overall performance. We tested and validated the system onthree different data sets: synthetic data generated by an Izhikevich neural spiking model, synthetic data with a Gaussian noise distribution, and data collected from a non-human primate engaged in a reaching task. All results indicated that the system with the critic confidence built in always outperformed the system without the critic confidence. Results of this study suggest the potential application of the technique in developing an autonomous BMI that does not need an external signal for training or extensive calibration. PMID:24904257
Application of new type of distributed multimedia databases to networked electronic museum
NASA Astrophysics Data System (ADS)
Kuroda, Kazuhide; Komatsu, Naohisa; Komiya, Kazumi; Ikeda, Hiroaki
1999-01-01
Recently, various kinds of multimedia application systems have actively been developed based on the achievement of advanced high sped communication networks, computer processing technologies, and digital contents-handling technologies. Under this background, this paper proposed a new distributed multimedia database system which can effectively perform a new function of cooperative retrieval among distributed databases. The proposed system introduces a new concept of 'Retrieval manager' which functions as an intelligent controller so that the user can recognize a set of distributed databases as one logical database. The logical database dynamically generates and performs a preferred combination of retrieving parameters on the basis of both directory data and the system environment. Moreover, a concept of 'domain' is defined in the system as a managing unit of retrieval. The retrieval can effectively be performed by cooperation of processing among multiple domains. Communication language and protocols are also defined in the system. These are used in every action for communications in the system. A language interpreter in each machine translates a communication language into an internal language used in each machine. Using the language interpreter, internal processing, such internal modules as DBMS and user interface modules can freely be selected. A concept of 'content-set' is also introduced. A content-set is defined as a package of contents. Contents in the content-set are related to each other. The system handles a content-set as one object. The user terminal can effectively control the displaying of retrieved contents, referring to data indicating the relation of the contents in the content- set. In order to verify the function of the proposed system, a networked electronic museum was experimentally built. The results of this experiment indicate that the proposed system can effectively retrieve the objective contents under the control to a number of distributed domains. The result also indicate that the system can effectively work even if the system becomes large.
Preprocessing Structured Clinical Data for Predictive Modeling and Decision Support
Oliveira, Mónica Duarte; Janela, Filipe; Martins, Henrique M. G.
2016-01-01
Summary Background EHR systems have high potential to improve healthcare delivery and management. Although structured EHR data generates information in machine-readable formats, their use for decision support still poses technical challenges for researchers due to the need to preprocess and convert data into a matrix format. During our research, we observed that clinical informatics literature does not provide guidance for researchers on how to build this matrix while avoiding potential pitfalls. Objectives This article aims to provide researchers a roadmap of the main technical challenges of preprocessing structured EHR data and possible strategies to overcome them. Methods Along standard data processing stages – extracting database entries, defining features, processing data, assessing feature values and integrating data elements, within an EDPAI framework –, we identified the main challenges faced by researchers and reflect on how to address those challenges based on lessons learned from our research experience and on best practices from related literature. We highlight the main potential sources of error, present strategies to approach those challenges and discuss implications of these strategies. Results Following the EDPAI framework, researchers face five key challenges: (1) gathering and integrating data, (2) identifying and handling different feature types, (3) combining features to handle redundancy and granularity, (4) addressing data missingness, and (5) handling multiple feature values. Strategies to address these challenges include: cross-checking identifiers for robust data retrieval and integration; applying clinical knowledge in identifying feature types, in addressing redundancy and granularity, and in accommodating multiple feature values; and investigating missing patterns adequately. Conclusions This article contributes to literature by providing a roadmap to inform structured EHR data preprocessing. It may advise researchers on potential pitfalls and implications of methodological decisions in handling structured data, so as to avoid biases and help realize the benefits of the secondary use of EHR data. PMID:27924347
Development of Magnetorheological Resistive Exercise Device for Rowing Machine
Žiliukas, Pranas
2016-01-01
Training equipment used by professional sportsmen has a great impact on their sport performance. Most universal exercisers may help only to improve the general physical condition due to the specific kinematics and peculiar resistance generated by their loading units. Training of effective techniques and learning of psychomotor skills are possible only when exercisers conform to the movements and resistance typical for particular sports kinematically and dynamically. Methodology of developing a magnetorheological resistive exercise device for generating the desired law of passive resistance force and its application in a lever-type rowing machine are described in the paper. The structural parameters of a controllable hydraulic cylinder type device were found by means of the computational fluid dynamics simulation performed by ANSYS CFX software. Parameters describing the magnetorheological fluid as non-Newtonian were determined by combining numerical and experimental research of the resistance force generated by the original magnetorheological damper. A structural scheme of the device control system was developed and the variation of the strength of magnetic field that affects the magnetorheological fluid circulating in the device was determined, ensuring a variation of the resistance force on the oar handle adequate for the resistance that occurs during a real boat rowing stroke. PMID:27293479
Development of Magnetorheological Resistive Exercise Device for Rowing Machine.
Grigas, Vytautas; Šulginas, Anatolijus; Žiliukas, Pranas
2015-01-01
Training equipment used by professional sportsmen has a great impact on their sport performance. Most universal exercisers may help only to improve the general physical condition due to the specific kinematics and peculiar resistance generated by their loading units. Training of effective techniques and learning of psychomotor skills are possible only when exercisers conform to the movements and resistance typical for particular sports kinematically and dynamically. Methodology of developing a magnetorheological resistive exercise device for generating the desired law of passive resistance force and its application in a lever-type rowing machine are described in the paper. The structural parameters of a controllable hydraulic cylinder type device were found by means of the computational fluid dynamics simulation performed by ANSYS CFX software. Parameters describing the magnetorheological fluid as non-Newtonian were determined by combining numerical and experimental research of the resistance force generated by the original magnetorheological damper. A structural scheme of the device control system was developed and the variation of the strength of magnetic field that affects the magnetorheological fluid circulating in the device was determined, ensuring a variation of the resistance force on the oar handle adequate for the resistance that occurs during a real boat rowing stroke.
repRNA: a web server for generating various feature vectors of RNA sequences.
Liu, Bin; Liu, Fule; Fang, Longyun; Wang, Xiaolong; Chou, Kuo-Chen
2016-02-01
With the rapid growth of RNA sequences generated in the postgenomic age, it is highly desired to develop a flexible method that can generate various kinds of vectors to represent these sequences by focusing on their different features. This is because nearly all the existing machine-learning methods, such as SVM (support vector machine) and KNN (k-nearest neighbor), can only handle vectors but not sequences. To meet the increasing demands and speed up the genome analyses, we have developed a new web server, called "representations of RNA sequences" (repRNA). Compared with the existing methods, repRNA is much more comprehensive, flexible and powerful, as reflected by the following facts: (1) it can generate 11 different modes of feature vectors for users to choose according to their investigation purposes; (2) it allows users to select the features from 22 built-in physicochemical properties and even those defined by users' own; (3) the resultant feature vectors and the secondary structures of the corresponding RNA sequences can be visualized. The repRNA web server is freely accessible to the public at http://bioinformatics.hitsz.edu.cn/repRNA/ .
Simulation based optimized beam velocity in additive manufacturing
NASA Astrophysics Data System (ADS)
Vignat, Frédéric; Béraud, Nicolas; Villeneuve, François
2017-08-01
Manufacturing good parts with additive technologies rely on melt pool dimension and temperature and are controlled by manufacturing strategies often decided on machine side. Strategies are built on beam path and variable energy input. Beam path are often a mix of contour and hatching strategies filling the contours at each slice. Energy input depend on beam intensity and speed and is determined from simple thermal models to control melt pool dimensions and temperature and ensure porosity free material. These models take into account variation in thermal environment such as overhanging surfaces or back and forth hatching path. However not all the situations are correctly handled and precision is limited. This paper proposes new method to determine energy input from full built chamber 3D thermal simulation. Using the results of the simulation, energy is modified to keep melt pool temperature in a predetermined range. The paper present first an experimental method to determine the optimal range of temperature. In a second part the method to optimize the beam speed from the simulation results is presented. Finally, the optimized beam path is tested in the EBM machine and built part are compared with part built with ordinary beam path.
A computer program for the calculation of laminar and turbulent boundary layer flows
NASA Technical Reports Server (NTRS)
Dwyer, H. A.; Doss, E. D.; Goldman, A. L.
1972-01-01
The results are presented of a study to produce a computer program to calculate laminar and turbulent boundary layer flows. The program is capable of calculating the following types of flow: (1) incompressible or compressible, (2) two dimensional or axisymmetric, and (3) flows with significant transverse curvature. Also, the program can handle a large variety of boundary conditions, such as blowing or suction, arbitrary temperature distributions and arbitrary wall heat fluxes. The program has been specialized to the calculation of equilibrium air flows and all of the thermodynamic and transport properties used are for air. For the turbulent transport properties, the eddy viscosity approach has been used. Although the eddy viscosity models are semi-empirical, the model employed in the program has corrections for pressure gradients, suction and blowing and compressibility. The basic method of approach is to put the equations of motion into a finite difference form and then solve them by use of a digital computer. The program is written in FORTRAN 4 and requires small amounts of computer time on most scientific machines. For example, most laminar flows can be calculated in less than one minute of machine time, while turbulent flows usually require three or four minutes.
Mathematical model for dynamic cell formation in fast fashion apparel manufacturing stage
NASA Astrophysics Data System (ADS)
Perera, Gayathri; Ratnayake, Vijitha
2018-05-01
This paper presents a mathematical programming model for dynamic cell formation to minimize changeover-related costs (i.e., machine relocation costs and machine setup cost) and inter-cell material handling cost to cope with the volatile production environments in apparel manufacturing industry. The model is formulated through findings of a comprehensive literature review. Developed model is validated based on data collected from three different factories in apparel industry, manufacturing fast fashion products. A program code is developed using Lingo 16.0 software package to generate optimal cells for developed model and to determine the possible cost-saving percentage when the existing layouts used in three factories are replaced by generated optimal cells. The optimal cells generated by developed mathematical model result in significant cost saving when compared with existing product layouts used in production/assembly department of selected factories in apparel industry. The developed model can be considered as effective in minimizing the considered cost terms in dynamic production environment of fast fashion apparel manufacturing industry. Findings of this paper can be used for further researches on minimizing the changeover-related costs in fast fashion apparel production stage.
Deceleration system for kinematic linkages of positioning
NASA Astrophysics Data System (ADS)
Stan, G.
2017-08-01
Flexible automation is used more and more in various production processes, so that both machining itself on CNC machine tools and workpiece handling means are performed through programming the needed working cycle. In order to obtain a successful precise positioning, each motion degree needs a certain deceleration before stopping at a programmed point. The increase of motion speed of moving elements within the manipulators structure depends directly on deceleration duty quality before the programmed stop. Proportional valves as well as servo-valves that can perform hydraulic decelerations are well known, but they feature several disadvantages, such as: high price, severe conditions for oil filtering and low reliability under industrial conditions. This work presents a new deceleration system that allows adjustment of deceleration slope according to actual conditions: inertial mass, speed etc. The new solution of hydraulic decelerator allows its integration to a position loop or its usage in case of positioning large elements that only perform fixed cycles. The results being obtained on the positioning accuracy of a linear axis using the new solution of the hydraulic decelerator are presented, too. The price of the new deceleration system is much lower compared to the price of proportional valves or servo-valves.
Large-scale machine learning and evaluation platform for real-time traffic surveillance
NASA Astrophysics Data System (ADS)
Eichel, Justin A.; Mishra, Akshaya; Miller, Nicholas; Jankovic, Nicholas; Thomas, Mohan A.; Abbott, Tyler; Swanson, Douglas; Keller, Joel
2016-09-01
In traffic engineering, vehicle detectors are trained on limited datasets, resulting in poor accuracy when deployed in real-world surveillance applications. Annotating large-scale high-quality datasets is challenging. Typically, these datasets have limited diversity; they do not reflect the real-world operating environment. There is a need for a large-scale, cloud-based positive and negative mining process and a large-scale learning and evaluation system for the application of automatic traffic measurements and classification. The proposed positive and negative mining process addresses the quality of crowd sourced ground truth data through machine learning review and human feedback mechanisms. The proposed learning and evaluation system uses a distributed cloud computing framework to handle data-scaling issues associated with large numbers of samples and a high-dimensional feature space. The system is trained using AdaBoost on 1,000,000 Haar-like features extracted from 70,000 annotated video frames. The trained real-time vehicle detector achieves an accuracy of at least 95% for 1/2 and about 78% for 19/20 of the time when tested on ˜7,500,000 video frames. At the end of 2016, the dataset is expected to have over 1 billion annotated video frames.
Leveraging human oversight and intervention in large-scale parallel processing of open-source data
NASA Astrophysics Data System (ADS)
Casini, Enrico; Suri, Niranjan; Bradshaw, Jeffrey M.
2015-05-01
The popularity of cloud computing along with the increased availability of cheap storage have led to the necessity of elaboration and transformation of large volumes of open-source data, all in parallel. One way to handle such extensive volumes of information properly is to take advantage of distributed computing frameworks like Map-Reduce. Unfortunately, an entirely automated approach that excludes human intervention is often unpredictable and error prone. Highly accurate data processing and decision-making can be achieved by supporting an automatic process through human collaboration, in a variety of environments such as warfare, cyber security and threat monitoring. Although this mutual participation seems easily exploitable, human-machine collaboration in the field of data analysis presents several challenges. First, due to the asynchronous nature of human intervention, it is necessary to verify that once a correction is made, all the necessary reprocessing is done in chain. Second, it is often needed to minimize the amount of reprocessing in order to optimize the usage of resources due to limited availability. In order to improve on these strict requirements, this paper introduces improvements to an innovative approach for human-machine collaboration in the processing of large amounts of open-source data in parallel.
Igne, Benoît; Drennen, James K; Anderson, Carl A
2014-01-01
Changes in raw materials and process wear and tear can have significant effects on the prediction error of near-infrared calibration models. When the variability that is present during routine manufacturing is not included in the calibration, test, and validation sets, the long-term performance and robustness of the model will be limited. Nonlinearity is a major source of interference. In near-infrared spectroscopy, nonlinearity can arise from light path-length differences that can come from differences in particle size or density. The usefulness of support vector machine (SVM) regression to handle nonlinearity and improve the robustness of calibration models in scenarios where the calibration set did not include all the variability present in test was evaluated. Compared to partial least squares (PLS) regression, SVM regression was less affected by physical (particle size) and chemical (moisture) differences. The linearity of the SVM predicted values was also improved. Nevertheless, although visualization and interpretation tools have been developed to enhance the usability of SVM-based methods, work is yet to be done to provide chemometricians in the pharmaceutical industry with a regression method that can supplement PLS-based methods.
Pham-The, Hai; Casañola-Martin, Gerardo; Garrigues, Teresa; Bermejo, Marival; González-Álvarez, Isabel; Nguyen-Hai, Nam; Cabrera-Pérez, Miguel Ángel; Le-Thi-Thu, Huong
2016-02-01
In many absorption, distribution, metabolism, and excretion (ADME) modeling problems, imbalanced data could negatively affect classification performance of machine learning algorithms. Solutions for handling imbalanced dataset have been proposed, but their application for ADME modeling tasks is underexplored. In this paper, various strategies including cost-sensitive learning and resampling methods were studied to tackle the moderate imbalance problem of a large Caco-2 cell permeability database. Simple physicochemical molecular descriptors were utilized for data modeling. Support vector machine classifiers were constructed and compared using multiple comparison tests. Results showed that the models developed on the basis of resampling strategies displayed better performance than the cost-sensitive classification models, especially in the case of oversampling data where misclassification rates for minority class have values of 0.11 and 0.14 for training and test set, respectively. A consensus model with enhanced applicability domain was subsequently constructed and showed improved performance. This model was used to predict a set of randomly selected high-permeability reference drugs according to the biopharmaceutics classification system. Overall, this study provides a comparison of numerous rebalancing strategies and displays the effectiveness of oversampling methods to deal with imbalanced permeability data problems.
Old scissors to industrial automation: the impact of technologic evolution on worker's health.
Teodoroski, Rita de Cassia Clark; Koppe, Vanessa Mazzocchi; Merino, Eugênio Andrés Díaz
2012-01-01
To cut a fabric, the professional performs different jobs and among them stands out the cut. The scissors has been the instrument most used for this activity. Over the years, technology has been conquering its space in the textile industry. However, despite the industrial automation able to offer subsidies to answer employment market demands, without appropriate orientation, the worker is exposed to the risks inherent at the job. Ergonomics is a science that search to promote the comfort and well being in consonance with efficacy. Its goals are properly well defined and clearly guide the actions aimed at transforming the working conditions. This study aimed to analyze the activity of cut tissues with a machine by a seamstress and the implications on their body posture. The methodology used was the observation technique and application of the Protocol RULA, where the result obtained was the level 3 and score 5, confirming that "investigations and changes are required soon". Conclude that using the machine to tissue cut should be encouraged, but in conjunction with orientations for improving posture while handling it. It seeks to prevent dysfunction of the musculoskeletal system that prevents employees from performing their work tasks efficiently and productively.
State Event Models for the Formal Analysis of Human-Machine Interactions
NASA Technical Reports Server (NTRS)
Combefis, Sebastien; Giannakopoulou, Dimitra; Pecheur, Charles
2014-01-01
The work described in this paper was motivated by our experience with applying a framework for formal analysis of human-machine interactions (HMI) to a realistic model of an autopilot. The framework is built around a formally defined conformance relation called "fullcontrol" between an actual system and the mental model according to which the system is operated. Systems are well-designed if they can be described by relatively simple, full-control, mental models for their human operators. For this reason, our framework supports automated generation of minimal full-control mental models for HMI systems, where both the system and the mental models are described as labelled transition systems (LTS). The autopilot that we analysed has been developed in the NASA Ames HMI prototyping tool ADEPT. In this paper, we describe how we extended the models that our HMI analysis framework handles to allow adequate representation of ADEPT models. We then provide a property-preserving reduction from these extended models to LTSs, to enable application of our LTS-based formal analysis algorithms. Finally, we briefly discuss the analyses we were able to perform on the autopilot model with our extended framework.
FPGA based data processing in the ALICE High Level Trigger in LHC Run 2
NASA Astrophysics Data System (ADS)
Engel, Heiko; Alt, Torsten; Kebschull, Udo;
2017-10-01
The ALICE High Level Trigger (HLT) is a computing cluster dedicated to the online compression, reconstruction and calibration of experimental data. The HLT receives detector data via serial optical links into FPGA based readout boards that process the data on a per-link level already inside the FPGA and provide it to the host machines connected with a data transport framework. FPGA based data pre-processing is enabled for the biggest detector of ALICE, the Time Projection Chamber (TPC), with a hardware cluster finding algorithm. This algorithm was ported to the Common Read-Out Receiver Card (C-RORC) as used in the HLT for RUN 2. It was improved to handle double the input bandwidth and adjusted to the upgraded TPC Readout Control Unit (RCU2). A flexible firmware implementation in the HLT handles both the old and the new TPC data format and link rates transparently. Extended protocol and data error detection, error handling and the enhanced RCU2 data ordering scheme provide an improved physics performance of the cluster finder. The performance of the cluster finder was verified against large sets of reference data both in terms of throughput and algorithmic correctness. Comparisons with a software reference implementation confirm significant savings on CPU processing power using the hardware implementation. The C-RORC hardware with the cluster finder for RCU1 data is in use in the HLT since the start of RUN 2. The extended hardware cluster finder implementation for the RCU2 with doubled throughput is active since the upgrade of the TPC readout electronics in early 2016.
An analysis of roof bolter fatalities and injuries in U.S. mining
Sammarco, J.J.; Podlesny, A.; Rubinstein, E.N.; Demich, B.
2017-01-01
Roof bolting typically follows the extraction of a commodity to help keep the roof from collapsing. During 2004 to 2013, roof bolter operators had the highest number of machinery-related injuries, accounting for 64.7 percent, at underground coal mines. This paper analyzes U.S. roof bolter fatal and nonfatal lost-time injury data at underground work locations for all commodities from 2004 through 2013 and determines risk indices for six roof bolting tasks. For fatal and nonfatal incidences combined, the roof bolting tasks in order of the highest to lowest risk index were bolting, handling of materials, setting the temporary roof support (TRS), drilling, tramming, and traversing. For fatalities, the roof bolting tasks in order of the highest to lowest risk index were handling of materials, setting the TRS, bolting, drilling, traversing, and tramming. Age was found to be a significant factor. Severity of injury, indicated by days lost, was found to increase with increasing age as well as with increasing experience, largely due to the confounding of age and experience. The operation of the roof bolting machine used in underground mining should be a research priority given the high frequency and severity of incidents. The results also suggest that temporal factors may exist, so additional research is warranted to better understand these factors and potentially develop interventions. This research provides a data-driven foundation from which future research can be conducted for safety interventions to reduce the frequency and severity of incidences involving the roof bolter activities of bolting, handling of materials, and setting the TRS. PMID:28845099
Object as a model of intelligent robot in the virtual workspace
NASA Astrophysics Data System (ADS)
Foit, K.; Gwiazda, A.; Banas, W.; Sekala, A.; Hryniewicz, P.
2015-11-01
The contemporary industry requires that every element of a production line will fit into the global schema, which is connected with the global structure of business. There is the need to find the practical and effective ways of the design and management of the production process. The term “effective” should be understood in a manner that there exists a method, which allows building a system of nodes and relations in order to describe the role of the particular machine in the production process. Among all the machines involved in the manufacturing process, industrial robots are the most complex ones. This complexity is reflected in the realization of elaborated tasks, involving handling, transporting or orienting the objects in a work space, and even performing simple machining processes, such as deburring, grinding, painting, applying adhesives and sealants etc. The robot also performs some activities connected with automatic tool changing and operating the equipment mounted on the wrist of the robot. Because of having the programmable control system, the robot also performs additional activities connected with sensors, vision systems, operating the storages of manipulated objects, tools or grippers, measuring stands, etc. For this reason the description of the robot as a part of production system should take into account the specific nature of this machine: the robot is a substitute of a worker, who performs his tasks in a particular environment. In this case, the model should be able to characterize the essence of "employment" in the sufficient way. One of the possible approaches to this problem is to treat the robot as an object, in the sense often used in computer science. This allows both: to describe certain operations performed on the object, as well as describing the operations performed by the object. This paper focuses mainly on the definition of the object as the model of the robot. This model is confronted with the other possible descriptions. The results can be further used during designing of the complete manufacturing system, which takes into account all the involved machines and has the form of an object-oriented model.
Reactive Scheduling in Multipurpose Batch Plants
NASA Astrophysics Data System (ADS)
Narayani, A.; Shaik, Munawar A.
2010-10-01
Scheduling is an important operation in process industries for improving resource utilization resulting in direct economic benefits. It has a two-fold objective of fulfilling customer orders within the specified time as well as maximizing the plant profit. Unexpected disturbances such as machine breakdown, arrival of rush orders and cancellation of orders affect the schedule of the plant. Reactive scheduling is generation of a new schedule which has minimum deviation from the original schedule in spite of the occurrence of unexpected events in the plant operation. Recently, Shaik & Floudas (2009) proposed a novel unified model for short-term scheduling of multipurpose batch plants using unit-specific event-based continuous time representation. In this paper, we extend the model of Shaik & Floudas (2009) to handle reactive scheduling.
Deuterium retention and surface modification of tungsten macrobrush samples exposed in FTU Tokamak
NASA Astrophysics Data System (ADS)
Maddaluno, G.; Giacomi, G.; Rufoloni, A.; Verdini, L.
2007-06-01
The effect of discrete structures such as macrobrush or castellated surfaces on power handling and deuterium retention of plasma facing components is to be assessed since such geometrical configurations are needed for increasing the lifetime of the armour to heat-sink joint. Four small macrobrush W and W + 1%La2O3 samples have been exposed in the Frascati Tokamak Upgrade (FTU) scrape-off layer up to the last closed flux surface by means of the Sample Introduction System. FTU is an all metal machine with no carbon source inside vacuum vessel; it exhibits ITER relevant energy and particle fluxes on the plasma facing components. Here, results on morphological surface changes (SEM), chemical composition (EDX) and deuterium retention (TDS) are reported.
Towards future high performance computing: What will change? How can we be efficient?
NASA Astrophysics Data System (ADS)
Düben, Peter
2017-04-01
How can we make the most out of "exascale" supercomputers that will be available soon and enable us to calculate an amazing number of 1,000,000,000,000,000,000 real numbers operations within a single second? How do we need to design applications to use these machines efficiently? What are the limits? We will discuss opportunities and limits of the use of future high performance computers from the perspective of Earth System Modelling. We will provide an overview about future challenges and outline how numerical application will need to be changed to run efficiently on supercomputers in the future. We will also discuss how different disciplines can support each other and talk about data handling and numerical precision of data.
Set processing in a network environment. [data bases and magnetic disks and tapes
NASA Technical Reports Server (NTRS)
Hardgrave, W. T.
1975-01-01
A combination of a local network, a mass storage system, and an autonomous set processor serving as a data/storage management machine is described. Its characteristics include: content-accessible data bases usable from all connected devices; efficient storage/access of large data bases; simple and direct programming with data manipulation and storage management handled by the set processor; simple data base design and entry from source representation to set processor representation with no predefinition necessary; capability available for user sort/order specification; significant reduction in tape/disk pack storage and mounts; flexible environment that allows upgrading hardware/software configuration without causing major interruptions in service; minimal traffic on data communications network; and improved central memory usage on large processors.
High-capacity high-speed recording
NASA Astrophysics Data System (ADS)
Jamberdino, A. A.
1981-06-01
Continuing advances in wideband communications and information handling are leading to extremely large volume digital data systems for which conventional data storage techniques are becoming inadequate. The paper presents an assessment of alternative recording technologies for the extremely wideband, high capacity storage and retrieval systems currently under development. Attention is given to longitudinal and rotary head high density magnetic recording, laser holography in human readable/machine readable devices and a wideband recorder, digital optical disks, and spot recording in microfiche formats. The electro-optical technologies considered are noted to be capable of providing data bandwidths up to 1000 megabits/sec and total data storage capacities in the 10 to the 11th to 10 to the 12th bit range, an order of magnitude improvement over conventional technologies.
Wang, Shuihua; Yang, Ming; Du, Sidan; Yang, Jiquan; Liu, Bin; Gorriz, Juan M.; Ramírez, Javier; Yuan, Ti-Fei; Zhang, Yudong
2016-01-01
Highlights We develop computer-aided diagnosis system for unilateral hearing loss detection in structural magnetic resonance imaging.Wavelet entropy is introduced to extract image global features from brain images. Directed acyclic graph is employed to endow support vector machine an ability to handle multi-class problems.The developed computer-aided diagnosis system achieves an overall accuracy of 95.1% for this three-class problem of differentiating left-sided and right-sided hearing loss from healthy controls. Aim: Sensorineural hearing loss (SNHL) is correlated to many neurodegenerative disease. Now more and more computer vision based methods are using to detect it in an automatic way. Materials: We have in total 49 subjects, scanned by 3.0T MRI (Siemens Medical Solutions, Erlangen, Germany). The subjects contain 14 patients with right-sided hearing loss (RHL), 15 patients with left-sided hearing loss (LHL), and 20 healthy controls (HC). Method: We treat this as a three-class classification problem: RHL, LHL, and HC. Wavelet entropy (WE) was selected from the magnetic resonance images of each subjects, and then submitted to a directed acyclic graph support vector machine (DAG-SVM). Results: The 10 repetition results of 10-fold cross validation shows 3-level decomposition will yield an overall accuracy of 95.10% for this three-class classification problem, higher than feedforward neural network, decision tree, and naive Bayesian classifier. Conclusions: This computer-aided diagnosis system is promising. We hope this study can attract more computer vision method for detecting hearing loss. PMID:27807415
Heat-machine control by quantum-state preparation: from quantum engines to refrigerators.
Gelbwaser-Klimovsky, D; Kurizki, G
2014-08-01
We explore the dependence of the performance bounds of heat engines and refrigerators on the initial quantum state and the subsequent evolution of their piston, modeled by a quantized harmonic oscillator. Our goal is to provide a fully quantized treatment of self-contained (autonomous) heat machines, as opposed to their prevailing semiclassical description that consists of a quantum system alternately coupled to a hot or a cold heat bath and parametrically driven by a classical time-dependent piston or field. Here, by contrast, there is no external time-dependent driving. Instead, the evolution is caused by the stationary simultaneous interaction of two heat baths (having distinct spectra and temperatures) with a single two-level system that is in turn coupled to the quantum piston. The fully quantized treatment we put forward allows us to investigate work extraction and refrigeration by the tools of quantum-optical amplifier and dissipation theory, particularly, by the analysis of amplified or dissipated phase-plane quasiprobability distributions. Our main insight is that quantum states may be thermodynamic resources and can provide a powerful handle, or control, on the efficiency of the heat machine. In particular, a piston initialized in a coherent state can cause the engine to produce work at an efficiency above the Carnot bound in the linear amplification regime. In the refrigeration regime, the coefficient of performance can transgress the Carnot bound if the piston is initialized in a Fock state. The piston may be realized by a vibrational mode, as in nanomechanical setups, or an electromagnetic field mode, as in cavity-based scenarios.
Fuzzy support vector machine for microarray imbalanced data classification
NASA Astrophysics Data System (ADS)
Ladayya, Faroh; Purnami, Santi Wulan; Irhamah
2017-11-01
DNA microarrays are data containing gene expression with small sample sizes and high number of features. Furthermore, imbalanced classes is a common problem in microarray data. This occurs when a dataset is dominated by a class which have significantly more instances than the other minority classes. Therefore, it is needed a classification method that solve the problem of high dimensional and imbalanced data. Support Vector Machine (SVM) is one of the classification methods that is capable of handling large or small samples, nonlinear, high dimensional, over learning and local minimum issues. SVM has been widely applied to DNA microarray data classification and it has been shown that SVM provides the best performance among other machine learning methods. However, imbalanced data will be a problem because SVM treats all samples in the same importance thus the results is bias for minority class. To overcome the imbalanced data, Fuzzy SVM (FSVM) is proposed. This method apply a fuzzy membership to each input point and reformulate the SVM such that different input points provide different contributions to the classifier. The minority classes have large fuzzy membership so FSVM can pay more attention to the samples with larger fuzzy membership. Given DNA microarray data is a high dimensional data with a very large number of features, it is necessary to do feature selection first using Fast Correlation based Filter (FCBF). In this study will be analyzed by SVM, FSVM and both methods by applying FCBF and get the classification performance of them. Based on the overall results, FSVM on selected features has the best classification performance compared to SVM.
NASA Technical Reports Server (NTRS)
1978-01-01
The photos show a few of the food products packaged in Alure, a metallized plastic material developed and manufactured by St. Regis Paper Company's Flexible Packaging Division, Dallas, Texas. The material incorporates a metallized film originally developed for space applications. Among the suppliers of the film to St. Regis is King-Seeley Thermos Company, Winchester, Ma'ssachusetts. Initially used by NASA as a signal-bouncing reflective coating for the Echo 1 communications satellite, the film was developed by a company later absorbed by King-Seeley. The metallized film was also used as insulating material for components of a number of other spacecraft. St. Regis developed Alure to meet a multiple packaging material need: good eye appeal, product protection for long periods and the ability to be used successfully on a wide variety of food packaging equipment. When the cost of aluminum foil skyrocketed, packagers sought substitute metallized materials but experiments with a number of them uncovered problems; some were too expensive, some did not adequately protect the product, some were difficult for the machinery to handle. Alure offers a solution. St. Regis created Alure by sandwiching the metallized film between layers of plastics. The resulting laminated metallized material has the superior eye appeal of foil but is less expensive and more easily machined. Alure effectively blocks out light, moisture and oxygen and therefore gives the packaged food long shelf life. A major packaging firm conducted its own tests of the material and confirmed the advantages of machinability and shelf life, adding that it runs faster on machines than materials used in the past and it decreases product waste; the net effect is increased productivity.
Heat-machine control by quantum-state preparation: From quantum engines to refrigerators
NASA Astrophysics Data System (ADS)
Gelbwaser-Klimovsky, D.; Kurizki, G.
2014-08-01
We explore the dependence of the performance bounds of heat engines and refrigerators on the initial quantum state and the subsequent evolution of their piston, modeled by a quantized harmonic oscillator. Our goal is to provide a fully quantized treatment of self-contained (autonomous) heat machines, as opposed to their prevailing semiclassical description that consists of a quantum system alternately coupled to a hot or a cold heat bath and parametrically driven by a classical time-dependent piston or field. Here, by contrast, there is no external time-dependent driving. Instead, the evolution is caused by the stationary simultaneous interaction of two heat baths (having distinct spectra and temperatures) with a single two-level system that is in turn coupled to the quantum piston. The fully quantized treatment we put forward allows us to investigate work extraction and refrigeration by the tools of quantum-optical amplifier and dissipation theory, particularly, by the analysis of amplified or dissipated phase-plane quasiprobability distributions. Our main insight is that quantum states may be thermodynamic resources and can provide a powerful handle, or control, on the efficiency of the heat machine. In particular, a piston initialized in a coherent state can cause the engine to produce work at an efficiency above the Carnot bound in the linear amplification regime. In the refrigeration regime, the coefficient of performance can transgress the Carnot bound if the piston is initialized in a Fock state. The piston may be realized by a vibrational mode, as in nanomechanical setups, or an electromagnetic field mode, as in cavity-based scenarios.
Wang, Shuihua; Yang, Ming; Du, Sidan; Yang, Jiquan; Liu, Bin; Gorriz, Juan M; Ramírez, Javier; Yuan, Ti-Fei; Zhang, Yudong
2016-01-01
Highlights We develop computer-aided diagnosis system for unilateral hearing loss detection in structural magnetic resonance imaging.Wavelet entropy is introduced to extract image global features from brain images. Directed acyclic graph is employed to endow support vector machine an ability to handle multi-class problems.The developed computer-aided diagnosis system achieves an overall accuracy of 95.1% for this three-class problem of differentiating left-sided and right-sided hearing loss from healthy controls. Aim: Sensorineural hearing loss (SNHL) is correlated to many neurodegenerative disease. Now more and more computer vision based methods are using to detect it in an automatic way. Materials: We have in total 49 subjects, scanned by 3.0T MRI (Siemens Medical Solutions, Erlangen, Germany). The subjects contain 14 patients with right-sided hearing loss (RHL), 15 patients with left-sided hearing loss (LHL), and 20 healthy controls (HC). Method: We treat this as a three-class classification problem: RHL, LHL, and HC. Wavelet entropy (WE) was selected from the magnetic resonance images of each subjects, and then submitted to a directed acyclic graph support vector machine (DAG-SVM). Results: The 10 repetition results of 10-fold cross validation shows 3-level decomposition will yield an overall accuracy of 95.10% for this three-class classification problem, higher than feedforward neural network, decision tree, and naive Bayesian classifier. Conclusions: This computer-aided diagnosis system is promising. We hope this study can attract more computer vision method for detecting hearing loss.
Klonoff, David C
2017-07-01
The Internet of Things (IoT) is generating an immense volume of data. With cloud computing, medical sensor and actuator data can be stored and analyzed remotely by distributed servers. The results can then be delivered via the Internet. The number of devices in IoT includes such wireless diabetes devices as blood glucose monitors, continuous glucose monitors, insulin pens, insulin pumps, and closed-loop systems. The cloud model for data storage and analysis is increasingly unable to process the data avalanche, and processing is being pushed out to the edge of the network closer to where the data-generating devices are. Fog computing and edge computing are two architectures for data handling that can offload data from the cloud, process it nearby the patient, and transmit information machine-to-machine or machine-to-human in milliseconds or seconds. Sensor data can be processed near the sensing and actuating devices with fog computing (with local nodes) and with edge computing (within the sensing devices). Compared to cloud computing, fog computing and edge computing offer five advantages: (1) greater data transmission speed, (2) less dependence on limited bandwidths, (3) greater privacy and security, (4) greater control over data generated in foreign countries where laws may limit use or permit unwanted governmental access, and (5) lower costs because more sensor-derived data are used locally and less data are transmitted remotely. Connected diabetes devices almost all use fog computing or edge computing because diabetes patients require a very rapid response to sensor input and cannot tolerate delays for cloud computing.
Karthick, P A; Ghosh, Diptasree Maitra; Ramakrishnan, S
2018-02-01
Surface electromyography (sEMG) based muscle fatigue research is widely preferred in sports science and occupational/rehabilitation studies due to its noninvasiveness. However, these signals are complex, multicomponent and highly nonstationary with large inter-subject variations, particularly during dynamic contractions. Hence, time-frequency based machine learning methodologies can improve the design of automated system for these signals. In this work, the analysis based on high-resolution time-frequency methods, namely, Stockwell transform (S-transform), B-distribution (BD) and extended modified B-distribution (EMBD) are proposed to differentiate the dynamic muscle nonfatigue and fatigue conditions. The nonfatigue and fatigue segments of sEMG signals recorded from the biceps brachii of 52 healthy volunteers are preprocessed and subjected to S-transform, BD and EMBD. Twelve features are extracted from each method and prominent features are selected using genetic algorithm (GA) and binary particle swarm optimization (BPSO). Five machine learning algorithms, namely, naïve Bayes, support vector machine (SVM) of polynomial and radial basis kernel, random forest and rotation forests are used for the classification. The results show that all the proposed time-frequency distributions (TFDs) are able to show the nonstationary variations of sEMG signals. Most of the features exhibit statistically significant difference in the muscle fatigue and nonfatigue conditions. The maximum number of features (66%) is reduced by GA and BPSO for EMBD and BD-TFD respectively. The combination of EMBD- polynomial kernel based SVM is found to be most accurate (91% accuracy) in classifying the conditions with the features selected using GA. The proposed methods are found to be capable of handling the nonstationary and multicomponent variations of sEMG signals recorded in dynamic fatiguing contractions. Particularly, the combination of EMBD- polynomial kernel based SVM could be used to detect the dynamic muscle fatigue conditions. Copyright © 2017 Elsevier B.V. All rights reserved.
Tremonte, Patrizio; Tipaldi, Luca; Succi, Mariantonietta; Pannella, Gianfranco; Falasca, Luisa; Capilongo, Valeria; Coppola, Raffaele; Sorrentino, Elena
2014-01-01
In Italy, the sale of raw milk from vending machines has been allowed since 2004. Boiling treatment before its use is mandatory for the consumer, because the raw milk could be an important source of foodborne pathogens. This study fits into this context with the aim to evaluate the microbiological quality of 30 raw milk samples periodically collected (March 2013 to July 2013) from 3 vending machines located in Molise, a region of southern Italy. Milk samples were stored for 72 h at 4 °C and then subjected to different treatments, such as boiling and microwaving, to simulate domestic handling. The results show that all the raw milk samples examined immediately after their collection were affected by high microbial loads, with values very close to or even greater than those acceptable by Italian law. The microbial populations increased during refrigeration, reaching after 72 h values of about 8.0 log cfu/mL for Pseudomonas spp., 6.5 log cfu/mL for yeasts, and up to 4.0 log cfu/mL for Enterobacteriaceae. Boiling treatment, applied after 72 h to refrigerated milk samples, caused complete decontamination, but negatively affected the nutritional quality of the milk, as demonstrated by a drastic reduction of whey proteins. The microwave treatment at 900 W for 75 s produced microbiological decontamination similar to that of boiling, preserving the content in whey proteins of milk. The microbiological characteristics of raw milk observed in this study fully justify the obligation to boil the raw milk from vending machines before consumption. However, this study also showed that domestic boiling causes a drastic reduction in the nutritional value of milk. Microwave treatment could represent a good alternative to boiling, on the condition that the process variables are standardized for safe domestic application. Copyright © 2014 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
de Garidel-Thoron, T.; Marchant, R.; Soto, E.; Gally, Y.; Beaufort, L.; Bolton, C. T.; Bouslama, M.; Licari, L.; Mazur, J. C.; Brutti, J. M.; Norsa, F.
2017-12-01
Foraminifera tests are the main proxy carriers for paleoceanographic reconstructions. Both geochemical and taxonomical studies require large numbers of tests to achieve statistical relevance. To date, the extraction of foraminifera from the sediment coarse fraction is still done by hand and thus time-consuming. Moreover, the recognition of morphotypes, ecologically relevant, requires some taxonomical skills not easily taught. The automatic recognition and extraction of foraminifera would largely help paleoceanographers to overcome these issues. Recent advances in automatic image classification using machine learning opens the way to automatic extraction of foraminifera. Here we detail progress on the design of an automatic picking machine as part of the FIRST project. The machine handles 30 pre-sieved samples (100-1000µm), separating them into individual particles (including foraminifera) and imaging each in pseudo-3D. The particles are classified and specimens of interest are sorted either for Individual Foraminifera Analyses (44 per slide) and/or for classical multiple analyses (8 morphological classes per slide, up to 1000 individuals per hole). The classification is based on machine learning using Convolutional Neural Networks (CNNs), similar to the approach used in the coccolithophorid imaging system SYRACO. To prove its feasibility, we built two training image datasets of modern planktonic foraminifera containing approximately 2000 and 5000 images each, corresponding to 15 & 25 morphological classes. Using a CNN with a residual topology (ResNet) we achieve over 95% correct classification for each dataset. We tested the network on 160,000 images from 45 depths of a sediment core from the Pacific ocean, for which we have human counts. The current algorithm is able to reproduce the downcore variability in both Globigerinoides ruber and the fragmentation index (r2 = 0.58 and 0.88 respectively). The FIRST prototype yields some promising results for high-resolution paleoceanographic studies and evolutionary studies.
Hudson, Mary Anne
2005-01-01
On June 17,2005, Texas Governor Rick Perry (R) signed into law Senate Bill 1525, making Texas the first state in the nation to require hospitals and nursing homes to implement safe patient handling and movement programs. Governor Perry is to be commended for this heroic first stand for safe patient handling in America. The landmark legislation will take effect January 1, 2006, requiring the establishment of policy to identify, assess, and develop methods of controlling the risk of injury to patients and nurses associated with lifting, transferring, repositioning, and movement of patients; evaluation of alternative methods from manual lifting to reduce the risk of injury from patient lifting, including equipment and patient care environment; restricting, to the extent feasible with existing equipment, manual handling of all or most of a patient's weight to emergency, life-threatening, or exceptional circumstances; and provision for refusal to perform patient handling tasks believed to involve unacceptable risks of injury to a patient or nurse. Manually lifting patients has been called deplorable, inefficient, dangerous to nurses, and painful and brutal to patients; manual lifting can cause needless suffering and injury to patients, with dangers including pain, bruising, skin tears, abrasions, tube dislodgement, dislocations, fractures, and being dropped by nursing staff during attempts to manually lift. Use of safe, secure, mechanical lift equipment and gentle friction-reducing devices for patient maneuvering tasks could eliminate such needless brutality. Research has proven that manual patient lifting is extremely hazardous to health-care workers, creating substantial risk of low-back injury, whether with one or two patient handlers. Studies on the use of mechanical patient lift equipment, by either nursing staff or lift teams, have proven repeatedly that most nursing staff back injury is preventable, leading to substantial savings to employers on medical and compensation costs. Because the health-care industry has relied on people to do the work of machines, nursing work remains the most dangerous occupation for disabling back injury. Back injury from patient lifting may be the single largest contributor to the nursing shortage, with perhaps 12% of nurses leaving or being terminated because of back injury. The US health-care industry has not kept pace with other industries, which provide mechanical lift equipment for lifting loads equivalent to the weight of patients, or with other countries, such as Australia and England, which are more advanced in their use of modern technology for patient lifting and with no-lifting practices in compliance with government regulations and nursing policies banning manual lifting. With Texas being the first state to succeed in passing legislation for safe patient handling, other states are working toward legislative protection against injury with manual patient lifting. California re-introduced safe patient handling legislation on February 17, 2005, with CA SB 363, Hospitals: Lift Teams, following the September 22, 2004, veto of CA AB 2532 by Governor Arnold Schwarzenegger, who said he believes existing statutory protection and workplace safety standards are sufficient to protect health care workers from injury. Massachusetts HB 2662, Relating to Safe Patient Handling in Certain Health Facilities, was introduced December 1, 2004. Ohio HB 67, signed March 21, 2005 by Governor Bob Taft (R), creates a program for interest-free loans to nursing homes for implementation of a no-manual-lift program. New York companion bills AB 7641 and SB 4029 were introduced in April, 2005, calling for creation of a 2-year study to establish safe patient handling programs and collect data on nursing staff and patient injury with manual patient handling versus lift equipment, to determine best practices for improving health and safety of health-care workers and patients during patient handling. Washington State is planning re-introduction of safe patient handling legislation, after WA HB 1672, Relating to reducing injuries among patients and health care workers, was stalled in committee in February, 2005. Language from these state initiatives may be used as models to assist other states with drafting safe patient handling legislation. Rapid enactment of a federal mandate for Safe Patient Handling No Manual Lift is essential and anticipated.
The NOAO NEWFIRM Data Handling System
NASA Astrophysics Data System (ADS)
Zárate, N.; Fitzpatrick, M.
2008-08-01
The NOAO Extremely Wide-Field IR Mosaic (NEWFIRM) is a new 1-2.4 micron IR camera that is now being commissioned for the 4m Mayall telescope at Kitt Peak. The focal plane consists of a 2x2 mosaic of 2048x2048 arrays offerring a field-of-view of 27.6' on a side. The use of dual MONSOON array controllers permits very fast readout, a scripting interface allows for highly efficient observing modes. We describe the Data Handling System (DHS) for the NEWFIRM camera which is designed to meet the performance requirements of the instrument as well as the observing environment in which in operates. It is responsible for receiving the data stream from the detector and instrument software, rectifying the image geometry, presenting a real-time display of the image to the user, final assembly of a science-grade image with complete headers, as well as triggering automated pipeline and archival functions. The DHS uses an event-based messaging system to control multiple processes on a distributed network of machines. The asynchronous nature of this processing means the DHS operates independently from the camera readout and the design of the system is inherently scalable to larger focal planes that use a greater number of array controllers. Current status and future plans for the DHS are also discussed.
Material Processing with High Power CO2-Lasers
NASA Astrophysics Data System (ADS)
Bakowsky, Lothar
1986-10-01
After a period of research and development lasertechnique now is regarded as an important instrument for flexible, economic and fully automatic manufacturing. Especially cutting of flat metal sheets with high power C02-lasers and CNC controlled two or three axes handling systems is a wide spread. application. Three dimensional laser cutting, laser-welding and -heat treatment are just at the be ginning of industrial use in production lines. The main. advantages of laser technology. are - high. accuracy - high, processing velocity - law thermal distortion. - no tool abrasion. The market for laser material processing systems had 1985 a volume of 300 Mio S with growth rates between, 20 % and 30 %. The topic of this lecture are hiTrh. power CO2-lasers. Besides this systems two others are used as machining tools, Nd-YAG- and Eximer lasers. All applications of high. power CO2-lasers to industrial material processing show that high processing velocity and quality are only guaranteed in case of a stable intensity. profile on the workpiece. This is only achieved by laser systems without any power and mode fluctuations and by handling systems of high accuracy. Two applications in the automotive industry are described, below as examples for laser cutting and laser welding of special cylindrical motor parts.
Nakano, Takashi; Otsuka, Makoto; Yoshimoto, Junichiro; Doya, Kenji
2015-01-01
A theoretical framework of reinforcement learning plays an important role in understanding action selection in animals. Spiking neural networks provide a theoretically grounded means to test computational hypotheses on neurally plausible algorithms of reinforcement learning through numerical simulation. However, most of these models cannot handle observations which are noisy, or occurred in the past, even though these are inevitable and constraining features of learning in real environments. This class of problem is formally known as partially observable reinforcement learning (PORL) problems. It provides a generalization of reinforcement learning to partially observable domains. In addition, observations in the real world tend to be rich and high-dimensional. In this work, we use a spiking neural network model to approximate the free energy of a restricted Boltzmann machine and apply it to the solution of PORL problems with high-dimensional observations. Our spiking network model solves maze tasks with perceptually ambiguous high-dimensional observations without knowledge of the true environment. An extended model with working memory also solves history-dependent tasks. The way spiking neural networks handle PORL problems may provide a glimpse into the underlying laws of neural information processing which can only be discovered through such a top-down approach.
Nakano, Takashi; Otsuka, Makoto; Yoshimoto, Junichiro; Doya, Kenji
2015-01-01
A theoretical framework of reinforcement learning plays an important role in understanding action selection in animals. Spiking neural networks provide a theoretically grounded means to test computational hypotheses on neurally plausible algorithms of reinforcement learning through numerical simulation. However, most of these models cannot handle observations which are noisy, or occurred in the past, even though these are inevitable and constraining features of learning in real environments. This class of problem is formally known as partially observable reinforcement learning (PORL) problems. It provides a generalization of reinforcement learning to partially observable domains. In addition, observations in the real world tend to be rich and high-dimensional. In this work, we use a spiking neural network model to approximate the free energy of a restricted Boltzmann machine and apply it to the solution of PORL problems with high-dimensional observations. Our spiking network model solves maze tasks with perceptually ambiguous high-dimensional observations without knowledge of the true environment. An extended model with working memory also solves history-dependent tasks. The way spiking neural networks handle PORL problems may provide a glimpse into the underlying laws of neural information processing which can only be discovered through such a top-down approach. PMID:25734662
Switchable adhesion for wafer-handling based on dielectric elastomer stack transducers
NASA Astrophysics Data System (ADS)
Grotepaß, T.; Butz, J.; Förster-Zügel, F.; Schlaak, H. F.
2016-04-01
Vacuum grippers are often used for the handling of wafers and small devices. In order to evacuate the gripper, a gas flow is created that can harm the micro structures on the wafer. A promising alternative to vacuum grippers could be adhesive grippers with switchable adhesion. There have been some publications of gecko-inspired adhesive devices. Most of these former works consist of a structured surface which adheres to the object manipulated and an actuator for switching the adhesion. Until now different actuator principles have been investigated, like smart memory alloys and pneumatics. In this work for the first time dielectric elastomer stack transducers (DEST) are combined with a structured surface. DESTs are a promising new transducer technology with many applications in different industry sectors like medical devices, human-machine-interaction and soft robotics. Stacked dielectric elastomer transducers show thickness contraction originating from the electromechanical pressure of two compliant electrodes compressing an elastomeric dielectric when a voltage is applied. Since DESTs and the adhesive surfaces previously described are made of elastomers, it is self-evident to combine both systems in one device. The DESTs are fabricated by a spin coating process. If the flat surface of the spinning carrier is substituted for example by a perforated one, the structured elastomer surface and the DEST can be fabricated in one process. By electrical actuation the DEST contracts and laterally expands which causes the gecko-like cilia to adhere on the object to manipulate. This work describes the assembly and the experimental results of such a device using switchable adhesion. It is intended to be used for the handling of glass wafers.
A Navier-Strokes Chimera Code on the Connection Machine CM-5: Design and Performance
NASA Technical Reports Server (NTRS)
Jespersen, Dennis C.; Levit, Creon; Kwak, Dochan (Technical Monitor)
1994-01-01
We have implemented a three-dimensional compressible Navier-Stokes code on the Connection Machine CM-5. The code is set up for implicit time-stepping on single or multiple structured grids. For multiple grids and geometrically complex problems, we follow the 'chimera' approach, where flow data on one zone is interpolated onto another in the region of overlap. We will describe our design philosophy and give some timing results for the current code. A parallel machine like the CM-5 is well-suited for finite-difference methods on structured grids. The regular pattern of connections of a structured mesh maps well onto the architecture of the machine. So the first design choice, finite differences on a structured mesh, is natural. We use centered differences in space, with added artificial dissipation terms. When numerically solving the Navier-Stokes equations, there are liable to be some mesh cells near a solid body that are small in at least one direction. This mesh cell geometry can impose a very severe CFL (Courant-Friedrichs-Lewy) condition on the time step for explicit time-stepping methods. Thus, though explicit time-stepping is well-suited to the architecture of the machine, we have adopted implicit time-stepping. We have further taken the approximate factorization approach. This creates the need to solve large banded linear systems and creates the first possible barrier to an efficient algorithm. To overcome this first possible barrier we have considered two options. The first is just to solve the banded linear systems with data spread over the whole machine, using whatever fast method is available. This option is adequate for solving scalar tridiagonal systems, but for scalar pentadiagonal or block tridiagonal systems it is somewhat slower than desired. The second option is to 'transpose' the flow and geometry variables as part of the time-stepping process: Start with x-lines of data in-processor. Form explicit terms in x, then transpose so y-lines of data are in-processor. Form explicit terms in y, then transpose so z-lines are in processor. Form explicit terms in z, then solve linear systems in the z-direction. Transpose to the y-direction, then solve linear systems in the y-direction. Finally transpose to the x direction and solve linear systems in the x-direction. This strategy avoids inter-processor communication when differencing and solving linear systems, but requires a large amount of communication when doing the transposes. The transpose method is more efficient than the non-transpose strategy when dealing with scalar pentadiagonal or block tridiagonal systems. For handling geometrically complex problems the chimera strategy was adopted. For multiple zone cases we compute on each zone sequentially (using the whole parallel machine), then send the chimera interpolation data to a distributed data structure (array) laid out over the whole machine. This information transfer implies an irregular communication pattern, and is the second possible barrier to an efficient algorithm. We have implemented these ideas on the CM-5 using CMF (Connection Machine Fortran), a data parallel language which combines elements of Fortran 90 and certain extensions, and which bears a strong similarity to High Performance Fortran. We make use of the Connection Machine Scientific Software Library (CMSSL) for the linear solver and array transpose operations.