Sample records for purex based process

  1. Project C-018H, 242-A Evaporator/PUREX Plant Process Condensate Treatment Facility, functional design criteria. Revision 3

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sullivan, N.

    1995-05-02

    This document provides the Functional Design Criteria (FDC) for Project C-018H, the 242-A Evaporator and Plutonium-Uranium Extraction (PUREX) Plant Condensate Treatment Facility (Also referred to as the 200 Area Effluent Treatment Facility [ETF]). The project will provide the facilities to treat and dispose of the 242-A Evaporator process condensate (PC), the Plutonium-Uranium Extraction (PUREX) Plant process condensate (PDD), and the PUREX Plant ammonia scrubber distillate (ASD).

  2. Chemical interaction matrix between reagents in a Purex based process

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brahman, R.K.; Hennessy, W.P.; Paviet-Hartmann, P.

    2008-07-01

    The United States Department of Energy (DOE) is the responsible entity for the disposal of the United States excess weapons grade plutonium. DOE selected a PUREX-based process to convert plutonium to low-enriched mixed oxide fuel for use in commercial nuclear power plants. To initiate this process in the United States, a Mixed Oxide (MOX) Fuel Fabrication Facility (MFFF) is under construction and will be operated by Shaw AREVA MOX Services at the Savannah River Site. This facility will be licensed and regulated by the U.S. Nuclear Regulatory Commission (NRC). A PUREX process, similar to the one used at La Hague,more » France, will purify plutonium feedstock through solvent extraction. MFFF employs two major process operations to manufacture MOX fuel assemblies: (1) the Aqueous Polishing (AP) process to remove gallium and other impurities from plutonium feedstock and (2) the MOX fuel fabrication process (MP), which processes the oxides into pellets and manufactures the MOX fuel assemblies. The AP process consists of three major steps, dissolution, purification, and conversion, and is the center of the primary chemical processing. A study of process hazards controls has been initiated that will provide knowledge and protection against the chemical risks associated from mixing of reagents over the life time of the process. This paper presents a comprehensive chemical interaction matrix evaluation for the reagents used in the PUREX-based process. Chemical interaction matrix supplements the process conditions by providing a checklist of any potential inadvertent chemical reactions that may take place. It also identifies the chemical compatibility/incompatibility of the reagents if mixed by failure of operations or equipment within the process itself or mixed inadvertently by a technician in the laboratories. (aut0010ho.« less

  3. 10 CFR Appendix I to Part 110 - Illustrative List of Reprocessing Plant Components Under NRC Export Licensing Authority

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... transuranic elements. Different technical processes can accomplish this separation. However, over the years Purex has become the most commonly used and accepted process. Purex involves the dissolution of... facilities have process functions similar to each other, including: irradiated fuel element chopping, fuel...

  4. Method of separating and recovering uranium and related cations from spent Purex-type systems

    DOEpatents

    Mailen, J.C.; Tallent, O.K.

    1987-02-25

    A process for separating uranium and related cations from a spent Purex-type solvent extraction system which contains degradation complexes of tributylphosphate wherein the system is subjected to an ion-exchange process prior to a sodium carbonate scrubbing step. A further embodiment comprises recovery of the separated uranium and related cations. 5 figs.

  5. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hohimer, J.P.

    The use of laser-based analytical methods in nuclear-fuel processing plants is considered. The species and locations for accountability, process control, and effluent control measurements in the Coprocessing, Thorex, and reference Purex fuel processing operations are identified and the conventional analytical methods used for these measurements are summarized. The laser analytical methods based upon Raman, absorption, fluorescence, and nonlinear spectroscopy are reviewed and evaluated for their use in fuel processing plants. After a comparison of the capabilities of the laser-based and conventional analytical methods, the promising areas of application of the laser-based methods in fuel processing plants are identified.

  6. Separation of uranium from technetium in recovery of spent nuclear fuel

    NASA Astrophysics Data System (ADS)

    Friedman, H. A.

    1984-06-01

    A method for decontaminating uranium product from the Purex 5 process is described. Hydrazine is added to the product uranyl nitrate stream from the Purex process, which contains hexavalent (UO2(2+)) uranium and heptavalent technetius (TcO4-). Technetium in the product stream is reduced and then complexed by the addition of oxalic acid (H2O2O4), and the Tc-oxalate complex is readily separated from the 10 uranium by solvent extraction with 30 vol % tributyl phosphate in n-dodecane.

  7. Separation of uranium from technetium in recovery of spent nuclear fuel

    DOEpatents

    Friedman, H.A.

    1984-06-13

    A method for decontaminating uranium product from the Purex 5 process comprises addition of hydrazine to the product uranyl nitrate stream from the Purex process, which contains hexavalent (UO/sub 2//sup 2 +/) uranium and heptavalent technetium (TcO/sub 4/-). Technetium in the product stream is reduced and then complexed by the addition of oxalic acid (H/sub 2/C/sub 2/O/sub 4/), and the Tc-oxalate complex is readily separated from the 10 uranium by solvent extraction with 30 vol % tributyl phosphate in n-dodecane.

  8. Separation of uranium from technetium in recovery of spent nuclear fuel

    DOEpatents

    Friedman, Horace A.

    1985-01-01

    A method for decontaminating uranium product from the Purex process comprises addition of hydrazine to the product uranyl nitrate stream from the Purex process, which contains hexavalent (UO.sub.2.sup.2+) uranium and heptavalent technetium (TcO.sub.4 -). Technetium in the product stream is reduced and then complexed by the addition of oxalic acid (H.sub.2 C.sub.2 O.sub.4), and the Tc-oxalate complex is readily separated from the uranium by solvent extraction with 30 vol. % tributyl phosphate in n-dodecane.

  9. Separation of uranium from technetium in recovery of spent nuclear fuel

    DOEpatents

    Pruett, D.J.; McTaggart, D.R.

    1983-08-31

    Uranium and technetium in the product stream of the Purex process for recovery of uranium in spent nuclear fuel are separated by (1) contacting the aqueous Purex product stream with hydrazine to reduce Tc/sup +7/ therein to a reduced species, and (2) contacting said aqueous stream with an organic phase containing tributyl phosphate and an organic diluent to extract uranium from said aqueous stream into said organic phase.

  10. Separation of uranium from technetium in recovery of spent nuclear fuel

    DOEpatents

    Pruett, David J.; McTaggart, Donald R.

    1984-01-01

    Uranium and technetium in the product stream of the Purex process for recovery of uranium in spent nuclear fuel are separated by (1) contacting the aqueous Purex product stream with hydrazine to reduce Tc.sup.+7 therein to a reduced species, and (2) contacting said aqueous stream with an organic phase containing tributyl phosphate and an organic diluent to extract uranium from said aqueous stream into said organic phase.

  11. The application of N,N-dimethyl-3-oxa-glutaramic acid (DOGA) in the PUREX process

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jianchen, Wang; Jing, Chen

    2007-07-01

    The new salt-free complexant, DOGA for separating trace Pu(IV) and Np(IV) from U(VI) nitric acid solution was studied. DOGA has stronger complexing abilities to Pu(IV) and Np(IV), but complexing ability of DOGA to U(VI) was weaker. The DOGA can be used in the PUREX process to separate Pu(IV) and Np(IV) from U(VI) nitric solution. On one hand, U(IV) in the nitric acid solution containing trace Pu(IV) and Np(IV) was extracted by 30%TBP - kerosene(v/v) in the presence of DOGA, but Pu(IV) and Np(IV) were kept in the aqueous phase. On the other hand, Pu(IV) and Np(IV) loading in 30% TBPmore » - kerosene were effectively stripped by DOGA into the aqueous phase, but U(VI) loading in 30% TBP - kerosene was remained in 30% TBP - kerosene. DOGA is a promising complexant to separate Pu(IV) and Np(IV) from U(VI) solution in the U-cycle of the PUREX process. (authors)« less

  12. Method for extracting lanthanides and actinides from acid solutions by modification of Purex solvent

    DOEpatents

    Horwitz, E.P.; Kalina, D.G.

    1984-05-21

    A process has been developed for the extraction of multivalent lanthanide and actinide values from acidic waste solutions, and for the separation of these values from fission product and other values, which utilizes a new series of neutral bi-functional extractants, the alkyl(phenyl)-N, N-dialkylcarbamoylmethylphosphine oxides, in combination with a phase modifier to form an extraction solution. The addition of the extractant to the Purex process extractant, tri-n-butylphosphate in normal paraffin hydrocarbon diluent, will permit the extraction of multivalent lanthanide and actinide values from 0.1 to 12.0 molar acid solutions.

  13. Fundamental Chemical Kinetic And Thermodynamic Data For Purex Process Models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Taylor, R.J.; Fox, O.D.; Sarsfield, M.J.

    2007-07-01

    To support either the continued operations of current reprocessing plants or the development of future fuel processing using hydrometallurgical processes, such as Advanced Purex or UREX type flowsheets, the accurate simulation of Purex solvent extraction is required. In recent years we have developed advanced process modeling capabilities that utilize modern software platforms such as Aspen Custom Modeler and can be run in steady state and dynamic simulations. However, such advanced models of the Purex process require a wide range of fundamental data including all relevant basic chemical kinetic and thermodynamic data for the major species present in the process. Thismore » paper will summarize some of these recent process chemistry studies that underpin our simulation, design and testing of Purex solvent extraction flowsheets. Whilst much kinetic data for actinide redox reactions in nitric acid exists in the literature, the data on reactions in the diluted TBP solvent phase is much rarer. This inhibits the accurate modelization of the Purex process particularly when species show a significant extractability in to the solvent phase or when cycling between solvent and aqueous phases occurs, for example in the reductive stripping of Pu(IV) by ferrous sulfamate in the Magnox reprocessing plant. To support current oxide reprocessing, we have investigated a range of solvent phase reactions: - U(IV)+HNO{sub 3}; - U(IV)+HNO{sub 2}; - U(IV)+HNO{sub 3} (Pu catalysis); - U(IV)+HNO{sub 3} (Tc catalysis); - U(IV)+ Np(VI); - U(IV)+Np(V); - Np(IV)+HNO{sub 3}; - Np(V)+Np(V); Rate equations have been determined for all these reactions and kinetic rate constants and activation energies are now available. Specific features of these reactions in the TBP phase include the roles of water and hydrolyzed intermediates in the reaction mechanisms. In reactions involving Np(V), cation-cation complex formation, which is much more favourable in TBP than in HNO{sub 3}, also occurs and complicates the redox chemistry. Whilst some features of the redox chemistry in TBP appear similar to the corresponding reactions in aqueous HNO{sub 3}, there are notable differences in rates, the forms of the rate equations and mechanisms. Secondly, to underpin the development of advanced single cycle flowsheets using the complexant aceto-hydroxamic acid, we have also characterised in some detail its redox chemistry and solvent extraction behaviour with both Np and Pu ions. We find that simple hydroxamic acids are remarkably rapid reducing agents for Np(VI). They also reduce Pu(VI) and cause a much slower reduction of Pu(IV) through a complex mechanism involving acid hydrolysis of the ligand. AHA is a strong hydrophilic and selective complexant for the tetravalent actinide ions as evidenced by stability constant and solvent extraction data for An(IV), M(III) and U(VI) ions. This has allowed the successful design of U/Pu+Np separation flowsheets suitable for advanced fuel cycles. (authors)« less

  14. Literature Review: Crud Formation at the Liquid/Liquid Interface of TBP-Based Solvent-Extraction Processes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Delegard, Calvin H.; Casella, Amanda J.

    2016-09-30

    This report summarizes the literature reviewed on crud formation at the liquid:liquid interface of solvent extraction processes. The review is focused both on classic PUREX extraction for industrial reprocessing, especially as practiced at the Hanford Site, and for those steps specific to plutonium purification that were used at the Plutonium Reclamation Facility (PRF) within the Plutonium Finishing Plant (PFP) at the Hanford Site.

  15. Spectroscopic methods of process monitoring for safeguards of used nuclear fuel separations

    NASA Astrophysics Data System (ADS)

    Warburton, Jamie Lee

    To support the demonstration of a more proliferation-resistant nuclear fuel processing plant, techniques and instrumentation to allow the real-time, online determination of special nuclear material concentrations in-process must be developed. An ideal materials accountability technique for proliferation resistance should provide nondestructive, realtime, on-line information of metal and ligand concentrations in separations streams without perturbing the process. UV-Visible spectroscopy can be adapted for this precise purpose in solvent extraction-based separations. The primary goal of this project is to understand fundamental URanium EXtraction (UREX) and Plutonium-URanium EXtraction (PUREX) reprocessing chemistry and corresponding UV-Visible spectroscopy for application in process monitoring for safeguards. By evaluating the impact of process conditions, such as acid concentration, metal concentration and flow rate, on the sensitivity of the UV-Visible detection system, the process-monitoring concept is developed from an advanced application of fundamental spectroscopy. Systematic benchtop-scale studies investigated the system relevant to UREX or PUREX type reprocessing systems, encompassing 0.01-1.26 M U and 0.01-8 M HNO3. A laboratory-scale TRansUranic Extraction (TRUEX) demonstration was performed and used both to analyze for potential online monitoring opportunities in the TRUEX process, and to provide the foundation for building and demonstrating a laboratory-scale UREX demonstration. The secondary goal of the project is to simulate a diversion scenario in UREX and successfully detect changes in metal concentration and solution chemistry in a counter current contactor system with a UV-Visible spectroscopic process monitor. UREX uses the same basic solvent extraction flowsheet as PUREX, but has a lower acid concentration throughout and adds acetohydroxamic acid (AHA) as a complexant/reductant to the feed solution to prevent the extraction of Pu. By examining UV-Visible spectra gathered in real time, the objective is to detect the conversion from the UREX process, which does not separate Pu, to the PUREX process, which yields a purified Pu product. The change in process chemistry can be detected in the feed solution, aqueous product or in the raffinate stream by identifying the acid concentration, metal distribution and the presence or absence of AHA. A fiber optic dip probe for UV-Visible spectroscopy was integrated into a bank of three counter-current centrifugal contactors to demonstrate the online process monitoring concept. Nd, Fe and Zr were added to the uranyl nitrate system to explore spectroscopic interferences and identify additional species as candidates for online monitoring. This milestone is a demonstration of the potential of this technique, which lies in the ability to simultaneously and directly monitor the chemical process conditions in a reprocessing plant, providing inspectors with another tool to detect nuclear material diversion attempts. Lastly, dry processing of used nuclear fuel is often used as a head-end step before solvent extraction-based separations such as UREX or TRUEX. A non-aqueous process, used fuel treatment by dry processing generally includes chopping of used fuel rods followed by repeated oxidation-reduction cycles and physical separation of the used fuel from the cladding. Thus, dry processing techniques are investigated and opportunities for online monitoring are proposed for continuation of this work in future studies.

  16. Method for extracting lanthanides and actinides from acid solutions by modification of purex solvent

    DOEpatents

    Horwitz, E. Philip; Kalina, Dale G.

    1986-01-01

    A process for the recovery of actinide and lanthanide values from aqueous solutions with an extraction solution containing an organic extractant having the formula: ##STR1## where .phi. is phenyl, R.sup.1 is a straight or branched alkyl or alkoxyalkyl containing from 6 to 12 carbon atoms and R.sup.2 is an alkyl containing from 3 to 6 carbon atoms and phase modifiers in a water-immiscible hydrocarbon diluent. The addition of the extractant to the Purex process extractant, tri-n-butylphosphate in normal paraffin hydrocarbon diluent, will permit the extraction of multivalent lanthanide and actinide values from 0.1 to 12.0 molar acid solutions.

  17. Chemical Processing Department monthly report, October 1962

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1962-11-21

    This report, from the Chemical Processing Department at HAPO, for October, 1962 discusses the following: Production operation; Purex and Redox operation; Finished products operation; maintenance; Financial operations; facilities engineering; research; employee relations; and weapons manufacturing operation.

  18. Method for extracting lanthanides and actinides from acid solutions by modification of Purex solvent

    DOEpatents

    Horwitz, E.P.; Kalina, D.G.

    1986-03-04

    A process is described for the recovery of actinide and lanthanide values from aqueous solutions with an extraction solution containing an organic extractant having the formula as shown in a diagram where [phi] is phenyl, R[sup 1] is a straight or branched alkyl or alkoxyalkyl containing from 6 to 12 carbon atoms and R[sup 2] is an alkyl containing from 3 to 6 carbon atoms and phase modifiers in a water-immiscible hydrocarbon diluent. The addition of the extractant to the Purex process extractant, tri-n-butylphosphate in normal paraffin hydrocarbon diluent, will permit the extraction of multivalent lanthanide and actinide values from 0.1 to 12.0 molar acid solutions. 6 figs.

  19. Nuclear and chemical safety analysis: Purex Plant 1970 thorium campaign

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Boldt, A.L.; Oberg, G.C.

    The purpose of this document is to discuss the flowsheet and the related processing equipment with respect to nuclear and chemical safety. The analyses presented are based on equipment utilization and revised piping as outlined in the design criteria. Processing of thorium and uranium-233 in the Purex Plant can be accomplished within currently accepted levels of risk with respect to chemical and nuclear safety if minor instrumentation changes are made. Uranium-233 processing is limited to a rate of about 670 grams per hour by equipment capacities and criticality safety considerations. The major criticality prevention problems result from the potential accumulationmore » of uranium-233 in a solvent phase in E-H4 (ICU concentrator), TK-J1 (IUC receiver), and TK-J21 (2AF pump tank). The same potential problems exist in TK-J5 (3AF pump tank) and TK-N1 (3BU receiver), but the probabilities of reaching a critical condition are not as great. In order to prevent the excessive accumulation of uranium-233 in any of these vessels by an extraction mechanism, it is necessary to maintain the uranium-233 and salting agent concentrations below the point at which a critical concentration of uranium-233 could be reached in a solvent phase.« less

  20. Monitoring iodine-129 in air and milk samples collected near the Hanford Site: an investigation of historical iodine monitoring data.

    PubMed

    Fritz, Brad G; Patton, Gregory W

    2006-01-01

    While other research has reported on the concentrations of (129)I in the environment surrounding active nuclear fuel reprocessing facilities, there is a shortage of information regarding how the concentrations change once facilities close. At the Hanford Site, the Plutonium-Uranium Extraction (PUREX) chemical separation plant was operating between 1983 and 1990, during which time (129)I concentrations in air and milk were measured. After the cessation of chemical processing, plant emissions decreased 2.5 orders of magnitude over an 8-year period. An evaluation of (129)I and (127)I concentration data in air and milk spanning the PUREX operation and post-closure period was conducted to compare the changes in environmental levels. Measured concentrations over the monitoring period were below the levels that could result in a potential annual human dose greater than 1 mSv. There was a measurable difference in the measured air concentrations of (129)I at different distances from the source, indicating a distinct Hanford fingerprint. Correlations between stack emissions of (129)I and concentrations in air and milk indicate that atmospheric emissions were the major source of (129)I measured in environmental samples. The measured concentrations during PUREX operations were similar to observations made around a fuel reprocessing plant in Germany. After the PUREX Plant stopped operating, (129)I concentration measurements made upwind of Hanford were similar to the results from Seville, Spain.

  1. Radiation Chemistry of Acetohydroxamic Acid in the UREX Process

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Karraker, D.G.

    2002-07-31

    The UREX process is being developed to process irradiated power reactor elements by dissolution in nitric acid and solvent extraction by a variation of the PUREX process.1 Rather than recovering both U and Pu, as in Purex, only U will be recovered by solvent extraction, hence the name ''UREX.'' A complexing agent, acetohydroxamic acid (AHA), will be added to the scrub stream to prevent the extraction of Pu(IV) and Np(VI). AHA (CH3C=ONHOH) is decomposed to gaseous products in waste evaporation, so no solid waste is generated by its addition. AHA is hydrolyzed in acid solution to acetic acid and hydroxylaminemore » at a rate dependent on the acid concentration.2-4 The fuel to be processed is ca 40 years cooled, 30,000-50,000 MWD/MT material; although only a few fission products remain, the Pu isotopes and 241Am generate a radiation field estimated to be 2.6E+02R during processing. (see Appendix for calculation.) This study was conducted to determine the effect of this level of radiation on the stability of AHA during processing.« less

  2. Chemical Processing Department monthly report, September 1956

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1956-10-18

    The September, 1956 monthly report for the Chemical Processing Department of Hanford Atomic Products Operation includes information regarding research and engineering efforts with respect to the Purex and Redox process technology. Also discussed is the production operation, finished products operation, power and general maintenance, financial operation, engineering and research operations, and employee operations. (MB)

  3. Chemical Processing Department monthly report, November 1957

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1957-12-23

    The November, 1957 monthly report for the Chemical Processing Department of the Hanford Atomic Products Operation includes information regarding research and engineering efforts with respect to the Purex and Redox process technology. Also discussed is the production operation, finished product operation, power and general maintenance, financial operation, engineering and research operations, and employee operation. (MB)

  4. PUREX/UO3 Facilities deactivation lessons learned history

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gerber, M.S.

    1996-09-19

    Disconnecting the criticality alarm permanently in June 1996 signified that the hazards in the PUREX (plutonium-uranium extraction) plant had been so removed and reduced that criticality was no longer a credible event. Turning off the PUREX criticality alarm also marked a salient point in a historic deactivation project, 1 year before its anticipated conclusion. The PUREX/UO3 Deactivation Project began in October 1993 as a 5-year, $222.5- million project. As a result of innovations implemented during 1994 and 1995, the project schedule was shortened by over a year, with concomitant savings. In 1994, the innovations included arranging to send contaminated nitricmore » acid from the PUREX Plant to British Nuclear Fuels, Limited (BNFL) for reuse and sending metal solutions containing plutonium and uranium from PUREX to the Hanford Site tank farms. These two steps saved the project $36.9- million. In 1995, reductions in overhead rate, work scope, and budget, along with curtailed capital equipment expenditures, reduced the cost another $25.6 million. These savings were achieved by using activity-based cost estimating and applying technical schedule enhancements. In 1996, a series of changes brought about under the general concept of ``reengineering`` reduced the cost approximately another $15 million, and moved the completion date to May 1997. With the total savings projected at about $75 million, or 33.7 percent of the originally projected cost, understanding how the changes came about, what decisions were made, and why they were made becomes important. At the same time sweeping changes in the cultural of the Hanford Site were taking place. These changes included shifting employee relations and work structures, introducing new philosophies and methods in maintaining safety and complying with regulations, using electronic technology to manage information, and, adopting new methods and bases for evaluating progress. Because these changes helped generate cost savings and were accompanied by and were an integral part of sweeping ``culture changes,`` the story of the lessons learned during the PUREX Deactivation Project are worth recounting. Foremost among the lessons is recognizing the benefits of ``right to left`` project planning. A deactivation project must start by identifying its end points, then make every task, budget, and organizational decision based on reaching those end points. Along with this key lesson is the knowledge that project planning and scheduling should be tied directly to costing, and the project status should be checked often (more often than needed to meet mandated reporting requirements) to reflect real-time work. People working on a successful project should never be guessing about its schedule or living with a paper schedule that does not represent the actual state of work. Other salient lessons were learned in the PUREX/UO3 Deactivation Project that support these guiding principles. They include recognizing the value of independent review, teamwork, and reengineering concepts; the need and value of cooperation between the DOE, its contractors, regulators, and stakeholders; and the essential nature of early and ongoing communication. Managing a successful project also requires being willing to take a fresh look at safety requirements and to apply them in a streamlined and sensible manner to deactivating facilities; draw on the enormous value of resident knowledge acquired by people over years and sometimes decades of working in old plants; and recognize the value of bringing in outside expertise for certain specialized tasks.This approach makes possible discovering the savings that can come when many creative options are pursued persistently and the wisdom of leaving some decisions to the future. The essential job of a deactivation project is to place a facility in a safe, stable, low-maintenance mode, for an interim period. Specific end points are identified to recognize and document this state. Keeping the limited objectives of the project in mind can guide decisions that reduce risks with minimal manipulation of physical materials, minimal waste generation, streamline regulations and safety requirements where possible, and separate the facility from ongoing entanglements with operating systems. Thus, the ``parked car`` state is achieved quickly and directly. The PUREX Deactivation Lessons Learned History was first issued in January 1995. Since then, several key changes have occurred in the project, making it advisable to revise and update the document. This document is organized with the significant lessons learned captured at the end of each section, and then recounted in Section 11.0, ``Lessons Consolidated.`` It is hoped and believed that the lessons learned on the PUREX Deactivation Project will have value to other facilities both inside and outside the DOE complex.« less

  5. PUREX/UO{sub 3} deactivation project management plan

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Washenfelder, D.J.

    1993-12-01

    From 1955 through 1990, the Plutonium-Uranium Extraction Plant (PUREX) provided the United States Department of Energy Hanford Site with nuclear fuel reprocessing capability. It operated in sequence with the Uranium Trioxide (UO{sub 3}) Plant, which converted the PUREX liquid uranium nitrate product to solid UO{sub 3} powder. Final UO{sub 3} Plant operation ended in 1993. In December 1992, planning was initiated for the deactivation of PUREX and UO{sub 3} Plant. The objective of deactivation planning was to identify the activities needed to establish a passively safe, environmentally secure configuration at both plants, and ensure that the configuration could be retainedmore » during the post-deactivation period. The PUREX/UO{sub 3} Deactivation Project management plan represents completion of the planning efforts. It presents the deactivation approach to be used for the two plants, and the supporting technical, cost, and schedule baselines. Deactivation activities concentrate on removal, reduction, and stabilization of the radioactive and chemical materials remaining at the plants, and the shutdown of the utilities and effluents. When deactivation is completed, the two plants will be left unoccupied and locked, pending eventual decontamination and decommissioning. Deactivation is expected to cost $233.8 million, require 5 years to complete, and yield $36 million in annual surveillance and maintenance cost savings.« less

  6. Overview of reductants utilized in nuclear fuel reprocessing/recycling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Paviet-Hartmann, P.; Riddle, C.; Campbell, K.

    2013-07-01

    The most widely used reductant to partition plutonium from uranium in the Purex process was ferrous sulfamate, other alternates were proposed such as hydrazine-stabilized ferrous nitrate or uranous nitrate, platinum catalyzed hydrogen, and hydrazine, hydroxylamine salts. New candidates to replace hydrazine or hydroxylamine nitrate (HAN) are pursued worldwide. They may improve the performance of the industrial Purex process towards different operations such as de-extraction of plutonium and reduction of the amount of hydrazine which will limit the formation of hydrazoic acid. When looking at future recycling technologies using hydroxamic ligands, neither acetohydroxamic acid (AHA) nor formohydroxamic acid (FHA) seem promisingmore » because they hydrolyze to give hydroxylamine and the parent carboxylic acid. Hydroxyethylhydrazine, HOC{sub 2}H{sub 4}N{sub 2}H{sub 3} (HEH) is a promising non-salt-forming reductant of Np and Pu ions because it is selective to neptunium and plutonium ions at room temperature and at relatively low acidity, it could serve as a replacement of HAN or AHA for the development of a novel used nuclear fuel recycling process.« less

  7. U.S. program assessing nuclear waste disposal in space - A status report

    NASA Technical Reports Server (NTRS)

    Rice, E. E.; Priest, C. C.; Friedlander, A. L.

    1980-01-01

    Various concepts for the space disposal of nuclear waste are discussed, with attention given to the destinations now being considered (high earth orbit, lunar orbit, lunar surface, solar orbit, solar system escape, sun). Waste mixes are considered in the context of the 'Purex' (Plutonium and Uranium extraction) process and the potential forms for nuclear waste disposal (ORNL cermet, Boro-silicate glass, Metal matrix, Hot-pressed supercalcine) are described. Preliminary estimates of the energy required and the cost surcharge needed to support the space disposal of nuclear waste are presented (8 metric tons/year, requiring three Shuttle launches). When Purex is employed, the generated electrical energy needed to support the Shuttle launches is shown to be less than 1%, and the projected surcharge to electrical users is shown to be slightly more than two mills/kW-hour.

  8. Special nuclear materials cutoff exercise: Issues and lessons learned. Volume 3

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Libby, R.A.; Segal, J.E.; Stanbro, W.D.

    1995-08-01

    This document is appendices D-J for the Special Nuclear Materials Cutoff Exercise: Issues and Lessons Learned. Included are discussions of the US IAEA Treaty, safeguard regulations for nuclear materials, issue sheets for the PUREX process, and the LANL follow up activity for reprocessing nuclear materials.

  9. The behaviour of tributyl phosphate in an organic diluent

    NASA Astrophysics Data System (ADS)

    Leay, Laura; Tucker, Kate; Del Regno, Annalaura; Schroeder, Sven L. M.; Sharrad, Clint A.; Masters, Andrew J.

    2014-09-01

    Tributyl phosphate (TBP) is used as a complexing agent in the Plutonium Uranium Extraction (PUREX) liquid-liquid phase extraction process for recovering uranium and plutonium from spent nuclear reactor fuel. Here, we address the molecular and microstructure of the organic phases involved in the extraction process, using molecular dynamics to show that when TBP is mixed with a paraffinic diluent, the TBP self-assembles into a bi-continuous phase. The underlying self-association of TBP is driven by intermolecular interaction between its polar groups, resulting in butyl moieties radiating out into the organic solvent. Simulation predicts a TBP diffusion constant that is anomalously low compared to what might normally be expected for its size; experimental nuclear magnetic resonance (NMR) studies also indicate an extremely low diffusion constant, consistent with a molecular aggregation model. Simulation of TBP at an oil/water interface shows the formation of a bilayer system at low TBP concentrations. At higher concentrations, a bulk bi-continuous structure is observed linking to this surface bilayer. We suggest that this structure may be intimately connected with the surprisingly rapid kinetics of the interfacial mass transport of uranium and plutonium from the aqueous to the organic phase in the PUREX process.

  10. Next-generation purex flowsheets with acetohydroxamic acid as complexant for FBR and thermal-fuel reprocessing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kumar, Shekhar; Koganti, S.B.

    2008-07-01

    Acetohydroxamic acid (AHA) is a novel complexant for recycle of nuclear-fuel materials. It can be used in ordinary centrifugal extractors, eliminating the need for electro-redox equipment or complex maintenance requirements in a remotely maintained hot cell. In this work, the effect of AHA on Pu(IV) distribution ratios in 30% TBP system was quantified, modeled, and integrated in SIMPSEX code. Two sets of batch experiments involving macro Pu concentrations (conducted at IGCAR) and one high-Pu flowsheet (literature) were simulated for AHA based U-Pu separation. Based on the simulation and validation results, AHA based next-generation reprocessing flowsheets are proposed for co-processing basedmore » FBR and thermal-fuel reprocessing as well as evaporator-less macro-level Pu concentration process required for MOX fuel fabrication. Utilization of AHA results in significant simplification in plant design and simpler technology implementations with significant cost savings. (authors)« less

  11. Hanford facility dangerous waste permit application, PUREX storage tunnels

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Haas, C. R.

    1997-09-08

    The Hanford Facility Dangerous Waste Permit Application is considered to be a single application organized into a General Information Portion (document number DOE/RL-91-28) and a Unit-Specific Portion. The scope of the Unit-Specific Portion is limited to Part B permit application documentation submitted for individual, `operating` treatment, storage, and/or disposal units, such as the PUREX Storage Tunnels (this document, DOE/RL-90-24).

  12. 242-A Evaporator/plutonium uranium extraction (PUREX) effluent treatment facility (ETF) nonradioactive air emission test report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hill, J.S., Westinghouse Hanford

    1996-05-10

    This report shows the methods used to test the stack gas outlet concentration and emission rate of Volatile Organic Compounds as Total Non-Methane Hydrocarbons in parts per million by volume,grams per dry standard cubic meter, and grams per minute from the PUREX ETF stream number G6 on the Hanford Site. Test results are shown in Appendix B.1.

  13. Managing Zirconium Chemistry and Phase Compatibility in Combined Process Separations for Minor Actinide Partitioning

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wall, Nathalie; Nash, Ken; Martin, Leigh

    In response to the NEUP Program Supporting Fuel Cycle R&D Separations and Waste Forms call DEFOA- 0000799, this report describes the results of an R&D project focusing on streamlining separation processes for advanced fuel cycles. An example of such a process relevant to the U.S. DOE FCR&D program would be one combining the functions of the TRUEX process for partitioning of lanthanides and minor actinides from PUREX(UREX) raffinates with that of the TALSPEAK process for separating transplutonium actinides from fission product lanthanides. A fully-developed PUREX(UREX)/TRUEX/TALSPEAK suite would generate actinides as product(s) for reuse (or transmutation) and fission products as waste.more » As standalone, consecutive unit-operations, TRUEX and TALSPEAK employ different extractant solutions (solvating (CMPO, octyl(phenyl)-N,Ndiisobutylcarbamoylmethylphosphine oxide) vs. cation exchanging (HDEHP, di-2(ethyl)hexylphosphoric acid) extractants), and distinct aqueous phases (2-4 M HNO 3 vs. concentrated pH 3.5 carboxylic acid buffers containing actinide selective chelating agents). The separate processes may also operate with different phase transfer kinetic constraints. Experience teaches (and it has been demonstrated at the lab scale) that, with proper control, multiple process separation systems can operate successfully. However, it is also recognized that considerable economies of scale could be achieved if multiple operations could be merged into a single process based on a combined extractant solvent. The task of accountability of nuclear materials through the process(es) also becomes more robust with fewer steps, providing that the processes can be accurately modeled. Work is underway in the U.S. and Europe on developing several new options for combined processes (TRUSPEAK, ALSEP, SANEX, GANEX, ExAm are examples). There are unique challenges associated with the operation of such processes, some relating to organic phase chemistry, others arising from the variable composition of the aqueous medium. This project targets in particular two problematic issues in designing combined process systems: managing the chemistry of challenging aqueous species (like Zr 4+) and optimizing the composition and properties of combined extractant organic phases.« less

  14. TREATMENT TANK CORROSION STUDIES FOR THE ENHANCED CHEMICAL CLEANING PROCESS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wiersma, B.

    2011-08-24

    Radioactive waste is stored in high level waste tanks on the Savannah River Site (SRS). Savannah River Remediation (SRR) is aggressively seeking to close the non-compliant Type I and II waste tanks. The removal of sludge (i.e., metal oxide) heels from the tank is the final stage in the waste removal process. The Enhanced Chemical Cleaning (ECC) process is being developed and investigated by SRR to aid in Savannah River Site (SRS) High-Level Waste (HLW) as an option for sludge heel removal. Corrosion rate data for carbon steel exposed to the ECC treatment tank environment was obtained to evaluate themore » degree of corrosion that occurs. These tests were also designed to determine the effect of various environmental variables such as temperature, agitation and sludge slurry type on the corrosion behavior of carbon steel. Coupon tests were performed to estimate the corrosion rate during the ECC process, as well as determine any susceptibility to localized corrosion. Electrochemical studies were performed to develop a better understanding of the corrosion mechanism. The tests were performed in 1 wt.% and 2.5 wt.% oxalic acid with HM and PUREX sludge simulants. The following results and conclusions were made based on this testing: (1) In 1 wt.% oxalic acid with a sludge simulant, carbon steel corroded at a rate of less than 25 mpy within the temperature and agitation levels of the test. No susceptibility to localized corrosion was observed. (2) In 2.5 wt.% oxalic acid with a sludge simulant, the carbon steel corrosion rates ranged between 15 and 88 mpy. The most severe corrosion was observed at 75 C in the HM/2.5 wt.% oxalic acid simulant. Pitting and general corrosion increased with the agitation level at this condition. No pitting and lower general corrosion rates were observed with the PUREX/2.5 wt.% oxalic acid simulant. The electrochemical and coupon tests both indicated that carbon steel is more susceptible to localized corrosion in the HM/oxalic acid environment than in the PUREX/oxalic acid environment. (3) The corrosion rates for PUREX/8 wt.% oxalic acid were greater than or equal to those observed for the PUREX/2.5 wt.% oxalic acid. No localized corrosion was observed in the tests with the 8 wt.% oxalic acid. Testing with HM/8 wt.% oxalic acid simulant was not performed. Thus, a comparison with the results with 2.5 wt.% oxalic acid, where the corrosion rate was 88 mpy and localized corrosion was observed at 75 C, cannot be made. (4) The corrosion rates in 1 and 2.5 wt.% oxalic acid solutions were temperature dependent: (a) At 50 C, the corrosion rates ranged between 90 to 140 mpy over the 30 day test period. The corrosion rates were higher under stagnant conditions. (b) At 75 C, the initial corrosion rates were as high as 300 mpy during the first day of exposure. The corrosion rates increased with agitation. However, once the passive ferrous oxalate film formed, the corrosion rate decreased dramatically to less than 20 mpy over the 30 day test period. This rate was independent of agitation. (5) Electrochemical testing indicated that for oxalic acid/sludge simulant mixtures the cathodic reaction has transport controlled reaction kinetics. The literature suggests that the dissolution of the sludge produces a di-oxalatoferrate ion that is reduced at the cathodic sites. The cathodic reaction does not appear to involve hydrogen evolution. On the other hand, electrochemical tests demonstrated that the cathodic reaction for corrosion of carbon steel in pure oxalic acid involves hydrogen evolution. (6) Agitation of the oxalic acid/sludge simulant mixtures typically resulted in a higher corrosion rates for both acid concentrations. The transport of the ferrous ion away from the metal surface results in a less protective ferrous oxalate film. (7) A mercury containing species along with aluminum, silicon and iron oxides was observed on the interior of the pits formed in the HM/2.5 wt.% oxalic acid simulant at 75 C. The pitting rates in the agitated and non-agitated solution were 2 mils/day and 1 mil/day, respectively. A mechanism by which the mercury interacts with the aluminum and silicon oxides in this simulant to accelerate corrosion was proposed.« less

  15. Brief overview of the long-lived radionuclide separation processes developed in france in connection with the spin program

    NASA Astrophysics Data System (ADS)

    Madic, Charles; Bourges, Jacques; Dozol, Jean-François

    1995-09-01

    To reduce the long-term potential hazards associated with the management of nuclear wastes generated by nuclear fuel reprocessing, one alternative is the transmutation of long-lived radionuclides into short-lived radionuclides by nuclear means (P & T strategy). In this context, according to the law passed by the French Parliament on 30 December 1991, the CEA launched the SPIN program for the design of long-lived radionuclide separation and nuclear incineration processes. The research in progress to define separation processes focused mainly on the minor actinides (neptunium, americium and curium) and some fission products, like cesium and technetium. To separate these long-lived radionuclides, two strategies were developed. The first involves research on new operating conditions for improving the PUREX fuel reprocessing technology. This approach concerns the elements neptunium and technetium (iodine and zirconium can also be considered). The second strategy involves the design of new processes; DIAMEX for the co-extraction of minor actinides from the high-level liquid waste leaving the PUREX process, An(III)/Ln(III) separation using tripyridyltriazine derivatives or picolinamide extracting agents; SESAME for the selective separation of americium after its oxidation to Am(IV) or Am(VI) in the presence of a heteropolytungstate ligand, and Cs extraction using a new class of extracting agents, calixarenes, which exhibit exceptional Cs separation properties, especially in the presence of sodium ion. This lecture focuses on the latest achievements in these research areas.

  16. Trip report. Eurochemic company assistance: Hanford Atomic Products Operation spent fuel processing technology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shank, E.M.

    1959-06-23

    Information obtained from HAPO during visit by M.K. Twichell, UCNC, and E.M. Shank, ORNL, is given. Included are the tentative procedures for obtaining and transmitting information to the Eurochemic company. Discussions are given on pulsed columns, corrosion, puse generators, centrifuges, valves, in-line instrumentation, evaporators, resin column design, off-gas processing, solvent recovery, liquid-waste handling, process control, equipment decontamination, criticality, radiation protection, diluent, and solvent stability, backmixing in a pulsed column, and use of 40% TBP in the purex flowsheet.

  17. Studies in support of an SNM cutoff agreement: The PUREX exercise

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stanbro, W.D.; Libby, R.; Segal, J.

    1995-07-01

    On September 23, 1993, President Clinton, in a speech before the United Nations General Assembly, called for an international agreement banning the production of plutonium and highly enriched uranium for nuclear explosive purposes. A major element of any verification regime for such an agreement would probably involve inspections of reprocessing plants in Nuclear Nonproliferation Treaty weapons states. Many of these are large facilities built in the 1950s with no thought that they would be subject to international inspection. To learn about some of the problems that might be involved in the inspection of such large, old facilities, the Department ofmore » Energy, Office of Arms Control and Nonproliferation, sponsored a mock inspection exercise at the PUREX plant on the Hanford Site. This exercise examined a series of alternatives for inspections of the PUREX as a model for this type of facility at other locations. A series of conclusions were developed that can be used to guide the development of verification regimes for a cutoff agreement at reprocessing facilities.« less

  18. Method for photochemical reduction of uranyl nitrate by tri-N-butyl phosphate and application of this method to nuclear fuel reprocessing

    DOEpatents

    De Poorter, Gerald L.; Rofer-De Poorter, Cheryl K.

    1978-01-01

    Uranyl ion in solution in tri-n-butyl phosphate is readily photochemically reduced to U(IV). The product U(IV) may effectively be used in the Purex process for treating spent nuclear fuels to reduce Pu(IV) to Pu(III). The Pu(III) is readily separated from uranium in solution in the tri-n-butyl phosphate by an aqueous strip.

  19. Separations in the STATS report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Choppin, G.R.

    1996-12-31

    The Separations Technology and Transmutation Systems (STATS) Committee formed a Subcommittee on Separations. This subcommittee was charged with evaluating the separations proposed for the several reactor and accelerator transmutation systems. It was also asked to review the processing options for the safe management of high-level waste generated by the defense programs, in particular, the special problems involved in dealing with the waste at the U.S. Department of Energy (DOE) facility in Hanford, Washington. Based on the evaluations from the Subcommittee on Separations, the STATS Committee concluded that for the reactor transmutation programs, aqueous separations involving a combination of PUREX andmore » TRUEX solvent extraction processes could be used. However, additional research and development (R&D) would be required before full plant-scale use of the TRUEX technology could be employed. Alternate separations technology for the reactor transmutation program involves pyroprocessing. This process would require a significant amount of R&D before its full-scale application can be evaluated.« less

  20. Complexation of lanthanides and actinides by acetohydroxamic acid

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Taylor, R.J.; Sinkov, S.I.; Choppin, G.R.

    2008-07-01

    Acetohydroxamic acid (AHA) has been proposed as a suitable reagent for the complexant-based, as opposed to reductive, stripping of plutonium and neptunium ions from the tributylphosphate solvent phase in advanced PUREX or UREX processes designed for future nuclear-fuel reprocessing. Stripping is achieved by the formation of strong hydrophilic complexes with the tetravalent actinides in nitric acid solutions. To underpin such applications, knowledge of the complexation constants of AHA with all relevant actinide (5f) and lanthanide (4f) ions is therefore important. This paper reports the determination of stability constants of AHA with the heavier lanthanide ions (Dy-Yb) and also U(IV) andmore » Th(IV) ions. Comparisons with our previously published AHA stability-constant data for 4f and 5f ions are made. (authors)« less

  1. Laser-enhanced chemical reactions and the liquid state. II. Possible applications to nuclear fuel reprocessing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    DePoorter, G.L.; Rofer-DePoorter, C.K.

    1976-01-01

    Laser photochemistry is surveyed as a possible improvement upon the Purex process for reprocessing spent nuclear fuel. Most of the components of spent nuclear fuel are photochemically active, and lasers can be used to selectively excite individual chemical species. The great variety of chemical species present and the degree of separation that must be achieved present difficulties in reprocessing. Lasers may be able to improve the necessary separations by photochemical reaction or effects on rates and equilibria of reactions. (auth)

  2. Flowsheet Analysis of U-Pu Co-Crystallization Process as a New Reprocessing System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shunji Homma; Jun-ichi Ishii; Jiro Koga

    2006-07-01

    A new fuel reprocessing system by U-Pu co-crystallization process is proposed and examined by flowsheet analysis. This reprocessing system is based on the fact that hexavalent plutonium in nitric acid solution is co-crystallized with uranyl nitrate, whereas it is not crystallized when uranyl nitrate does not exist in the solution. The system consists of five steps: dissolution of spent fuel, plutonium oxidation, U-Pu co-crystallization as a co-decontamination, re-dissolution of the crystals, and U re-crystallization as a U-Pu separation. The system requires a recycling of the mother liquor from the U-Pu co-crystallization step and the appropriate recycle ratio is determined bymore » flowsheet analysis such that the satisfactory decontamination is achieved. Further flowsheet study using four different compositions of LWR spent fuels demonstrates that the constant ratio of plutonium to uranium in mother liquor from the re-crystallization step is achieved for every composition by controlling the temperature. It is also demonstrated by comparing to the Purex process that the size of the plant based on the proposed system is significantly reduced. (authors)« less

  3. Private Sector Initiative Between the U.S. and Japan

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    1998-09-30

    OAK-A258 Private Sector Initiative Between the U.S. and Japan. This report for calendar years 1993 through September 1998 describes efforts performed under the Private Sector Initiatives contract. The report also describes those efforts that have continued with private funding after being initiated under this contract. The development of a pyrochemical process, called TRUMP-S, for partitioning actinides from PUREX waste, is described in this report. This effort is funded by the Central Research Institute of Electric Power Industry (CRIEPI), KHI, the United States Department of Energy, and Boeing.

  4. Monitoring Iodine-129 in Air and Milk Samples Collected Near the Hanford Site: An Investigation of Historical Iodine Monitoring Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fritz, Brad G.; Patton, Gregory W.

    2006-01-01

    While other research has reported on the concentrations of 129I in the environment surrounding active nuclear fuel reprocessing facilities, there is a shortage of information regarding how the concentrations change once facilities close. At the Hanford Site, the Plutonium-Uranium Extraction (PUREX) chemical separation plant was operational between 1983 and 1990, during which time 129I concentrations in air and milk were measured. After the cessation of operations in 1990, plant emissions decreased 2.5 orders of magnitude over an 8 year period, and monitoring of environmental levels continued. An evaluation of air and milk 129I concentration data spanning the PUREX operation andmore » post closure period was conducted to compare the changes in environmental levels of 129I measured. Measured concentrations over the monitoring period were below levels that could result in a potential human dose greater than 10 uSv. There was a significant and measurable difference in the measured air concentrations of 129I at different distances from the source, indicating a distinct Hanford fingerprint. Correlations between stack emissions of 129I and concentrations in air and milk indicate that atmospheric emissions were responsible for the 129I concentrations measured in environmental samples. The measured concentrations during PUREX operation were similar to observations made around a fuel reprocessing plant in Germany.« less

  5. Aspects of remote maintenance in an FRG reprocessing plant from the manufacturer's viewpoint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zeitzchel, G.; Tennie, M.; Saal, G.

    In April 1986 a consortium led by Kraftwerk Union AG was commissioned by the German society for nuclear fuel reprocessing (DWK) to build the first West German commercial reprocessing plant for spent fuel assemblies. The main result of the planning efforts regarding remote maintenance operations inside the main process building was the introduction of FEMO technology (FEMO is an acronym based on German for remote handling modular technique). According to this technology the two cells in which the actual reprocessing (which is based on the PUREX technique) takes place are provided with frames to accommodate the process components (tanks, pumps,more » agitators, etc.), each frame together with the components which it supports forming one module. The two cells are inaccessible and windowless. For handling operations each cell is equipped with an overhead crane and a crane-like manipulator carrier system (MTS) with power manipulator. Viewing of the operations from outside the cells is made possible by television (TV) cameras installed at the crane, the MTS, and the manipulator. This paper addresses some examples of problems that still need to be solved in connection with FEMO handling. In particular, the need for close cooperation between the equipment operator, the component designer, the process engineer, the planning engineer, and the licensing authorities will be demonstrated.« less

  6. A method for phenomenological and chemical kinetics study of autocatalytic reactive dissolution by optical microscopy. The case of uranium dioxide dissolution in nitric acid media

    NASA Astrophysics Data System (ADS)

    Marc, Philippe; Magnaldo, Alastair; Godard, Jérémy; Schaer, Éric

    2018-03-01

    Dissolution is a milestone of the head-end of hydrometallurgical processes, as the stabilization rates of the chemical elements determine the process performance and hold-up. This study aims at better understanding the chemical and physico-chemical phenomena of uranium dioxide dissolution reactions in nitric acid media in the Purex process, which separates the reusable materials and the final wastes of the spent nuclear fuels. It has been documented that the attack of sintering-manufactured uranium dioxide solids occurs through preferential attack sites, which leads to the development of cracks in the solids. Optical microscopy observations show that in some cases, the development of these cracks leads to the solid cleavage. It is shown here that the dissolution of the detached fragments is much slower than the process of the complete cleavage of the solid, and occurs with no disturbing phenomena, like gas bubbling. This fact has motivated the measurement of dissolution kinetics using optical microscopy and image processing. By further discriminating between external resistance and chemical reaction, the "true" chemical kinetics of the reaction have been measured, and the highly autocatalytic nature of the reaction confirmed. Based on these results, the constants of the chemical reactions kinetic laws have also been evaluated.

  7. CHEMICAL DIFFERENCES BETWEEN SLUDGE SOLIDS AT THE F AND H AREA TANK FARMS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Reboul, S.

    2012-08-29

    The primary source of waste solids received into the F Area Tank Farm (FTF) was from PUREX processing performed to recover uranium and plutonium from irradiated depleted uranium targets. In contrast, two primary sources of waste solids were received into the H Area Tank Farm (HTF): a) waste from PUREX processing; and b) waste from H-modified (HM) processing performed to recover uranium and neptunium from burned enriched uranium fuel. Due to the differences between the irradiated depleted uranium targets and the burned enriched uranium fuel, the average compositions of the F and H Area wastes are markedly different from onemore » another. Both F and H Area wastes contain significant amounts of iron and aluminum compounds. However, because the iron content of PUREX waste is higher than that of HM waste, and the aluminum content of PUREX waste is lower than that of HM waste, the iron to aluminum ratios of typical FTF waste solids are appreciably higher than those of typical HTF waste solids. Other constituents present at significantly higher concentrations in the typical FTF waste solids include uranium, nickel, ruthenium, zinc, silver, cobalt and copper. In contrast, constituents present at significantly higher concentrations in the typical HTF waste solids include mercury, thorium, oxalate, and radionuclides U-233, U-234, U-235, U-236, Pu-238, Pu-242, Cm-244, and Cm-245. Because of the higher concentrations of Pu-238 in HTF, the long-term concentrations of Th-230 and Ra-226 (from Pu-238 decay) will also be higher in HTF. The uranium and plutonium distributions of the average FTF waste were found to be consistent with depleted uranium and weapons grade plutonium, respectively (U-235 comprised 0.3 wt% of the FTF uranium, and Pu-240 comprised 6 wt% of the FTF plutonium). In contrast, at HTF, U-235 comprised 5 wt% of the uranium, and Pu-240 comprised 17 wt% of the plutonium, consistent with enriched uranium and high burn-up plutonium. X-ray diffraction analyses of various FTF and HTF samples indicated that the primary crystalline compounds of iron in sludge solids are Fe{sub 2}O{sub 3}, Fe{sub 3}O{sub 4}, and FeO(OH), and the primary crystalline compounds of aluminum are Al(OH){sub 3} and AlO(OH). Also identified were carbonate compounds of calcium, magnesium, and sodium; a nitrated sodium aluminosilicate; and various uranium compounds. Consistent with expectations, oxalate compounds were identified in solids associated with oxalic acid cleaning operations. The most likely oxidation states and chemical forms of technetium are assessed in the context of solubility, since technetium-99 is a key risk driver from an environmental fate and transport perspective. The primary oxidation state of technetium in SRS sludge solids is expected to be Tc(IV). In salt waste, the primary oxidation state is expected to be Tc(VII). The primary form of technetium in sludge is expected to be a hydrated technetium dioxide, TcO{sub 2} {center_dot} xH{sub 2}O, which is relatively insoluble and likely co-precipitated with iron. In salt waste solutions, the primary form of technetium is expected to be the very soluble pertechnetate anion, TcO{sub 4}{sup -}. The relative differences between the F and H Tank Farm waste provide a basis for anticipating differences that will occur as constituents of FTF and HTF waste residue enter the environment over the long-term future. If a constituent is significantly more dominant in one of the Tank Farms, its long-term environmental contribution will likely be commensurately higher, assuming the environmental transport conditions of the two Tank Farms share some commonality. It is in this vein that the information cited in this document is provided - for use during the generation, assessment, and validation of Performance Assessment modeling results.« less

  8. Extraction of cesium, strontium and the platinium group metals from acidic high activity nuclear waste using a Purex process compatible organic extractant. Final report, December 15, 1980-August 15, 1983

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Davis, M.W. Jr.; Van Brunt, V.

    1984-09-14

    Purex process compatible organic systems which selectively and reversibly extract cesium, strontium, and palladium from synthetic mixed fission product solutions containing 3M HNO/sub 3/ have been developed. This advance makes the development of continuous solvent extraction processes for their recovery more likely. The most favorable cesium and strontium complexing solutions have been tested for radiation stability to 10/sup 7/ rad using a 0.4 x 10/sup 7/ rad/h /sup 60/Co source. The distribution coefficients dropped somewhat but remained above unity. For cesium the complexing organic solution is 5 vol % (0.1M) NNS, 27 vol % TBP and 68 vol % kerosenemore » containing 0.05m Bis 4,4',(5')(1-hydroxy 2-ethylhexyl)-benzo 18-crown-6 (Crown XVII). The NNS is a sulfonic acid cation exchanger. With an aqueous phase containing 0.006M Cs/sup +1/ in contact with an equal volume of extractant the D org/aq = 1.6 at a temperature of 25 to 35/sup 0/C. For strontium the complexing organic solution is 5 vol % (0.1M) NNS, 27 vol % TBP and 68 vol % Kerosene containing 0.02M Bis 4,4'(5') (1-hydroxyheptyl)cyclohexo 18-crown-6 (Crown XVI). With an aqueous phase containing 0.003M Sr/sup +2/ in contact with an equal volume of extractant the D org/aq = 1.98 at a temperature of 25 to 35/sup 0/C. For palladium the complexing organic solution consisted of a ratio of TBP/kerosene of 0.667 containing 0.3M Alamine 336 which is a tertiary amine anion exchanger. With an aqueous phase containing 0.0045M Pd/sup +/ in contact with an equal volume of extractant the D org/aq = 1.95 at a temperature of 25 to 35/sup 0/C.« less

  9. Demand driven salt clean-up in a molten salt fast reactor - Defining a priority list.

    PubMed

    Merk, B; Litskevich, D; Gregg, R; Mount, A R

    2018-01-01

    The PUREX technology based on aqueous processes is currently the leading reprocessing technology in nuclear energy systems. It seems to be the most developed and established process for light water reactor fuel and the use of solid fuel. However, demand driven development of the nuclear system opens the way to liquid fuelled reactors, and disruptive technology development through the application of an integrated fuel cycle with a direct link to reactor operation. The possibilities of this new concept for innovative reprocessing technology development are analysed, the boundary conditions are discussed, and the economic as well as the neutron physical optimization parameters of the process are elucidated. Reactor physical knowledge of the influence of different elements on the neutron economy of the reactor is required. Using an innovative study approach, an element priority list for the salt clean-up is developed, which indicates that separation of Neodymium and Caesium is desirable, as they contribute almost 50% to the loss of criticality. Separating Zirconium and Samarium in addition from the fuel salt would remove nearly 80% of the loss of criticality due to fission products. The theoretical study is followed by a qualitative discussion of the different, demand driven optimization strategies which could satisfy the conflicting interests of sustainable reactor operation, efficient chemical processing for the salt clean-up, and the related economic as well as chemical engineering consequences. A new, innovative approach of balancing the throughput through salt processing based on a low number of separation process steps is developed. Next steps for the development of an economically viable salt clean-up process are identified.

  10. Demand driven salt clean-up in a molten salt fast reactor – Defining a priority list

    PubMed Central

    Litskevich, D.; Gregg, R.; Mount, A. R.

    2018-01-01

    The PUREX technology based on aqueous processes is currently the leading reprocessing technology in nuclear energy systems. It seems to be the most developed and established process for light water reactor fuel and the use of solid fuel. However, demand driven development of the nuclear system opens the way to liquid fuelled reactors, and disruptive technology development through the application of an integrated fuel cycle with a direct link to reactor operation. The possibilities of this new concept for innovative reprocessing technology development are analysed, the boundary conditions are discussed, and the economic as well as the neutron physical optimization parameters of the process are elucidated. Reactor physical knowledge of the influence of different elements on the neutron economy of the reactor is required. Using an innovative study approach, an element priority list for the salt clean-up is developed, which indicates that separation of Neodymium and Caesium is desirable, as they contribute almost 50% to the loss of criticality. Separating Zirconium and Samarium in addition from the fuel salt would remove nearly 80% of the loss of criticality due to fission products. The theoretical study is followed by a qualitative discussion of the different, demand driven optimization strategies which could satisfy the conflicting interests of sustainable reactor operation, efficient chemical processing for the salt clean-up, and the related economic as well as chemical engineering consequences. A new, innovative approach of balancing the throughput through salt processing based on a low number of separation process steps is developed. Next steps for the development of an economically viable salt clean-up process are identified. PMID:29494604

  11. The used nuclear fuel problem - can reprocessing and consolidated storage be complementary?

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Phillips, C.; Thomas, I.

    2013-07-01

    This paper describes our CISF (Consolidated Interim Storage Facilities) and Reprocessing Facility concepts and show how they can be combined with a geologic repository to provide a comprehensive system for dealing with spent fuels in the USA. The performance of the CISF was logistically analyzed under six operational scenarios. A 3-stage plan has been developed to establish the CISF. Stage 1: the construction at the CISF site of only a rail receipt interface and storage pad large enough for the number of casks that will be received. The construction of the CISF Canister Handling Facility, the Storage Cask Fabrication Facility,more » the Cask Maintenance Facility and supporting infrastructure are performed during stage 2. The construction and placement into operation of a water-filled pool repackaging facility is completed for Stage 3. By using this staged approach, the capital cost of the CISF is spread over a number of years. It also allows more time for a final decision on the geologic repository to be made. A recycling facility will be built, this facility will used the NUEX recycling process that is based on the aqueous-based PUREX solvent extraction process, using a solvent of tri-N-butyl phosphate in a kerosene diluent. It is capable of processing spent fuels at a rate of 5 MT per day, at burn-ups up to 50 GWD per ton of spent fuels and a minimum of 5 years out-of-reactor cooling.« less

  12. Dissolution of Simulated and Radioactive Savannah River Site High-Level Waste Sludges with Oxalic Acid & Citric Acid Solutions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    STALLINGS, MARY

    This report presents findings from tests investigating the dissolution of simulated and radioactive Savannah River Site sludges with 4 per cent oxalic acid and mixtures of oxalic and citric acid previously recommended by a Russian team from the Khlopin Radium Institute and the Mining and Chemical Combine (MCC). Testing also included characterization of the simulated and radioactive waste sludges. Testing results showed the following: Dissolution of simulated HM and PUREX sludges with oxalic and citric acid mixtures at SRTC confirmed general trends reported previously by Russian testing. Unlike the previous Russian testing six sequential contacts of a mixture of oxalicmore » acid citric acids at a 2:1 ratio (v/w) of acid to sludge did not produce complete dissolution of simulated HM and PUREX sludges. We observed that increased sludge dissolution occurred at a higher acid to sludge ratio, 50:1 (v/w), compared to the recommended ratio of 2:1 (v/w). We observed much lower dissolution of aluminum in a simulated HM sludge by sodium hydroxide leaching. We attribute the low aluminum dissolution in caustic to the high fraction of boehmite present in the simulated sludge. Dissolution of HLW sludges with 4 per cent oxalic acid and oxalic/citric acid followed general trends observed with simulated sludges. The limited testing suggests that a mixture of oxalic and citric acids is more efficient for dissolving HM and PUREX sludges and provides a more homogeneous dissolution of HM sludge than oxalic acid alone. Dissolution of HLW sludges in oxalic and oxalic/citric acid mixtures produced residual sludge solids that measured at higher neutron poison to equivalent 235U weight ratios than that in the untreated sludge solids. This finding suggests that residual solids do not present an increased nuclear criticality safety risk. Generally the neutron poison to equivalent 235U weight ratios of the acid solutions containing dissolved sludge components are lower than those in the untreated sludge solids. We recommend that these results be evaluated further to determine if these solutions contain sufficient neutron poisons. We observed low general corrosion rates in tests in which carbon steel coupons were contacted with solutions of oxalic acid, citric acid and mixtures of oxalic and citric acids. Wall thinning can be minimized by maintaining short contact times with these acid solutions. We recommend additional testing with oxalic and oxalic/citric acid mixtures to measure dissolution performance of sludges that have not been previously dried. This testing should include tests to clearly ascertain the effects of total acid strength and metal complexation on dissolution performance. Further work should also evaluate the downstream impacts of citric acid on the SRS High-Level Waste System (e.g., radiochemical separations in the Salt Waste Processing Facility and addition of organic carbon in the Saltstone and Defense Waste Processing facilities).« less

  13. State waste discharge permit application: 200 Area Treated Effluent Disposal Facility (Project W-049H)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1994-08-01

    As part of the original Hanford Federal Facility Agreement and Concent Order negotiations, US DOE, US EPA and the Washington State Department of Ecology agreed that liquid effluent discharges to the ground to the Hanford Site are subject to permitting in the State Waste Discharge Permit Program (SWDP). This document constitutes the SWDP Application for the 200 Area TEDF stream which includes the following streams discharged into the area: Plutonium Finishing Plant waste water; 222-S laboratory Complex waste water; T Plant waste water; 284-W Power Plant waste water; PUREX chemical Sewer; B Plant chemical sewer, process condensate, steam condensate; 242-A-81more » Water Services waste water.« less

  14. Process control plan for 242-A Evaporator Campaign 95-1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Le, E.Q.; Guthrie, M.D.

    1995-05-18

    The wastes from tanks 106-AP, 107-AP, and 106-AW have been selected to be candidate feed wastes for Evaporator Campaign 95-1. The wastes in tank 106-AP and 107-AP are primarily from B-Plant strontium processing and PUREX neutralized cladding removal, respectively. The waste in tank 106-AW originated primarily from the partially concentrated product from 242-A Evaporator Campaign 94-2. Approximately 8.67 million liters of waste from these tanks will be transferred to tank 102-AW during the campaign. Tank 102-AW is the dedicated waste feed tank for the evaporator and currently contains 647,000 liters of processable waste. The purpose of the 242-A Evaporator Campaignmore » 95-1 Process Control Plan (hereafter referred to as PCP) is to certify that the wastes in tanks 106-AP, 107-AP, 102-AW, and 106-AW are acceptable for processing through evaporator and provide a general description of process strategies and activities which will take place during Campaign 95-1. The PCP also summarizes and presents a comprehensive characterization of the wastes in these tanks.« less

  15. Organic chemical aging mechanisms: An annotated bibliography. Waste Tank Safety Program

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Samuels, W.D.; Camaioni, D.M.; Nelson, D.A.

    1993-09-01

    An annotated bibliography has been compiled of the potential chemical and radiological aging mechanisms of the organic constituents (non-ferrocyanide) that would likely be found in the UST at Hanford. The majority of the work that has been conducted on the aging of organic chemicals used for extraction and processing of nuclear materials has been in conjunction with the acid or PUREX type processes. At Hanford the waste being stored in the UST has been stabilized with caustic. The aging factors that were used in this work were radiolysis, hydrolysis and nitrite/nitrate oxidation. The purpose of this work was two-fold: tomore » determine whether or not research had been or is currently being conducted on the species associated with the Hanford UST waste, either as a mixture or as individual chemicals or chemical functionalities, and to determine what areas of chemical aging need to be addressed by further research.« less

  16. TPE/REE separation with the use of zirconium salt of HDBP

    NASA Astrophysics Data System (ADS)

    Glekov, R. G.; Shmidt, O. V.; Palenik, Yu. V.; Goletsky, N. D.; Sukhareva, S. Yu.; Fedorov, Yu. S.; Zilberman, B. Ya.

    2003-01-01

    Partitioning of long-lived radionuclides (minor actinides, fission products) is considered as TBP-compatible ZEALEX-process for extraction separation of transplutonium elements (TPE) and rare-earth elements (REE), as well as Y, Mo, Fe and residual amounts of Np, Pu, U. Zirconium salt of dibutyl phosphoric acid (ZS-HDBP) dissolved in 30 % TBP is used as a solvent. The process was tested in multistage centrifugal contactors. Lanthanides, Y and TPE, as well as Mo, Fe were extracted from high-level Purex raffinate, Am and ceric subgroup of REE being separated from the polyvalent elements by stripping with HNO3. TPE/REE partitioning was achieved in the second cycle of the ZEALEX-process using DTPA in formic acid media. The integral decontamination factor of Am from La and Ce after both cycles is >200, from Pr and Nd 20-30 and from Sm and Eu 3.6; REE strips in both cycles contained <0,1% of the initial amount of TPE.

  17. Recovery of fissile materials from nuclear wastes

    DOEpatents

    Forsberg, Charles W.

    1999-01-01

    A process for recovering fissile materials such as uranium, and plutonium, and rare earth elements, from complex waste feed material, and converting the remaining wastes into a waste glass suitable for storage or disposal. The waste feed is mixed with a dissolution glass formed of lead oxide and boron oxide resulting in oxidation, dehalogenation, and dissolution of metal oxides. Carbon is added to remove lead oxide, and a boron oxide fusion melt is produced. The fusion melt is essentially devoid of organic materials and halogens, and is easily and rapidly dissolved in nitric acid. After dissolution, uranium, plutonium and rare earth elements are separated from the acid and recovered by processes such as PUREX or ion exchange. The remaining acid waste stream is vitrified to produce a waste glass suitable for storage or disposal. Potential waste feed materials include plutonium scrap and residue, miscellaneous spent nuclear fuel, and uranium fissile wastes. The initial feed materials may contain mixtures of metals, ceramics, amorphous solids, halides, organic material and other carbon-containing material.

  18. Experience gained with the Synroc demonstration plant at ANSTO and its relevance to plutonium immobilization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jostsons, A.; Ridal, A.; Mercer, D.J.

    1996-05-01

    The Synroc Demonstration Plant (SDP) was designed and constructed at Lucas Heights to demonstrate the feasibility of Synroc production on a commercial scale (10 kg/hr) with simulated Purex liquid HLW. Since commissioning of the SDP in 1987, over 6000 kg of Synroc has been fabricated with a range of feeds and waste loadings. The SDP utilises uniaxial hot-pressing to consolidate Synroc. Pressureless sintering and hot-isostatic pressing have also been studied at smaller scales. The results of this extensive process development have been incorporated in a conceptual design for a radioactive plant to condition HLW from a reprocessing plant with amore » capacity to treat 800 tpa of spent LWR fuel. Synroic containing TRU, including Pu, and fission products has been fabricated and characterised in a glove-box facility and hot cells, respectively. The extensive experience in processing of Synroc over the past 15 years is summarised and its relevance to immobilization of surplus plutonium is discussed.« less

  19. Method of uranium reclamation from aqueous systems by reactive ion exchange. [US DOE patent application; anion exchange resin of copolymerized divinyl-benzene and styrene having quarternary ammonium groups and bicarbonate ligands

    DOEpatents

    Maya, L.

    1981-11-05

    A reactive ion exchange method for separation and recovery of values of uranium, neptunium, plutonium, or americium from substantially neutral aqueous systems of said metals comprises contacting said system with an effective amount of a basic anion exchange resin of copolymerized divinyl-benzene and styrene having quarternary ammonium groups and bicarbonate ligands to achieve nearly 100% sorption of said actinyl ion onto said resin and an aqueous system practically free of said actinyl ions. The method is operational over an extensive range of concentrations from about 10/sup -6/ M to 1.0 M actinyl ion and a pH range of about 4 to 7. The method has particulr application to treatment of waste streams from Purex-type nuclear fuel reprocessing facilities and hydrometallurgical processes involving U, Np, P, or Am.

  20. Organic Separation Test Results

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Russell, Renee L.; Rinehart, Donald E.; Peterson, Reid A.

    2014-09-22

    Separable organics have been defined as “those organic compounds of very limited solubility in the bulk waste and that can form a separate liquid phase or layer” (Smalley and Nguyen 2013), and result from three main solvent extraction processes: U Plant Uranium Recovery Process, B Plant Waste Fractionation Process, and Plutonium Uranium Extraction (PUREX) Process. The primary organic solvents associated with tank solids are TBP, D2EHPA, and NPH. There is concern that, while this organic material is bound to the sludge particles as it is stored in the tanks, waste feed delivery activities, specifically transfer pump and mixer pump operations,more » could cause the organics to form a separated layer in the tank farms feed tank. Therefore, Washington River Protection Solutions (WRPS) is experimentally evaluating the potential of organic solvents separating from the tank solids (sludge) during waste feed delivery activities, specifically the waste mixing and transfer processes. Given the Hanford Tank Waste Treatment and Immobilization Plant (WTP) waste acceptance criteria per the Waste Feed Acceptance Criteria document (24590-WTP-RPT-MGT-11-014) that there is to be “no visible layer” of separable organics in the waste feed, this would result in the batch being unacceptable to transfer to WTP. This study is of particular importance to WRPS because of these WTP requirements.« less

  1. Back-end of the fuel cycle - Indian scenario

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wattal, P.K.

    Nuclear power has a key role in meeting the energy demands of India. This can be sustained by ensuring robust technology for the back end of the fuel cycle. Considering the modest indigenous resources of U and a huge Th reserve, India has adopted a three stage Nuclear Power Programme (NPP) based on 'closed fuel cycle' approach. This option on 'Recovery and Recycle' serves twin objectives of ensuring adequate supply of nuclear fuel and also reducing the long term radio-toxicity of the wastes. Reprocessing of the spent fuel by Purex process is currently employed. High Level Liquid Waste (HLW) generatedmore » during reprocessing is vitrified and undergoes interim storage. Back-end technologies are constantly modified to address waste volume minimization and radio-toxicity reduction. Long-term management of HLW in Indian context would involve partitioning of long lived minor actinides and recovery of valuable fission products specifically cesium. Recovery of minor actinides from HLW and its recycle is highly desirable for the sustained growth of India's NPPs. In this context, programme for developing and deploying partitioning technologies on industrial scale is pursued. The partitioned elements could be either transmuted in Fast Reactors (FRs)/Accelerated Driven Systems (ADS) as an integral part of sustainable Indian NPP. (authors)« less

  2. Industrial scale-plant for HLW partitioning in Russia

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dzekun, E.G.; Glagolenko, Y.V.; Drojko, E.G.

    1996-12-31

    Radiochemical plant of PA <> at Ozersk, which was come on line in December 1948 originally for weapon plutonium production and reoriented on the reprocessing of spent fuel, till now keeps on storage HLW of the military program. Application of the vitrification method since 1986 has not essentially reduced HLW volumes. So, as of September 1, 1995 vitrification installations had been processed 9590 m{sup 3} HLW and 235 MCi of radionuclides was included in glass. However only 1100 m{sup 3} and 20.5 MCi is part of waste of the military program. The reason is the fact, that the technology andmore » equipment of vitrification were developed for current waste of Purex-process, for which low contents of corrosion-dangerous impurity to materials of vitrification installation is characteristic of. With reference to HLW, which are growing at PA <> in the course of weapon plutonium production, the program of Science-Research Works includes the following main directions of work. Development of technology and equipment of installations for immobilising HLW with high contents of impurity into a solid form at induction melter. Application of High-temperature Adsorption Method for sorption of radionuclides from HLW on silica gel. Application of Partitioning Method of radionuclides from HLW, based on extraction cesium and strontium into cobalt dicarbollyde or crown-ethers, but also on recovery of cesium radionuclides by sorption on inorganic sorbents. In this paper the results of work on creation of first industrial scale-plant for partitioning HLW by the extraction and sorption methods are reported.« less

  3. SPECTROSCOPIC ONLINE MONITORING FOR PROCESS CONTROL AND SAFEGUARDING OF RADIOCHEMICAL STREAMS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bryan, Samuel A.; Levitskaia, Tatiana G.

    2013-09-29

    There is a renewed interest worldwide to promote the use of nuclear power and close the nuclear fuel cycle. The long term successful use of nuclear power is critically dependent upon adequate and safe processing and disposition of the used nuclear fuel. Liquid-liquid extraction is a separation technique commonly employed for the processing of the dissolved used nuclear fuel. The instrumentation used to monitor these processes must be robust, require little or no maintenance, and be able to withstand harsh environments such as high radiation fields and aggressive chemical matrices. This paper summarizes application of the absorption and vibrational spectroscopicmore » techniques supplemented by physicochemical measurements for radiochemical process monitoring. In this context, our team experimentally assessed the potential of Raman and spectrophotometric techniques for online real-time monitoring of the U(VI)/nitrate ion/nitric acid and Pu(IV)/Np(V)/Nd(III), respectively, in solutions relevant to spent fuel reprocessing. These techniques demonstrate robust performance in the repetitive batch measurements of each analyte in a wide concentration range using simulant and commercial dissolved spent fuel solutions. Spectroscopic measurements served as training sets for the multivariate data analysis to obtain partial least squares predictive models, which were validated using on-line centrifugal contactor extraction tests. Satisfactory prediction of the analytes concentrations in these preliminary experiments warrants further development of the spectroscopy-based methods for radiochemical process control and safeguarding. Additionally, the ability to identify material intentionally diverted from a liquid-liquid extraction contactor system was successfully tested using on-line process monitoring as a means to detect the amount of material diverted. A chemical diversion and detection from a liquid-liquid extraction scheme was demonstrated using a centrifugal contactor system operating with the simulant PUREX extraction system of Nd(NO3)3/nitric acid aqueous phase and TBP/n-dodecane organic phase. During a continuous extraction experiment, a portion of the feed from a counter-current extraction system was diverted while the spectroscopic on-line process monitoring system was simultaneously measuring the feed, raffinate and organic products streams. The amount observed to be diverted by on-line spectroscopic process monitoring was in excellent agreement with values based from the known mass of sample directly taken (diverted) from system feed solution.« less

  4. PROCESSING OF NEUTRON-IRRADIATED URANIUM

    DOEpatents

    Hopkins, H.H. Jr.

    1960-09-01

    An improved "Purex" process for separating uranium, plutonium, and fission products from nitric acid solutions of neutron-irradiated uranium is offered. Uranium is first extracted into tributyl phosphate (TBP) away from plutonium and fission products after adjustment of the acidity from 0.3 to 0.5 M and heating from 60 to 70 deg C. Coextracted plutonium, ruthenium, and fission products are fractionally removed from the TBP by three scrubbing steps with a 0.5 M nitric acid solution of ferrous sulfamate (FSA), from 3.5 to 5 M nitric acid, and water, respectively, and the purified uranium is finally recovered from the TBP by precipitation with an aqueous solution of oxalic acid. The plutonium in the 0.3 to 0.5 M acid solution is oxidized to the tetravalent state with sodium nitrite and extracted into TBP containing a small amount of dibutyl phosphate (DBP). Plutonium is then back-extracted from the TBP-DBP mixture with a nitric acid solution of FSA, reoxidized with sodium nitrite in the aqueous strip solution obtained, and once more extracted with TBP alone. Finally the plutonium is stripped from the TBP with dilute acid, and a portion of the strip solution thus obtained is recycled into the TBPDBP for further purification.

  5. Alternative Chemical Cleaning Methods for High Level Waste Tanks: Simulant Studies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rudisill, T.; King, W.; Hay, M.

    Solubility testing with simulated High Level Waste tank heel solids has been conducted in order to evaluate two alternative chemical cleaning technologies for the dissolution of sludge residuals remaining in the tanks after the exhaustion of mechanical cleaning and sludge washing efforts. Tests were conducted with non-radioactive pure phase metal reagents, binary mixtures of reagents, and a Savannah River Site PUREX heel simulant to determine the effectiveness of an optimized, dilute oxalic/nitric acid cleaning reagent and pure, dilute nitric acid toward dissolving the bulk non-radioactive waste components. A focus of this testing was on minimization of oxalic acid additions duringmore » tank cleaning. For comparison purposes, separate samples were also contacted with pure, concentrated oxalic acid which is the current baseline chemical cleaning reagent. In a separate study, solubility tests were conducted with radioactive tank heel simulants using acidic and caustic permanganate-based methods focused on the “targeted” dissolution of actinide species known to be drivers for Savannah River Site tank closure Performance Assessments. Permanganate-based cleaning methods were evaluated prior to and after oxalic acid contact.« less

  6. Hexavalent Americium recovery using Copper(III) periodate

    DOE PAGES

    McCann, Kevin; Brigham, Derek M.; Morrison, Samuel; ...

    2016-10-31

    Separation of americium from the lanthanides is considered one of the most difficult separation steps in closing the nuclear fuel cycle. One approach to this separation could involve oxidizing americium to the hexavalent state to form a linear dioxo cation while the lanthanides remain as trivalent ions. This work considers aqueous soluble Cu 3+ periodate as an oxidant under molar nitric acid conditions to separate hexavalent Am with diamyl amylphosphonate (DAAP) in n-dodecane. Initial studies assessed the kinetics of Cu 3+ periodate autoreduction in acidic media to aid in development of the solvent extraction system. Following characterization of the Cumore » 3+ periodate oxidant, solvent extraction studies optimized the recovery of Am from varied nitric acid media and in the presence of other fission product, or fission product surrogate, species. Short aqueous/organic contact times encouraged successful recovery of Am (distribution values as high as 2) from nitric acid media in the absence of redox active fission products. In the presence of a post-plutonium uranium redox extraction (post-PUREX) simulant aqueous feed, precipitation of tetravalent species (Ce, Ru, Zr) occurred and the distribution values of 241Am were suppressed, suggesting some oxidizing capacity of the Cu 3+ periodate is significantly consumed by other redox active metals in the simulant. Furthermore, the manuscript demonstrates Cu 3+ periodate as a potentially viable oxidant for Am oxidation and recovery and notes the consumption of oxidizing capacity observed in the presence of the post-PUREX simulant feed will need to be addressed for any approach seeking to oxidize Am for separations relevant to the nuclear fuel cycle.« less

  7. Hexavalent Americium Recovery Using Copper(III) Periodate

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McCann, Kevin; Brigham, Derek M.; Morrison, Samuel

    2016-11-21

    Separation of americium from the lanthanides is considered one of the most difficult separation steps in closing the nuclear fuel cycle. One approach to this separation could involve oxidizing americium to the hexavalent state to form a linear dioxo cation while the lanthanides remain as trivalent ions. This work considers aqueous soluble Cu3+ periodate as an oxidant under molar nitric acid conditions to separate hexavalent Am with diamyl amylphosphonate (DAAP) in n-dodecane. Initial studies assessed the kinetics of Cu3+ periodate auto-reduction in acidic media to aid in development of the solvent extraction system. Following characterization of the Cu3+ periodate oxidant,more » solvent extraction studies optimized the recovery of Am from varied nitric acid media and in the presence of other fission product, or fission product surrogate, species. Short aqueous/organic contact times encouraged successful recovery of Am (distribution values as high as 2) from nitric acid media in the absence of redox active fission products. In the presence of a post-PUREX simulant aqueous feed, precipitation of tetravalent species (Ce, Ru, Zr) occurred and the distribution values of 241Am were suppressed, suggesting some oxidizing capacity of the Cu3+ periodate is significantly consumed by other redox active metals in the simulant. The manuscript demonstrates Cu3+ periodate as a potentially viable oxidant for Am oxidation and recovery and notes the consumption of oxidizing capacity observed in the presence of the post-PUREX simulant feed will need to be addressed for any approach seeking to oxidize Am for separations relevant to the nuclear fuel cycle.« less

  8. Safeguard monitoring of direct electrolytic reduction

    NASA Astrophysics Data System (ADS)

    Jurovitzki, Abraham L.

    Nuclear power is regaining global prominence as a sustainable energy source as the world faces the consequences of depending on limited fossil based, CO2 emitting fuels. A key component to achieving this sustainability is to implement a closed nuclear fuel cycle. Without achieving this goal, a relatively small fraction of the energy value in nuclear fuel is actually utilized. This involves recycling of spent nuclear fuel (SNF)---separating fissile actinides from waste products and using them to fabricate fresh fuel. Pyroprocessing is a viable option being developed for this purpose with a host of benefits compared to other recycling options, such as PUREX. Notably, pyroprocessing is ill suited to separate pure plutonium from spent fuel and thus has non-proliferation benefits. Pyroprocessing involves high temperature electrochemical and chemical processing of SNF in a molten salt electrolyte. During this batch process, several intermediate and final streams are produced that contain radioactive material. While pyroprocessing is ineffective at separating pure plutonium, there are various process misuse scenarios that could result in diversion of impure plutonium into one or more of these streams. This is a proliferation risk that should be addressed with innovative safeguards technology. One approach to meeting this challenge is to develop real time monitoring techniques that can be implemented in the hot cells and coupled with the various unit operations involved with pyroprocessing. Current state of the art monitoring techniques involve external chemical assaying which requires sample removal from these unit operations. These methods do not meet International Atomic Energy Agency's (IAEA) timeliness requirements. In this work, a number of monitoring techniques were assessed for their viability as online monitoring tools. A hypothetical diversion scenario for the direct electrolytic reduction process was experimentally verified (using Nd2O3 as a surrogate for PuO2). Electrochemical analysis was demonstrated to be effective at detecting even very dilute concentrations of actinides as evidence for a diversion attempt.

  9. ADVANCED OXIDATION: OXALATE DECOMPOSITION TESTING WITH OZONE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ketusky, E.; Subramanian, K.

    At the Savannah River Site (SRS), oxalic acid is currently considered the preferred agent for chemically cleaning the large underground Liquid Radioactive Waste Tanks. It is applied only in the final stages of emptying a tank when generally less than 5,000 kg of waste solids remain, and slurrying based removal methods are no-longer effective. The use of oxalic acid is preferred because of its combined dissolution and chelating properties, as well as the fact that corrosion to the carbon steel tank walls can be controlled. Although oxalic acid is the preferred agent, there are significant potential downstream impacts. Impacts include:more » (1) Degraded evaporator operation; (2) Resultant oxalate precipitates taking away critically needed operating volume; and (3) Eventual creation of significant volumes of additional feed to salt processing. As an alternative to dealing with the downstream impacts, oxalate decomposition using variations of ozone based Advanced Oxidation Process (AOP) were investigated. In general AOPs use ozone or peroxide and a catalyst to create hydroxyl radicals. Hydroxyl radicals have among the highest oxidation potentials, and are commonly used to decompose organics. Although oxalate is considered among the most difficult organic to decompose, the ability of hydroxyl radicals to decompose oxalate is considered to be well demonstrated. In addition, as AOPs are considered to be 'green' their use enables any net chemical additions to the waste to be minimized. In order to test the ability to decompose the oxalate and determine the decomposition rates, a test rig was designed, where 10 vol% ozone would be educted into a spent oxalic acid decomposition loop, with the loop maintained at 70 C and recirculated at 40L/min. Each of the spent oxalic acid streams would be created from three oxalic acid strikes of an F-area simulant (i.e., Purex = high Fe/Al concentration) and H-area simulant (i.e., H area modified Purex = high Al/Fe concentration) after nearing dissolution equilibrium, and then decomposed to {le} 100 Parts per Million (ppm) oxalate. Since AOP technology largely originated on using ultraviolet (UV) light as a primary catalyst, decomposition of the spent oxalic acid, well exposed to a medium pressure mercury vapor light was considered the benchmark. However, with multi-valent metals already contained in the feed, and maintenance of the UV light a concern; testing was conducted to evaluate the impact from removing the UV light. Using current AOP terminology, the test without the UV light would likely be considered an ozone based, dark, ferrioxalate type, decomposition process. Specifically, as part of the testing, the impacts from the following were investigated: (1) Importance of the UV light on the decomposition rates when decomposing 1 wt% spent oxalic acid; (2) Impact of increasing the oxalic acid strength from 1 to 2.5 wt% on the decomposition rates; and (3) For F-area testing, the advantage of increasing the spent oxalic acid flowrate from 40 L/min (liters/minute) to 50 L/min during decomposition of the 2.5 wt% spent oxalic acid. The results showed that removal of the UV light (from 1 wt% testing) slowed the decomposition rates in both the F & H testing. Specifically, for F-Area Strike 1, the time increased from about 6 hours to 8 hours. In H-Area, the impact was not as significant, with the time required for Strike 1 to be decomposed to less than 100 ppm increasing slightly, from 5.4 to 6.4 hours. For the spent 2.5 wt% oxalic acid decomposition tests (all) without the UV light, the F-area decompositions required approx. 10 to 13 hours, while the corresponding required H-Area decompositions times ranged from 10 to 21 hours. For the 2.5 wt% F-Area sludge, the increased availability of iron likely caused the increased decomposition rates compared to the 1 wt% oxalic acid based tests. In addition, for the F-testing, increasing the recirculation flow rates from 40 liter/minute to 50 liter/minute resulted in an increased decomposition rate, suggesting a better use of ozone.« less

  10. Molecular dynamics simulation for the test of calibrated OPLS-AA force field for binary liquid mixture of tri-iso-amyl phosphate and n-dodecane.

    PubMed

    Das, Arya; Ali, Sk Musharaf

    2018-02-21

    Tri-isoamyl phosphate (TiAP) has been proposed to be an alternative for tri-butyl phosphate (TBP) in the Plutonium Uranium Extraction (PUREX) process. Recently, we have successfully calibrated and tested all-atom optimized potentials for liquid simulations using Mulliken partial charges for pure TiAP, TBP, and dodecane by performing molecular dynamics (MD) simulation. It is of immense importance to extend this potential for the various molecular properties of TiAP and TiAP/n-dodecane binary mixtures using MD simulation. Earlier, efforts were devoted to find out a suitable force field which can explain both structural and dynamical properties by empirical parameterization. Therefore, the present MD study reports the structural, dynamical, and thermodynamical properties with different mole fractions of TiAP-dodecane mixtures at the entire range of mole fraction of 0-1 employing our calibrated Mulliken embedded optimized potentials for liquid simulation (OPLS) force field. The calculated electric dipole moment of TiAP was seen to be almost unaffected by the TiAP concentration in the dodecane diluent. The calculated liquid densities of the TiAP-dodecane mixture are in good agreement with the experimental data. The mixture densities at different temperatures are also studied which was found to be reduced with temperature as expected. The plot of diffusivities for TiAP and dodecane against mole fraction in the binary mixture intersects at a composition in the range of 25%-30% of TiAP in dodecane, which is very much closer to the TBP/n-dodecane composition used in the PUREX process. The excess volume of mixing was found to be positive for the entire range of mole fraction and the excess enthalpy of mixing was shown to be endothermic for the TBP/n-dodecane mixture as well as TiAP/n-dodecane mixture as reported experimentally. The spatial pair correlation functions are evaluated between TiAP-TiAP and TiAP-dodecane molecules. Further, shear viscosity has been computed by performing the non-equilibrium molecular dynamics employing the periodic perturbation method. The calculated shear viscosity of the binary mixture is found to be in excellent agreement with the experimental values. The use of the newly calibrated OPLS force field embedding Mulliken charges is shown to be equally reliable in predicting the structural and dynamical properties for the mixture without incorporating any arbitrary scaling in the force field or Lennard-Jones parameters. Further, the present MD simulation results demonstrate that the Stokes-Einstein relation breaks down at the molecular level. The present methodology might be adopted to evaluate the liquid state properties of an aqueous-organic biphasic system, which is of great significance in the interfacial science and technology.

  11. Molecular dynamics simulation for the test of calibrated OPLS-AA force field for binary liquid mixture of tri-iso-amyl phosphate and n-dodecane

    NASA Astrophysics Data System (ADS)

    Das, Arya; Ali, Sk. Musharaf

    2018-02-01

    Tri-isoamyl phosphate (TiAP) has been proposed to be an alternative for tri-butyl phosphate (TBP) in the Plutonium Uranium Extraction (PUREX) process. Recently, we have successfully calibrated and tested all-atom optimized potentials for liquid simulations using Mulliken partial charges for pure TiAP, TBP, and dodecane by performing molecular dynamics (MD) simulation. It is of immense importance to extend this potential for the various molecular properties of TiAP and TiAP/n-dodecane binary mixtures using MD simulation. Earlier, efforts were devoted to find out a suitable force field which can explain both structural and dynamical properties by empirical parameterization. Therefore, the present MD study reports the structural, dynamical, and thermodynamical properties with different mole fractions of TiAP-dodecane mixtures at the entire range of mole fraction of 0-1 employing our calibrated Mulliken embedded optimized potentials for liquid simulation (OPLS) force field. The calculated electric dipole moment of TiAP was seen to be almost unaffected by the TiAP concentration in the dodecane diluent. The calculated liquid densities of the TiAP-dodecane mixture are in good agreement with the experimental data. The mixture densities at different temperatures are also studied which was found to be reduced with temperature as expected. The plot of diffusivities for TiAP and dodecane against mole fraction in the binary mixture intersects at a composition in the range of 25%-30% of TiAP in dodecane, which is very much closer to the TBP/n-dodecane composition used in the PUREX process. The excess volume of mixing was found to be positive for the entire range of mole fraction and the excess enthalpy of mixing was shown to be endothermic for the TBP/n-dodecane mixture as well as TiAP/n-dodecane mixture as reported experimentally. The spatial pair correlation functions are evaluated between TiAP-TiAP and TiAP-dodecane molecules. Further, shear viscosity has been computed by performing the non-equilibrium molecular dynamics employing the periodic perturbation method. The calculated shear viscosity of the binary mixture is found to be in excellent agreement with the experimental values. The use of the newly calibrated OPLS force field embedding Mulliken charges is shown to be equally reliable in predicting the structural and dynamical properties for the mixture without incorporating any arbitrary scaling in the force field or Lennard-Jones parameters. Further, the present MD simulation results demonstrate that the Stokes-Einstein relation breaks down at the molecular level. The present methodology might be adopted to evaluate the liquid state properties of an aqueous-organic biphasic system, which is of great significance in the interfacial science and technology.

  12. Overview of reductants utilized in nuclear fuel reprocessing/recycling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Patricia Paviet-Hartmann; Catherine Riddle; Keri Campbell

    2013-10-01

    Most of the aqueous processes developed, or under consideration worldwide for the recycling of used nuclear fuel (UNF) utilize the oxido-reduction properties of actinides to separate them from other radionuclides. Generally, after acid dissolution of the UNF, (essentially in nitric acid solution), actinides are separated from the raffinate by liquid-liquid extraction using specific solvents, associated along the process, with a particular reductant that will allow the separation to occur. For example, the industrial PUREX process utilizes hydroxylamine as a plutonium reductant. Hydroxylamine has numerous advantages: not only does it have the proper attributes to reduce Pu(IV) to Pu(III), but itmore » is also a non-metallic chemical that is readily decomposed to innocuous products by heating. However, it has been observed that the presence of high nitric acid concentrations or impurities (such as metal ions) in hydroxylamine solutions increase the likelihood of the initiation of an autocatalytic reaction. Recently there has been some interest in the application of simple hydrophilic hydroxamic ligands such as acetohydroxamic acid (AHA) for the stripping of tetravalent actinides in the UREX process flowsheet. This approach is based on the high coordinating ability of hydroxamic acids with tetravalent actinides (Np and Pu) compared with hexavalent uranium. Thus, the use of AHA offers a route for controlling neptunium and plutonium in the UREX process by complexant based stripping of Np(IV) and Pu(IV) from the TBP solvent phase, while U(VI) ions are not affected by AHA and remain solvated in the TBP phase. In the European GANEX process, AHA is also used to form hydrophilic complexes with actinides and strip them from the organic phase into nitric acid. However, AHA does not decompose completely when treated with nitric acid and hampers nitric acid recycling. In lieu of using AHA in the UREX + process, formohydroxamic acid (FHA), although not commercially available, hold promises as a replacement for AHA. FHA undergoes hydrolysis to formic acid which is volatile, thus allowing the recycling of nitric acid. Unfortunately, FHA powder was not stable in the experiments we ran in our laboratory. In addition, AHA and FHA also decompose to hydroxylamine which may undergo an autocatalytic reaction. Other reductants are available and could be extremely useful for actinides separation. The review presents the current plutonium reductants used in used nuclear fuel reprocessing and will introduce innovative and novel reductants that could become reducers for future research on UNF separation.« less

  13. Micro-Raman Technology to Interrogate Two-Phase Extraction on a Microfluidic Device.

    PubMed

    Nelson, Gilbert L; Asmussen, Susan E; Lines, Amanda M; Casella, Amanda J; Bottenus, Danny R; Clark, Sue B; Bryan, Samuel A

    2018-05-21

    Microfluidic devices provide ideal environments to study solvent extraction. When droplets form and generate plug flow down the microfluidic channel, the device acts as a microreactor in which the kinetics of chemical reactions and interfacial transfer can be examined. Here, we present a methodology that combines chemometric analysis with online micro-Raman spectroscopy to monitor biphasic extractions within a microfluidic device. Among the many benefits of microreactors is the ability to maintain small sample volumes, which is especially important when studying solvent extraction in harsh environments, such as in separations related to the nuclear fuel cycle. In solvent extraction, the efficiency of the process depends on complex formation and rates of transfer in biphasic systems. Thus, it is important to understand the kinetic parameters in an extraction system to maintain a high efficiency and effectivity of the process. This monitoring provided concentration measurements in both organic and aqueous plugs as they were pumped through the microfluidic channel. The biphasic system studied was comprised of HNO 3 as the aqueous phase and 30% (v/v) tributyl phosphate in n-dodecane comprised the organic phase, which simulated the plutonium uranium reduction extraction (PUREX) process. Using pre-equilibrated solutions (post extraction), the validity of the technique and methodology is illustrated. Following this validation, solutions that were not equilibrated were examined and the kinetics of interfacial mass transfer within the biphasic system were established. Kinetic results of extraction were compared to kinetics already determined on a macro scale to prove the efficacy of the technique.

  14. Engineering study of 50 miscellaneous inactive underground radioactive waste tanks located at the Hanford Site, Washington

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Freeman-Pollard, J.R.

    1994-03-02

    This engineering study addresses 50 inactive underground radioactive waste tanks. The tanks were formerly used for the following functions associated with plutonium and uranium separations and waste management activities in the 200 East and 200 West Areas of the Hanford Site: settling solids prior to disposal of supernatant in cribs and a reverse well; neutralizing acidic process wastes prior to crib disposal; receipt and processing of single-shell tank (SST) waste for uranium recovery operations; catch tanks to collect water that intruded into diversion boxes and transfer pipeline encasements and any leakage that occurred during waste transfer operations; and waste handlingmore » and process experimentation. Most of these tanks have not been in use for many years. Several projects have, been planned and implemented since the 1970`s and through 1985 to remove waste and interim isolate or interim stabilize many of the tanks. Some tanks have been filled with grout within the past several years. Responsibility for final closure and/or remediation of these tanks is currently assigned to several programs including Tank Waste Remediation Systems (TWRS), Environmental Restoration and Remedial Action (ERRA), and Decommissioning and Resource Conservation and Recovery Act (RCRA) Closure (D&RCP). Some are under facility landlord responsibility for maintenance and surveillance (i.e. Plutonium Uranium Extraction [PUREX]). However, most of the tanks are not currently included in any active monitoring or surveillance program.« less

  15. Tc-99 Decontamination From Heat Treated Gaseous Diffusion Membrane -Phase I

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Oji, L.; Wilmarth, B.; Restivo, M.

    2017-03-13

    Uranium gaseous diffusion cascades represent a significant environmental challenge to dismantle, containerize and dispose as low-level radioactive waste. Baseline technologies rely on manual manipulations involving direct access to technetium-contaminated piping and materials. There is a potential to utilize novel thermal decontamination technologies to remove the technetium and allow for on-site disposal of the very large uranium converters. Technetium entered these gaseous diffusion cascades as a hexafluoride complex in the same fashion as uranium. Technetium, as the isotope Tc-99, is an impurity that follows uranium in the first cycle of the Plutonium and Uranium Extraction (PUREX) process. The technetium speciation ormore » exact form in the gas diffusion cascades is not well defined. Several forms of Tc-99 compounds, mostly the fluorinated technetium compounds with varying degrees of volatility have been speculated by the scientific community to be present in these cascades. Therefore, there may be a possibility of using thermal desorption, which is independent of the technetium oxidation states, to perform an in situ removal of the technetium as a volatile species and trap the radionuclide on sorbent traps which could be disposed as low-level waste.« less

  16. FURTHER ASSESSMENTS OF THE ATTRACTIVENESS OF MATERIALS IN ADVANCED NUCLEAR FUEL CYCLES FROM A SAFEGUARDS PERSPECTIVE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bathke, C. G.; Jarvinen, G. D.; Wallace, R. K.

    2008-10-01

    This paper summarizes the results of an extension to an earlier study [ ] that examined the attractiveness of materials mixtures containing special nuclear materials (SNM) associated with the PUREX, UREX+, and COEX reprocessing schemes. This study focuses on the materials associated with the UREX, COEX, THOREX, and PYROX reprocessing schemes. This study also examines what is required to render plutonium as “unattractive.” Furthermore, combining the results of this study with those from the earlier study permits a comparison of the uranium and thorium based fuel cycles on the basis of the attractiveness of the SNM associated with each fuelmore » cycle. Both studies were performed at the request of the United States Department of Energy (DOE), and are based on the calculation of “attractiveness levels” that has been couched in terms chosen for consistency with those normally used for nuclear materials in DOE nuclear facilities [ ]. The methodology and key findings will be presented. Additionally, how these attractiveness levels relate to proliferation resistance (e.g. by increasing impediments to the diversion, theft, undeclared production of SNM for the purpose of acquiring a nuclear weapon), and how they could be used to help inform policy makers, will be discussed.« less

  17. The Attractiveness of Materials in Advanced Nuclear Fuel Cycles for Various Proliferation and Theft Scenarios

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bathke, C. G.; Wallace, R. K.; Ireland, J. R.

    2010-09-01

    This paper is an extension to earlier studies1,2 that examined the attractiveness of materials mixtures containing special nuclear materials (SNM) and alternate nuclear materials (ANM) associated with the PUREX, UREX, COEX, THOREX, and PYROX reprocessing schemes. This study extends the figure of merit (FOM) for evaluating attractiveness to cover a broad range of proliferant state and sub-national group capabilities. The primary conclusion of this study is that all fissile material needs to be rigorously safeguarded to detect diversion by a state and provided the highest levels of physical protection to prevent theft by sub-national groups; no “silver bullet” has beenmore » found that will permit the relaxation of current international safeguards or national physical security protection levels. This series of studies has been performed at the request of the United States Department of Energy (DOE) and is based on the calculation of "attractiveness levels" that are expressed in terms consistent with, but normally reserved for nuclear materials in DOE nuclear facilities.3 The expanded methodology and updated findings are presented. Additionally, how these attractiveness levels relate to proliferation resistance and physical security are discussed.« less

  18. The attractiveness of materials in advanced nuclear fuel cycles for various proliferation and theft scenarios

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bathke, Charles G; Wallace, Richard K; Ireland, John R

    2009-01-01

    This paper is an extension to earlier studies that examined the attractiveness of materials mixtures containing special nuclear materials (SNM) and alternate nuclear materials (ANM) associated with the PUREX, UREX, COEX, THOREX, and PYROX reprocessing schemes. This study extends the figure of merit (FOM) for evaluating attractiveness to cover a broad range of proliferant state and sub-national group capabilities. The primary conclusion of this study is that all fissile material needs to be rigorously safeguarded to detect diversion by a state and provided the highest levels of physical protection to prevent theft by sub-national groups; no 'silver bullet' has beenmore » found that will permit the relaxation of current international safeguards or national physical security protection levels. This series of studies has been performed at the request of the United States Department of Energy (DOE) and is based on the calculation of 'attractiveness levels' that are expressed in terms consistent with, but normally reserved for nuclear materials in DOE nuclear facilities. The expanded methodology and updated findings are presented. Additionally, how these attractiveness levels relate to proliferation resistance and physical security are discussed.« less

  19. Annual Report, Fall 2016: Alternative Chemical Cleaning of Radioactive High Level Waste Tanks - Corrosion Test Results

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wyrwas, R. B.

    The testing presented in this report is in support of the investigation of the Alternative Chemical Cleaning program to aid in developing strategies and technologies to chemically clean radioactive High Level Waste tanks prior to tank closure. The data and conclusions presented here were the examination of the corrosion rates of A285 carbon steel and 304L stainless steel exposed to two proposed chemical cleaning solutions: acidic permanganate (0.18 M nitric acid and 0.05M sodium permanganate) and caustic permanganate. (10 M sodium hydroxide and 0.05M sodium permanganate). These solutions have been proposed as a chemical cleaning solution for the retrieval ofmore » actinides in the sludge in the waste tanks, and were tested with both HM and PUREX sludge simulants at a 20:1 ratio.« less

  20. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Strickland, Christopher E.; Lawter, Amanda R.; Qafoku, Nikolla

    Isotopes of iodine were generated during plutonium production from nine production reactors at the U.S. Department of Energy Hanford Site. The long half-life 129I generated at the Hanford Site during reactor operations was 1) stored in single-shell and double-shell tanks, 2) discharged to liquid disposal sites (e.g., cribs and trenches), 3) released to the atmosphere during fuel reprocessing operations, or 4) captured by off-gas absorbent devices (silver reactors) at chemical separations plants (PUREX, B-Plant, T-Plant, and REDOX). Releases of 129I to the subsurface have resulted in several large, though dilute, plumes in the groundwater, including the plume in the 200-UP-1more » operable unit. There is also 129I remaining in the vadose zone beneath disposal or leak locations. Because 129I is an uncommon contaminant, relevant remediation experience and scientific literature are limited.« less

  1. Radiation Stability of Benzyl Tributyl Ammonium Chloride towards Technetium-99 Extraction - 13016

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Paviet-Hartmann, Patricia; Horkley, Jared; Campbell, Keri

    2013-07-01

    A closed nuclear fuel cycle combining new separation technologies along with generation III and generation IV reactors is a promising way to achieve a sustainable energy supply. But it is important to keep in mind that future recycling processes of used nuclear fuel (UNF) must minimize wastes, improve partitioning processes, and integrate waste considerations into processes. New separation processes are being developed worldwide to complement the actual industrialized PUREX process which selectively separates U(VI) and Pu(IV) from the raffinate. As an example, the UREX process has been developed in the United States to co-extract hexavalent uranium (U) and hepta-valent technetiummore » (Tc) by tri-n-butyl phosphate (TBP). Tc-99 is recognized to be one of the most abundant, long-lived radio-toxic isotopes in UNF (half-life, t{sub 1/2} = 2.13 x 10{sup 5} years), and as such, is targeted in UNF separation strategies for isolation and encapsulation in solid waste-forms for final disposal in a nuclear waste repository. Immobilization of Tc-99 by a durable solid waste-form is a challenge, and its fate in new advanced technology processes is of importance. It is essential to be able to quantify and locate 1) its occurrence in any new developed flowsheets, 2) its chemical form in the individual phases of a process, 3) its potential quantitative transfer in any waste streams, and consequently, 4) its quantitative separation for either potential transmutation to Ru-100 or isolation and encapsulation in solid waste-forms for ultimate disposal. In addition, as a result of an U(VI)-Tc(VII) co-extraction in a UREX-based process, Tc(VII) could be found in low level waste (LLW) streams. There is a need for the development of new extraction systems that would selectively extract Tc-99 from LLW streams and concentrate it for feed into high level waste (HLW) for either Tc-99 immobilization in metallic waste-forms (Tc-Zr alloys), and/or borosilicate-based waste glass. Studies have been launched to investigate the suitability of new macro-compounds such as crown-ethers, aza-crown ethers, quaternary ammonium salts, and resorcin-arenes for the selective extraction of Tc-99 from nitric acid solutions. The selectivity of the ligand is important in evaluating potential separation processes and also the radiation stability of the molecule is essential for minimization of waste and radiolysis products. In this paper, we are reporting the extraction of TcO{sub 4}{sup -} by benzyl tributyl ammonium chloride (BTBA). Experimental efforts were focused on determining the best extraction conditions by varying the ligand's matrix conditions and concentration, as well as varying the organic phase composition (i.e. diluent variation). Furthermore, the ligand has been investigated for radiation stability. The ?-irradiation was performed on the neat organic phases containing the ligand at different absorbed doses to a maximum of 200 kGy using an external Co-60 source. Post-irradiation solvent extraction measurements will be discussed. (authors)« less

  2. The Effects of Radiation Chemistry on Solvent Extraction: 1. Conditions in Acidic Solution and a Review of TBP Radiolysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bruce J. Mincher; Guiseppe Modolo; Strephen P. Mezyk

    2009-01-01

    Solvent extraction is the most commonly used process scale separation technique for nuclear applications and it benefits from more than 60 years of research and development and proven experience at the industrial scale. Advanced solvent extraction processes for the separation of actinides and fission products from dissolved nuclear fuel are now being investigated worldwide by numerous groups (US, Europe, Russia, Japan etc.) in order to decrease the radiotoxic inventories of nuclear waste. While none of the advanced processes have yet been implemented at the industrial scale their development studies have sometimes reached demonstration tests at the laboratory scale. Most ofmore » the partitioning strategies rely on the following four separations: 1. Partitioning of uranium and/or plutonium from spent fuel dissolution liquors. 2. Separation of the heat generating fission products such as strontium and cesium. 3. Coextraction of the trivalent actinides and lanthanides. 4. Separation of the trivalent actinides from the trivalent lanthanides. Tributylphosphate (TBP) in the first separation is the basis of the PUREX, UREX and COEX processes, developed in Europe and the US, whereas monoamides as alternatives for TBP are being developed in Japan and India. For the second separation, many processes were developed worldwide, including the use of crown-ether extractants, like the FPEX process developed in the USA, and the CCD-PEG process jointly developed in the USA and Russia for the partitioning of cesium and strontium. In the third separation, phosphine oxides (CMPOs), malonamides, and diglycolamides are used in the TRUEX, DIAMEX and the ARTIST processes, respectively developed in US, Europe and Japan. Trialkylphosphine oxide(TRPO) developed in China, or UNEX (a mixture of several extractants) jointly developed in Russia and the USA allow all actinides to be co-extracted from acidic radioactive liquid waste. For the final separation, soft donor atom-containing ligands such as the bistriazinylbipyridines (BTBPs) or dithiophosphinic acids have been developed in Europe and China to selectively extract the trivalent actinides. However, in the TALSPEAK process developed in the USA, the separation is based on the relatively high affinity of aminopolycarboxylic acid complexants such as DTPA for trivalent actinides over lanthanides. In the DIDPA, SETFICS and the GANEX processes, developed in Japan and France, the group separation is accomplished in a reverse TALSPEAK process. A typical scenario is shown in Figure 1 for the UREX1a (Uranium Extraction version 1a) process. The initial step is the TBP extraction for the separation of recyclable uranium. The second step partitions the short-lived, highly radioactive cesium and strontium to minimize heat loading in the high-level waste repository. The third step is a group separation of the trivalent actinides and lanthanides with the last step being partitioning of the trivalent lanthanides from the actinides.« less

  3. Chemical Disposition of Plutonium in Hanford Site Tank Wastes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Delegard, Calvin H.; Jones, Susan A.

    2015-05-07

    This report examines the chemical disposition of plutonium (Pu) in Hanford Site tank wastes, by itself and in its observed and potential interactions with the neutron absorbers aluminum (Al), cadmium (Cd), chromium (Cr), iron (Fe), manganese (Mn), nickel (Ni), and sodium (Na). Consideration also is given to the interactions of plutonium with uranium (U). No consideration of the disposition of uranium itself as an element with fissile isotopes is considered except tangentially with respect to its interaction as an absorber for plutonium. The report begins with a brief review of Hanford Site plutonium processes, examining the various means used tomore » recover plutonium from irradiated fuel and from scrap, and also examines the intermediate processing of plutonium to prepare useful chemical forms. The paper provides an overview of Hanford tank defined-waste–type compositions and some calculations of the ratios of plutonium to absorber elements in these waste types and in individual waste analyses. These assessments are based on Hanford tank waste inventory data derived from separately published, expert assessments of tank disposal records, process flowsheets, and chemical/radiochemical analyses. This work also investigates the distribution and expected speciation of plutonium in tank waste solution and solid phases. For the solid phases, both pure plutonium compounds and plutonium interactions with absorber elements are considered. These assessments of plutonium chemistry are based largely on analyses of idealized or simulated tank waste or strongly alkaline systems. The very limited information available on plutonium behavior, disposition, and speciation in genuine tank waste also is discussed. The assessments show that plutonium coprecipitates strongly with chromium, iron, manganese and uranium absorbers. Plutonium’s chemical interactions with aluminum, nickel, and sodium are minimal to non-existent. Credit for neutronic interaction of plutonium with these absorbers occurs only if they are physically proximal in solution or the plutonium present in the solid phase is intimately mixed with compounds or solutions of these absorbers. No information on the potential chemical interaction of plutonium with cadmium was found in the technical literature. Definitive evidence of sorption or adsorption of plutonium onto various solid phases from strongly alkaline media is less clear-cut, perhaps owing to fewer studies and to some well-attributed tests run under conditions exceeding the very low solubility of plutonium. The several studies that are well-founded show that only about half of the plutonium is adsorbed from waste solutions onto sludge solid phases. The organic complexants found in many Hanford tank waste solutions seem to decrease plutonium uptake onto solids. A number of studies show plutonium sorbs effectively onto sodium titanate. Finally, this report presents findings describing the behavior of plutonium vis-à-vis other elements during sludge dissolution in nitric acid based on Hanford tank waste experience gained by lab-scale tests, chemical and radiochemical sample characterization, and full-scale processing in preparation for strontium-90 recovery from PUREX sludges.« less

  4. UNIT OPERATIONS SECTION MONTHLY PROGRESS REPORT, OCTOBER 1961

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Whatley, M.E.; Haas, P.A.; Horton, R.W.

    1962-04-01

    Additional runs were made in the 6-in.-dia. separation column. The kinetics of the methane --copper oxide reaction was investigated in deep bed tests. The work on the development of the shear included a satisfactory method of ng, preliminary test of an outer gag faced with rubber, and a metallic inner gsg contoured to the shape of a sheared assembly. The mechanical dejacketing of the SRE Core I fuel, NaK-bonded, stainless steel-clad uranium slugs, was successfully completed. The effective therrnal conductivity of a packed bed of 0.023-in. steel shot was approximately 0.33 Btu/hr- deg Fft at 200 deg F. Flow capacitymore » for the compound extraction scrub column equipped with sieve plates (0.125-in.-dia. was determined. Average waste calcination rates for Purex were higher by a factor of 1.5 to 2.0 than rates for TBP-25. (auth)« less

  5. TC-99 Decontaminant from heat treated gaseous diffusion membrane -Phase I, Part B

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Oji, L.; Restivo, M.; Duignan, M.

    2017-11-01

    Uranium gaseous diffusion cascades represent a significant environmental challenge to dismantle, containerize and dispose as low-level radioactive waste. Baseline technologies rely on manual manipulations involving direct access to technetium-contaminated piping and materials. There is a potential to utilize novel decontamination technologies to remove the technetium and allow for on-site disposal of the very large uranium converters. Technetium entered these gaseous diffusion cascades as a hexafluoride complex in the same fashion as uranium. Technetium, as the isotope Tc-99, is an impurity that follows uranium in the first cycle of the Plutonium and Uranium Extraction (PUREX) process. The technetium speciation or exactmore » form in the gaseous diffusion cascades is not well defined. Several forms of Tc-99 compounds, mostly the fluorinated technetium compounds with varying degrees of volatility have been speculated by the scientific community to be present in these cascades. Therefore, there may be a possibility of using thermal or leaching desorption, which is independent of the technetium oxidation states, to perform an insitu removal of the technetium as a volatile species and trap the radionuclide on sorbent traps which could be disposed as low-level waste. Based on the positive results of the first part of this work1 the use of steam as a thermal decontamination agent was further explored with a second piece of used barrier material from a different location. This new series of tests included exposing more of the material surface to the flow of high temperature steam through the change in the reactor design, subjecting it to alternating periods of stream and vacuum, as well as determining if a lower temperature steam, i.e., 121°C (250°F) would be effective, too. Along with these methods, one other simpler method involving the leaching of the Tc-99 contaminated barrier material with a 1.0 M aqueous solution of ammonium carbonate, with and without sonication, was evaluated.« less

  6. Improving the Estimates of Waste from the Recycling of Used Nuclear Fuel - 13410

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Phillips, Chris; Willis, William; Carter, Robert

    2013-07-01

    Estimates are presented of wastes arising from the reprocessing of 50 GWD/tonne, 5 year and 50 year cooled used nuclear fuel (UNF) from Light Water Reactors (LWRs), using the 'NUEX' solvent extraction process. NUEX is a fourth generation aqueous based reprocessing system, comprising shearing and dissolution in nitric acid of the UNF, separation of uranium and mixed uranium-plutonium using solvent extraction in a development of the PUREX process using tri-n-butyl phosphate in a kerosene diluent, purification of the plutonium and uranium-plutonium products, and conversion of them to uranium trioxide and mixed uranium-plutonium dioxides respectively. These products are suitable for usemore » as new LWR uranium oxide and mixed oxide fuel, respectively. Each unit process is described and the wastes that it produces are identified and quantified. Quantification of the process wastes was achieved by use of a detailed process model developed using the Aspen Custom Modeler suite of software and based on both first principles equilibrium and rate data, plus practical experience and data from the industrial scale Thermal Oxide Reprocessing Plant (THORP) at the Sellafield nuclear site in the United Kingdom. By feeding this model with the known concentrations of all species in the incoming UNF, the species and their concentrations in all product and waste streams were produced as the output. By using these data, along with a defined set of assumptions, including regulatory requirements, it was possible to calculate the waste forms, their radioactivities, volumes and quantities. Quantification of secondary wastes, such as plant maintenance, housekeeping and clean-up wastes, was achieved by reviewing actual operating experience from THORP during its hot operation from 1994 to the present time. This work was carried out under a contract from the United States Department of Energy (DOE) and, so as to enable DOE to make valid comparisons with other similar work, a number of assumptions were agreed. These include an assumed reprocessing capacity of 800 tonnes per year, the requirement to remove as waste forms the volatile fission products carbon-14, iodine-129, krypton-85, tritium and ruthenium-106, the restriction of discharge of any water from the facility unless it meets US Environmental Protection Agency drinking water standards, no intentional blending of wastes to lower their classification, and the requirement for the recovered uranium to be sufficiently free from fission products and neutron-absorbing species to allow it to be re-enriched and recycled as nuclear fuel. The results from this work showed that over 99.9% of the radioactivity in the UNF can be concentrated via reprocessing into a fission-product-containing vitrified product, bottles of compressed krypton storage and a cement grout containing the tritium, that together have a volume of only about one eighth the volume of the original UNF. The other waste forms have larger volumes than the original UNF but contain only the remaining 0.1% of the radioactivity. (authors)« less

  7. Conceptual Model of Iodine Behavior in the Subsurface at the Hanford Site

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Truex, Michael J.; Lee, Brady D.; Johnson, Christian D.

    Isotopes of iodine were generated during plutonium production within the nine production reactors at the U.S. Department of Energy Hanford Site. The short half-life 131I that was released from the fuel into the atmosphere during the dissolution process (when the fuel was dissolved) in the Hanford Site 200 Area is no longer present at concentrations of concern in the environment. The long half-life 129I generated at the Hanford Site during reactor operations was (1) stored in single-shell and double-shell tanks, (2) discharged to liquid disposal sites (e.g., cribs and trenches), (3) released to the atmosphere during fuel reprocessing operations, ormore » (4) captured by off-gas absorbent devices (silver reactors) at chemical separations plants (PUREX, B-Plant, T-Plant, and REDOX). Releases of 129I to the subsurface have resulted in several large, though dilute, plumes in the groundwater. There is also 129I remaining in the vadose zone beneath disposal or leak locations. The fate and transport of 129I in the environment and potential remediation technologies are currently being studied as part of environmental remediation activities at the Hanford Site. A conceptual model describing the nature and extent of subsurface contamination, factors that control plume behavior, and factors relevant to potential remediation processes is needed to support environmental remedy decisions. Because 129I is an uncommon contaminant, relevant remediation experience and scientific literature are limited. In addition, its behavior in subsurface is different from that of other more common and important contaminants (e.g., U, Cr and Tc) in terms of sorption (adsorption and precipitation), and aqueous phase species transformation via redox reactions. Thus, the conceptual model also needs to both describe known contaminant and biogeochemical process information and identify aspects about which additional information is needed to effectively support remedy decisions.« less

  8. Annual report, spring 2015. Alternative chemical cleaning methods for high level waste tanks-corrosion test results

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wyrwas, R. B.

    The testing presented in this report is in support of the investigation of the Alternative Chemical Cleaning program to aid in developing strategies and technologies to chemically clean radioactive High Level Waste tanks prior to tank closure. The data and conclusions presented here were the examination of the corrosion rates of A285 carbon steel and 304L stainless steel when interacted with the chemical cleaning solution composed of 0.18 M nitric acid and 0.5 wt. % oxalic acid. This solution has been proposed as a dissolution solution that would be used to remove the remaining hard heel portion of the sludgemore » in the waste tanks. This solution was combined with the HM and PUREX simulated sludge with dilution ratios that represent the bulk oxalic cleaning process (20:1 ratio, acid solution to simulant) and the cumulative volume associated with multiple acid strikes (50:1 ratio). The testing was conducted over 28 days at 50°C and deployed two methods to invest the corrosion conditions; passive weight loss coupon and an active electrochemical probe were used to collect data on the corrosion rate and material performance. In addition to investigating the chemical cleaning solutions, electrochemical corrosion testing was performed on acidic and basic solutions containing sodium permanganate at room temperature to explore the corrosion impacts if these solutions were to be implemented to retrieve remaining actinides that are currently in the sludge of the tank.« less

  9. An Assessment of the Attractiveness of Material Associated with a MOX Fuel Cycle from a Safeguards Perspective

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bathke, Charles G; Wallace, Richard K; Ireland, John R

    2009-01-01

    This paper is an extension to earlier studies that examined the attractiveness of materials mixtures containing special nuclear materials (SNM) and alternate nuclear materials (ANM) associated with the PUREX, UREX, coextraction, THOREX, and PYROX reprocessing schemes. This study extends the figure of merit (FOM) for evaluating attractiveness to cover a broad range of proliferant State and sub-national group capabilities. This study also considers those materials that will be recycled and burned, possibly multiple times, in LWRs [e.g., plutonium in the form of mixed oxide (MOX) fuel]. The primary conclusion of this study is that all fissile material needs to bemore » rigorously safeguarded to detect diversion by a State and provided the highest levels of physical protection to prevent theft by sub-national groups; no 'silver bullet' has been found that will permit the relaxation of current international safeguards or national physical security protection levels. This series of studies has been performed at the request of the United States Department of Energy (DOE) and is based on the calculation of 'attractiveness levels' that are expressed in terms consistent with, but normally reserved for nuclear materials in DOE nuclear facilities. The expanded methodology and updated findings are presented. Additionally, how these attractiveness levels relate to proliferation resistance and physical security are discussed.« less

  10. AN ASSESSMENT OF THE ATTRACTIVENESS OF MATERIAL ASSOCIATED WITH A MOX FUEL CYCLE FROM A SAFEGUARDS PERSPECTIVE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bathke, C. G.; Ebbinghaus, B. B.; Sleaford, Brad W.

    2009-07-09

    This paper is an extension to earlier studies [1,2] that examined the attractiveness of materials mixtures containing special nuclear materials (SNM) and alternate nuclear materials (ANM) associated with the PUREX, UREX, coextraction, THOREX, and PYROX reprocessing schemes. This study extends the figure of merit (FOM) for evaluating attractiveness to cover a broad range of proliferant State and sub-national group capabilities. This study also considers those materials that will be recycled and burned, possibly multiple times, in LWRs [e.g., plutonium in the form of mixed oxide (MOX) fuel]. The primary conclusion of this study is that all fissile material needs tomore » be rigorously safeguarded to detect diversion by a State and provided the highest levels of physical protection to prevent theft by sub-national groups; no “silver bullet” has been found that will permit the relaxation of current international safeguards or national physical security protection levels. This series of studies has been performed at the request of the United States Department of Energy (DOE) and is based on the calculation of "attractiveness levels" that are expressed in terms consistent with, but normally reserved for nuclear materials in DOE nuclear facilities [3]. The expanded methodology and updated findings are presented. Additionally, how these attractiveness levels relate to proliferation resistance and physical security are discussed.« less

  11. Experimental study on thermal hazard of tributyl phosphate-nitric acid mixtures using micro calorimeter technique.

    PubMed

    Sun, Qi; Jiang, Lin; Gong, Liang; Sun, Jin-Hua

    2016-08-15

    During PUREX spent nuclear fuel reprocessing, mixture of tributyl phosphate (TBP) and hydrocarbon solvent are employed as organic solvent to extract uranium in consideration of radiation contaminated safety and resource recycling, meanwhile nitric acid is utilized to dissolve the spent fuel into small pieces. However, once TBP contacts with nitric acid or nitrates above 130°C, a heavy "red oil" layer would occur accompanied by thermal runaway reactions, even caused several nuclear safety accident. Considering nitric acid volatility and weak exothermic detection, C80micro calorimeter technique was used in this study to investigate thermal decomposition of TBP mixed with nitric acid. Results show that the concentration of nitric acid greatly influences thermal hazard of the system by direct reactions. Even with a low heating rate, if the concentration of nitric acid increases due to evaporation of water or improper operations, thermal runaway in the closed system could start at a low temperature. Copyright © 2016 Elsevier B.V. All rights reserved.

  12. Americium-241 in surface soil associated with the Hanford site and vicinity

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Price, K.R.; Gilbert, R.O.; Gano, K.A.

    1981-05-01

    Various kinds of surface soil samples were collected and analyzed for Americium-241 (/sup 241/Am) to examine the feasibility of improving soil sample data for the Hanford Surface Environmental Surveillance Program. Results do not indicate that a major improvement would occur if procedures were changed from the current practices. Conclusions from this study are somewhat tempered by the very low levels of /sup 241/Am (< 0.10 pCi/g dry weight) detected in surface soil samples and by the fact that statistical significance depended on the type of statistical tests used. In general, the average concentration of /sup 241/Am in soil crust (0more » to 1.0 cm deep) was greater than the corresponding subsurface layer (1.0 to 2.5 cm deep), and the average concentration of /sup 241/Am in some onsite samples collected near the PUREX facility was greater than comparable samples collected 60 km upwind at an offsite location.« less

  13. POTENTIAL IMPACT OF BLENDING RESIDUAL SOLIDS FROM TANKS 18/19 MOUNDS WITH TANK 7 OPERATIONS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Eibling, R; Erich Hansen, E; Bradley Pickenheim, B

    2007-03-29

    High level waste tanks 18F and 19F have residual mounds of waste which may require removal before the tanks can be closed. Conventional slurry pump technology, previously used for waste removal and tank cleaning, has been incapable of removing theses mounds from tanks 18F and 19F. A mechanical cleaning method has been identified that is potentially capable of removing and transferring the mound material to tank 7F for incorporation in a sludge batch for eventual disposal in high level waste glass by the Defense Waste Processing Facility. The Savannah River National Laboratory has been requested to evaluate whether the materialmore » transferred from tanks 18F/19F by the mechanical cleaning technology can later be suspended in Tank 7F by conventional slurry pumps after mixing with high level waste sludge. The proposed mechanical cleaning process for removing the waste mounds from tanks 18 and 19 may utilize a high pressure water jet-eductor that creates a vacuum to mobilize solids. The high pressure jet is also used to transport the suspended solids. The jet-eductor system will be mounted on a mechanical crawler for movement around the bottom of tanks 18 and 19. Based on physical chemical property testing of the jet-eductor system processed IE-95 zeolite and size-reduced IE-95 zeolite, the following conclusions were made: (1) The jet-eductor system processed zeolite has a mean and median particle size (volume basis) of 115.4 and 43.3 microns in water. Preferential settling of these large particles is likely. (2) The jet-eductor system processed zeolite rapidly generates settled solid yield stresses in excess of 11,000 Pascals in caustic supernates and will not be easily retrieved from Tank 7 with the existing slurry pump technology. (3) Settled size-reduced IE-95 zeolite (less than 38 microns) in caustic supernate does not generate yield stresses in excess of 600 Pascals in less than 30 days. (4) Preferential settling of size-reduced zeolite is a function of the amount of sludge and the level of dilution for the mixture. (5) Blending the size-reduced zeolite into larger quantities of sludge can reduce the amount of preferential settling. (6) Periodic dilution or resuspension due to sludge washing or other mixing requirements will increase the chances of preferential settling of the zeolite solids. (7) Mixtures of Purex sludge and size-reduced zeolite did not produce yield stresses greater than 200 Pascals for settling times less than thirty days. Most of the sludge-zeolite blends did not exceed 50 Pascals. These mixtures should be removable by current pump technology if sufficient velocities can be obtained. (8) The settling rate of the sludge-zeolite mixtures is a function of the ionic strength (or supernate density) and the zeolite- sludge mixing ratio. (9) Simulant tests indicate that leaching of Si may be an issue for the processed Tank 19 mound material. (10) Floating zeolite fines observed in water for the jet-eductor system and size-reduced zeolite were not observed when the size-reduced zeolite was blended with caustic solutions, indicating that the caustic solutions cause the fines to agglomerate. Based on the test programs described in this report, the potential for successfully removing Tank 18/19 mound material from Tank 7 with the current slurry pump technology requires the reduction of the particle size of the Tank 18/19 mound material.« less

  14. Nuclear Fuel Reprocessing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harold F. McFarlane; Terry Todd

    2013-11-01

    Reprocessing is essential to closing nuclear fuel cycle. Natural uranium contains only 0.7 percent 235U, the fissile (see glossary for technical terms) isotope that produces most of the fission energy in a nuclear power plant. Prior to being used in commercial nuclear fuel, uranium is typically enriched to 3–5% in 235U. If the enrichment process discards depleted uranium at 0.2 percent 235U, it takes more than seven tonnes of uranium feed to produce one tonne of 4%-enriched uranium. Nuclear fuel discharged at the end of its economic lifetime contains less one percent 235U, but still more than the natural ore.more » Less than one percent of the uranium that enters the fuel cycle is actually used in a single pass through the reactor. The other naturally occurring isotope, 238U, directly contributes in a minor way to power generation. However, its main role is to transmute into plutoniumby neutron capture and subsequent radioactive decay of unstable uraniumand neptuniumisotopes. 239Pu and 241Pu are fissile isotopes that produce more than 40% of the fission energy in commercially deployed reactors. It is recovery of the plutonium (and to a lesser extent the uranium) for use in recycled nuclear fuel that has been the primary focus of commercial reprocessing. Uraniumtargets irradiated in special purpose reactors are also reprocessed to obtain the fission product 99Mo, the parent isotope of technetium, which is widely used inmedical procedures. Among the fission products, recovery of such expensive metals as platinum and rhodium is technically achievable, but not economically viable in current market and regulatory conditions. During the past 60 years, many different techniques for reprocessing used nuclear fuel have been proposed and tested in the laboratory. However, commercial reprocessing has been implemented along a single line of aqueous solvent extraction technology called plutonium uranium reduction extraction process (PUREX). Similarly, hundreds of types of reactor fuels have been irradiated for different purposes, but the vast majority of commercial fuel is uranium oxide clad in zirconium alloy tubing. As a result, commercial reprocessing plants have relatively narrow technical requirements for used nuclear that is accepted for processing.« less

  15. Reducing Actinide Production Using Inert Matrix Fuels

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Deinert, Mark

    2017-08-23

    The environmental and geopolitical problems that surround nuclear power stem largely from the longlived transuranic isotopes of Am, Cm, Np and Pu that are contained in spent nuclear fuel. New methods for transmuting these elements into more benign forms are needed. Current research efforts focus largely on the development of fast burner reactors, because it has been shown that they could dramatically reduce the accumulation of transuranics. However, despite five decades of effort, fast reactors have yet to achieve industrial viability. A critical limitation to this, and other such strategies, is that they require a type of spent fuel reprocessingmore » that can efficiently separate all of the transuranics from the fission products with which they are mixed. Unfortunately, the technology for doing this on an industrial scale is still in development. In this project, we explore a strategy for transmutation that can be deployed using existing, current generation reactors and reprocessing systems. We show that use of an inert matrix fuel to recycle transuranics in a conventional pressurized water reactor could reduce overall production of these materials by an amount that is similar to what is achievable using proposed fast reactor cycles. Furthermore, we show that these transuranic reductions can be achieved even if the fission products are carried into the inert matrix fuel along with the transuranics, bypassing the critical separations hurdle described above. The implications of these findings are significant, because they imply that inert matrix fuel could be made directly from the material streams produced by the commercially available PUREX process. Zirconium dioxide would be an ideal choice of inert matrix in this context because it is known to form a stable solid solution with both fission products and transuranics.« less

  16. DESTRUCTION OF TETRAPHENYLBORATE IN TANK 48H USING WET AIR OXIDATION BATCH BENCH SCALE AUTOCLAVE TESTING WITH ACTUAL RADIOACTIVE TANK 48H WASTE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Adu-Wusu, K; Paul Burket, P

    2009-03-31

    Wet Air Oxidation (WAO) is one of the two technologies being considered for the destruction of Tetraphenylborate (TPB) in Tank 48H. Batch bench-scale autoclave testing with radioactive (actual) Tank 48H waste is among the tests required in the WAO Technology Maturation Plan. The goal of the autoclave testing is to validate that the simulant being used for extensive WAO vendor testing adequately represents the Tank 48H waste. The test objective was to demonstrate comparable test results when running simulated waste and real waste under similar test conditions. Specifically: (1) Confirm the TPB destruction efficiency and rate (same reaction times) obtainedmore » from comparable simulant tests, (2) Determine the destruction efficiency of other organics including biphenyl, (3) Identify and quantify the reaction byproducts, and (4) Determine off-gas composition. Batch bench-scale stirred autoclave tests were conducted with simulated and actual Tank 48H wastes at SRNL. Experimental conditions were chosen based on continuous-flow pilot-scale simulant testing performed at Siemens Water Technologies Corporation (SWT) in Rothschild, Wisconsin. The following items were demonstrated as a result of this testing. (1) Tetraphenylborate was destroyed to below detection limits during the 1-hour reaction time at 280 C. Destruction efficiency of TPB was > 99.997%. (2) Other organics (TPB associated compounds), except biphenyl, were destroyed to below their respective detection limits. Biphenyl was partially destroyed in the process, mainly due to its propensity to reside in the vapor phase during the WAO reaction. Biphenyl is expected to be removed in the gas phase during the actual process, which is a continuous-flow system. (3) Reaction byproducts, remnants of MST, and the PUREX sludge, were characterized in this work. Radioactive species, such as Pu, Sr-90 and Cs-137 were quantified in the filtrate and slurry samples. Notably, Cs-137, boron and potassium were shown as soluble as a result of the WAO reaction. (4) Off-gas composition was measured in the resulting gas phase from the reaction. Benzene and hydrogen were formed during the reaction, but they were reasonably low in the off-gas at 0.096 and 0.0063 vol% respectively. Considering the consistency in replicating similar test results with simulated waste and Tank 48H waste under similar test conditions, the results confirm the validity of the simulant for other WAO test conditions.« less

  17. THE ATTRACTIVENESS OF MATERIALS IN ADVANCED NUCLEAR FUEL CYCLES FOR VARIOUS PROLIFERATION AND THEFT SCENARIOS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bathke, C. G.; Ebbinghaus, Bartley B.; Collins, Brian A.

    2012-08-29

    We must anticipate that the day is approaching when details of nuclear weapons design and fabrication will become common knowledge. On that day we must be particularly certain that all special nuclear materials (SNM) are adequately accounted for and protected and that we have a clear understanding of the utility of nuclear materials to potential adversaries. To this end, this paper examines the attractiveness of materials mixtures containing SNM and alternate nuclear materials associated with the plutonium-uranium reduction extraction (Purex), uranium extraction (UREX), coextraction (COEX), thorium extraction (THOREX), and PYROX (an electrochemical refining method) reprocessing schemes. This paper provides amore » set of figures of merit for evaluating material attractiveness that covers a broad range of proliferant state and subnational group capabilities. The primary conclusion of this paper is that all fissile material must be rigorously safeguarded to detect diversion by a state and must be provided the highest levels of physical protection to prevent theft by subnational groups; no 'silver bullet' fuel cycle has been found that will permit the relaxation of current international safeguards or national physical security protection levels. The work reported herein has been performed at the request of the U.S. Department of Energy (DOE) and is based on the calculation of 'attractiveness levels' that are expressed in terms consistent with, but normally reserved for, the nuclear materials in DOE nuclear facilities. The methodology and findings are presented. Additionally, how these attractiveness levels relate to proliferation resistance and physical security is discussed.« less

  18. Facing the challenge of predicting the standard formation enthalpies of n-butyl-phosphate species with ab initio methods

    NASA Astrophysics Data System (ADS)

    Saab, Mohamad; Réal, Florent; Šulka, Martin; Cantrel, Laurent; Virot, François; Vallet, Valérie

    2017-06-01

    Tributyl-phosphate (TBP), a ligand used in the PUREX liquid-liquid separation process of spent nuclear fuel, can form an explosive mixture in contact with nitric acid that might lead to a violent explosive thermal runaway. In the context of safety of a nuclear reprocessing plant facility, it is crucial to predict the stability of TBP at elevated temperatures. So far, only the enthalpies of formation of TBP are available in the literature with rather large uncertainties, while those of its degradation products, di-(HDBP) and mono-(H2MBP), are unknown. In this goal, we have used state-of-the art quantum chemical methods to compute the formation enthalpies and entropies of TBP and its degradation products di-(HDBP) and mono-(H2MBP) in gas and liquid phases. Comparisons of levels of quantum chemical theory revealed that there are significant effects of correlation on their electronic structures, pushing for the need of not only high level of electronic correlation treatment, namely, local coupled cluster with single and double excitation operators and perturbative treatment of triple excitations, but also extrapolations to the complete basis to produce reliable and accurate thermodynamics data. Solvation enthalpies were computed with the conductor-like screening model for real solvents [COSMO-RS], for which we observe errors not exceeding 22 kJ mol-1. We thus propose with final uncertainty of about 20 kJ mol-1 standard enthalpies of formation of TBP, HDBP, and H2MBP which amounts to -1281.7 ± 24.4, -1229.4 ± 19.6, and -1176.7 ± 14.8 kJ mol-1, respectively, in the gas phase. In the liquid phase, the predicted values are -1367.3 ± 24.4, -1348.7 ± 19.6, and -1323.8± 14.8 kJ mol-1, to which we may add about -22 kJ mol-1 error from the COSMO-RS solvent model. From these data, the complete hydrolysis of TBP is predicted as an exothermic phenomena but showing a slightly endergonic process.

  19. Purex Plant comparison with 40 CFR 61, subpart H, and other referenced guidelines for the Product Removal (PR) (296-A-1) stack

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lohrasbi, J.

    Dose calculations for atmospheric radionuclide releases from the Hanford Site for calendar year (CY) 1992 were performed by Pacific Northwest Laboratory (PNL) using the approved US Environmental Protection Agency (EPA) CAP-88 computer model. Emissions from discharge points in the Hanford Site 100, 200, 300, 400, and 600 areas were calculated based on results of analyses of continuous and periodic sampling conducted at the discharge points. These calculated emissions were provided for inclusion in the CAP-88 model by area and by individual facility for those facilities having the potential to contribute more than 10 percent of the Hanford Site total ormore » to result in an impact of greater than 0.1 mrem per year to the maximally exposed individual (MEI). Also included in the assessment of offsite dose modeling are the measured radioactive emissions from all Hanford Site stacks that have routine monitoring performed. Record sampling systems have been installed on all stacks and vents that use exhaust fans to discharge air that potentially may carry airborne radioactivity. Estimation of activity from ingrowth of long-lived radioactive progeny is not included in the CAP-88 model; therefore, the Hanford Site GENII code (Napier et al. 1988) was used to supplement the CAP-88 dose calculations. When the dose to the MEI located in the Ringold area was calculated, the effective dose equivalent (EDE) from combined Hanford Site radioactive airborne emissions was shown to be 3.7E-03 mrem. This value was reported in the annual air emission report prepared for the Hanford Site (RL 1993).« less

  20. Process for converting sodium nitrate-containing, caustic liquid radioactive wastes to solid insoluble products

    DOEpatents

    Barney, Gary S.; Brownell, Lloyd E.

    1977-01-01

    A method for converting sodium nitrate-containing, caustic, radioactive wastes to a solid, relatively insoluble, thermally stable form is provided and comprises the steps of reacting powdered aluminum silicate clay, e.g., kaolin, bentonite, dickite, halloysite, pyrophyllite, etc., with the sodium nitrate-containing radioactive wastes which have a caustic concentration of about 3 to 7 M at a temperature of 30.degree. C to 100.degree. C to thereby entrap the dissolved radioactive salts in the aluminosilicate matrix. In one embodiment the sodium nitrate-containing, caustic, radioactive liquid waste, such as neutralized Purex-type waste, or salts or oxide produced by evaporation or calcination of these liquid wastes (e.g., anhydrous salt cake) is converted at a temperature within the range of 30.degree. C to 100.degree. C to the solid mineral form-cancrinite having an approximate chemical formula 2(NaAlSiO.sub.4) .sup.. xSalt.sup.. y H.sub.2 O with x = 0.52 and y = 0.68 when the entrapped salt is NaNO.sub.3. In another embodiment the sodium nitrate-containing, caustic, radioactive liquid is reacted with the powdered aluminum silicate clay at a temperature within the range of 30.degree. C to 100.degree. C, the resulting reaction product is air dried eitheras loose powder or molded shapes (e.g., bricks) and then fired at a temperature of at least 600.degree. C to form the solid mineral form-nepheline which has the approximate chemical formula of NaAlSiO.sub.4. The leach rate of the entrapped radioactive salts with distilled water is reduced essentially to that of the aluminosilicate lattice which is very low, e.g., in the range of 10.sup.-.sup.2 to 10.sup.-.sup.4 g/cm.sup.2 -- day for cancrinite and 10.sup.-.sup.3 to 10.sup.-.sup.5 g/cm.sup.2 -- day for nepheline.

  1. Testing prediction capabilities of an 131I terrestrial transport model by using measurements collected at the Hanford nuclear facility.

    PubMed

    Apostoaei, A Iulian

    2005-05-01

    A model describing transport of 131I in the environment was developed by SENES Oak Ridge, Inc., for assessment of radiation doses and excess lifetime risk from 131I atmospheric releases from Oak Ridge Reservation in Oak Ridge, Tennessee, and from Idaho National Engineering and Environmental Laboratory in southeast Idaho. This paper describes the results of an exercise designed to test the reliability of this model and to identify the main sources of uncertainty in doses and risks estimated by this model. The testing of the model was based on materials published by the International Atomic Energy Agency BIOMASS program, specifically environmental data collected after the release into atmosphere of 63 curies of 131I during 2-5 September 1963, after an accident at the Hanford PUREX Chemical Separations Plant, in Hanford, Washington. Measurements of activity in air, vegetation, and milk were collected in nine counties around Hanford during the first couple of months after the accident. The activity of 131I in the thyroid glands of two children was measured 47 d after the accident. The model developed by SENES Oak Ridge, Inc., was used to estimate concentrations of 131I in environmental media, thyroid doses for the general population, and the activity of 131I in thyroid glands of the two children. Predicted concentrations of 131I in pasture grass and milk and thyroid doses were compared with similar estimates produced by other modelers. The SENES model was also used to estimate excess lifetime risk of thyroid cancer due to the September 1963 releases of 131I from Hanford. The SENES model was first calibrated and then applied to all locations of interest around Hanford without fitting the model parameters to a given location. Predictions showed that the SENES model reproduces satisfactorily the time-dependent and the time-integrated measured concentrations in vegetation and milk, and provides reliable estimates of 131I activity in thyroids of children. SENES model generated concentrations of 131I closer to observed concentrations, as compared to the predictions produced with other models. The inter-model comparison showed that variation of thyroid doses among all participating models (SENES model included) was a factor of 3 for the general population, but a factor of 10 for the two studied children. As opposed to other models, SENES model allows a complete analysis of uncertainties in every predicted quantity, including estimated thyroid doses and risk of thyroid cancer. The uncertainties in the risk-per-unit-dose and the dose-per-unit-intake coefficients are major contributors to the uncertainty in the estimated lifetime risk and thyroid dose, respectively. The largest contributors to the uncertainty in the estimated concentration in milk are the feed-to-milk transfer factor (F(m)), the dry deposition velocity (V(d)), and the mass interception factor (r/Y)dry for the elemental form of iodine (I2). Exposure to the 1963 PUREX/Hanford accident produced low doses and risks for people living at the studied locations. The upper 97.5th percentile of the excess lifetime risk of thyroid cancer for the most extreme situations is about 10(-4). Measurements in pasture grass and milk at all locations around Hanford indicate a very low transfer of 131I from pasture to cow's milk (e.g., a feed-to-milk transfer coefficient, F(m), for commercial cows of about 0.0022 d L(-1)). These values are towards the low end of F(m) values measured elsewhere and they are low compared to the F(m) values used in other dose reconstruction studies, including the Hanford Environmental Dose Reconstruction.

  2. Vadose Zone Transport Field Study: Summary Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ward, Andy L.; Conrad, Mark E.; Daily, William D.

    2006-07-31

    From FY 2000 through FY 2003, a series of vadose zone transport field experiments were conducted as part of the U.S. Department of Energy’s Groundwater/Vadose Zone Integration Project Science and Technology Project, now known as the Remediation and Closure Science Project, and managed by the Pacific Northwest National Laboratory (PNNL). The series of experiments included two major field campaigns, one at a 299-E24-11 injection test site near PUREX and a second at a clastic dike site off Army Loop Road. The goals of these experiments were to improve our understanding of vadose zone transport processes; to develop data sets tomore » validate and calibrate vadose zone flow and transport models; and to identify advanced monitoring techniques useful for evaluating flow-and-transport mechanisms and delineating contaminant plumes in the vadose zone at the Hanford Site. This report summarizes the key findings from the field studies and demonstrates how data collected from these studies are being used to improve conceptual models and develop numerical models of flow and transport in Hanford’s vadose zone. Results of these tests have led to a better understanding of the vadose zone. Fine-scale geologic heterogeneities, including grain fabric and lamination, were observed to have a strong effect on the large-scale behavior of contaminant plumes, primarily through increased lateral spreading resulting from anisotropy. Conceptual models have been updated to include lateral spreading and numerical models of unsaturated flow and transport have revised accordingly. A new robust model based on the concept of a connectivity tensor was developed to describe saturation-dependent anisotropy in strongly heterogeneous soils and has been incorporated into PNNL’s Subsurface Transport Over Multiple Phases (STOMP) simulator. Application to field-scale transport problems have led to a better understanding plume behavior at a number of sites where lateral spreading may have dominated waste migration (e.g. BC Cribs and Trenches). The improved models have been also coupled with inverse models and newly-developed parameter scaling techniques to allow estimation of field-scale and effective transport parameters for the vadose zone. The development and utility of pedotransfer functions for describing fine-scale hydrogeochemical heterogeneity and for incorporating this heterogeneity into reactive transport models was explored. An approach based on grain-size statistics appears feasible and has been used to describe heterogeneity in hydraulic properties and sorption properties, such as the cation exchange capacity and the specific surface area of Hanford sediments. This work has also led to the development of inverse modeling capabilities for time-dependent, subsurface, reactive transport with transient flow fields using an automated optimization algorithm. In addition, a number of geophysical techniques investigated for their potential to provide detailed information on the subtle changes in lithology and bedding surfaces; plume delineation, leak detection. High-resolution resistivity is now being used for detecting saline plumes at several waste sites at Hanford, including tank farms. Results from the field studies and associated analysis have appeared in more than 46 publications generated over the past 4 years. These publications include test plans and status reports, in addition to numerous technical notes and peer reviewed papers.« less

  3. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shinichi Aose; Takafumi Kitajima; Kouji Ogasawara

    CPF (Chemical Processing Facility) was constructed at Nuclear Fuel Cycle Engineering Laboratories of JAEA (Japan Atomic Energy Agency) in 1980 as a basic research field where spent fuel pins from fast reactor (FR) and high level liquid waste can be dealt with. The renovation consists of remodeling of the CA-3 cell and the laboratory A, installation of globe boxes, hoods and analytical equipments to the laboratory C and the analytical laboratory. Also maintenance equipments in the CA-5 cell which had been out of order were repaired. The CA-3 cell is the main cell in which important equipments such as amore » dissolver, a clarifier and extractors are installed for carrying out the hot test using the irradiated FR fuel. Since the CPF had specialized originally in the research function for the Purex process, it was desired to execute the research and development of such new, various reprocessing processes. Formerly, equipments were arranged in wide space and connected with not only each other but also with utility supply system mainly by fixed stainless steel pipes. It caused shortage of operation space in flexibility for basic experimental study. Old equipments in the CA-3 cell including vessels and pipes were removed after successful decontamination, and new equipments were installed conformably to the new design. For the purpose of easy installation and rearranging the experimental equipments, equipments are basically connected by flexible pipes. Since dissolver is able to be easily replaced, various dissolution experiments is conducted. Insoluble residue generated by dissolution of spent fuel is clarified by centrifugal. This small apparatus is effective to space-saving. Mini mixer settlers or centrifugal contactors are put on to the prescribed limited space in front of the backside wall. Fresh reagents such as solvent, scrubbing and stripping solution are continuously fed from the laboratory A to the extractor by the reagent supply system with semi-automatic observation system. The in-cell crane in CA-5 was renovated to increase driving efficiency. At the renovation for the in-cell crane, full scale mockup test and 3D simulation test had been executed in advance. After the renovation, hot tests in the CPF had been resumed from JFY 2002. New equipments such as dissolver, extractor, electrolytic device, etc. were installed in CA-3 conformably to the new design laid out in order to ensure the function and space. Glove boxes in the analysis laboratory were renewed in order to let it have flexibility from the viewpoint of conducting basic experiments (ex. U crystallization). Glove boxes and hoods were newly installed in the laboratory A for basic research and analysis, especially on MA chemistries. One laboratory (the laboratory C) was established to research about dry reprocessing. The renovation of the CPF has been executed in order to contribute to the development on the advanced fast reactor fuel cycle system, which will give us many sort of technical subject and experimental theme to be solved in the 2. Generation of the CPF.« less

  4. Prototype Demonstration of Gamma- Blind Tensioned Metastable Fluid Neutron/Multiplicity/Alpha Detector – Real Time Methods for Advanced Fuel Cycle Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McDeavitt, Sean M.

    The content of this report summarizes a multi-year effort to develop prototype detection equipment using the Tensioned Metastable Fluid Detector (TMFD) technology developed by Taleyarkhan [1]. The context of this development effort was to create new methods for evaluating and developing advanced methods for safeguarding nuclear materials along with instrumentation in various stages of the fuel cycle, especially in material balance areas (MBAs) and during reprocessing of used nuclear fuel. One of the challenges related to the implementation of any type of MBA and/or reprocessing technology (e.g., PUREX or UREX) is the real-time quantification and control of the transuranic (TRU)more » isotopes as they move through the process. Monitoring of higher actinides from their neutron emission (including multiplicity) and alpha signatures during transit in MBAs and in aqueous separations is a critical research area. By providing on-line real-time materials accountability, diversion of the materials becomes much more difficult. The Tensioned Metastable Fluid Detector (TMFD) is a transformational technology that is uniquely capable of both alpha and neutron spectroscopy while being “blind” to the intense gamma field that typically accompanies used fuel – simultaneously with the ability to provide multiplicity information as well [1-3]. The TMFD technology was proven (lab-scale) as part of a 2008 NERI-C program [1-7]. The bulk of this report describes the advancements and demonstrations made in TMFD technology. One final point to present before turning to the TMFD demonstrations is the context for discussing real-time monitoring of SNM. It is useful to review the spectrum of isotopes generated within nuclear fuel during reactor operations. Used nuclear fuel (UNF) from a light water reactor (LWR) contains fission products as well as TRU elements formed through neutron absorption/decay chains. The majority of the fission products are gamma and beta emitters and they represent the more significant hazards from a radiation protection standpoint. However, alpha and neutron emitting uranium and TRU elements represent the more significant safeguards and security concerns. Table 1.1 presents a representative PWR inventory of the uranium and actinide isotopes present in a used fuel assembly. The uranium and actinide isotopes (chiefly the Pu, Am and Cm elements) are all emitters of alpha particles and some of them release significant quantities of neutrons through spontaneous fissions« less

  5. Study of Compton suppression for use in spent nuclear fuel assay

    NASA Astrophysics Data System (ADS)

    Bender, Sarah

    The focus of this study has been to assess Compton suppressed gamma-ray detection systems for the multivariate analysis of spent nuclear fuel. This objective has been achieved using direct measurement of samples of irradiated fuel elements in two geometrical configurations with Compton suppression systems. In order to address the objective to quantify the number of additionally resolvable photopeaks, direct Compton suppressed spectroscopic measurements of spent nuclear fuel in two configurations were performed: as intact fuel elements and as dissolved feed solutions. These measurements directly assessed and quantified the differences in measured gamma-ray spectrum from the application of Compton suppression. Several irradiated fuel elements of varying cooling time from the Penn State Breazeale Reactor spent fuel inventory were measured using three Compton suppression systems that utilized different primary detectors: HPGe, LaBr3, and NaI(Tl). The application of Compton suppression using a LaBr3 primary detector to the measurement of the current core fuel element, which presented the highest count rate, allowed four additional spectral features to be resolved. In comparison, the HPGe-CSS was able to resolve eight additional photopeaks as compared to the standalone HPGe measurement. Measurements with the NaI(Tl) primary detector were unable to resolve any additional peaks, due to its relatively low resolution. Samples of Approved Test Material (ATM) commercial fuel elements were obtained from Pacific Northwest National Laboratory. The samples had been processed using the beginning stages of the PUREX method and represented the unseparated feed solution from a reprocessing facility. Compton suppressed measurements of the ATM fuel samples were recorded inside the guard detector annulus, to simulate the siphoning of small quantities from the main process stream for long dwell measurement periods. Photopeak losses were observed in the measurements of the dissolved ATM fuel samples because the spectra was recorded from the source in very close proximity to the detector and surrounded by the guard annulus, so the detection probability is very high. Though this configuration is optimal for a Compton suppression system for the measurement of low count rate samples, measurement of high count rate samples in the enclosed arrangement leads to sum peaks in both the suppressed and unsuppressed spectra and losses to photopeak counts in the suppressed spectra. No additional photopeaks were detected using Compton suppression with this geometry. A detector model was constructed that can accurately simulate a Compton suppressed spectral measurement of radiation from spent nuclear fuel using HPGe or LaBr3 detectors. This is the first detector model capable of such an accomplishment. The model uses the Geant4 toolkit coupled with the RadSrc application and it accepts spent fuel composition data in list form. The model has been validated using dissolved ATM fuel samples in the standard, enclosed geometry of the PSU HPGe-CSS. The model showed generally good agreement with both the unsuppressed and suppressed measured fuel sample spectra, however the simulation is more appropriate for the generation of gamma-ray spectra in the beam source configuration. Photopeak losses due to cascade decay emissions in the Compton suppressed spectra were not appropriately managed by the simulation. Compton suppression would be a beneficial addition to NDA process monitoring systems if oriented such that the gamma-ray photons are collimated to impinge the primary detector face as a beam. The analysis has shown that peak losses through accidental coincidences are minimal and the reduction in the Compton continuum allows additional peaks to be resolved. (Abstract shortened by UMI.).

  6. ARRAYS OF BOTTLES OF PLUTONIUM NITRATE SOLUTION

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Margaret A. Marshall

    2012-09-01

    In October and November of 1981 thirteen approaches-to-critical were performed on a remote split table machine (RSTM) in the Critical Mass Laboratory of Pacific Northwest Laboratory (PNL) in Richland, Washington using planar arrays of polyethylene bottles filled with plutonium (Pu) nitrate solution. Arrays of up to sixteen bottles were used to measure the critical number of bottles and critical array spacing with a tight fitting Plexiglas® reflector on all sides of the arrays except the top. Some experiments used Plexiglas shells fitted around each bottles to determine the effect of moderation on criticality. Each bottle contained approximately 2.4 L ofmore » Pu(NO3)4 solution with a Pu content of 105 g Pu/L and a free acid molarity H+ of 5.1. The plutonium was of low 240Pu (2.9 wt.%) content. These experiments were sponsored by Rockwell Hanford Operations because of the lack of experimental data on the criticality of arrays of bottles of Pu solution such as might be found in storage and handling at the Purex Facility at Hanford. The results of these experiments were used “to provide benchmark data to validate calculational codes used in criticality safety assessments of [the] plant configurations” (Ref. 1). Data for this evaluation were collected from the published report (Ref. 1), the approach to critical logbook, the experimenter’s logbook, and communication with the primary experimenter, B. Michael Durst. Of the 13 experiments preformed 10 were evaluated. One of the experiments was not evaluated because it had been thrown out by the experimenter, one was not evaluated because it was a repeat of another experiment and the third was not evaluated because it reported the critical number of bottles as being greater than 25. Seven of the thirteen evaluated experiments were determined to be acceptable benchmark experiments. A similar experiment using uranyl nitrate was benchmarked as U233-SOL-THERM-014.« less

  7. Air pathway effects of nuclear materials production at the Hanford Site, 1983 to 1992

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Patton, G.W.; Cooper, A.T.

    1993-10-01

    This report describes the air pathway effects of Hanford Site operations from 1983 to 1992 on the local environment by summarizing the air concentrations of selected radionuclides at both onsite and offsite locations, comparing trends in environment concentrations to changing facility emissions, and briefly describing trends in the radiological dose to the hypothetical maximally exposed member of the public. The years 1983 to 1992 represent the last Hanford Site plutonium production campaign, and this report deals mainly with the air pathway effects from the 200 Areas, in which the major contributors to radiological emissions were located. An additional purpose formore » report was to review the environmental data for a long period of time to provide insight not available in an annual report format. The sampling and analytical systems used by the Surface Environmental Surveillance Project (SESP) to collect air samples during the period of this report were sufficiently sensitive to observe locally elevated concentrations of selected radionuclides near onsite source of emission as well as observing elevated levels, compared to distant locations, of some radionuclides at the down wind perimeter. The US DOE Derived Concentration Guides (DCGs) for airborne radionuclides were not exceeded for any air sample collected during 1983 to 1992, with annual average concentrations of all radionuclides at the downwind perimeter being considerably below the DCG values. Air emissions at the Hanford Site during the period of this report were dominated by releases from the PUREX Plant, with {sup 85}Kr being the major release on a curie basis and {sup 129}I being the major release on a radiological dose basis. The estimated potential radiological dose from Hanford Site point source emissions to the hypothetical maximally exposed individual (MEI) ranged from 0. 02 to 0.22 mrem/yr (effective dose equivalent), which is well below the DOE radiation limit to the public of 100 mrem/yr.« less

  8. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Law, A.G.; Serkowski, J.A.; Schatz, A.L.

    The Separations Area ground-water monitoring network consisted of 137 wells. Samples from wells in the monitoring network were collected on a monthly, quarterly, or semiannual schedule, depending on the history of the liquid waste disposal site. Samples were analyzed selectively for total alpha, total beta, tritium, /sup 90/Sr, /sup 137/Cs, /sup 60/Co, /sup 106/Ru, total uranium and nitrate. Average concentrations of contaminants in most wells were essentially the same in 1986 as in 1985. The DCG for tritium was exceeded at two PUREX cribs. The ACL specified for /sup 90/Sr was exceeded in three wells near the 216-A-25 Pond. Disposalmore » of effluents to the pond decreased as the main pond was reduced in width to a ditch leading the overflow pond. The ACL guidelines for uranium were exceeded although concentrations were below the DCG; the source of this uranium is probably the inactive 216-B-12 crib. Uranium concentrations above the ACL but below the DCG were also observed at the 216-U-14 ditch and the source is under evaluation. The inactive 216-B-5 reverse well exceeded the DCG for /sup 90/Sr and the ACL for /sup 137/Cs and uranium. Inactive facilities exceeding Rockwell guidelines were the 216-S-1/2 cribs, 216-U-1/2 cribs, the 216-U-10 pond, and the 216-U-6 crib. The 216-S-1/2 cribs have historically had high /sup 137/Cs concentrations because of localized contamination but are below the DCG. Uranium concentrations, which are above the DCG, have stabilized at the 216-U-1/2 cribs after the remedial pumping and uranium removal conducted in 1985. Possible additional action is currently being evaluated. Disposal of the effluent from the ion exchange column to the 216-S-25 crib resulted in ground-water concentrations that exceeded Rockwell guidelines but below the DCG. Ground water near the 216-U-10 pond remains elevated but below the DCG due to past disposal to the pond, which was deactivated in 1984. 23 refs., 25 figs., 26 tabs.« less

  9. Status of the French Research on Partitioning and Transmutation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Warin, Dominique

    2007-07-01

    The global energy context pleads in favor of a sustainable development of nuclear energy since the demand for energy will likely increase, whereas resources will tend to get scarcer and the prospect of global warming will drive down the consumption of fossil fuel sources. How we deal with radioactive waste is crucial in this context. The production of nuclear energy in France has been associated, since its inception, with the optimization of radioactive waste management, including the partitioning and the recycling of recoverable energetic materials. The public's concern regarding the long-term waste management made the French Government prepare and passmore » the December 1991 Law, requesting in particular, the study for fifteen years of solutions for still minimizing the quantity and the hazardousness of final waste, via partitioning and transmutation. At the end of these fifteen years of research, it is considered that partitioning techniques, which have been validated on real solutions, are at disposal. Indeed, aqueous process for separation of minor actinides from the PUREX raffinate has been brought to a point where there is reasonable assurance that industrial deployment can be successful. A key experiment has been the successful kilogram scale trials in the CEA-Marcoule Atalante facility in 2005 and this result, together with the results obtained in the frame of the successive European projects, constitutes a considerable step forward. For transmutation, CEA has conducted programs proving the feasibility of the elimination of minor actinides and fission products: fabrication of specific targets and fuels for transmutation tests in the HFR and Phenix reactors, neutronics and technology studies for critical reactors and ADS developments. Scenario studies have also allowed assessing the feasibility, at the level of cycle and fuel facilities, and the efficiency of transmutation in terms of the quantitative reduction of the final waste inventory depending of the reactor fleet (PWR-FR-ADS). Important results are now available concerning the possibility of significantly reducing the quantity and the radiotoxicity of long-lived waste in association with a sustainable development of nuclear energy. As France has confirmed its long-term approach to nuclear energy, the most effective implementation of P and T of minor actinides relies on the fast neutron GEN IV systems, which are designed to recycle and manage their own actinides. The perspective to deploy a first series of such systems around 2040 supports the idea that progress is being made: the long-term waste would only be made up of fission products, with very low amounts of minor actinides. In this sense, the new waste management law passed by the French Parliament on June 28, 2006, demands that P and T research continues in strong connection to GEN IV systems and ADS development and allowing the assessment of the industrial perspectives of such systems in 2012 and to put into operation a transmutation demo facility in 2020. (author)« less

  10. System and method for deriving a process-based specification

    NASA Technical Reports Server (NTRS)

    Hinchey, Michael Gerard (Inventor); Rouff, Christopher A. (Inventor); Rash, James Larry (Inventor)

    2009-01-01

    A system and method for deriving a process-based specification for a system is disclosed. The process-based specification is mathematically inferred from a trace-based specification. The trace-based specification is derived from a non-empty set of traces or natural language scenarios. The process-based specification is mathematically equivalent to the trace-based specification. Code is generated, if applicable, from the process-based specification. A process, or phases of a process, using the features disclosed can be reversed and repeated to allow for an interactive development and modification of legacy systems. The process is applicable to any class of system, including, but not limited to, biological and physical systems, electrical and electro-mechanical systems in addition to software, hardware and hybrid hardware-software systems.

  11. Differentiating location- and distance-based processes in memory for time: an ERP study.

    PubMed

    Curran, Tim; Friedman, William J

    2003-09-01

    Memory for the time of events may benefit from reconstructive, location-based, and distance-based processes, but these processes are difficult to dissociate with behavioral methods. Neuropsychological research has emphasized the contribution of prefrontal brain mechanisms to memory for time but has not clearly differentiated location- from distance-based processing. The present experiment recorded event-related brain potentials (ERPs) while subjects completed two different temporal memory tests, designed to emphasize either location- or distance-based processing. The subjects' reports of location-based versus distance-based strategies and the reaction time pattern validated our experimental manipulation. Late (800-1,800 msec) frontal ERP effects were related to location-based processing. The results provide support for a two-process theory of memory for time and suggest that frontal memory mechanisms are specifically related to reconstructive, location-based processing.

  12. On Intelligent Design and Planning Method of Process Route Based on Gun Breech Machining Process

    NASA Astrophysics Data System (ADS)

    Hongzhi, Zhao; Jian, Zhang

    2018-03-01

    The paper states an approach of intelligent design and planning of process route based on gun breech machining process, against several problems, such as complex machining process of gun breech, tedious route design and long period of its traditional unmanageable process route. Based on gun breech machining process, intelligent design and planning system of process route are developed by virtue of DEST and VC++. The system includes two functional modules--process route intelligent design and its planning. The process route intelligent design module, through the analysis of gun breech machining process, summarizes breech process knowledge so as to complete the design of knowledge base and inference engine. And then gun breech process route intelligently output. On the basis of intelligent route design module, the final process route is made, edited and managed in the process route planning module.

  13. Situation awareness acquired from monitoring process plants - the Process Overview concept and measure.

    PubMed

    Lau, Nathan; Jamieson, Greg A; Skraaning, Gyrd

    2016-07-01

    We introduce Process Overview, a situation awareness characterisation of the knowledge derived from monitoring process plants. Process Overview is based on observational studies of process control work in the literature. The characterisation is applied to develop a query-based measure called the Process Overview Measure. The goal of the measure is to improve coupling between situation and awareness according to process plant properties and operator cognitive work. A companion article presents the empirical evaluation of the Process Overview Measure in a realistic process control setting. The Process Overview Measure demonstrated sensitivity and validity by revealing significant effects of experimental manipulations that corroborated with other empirical results. The measure also demonstrated adequate inter-rater reliability and practicality for measuring SA based on data collected by process experts. Practitioner Summary: The Process Overview Measure is a query-based measure for assessing operator situation awareness from monitoring process plants in representative settings.

  14. A new window of opportunity to reject process-based biotechnology regulation

    PubMed Central

    Marchant, Gary E; Stevens, Yvonne A

    2015-01-01

    ABSTRACT. The question of whether biotechnology regulation should be based on the process or the product has long been debated, with different jurisdictions adopting different approaches. The European Union has adopted a process-based approach, Canada has adopted a product-based approach, and the United States has implemented a hybrid system. With the recent proliferation of new methods of genetic modification, such as gene editing, process-based regulatory systems, which are premised on a binary system of transgenic and conventional approaches, will become increasingly obsolete and unsustainable. To avoid unreasonable, unfair and arbitrary results, nations that have adopted process-based approaches will need to migrate to a product-based approach that considers the novelty and risks of the individual trait, rather than the process by which that trait was produced. This commentary suggests some approaches for the design of such a product-based approach. PMID:26930116

  15. A new window of opportunity to reject process-based biotechnology regulation.

    PubMed

    Marchant, Gary E; Stevens, Yvonne A

    2015-01-01

    The question of whether biotechnology regulation should be based on the process or the product has long been debated, with different jurisdictions adopting different approaches. The European Union has adopted a process-based approach, Canada has adopted a product-based approach, and the United States has implemented a hybrid system. With the recent proliferation of new methods of genetic modification, such as gene editing, process-based regulatory systems, which are premised on a binary system of transgenic and conventional approaches, will become increasingly obsolete and unsustainable. To avoid unreasonable, unfair and arbitrary results, nations that have adopted process-based approaches will need to migrate to a product-based approach that considers the novelty and risks of the individual trait, rather than the process by which that trait was produced. This commentary suggests some approaches for the design of such a product-based approach.

  16. Implicit Schemata and Categories in Memory-Based Language Processing

    ERIC Educational Resources Information Center

    van den Bosch, Antal; Daelemans, Walter

    2013-01-01

    Memory-based language processing (MBLP) is an approach to language processing based on exemplar storage during learning and analogical reasoning during processing. From a cognitive perspective, the approach is attractive as a model for human language processing because it does not make any assumptions about the way abstractions are shaped, nor any…

  17. Models of Quantitative Estimations: Rule-Based and Exemplar-Based Processes Compared

    ERIC Educational Resources Information Center

    von Helversen, Bettina; Rieskamp, Jorg

    2009-01-01

    The cognitive processes underlying quantitative estimations vary. Past research has identified task-contingent changes between rule-based and exemplar-based processes (P. Juslin, L. Karlsson, & H. Olsson, 2008). B. von Helversen and J. Rieskamp (2008), however, proposed a simple rule-based model--the mapping model--that outperformed the…

  18. Unified Modeling Language (UML) for hospital-based cancer registration processes.

    PubMed

    Shiki, Naomi; Ohno, Yuko; Fujii, Ayumi; Murata, Taizo; Matsumura, Yasushi

    2008-01-01

    Hospital-based cancer registry involves complex processing steps that span across multiple departments. In addition, management techniques and registration procedures differ depending on each medical facility. Establishing processes for hospital-based cancer registry requires clarifying specific functions and labor needed. In recent years, the business modeling technique, in which management evaluation is done by clearly spelling out processes and functions, has been applied to business process analysis. However, there are few analytical reports describing the applications of these concepts to medical-related work. In this study, we initially sought to model hospital-based cancer registration processes using the Unified Modeling Language (UML), to clarify functions. The object of this study was the cancer registry of Osaka University Hospital. We organized the hospital-based cancer registration processes based on interview and observational surveys, and produced an As-Is model using activity, use-case, and class diagrams. After drafting every UML model, it was fed-back to practitioners to check its validity and improved. We were able to define the workflow for each department using activity diagrams. In addition, by using use-case diagrams we were able to classify each department within the hospital as a system, and thereby specify the core processes and staff that were responsible for each department. The class diagrams were effective in systematically organizing the information to be used for hospital-based cancer registries. Using UML modeling, hospital-based cancer registration processes were broadly classified into three separate processes, namely, registration tasks, quality control, and filing data. An additional 14 functions were also extracted. Many tasks take place within the hospital-based cancer registry office, but the process of providing information spans across multiple departments. Moreover, additional tasks were required in comparison to using a standardized system because the hospital-based cancer registration system was constructed with the pre-existing computer system in Osaka University Hospital. Difficulty of utilization of useful information for cancer registration processes was shown to increase the task workload. By using UML, we were able to clarify functions and extract the typical processes for a hospital-based cancer registry. Modeling can provide a basis of process analysis for establishment of efficient hospital-based cancer registration processes in each institute.

  19. A neuroanatomical model of space-based and object-centered processing in spatial neglect.

    PubMed

    Pedrazzini, Elena; Schnider, Armin; Ptak, Radek

    2017-11-01

    Visual attention can be deployed in space-based or object-centered reference frames. Right-hemisphere damage may lead to distinct deficits of space- or object-based processing, and such dissociations are thought to underlie the heterogeneous nature of spatial neglect. Previous studies have suggested that object-centered processing deficits (such as in copying, reading or line bisection) result from damage to retro-rolandic regions while impaired spatial exploration reflects damage to more anterior regions. However, this evidence is based on small samples and heterogeneous tasks. Here, we tested a theoretical model of neglect that takes in account the space- and object-based processing and relates them to neuroanatomical predictors. One hundred and one right-hemisphere-damaged patients were examined with classic neuropsychological tests and structural brain imaging. Relations between neglect measures and damage to the temporal-parietal junction, intraparietal cortex, insula and middle frontal gyrus were examined with two structural equation models by assuming that object-centered processing (involved in line bisection and single-word reading) and space-based processing (involved in cancelation tasks) either represented a unique latent variable or two distinct variables. Of these two models the latter had better explanatory power. Damage to the intraparietal sulcus was a significant predictor of object-centered, but not space-based processing, while damage to the temporal-parietal junction predicted space-based, but not object-centered processing. Space-based processing and object-centered processing were strongly intercorrelated, indicating that they rely on similar, albeit partly dissociated processes. These findings indicate that object-centered and space-based deficits in neglect are partly independent and result from superior parietal and inferior parietal damage, respectively.

  20. Musical rhythm and reading development: does beat processing matter?

    PubMed

    Ozernov-Palchik, Ola; Patel, Aniruddh D

    2018-05-20

    There is mounting evidence for links between musical rhythm processing and reading-related cognitive skills, such as phonological awareness. This may be because music and speech are rhythmic: both involve processing complex sound sequences with systematic patterns of timing, accent, and grouping. Yet, there is a salient difference between musical and speech rhythm: musical rhythm is often beat-based (based on an underlying grid of equal time intervals), while speech rhythm is not. Thus, the role of beat-based processing in the reading-rhythm relationship is not clear. Is there is a distinct relation between beat-based processing mechanisms and reading-related language skills, or is the rhythm-reading link entirely due to shared mechanisms for processing nonbeat-based aspects of temporal structure? We discuss recent evidence for a distinct link between beat-based processing and early reading abilities in young children, and suggest experimental designs that would allow one to further methodically investigate this relationship. We propose that beat-based processing taps into a listener's ability to use rich contextual regularities to form predictions, a skill important for reading development. © 2018 New York Academy of Sciences.

  1. Conceptual information processing: A robust approach to KBS-DBMS integration

    NASA Technical Reports Server (NTRS)

    Lazzara, Allen V.; Tepfenhart, William; White, Richard C.; Liuzzi, Raymond

    1987-01-01

    Integrating the respective functionality and architectural features of knowledge base and data base management systems is a topic of considerable interest. Several aspects of this topic and associated issues are addressed. The significance of integration and the problems associated with accomplishing that integration are discussed. The shortcomings of current approaches to integration and the need to fuse the capabilities of both knowledge base and data base management systems motivates the investigation of information processing paradigms. One such paradigm is concept based processing, i.e., processing based on concepts and conceptual relations. An approach to robust knowledge and data base system integration is discussed by addressing progress made in the development of an experimental model for conceptual information processing.

  2. The research on construction and application of machining process knowledge base

    NASA Astrophysics Data System (ADS)

    Zhao, Tan; Qiao, Lihong; Qie, Yifan; Guo, Kai

    2018-03-01

    In order to realize the application of knowledge in machining process design, from the perspective of knowledge in the application of computer aided process planning(CAPP), a hierarchical structure of knowledge classification is established according to the characteristics of mechanical engineering field. The expression of machining process knowledge is structured by means of production rules and the object-oriented methods. Three kinds of knowledge base models are constructed according to the representation of machining process knowledge. In this paper, the definition and classification of machining process knowledge, knowledge model, and the application flow of the process design based on the knowledge base are given, and the main steps of the design decision of the machine tool are carried out as an application by using the knowledge base.

  3. Affective-cognitive meta-bases versus structural bases of attitudes predict processing interest versus efficiency.

    PubMed

    See, Ya Hui Michelle; Petty, Richard E; Fabrigar, Leandre R

    2013-08-01

    We proposed that (a) processing interest for affective over cognitive information is captured by meta-bases (i.e., the extent to which people subjectively perceive themselves to rely on affect or cognition in their attitudes) and (b) processing efficiency for affective over cognitive information is captured by structural bases (i.e., the extent to which attitudes are more evaluatively congruent with affect or cognition). Because processing speed can disentangle interest from efficiency by being manifest as longer or shorter reading times, we hypothesized and found that more affective meta-bases predicted longer affective than cognitive reading time when processing efficiency was held constant (Study 1). In contrast, more affective structural bases predicted shorter affective than cognitive reading time when participants were constrained in their ability to allocate resources deliberatively (Study 2). When deliberation was neither encouraged nor constrained, effects for meta-bases and structural bases emerged (Study 3). Implications for affective-cognitive processing and other attitudes-relevant constructs are discussed.

  4. Process-based Cost Estimation for Ramjet/Scramjet Engines

    NASA Technical Reports Server (NTRS)

    Singh, Brijendra; Torres, Felix; Nesman, Miles; Reynolds, John

    2003-01-01

    Process-based cost estimation plays a key role in effecting cultural change that integrates distributed science, technology and engineering teams to rapidly create innovative and affordable products. Working together, NASA Glenn Research Center and Boeing Canoga Park have developed a methodology of process-based cost estimation bridging the methodologies of high-level parametric models and detailed bottoms-up estimation. The NASA GRC/Boeing CP process-based cost model provides a probabilistic structure of layered cost drivers. High-level inputs characterize mission requirements, system performance, and relevant economic factors. Design alternatives are extracted from a standard, product-specific work breakdown structure to pre-load lower-level cost driver inputs and generate the cost-risk analysis. As product design progresses and matures the lower level more detailed cost drivers can be re-accessed and the projected variation of input values narrowed, thereby generating a progressively more accurate estimate of cost-risk. Incorporated into the process-based cost model are techniques for decision analysis, specifically, the analytic hierarchy process (AHP) and functional utility analysis. Design alternatives may then be evaluated not just on cost-risk, but also user defined performance and schedule criteria. This implementation of full-trade study support contributes significantly to the realization of the integrated development environment. The process-based cost estimation model generates development and manufacturing cost estimates. The development team plans to expand the manufacturing process base from approximately 80 manufacturing processes to over 250 processes. Operation and support cost modeling is also envisioned. Process-based estimation considers the materials, resources, and processes in establishing cost-risk and rather depending on weight as an input, actually estimates weight along with cost and schedule.

  5. [Near infrared spectroscopy based process trajectory technology and its application in monitoring and controlling of traditional Chinese medicine manufacturing process].

    PubMed

    Li, Wen-Long; Qu, Hai-Bin

    2016-10-01

    In this paper, the principle of NIRS (near infrared spectroscopy)-based process trajectory technology was introduced.The main steps of the technique include:① in-line collection of the processes spectra of different technics; ② unfolding of the 3-D process spectra;③ determination of the process trajectories and their normal limits;④ monitoring of the new batches with the established MSPC (multivariate statistical process control) models.Applications of the technology in the chemical and biological medicines were reviewed briefly. By a comprehensive introduction of our feasibility research on the monitoring of traditional Chinese medicine technical process using NIRS-based multivariate process trajectories, several important problems of the practical applications which need urgent solutions are proposed, and also the application prospect of the NIRS-based process trajectory technology is fully discussed and put forward in the end. Copyright© by the Chinese Pharmaceutical Association.

  6. SEMICONDUCTOR TECHNOLOGY A signal processing method for the friction-based endpoint detection system of a CMP process

    NASA Astrophysics Data System (ADS)

    Chi, Xu; Dongming, Guo; Zhuji, Jin; Renke, Kang

    2010-12-01

    A signal processing method for the friction-based endpoint detection system of a chemical mechanical polishing (CMP) process is presented. The signal process method uses the wavelet threshold denoising method to reduce the noise contained in the measured original signal, extracts the Kalman filter innovation from the denoised signal as the feature signal, and judges the CMP endpoint based on the feature of the Kalman filter innovation sequence during the CMP process. Applying the signal processing method, the endpoint detection experiments of the Cu CMP process were carried out. The results show that the signal processing method can judge the endpoint of the Cu CMP process.

  7. Social value and individual choice: The value of a choice-based decision-making process in a collectively funded health system.

    PubMed

    Espinoza, Manuel Antonio; Manca, Andrea; Claxton, Karl; Sculpher, Mark

    2018-02-01

    Evidence about cost-effectiveness is increasingly being used to inform decisions about the funding of new technologies that are usually implemented as guidelines from centralized decision-making bodies. However, there is also an increasing recognition for the role of patients in determining their preferred treatment option. This paper presents a method to estimate the value of implementing a choice-based decision process using the cost-effectiveness analysis toolbox. This value is estimated for 3 alternative scenarios. First, it compares centralized decisions, based on population average cost-effectiveness, against a decision process based on patient choice. Second, it compares centralized decision based on patients' subgroups versus an individual choice-based decision process. Third, it compares a centralized process based on average cost-effectiveness against a choice-based process where patients choose according to a different measure of outcome to that used by the centralized decision maker. The methods are applied to a case study for the management of acute coronary syndrome. It is concluded that implementing a choice-based process of treatment allocation may be an option in collectively funded health systems. However, its value will depend on the specific health problem and the social values considered relevant to the health system. Copyright © 2017 John Wiley & Sons, Ltd.

  8. Pre- and Post-Processing Tools to Create and Characterize Particle-Based Composite Model Structures

    DTIC Science & Technology

    2017-11-01

    ARL-TR-8213 ● NOV 2017 US Army Research Laboratory Pre- and Post -Processing Tools to Create and Characterize Particle-Based...ARL-TR-8213 ● NOV 2017 US Army Research Laboratory Pre- and Post -Processing Tools to Create and Characterize Particle-Based Composite...AND SUBTITLE Pre- and Post -Processing Tools to Create and Characterize Particle-Based Composite Model Structures 5a. CONTRACT NUMBER 5b. GRANT

  9. Introduction to Radar Signal and Data Processing: The Opportunity

    DTIC Science & Technology

    2006-09-01

    SpA) Director of Analysis of Integrated Systems Group Via Tiburtina Km. 12.400 00131 Rome ITALY e.mail: afarina@selex-si.com Key words: radar...signal processing, data processing, adaptivity, space-time adaptive processing, knowledge based systems , CFAR. 1. SUMMARY This paper introduces to...the lecture series dedicated to the knowledge-based radar signal and data processing. Knowledge-based expert system (KBS) is in the realm of

  10. Patch-based models and algorithms for image processing: a review of the basic principles and methods, and their application in computed tomography.

    PubMed

    Karimi, Davood; Ward, Rabab K

    2016-10-01

    Image models are central to all image processing tasks. The great advancements in digital image processing would not have been made possible without powerful models which, themselves, have evolved over time. In the past decade, "patch-based" models have emerged as one of the most effective models for natural images. Patch-based methods have outperformed other competing methods in many image processing tasks. These developments have come at a time when greater availability of powerful computational resources and growing concerns over the health risks of the ionizing radiation encourage research on image processing algorithms for computed tomography (CT). The goal of this paper is to explain the principles of patch-based methods and to review some of their recent applications in CT. We first review the central concepts in patch-based image processing and explain some of the state-of-the-art algorithms, with a focus on aspects that are more relevant to CT. Then, we review some of the recent application of patch-based methods in CT. Patch-based methods have already transformed the field of image processing, leading to state-of-the-art results in many applications. More recently, several studies have proposed patch-based algorithms for various image processing tasks in CT, from denoising and restoration to iterative reconstruction. Although these studies have reported good results, the true potential of patch-based methods for CT has not been yet appreciated. Patch-based methods can play a central role in image reconstruction and processing for CT. They have the potential to lead to substantial improvements in the current state of the art.

  11. 76 FR 70878 - Revitalizing Base Closure Communities and Addressing Impacts of Realignment

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-11-16

    ... base closure process to conform to the amendment to the Defense Base Closure and Realignment Act of... departments to expedite the EDC process. Closed military bases represent a potential engine of economic... purposes of establishing EDC terms and conditions. It also eliminates the need to establish a process by...

  12. Expert system for web based collaborative CAE

    NASA Astrophysics Data System (ADS)

    Hou, Liang; Lin, Zusheng

    2006-11-01

    An expert system for web based collaborative CAE was developed based on knowledge engineering, relational database and commercial FEA (Finite element analysis) software. The architecture of the system was illustrated. In this system, the experts' experiences, theories and typical examples and other related knowledge, which will be used in the stage of pre-process in FEA, were categorized into analysis process and object knowledge. Then, the integrated knowledge model based on object-oriented method and rule based method was described. The integrated reasoning process based on CBR (case based reasoning) and rule based reasoning was presented. Finally, the analysis process of this expert system in web based CAE application was illustrated, and an analysis example of a machine tool's column was illustrated to prove the validity of the system.

  13. Hydrothermal Processing of Base Camp Solid Wastes To Allow Onsite Recycling

    DTIC Science & Technology

    2008-09-01

    ER D C/ CE R L TR -0 8 -1 3 Hydrothermal Processing of Base Camp Solid Wastes To Allow Onsite Recycling Gary L. Gerdes, Deborah...release; distribution is unlimited. ERDC/CERL TR-08-13 September 2008 Hydrothermal Processing of Base Camp Solid Wastes To Allow Onsite Recycling...a technology to process domestic solid waste using a unique hydrothermal system. The process was successfully demonstrated at Forts Benning and

  14. Collaboration processes and perceived effectiveness of integrated care projects in primary care: a longitudinal mixed-methods study.

    PubMed

    Valentijn, Pim P; Ruwaard, Dirk; Vrijhoef, Hubertus J M; de Bont, Antoinette; Arends, Rosa Y; Bruijnzeels, Marc A

    2015-10-09

    Collaborative partnerships are considered an essential strategy for integrating local disjointed health and social services. Currently, little evidence is available on how integrated care arrangements between professionals and organisations are achieved through the evolution of collaboration processes over time. The first aim was to develop a typology of integrated care projects (ICPs) based on the final degree of integration as perceived by multiple stakeholders. The second aim was to study how types of integration differ in changes of collaboration processes over time and final perceived effectiveness. A longitudinal mixed-methods study design based on two data sources (surveys and interviews) was used to identify the perceived degree of integration and patterns in collaboration among 42 ICPs in primary care in The Netherlands. We used cluster analysis to identify distinct subgroups of ICPs based on the final perceived degree of integration from a professional, organisational and system perspective. With the use of ANOVAs, the subgroups were contrasted based on: 1) changes in collaboration processes over time (shared ambition, interests and mutual gains, relationship dynamics, organisational dynamics and process management) and 2) final perceived effectiveness (i.e. rated success) at the professional, organisational and system levels. The ICPs were classified into three subgroups with: 'United Integration Perspectives (UIP)', 'Disunited Integration Perspectives (DIP)' and 'Professional-oriented Integration Perspectives (PIP)'. ICPs within the UIP subgroup made the strongest increase in trust-based (mutual gains and relationship dynamics) as well as control-based (organisational dynamics and process management) collaboration processes and had the highest overall effectiveness rates. On the other hand, ICPs with the DIP subgroup decreased on collaboration processes and had the lowest overall effectiveness rates. ICPs within the PIP subgroup increased in control-based collaboration processes (organisational dynamics and process management) and had the highest effectiveness rates at the professional level. The differences across the three subgroups in terms of the development of collaboration processes and the final perceived effectiveness provide evidence that united stakeholders' perspectives are achieved through a constructive collaboration process over time. Disunited perspectives at the professional, organisation and system levels can be aligned by both trust-based and control-based collaboration processes.

  15. Empirical evaluation of the Process Overview Measure for assessing situation awareness in process plants.

    PubMed

    Lau, Nathan; Jamieson, Greg A; Skraaning, Gyrd

    2016-03-01

    The Process Overview Measure is a query-based measure developed to assess operator situation awareness (SA) from monitoring process plants. A companion paper describes how the measure has been developed according to process plant properties and operator cognitive work. The Process Overview Measure demonstrated practicality, sensitivity, validity and reliability in two full-scope simulator experiments investigating dramatically different operational concepts. Practicality was assessed based on qualitative feedback of participants and researchers. The Process Overview Measure demonstrated sensitivity and validity by revealing significant effects of experimental manipulations that corroborated with other empirical results. The measure also demonstrated adequate inter-rater reliability and practicality for measuring SA in full-scope simulator settings based on data collected on process experts. Thus, full-scope simulator studies can employ the Process Overview Measure to reveal the impact of new control room technology and operational concepts on monitoring process plants. Practitioner Summary: The Process Overview Measure is a query-based measure that demonstrated practicality, sensitivity, validity and reliability for assessing operator situation awareness (SA) from monitoring process plants in representative settings.

  16. Understanding community-based processes for research ethics review: a national study.

    PubMed

    Shore, Nancy; Brazauskas, Ruta; Drew, Elaine; Wong, Kristine A; Moy, Lisa; Baden, Andrea Corage; Cyr, Kirsten; Ulevicus, Jocelyn; Seifer, Sarena D

    2011-12-01

    Institutional review boards (IRBs), designed to protect individual study participants, do not routinely assess community consent, risks, and benefits. Community groups are establishing ethics review processes to determine whether and how research is conducted in their communities. To strengthen the ethics review of community-engaged research, we sought to identify and describe these processes. In 2008 we conducted an online survey of US-based community groups and community-institutional partnerships involved in human-participants research. We identified 109 respondents who met participation criteria and had ethics review processes in place. The respondents' processes mainly functioned through community-institutional partnerships, community-based organizations, community health centers, and tribal organizations. These processes had been created primarily to ensure that the involved communities were engaged in and directly benefited from research and were protected from research harms. The primary process benefits included giving communities a voice in determining which studies were conducted and ensuring that studies were relevant and feasible, and that they built community capacity. The primary process challenges were the time and resources needed to support the process. Community-based processes for ethics review consider community-level ethical issues that institution-based IRBs often do not.

  17. Action video games and improved attentional control: Disentangling selection- and response-based processes.

    PubMed

    Chisholm, Joseph D; Kingstone, Alan

    2015-10-01

    Research has demonstrated that experience with action video games is associated with improvements in a host of cognitive tasks. Evidence from paradigms that assess aspects of attention has suggested that action video game players (AVGPs) possess greater control over the allocation of attentional resources than do non-video-game players (NVGPs). Using a compound search task that teased apart selection- and response-based processes (Duncan, 1985), we required participants to perform an oculomotor capture task in which they made saccades to a uniquely colored target (selection-based process) and then produced a manual directional response based on information within the target (response-based process). We replicated the finding that AVGPs are less susceptible to attentional distraction and, critically, revealed that AVGPs outperform NVGPs on both selection-based and response-based processes. These results not only are consistent with the improved-attentional-control account of AVGP benefits, but they suggest that the benefit of action video game playing extends across the full breadth of attention-mediated stimulus-response processes that impact human performance.

  18. Comparative evaluation of urban storm water quality models

    NASA Astrophysics Data System (ADS)

    Vaze, J.; Chiew, Francis H. S.

    2003-10-01

    The estimation of urban storm water pollutant loads is required for the development of mitigation and management strategies to minimize impacts to receiving environments. Event pollutant loads are typically estimated using either regression equations or "process-based" water quality models. The relative merit of using regression models compared to process-based models is not clear. A modeling study is carried out here to evaluate the comparative ability of the regression equations and process-based water quality models to estimate event diffuse pollutant loads from impervious surfaces. The results indicate that, once calibrated, both the regression equations and the process-based model can estimate event pollutant loads satisfactorily. In fact, the loads estimated using the regression equation as a function of rainfall intensity and runoff rate are better than the loads estimated using the process-based model. Therefore, if only estimates of event loads are required, regression models should be used because they are simpler and require less data compared to process-based models.

  19. Anammox-based technologies for nitrogen removal: Advances in process start-up and remaining issues.

    PubMed

    Ali, Muhammad; Okabe, Satoshi

    2015-12-01

    Nitrogen removal from wastewater via anaerobic ammonium oxidation (anammox)-based process has been recognized as efficient, cost-effective and low energy alternative to the conventional nitrification and denitrification processes. To date, more than one hundred full-scale anammox plants have been installed and operated for treatment of NH4(+)-rich wastewater streams around the world, and the number is increasing rapidly. Since the discovery of anammox process, extensive researches have been done to develop various anammox-based technologies. However, there are still some challenges in practical application of anammox-based treatment process at full-scale, e.g., longer start-up period, limited application to mainstream municipal wastewater and poor effluent water quality. This paper aimed to summarize recent status of application of anammox process and researches on technological development for solving these remaining problems. In addition, an integrated system of anammox-based process and microbial fuel cell is proposed for sustainable and energy-positive wastewater treatment. Copyright © 2015 Elsevier Ltd. All rights reserved.

  20. Observer-based perturbation extremum seeking control with input constraints for direct-contact membrane distillation process

    NASA Astrophysics Data System (ADS)

    Eleiwi, Fadi; Laleg-Kirati, Taous Meriem

    2018-06-01

    An observer-based perturbation extremum seeking control is proposed for a direct-contact membrane distillation (DCMD) process. The process is described with a dynamic model that is based on a 2D advection-diffusion equation model which has pump flow rates as process inputs. The objective of the controller is to optimise the trade-off between the permeate mass flux and the energy consumption by the pumps inside the process. Cases of single and multiple control inputs are considered through the use of only the feed pump flow rate or both the feed and the permeate pump flow rates. A nonlinear Lyapunov-based observer is designed to provide an estimation for the temperature distribution all over the designated domain of the DCMD process. Moreover, control inputs are constrained with an anti-windup technique to be within feasible and physical ranges. Performance of the proposed structure is analysed, and simulations based on real DCMD process parameters for each control input are provided.

  1. Selective aqueous extraction of organics coupled with trapping by membrane separation

    DOEpatents

    van Eikeren, Paul; Brose, Daniel J.; Ray, Roderick J.

    1991-01-01

    An improvement to processes for the selective extractation of organic solutes from organic solvents by water-based extractants is disclosed, the improvement comprising coupling various membrane separation processes with the organic extraction process, the membrane separation process being utilized to continuously recycle the water-based extractant and at the same time selectively remove or concentrate organic solute from the water-based extractant.

  2. Application of agent-based system for bioprocess description and process improvement.

    PubMed

    Gao, Ying; Kipling, Katie; Glassey, Jarka; Willis, Mark; Montague, Gary; Zhou, Yuhong; Titchener-Hooker, Nigel J

    2010-01-01

    Modeling plays an important role in bioprocess development for design and scale-up. Predictive models can also be used in biopharmaceutical manufacturing to assist decision-making either to maintain process consistency or to identify optimal operating conditions. To predict the whole bioprocess performance, the strong interactions present in a processing sequence must be adequately modeled. Traditionally, bioprocess modeling considers process units separately, which makes it difficult to capture the interactions between units. In this work, a systematic framework is developed to analyze the bioprocesses based on a whole process understanding and considering the interactions between process operations. An agent-based approach is adopted to provide a flexible infrastructure for the necessary integration of process models. This enables the prediction of overall process behavior, which can then be applied during process development or once manufacturing has commenced, in both cases leading to the capacity for fast evaluation of process improvement options. The multi-agent system comprises a process knowledge base, process models, and a group of functional agents. In this system, agent components co-operate with each other in performing their tasks. These include the description of the whole process behavior, evaluating process operating conditions, monitoring of the operating processes, predicting critical process performance, and providing guidance to decision-making when coping with process deviations. During process development, the system can be used to evaluate the design space for process operation. During manufacture, the system can be applied to identify abnormal process operation events and then to provide suggestions as to how best to cope with the deviations. In all cases, the function of the system is to ensure an efficient manufacturing process. The implementation of the agent-based approach is illustrated via selected application scenarios, which demonstrate how such a framework may enable the better integration of process operations by providing a plant-wide process description to facilitate process improvement. Copyright 2009 American Institute of Chemical Engineers

  3. Net-centric ACT-R-Based Cognitive Architecture with DEVS Unified Process

    DTIC Science & Technology

    2011-04-01

    effort has been spent in analyzing various forms of requirement specifications, viz, state-based, Natural Language based, UML-based, Rule- based, BPMN ...requirement specifications in one of the chosen formats such as BPMN , DoDAF, Natural Language Processing (NLP) based, UML- based, DSL or simply

  4. A KPI-based process monitoring and fault detection framework for large-scale processes.

    PubMed

    Zhang, Kai; Shardt, Yuri A W; Chen, Zhiwen; Yang, Xu; Ding, Steven X; Peng, Kaixiang

    2017-05-01

    Large-scale processes, consisting of multiple interconnected subprocesses, are commonly encountered in industrial systems, whose performance needs to be determined. A common approach to this problem is to use a key performance indicator (KPI)-based approach. However, the different KPI-based approaches are not developed with a coherent and consistent framework. Thus, this paper proposes a framework for KPI-based process monitoring and fault detection (PM-FD) for large-scale industrial processes, which considers the static and dynamic relationships between process and KPI variables. For the static case, a least squares-based approach is developed that provides an explicit link with least-squares regression, which gives better performance than partial least squares. For the dynamic case, using the kernel representation of each subprocess, an instrument variable is used to reduce the dynamic case to the static case. This framework is applied to the TE benchmark process and the hot strip mill rolling process. The results show that the proposed method can detect faults better than previous methods. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.

  5. Automatic and controlled components of judgment and decision making.

    PubMed

    Ferreira, Mario B; Garcia-Marques, Leonel; Sherman, Steven J; Sherman, Jeffrey W

    2006-11-01

    The categorization of inductive reasoning into largely automatic processes (heuristic reasoning) and controlled analytical processes (rule-based reasoning) put forward by dual-process approaches of judgment under uncertainty (e.g., K. E. Stanovich & R. F. West, 2000) has been primarily a matter of assumption with a scarcity of direct empirical findings supporting it. The present authors use the process dissociation procedure (L. L. Jacoby, 1991) to provide convergent evidence validating a dual-process perspective to judgment under uncertainty based on the independent contributions of heuristic and rule-based reasoning. Process dissociations based on experimental manipulation of variables were derived from the most relevant theoretical properties typically used to contrast the two forms of reasoning. These include processing goals (Experiment 1), cognitive resources (Experiment 2), priming (Experiment 3), and formal training (Experiment 4); the results consistently support the author's perspective. They conclude that judgment under uncertainty is neither an automatic nor a controlled process but that it reflects both processes, with each making independent contributions.

  6. A midas plugin to enable construction of reproducible web-based image processing pipelines

    PubMed Central

    Grauer, Michael; Reynolds, Patrick; Hoogstoel, Marion; Budin, Francois; Styner, Martin A.; Oguz, Ipek

    2013-01-01

    Image processing is an important quantitative technique for neuroscience researchers, but difficult for those who lack experience in the field. In this paper we present a web-based platform that allows an expert to create a brain image processing pipeline, enabling execution of that pipeline even by those biomedical researchers with limited image processing knowledge. These tools are implemented as a plugin for Midas, an open-source toolkit for creating web based scientific data storage and processing platforms. Using this plugin, an image processing expert can construct a pipeline, create a web-based User Interface, manage jobs, and visualize intermediate results. Pipelines are executed on a grid computing platform using BatchMake and HTCondor. This represents a new capability for biomedical researchers and offers an innovative platform for scientific collaboration. Current tools work well, but can be inaccessible for those lacking image processing expertise. Using this plugin, researchers in collaboration with image processing experts can create workflows with reasonable default settings and streamlined user interfaces, and data can be processed easily from a lab environment without the need for a powerful desktop computer. This platform allows simplified troubleshooting, centralized maintenance, and easy data sharing with collaborators. These capabilities enable reproducible science by sharing datasets and processing pipelines between collaborators. In this paper, we present a description of this innovative Midas plugin, along with results obtained from building and executing several ITK based image processing workflows for diffusion weighted MRI (DW MRI) of rodent brain images, as well as recommendations for building automated image processing pipelines. Although the particular image processing pipelines developed were focused on rodent brain MRI, the presented plugin can be used to support any executable or script-based pipeline. PMID:24416016

  7. A midas plugin to enable construction of reproducible web-based image processing pipelines.

    PubMed

    Grauer, Michael; Reynolds, Patrick; Hoogstoel, Marion; Budin, Francois; Styner, Martin A; Oguz, Ipek

    2013-01-01

    Image processing is an important quantitative technique for neuroscience researchers, but difficult for those who lack experience in the field. In this paper we present a web-based platform that allows an expert to create a brain image processing pipeline, enabling execution of that pipeline even by those biomedical researchers with limited image processing knowledge. These tools are implemented as a plugin for Midas, an open-source toolkit for creating web based scientific data storage and processing platforms. Using this plugin, an image processing expert can construct a pipeline, create a web-based User Interface, manage jobs, and visualize intermediate results. Pipelines are executed on a grid computing platform using BatchMake and HTCondor. This represents a new capability for biomedical researchers and offers an innovative platform for scientific collaboration. Current tools work well, but can be inaccessible for those lacking image processing expertise. Using this plugin, researchers in collaboration with image processing experts can create workflows with reasonable default settings and streamlined user interfaces, and data can be processed easily from a lab environment without the need for a powerful desktop computer. This platform allows simplified troubleshooting, centralized maintenance, and easy data sharing with collaborators. These capabilities enable reproducible science by sharing datasets and processing pipelines between collaborators. In this paper, we present a description of this innovative Midas plugin, along with results obtained from building and executing several ITK based image processing workflows for diffusion weighted MRI (DW MRI) of rodent brain images, as well as recommendations for building automated image processing pipelines. Although the particular image processing pipelines developed were focused on rodent brain MRI, the presented plugin can be used to support any executable or script-based pipeline.

  8. Process-based costing.

    PubMed

    Lee, Robert H; Bott, Marjorie J; Forbes, Sarah; Redford, Linda; Swagerty, Daniel L; Taunton, Roma Lee

    2003-01-01

    Understanding how quality improvement affects costs is important. Unfortunately, low-cost, reliable ways of measuring direct costs are scarce. This article builds on the principles of process improvement to develop a costing strategy that meets both criteria. Process-based costing has 4 steps: developing a flowchart, estimating resource use, valuing resources, and calculating direct costs. To illustrate the technique, this article uses it to cost the care planning process in 3 long-term care facilities. We conclude that process-based costing is easy to implement; generates reliable, valid data; and allows nursing managers to assess the costs of new or modified processes.

  9. Process evaluation of the Enabling Mothers toPrevent Pediatric Obesity Through Web-Based Learning and Reciprocal Determinism (EMPOWER) randomized control trial.

    PubMed

    Knowlden, Adam P; Sharma, Manoj

    2014-09-01

    Family-and-home-based interventions are an important vehicle for preventing childhood obesity. Systematic process evaluations have not been routinely conducted in assessment of these interventions. The purpose of this study was to plan and conduct a process evaluation of the Enabling Mothers to Prevent Pediatric Obesity Through Web-Based Learning and Reciprocal Determinism (EMPOWER) randomized control trial. The trial was composed of two web-based, mother-centered interventions for prevention of obesity in children between 4 and 6 years of age. Process evaluation used the components of program fidelity, dose delivered, dose received, context, reach, and recruitment. Categorical process evaluation data (program fidelity, dose delivered, dose exposure, and context) were assessed using Program Implementation Index (PII) values. Continuous process evaluation variables (dose satisfaction and recruitment) were assessed using ANOVA tests to evaluate mean differences between groups (experimental and control) and sessions (sessions 1 through 5). Process evaluation results found that both groups (experimental and control) were equivalent, and interventions were administered as planned. Analysis of web-based intervention process objectives requires tailoring of process evaluation models for online delivery. Dissemination of process evaluation results can advance best practices for implementing effective online health promotion programs. © 2014 Society for Public Health Education.

  10. Supervisee Art-Based Disclosure in "El Duende" Process Painting

    ERIC Educational Resources Information Center

    Robb, Megan; Miller, Abbe

    2017-01-01

    Although art-based supervision often leads to supervisee disclosure, little is known about the experience, process, or contributions of such disclosure. We investigated the phenomenon of supervisee disclosure during "El Duende" Process Painting art-based group supervision using a qualitative study. JoHari's Window was used as a grounding…

  11. Multi-model comparison on the effects of climate change on tree species in the eastern U.S.: results from an enhanced niche model and process-based ecosystem and landscape models

    Treesearch

    Louis R. Iverson; Frank R. Thompson; Stephen Matthews; Matthew Peters; Anantha Prasad; William D. Dijak; Jacob Fraser; Wen J. Wang; Brice Hanberry; Hong He; Maria Janowiak; Patricia Butler; Leslie Brandt; Chris Swanston

    2016-01-01

    Context. Species distribution models (SDM) establish statistical relationships between the current distribution of species and key attributes whereas process-based models simulate ecosystem and tree species dynamics based on representations of physical and biological processes. TreeAtlas, which uses DISTRIB SDM, and Linkages and LANDIS PRO, process...

  12. Numerical Study on Wake Flow Field Characteristic of the Base-Bleed Unit under Fast Depressurization Process

    NASA Astrophysics Data System (ADS)

    Xue, Xiaochun; Yu, Yonggang

    2017-04-01

    Numerical analyses have been performed to study the influence of fast depressurization on the wake flow field of the base-bleed unit (BBU) with a secondary combustion when the base-bleed projectile is propelled out of the muzzle. Two-dimensional axisymmetric Navier-Stokes equations for a multi-component chemically reactive system is solved by Fortran program to calculate the couplings of the internal flow field and wake flow field with consideration of the combustion of the base-bleed propellant and secondary combustion effect. Based on the comparison with the experiments, the unsteady variation mechanism and secondary combustion characteristic of wake flow field under fast depressurization process is obtained numerically. The results show that in the fast depressurization process, the variation extent of the base pressure of the BBU is larger in first 0.9 ms and then decreases gradually and after 1.5 ms, it remains basically stable. The pressure and temperature of the base-bleed combustion chamber experience the decrease and pickup process. Moreover, after the pressure and temperature decrease to the lowest point, the phenomenon that the external gases are flowing back into the base-bleed combustion chamber appears. Also, with the decrease of the initial pressure, the unsteady process becomes shorter and the temperature gradient in the base-bleed combustion chamber declines under the fast depressurization process, which benefits the combustion of the base-bleed propellant.

  13. Process Analytical Technology (PAT): batch-to-batch reproducibility of fermentation processes by robust process operational design and control.

    PubMed

    Gnoth, S; Jenzsch, M; Simutis, R; Lübbert, A

    2007-10-31

    The Process Analytical Technology (PAT) initiative of the FDA is a reaction on the increasing discrepancy between current possibilities in process supervision and control of pharmaceutical production processes and its current application in industrial manufacturing processes. With rigid approval practices based on standard operational procedures, adaptations of production reactors towards the state of the art were more or less inhibited for long years. Now PAT paves the way for continuous process and product improvements through improved process supervision based on knowledge-based data analysis, "Quality-by-Design"-concepts, and, finally, through feedback control. Examples of up-to-date implementations of this concept are presented. They are taken from one key group of processes in recombinant pharmaceutical protein manufacturing, the cultivations of genetically modified Escherichia coli bacteria.

  14. Learning-based controller for biotechnology processing, and method of using

    DOEpatents

    Johnson, John A.; Stoner, Daphne L.; Larsen, Eric D.; Miller, Karen S.; Tolle, Charles R.

    2004-09-14

    The present invention relates to process control where some of the controllable parameters are difficult or impossible to characterize. The present invention relates to process control in biotechnology of such systems, but not limited to. Additionally, the present invention relates to process control in biotechnology minerals processing. In the inventive method, an application of the present invention manipulates a minerals bioprocess to find local exterma (maxima or minima) for selected output variables/process goals by using a learning-based controller for bioprocess oxidation of minerals during hydrometallurgical processing. The learning-based controller operates with or without human supervision and works to find processor optima without previously defined optima due to the non-characterized nature of the process being manipulated.

  15. Lubricant base oil and wax processing. [Glossary included

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sequeira, A. Jr.

    1994-01-01

    This book provides state-of-the-art information on all processes currently used to manufacture lubricant base oils and waxes. It furnishes helpful lists of conversion factors, construction cost data, and process licensors, as well as a glossary of essential petroleum processing terms.

  16. Environmental Assessment: General Plan-Based Environmental Impact Analysis Process, Laughlin Air Force Base

    DTIC Science & Technology

    2007-05-01

    BASED ENVIROMENTAL IMPACT ANALYSIS PROCESS LAUGHLIN AIR FORCE BASE, TEXAS AGENCY: 47th Flying Training Wing (FTW), Laughlin Air Force Base (AFB), Texas...8217\\ \\ \\ \\ \\\\ \\ ~ >(- \\ , ~ AOC01 \\ PS018 / WP002 \\ DP008 // WP006 \\ ~ ,/ ’----- -----·-------------~--/·/ LAUGHLIN AIR FORCE BASE ENVIROMENTAL RESTORATION

  17. Design and Implementation of Hydrologic Process Knowledge-base Ontology: A case study for the Infiltration Process

    NASA Astrophysics Data System (ADS)

    Elag, M.; Goodall, J. L.

    2013-12-01

    Hydrologic modeling often requires the re-use and integration of models from different disciplines to simulate complex environmental systems. Component-based modeling introduces a flexible approach for integrating physical-based processes across disciplinary boundaries. Several hydrologic-related modeling communities have adopted the component-based approach for simulating complex physical systems by integrating model components across disciplinary boundaries in a workflow. However, it is not always straightforward to create these interdisciplinary models due to the lack of sufficient knowledge about a hydrologic process. This shortcoming is a result of using informal methods for organizing and sharing information about a hydrologic process. A knowledge-based ontology provides such standards and is considered the ideal approach for overcoming this challenge. The aims of this research are to present the methodology used in analyzing the basic hydrologic domain in order to identify hydrologic processes, the ontology itself, and how the proposed ontology is integrated with the Water Resources Component (WRC) ontology. The proposed ontology standardizes the definitions of a hydrologic process, the relationships between hydrologic processes, and their associated scientific equations. The objective of the proposed Hydrologic Process (HP) Ontology is to advance the idea of creating a unified knowledge framework for components' metadata by introducing a domain-level ontology for hydrologic processes. The HP ontology is a step toward an explicit and robust domain knowledge framework that can be evolved through the contribution of domain users. Analysis of the hydrologic domain is accomplished using the Formal Concept Approach (FCA), in which the infiltration process, an important hydrologic process, is examined. Two infiltration methods, the Green-Ampt and Philip's methods, were used to demonstrate the implementation of information in the HP ontology. Furthermore, a SPARQL service is provided for semantic-based querying of the ontology.

  18. Rethinking a Negative Event: The Affective Impact of Ruminative versus Imagery-Based Processing of Aversive Autobiographical Memories.

    PubMed

    Slofstra, Christien; Eisma, Maarten C; Holmes, Emily A; Bockting, Claudi L H; Nauta, Maaike H

    2017-01-01

    Ruminative (abstract verbal) processing during recall of aversive autobiographical memories may serve to dampen their short-term affective impact. Experimental studies indeed demonstrate that verbal processing of non-autobiographical material and positive autobiographical memories evokes weaker affective responses than imagery-based processing. In the current study, we hypothesized that abstract verbal or concrete verbal processing of an aversive autobiographical memory would result in weaker affective responses than imagery-based processing. The affective impact of abstract verbal versus concrete verbal versus imagery-based processing during recall of an aversive autobiographical memory was investigated in a non-clinical sample ( n  = 99) using both an observational and an experimental design. Observationally, it was examined whether spontaneous use of processing modes (both state and trait measures) was associated with impact of aversive autobiographical memory recall on negative and positive affect. Experimentally, the causal relation between processing modes and affective impact was investigated by manipulating the processing mode during retrieval of the same aversive autobiographical memory. Main findings were that higher levels of trait (but not state) measures of both ruminative and imagery-based processing and depressive symptomatology were positively correlated with higher levels of negative affective impact in the observational part of the study. In the experimental part, no main effect of processing modes on affective impact of autobiographical memories was found. However, a significant moderating effect of depressive symptomatology was found. Only for individuals with low levels of depressive symptomatology, concrete verbal (but not abstract verbal) processing of the aversive autobiographical memory did result in weaker affective responses, compared to imagery-based processing. These results cast doubt on the hypothesis that ruminative processing of aversive autobiographical memories serves to avoid the negative emotions evoked by such memories. Furthermore, findings suggest that depressive symptomatology is associated with the spontaneous use and the affective impact of processing modes during recall of aversive autobiographical memories. Clinical studies are needed that examine the role of processing modes during aversive autobiographical memory recall in depression, including the potential effectiveness of targeting processing modes in therapy.

  19. A novel vortex tube-based N2-expander liquefaction process for enhancing the energy efficiency of natural gas liquefaction

    NASA Astrophysics Data System (ADS)

    Qyyum, Muhammad Abdul; Wei, Feng; Hussain, Arif; Ali, Wahid; Sehee, Oh; Lee, Moonyong

    2017-11-01

    This research work unfolds a simple, safe, and environment-friendly energy efficient novel vortex tube-based natural gas liquefaction process (LNG). A vortex tube was introduced to the popular N2-expander liquefaction process to enhance the liquefaction efficiency. The process structure and condition were modified and optimized to take a potential advantage of the vortex tube on the natural gas liquefaction cycle. Two commercial simulators ANSYS® and Aspen HYSYS® were used to investigate the application of vortex tube in the refrigeration cycle of LNG process. The Computational fluid dynamics (CFD) model was used to simulate the vortex tube with nitrogen (N2) as a working fluid. Subsequently, the results of the CFD model were embedded in the Aspen HYSYS® to validate the proposed LNG liquefaction process. The proposed natural gas liquefaction process was optimized using the knowledge-based optimization (KBO) approach. The overall energy consumption was chosen as an objective function for optimization. The performance of the proposed liquefaction process was compared with the conventional N2-expander liquefaction process. The vortex tube-based LNG process showed a significant improvement of energy efficiency by 20% in comparison with the conventional N2-expander liquefaction process. This high energy efficiency was mainly due to the isentropic expansion of the vortex tube. It turned out that the high energy efficiency of vortex tube-based process is totally dependent on the refrigerant cold fraction, operating conditions as well as refrigerant cycle configurations.

  20. An object-oriented description method of EPMM process

    NASA Astrophysics Data System (ADS)

    Jiang, Zuo; Yang, Fan

    2017-06-01

    In order to use the object-oriented mature tools and language in software process model, make the software process model more accord with the industrial standard, it’s necessary to study the object-oriented modelling of software process. Based on the formal process definition in EPMM, considering the characteristics that Petri net is mainly formal modelling tool and combining the Petri net modelling with the object-oriented modelling idea, this paper provides this implementation method to convert EPMM based on Petri net into object models based on object-oriented description.

  1. Automated process control for plasma etching

    NASA Astrophysics Data System (ADS)

    McGeown, Margaret; Arshak, Khalil I.; Murphy, Eamonn

    1992-06-01

    This paper discusses the development and implementation of a rule-based system which assists in providing automated process control for plasma etching. The heart of the system is to establish a correspondence between a particular data pattern -- sensor or data signals -- and one or more modes of failure, i.e., a data-driven monitoring approach. The objective of this rule based system, PLETCHSY, is to create a program combining statistical process control (SPC) and fault diagnosis to help control a manufacturing process which varies over time. This can be achieved by building a process control system (PCS) with the following characteristics. A facility to monitor the performance of the process by obtaining and analyzing the data relating to the appropriate process variables. Process sensor/status signals are input into an SPC module. If trends are present, the SPC module outputs the last seven control points, a pattern which is represented by either regression or scoring. The pattern is passed to the rule-based module. When the rule-based system recognizes a pattern, it starts the diagnostic process using the pattern. If the process is considered to be going out of control, advice is provided about actions which should be taken to bring the process back into control.

  2. The Effects of a Web-Based Nursing Process Documentation Program on Stress and Anxiety of Nursing Students in South Korea.

    PubMed

    Lee, Eunjoo; Noh, Hyun Kyung

    2016-01-01

    To examine the effects of a web-based nursing process documentation system on the stress and anxiety of nursing students during their clinical practice. A quasi-experimental design was employed. The experimental group (n = 110) used a web-based nursing process documentation program for their case reports as part of assignments for a clinical practicum, whereas the control group (n = 106) used traditional paper-based case reports. Stress and anxiety levels were measured with a numeric rating scale before, 2 weeks after, and 4 weeks after using the web-based nursing process documentation program during a clinical practicum. The data were analyzed using descriptive statistics, t tests, chi-square tests, and repeated-measures analyses of variance. Nursing students who used the web-based nursing process documentation program showed significant lower levels of stress and anxiety than the control group. A web-based nursing process documentation program could be used to reduce the stress and anxiety of nursing students during clinical practicum, which ultimately would benefit nursing students by increasing satisfaction with and effectiveness of clinical practicum. © 2015 NANDA International, Inc.

  3. Process-based principles for restoring river ecosystems

    Treesearch

    Timothy J. Beechie; David A. Sear; Julian D. Olden; George R. Pess; John M. Buffington; Hamish Moir; Philip Roni; Michael M. Pollock

    2010-01-01

    Process-based restoration aims to reestablish normative rates and magnitudes of physical, chemical, and biological processes that sustain river and floodplain ecosystems. Ecosystem conditions at any site are governed by hierarchical regional, watershed, and reach-scale processes controlling hydrologic and sediment regimes; floodplain and aquatic habitat...

  4. Rule-based reasoning is fast and belief-based reasoning can be slow: Challenging current explanations of belief-bias and base-rate neglect.

    PubMed

    Newman, Ian R; Gibb, Maia; Thompson, Valerie A

    2017-07-01

    It is commonly assumed that belief-based reasoning is fast and automatic, whereas rule-based reasoning is slower and more effortful. Dual-Process theories of reasoning rely on this speed-asymmetry explanation to account for a number of reasoning phenomena, such as base-rate neglect and belief-bias. The goal of the current study was to test this hypothesis about the relative speed of belief-based and rule-based processes. Participants solved base-rate problems (Experiment 1) and conditional inferences (Experiment 2) under a challenging deadline; they then gave a second response in free time. We found that fast responses were informed by rules of probability and logical validity, and that slow responses incorporated belief-based information. Implications for Dual-Process theories and future research options for dissociating Type I and Type II processes are discussed. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  5. An Automated Energy Detection Algorithm Based on Morphological Filter Processing with a Modified Watershed Transform

    DTIC Science & Technology

    2018-01-01

    ARL-TR-8270 ● JAN 2018 US Army Research Laboratory An Automated Energy Detection Algorithm Based on Morphological Filter...Automated Energy Detection Algorithm Based on Morphological Filter Processing with a Modified Watershed Transform by Kwok F Tom Sensors and Electron...1 October 2016–30 September 2017 4. TITLE AND SUBTITLE An Automated Energy Detection Algorithm Based on Morphological Filter Processing with a

  6. The Acquisition of Integrated Science Process Skills in a Web-Based Learning Environment

    ERIC Educational Resources Information Center

    Saat, Rohaida Mohd

    2004-01-01

    Web-based learning is becoming prevalent in science learning. Some use specially designed programs, while others use materials available on the Internet. This qualitative case study examined the process of acquisition of integrated science process skills, particularly the skill of controlling variables, in a web-based learning environment among…

  7. Process-based modeling of temperature and water profiles in the seedling recruitment zone: Part II. Seedling emergence timing

    USDA-ARS?s Scientific Manuscript database

    Predictions of seedling emergence timing for spring wheat are facilitated by process-based modeling of the microsite environment in the shallow seedling recruitment zone. Hourly temperature and water profiles within the recruitment zone for 60 days after planting were simulated from the process-base...

  8. Process Writing and Communicative-Task-Based Instruction: Many Common Features, but More Common Limitations?

    ERIC Educational Resources Information Center

    Bruton, Anthony

    2005-01-01

    Process writing and communicative-task-based instruction both assume productive tasks that prompt self-expression to motivate students and as the principal engine for developing L2 proficiency in the language classroom. Besides this, process writing and communicative-task-based instruction have much else in common, despite some obvious…

  9. Process Correlation Analysis Model for Process Improvement Identification

    PubMed Central

    Park, Sooyong

    2014-01-01

    Software process improvement aims at improving the development process of software systems. It is initiated by process assessment identifying strengths and weaknesses and based on the findings, improvement plans are developed. In general, a process reference model (e.g., CMMI) is used throughout the process of software process improvement as the base. CMMI defines a set of process areas involved in software development and what to be carried out in process areas in terms of goals and practices. Process areas and their elements (goals and practices) are often correlated due to the iterative nature of software development process. However, in the current practice, correlations of process elements are often overlooked in the development of an improvement plan, which diminishes the efficiency of the plan. This is mainly attributed to significant efforts and the lack of required expertise. In this paper, we present a process correlation analysis model that helps identify correlations of process elements from the results of process assessment. This model is defined based on CMMI and empirical data of improvement practices. We evaluate the model using industrial data. PMID:24977170

  10. Process correlation analysis model for process improvement identification.

    PubMed

    Choi, Su-jin; Kim, Dae-Kyoo; Park, Sooyong

    2014-01-01

    Software process improvement aims at improving the development process of software systems. It is initiated by process assessment identifying strengths and weaknesses and based on the findings, improvement plans are developed. In general, a process reference model (e.g., CMMI) is used throughout the process of software process improvement as the base. CMMI defines a set of process areas involved in software development and what to be carried out in process areas in terms of goals and practices. Process areas and their elements (goals and practices) are often correlated due to the iterative nature of software development process. However, in the current practice, correlations of process elements are often overlooked in the development of an improvement plan, which diminishes the efficiency of the plan. This is mainly attributed to significant efforts and the lack of required expertise. In this paper, we present a process correlation analysis model that helps identify correlations of process elements from the results of process assessment. This model is defined based on CMMI and empirical data of improvement practices. We evaluate the model using industrial data.

  11. Enzyme-based solutions for textile processing and dye contaminant biodegradation-a review.

    PubMed

    Chatha, Shahzad Ali Shahid; Asgher, Muhammad; Iqbal, Hafiz M N

    2017-06-01

    The textile industry, as recognized conformist and stake industry in the world's economy, is facing serious environmental challenges. In numerous industries, in practice, various chemical-based processes from initial sizing to final washing are fascinating harsh environment concerns. Some of these chemicals are corrosive to equipment and cause serious damage itself. Therefore, in the twenty-first century, chemical and allied industries quest a paradigm transition from traditional chemical-based concepts to a greener, sustainable, and environmentally friendlier catalytic alternative, both at the laboratory and industrial scales. Bio-based catalysis offers numerous benefits in the context of biotechnological industry and environmental applications. In recent years, bio-based processing has received particular interest among the scientist for inter- and multi-disciplinary investigations in the areas of natural and engineering sciences for the application in biotechnology sector at large and textile industries in particular. Different enzymatic processes such as chemical substitution have been developed or in the process of development for various textile wet processes. In this context, the present review article summarizes current developments and highlights those areas where environment-friendly enzymatic textile processing might play an increasingly important role in the textile industry. In the first part of the review, a special focus has been given to a comparative discussion of the chemical-based "classical/conventional" treatments and the modern enzyme-based treatment processes. Some relevant information is also reported to identify the major research gaps to be worked out in future.

  12. Developing cloud-based Business Process Management (BPM): a survey

    NASA Astrophysics Data System (ADS)

    Mercia; Gunawan, W.; Fajar, A. N.; Alianto, H.; Inayatulloh

    2018-03-01

    In today’s highly competitive business environment, modern enterprises are dealing difficulties to cut unnecessary costs, eliminate wastes and delivery huge benefits for the organization. Companies are increasingly turning to a more flexible IT environment to help them realize this goal. For this reason, the article applies cloud based Business Process Management (BPM) that enables to focus on modeling, monitoring and process management. Cloud based BPM consists of business processes, business information and IT resources, which help build real-time intelligence systems, based on business management and cloud technology. Cloud computing is a paradigm that involves procuring dynamically measurable resources over the internet as an IT resource service. Cloud based BPM service enables to address common problems faced by traditional BPM, especially in promoting flexibility, event-driven business process to exploit opportunities in the marketplace.

  13. Testing the Digital Thread in Support of Model-Based Manufacturing and Inspection

    PubMed Central

    Hedberg, Thomas; Lubell, Joshua; Fischer, Lyle; Maggiano, Larry; Feeney, Allison Barnard

    2016-01-01

    A number of manufacturing companies have reported anecdotal evidence describing the benefits of Model-Based Enterprise (MBE). Based on this evidence, major players in industry have embraced a vision to deploy MBE. In our view, the best chance of realizing this vision is the creation of a single “digital thread.” Under MBE, there exists a Model-Based Definition (MBD), created by the Engineering function, that downstream functions reuse to complete Model-Based Manufacturing and Model-Based Inspection activities. The ensemble of data that enables the combination of model-based definition, manufacturing, and inspection defines this digital thread. Such a digital thread would enable real-time design and analysis, collaborative process-flow development, automated artifact creation, and full-process traceability in a seamless real-time collaborative development among project participants. This paper documents the strengths and weaknesses in the current, industry strategies for implementing MBE. It also identifies gaps in the transition and/or exchange of data between various manufacturing processes. Lastly, this paper presents measured results from a study of model-based processes compared to drawing-based processes and provides evidence to support the anecdotal evidence and vision made by industry. PMID:27325911

  14. On the fractal characterization of Paretian Poisson processes

    NASA Astrophysics Data System (ADS)

    Eliazar, Iddo I.; Sokolov, Igor M.

    2012-06-01

    Paretian Poisson processes are Poisson processes which are defined on the positive half-line, have maximal points, and are quantified by power-law intensities. Paretian Poisson processes are elemental in statistical physics, and are the bedrock of a host of power-law statistics ranging from Pareto's law to anomalous diffusion. In this paper we establish evenness-based fractal characterizations of Paretian Poisson processes. Considering an array of socioeconomic evenness-based measures of statistical heterogeneity, we show that: amongst the realm of Poisson processes which are defined on the positive half-line, and have maximal points, Paretian Poisson processes are the unique class of 'fractal processes' exhibiting scale-invariance. The results established in this paper are diametric to previous results asserting that the scale-invariance of Poisson processes-with respect to physical randomness-based measures of statistical heterogeneity-is characterized by exponential Poissonian intensities.

  15. A new intuitionistic fuzzy rule-based decision-making system for an operating system process scheduler.

    PubMed

    Butt, Muhammad Arif; Akram, Muhammad

    2016-01-01

    We present a new intuitionistic fuzzy rule-based decision-making system based on intuitionistic fuzzy sets for a process scheduler of a batch operating system. Our proposed intuitionistic fuzzy scheduling algorithm, inputs the nice value and burst time of all available processes in the ready queue, intuitionistically fuzzify the input values, triggers appropriate rules of our intuitionistic fuzzy inference engine and finally calculates the dynamic priority (dp) of all the processes in the ready queue. Once the dp of every process is calculated the ready queue is sorted in decreasing order of dp of every process. The process with maximum dp value is sent to the central processing unit for execution. Finally, we show complete working of our algorithm on two different data sets and give comparisons with some standard non-preemptive process schedulers.

  16. Reading Remediation Based on Sequential and Simultaneous Processing.

    ERIC Educational Resources Information Center

    Gunnison, Judy; And Others

    1982-01-01

    The theory postulating a dichotomy between sequential and simultaneous processing is reviewed and its implications for remediating reading problems are reviewed. Research is cited on sequential-simultaneous processing for early and advanced reading. A list of remedial strategies based on the processing dichotomy addresses decoding and lexical…

  17. The poetics of mourning and faith-based intervention in maladaptive grieving processes in Ethiopia.

    PubMed

    Hussein, Jeylan Wolyie

    2018-08-01

    The paper is an inquiry into the poetics of mourning and faith-based intervention in maladaptive grieving processes in Ethiopia. The paper discusses the ways that loss is signified and analyzes the meanings of ethnocultural and psychospiritual practices employed to deal with maladaptive grief processes and their psychological and emotional after-effects. Hermeneutics provided the methodological framework and informed the analysis. The thesis of the paper is that the poetics of mourning and faith-based social interventions are interactionally based meaning making processes. The paper indicates the limitations of the study and their implications for further inquiry.

  18. A Petri Net-Based Software Process Model for Developing Process-Oriented Information Systems

    NASA Astrophysics Data System (ADS)

    Li, Yu; Oberweis, Andreas

    Aiming at increasing flexibility, efficiency, effectiveness, and transparency of information processing and resource deployment in organizations to ensure customer satisfaction and high quality of products and services, process-oriented information systems (POIS) represent a promising realization form of computerized business information systems. Due to the complexity of POIS, explicit and specialized software process models are required to guide POIS development. In this chapter we characterize POIS with an architecture framework and present a Petri net-based software process model tailored for POIS development with consideration of organizational roles. As integrated parts of the software process model, we also introduce XML nets, a variant of high-level Petri nets as basic methodology for business processes modeling, and an XML net-based software toolset providing comprehensive functionalities for POIS development.

  19. Process-based organization design and hospital efficiency.

    PubMed

    Vera, Antonio; Kuntz, Ludwig

    2007-01-01

    The central idea of process-based organization design is that organizing a firm around core business processes leads to cost reductions and quality improvements. We investigated theoretically and empirically whether the implementation of a process-based organization design is advisable in hospitals. The data came from a database compiled by the Statistical Office of the German federal state of Rheinland-Pfalz and from a written questionnaire, which was sent to the chief executive officers (CEOs) of all 92 hospitals in this federal state. We used data envelopment analysis (DEA) to measure hospital efficiency, and factor analysis and regression analysis to test our hypothesis. Our principal finding is that a high degree of process-based organization has a moderate but significant positive effect on the efficiency of hospitals. The main implication is that hospitals should implement a process-based organization to improve their efficiency. However, to actually achieve positive effects on efficiency, it is of paramount importance to observe some implementation rules, in particular to mobilize physician participation and to create an adequate organizational culture.

  20. Process-Based Governance in Public Administrations Using Activity-Based Costing

    NASA Astrophysics Data System (ADS)

    Becker, Jörg; Bergener, Philipp; Räckers, Michael

    Decision- and policy-makers in public administrations currently lack on missing relevant information for sufficient governance. In Germany the introduction of New Public Management and double-entry accounting enable public administrations to get the opportunity to use cost-centered accounting mechanisms to establish new governance mechanisms. Process modelling in this case can be a useful instrument to help the public administrations decision- and policy-makers to structure their activities and capture relevant information. In combination with approaches like Activity-Based Costing, higher management level can be supported with a reasonable data base for fruitful and reasonable governance approaches. Therefore, the aim of this article is combining the public sector domain specific process modelling method PICTURE and concept of activity-based costing for supporting Public Administrations in process-based Governance.

  1. DEVS Unified Process for Web-Centric Development and Testing of System of Systems

    DTIC Science & Technology

    2008-05-20

    gathering from the user. Further, methodologies have been developed to generate DEVS models from BPMN /BPEL-based and message-based requirement specifications...27] 3. BPMN /BPEL based system specifications: Business Process Modeling Notation ( BPMN ) [bpm] or Business Process Execution Language (BPEL) provide a...information is stored in .wsdl and .bpel files for BPEL but in proprietary format for BPMN . 4. DoDAF-based requirement specifications: Department of

  2. Problem Based Learning: Cognitive and Metacognitive Processes during Problem Analysis.

    ERIC Educational Resources Information Center

    De Grave, W. S.; And Others

    1996-01-01

    To investigate whether problem-based learning leads to conceptual change, the cognitive and metacognitive processes of a group of medical students were studied during the problem analysis phase, and their verbal communication and thinking processes were analyzed. Stimulated recall of the thinking process during the discussion detected a conceptual…

  3. Facial Movements Facilitate Part-Based, Not Holistic, Processing in Children, Adolescents, and Adults

    ERIC Educational Resources Information Center

    Xiao, Naiqi G.; Quinn, Paul C.; Ge, Liezhong; Lee, Kang

    2017-01-01

    Although most of the faces we encounter daily are moving ones, much of what we know about face processing and its development is based on studies using static faces that emphasize holistic processing as the hallmark of mature face processing. Here the authors examined the effects of facial movements on face processing developmentally in children…

  4. A Model of Process-Based Automation: Cost and Quality Implications in the Medication Management Process

    ERIC Educational Resources Information Center

    Spaulding, Trent Joseph

    2011-01-01

    The objective of this research is to understand how a set of systems, as defined by the business process, creates value. The three studies contained in this work develop the model of process-based automation. The model states that complementarities among systems are specified by handoffs in the business process. The model also provides theory to…

  5. Process-based models are required to manage ecological systems in a changing world

    Treesearch

    K. Cuddington; M.-J. Fortin; L.R. Gerber; A. Hastings; A. Liebhold; M. OConnor; C. Ray

    2013-01-01

    Several modeling approaches can be used to guide management decisions. However, some approaches are better fitted than others to address the problem of prediction under global change. Process-based models, which are based on a theoretical understanding of relevant ecological processes, provide a useful framework to incorporate specific responses to altered...

  6. Knowledge information management toolkit and method

    DOEpatents

    Hempstead, Antoinette R.; Brown, Kenneth L.

    2006-08-15

    A system is provided for managing user entry and/or modification of knowledge information into a knowledge base file having an integrator support component and a data source access support component. The system includes processing circuitry, memory, a user interface, and a knowledge base toolkit. The memory communicates with the processing circuitry and is configured to store at least one knowledge base. The user interface communicates with the processing circuitry and is configured for user entry and/or modification of knowledge pieces within a knowledge base. The knowledge base toolkit is configured for converting knowledge in at least one knowledge base from a first knowledge base form into a second knowledge base form. A method is also provided.

  7. Common Workflow Service: Standards Based Solution for Managing Operational Processes

    NASA Astrophysics Data System (ADS)

    Tinio, A. W.; Hollins, G. A.

    2017-06-01

    The Common Workflow Service is a collaborative and standards-based solution for managing mission operations processes using techniques from the Business Process Management (BPM) discipline. This presentation describes the CWS and its benefits.

  8. Development of Spreadsheet-Based Integrated Transaction Processing Systems and Financial Reporting Systems

    NASA Astrophysics Data System (ADS)

    Ariana, I. M.; Bagiada, I. M.

    2018-01-01

    Development of spreadsheet-based integrated transaction processing systems and financial reporting systems is intended to optimize the capabilities of spreadsheet in accounting data processing. The purpose of this study are: 1) to describe the spreadsheet-based integrated transaction processing systems and financial reporting systems; 2) to test its technical and operational feasibility. This study type is research and development. The main steps of study are: 1) needs analysis (need assessment); 2) developing spreadsheet-based integrated transaction processing systems and financial reporting systems; and 3) testing the feasibility of spreadsheet-based integrated transaction processing systems and financial reporting systems. The technical feasibility include the ability of hardware and operating systems to respond the application of accounting, simplicity and ease of use. Operational feasibility include the ability of users using accounting applications, the ability of accounting applications to produce information, and control applications of the accounting applications. The instrument used to assess the technical and operational feasibility of the systems is the expert perception questionnaire. The instrument uses 4 Likert scale, from 1 (strongly disagree) to 4 (strongly agree). Data were analyzed using percentage analysis by comparing the number of answers within one (1) item by the number of ideal answer within one (1) item. Spreadsheet-based integrated transaction processing systems and financial reporting systems integrate sales, purchases, and cash transaction processing systems to produce financial reports (statement of profit or loss and other comprehensive income, statement of changes in equity, statement of financial position, and statement of cash flows) and other reports. Spreadsheet-based integrated transaction processing systems and financial reporting systems is feasible from the technical aspects (87.50%) and operational aspects (84.17%).

  9. Disentangling inhibition-based and retrieval-based aftereffects of distractors: Cognitive versus motor processes.

    PubMed

    Singh, Tarini; Laub, Ruth; Burgard, Jan Pablo; Frings, Christian

    2018-05-01

    Selective attention refers to the ability to selectively act upon relevant information at the expense of irrelevant information. Yet, in many experimental tasks, what happens to the representation of the irrelevant information is still debated. Typically, 2 approaches to distractor processing have been suggested, namely distractor inhibition and distractor-based retrieval. However, it is also typical that both processes are hard to disentangle. For instance, in the negative priming literature (for a review Frings, Schneider, & Fox, 2015) this has been a continuous debate since the early 1980s. In the present study, we attempted to prove that both processes exist, but that they reflect distractor processing at different levels of representation. Distractor inhibition impacts stimulus representation, whereas distractor-based retrieval impacts mainly motor processes. We investigated both processes in a distractor-priming task, which enables an independent measurement of both processes. For our argument that both processes impact different levels of distractor representation, we estimated the exponential parameter (τ) and Gaussian components (μ, σ) of the exponential Gaussian reaction-time (RT) distribution, which have previously been used to independently test the effects of cognitive and motor processes (e.g., Moutsopoulou & Waszak, 2012). The distractor-based retrieval effect was evident for the Gaussian component, which is typically discussed as reflecting motor processes, but not for the exponential parameter, whereas the inhibition component was evident for the exponential parameter, which is typically discussed as reflecting cognitive processes, but not for the Gaussian parameter. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  10. A Sensitivity Analysis Method to Study the Behavior of Complex Process-based Models

    NASA Astrophysics Data System (ADS)

    Brugnach, M.; Neilson, R.; Bolte, J.

    2001-12-01

    The use of process-based models as a tool for scientific inquiry is becoming increasingly relevant in ecosystem studies. Process-based models are artificial constructs that simulate the system by mechanistically mimicking the functioning of its component processes. Structurally, a process-based model can be characterized, in terms of its processes and the relationships established among them. Each process comprises a set of functional relationships among several model components (e.g., state variables, parameters and input data). While not encoded explicitly, the dynamics of the model emerge from this set of components and interactions organized in terms of processes. It is the task of the modeler to guarantee that the dynamics generated are appropriate and semantically equivalent to the phenomena being modeled. Despite the availability of techniques to characterize and understand model behavior, they do not suffice to completely and easily understand how a complex process-based model operates. For example, sensitivity analysis studies model behavior by determining the rate of change in model output as parameters or input data are varied. One of the problems with this approach is that it considers the model as a "black box", and it focuses on explaining model behavior by analyzing the relationship input-output. Since, these models have a high degree of non-linearity, understanding how the input affects an output can be an extremely difficult task. Operationally, the application of this technique may constitute a challenging task because complex process-based models are generally characterized by a large parameter space. In order to overcome some of these difficulties, we propose a method of sensitivity analysis to be applicable to complex process-based models. This method focuses sensitivity analysis at the process level, and it aims to determine how sensitive the model output is to variations in the processes. Once the processes that exert the major influence in the output are identified, the causes of its variability can be found. Some of the advantages of this approach are that it reduces the dimensionality of the search space, it facilitates the interpretation of the results and it provides information that allows exploration of uncertainty at the process level, and how it might affect model output. We present an example using the vegetation model BIOME-BGC.

  11. Values-based recruitment in health care.

    PubMed

    Miller, Sam Louise

    2015-01-27

    Values-based recruitment is a process being introduced to student selection for nursing courses and appointment to registered nurse posts. This article discusses the process of values-based recruitment and demonstrates why it is important in health care today. It examines the implications of values-based recruitment for candidates applying to nursing courses and to newly qualified nurses applying for their first posts in England. To ensure the best chance of success, candidates should understand the principles and process of values-based recruitment and how to prepare for this type of interview.

  12. Natural Resource Information System. Volume 1: Overall description

    NASA Technical Reports Server (NTRS)

    1972-01-01

    A prototype computer-based Natural Resource Information System was designed which could store, process, and display data of maximum usefulness to land management decision making. The system includes graphic input and display, the use of remote sensing as a data source, and it is useful at multiple management levels. A survey established current decision making processes and functions, information requirements, and data collection and processing procedures. The applications of remote sensing data and processing requirements were established. Processing software was constructed and a data base established using high-altitude imagery and map coverage of selected areas of SE Arizona. Finally a demonstration of system processing functions was conducted utilizing material from the data base.

  13. Associative recognition: a case of recall-to-reject processing.

    PubMed

    Rotello, C M; Heit, E

    2000-09-01

    Two-process accounts of recognition memory assume that memory judgments are based on both a rapidly available familiarity-based process and a slower, more accurate, recall-based mechanism. Past experiments on the time course of item recognition have not supported the recall-to-reject account of the second process, in which the retrieval of an old item is used to reject a similar foil (Rotello & Heit, 1999). In three new experiments, using analyses similar to those of Rotello and Heit, we found robust evidence for recall-to-reject processing in associative recognition, for word pairs, and for list-discrimination judgments. Put together, these results have implications for two-process accounts of recognition.

  14. Application of advanced structure to multi-tone mask for FPD process

    NASA Astrophysics Data System (ADS)

    Song, Jin-Han; Jeong, Jin-Woong; Kim, Kyu-Sik; Jeong, Woo-Gun; Yun, Sang-Pil; Lee, Dong-Heok; Choi, Sang-Soo

    2017-07-01

    In accordance with improvement of FPD technology, masks such as phase shift mask (PSM) and multi-tone mask (MTM) for a particular purpose also have been developed. Above all, the MTM consisted of more than tri-tone transmittance has a substantial advantage which enables to reduce the number of mask demand in FPD fabrication process contrast to normal mask of two-tone transmittance.[1,2] A chromium (Cr)-based MTM (Typically top type) is being widely employed because of convenience of etch process caused by its only Cr-based structure consisted of Cr absorber layer and Cr half-tone layer. However, the top type of Cr-based MTM demands two Cr sputtering processes after each layer etching process and writing process. For this reason, a different material from the Cr-based MTM is required for reduction of mask fabrication time and cost. In this study, we evaluate a MTM which has a structure combined Cr with molybdenum silicide (MoSi) to resolve the issues mentioned above. The MoSi which is demonstrated by integrated circuit (IC) process is a suitable material for MTM evaluation. This structure could realize multi-transmittance in common with the Cr-based MTM. Moreover, it enables to reduce the number of sputtering process. We investigate a optimized structure upon consideration of productivity along with performance such as critical dimension (CD) variation and transmittance range of each structure. The transmittance is targeted at h-line wavelength (405 nm) in the evaluation. Compared with Cr-based MTM, the performances of all Cr-/MoSi-based MTMs are considered.

  15. Prefrontal and medial temporal contributions to episodic memory-based reasoning.

    PubMed

    Suzuki, Chisato; Tsukiura, Takashi; Mochizuki-Kawai, Hiroko; Shigemune, Yayoi; Iijima, Toshio

    2009-03-01

    Episodic memory retrieval and reasoning are fundamental psychological components of our daily lives. Although previous studies have investigated the brain regions associated with these processes separately, the neural mechanisms of reasoning based on episodic memory retrieval are largely unknown. Here, we investigated the neural correlates underlying episodic memory-based reasoning using functional magnetic resonance imaging (fMRI). During fMRI scanning, subjects performed three tasks: reasoning, episodic memory retrieval, and episodic memory-based reasoning. We identified dissociable activations related to reasoning, episodic memory retrieval, and linking processes between the two. Regions related to reasoning were identified in the left ventral prefrontal cortices (PFC), and those related to episodic memory retrieval were found in the right medial temporal lobe (MTL) regions. In addition, activations predominant in the linking process between the two were found in the left dorsal and right ventral PFC. These findings suggest that episodic memory-based reasoning is composed of at least three processes, i.e., reasoning, episodic memory retrieval, and linking processes between the two, and that activation of both the PFC and MTL is crucial in episodic memory-based reasoning. These findings are the first to demonstrate that PFC and MTL regions contribute differentially to each process in episodic memory-based reasoning.

  16. Cognitive load privileges memory-based over data-driven processing, not group-level over person-level processing.

    PubMed

    Skorich, Daniel P; Mavor, Kenneth I

    2013-09-01

    In the current paper, we argue that categorization and individuation, as traditionally discussed and as experimentally operationalized, are defined in terms of two confounded underlying dimensions: a person/group dimension and a memory-based/data-driven dimension. In a series of three experiments, we unconfound these dimensions and impose a cognitive load. Across the three experiments, two with laboratory-created targets and one with participants' friends as the target, we demonstrate that cognitive load privileges memory-based over data-driven processing, not group- over person-level processing. We discuss the results in terms of their implications for conceptualizations of the categorization/individuation distinction, for the equivalence of person and group processes, for the ultimate 'purpose' and meaningfulness of group-based perception and, fundamentally, for the process of categorization, broadly defined. © 2012 The British Psychological Society.

  17. A Psychometric Study of Reading Processes in L2 Acquisition: Deploying Deep Processing to Push Learners' Discourse Towards Syntactic Processing-Based Constructions

    ERIC Educational Resources Information Center

    Manuel, Carlos J.

    2009-01-01

    This study assesses reading processes and/or strategies needed to deploy deep processing that could push learners towards syntactic-based constructions in L2 classrooms. Research has found L2 acquisition to present varying degrees of success and/or fossilization (Bley-Vroman 1989, Birdsong 1992 and Sharwood Smith 1994). For example, learners have…

  18. Challenges and Opportunities: One Stop Processing of Automatic Large-Scale Base Map Production Using Airborne LIDAR Data Within GIS Environment. Case Study: Makassar City, Indonesia

    NASA Astrophysics Data System (ADS)

    Widyaningrum, E.; Gorte, B. G. H.

    2017-05-01

    LiDAR data acquisition is recognized as one of the fastest solutions to provide basis data for large-scale topographical base maps worldwide. Automatic LiDAR processing is believed one possible scheme to accelerate the large-scale topographic base map provision by the Geospatial Information Agency in Indonesia. As a progressive advanced technology, Geographic Information System (GIS) open possibilities to deal with geospatial data automatic processing and analyses. Considering further needs of spatial data sharing and integration, the one stop processing of LiDAR data in a GIS environment is considered a powerful and efficient approach for the base map provision. The quality of the automated topographic base map is assessed and analysed based on its completeness, correctness, quality, and the confusion matrix.

  19. Towards a Web-Based Handbook of Generic, Process-Oriented Learning Designs

    ERIC Educational Resources Information Center

    Marjanovic, Olivera

    2005-01-01

    Process-oriented learning designs are innovative learning activities that include a set of inter-related learning tasks and are generic (could be used across disciplines). An example includes a problem-solving process widely used in problem-based learning today. Most of the existing process-oriented learning designs are not documented, let alone…

  20. Visemic Processing in Audiovisual Discrimination of Natural Speech: A Simultaneous fMRI-EEG Study

    ERIC Educational Resources Information Center

    Dubois, Cyril; Otzenberger, Helene; Gounot, Daniel; Sock, Rudolph; Metz-Lutz, Marie-Noelle

    2012-01-01

    In a noisy environment, visual perception of articulatory movements improves natural speech intelligibility. Parallel to phonemic processing based on auditory signal, visemic processing constitutes a counterpart based on "visemes", the distinctive visual units of speech. Aiming at investigating the neural substrates of visemic processing in a…

  1. Process-Based Remediation of Decoding in Gifted LD Students: Three Case Studies.

    ERIC Educational Resources Information Center

    Crawford, Shawn; Snart, Fern

    1994-01-01

    Three gifted males (ages 10-13) with deficits in successive coding participated in a process-based remedial program which combined global training on tasks requiring successive processing and tasks applying successive processing to decoding in reading, and which utilized verbal mediation. Differences in student improvement were related to entry…

  2. Object-based neglect in number processing

    PubMed Central

    2013-01-01

    Recent evidence suggests that neglect patients seem to have particular problems representing relatively smaller numbers corresponding to the left part of the mental number line. However, while this indicates space-based neglect for representational number space little is known about whether and - if so - how object-based neglect influences number processing. To evaluate influences of object-based neglect in numerical cognition, a group of neglect patients and two control groups had to compare two-digit numbers to an internally represented standard. Conceptualizing two-digit numbers as objects of which the left part (i.e., the tens digit should be specifically neglected) we were able to evaluate object-based neglect for number magnitude processing. Object-based neglect was indicated by a larger unit-decade compatibility effect actually reflecting impaired processing of the leftward tens digits. Additionally, faster processing of within- as compared to between-decade items provided further evidence suggesting particular difficulties in integrating tens and units into the place-value structure of the Arabic number system. In summary, the present study indicates that, in addition to the spatial representation of number magnitude, also the processing of place-value information of multi-digit numbers seems specifically impaired in neglect patients. PMID:23343126

  3. Chemomics-based marker compounds mining and mimetic processing for exploring chemical mechanisms in traditional processing of herbal medicines, a continuous study on Rehmanniae Radix.

    PubMed

    Zhou, Li; Xu, Jin-Di; Zhou, Shan-Shan; Shen, Hong; Mao, Qian; Kong, Ming; Zou, Ye-Ting; Xu, Ya-Yun; Xu, Jun; Li, Song-Lin

    2017-12-29

    Exploring processing chemistry, in particular the chemical transformation mechanisms involved, is a key step to elucidate the scientific basis in traditional processing of herbal medicines. Previously, taking Rehmanniae Radix (RR) as a case study, the holistic chemome (secondary metabolome and glycome) difference between raw and processed RR was revealed by integrating hyphenated chromatographic techniques-based targeted glycomics and untargeted metabolomics. Nevertheless, the complex chemical transformation mechanisms underpinning the holistic chemome variation in RR processing remain to be extensively clarified. As a continuous study, here a novel strategy by combining chemomics-based marker compounds mining and mimetic processing is proposed for further exploring the chemical mechanisms involved in herbal processing. First, the differential marker compounds between raw and processed herbs were rapidly discovered by untargeted chemomics-based mining approach through multivariate statistical analysis of the chemome data obtained by integrated metabolomics and glycomics analysis. Second, the marker compounds were mimetically processed under the simulated physicochemical conditions as in the herb processing, and the final reaction products were chemically characterized by targeted chemomics-based mining approach. Third, the main chemical transformation mechanisms involved were clarified by linking up the original marker compounds and their mimetic processing products. Using this strategy, a set of differential marker compounds including saccharides, glycosides and furfurals in raw and processed RR was rapidly found, and the major chemical mechanisms involved in RR processing were elucidated as stepwise transformations of saccharides (polysaccharides, oligosaccharides and monosaccharides) and glycosides (iridoid glycosides and phenethylalcohol glycosides) into furfurals (glycosylated/non-glycosylated hydroxymethylfurfurals) by deglycosylation and/or dehydration. The research deliverables indicated that the proposed strategy could advance the understanding of RR processing chemistry, and therefore may be considered a promising approach for delving into the scientific basis in traditional processing of herbal medicines. Copyright © 2017 Elsevier B.V. All rights reserved.

  4. Business Process-Based Resource Importance Determination

    NASA Astrophysics Data System (ADS)

    Fenz, Stefan; Ekelhart, Andreas; Neubauer, Thomas

    Information security risk management (ISRM) heavily depends on realistic impact values representing the resources’ importance in the overall organizational context. Although a variety of ISRM approaches have been proposed, well-founded methods that provide an answer to the following question are still missing: How can business processes be used to determine resources’ importance in the overall organizational context? We answer this question by measuring the actual importance level of resources based on business processes. Therefore, this paper presents our novel business process-based resource importance determination method which provides ISRM with an efficient and powerful tool for deriving realistic resource importance figures solely from existing business processes. The conducted evaluation has shown that the calculation results of the developed method comply to the results gained in traditional workshop-based assessments.

  5. Elastic facial movement influences part-based but not holistic processing

    PubMed Central

    Xiao, Naiqi G.; Quinn, Paul C.; Ge, Liezhong; Lee, Kang

    2013-01-01

    Face processing has been studied for decades. However, most of the empirical investigations have been conducted using static face images as stimuli. Little is known about whether static face processing findings can be generalized to real world contexts, in which faces are constantly moving. The present study investigates the nature of face processing (holistic vs. part-based) in elastic moving faces. Specifically, we focus on whether elastic moving faces, as compared to static ones, can facilitate holistic or part-based face processing. Using the composite paradigm, participants were asked to remember either an elastic moving face (i.e., a face that blinks and chews) or a static face, and then tested with a static composite face. The composite effect was (1) significantly smaller in the dynamic condition than in the static condition, (2) consistently found with different face encoding times (Experiments 1–3), and (3) present for the recognition of both upper and lower face parts (Experiment 4). These results suggest that elastic facial motion facilitates part-based processing, rather than holistic processing. Thus, while previous work with static faces has emphasized an important role for holistic processing, the current work highlights an important role for featural processing with moving faces. PMID:23398253

  6. Low-Cost Aqueous Coal Desulfurization

    NASA Technical Reports Server (NTRS)

    Kalvinskas, J. J.; Vasilakos, N.; Corcoran, W. H.; Grohmann, K.; Rohatgi, N. K.

    1982-01-01

    Water-based process for desulfurizing coal not only eliminates need for costly organic solvent but removes sulfur more effectively than an earlier solvent-based process. New process could provide low-cost commercial method for converting high-sulfur coal into environmentally acceptable fuel.

  7. Quality management of manufacturing process based on manufacturing execution system

    NASA Astrophysics Data System (ADS)

    Zhang, Jian; Jiang, Yang; Jiang, Weizhuo

    2017-04-01

    Quality control elements in manufacturing process are elaborated. And the approach of quality management of manufacturing process based on manufacturing execution system (MES) is discussed. The functions of MES for a microcircuit production line are introduced conclusively.

  8. Simple Process-Based Simulators for Generating Spatial Patterns of Habitat Loss and Fragmentation: A Review and Introduction to the G-RaFFe Model

    PubMed Central

    Pe'er, Guy; Zurita, Gustavo A.; Schober, Lucia; Bellocq, Maria I.; Strer, Maximilian; Müller, Michael; Pütz, Sandro

    2013-01-01

    Landscape simulators are widely applied in landscape ecology for generating landscape patterns. These models can be divided into two categories: pattern-based models that generate spatial patterns irrespective of the processes that shape them, and process-based models that attempt to generate patterns based on the processes that shape them. The latter often tend toward complexity in an attempt to obtain high predictive precision, but are rarely used for generic or theoretical purposes. Here we show that a simple process-based simulator can generate a variety of spatial patterns including realistic ones, typifying landscapes fragmented by anthropogenic activities. The model “G-RaFFe” generates roads and fields to reproduce the processes in which forests are converted into arable lands. For a selected level of habitat cover, three factors dominate its outcomes: the number of roads (accessibility), maximum field size (accounting for land ownership patterns), and maximum field disconnection (which enables field to be detached from roads). We compared the performance of G-RaFFe to three other models: Simmap (neutral model), Qrule (fractal-based) and Dinamica EGO (with 4 model versions differing in complexity). A PCA-based analysis indicated G-RaFFe and Dinamica version 4 (most complex) to perform best in matching realistic spatial patterns, but an alternative analysis which considers model variability identified G-RaFFe and Qrule as performing best. We also found model performance to be affected by habitat cover and the actual land-uses, the latter reflecting on land ownership patterns. We suggest that simple process-based generators such as G-RaFFe can be used to generate spatial patterns as templates for theoretical analyses, as well as for gaining better understanding of the relation between spatial processes and patterns. We suggest caution in applying neutral or fractal-based approaches, since spatial patterns that typify anthropogenic landscapes are often non-fractal in nature. PMID:23724108

  9. Simple process-based simulators for generating spatial patterns of habitat loss and fragmentation: a review and introduction to the G-RaFFe model.

    PubMed

    Pe'er, Guy; Zurita, Gustavo A; Schober, Lucia; Bellocq, Maria I; Strer, Maximilian; Müller, Michael; Pütz, Sandro

    2013-01-01

    Landscape simulators are widely applied in landscape ecology for generating landscape patterns. These models can be divided into two categories: pattern-based models that generate spatial patterns irrespective of the processes that shape them, and process-based models that attempt to generate patterns based on the processes that shape them. The latter often tend toward complexity in an attempt to obtain high predictive precision, but are rarely used for generic or theoretical purposes. Here we show that a simple process-based simulator can generate a variety of spatial patterns including realistic ones, typifying landscapes fragmented by anthropogenic activities. The model "G-RaFFe" generates roads and fields to reproduce the processes in which forests are converted into arable lands. For a selected level of habitat cover, three factors dominate its outcomes: the number of roads (accessibility), maximum field size (accounting for land ownership patterns), and maximum field disconnection (which enables field to be detached from roads). We compared the performance of G-RaFFe to three other models: Simmap (neutral model), Qrule (fractal-based) and Dinamica EGO (with 4 model versions differing in complexity). A PCA-based analysis indicated G-RaFFe and Dinamica version 4 (most complex) to perform best in matching realistic spatial patterns, but an alternative analysis which considers model variability identified G-RaFFe and Qrule as performing best. We also found model performance to be affected by habitat cover and the actual land-uses, the latter reflecting on land ownership patterns. We suggest that simple process-based generators such as G-RaFFe can be used to generate spatial patterns as templates for theoretical analyses, as well as for gaining better understanding of the relation between spatial processes and patterns. We suggest caution in applying neutral or fractal-based approaches, since spatial patterns that typify anthropogenic landscapes are often non-fractal in nature.

  10. Off-target model based OPC

    NASA Astrophysics Data System (ADS)

    Lu, Mark; Liang, Curtis; King, Dion; Melvin, Lawrence S., III

    2005-11-01

    Model-based Optical Proximity correction has become an indispensable tool for achieving wafer pattern to design fidelity at current manufacturing process nodes. Most model-based OPC is performed considering the nominal process condition, with limited consideration of through process manufacturing robustness. This study examines the use of off-target process models - models that represent non-nominal process states such as would occur with a dose or focus variation - to understands and manipulate the final pattern correction to a more process robust configuration. The study will first examine and validate the process of generating an off-target model, then examine the quality of the off-target model. Once the off-target model is proven, it will be used to demonstrate methods of generating process robust corrections. The concepts are demonstrated using a 0.13 μm logic gate process. Preliminary indications show success in both off-target model production and process robust corrections. With these off-target models as tools, mask production cycle times can be reduced.

  11. A Methodology and Software Environment for Testing Process Model’s Sequential Predictions with Protocols

    DTIC Science & Technology

    1992-12-21

    in preparation). Foundations of artificial intelligence. Cambridge, MA: MIT Press. O’Reilly, R. C. (1991). X3DNet: An X- Based Neural Network ...2.2.3 Trace based protocol analysis 19 2.2A Summary of important data features 21 2.3 Tools related to process model testing 23 2.3.1 Tools for building...algorithm 57 3. Requirements for testing process models using trace based protocol 59 analysis 3.1 Definition of trace based protocol analysis (TBPA) 59

  12. Article and process for producing an article

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lacy, Benjamin Paul; Jacala, Ariel Caesar Prepena; Kottilingam, Srikanth Chandrudu

    An article and a process of producing an article are provided. The article includes a base material, a cooling feature arrangement positioned on the base material, the cooling feature arrangement including an additive-structured material, and a cover material. The cooling feature arrangement is between the base material and the cover material. The process of producing the article includes manufacturing a cooling feature arrangement by an additive manufacturing technique, and then positioning the cooling feature arrangement between a base material and a cover material.

  13. Visual and spatial working memory are not that dissociated after all: a time-based resource-sharing account.

    PubMed

    Vergauwe, Evie; Barrouillet, Pierre; Camos, Valérie

    2009-07-01

    Examinations of interference between visual and spatial materials in working memory have suggested domain- and process-based fractionations of visuo-spatial working memory. The present study examined the role of central time-based resource sharing in visuo-spatial working memory and assessed its role in obtained interference patterns. Visual and spatial storage were combined with both visual and spatial on-line processing components in computer-paced working memory span tasks (Experiment 1) and in a selective interference paradigm (Experiment 2). The cognitive load of the processing components was manipulated to investigate its impact on concurrent maintenance for both within-domain and between-domain combinations of processing and storage components. In contrast to both domain- and process-based fractionations of visuo-spatial working memory, the results revealed that recall performance was determined by the cognitive load induced by the processing of items, rather than by the domain to which those items pertained. These findings are interpreted as evidence for a time-based resource-sharing mechanism in visuo-spatial working memory.

  14. [Definition and stabilization of processes II. Clinical Processes in a Urology Department].

    PubMed

    Pascual, Carlos; Luján, Marcos; Mora, José Ramón; Diz, Manuel Ramón; Martín, Carlos; López, María Carmen

    2015-01-01

    New models in clinical management seek a clinical practice based on quality, efficacy and efficiency, avoiding variability and improvisation. In this paper we have developed one of the most frequent clinical processes in our speciality, the process based on DRG 311 or transurethral procedures without complications. Along it we will describe its components: Stabilization form, clinical trajectory, cost calculation, and finally the process flowchart.

  15. [Construction of NIRS-based process analytical system for production of salvianolic acid for injection and relative discussion].

    PubMed

    Zhang, Lei; Yue, Hong-Shui; Ju, Ai-Chun; Ye, Zheng-Liang

    2016-10-01

    Currently, near infrared spectroscopy (NIRS) has been considered as an efficient tool for achieving process analytical technology(PAT) in the manufacture of traditional Chinese medicine (TCM) products. In this article, the NIRS based process analytical system for the production of salvianolic acid for injection was introduced. The design of the process analytical system was described in detail, including the selection of monitored processes and testing mode, and potential risks that should be avoided. Moreover, the development of relative technologies was also presented, which contained the establishment of the monitoring methods for the elution of polyamide resin and macroporous resin chromatography processes, as well as the rapid analysis method for finished products. Based on author's experience of research and work, several issues in the application of NIRS to the process monitoring and control in TCM production were then raised, and some potential solutions were also discussed. The issues include building the technical team for process analytical system, the design of the process analytical system in the manufacture of TCM products, standardization of the NIRS-based analytical methods, and improving the management of process analytical system. Finally, the prospect for the application of NIRS in the TCM industry was put forward. Copyright© by the Chinese Pharmaceutical Association.

  16. Economics of recombinant antibody production processes at various scales: Industry-standard compared to continuous precipitation.

    PubMed

    Hammerschmidt, Nikolaus; Tscheliessnig, Anne; Sommer, Ralf; Helk, Bernhard; Jungbauer, Alois

    2014-06-01

    Standard industry processes for recombinant antibody production employ protein A affinity chromatography in combination with other chromatography steps and ultra-/diafiltration. This study compares a generic antibody production process with a recently developed purification process based on a series of selective precipitation steps. The new process makes two of the usual three chromatographic steps obsolete and can be performed in a continuous fashion. Cost of Goods (CoGs) analyses were done for: (i) a generic chromatography-based antibody standard purification; (ii) the continuous precipitation-based purification process coupled to a continuous perfusion production system; and (iii) a hybrid process, coupling the continuous purification process to an upstream batch process. The results of this economic analysis show that the precipitation-based process offers cost reductions at all stages of the life cycle of a therapeutic antibody, (i.e. clinical phase I, II and III, as well as full commercial production). The savings in clinical phase production are largely attributed to the fact that expensive chromatographic resins are omitted. These economic analyses will help to determine the strategies that are best suited for small-scale production in parallel fashion, which is of importance for antibody production in non-privileged countries and for personalized medicine. Copyright © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  17. A model of the hierarchy of behaviour, cognition, and consciousness.

    PubMed

    Toates, Frederick

    2006-03-01

    Processes comparable in important respects to those underlying human conscious and non-conscious processing can be identified in a range of species and it is argued that these reflect evolutionary precursors of the human processes. A distinction is drawn between two types of processing: (1) stimulus-based and (2) higher-order. For 'higher-order,' in humans the operations of processing are themselves associated with conscious awareness. Conscious awareness sets the context for stimulus-based processing and its end-point is accessible to conscious awareness. However, the mechanics of the translation between stimulus and response proceeds without conscious control. The paper argues that higher-order processing is an evolutionary addition to stimulus-based processing. The model's value is shown for gaining insight into a range of phenomena and their link with consciousness. These include brain damage, learning, memory, development, vision, emotion, motor control, reasoning, the voluntary versus involuntary debate, and mental disorder.

  18. An assembly process model based on object-oriented hierarchical time Petri Nets

    NASA Astrophysics Data System (ADS)

    Wang, Jiapeng; Liu, Shaoli; Liu, Jianhua; Du, Zenghui

    2017-04-01

    In order to improve the versatility, accuracy and integrity of the assembly process model of complex products, an assembly process model based on object-oriented hierarchical time Petri Nets is presented. A complete assembly process information model including assembly resources, assembly inspection, time, structure and flexible parts is established, and this model describes the static and dynamic data involved in the assembly process. Through the analysis of three-dimensional assembly process information, the assembly information is hierarchically divided from the whole, the local to the details and the subnet model of different levels of object-oriented Petri Nets is established. The communication problem between Petri subnets is solved by using message database, and it reduces the complexity of system modeling effectively. Finally, the modeling process is presented, and a five layer Petri Nets model is established based on the hoisting process of the engine compartment of a wheeled armored vehicle.

  19. Improving Video Game Development: Facilitating Heterogeneous Team Collaboration through Flexible Software Processes

    NASA Astrophysics Data System (ADS)

    Musil, Juergen; Schweda, Angelika; Winkler, Dietmar; Biffl, Stefan

    Based on our observations of Austrian video game software development (VGSD) practices we identified a lack of systematic processes/method support and inefficient collaboration between various involved disciplines, i.e. engineers and artists. VGSD includes heterogeneous disciplines, e.g. creative arts, game/content design, and software. Nevertheless, improving team collaboration and process support is an ongoing challenge to enable a comprehensive view on game development projects. Lessons learned from software engineering practices can help game developers to increase game development processes within a heterogeneous environment. Based on a state of the practice survey in the Austrian games industry, this paper presents (a) first results with focus on process/method support and (b) suggests a candidate flexible process approach based on Scrum to improve VGSD and team collaboration. Results showed (a) a trend to highly flexible software processes involving various disciplines and (b) identified the suggested flexible process approach as feasible and useful for project application.

  20. On the Risk Management and Auditing of SOA Based Business Processes

    NASA Astrophysics Data System (ADS)

    Orriens, Bart; Heuvel, Willem-Jan V./D.; Papazoglou, Mike

    SOA-enabled business processes stretch across many cooperating and coordinated systems, possibly crossing organizational boundaries, and technologies like XML and Web services are used for making system-to-system interactions commonplace. Business processes form the foundation for all organizations, and as such, are impacted by industry regulations. This requires organizations to review their business processes and ensure that they meet the compliance standards set forth in legislation. In this paper we sketch a SOA-based service risk management and auditing methodology including a compliance enforcement and verification system that assures verifiable business process compliance. This is done on the basis of a knowledge-based system that allows integration of internal control systems into business processes conform pre-defined compliance rules, monitor both the normal process behavior and those of the control systems during process execution, and log these behaviors to facilitate retrospective auditing.

  1. Complex Event Processing for Content-Based Text, Image, and Video Retrieval

    DTIC Science & Technology

    2016-06-01

    NY): Wiley- Interscience; 2000. Feldman R, Sanger J. The text mining handbook: advanced approaches in analyzing unstructured data. New York (NY...ARL-TR-7705 ● JUNE 2016 US Army Research Laboratory Complex Event Processing for Content-Based Text , Image, and Video Retrieval...ARL-TR-7705 ● JUNE 2016 US Army Research Laboratory Complex Event Processing for Content-Based Text , Image, and Video Retrieval

  2. Process for strengthening silicon based ceramics

    DOEpatents

    Kim, Hyoun-Ee; Moorhead, A. J.

    1993-01-01

    A process for strengthening silicon based ceramic monolithic materials and omposite materials that contain silicon based ceramic reinforcing phases that requires that the ceramic be exposed to a wet hydrogen atmosphere at about 1400.degree. C. The process results in a dense, tightly adherent silicon containing oxide layer that heals, blunts , or otherwise negates the detrimental effect of strength limiting flaws on the surface of the ceramic body.

  3. Process for strengthening silicon based ceramics

    DOEpatents

    Kim, Hyoun-Ee; Moorhead, A. J.

    1993-04-06

    A process for strengthening silicon based ceramic monolithic materials and omposite materials that contain silicon based ceramic reinforcing phases that requires that the ceramic be exposed to a wet hydrogen atmosphere at about 1400.degree. C. The process results in a dense, tightly adherent silicon containing oxide layer that heals, blunts , or otherwise negates the detrimental effect of strength limiting flaws on the surface of the ceramic body.

  4. Missing in Action: Writing Process-Based Instructional Practices and Measures of Higher-Order Literacy Achievement in Predominantly Urban Elementary Schools

    ERIC Educational Resources Information Center

    Briddell, Andrew

    2013-01-01

    This study of 1,974 fifth grade students investigated potential relationships between writing process-based instruction practices and higher-order thinking measured by a standardized literacy assessment. Writing process is defined as a highly complex, socio-cognitive process that includes: planning, text production, review, metacognition, writing…

  5. The Problem-Based Learning Process: Reflections of Pre-Service Elementary School Teachers

    ERIC Educational Resources Information Center

    Baysal, Zeliha Nurdan

    2017-01-01

    This study aims to identify the benefits acquired by third-year pre-service elementary school teachers participating in a problem-based learning process in social studies education, the issues they encountered in that process and those they are likely to encounter, and their feelings about the process. Semi-structured interviews were used as one…

  6. A cost-effective line-based light-balancing technique using adaptive processing.

    PubMed

    Hsia, Shih-Chang; Chen, Ming-Huei; Chen, Yu-Min

    2006-09-01

    The camera imaging system has been widely used; however, the displaying image appears to have an unequal light distribution. This paper presents novel light-balancing techniques to compensate uneven illumination based on adaptive signal processing. For text image processing, first, we estimate the background level and then process each pixel with nonuniform gain. This algorithm can balance the light distribution while keeping a high contrast in the image. For graph image processing, the adaptive section control using piecewise nonlinear gain is proposed to equalize the histogram. Simulations show that the performance of light balance is better than the other methods. Moreover, we employ line-based processing to efficiently reduce the memory requirement and the computational cost to make it applicable in real-time systems.

  7. Additive Manufacturing of Single-Crystal Superalloy CMSX-4 Through Scanning Laser Epitaxy: Computational Modeling, Experimental Process Development, and Process Parameter Optimization

    NASA Astrophysics Data System (ADS)

    Basak, Amrita; Acharya, Ranadip; Das, Suman

    2016-08-01

    This paper focuses on additive manufacturing (AM) of single-crystal (SX) nickel-based superalloy CMSX-4 through scanning laser epitaxy (SLE). SLE, a powder bed fusion-based AM process was explored for the purpose of producing crack-free, dense deposits of CMSX-4 on top of similar chemistry investment-cast substrates. Optical microscopy and scanning electron microscopy (SEM) investigations revealed the presence of dendritic microstructures that consisted of fine γ' precipitates within the γ matrix in the deposit region. Computational fluid dynamics (CFD)-based process modeling, statistical design of experiments (DoE), and microstructural characterization techniques were combined to produce metallurgically bonded single-crystal deposits of more than 500 μm height in a single pass along the entire length of the substrate. A customized quantitative metallography based image analysis technique was employed for automatic extraction of various deposit quality metrics from the digital cross-sectional micrographs. The processing parameters were varied, and optimal processing windows were identified to obtain good quality deposits. The results reported here represent one of the few successes obtained in producing single-crystal epitaxial deposits through a powder bed fusion-based metal AM process and thus demonstrate the potential of SLE to repair and manufacture single-crystal hot section components of gas turbine systems from nickel-based superalloy powders.

  8. Quality data collection and management technology of aerospace complex product assembly process

    NASA Astrophysics Data System (ADS)

    Weng, Gang; Liu, Jianhua; He, Yongxi; Zhuang, Cunbo

    2017-04-01

    Aiming at solving problems of difficult management and poor traceability for discrete assembly process quality data, a data collection and management method is proposed which take the assembly process and BOM as the core. Data collection method base on workflow technology, data model base on BOM and quality traceability of assembly process is included in the method. Finally, assembly process quality data management system is developed and effective control and management of quality information for complex product assembly process is realized.

  9. Rationality versus reality: the challenges of evidence-based decision making for health policy makers

    PubMed Central

    2010-01-01

    Background Current healthcare systems have extended the evidence-based medicine (EBM) approach to health policy and delivery decisions, such as access-to-care, healthcare funding and health program continuance, through attempts to integrate valid and reliable evidence into the decision making process. These policy decisions have major impacts on society and have high personal and financial costs associated with those decisions. Decision models such as these function under a shared assumption of rational choice and utility maximization in the decision-making process. Discussion We contend that health policy decision makers are generally unable to attain the basic goals of evidence-based decision making (EBDM) and evidence-based policy making (EBPM) because humans make decisions with their naturally limited, faulty, and biased decision-making processes. A cognitive information processing framework is presented to support this argument, and subtle cognitive processing mechanisms are introduced to support the focal thesis: health policy makers' decisions are influenced by the subjective manner in which they individually process decision-relevant information rather than on the objective merits of the evidence alone. As such, subsequent health policy decisions do not necessarily achieve the goals of evidence-based policy making, such as maximizing health outcomes for society based on valid and reliable research evidence. Summary In this era of increasing adoption of evidence-based healthcare models, the rational choice, utility maximizing assumptions in EBDM and EBPM, must be critically evaluated to ensure effective and high-quality health policy decisions. The cognitive information processing framework presented here will aid health policy decision makers by identifying how their decisions might be subtly influenced by non-rational factors. In this paper, we identify some of the biases and potential intervention points and provide some initial suggestions about how the EBDM/EBPM process can be improved. PMID:20504357

  10. Rationality versus reality: the challenges of evidence-based decision making for health policy makers.

    PubMed

    McCaughey, Deirdre; Bruning, Nealia S

    2010-05-26

    Current healthcare systems have extended the evidence-based medicine (EBM) approach to health policy and delivery decisions, such as access-to-care, healthcare funding and health program continuance, through attempts to integrate valid and reliable evidence into the decision making process. These policy decisions have major impacts on society and have high personal and financial costs associated with those decisions. Decision models such as these function under a shared assumption of rational choice and utility maximization in the decision-making process. We contend that health policy decision makers are generally unable to attain the basic goals of evidence-based decision making (EBDM) and evidence-based policy making (EBPM) because humans make decisions with their naturally limited, faulty, and biased decision-making processes. A cognitive information processing framework is presented to support this argument, and subtle cognitive processing mechanisms are introduced to support the focal thesis: health policy makers' decisions are influenced by the subjective manner in which they individually process decision-relevant information rather than on the objective merits of the evidence alone. As such, subsequent health policy decisions do not necessarily achieve the goals of evidence-based policy making, such as maximizing health outcomes for society based on valid and reliable research evidence. In this era of increasing adoption of evidence-based healthcare models, the rational choice, utility maximizing assumptions in EBDM and EBPM, must be critically evaluated to ensure effective and high-quality health policy decisions. The cognitive information processing framework presented here will aid health policy decision makers by identifying how their decisions might be subtly influenced by non-rational factors. In this paper, we identify some of the biases and potential intervention points and provide some initial suggestions about how the EBDM/EBPM process can be improved.

  11. Randomized evaluation of a web based interview process for urology resident selection.

    PubMed

    Shah, Satyan K; Arora, Sanjeev; Skipper, Betty; Kalishman, Summers; Timm, T Craig; Smith, Anthony Y

    2012-04-01

    We determined whether a web based interview process for resident selection could effectively replace the traditional on-site interview. For the 2010 to 2011 match cycle, applicants to the University of New Mexico urology residency program were randomized to participate in a web based interview process via Skype or a traditional on-site interview process. Both methods included interviews with the faculty, a tour of facilities and the opportunity to ask current residents any questions. To maintain fairness the applicants were then reinterviewed via the opposite process several weeks later. We assessed comparative effectiveness, cost, convenience and satisfaction using anonymous surveys largely scored on a 5-point Likert scale. Of 39 total participants (33 applicants and 6 faculty) 95% completed the surveys. The web based interview was less costly to applicants (mean $171 vs $364, p=0.05) and required less time away from school (10% missing 1 or more days vs 30%, p=0.04) compared to traditional on-site interview. However, applicants perceived the web based interview process as less effective than traditional on-site interview, with a mean 6-item summative effectiveness score of 21.3 vs 25.6 (p=0.003). Applicants and faculty favored continuing the web based interview process in the future as an adjunct to on-site interviews. Residency interviews can be successfully conducted via the Internet. The web based interview process reduced costs and improved convenience. The findings of this study support the use of videoconferencing as an adjunct to traditional interview methods rather than as a replacement. Copyright © 2012 American Urological Association Education and Research, Inc. Published by Elsevier Inc. All rights reserved.

  12. A Java-based fMRI processing pipeline evaluation system for assessment of univariate general linear model and multivariate canonical variate analysis-based pipelines.

    PubMed

    Zhang, Jing; Liang, Lichen; Anderson, Jon R; Gatewood, Lael; Rottenberg, David A; Strother, Stephen C

    2008-01-01

    As functional magnetic resonance imaging (fMRI) becomes widely used, the demands for evaluation of fMRI processing pipelines and validation of fMRI analysis results is increasing rapidly. The current NPAIRS package, an IDL-based fMRI processing pipeline evaluation framework, lacks system interoperability and the ability to evaluate general linear model (GLM)-based pipelines using prediction metrics. Thus, it can not fully evaluate fMRI analytical software modules such as FSL.FEAT and NPAIRS.GLM. In order to overcome these limitations, a Java-based fMRI processing pipeline evaluation system was developed. It integrated YALE (a machine learning environment) into Fiswidgets (a fMRI software environment) to obtain system interoperability and applied an algorithm to measure GLM prediction accuracy. The results demonstrated that the system can evaluate fMRI processing pipelines with univariate GLM and multivariate canonical variates analysis (CVA)-based models on real fMRI data based on prediction accuracy (classification accuracy) and statistical parametric image (SPI) reproducibility. In addition, a preliminary study was performed where four fMRI processing pipelines with GLM and CVA modules such as FSL.FEAT and NPAIRS.CVA were evaluated with the system. The results indicated that (1) the system can compare different fMRI processing pipelines with heterogeneous models (NPAIRS.GLM, NPAIRS.CVA and FSL.FEAT) and rank their performance by automatic performance scoring, and (2) the rank of pipeline performance is highly dependent on the preprocessing operations. These results suggest that the system will be of value for the comparison, validation, standardization and optimization of functional neuroimaging software packages and fMRI processing pipelines.

  13. Process improvement for the safe delivery of multidisciplinary-executed treatments-A case in Y-90 microspheres therapy.

    PubMed

    Cai, Bin; Altman, Michael B; Garcia-Ramirez, Jose; LaBrash, Jason; Goddu, S Murty; Mutic, Sasa; Parikh, Parag J; Olsen, Jeffrey R; Saad, Nael; Zoberi, Jacqueline E

    To develop a safe and robust workflow for yttrium-90 (Y-90) radioembolization procedures in a multidisciplinary team environment. A generalized Define-Measure-Analyze-Improve-Control (DMAIC)-based approach to process improvement was applied to a Y-90 radioembolization workflow. In the first DMAIC cycle, events with the Y-90 workflow were defined and analyzed. To improve the workflow, a web-based interactive electronic white board (EWB) system was adopted as the central communication platform and information processing hub. The EWB-based Y-90 workflow then underwent a second DMAIC cycle. Out of 245 treatments, three misses that went undetected until treatment initiation were recorded over a period of 21 months, and root-cause-analysis was performed to determine causes of each incident and opportunities for improvement. The EWB-based Y-90 process was further improved via new rules to define reliable sources of information as inputs into the planning process, as well as new check points to ensure this information was communicated correctly throughout the process flow. After implementation of the revised EWB-based Y-90 workflow, after two DMAIC-like cycles, there were zero misses out of 153 patient treatments in 1 year. The DMAIC-based approach adopted here allowed the iterative development of a robust workflow to achieve an adaptable, event-minimizing planning process despite a complex setting which requires the participation of multiple teams for Y-90 microspheres therapy. Implementation of such a workflow using the EWB or similar platform with a DMAIC-based process improvement approach could be expanded to other treatment procedures, especially those requiring multidisciplinary management. Copyright © 2016 American Brachytherapy Society. Published by Elsevier Inc. All rights reserved.

  14. Production of orthophosphate suspension fertilizers from wet-process acid

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jones, T.M.; Burnell, J.R.

    1984-01-01

    For many years, the Tennessee Valley Authority (TVA) has worked toward development of suspension fertilizers. TVA has two plants for production of base suspension fertilizers from wet-process orthophosphoric acid. One is a demonstration-scale plant where a 13-38-0 grade base suspension is produced by a three-stage ammoniation process. The other is a new batch-type pilot plant which is capable of producing high-grade base suspensions of various ratios and grades from wet-process acid. In this batch plant, suspensions and solutions can also be produced from solid intermediates.

  15. Acceleration of integral imaging based incoherent Fourier hologram capture using graphic processing unit.

    PubMed

    Jeong, Kyeong-Min; Kim, Hee-Seung; Hong, Sung-In; Lee, Sung-Keun; Jo, Na-Young; Kim, Yong-Soo; Lim, Hong-Gi; Park, Jae-Hyeung

    2012-10-08

    Speed enhancement of integral imaging based incoherent Fourier hologram capture using a graphic processing unit is reported. Integral imaging based method enables exact hologram capture of real-existing three-dimensional objects under regular incoherent illumination. In our implementation, we apply parallel computation scheme using the graphic processing unit, accelerating the processing speed. Using enhanced speed of hologram capture, we also implement a pseudo real-time hologram capture and optical reconstruction system. The overall operation speed is measured to be 1 frame per second.

  16. A continuous dual-process model of remember/know judgments.

    PubMed

    Wixted, John T; Mickes, Laura

    2010-10-01

    The dual-process theory of recognition memory holds that recognition decisions can be based on recollection or familiarity, and the remember/know procedure is widely used to investigate those 2 processes. Dual-process theory in general and the remember/know procedure in particular have been challenged by an alternative strength-based interpretation based on signal-detection theory, which holds that remember judgments simply reflect stronger memories than do know judgments. Although supported by a considerable body of research, the signal-detection account is difficult to reconcile with G. Mandler's (1980) classic "butcher-on-the-bus" phenomenon (i.e., strong, familiarity-based recognition). In this article, a new signal-detection model is proposed that does not deny either the validity of dual-process theory or the possibility that remember/know judgments can-when used in the right way-help to distinguish between memories that are largely recollection based from those that are largely familiarity based. It does, however, agree with all prior signal-detection-based critiques of the remember/know procedure, which hold that, as it is ordinarily used, the procedure mainly distinguishes strong memories from weak memories (not recollection from familiarity).

  17. Effect of iron salt type and dosing mode on Fenton-based pretreatment of rice straw for enzymatic hydrolysis.

    PubMed

    Gan, Yu-Yan; Zhou, Si-Li; Dai, Xiao; Wu, Han; Xiong, Zi-Yao; Qin, Yuan-Hang; Ma, Jiayu; Yang, Li; Wu, Zai-Kun; Wang, Tie-Lin; Wang, Wei-Guo; Wang, Cun-Wen

    2018-06-15

    Fenton-based processes with four different iron salts in two different dosing modes were used to pretreat rice straw (RS) samples to increase their enzymatic digestibility. The composition analysis shows that the RS sample pretreated by the dosing mode of iron salt adding into H 2 O 2 has a much lower hemicellulose content than that pretreated by the dosing mode of H 2 O 2 adding into iron salt, and the RS sample pretreated by the chloride salt-based Fenton process has a much lower lignin content and a slightly lower hemicellulose content than that pretreated by the sulphate salt-based Fenton process. The higher concentration of reducing sugar observed on the RS sample with lower lignin and hemicellulose contents justifies that the Fenton-based process could enhance the enzymic hydrolysis of RS by removing hemicellulose and lignin and increasing its accessibility to cellulase. FeCl 3 ·6H 2 O adding into H 2 O 2 is the most efficient Fenton-based process for RS pretreatment. Copyright © 2018 Elsevier Ltd. All rights reserved.

  18. Service-based analysis of biological pathways

    PubMed Central

    Zheng, George; Bouguettaya, Athman

    2009-01-01

    Background Computer-based pathway discovery is concerned with two important objectives: pathway identification and analysis. Conventional mining and modeling approaches aimed at pathway discovery are often effective at achieving either objective, but not both. Such limitations can be effectively tackled leveraging a Web service-based modeling and mining approach. Results Inspired by molecular recognitions and drug discovery processes, we developed a Web service mining tool, named PathExplorer, to discover potentially interesting biological pathways linking service models of biological processes. The tool uses an innovative approach to identify useful pathways based on graph-based hints and service-based simulation verifying user's hypotheses. Conclusion Web service modeling of biological processes allows the easy access and invocation of these processes on the Web. Web service mining techniques described in this paper enable the discovery of biological pathways linking these process service models. Algorithms presented in this paper for automatically highlighting interesting subgraph within an identified pathway network enable the user to formulate hypothesis, which can be tested out using our simulation algorithm that are also described in this paper. PMID:19796403

  19. Conceptual design of distillation-based hybrid separation processes.

    PubMed

    Skiborowski, Mirko; Harwardt, Andreas; Marquardt, Wolfgang

    2013-01-01

    Hybrid separation processes combine different separation principles and constitute a promising design option for the separation of complex mixtures. Particularly, the integration of distillation with other unit operations can significantly improve the separation of close-boiling or azeotropic mixtures. Although the design of single-unit operations is well understood and supported by computational methods, the optimal design of flowsheets of hybrid separation processes is still a challenging task. The large number of operational and design degrees of freedom requires a systematic and optimization-based design approach. To this end, a structured approach, the so-called process synthesis framework, is proposed. This article reviews available computational methods for the conceptual design of distillation-based hybrid processes for the separation of liquid mixtures. Open problems are identified that must be addressed to finally establish a structured process synthesis framework for such processes.

  20. Capturing molecular multimode relaxation processes in excitable gases based on decomposition of acoustic relaxation spectra

    NASA Astrophysics Data System (ADS)

    Zhu, Ming; Liu, Tingting; Wang, Shu; Zhang, Kesheng

    2017-08-01

    Existing two-frequency reconstructive methods can only capture primary (single) molecular relaxation processes in excitable gases. In this paper, we present a reconstructive method based on the novel decomposition of frequency-dependent acoustic relaxation spectra to capture the entire molecular multimode relaxation process. This decomposition of acoustic relaxation spectra is developed from the frequency-dependent effective specific heat, indicating that a multi-relaxation process is the sum of the interior single-relaxation processes. Based on this decomposition, we can reconstruct the entire multi-relaxation process by capturing the relaxation times and relaxation strengths of N interior single-relaxation processes, using the measurements of acoustic absorption and sound speed at 2N frequencies. Experimental data for the gas mixtures CO2-N2 and CO2-O2 validate our decomposition and reconstruction approach.

  1. Affordable Design: A Methodolgy to Implement Process-Based Manufacturing Cost into the Traditional Performance-Focused Multidisciplinary Design Optimization

    NASA Technical Reports Server (NTRS)

    Bao, Han P.; Samareh, J. A.

    2000-01-01

    The primary objective of this paper is to demonstrate the use of process-based manufacturing and assembly cost models in a traditional performance-focused multidisciplinary design and optimization process. The use of automated cost-performance analysis is an enabling technology that could bring realistic processbased manufacturing and assembly cost into multidisciplinary design and optimization. In this paper, we present a new methodology for incorporating process costing into a standard multidisciplinary design optimization process. Material, manufacturing processes, and assembly processes costs then could be used as the objective function for the optimization method. A case study involving forty-six different configurations of a simple wing is presented, indicating that a design based on performance criteria alone may not necessarily be the most affordable as far as manufacturing and assembly cost is concerned.

  2. Image processing system design for microcantilever-based optical readout infrared arrays

    NASA Astrophysics Data System (ADS)

    Tong, Qiang; Dong, Liquan; Zhao, Yuejin; Gong, Cheng; Liu, Xiaohua; Yu, Xiaomei; Yang, Lei; Liu, Weiyu

    2012-12-01

    Compared with the traditional infrared imaging technology, the new type of optical-readout uncooled infrared imaging technology based on MEMS has many advantages, such as low cost, small size, producing simple. In addition, the theory proves that the technology's high thermal detection sensitivity. So it has a very broad application prospects in the field of high performance infrared detection. The paper mainly focuses on an image capturing and processing system in the new type of optical-readout uncooled infrared imaging technology based on MEMS. The image capturing and processing system consists of software and hardware. We build our image processing core hardware platform based on TI's high performance DSP chip which is the TMS320DM642, and then design our image capturing board based on the MT9P031. MT9P031 is Micron's company high frame rate, low power consumption CMOS chip. Last we use Intel's company network transceiver devices-LXT971A to design the network output board. The software system is built on the real-time operating system DSP/BIOS. We design our video capture driver program based on TI's class-mini driver and network output program based on the NDK kit for image capturing and processing and transmitting. The experiment shows that the system has the advantages of high capturing resolution and fast processing speed. The speed of the network transmission is up to 100Mbps.

  3. Digital Signal Processing Based Biotelemetry Receivers

    NASA Technical Reports Server (NTRS)

    Singh, Avtar; Hines, John; Somps, Chris

    1997-01-01

    This is an attempt to develop a biotelemetry receiver using digital signal processing technology and techniques. The receiver developed in this work is based on recovering signals that have been encoded using either Pulse Position Modulation (PPM) or Pulse Code Modulation (PCM) technique. A prototype has been developed using state-of-the-art digital signal processing technology. A Printed Circuit Board (PCB) is being developed based on the technique and technology described here. This board is intended to be used in the UCSF Fetal Monitoring system developed at NASA. The board is capable of handling a variety of PPM and PCM signals encoding signals such as ECG, temperature, and pressure. A signal processing program has also been developed to analyze the received ECG signal to determine heart rate. This system provides a base for using digital signal processing in biotelemetry receivers and other similar applications.

  4. Individual Differences in Base Rate Neglect: A Fuzzy Processing Preference Index

    PubMed Central

    Wolfe, Christopher R.; Fisher, Christopher R.

    2013-01-01

    Little is known about individual differences in integrating numeric base-rates and qualitative text in making probability judgments. Fuzzy-Trace Theory predicts a preference for fuzzy processing. We conducted six studies to develop the FPPI, a reliable and valid instrument assessing individual differences in this fuzzy processing preference. It consists of 19 probability estimation items plus 4 "M-Scale" items that distinguish simple pattern matching from “base rate respect.” Cronbach's Alpha was consistently above 0.90. Validity is suggested by significant correlations between FPPI scores and three other measurers: "Rule Based" Process Dissociation Procedure scores; the number of conjunction fallacies in joint probability estimation; and logic index scores on syllogistic reasoning. Replicating norms collected in a university study with a web-based study produced negligible differences in FPPI scores, indicating robustness. The predicted relationships between individual differences in base rate respect and both conjunction fallacies and syllogistic reasoning were partially replicated in two web-based studies. PMID:23935255

  5. A KPI framework for process-based benchmarking of hospital information systems.

    PubMed

    Jahn, Franziska; Winter, Alfred

    2011-01-01

    Benchmarking is a major topic for monitoring, directing and elucidating the performance of hospital information systems (HIS). Current approaches neglect the outcome of the processes that are supported by the HIS and their contribution to the hospital's strategic goals. We suggest to benchmark HIS based on clinical documentation processes and their outcome. A framework consisting of a general process model and outcome criteria for clinical documentation processes is introduced.

  6. Predictors of Processing-Based Task Performance in Bilingual and Monolingual Children

    PubMed Central

    Buac, Milijana; Gross, Megan; Kaushanskaya, Margarita

    2016-01-01

    In the present study we examined performance of bilingual Spanish-English-speaking and monolingual English-speaking school-age children on a range of processing-based measures within the framework of Baddeley’s working memory model. The processing-based measures included measures of short-term memory, measures of working memory, and a novel word-learning task. Results revealed that monolinguals outperformed bilinguals on the short-term memory tasks but not the working memory and novel word-learning tasks. Further, children’s vocabulary skills and socioeconomic status (SES) were more predictive of processing-based task performance in the bilingual group than the monolingual group. Together, these findings indicate that processing-based tasks that engage verbal working memory rather than short-term memory may be better-suited for diagnostic purposes with bilingual children. However, even verbal working memory measures are sensitive to bilingual children’s language-specific knowledge and demographic characteristics, and therefore may have limited clinical utility. PMID:27179914

  7. Manufacturing Process Selection of Composite Bicycle’s Crank Arm using Analytical Hierarchy Process (AHP)

    NASA Astrophysics Data System (ADS)

    Luqman, M.; Rosli, M. U.; Khor, C. Y.; Zambree, Shayfull; Jahidi, H.

    2018-03-01

    Crank arm is one of the important parts in a bicycle that is an expensive product due to the high cost of material and production process. This research is aimed to investigate the potential type of manufacturing process to fabricate composite bicycle crank arm and to describe an approach based on analytical hierarchy process (AHP) that assists decision makers or manufacturing engineers in determining the most suitable process to be employed in manufacturing of composite bicycle crank arm at the early stage of the product development process to reduce the production cost. There are four types of processes were considered, namely resin transfer molding (RTM), compression molding (CM), vacuum bag molding and filament winding (FW). The analysis ranks these four types of process for its suitability in the manufacturing of bicycle crank arm based on five main selection factors and 10 sub factors. Determining the right manufacturing process was performed based on AHP process steps. Consistency test was performed to make sure the judgements are consistent during the comparison. The results indicated that the compression molding was the most appropriate manufacturing process because it has the highest value (33.6%) among the other manufacturing processes.

  8. Software-Based Real-Time Acquisition and Processing of PET Detector Raw Data.

    PubMed

    Goldschmidt, Benjamin; Schug, David; Lerche, Christoph W; Salomon, André; Gebhardt, Pierre; Weissler, Bjoern; Wehner, Jakob; Dueppenbecker, Peter M; Kiessling, Fabian; Schulz, Volkmar

    2016-02-01

    In modern positron emission tomography (PET) readout architectures, the position and energy estimation of scintillation events (singles) and the detection of coincident events (coincidences) are typically carried out on highly integrated, programmable printed circuit boards. The implementation of advanced singles and coincidence processing (SCP) algorithms for these architectures is often limited by the strict constraints of hardware-based data processing. In this paper, we present a software-based data acquisition and processing architecture (DAPA) that offers a high degree of flexibility for advanced SCP algorithms through relaxed real-time constraints and an easily extendible data processing framework. The DAPA is designed to acquire detector raw data from independent (but synchronized) detector modules and process the data for singles and coincidences in real-time using a center-of-gravity (COG)-based, a least-squares (LS)-based, or a maximum-likelihood (ML)-based crystal position and energy estimation approach (CPEEA). To test the DAPA, we adapted it to a preclinical PET detector that outputs detector raw data from 60 independent digital silicon photomultiplier (dSiPM)-based detector stacks and evaluated it with a [(18)F]-fluorodeoxyglucose-filled hot-rod phantom. The DAPA is highly reliable with less than 0.1% of all detector raw data lost or corrupted. For high validation thresholds (37.1 ± 12.8 photons per pixel) of the dSiPM detector tiles, the DAPA is real time capable up to 55 MBq for the COG-based CPEEA, up to 31 MBq for the LS-based CPEEA, and up to 28 MBq for the ML-based CPEEA. Compared to the COG-based CPEEA, the rods in the image reconstruction of the hot-rod phantom are only slightly better separable and less blurred for the LS- and ML-based CPEEA. While the coincidence time resolution (∼ 500 ps) and energy resolution (∼12.3%) are comparable for all three CPEEA, the system sensitivity is up to 2.5 × higher for the LS- and ML-based CPEEA.

  9. Calcium hydroxide as a processing base in alkali-aided pH-shift protein recovery process.

    PubMed

    Paker, Ilgin; Jaczynski, Jacek; Matak, Kristen E

    2017-02-01

    Protein may be recovered by using pH shifts to solubilize and precipitate protein. Typically, sodium hydroxide is used as the processing base; however, this has been shown to significantly increase sodium in the final recovered protein. Protein was extracted from black bullhead catfish (Ameiurus melas) using a pH-shift method. Protein was solubilized using either sodium hydroxide (NaOH) or calcium hydroxide (Ca(OH) 2 ) and precipitated at pH 5.5 using hydrochloric acid (HCl). Protein solubility was greater when Ca(OH) 2 was used compared to NaOH during this process. Using Ca(OH) 2 as the processing base yielded the greatest lipid recovery (P < 0.05) at 77 g 100 g -1 , whereas the greatest (P < 0.05) protein recovery yield was recorded as 53 g 100 g -1 protein using NaOH. Protein solubilized with Ca(OH) 2 had more (P < 0.05) calcium in the protein fraction, whereas using NaOH increased (P < 0.05) sodium content. Results of our study showed that protein solubility was increased and the recovered protein had significantly more calcium when Ca(OH) 2 was used as the processing base. Results showed both NaOH and Ca(OH) 2 to be an effective processing base for pH-shift protein recovery processes. © 2016 Society of Chemical Industry. © 2016 Society of Chemical Industry.

  10. Bridging process-based and empirical approaches to modeling tree growth

    Treesearch

    Harry T. Valentine; Annikki Makela; Annikki Makela

    2005-01-01

    The gulf between process-based and empirical approaches to modeling tree growth may be bridged, in part, by the use of a common model. To this end, we have formulated a process-based model of tree growth that can be fitted and applied in an empirical mode. The growth model is grounded in pipe model theory and an optimal control model of crown development. Together, the...

  11. Comparison of pre-processing methods for multiplex bead-based immunoassays.

    PubMed

    Rausch, Tanja K; Schillert, Arne; Ziegler, Andreas; Lüking, Angelika; Zucht, Hans-Dieter; Schulz-Knappe, Peter

    2016-08-11

    High throughput protein expression studies can be performed using bead-based protein immunoassays, such as the Luminex® xMAP® technology. Technical variability is inherent to these experiments and may lead to systematic bias and reduced power. To reduce technical variability, data pre-processing is performed. However, no recommendations exist for the pre-processing of Luminex® xMAP® data. We compared 37 different data pre-processing combinations of transformation and normalization methods in 42 samples on 384 analytes obtained from a multiplex immunoassay based on the Luminex® xMAP® technology. We evaluated the performance of each pre-processing approach with 6 different performance criteria. Three performance criteria were plots. All plots were evaluated by 15 independent and blinded readers. Four different combinations of transformation and normalization methods performed well as pre-processing procedure for this bead-based protein immunoassay. The following combinations of transformation and normalization were suitable for pre-processing Luminex® xMAP® data in this study: weighted Box-Cox followed by quantile or robust spline normalization (rsn), asinh transformation followed by loess normalization and Box-Cox followed by rsn.

  12. Collection of process data after cardiac surgery: initial implementation with a Java-based intranet applet.

    PubMed

    Ratcliffe, M B; Khan, J H; Magee, K M; McElhinney, D B; Hubner, C

    2000-06-01

    Using a Java-based intranet program (applet), we collected postoperative process data after coronary artery bypass grafting. A Java-based applet was developed and deployed on a hospital intranet. Briefly, the nurse entered patient process data using a point and click interface. The applet generated a nursing note, and process data were saved in a Microsoft Access database. In 10 patients, this method was validated by comparison with a retrospective chart review. In 45 consecutive patients, weekly control charts were generated from the data. When aberrations from the pathway occurred, feedback was initiated to restore the goals of the critical pathway. The intranet process data collection method was verified by a manual chart review with 98% sensitivity. The control charts for time to extubation, intensive care unit stay, and hospital stay showed a deviation from critical pathway goals after the first 20 patients. Feedback modulation was associated with a return to critical pathway goals. Java-based applets are inexpensive and can collect accurate postoperative process data, identify critical pathway deviations, and allow timely feedback of process data.

  13. 42 CFR § 484.330 - Process for determining and applying the value-based payment adjustment under the Home Health...

    Code of Federal Regulations, 2010 CFR

    2016-10-01

    ...-based payment adjustment under the Home Health Value-Based Purchasing (HHVBP) Model. § 484.330 Section... (HHVBP) Model Components for Competing Home Health Agencies Within State Boundaries § 484.330 Process for determining and applying the value-based payment adjustment under the Home Health Value-Based Purchasing...

  14. 42 CFR § 484.330 - Process for determining and applying the value-based payment adjustment under the Home Health...

    Code of Federal Regulations, 2010 CFR

    2017-10-01

    ...-based payment adjustment under the Home Health Value-Based Purchasing (HHVBP) Model. § 484.330 Section... (HHVBP) Model Components for Competing Home Health Agencies Within State Boundaries § 484.330 Process for determining and applying the value-based payment adjustment under the Home Health Value-Based Purchasing...

  15. Development of an Integrated Multi-Contaminant Removal Process Applied to Warm Syngas Cleanup for Coal-Based Advanced Gasification Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Meyer, Howard

    2010-11-30

    This project met the objective to further the development of an integrated multi-contaminant removal process in which H2S, NH3, HCl and heavy metals including Hg, As, Se and Cd present in the coal-derived syngas can be removed to specified levels in a single/integrated process step. The process supports the mission and goals of the Department of Energy's Gasification Technologies Program, namely to enhance the performance of gasification systems, thus enabling U.S. industry to improve the competitiveness of gasification-based processes. The gasification program will reduce equipment costs, improve process environmental performance, and increase process reliability and flexibility. Two sulfur conversion conceptsmore » were tested in the laboratory under this project, i.e., the solventbased, high-pressure University of California Sulfur Recovery Process High Pressure (UCSRP-HP) and the catalytic-based, direct oxidation (DO) section of the CrystaSulf-DO process. Each process required a polishing unit to meet the ultra-clean sulfur content goals of <50 ppbv (parts per billion by volume) as may be necessary for fuel cells or chemical production applications. UCSRP-HP was also tested for the removal of trace, non-sulfur contaminants, including ammonia, hydrogen chloride, and heavy metals. A bench-scale unit was commissioned and limited testing was performed with simulated syngas. Aspen-Plus®-based computer simulation models were prepared and the economics of the UCSRP-HP and CrystaSulf-DO processes were evaluated for a nominal 500 MWe, coal-based, IGCC power plant with carbon capture. This report covers the progress on the UCSRP-HP technology development and the CrystaSulf-DO technology.« less

  16. A framework supporting the development of a Grid portal for analysis based on ROI.

    PubMed

    Ichikawa, K; Date, S; Kaishima, T; Shimojo, S

    2005-01-01

    In our research on brain function analysis, users require two different simultaneous types of processing: interactive processing to a specific part of data and high-performance batch processing to an entire dataset. The difference between these two types of processing is in whether or not the analysis is for data in the region of interest (ROI). In this study, we propose a Grid portal that has a mechanism to freely assign computing resources to the users on a Grid environment according to the users' two different types of processing requirements. We constructed a Grid portal which integrates interactive processing and batch processing by the following two mechanisms. First, a job steering mechanism controls job execution based on user-tagged priority among organizations with heterogeneous computing resources. Interactive jobs are processed in preference to batch jobs by this mechanism. Second, a priority-based result delivery mechanism that administrates a rank of data significance. The portal ensures a turn-around time of interactive processing by the priority-based job controlling mechanism, and provides the users with quality of services (QoS) for interactive processing. The users can access the analysis results of interactive jobs in preference to the analysis results of batch jobs. The Grid portal has also achieved high-performance computation of MEG analysis with batch processing on the Grid environment. The priority-based job controlling mechanism has been realized to freely assign computing resources to the users' requirements. Furthermore the achievement of high-performance computation contributes greatly to the overall progress of brain science. The portal has thus made it possible for the users to flexibly include the large computational power in what they want to analyze.

  17. Standardized Review and Approval Process for High-Cost Medication Use Promotes Value-Based Care in a Large Academic Medical System.

    PubMed

    Durvasula, Raghu; Kelly, Janet; Schleyer, Anneliese; Anawalt, Bradley D; Somani, Shabir; Dellit, Timothy H

    2018-04-01

    As healthcare costs rise and reimbursements decrease, healthcare organization leadership and clinical providers must collaborate to provide high-value healthcare. Medications are a key driver of the increasing cost of healthcare, largely as a result of the proliferation of expensive specialty drugs, including biologic agents. Such medications contribute significantly to the inpatient diagnosis-related group payment system, often with minimal or unproved benefit over less-expensive therapies. To describe a systematic review process to reduce non-evidence-based inpatient use of high-cost medications across a large multihospital academic health system. We created a Pharmacy & Therapeutics subcommittee consisting of clinicians, pharmacists, and an ethics representative. This committee developed a standardized process for a timely review (<48 hours) and approval of high-cost medications based on their clinical effectiveness, safety, and appropriateness. The engagement of clinical experts in the development of the consensus-based guidelines for the use of specific medications facilitated the clinicians' acceptance of the review process. Over a 2-year period, a total of 85 patient-specific requests underwent formal review. All reviews were conducted within 48 hours. This review process has reduced the non-evidence-based use of specialty medications and has resulted in a pharmacy savings of $491,000 in fiscal year 2016, with almost 80% of the savings occurring in the last 2 quarters, because our process has matured. The creation of a collaborative review process to ensure consistent, evidence-based utilization of high-cost medications provides value-based care, while minimizing unnecessary practice variation and reducing the cost of inpatient care.

  18. Validation of a DNA IQ-based extraction method for TECAN robotic liquid handling workstations for processing casework.

    PubMed

    Frégeau, Chantal J; Lett, C Marc; Fourney, Ron M

    2010-10-01

    A semi-automated DNA extraction process for casework samples based on the Promega DNA IQ™ system was optimized and validated on TECAN Genesis 150/8 and Freedom EVO robotic liquid handling stations configured with fixed tips and a TECAN TE-Shake™ unit. The use of an orbital shaker during the extraction process promoted efficiency with respect to DNA capture, magnetic bead/DNA complex washes and DNA elution. Validation studies determined the reliability and limitations of this shaker-based process. Reproducibility with regards to DNA yields for the tested robotic workstations proved to be excellent and not significantly different than that offered by the manual phenol/chloroform extraction. DNA extraction of animal:human blood mixtures contaminated with soil demonstrated that a human profile was detectable even in the presence of abundant animal blood. For exhibits containing small amounts of biological material, concordance studies confirmed that DNA yields for this shaker-based extraction process are equivalent or greater to those observed with phenol/chloroform extraction as well as our original validated automated magnetic bead percolation-based extraction process. Our data further supports the increasing use of robotics for the processing of casework samples. Crown Copyright © 2009. Published by Elsevier Ireland Ltd. All rights reserved.

  19. Real-Time, Sensor-Based Computing in the Laboratory.

    ERIC Educational Resources Information Center

    Badmus, O. O.; And Others

    1996-01-01

    Demonstrates the importance of Real-Time, Sensor-Based (RTSB) computing and how it can be easily and effectively integrated into university student laboratories. Describes the experimental processes, the process instrumentation and process-computer interface, the computer and communications systems, and typical software. Provides much technical…

  20. Understanding Atmospheric Anomalies Associated with Seasonal Pluvial-Drought Processes Using Southwest China as an Example

    NASA Astrophysics Data System (ADS)

    Liu, Z.; LU, G.; He, H.; Wu, Z.; He, J.

    2017-12-01

    Seasonal pluvial-drought transition processes are unique natural phenomena. To explore possible mechanisms, we considered Southwest China (SWC) as the study region and comprehensively investigated the temporal evolution of large-scale and regional atmospheric variables with the simple method of Standardized Anomalies (SA). Some key results include: (1) The net vertical integral of water vapour flux (VIWVF) across the four boundaries may be a feasible indicator of pluvial-drought transition processes over SWC, because its SA-based index is almost consistent with process development. (2) The vertical SA-based patterns of regional horizontal divergence (D) and vertical motion (ω) also coincides with the pluvial-drought transition processes well, and the SA-based index of regional D show relatively high correlation with the identified processes over SWC. (3) With respect to large-scale anomalies of circulation patterns, a well-organized Eurasian Pattern is one important feature during the pluvial-drought transition over SWC. (4) To explore the possibility of simulating drought development using previous pluvial anomalies, large-scale and regional atmospheric SA-based indices were used. As a whole, when SA-based indices of regional dynamic and water-vapor variables are introduced, simulated drought development only with large-scale anomalies can be improved a lot. (5) Eventually, pluvial-drought transition processes and associated regional atmospheric anomalies over nine Chinese drought study regions were investigated. With respect to regional D, vertically single or double "upper-positive-lower-negative" and "upper-negative-lower-positive" patterns are the most common vertical SA-based patterns during the pluvial and drought parts of transition processes, respectively.

  1. Optimization of insect cell based protein production processes - online monitoring, expression systems, scale up.

    PubMed

    Druzinec, Damir; Salzig, Denise; Brix, Alexander; Kraume, Matthias; Vilcinskas, Andreas; Kollewe, Christian; Czermak, Peter

    2013-01-01

    Due to the increasing use of insect cell based expression systems in research and industrial recombinant protein production, the development of efficient and reproducible production processes remains a challenging task. In this context, the application of online monitoring techniques is intended to ensure high and reproducible product qualities already during the early phases of process development. In the following chapter, the most common transient and stable insect cell based expression systems are briefly introduced. Novel applications of insect cell based expression systems for the production of insect derived antimicrobial peptides/proteins (AMPs) are discussed using the example of G. mellonella derived gloverin. Suitable in situ sensor techniques for insect cell culture monitoring in disposable and common bioreactor systems are outlined with respect to optical and capacitive sensor concepts. Since scale up of production processes is one of the most critical steps in process development, a conclusive overview is given about scale up aspects for industrial insect cell culture processes.

  2. Development of High Throughput Process for Constructing 454 Titanium and Illumina Libraries

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Deshpande, Shweta; Hack, Christopher; Tang, Eric

    2010-05-28

    We have developed two processes with the Biomek FX robot to construct 454 titanium and Illumina libraries in order to meet the increasing library demands. All modifications in the library construction steps were made to enable the adaptation of the entire processes to work with the 96-well plate format. The key modifications include the shearing of DNA with Covaris E210 and the enzymatic reaction cleaning and fragment size selection with SPRI beads and magnetic plate holders. The construction of 96 Titanium libraries takes about 8 hours from sheared DNA to ssDNA recovery. The processing of 96 Illumina libraries takes lessmore » time than that of the Titanium library process. Although both processes still require manual transfer of plates from robot to other work stations such as thermocyclers, these robotic processes represent about 12- to 24-folds increase of library capacity comparing to the manual processes. To enable the sequencing of many libraries in parallel, we have also developed sets of molecular barcodes for both library types. The requirements for the 454 library barcodes include 10 bases, 40-60percent GC, no consecutive same base, and no less than 3 bases difference between barcodes. We have used 96 of the resulted 270 barcodes to construct libraries and pool to test the ability of accurately assigning reads to the right samples. When allowing 1 base error occurred in the 10 base barcodes, we could assign 99.6percent of the total reads and 100percent of them were uniquely assigned. As for the Illumina barcodes, the requirements include 4 bases, balanced GC, and at least 2 bases difference between barcodes. We have begun to assess the ability to assign reads after pooling different number of libraries. We will discuss the progress and the challenges of these scale-up processes.« less

  3. Enforcement of entailment constraints in distributed service-based business processes.

    PubMed

    Hummer, Waldemar; Gaubatz, Patrick; Strembeck, Mark; Zdun, Uwe; Dustdar, Schahram

    2013-11-01

    A distributed business process is executed in a distributed computing environment. The service-oriented architecture (SOA) paradigm is a popular option for the integration of software services and execution of distributed business processes. Entailment constraints, such as mutual exclusion and binding constraints, are important means to control process execution. Mutually exclusive tasks result from the division of powerful rights and responsibilities to prevent fraud and abuse. In contrast, binding constraints define that a subject who performed one task must also perform the corresponding bound task(s). We aim to provide a model-driven approach for the specification and enforcement of task-based entailment constraints in distributed service-based business processes. Based on a generic metamodel, we define a domain-specific language (DSL) that maps the different modeling-level artifacts to the implementation-level. The DSL integrates elements from role-based access control (RBAC) with the tasks that are performed in a business process. Process definitions are annotated using the DSL, and our software platform uses automated model transformations to produce executable WS-BPEL specifications which enforce the entailment constraints. We evaluate the impact of constraint enforcement on runtime performance for five selected service-based processes from existing literature. Our evaluation demonstrates that the approach correctly enforces task-based entailment constraints at runtime. The performance experiments illustrate that the runtime enforcement operates with an overhead that scales well up to the order of several ten thousand logged invocations. Using our DSL annotations, the user-defined process definition remains declarative and clean of security enforcement code. Our approach decouples the concerns of (non-technical) domain experts from technical details of entailment constraint enforcement. The developed framework integrates seamlessly with WS-BPEL and the Web services technology stack. Our prototype implementation shows the feasibility of the approach, and the evaluation points to future work and further performance optimizations.

  4. Computational modeling of residual stress formation during the electron beam melting process for Inconel 718

    DOE PAGES

    Prabhakar, P.; Sames, William J.; Dehoff, Ryan R.; ...

    2015-03-28

    Here, a computational modeling approach to simulate residual stress formation during the electron beam melting (EBM) process within the additive manufacturing (AM) technologies for Inconel 718 is presented in this paper. The EBM process has demonstrated a high potential to fabricate components with complex geometries, but the resulting components are influenced by the thermal cycles observed during the manufacturing process. When processing nickel based superalloys, very high temperatures (approx. 1000 °C) are observed in the powder bed, base plate, and build. These high temperatures, when combined with substrate adherence, can result in warping of the base plate and affect themore » final component by causing defects. It is important to have an understanding of the thermo-mechanical response of the entire system, that is, its mechanical behavior towards thermal loading occurring during the EBM process prior to manufacturing a component. Therefore, computational models to predict the response of the system during the EBM process will aid in eliminating the undesired process conditions, a priori, in order to fabricate the optimum component. Such a comprehensive computational modeling approach is demonstrated to analyze warping of the base plate, stress and plastic strain accumulation within the material, and thermal cycles in the system during different stages of the EBM process.« less

  5. Nonlinear process in the mode transition in typical strut-based and cavity-strut based scramjet combustors

    NASA Astrophysics Data System (ADS)

    Yan, Li; Liao, Lei; Huang, Wei; Li, Lang-quan

    2018-04-01

    The analysis of nonlinear characteristics and control of mode transition process is the crucial issue to enhance the stability and reliability of the dual-mode scramjet engine. In the current study, the mode transition processes in both strut-based combustor and cavity-strut based combustor are numerically studied, and the influence of the cavity on the transition process is analyzed in detail. The simulations are conducted by means of the Reynolds averaged Navier-Stokes (RANS) equations coupled with the renormalization group (RNG) k-ε turbulence model and the single-step chemical reaction mechanism, and this numerical approach is proved to be valid by comparing the predicted results with the available experimental shadowgraphs in the open literature. During the mode transition process, an obvious nonlinear property is observed, namely the unevenly variations of pressure along the combustor. The hysteresis phenomenon is more obvious upstream of the flow field. For the cavity-strut configuration, the whole flow field is more inclined to the supersonic state during the transition process, and it is uneasy to convert to the ramjet mode. In the scram-to-ram transition process, the process would be more stable, and the hysteresis effect would be reduced in the ram-to-scram transition process.

  6. Anxiety Sensitivity and Cognitive-Based Smoking Processes: Testing the Mediating Role of Emotion Dysregulation among Treatment-Seeking Daily Smokers

    PubMed Central

    Johnson, Kirsten A.; Farris, Samantha G.; Schmidt, Norman B.; Zvolensky, Michael J.

    2012-01-01

    Objective The current study investigated whether emotion dysregulation (ED; difficulties in the self-regulation of affective states) mediated relations between anxiety sensitivity (AS; fear of anxiety and related sensations) and cognitive-based smoking processes. Method Participants (n = 197; 57.5% male; Mage = 38.0) were daily smokers recruited as part of a randomized control trial for smoking cessation. Results AS was uniquely associated with all smoking processes. Moreover, ED significantly mediated relations between AS and the smoking processes. Conclusions Findings suggest that ED is an important construct to consider in relations between AS and cognitive-based smoking processes among adult treatment-seeking smokers. PMID:22540436

  7. [GSH fermentation process modeling using entropy-criterion based RBF neural network model].

    PubMed

    Tan, Zuoping; Wang, Shitong; Deng, Zhaohong; Du, Guocheng

    2008-05-01

    The prediction accuracy and generalization of GSH fermentation process modeling are often deteriorated by noise existing in the corresponding experimental data. In order to avoid this problem, we present a novel RBF neural network modeling approach based on entropy criterion. It considers the whole distribution structure of the training data set in the parameter learning process compared with the traditional MSE-criterion based parameter learning, and thus effectively avoids the weak generalization and over-learning. Then the proposed approach is applied to the GSH fermentation process modeling. Our results demonstrate that this proposed method has better prediction accuracy, generalization and robustness such that it offers a potential application merit for the GSH fermentation process modeling.

  8. An Extension of SIC Predictions to the Wiener Coactive Model

    PubMed Central

    Houpt, Joseph W.; Townsend, James T.

    2011-01-01

    The survivor interaction contrasts (SIC) is a powerful measure for distinguishing among candidate models of human information processing. One class of models to which SIC analysis can apply are the coactive, or channel summation, models of human information processing. In general, parametric forms of coactive models assume that responses are made based on the first passage time across a fixed threshold of a sum of stochastic processes. Previous work has shown that that the SIC for a coactive model based on the sum of Poisson processes has a distinctive down-up-down form, with an early negative region that is smaller than the later positive region. In this note, we demonstrate that a coactive process based on the sum of two Wiener processes has the same SIC form. PMID:21822333

  9. An Extension of SIC Predictions to the Wiener Coactive Model.

    PubMed

    Houpt, Joseph W; Townsend, James T

    2011-06-01

    The survivor interaction contrasts (SIC) is a powerful measure for distinguishing among candidate models of human information processing. One class of models to which SIC analysis can apply are the coactive, or channel summation, models of human information processing. In general, parametric forms of coactive models assume that responses are made based on the first passage time across a fixed threshold of a sum of stochastic processes. Previous work has shown that that the SIC for a coactive model based on the sum of Poisson processes has a distinctive down-up-down form, with an early negative region that is smaller than the later positive region. In this note, we demonstrate that a coactive process based on the sum of two Wiener processes has the same SIC form.

  10. A Mixed Kijima Model Using the Weibull-Based Generalized Renewal Processes

    PubMed Central

    2015-01-01

    Generalized Renewal Processes are useful for approaching the rejuvenation of dynamical systems resulting from planned or unplanned interventions. We present new perspectives for the Generalized Renewal Processes in general and for the Weibull-based Generalized Renewal Processes in particular. Disregarding from literature, we present a mixed Generalized Renewal Processes approach involving Kijima Type I and II models, allowing one to infer the impact of distinct interventions on the performance of the system under study. The first and second theoretical moments of this model are introduced as well as its maximum likelihood estimation and random sampling approaches. In order to illustrate the usefulness of the proposed Weibull-based Generalized Renewal Processes model, some real data sets involving improving, stable, and deteriorating systems are used. PMID:26197222

  11. a Metadata Based Approach for Analyzing Uav Datasets for Photogrammetric Applications

    NASA Astrophysics Data System (ADS)

    Dhanda, A.; Remondino, F.; Santana Quintero, M.

    2018-05-01

    This paper proposes a methodology for pre-processing and analysing Unmanned Aerial Vehicle (UAV) datasets before photogrammetric processing. In cases where images are gathered without a detailed flight plan and at regular acquisition intervals the datasets can be quite large and be time consuming to process. This paper proposes a method to calculate the image overlap and filter out images to reduce large block sizes and speed up photogrammetric processing. The python-based algorithm that implements this methodology leverages the metadata in each image to determine the end and side overlap of grid-based UAV flights. Utilizing user input, the algorithm filters out images that are unneeded for photogrammetric processing. The result is an algorithm that can speed up photogrammetric processing and provide valuable information to the user about the flight path.

  12. [Construction of research system for processing mechanism of traditional Chinese medicine based on chemical composition transformation combined with intestinal absorption barrier].

    PubMed

    Sun, E; Xu, Feng-Juan; Zhang, Zhen-Hai; Wei, Ying-Jie; Tan, Xiao-Bin; Cheng, Xu-Dong; Jia, Xiao-Bin

    2014-02-01

    Based on practice of Epimedium processing mechanism for many years and integrated multidisciplinary theory and technology, this paper initially constructs the research system for processing mechanism of traditional Chinese medicine based on chemical composition transformation combined with intestinal absorption barrier, which to form an innovative research mode of the " chemical composition changes-biological transformation-metabolism in vitro and in vivo-intestinal absorption-pharmacokinetic combined pharmacodynamic-pharmacodynamic mechanism". Combined with specific examples of Epimedium and other Chinese herbal medicine processing mechanism, this paper also discusses the academic thoughts, research methods and key technologies of this research system, which will be conducive to systematically reveal the modem scientific connotation of traditional Chinese medicine processing, and enrich the theory of Chinese herbal medicine processing.

  13. How Many Batches Are Needed for Process Validation under the New FDA Guidance?

    PubMed

    Yang, Harry

    2013-01-01

    The newly updated FDA Guidance for Industry on Process Validation: General Principles and Practices ushers in a life cycle approach to process validation. While the guidance no longer considers the use of traditional three-batch validation appropriate, it does not prescribe the number of validation batches for a prospective validation protocol, nor does it provide specific methods to determine it. This potentially could leave manufacturers in a quandary. In this paper, I develop a Bayesian method to address the issue. By combining process knowledge gained from Stage 1 Process Design (PD) with expected outcomes of Stage 2 Process Performance Qualification (PPQ), the number of validation batches for PPQ is determined to provide a high level of assurance that the process will consistently produce future batches meeting quality standards. Several examples based on simulated data are presented to illustrate the use of the Bayesian method in helping manufacturers make risk-based decisions for Stage 2 PPQ, and they highlight the advantages of the method over traditional Frequentist approaches. The discussions in the paper lend support for a life cycle and risk-based approach to process validation recommended in the new FDA guidance. The newly updated FDA Guidance for Industry on Process Validation: General Principles and Practices ushers in a life cycle approach to process validation. While the guidance no longer considers the use of traditional three-batch validation appropriate, it does not prescribe the number of validation batches for a prospective validation protocol, nor does it provide specific methods to determine it. This potentially could leave manufacturers in a quandary. In this paper, I develop a Bayesian method to address the issue. By combining process knowledge gained from Stage 1 Process Design (PD) with expected outcomes of Stage 2 Process Performance Qualification (PPQ), the number of validation batches for PPQ is determined to provide a high level of assurance that the process will consistently produce future batches meeting quality standards. Several examples based on simulated data are presented to illustrate the use of the Bayesian method in helping manufacturers make risk-based decisions for Stage 2 PPQ, and THEY highlight the advantages of the method over traditional Frequentist approaches. The discussions in the paper lend support for a life cycle and risk-based approach to process validation recommended in the new FDA guidance.

  14. Research on the processing technology of elongated holes based on rotary ultrasonic drilling

    NASA Astrophysics Data System (ADS)

    Tong, Yi; Chen, Jianhua; Sun, Lipeng; Yu, Xin; Wang, Xin

    2014-08-01

    The optical glass is hard, brittle and difficult to process. Based on the method of rotating ultrasonic drilling, the study of single factor on drilling elongated holes was made in optical glass. The processing equipment was DAMA ultrasonic machine, and the machining tools were electroplated with diamond. Through the detection and analysis on the processing quality and surface roughness, the process parameters (the spindle speed, amplitude, feed rate) of rotary ultrasonic drilling were researched, and the influence of processing parameters on surface roughness was obtained, which will provide reference and basis for the actual processing.

  15. Fuzzy control of burnout of multilayer ceramic actuators

    NASA Astrophysics Data System (ADS)

    Ling, Alice V.; Voss, David; Christodoulou, Leo

    1996-08-01

    To improve the yield and repeatability of the burnout process of multilayer ceramic actuators (MCAs), an intelligent processing of materials (IPM-based) control system has been developed for the manufacture of MCAs. IPM involves the active (ultimately adaptive) control of a material process using empirical or analytical models and in situ sensing of critical process states (part features and process parameters) to modify the processing conditions in real time to achieve predefined product goals. Thus, the three enabling technologies for the IPM burnout control system are process modeling, in situ sensing and intelligent control. This paper presents the design of an IPM-based control strategy for the burnout process of MCAs.

  16. A dual-process perspective on fluency-based aesthetics: the pleasure-interest model of aesthetic liking.

    PubMed

    Graf, Laura K M; Landwehr, Jan R

    2015-11-01

    In this article, we develop an account of how aesthetic preferences can be formed as a result of two hierarchical, fluency-based processes. Our model suggests that processing performed immediately upon encountering an aesthetic object is stimulus driven, and aesthetic preferences that accrue from this processing reflect aesthetic evaluations of pleasure or displeasure. When sufficient processing motivation is provided by a perceiver's need for cognitive enrichment and/or the stimulus' processing affordance, elaborate perceiver-driven processing can emerge, which gives rise to fluency-based aesthetic evaluations of interest, boredom, or confusion. Because the positive outcomes in our model are pleasure and interest, we call it the Pleasure-Interest Model of Aesthetic Liking (PIA Model). Theoretically, this model integrates a dual-process perspective and ideas from lay epistemology into processing fluency theory, and it provides a parsimonious framework to embed and unite a wealth of aesthetic phenomena, including contradictory preference patterns for easy versus difficult-to-process aesthetic stimuli. © 2015 by the Society for Personality and Social Psychology, Inc.

  17. The peculiarities of process-based approach realization in transport sector company management

    NASA Astrophysics Data System (ADS)

    Khripko, Elena; Sidorov, Gennadiy

    2017-10-01

    In the present article we study the phenomena of multiple meaning in understanding process-based management method in construction of transport infrastructure facilities. The idea of multiple meaning is in distortions which appear during reception of the management process paradigm in organizational environment of transport sector. The cause of distortion in process management is organizational resistance. The distortions of management processes are discovered at the level of diffusion among spheres of responsibility, collision in forms of functional, project and process interaction between the owner of the process and its participants. The level of distortion is affected by the attitude towards the result of work which means that process understanding of the result is replaced by the functional one in practice of management. This transfiguration is the consequence of regressive defensive mechanisms of the organizational environment. On the base of experience of forming process management in construction of transport infrastructure facilities company of the issues of diagnostics of various forms of organizational resistance and ways of reducing the destructive influence on managing processes are reviewed.

  18. An Overview of Ni Base Additive Fabrication Technologies for Aerospace Applications (Preprint)

    DTIC Science & Technology

    2011-03-01

    fusion welding processes that have the ability to add filler material can be used as additive manufacturing processes . The majority of the work in the...Laser Additive Manufacturing (LAM) The LAM process uses a conventional laser welding heat source (CO2 or solid state laser) combined with a...wrought properties. The LAM process typically has a lower deposition rate (0.5 – 10 lbs/hr) compared to EB, PTA or TIG based processes , although as

  19. Platform for Post-Processing Waveform-Based NDE

    NASA Technical Reports Server (NTRS)

    Roth, Don J.

    2010-01-01

    Signal- and image-processing methods are commonly needed to extract information from the waves, improve resolution of, and highlight defects in an image. Since some similarity exists for all waveform-based nondestructive evaluation (NDE) methods, it would seem that a common software platform containing multiple signal- and image-processing techniques to process the waveforms and images makes sense where multiple techniques, scientists, engineers, and organizations are involved. NDE Wave & Image Processor Version 2.0 software provides a single, integrated signal- and image-processing and analysis environment for total NDE data processing and analysis. It brings some of the most useful algorithms developed for NDE over the past 20 years into a commercial-grade product. The software can import signal/spectroscopic data, image data, and image series data. This software offers the user hundreds of basic and advanced signal- and image-processing capabilities including esoteric 1D and 2D wavelet-based de-noising, de-trending, and filtering. Batch processing is included for signal- and image-processing capability so that an optimized sequence of processing operations can be applied to entire folders of signals, spectra, and images. Additionally, an extensive interactive model-based curve-fitting facility has been included to allow fitting of spectroscopy data such as from Raman spectroscopy. An extensive joint-time frequency module is included for analysis of non-stationary or transient data such as that from acoustic emission, vibration, or earthquake data.

  20. Bitumen and heavy oil upgrading in Canada

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chrones, J.; Germain, R.R.

    1989-01-01

    A review is presented of the heavy oil upgrading industry in Canada. Up to now it has been based on the processing of bitumen extracted from oil sands mining operations at two sites, to produce a residue-free, low sulphur, synthetic crude. Carbon rejection has been the prime process technology with delayed coking being used by Suncor and FLUID COKING at Syncrude. Alternative processes for recovering greater amounts of synthetic crude are examined. These include a variety of hydrogen addition processes and combinations which produce pipelineable materials requiring further processing in downstream refineries with expanded capabilities. The Newgrade Energy Inc. upgradermore » now under construction in Regina, will use fixed-bed, catalytic, atmospheric-residue, hydrogen processing. Two additional projects, also based on hydrogenation, will use ebullated bed catalyst systems; the expansion of Syncrude, now underway, is using the LC Fining Process whereas the announced Husky Bi-Provincial upgrader is based on H-Oil.« less

  1. Bitumen and heavy oil upgrading in Canada

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chrones, J.

    1988-06-01

    A review is presented of the heavy oil upgrading industry in Canada. Up to now it has been based on the processing of bitumen extracted from oil sands mining operations at two sites, to produce a residue-free, low sulfur, synthetic crude. Carbon rejection has been the prime process technology with delayed coking being used by Suncor and FLUID COKING at Syncrude. Alternative processes for recovering greater amounts of synthetic crude are examined. These include a variety of hydrogen addition processes and combinations which produce pipelineable materials requiring further processing in downstream refineries with expanded capabilities. The Newgrade Energy Inc. upgrader,more » now under construction in Regina, will use fixed-bed, catalytic, atmospheric-residue, hydrogen processing. Two additional products, also based on hydrogenation, will use ebullated bed catalyst systems: the expansion of Syncrude, now underway, is using the LC Fining Process whereas the announced Husky Bi-Provincial upgrader is based on H-Oil.« less

  2. One Step at a Time: SBM as an Incremental Process.

    ERIC Educational Resources Information Center

    Conrad, Mark

    1995-01-01

    Discusses incremental SBM budgeting and answers questions regarding resource equity, bookkeeping requirements, accountability, decision-making processes, and purchasing. Approaching site-based management as an incremental process recognizes that every school system engages in some level of site-based decisions. Implementation can be gradual and…

  3. Choice and Desegregation.

    ERIC Educational Resources Information Center

    Bennett, David A.

    This document comprises a report on the architectural elements of choice in the desegregation process, a review of the choice process based on Minnesota's experience, and a statement of implications for state policymakers. The following organizational principles of the choice process are discussed: (1) enrollment based on a "first come, first…

  4. 76 FR 22648 - Resolution Plans and Credit Exposure Reports Required

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-04-22

    ..., including associated services, functions and support that, in the view of the Covered Company or as jointly...-based Covered Company's overall contingency planning process, and information regarding the.... operations be linked to the contingency planning process of the foreign-based Covered Company? Process 1. Are...

  5. Radiology information system: a workflow-based approach.

    PubMed

    Zhang, Jinyan; Lu, Xudong; Nie, Hongchao; Huang, Zhengxing; van der Aalst, W M P

    2009-09-01

    Introducing workflow management technology in healthcare seems to be prospective in dealing with the problem that the current healthcare Information Systems cannot provide sufficient support for the process management, although several challenges still exist. The purpose of this paper is to study the method of developing workflow-based information system in radiology department as a use case. First, a workflow model of typical radiology process was established. Second, based on the model, the system could be designed and implemented as a group of loosely coupled components. Each component corresponded to one task in the process and could be assembled by the workflow management system. The legacy systems could be taken as special components, which also corresponded to the tasks and were integrated through transferring non-work- flow-aware interfaces to the standard ones. Finally, a workflow dashboard was designed and implemented to provide an integral view of radiology processes. The workflow-based Radiology Information System was deployed in the radiology department of Zhejiang Chinese Medicine Hospital in China. The results showed that it could be adjusted flexibly in response to the needs of changing process, and enhance the process management in the department. It can also provide a more workflow-aware integration method, comparing with other methods such as IHE-based ones. The workflow-based approach is a new method of developing radiology information system with more flexibility, more functionalities of process management and more workflow-aware integration. The work of this paper is an initial endeavor for introducing workflow management technology in healthcare.

  6. Assessing the structure of non-routine decision processes in Airline Operations Control.

    PubMed

    Richters, Floor; Schraagen, Jan Maarten; Heerkens, Hans

    2016-03-01

    Unfamiliar severe disruptions challenge Airline Operations Control professionals most, as their expertise is stretched to its limits. This study has elicited the structure of Airline Operations Control professionals' decision process during unfamiliar disruptions by mapping three macrocognitive activities on the decision ladder: sensemaking, option evaluation and action planning. The relationship between this structure and decision quality was measured. A simulated task was staged, based on which think-aloud protocols were obtained. Results show that the general decision process structure resembles the structure of experts working under routine conditions, in terms of the general structure of the macrocognitive activities, and the rule-based approach used to identify options and actions. Surprisingly, high quality of decision outcomes was found to relate to the use of rule-based strategies. This implies that successful professionals are capable of dealing with unfamiliar problems by reframing them into familiar ones, rather than to engage in knowledge-based processing. Practitioner Summary: We examined the macrocognitive structure of Airline Operations Control professionals' decision process during a simulated unfamiliar disruption in relation to decision quality. Results suggest that successful professionals are capable of dealing with unfamiliar problems by reframing them into familiar ones, rather than to engage in knowledge-based processing.

  7. Breaking Lander-Waterman’s Coverage Bound

    PubMed Central

    Nashta-ali, Damoun; Motahari, Seyed Abolfazl; Hosseinkhalaj, Babak

    2016-01-01

    Lander-Waterman’s coverage bound establishes the total number of reads required to cover the whole genome of size G bases. In fact, their bound is a direct consequence of the well-known solution to the coupon collector’s problem which proves that for such genome, the total number of bases to be sequenced should be O(G ln G). Although the result leads to a tight bound, it is based on a tacit assumption that the set of reads are first collected through a sequencing process and then are processed through a computation process, i.e., there are two different machines: one for sequencing and one for processing. In this paper, we present a significant improvement compared to Lander-Waterman’s result and prove that by combining the sequencing and computing processes, one can re-sequence the whole genome with as low as O(G) sequenced bases in total. Our approach also dramatically reduces the required computational power for the combined process. Simulation results are performed on real genomes with different sequencing error rates. The results support our theory predicting the log G improvement on coverage bound and corresponding reduction in the total number of bases required to be sequenced. PMID:27806058

  8. Optimization evaluation of cutting technology based on mechanical parts

    NASA Astrophysics Data System (ADS)

    Wang, Yu

    2018-04-01

    The relationship between the mechanical manufacturing process and the carbon emission is studied on the basis of the process of the mechanical manufacturing process. The formula of carbon emission calculation suitable for mechanical manufacturing process is derived. Based on this, a green evaluation method for cold machining process of mechanical parts is proposed. The application verification and data analysis of the proposed evaluation method are carried out by an example. The results show that there is a great relationship between the mechanical manufacturing process data and carbon emissions.

  9. Reprocessing system with nuclide separation based on chromatography in hydrochloric acid solution

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Suzuki, Tatsuya; Tachibana, Yu; Koyama, Shi-ichi

    2013-07-01

    We have proposed the reprocessing system with nuclide separation processes based on the chromatographic technique in the hydrochloric acid solution system. Our proposed system consists of the dissolution process, the reprocessing process, the minor actinide separation process, and nuclide separation processes. In the reprocessing and separation processes, the pyridine resin is used as a main separation media. It was confirmed that the dissolution in the hydrochloric acid solution is easily achieved by the plasma voloxidation and by the addition of oxygen peroxide into the hydrochloric acid solution.

  10. Application of process mining to assess the data quality of routinely collected time-based performance data sourced from electronic health records by validating process conformance.

    PubMed

    Perimal-Lewis, Lua; Teubner, David; Hakendorf, Paul; Horwood, Chris

    2016-12-01

    Effective and accurate use of routinely collected health data to produce Key Performance Indicator reporting is dependent on the underlying data quality. In this research, Process Mining methodology and tools were leveraged to assess the data quality of time-based Emergency Department data sourced from electronic health records. This research was done working closely with the domain experts to validate the process models. The hospital patient journey model was used to assess flow abnormalities which resulted from incorrect timestamp data used in time-based performance metrics. The research demonstrated process mining as a feasible methodology to assess data quality of time-based hospital performance metrics. The insight gained from this research enabled appropriate corrective actions to be put in place to address the data quality issues. © The Author(s) 2015.

  11. Acute stress affects prospective memory functions via associative memory processes.

    PubMed

    Szőllősi, Ágnes; Pajkossy, Péter; Demeter, Gyula; Kéri, Szabolcs; Racsmány, Mihály

    2018-01-01

    Recent findings suggest that acute stress can improve the execution of delayed intentions (prospective memory, PM). However, it is unclear whether this improvement can be explained by altered executive control processes or by altered associative memory functioning. To investigate this issue, we used physical-psychosocial stressors to induce acute stress in laboratory settings. Then participants completed event- and time-based PM tasks requiring the different contribution of control processes and a control task (letter fluency) frequently used to measure executive functions. According to our results, acute stress had no impact on ongoing task performance, time-based PM, and verbal fluency, whereas it enhanced event-based PM as measured by response speed for the prospective cues. Our findings indicate that, here, acute stress did not affect executive control processes. We suggest that stress affected event-based PM via associative memory processes. Copyright © 2017 Elsevier B.V. All rights reserved.

  12. Measurement-based reliability/performability models

    NASA Technical Reports Server (NTRS)

    Hsueh, Mei-Chen

    1987-01-01

    Measurement-based models based on real error-data collected on a multiprocessor system are described. Model development from the raw error-data to the estimation of cumulative reward is also described. A workload/reliability model is developed based on low-level error and resource usage data collected on an IBM 3081 system during its normal operation in order to evaluate the resource usage/error/recovery process in a large mainframe system. Thus, both normal and erroneous behavior of the system are modeled. The results provide an understanding of the different types of errors and recovery processes. The measured data show that the holding times in key operational and error states are not simple exponentials and that a semi-Markov process is necessary to model the system behavior. A sensitivity analysis is performed to investigate the significance of using a semi-Markov process, as opposed to a Markov process, to model the measured system.

  13. Validation of column-based chromatography processes for the purification of proteins. Technical report No. 14.

    PubMed

    2008-01-01

    PDA Technical Report No. 14 has been written to provide current best practices, such as application of risk-based decision making, based in sound science to provide a foundation for the validation of column-based chromatography processes and to expand upon information provided in Technical Report No. 42, Process Validation of Protein Manufacturing. The intent of this technical report is to provide an integrated validation life-cycle approach that begins with the use of process development data for the definition of operational parameters as a basis for validation, confirmation, and/or minor adjustment to these parameters at manufacturing scale during production of conformance batches and maintenance of the validated state throughout the product's life cycle.

  14. Diagnostic and prognostic histopathology system using morphometric indices

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Parvin, Bahram; Chang, Hang; Han, Ju

    Determining at least one of a prognosis or a therapy for a patient based on a stained tissue section of the patient. An image of a stained tissue section of a patient is processed by a processing device. A set of features values for a set of cell-based features is extracted from the processed image, and the processed image is associated with a particular cluster of a plurality of clusters based on the set of feature values, where the plurality of clusters is defined with respect to a feature space corresponding to the set of features.

  15. The effect of individually-induced processes on image-based overlay and diffraction-based overlay

    NASA Astrophysics Data System (ADS)

    Oh, SeungHwa; Lee, Jeongjin; Lee, Seungyoon; Hwang, Chan; Choi, Gilheyun; Kang, Ho-Kyu; Jung, EunSeung

    2014-04-01

    In this paper, set of wafers with separated processes was prepared and overlay measurement result was compared in two methods; IBO and DBO. Based on the experimental result, theoretical approach of relationship between overlay mark deformation and overlay variation is presented. Moreover, overlay reading simulation was used in verification and prediction of overlay variation due to deformation of overlay mark caused by induced processes. Through this study, understanding of individual process effects on overlay measurement error is given. Additionally, guideline of selecting proper overlay measurement scheme for specific layer is presented.

  16. Conflict monitoring in dual process theories of thinking.

    PubMed

    De Neys, Wim; Glumicic, Tamara

    2008-03-01

    Popular dual process theories have characterized human thinking as an interplay between an intuitive-heuristic and demanding-analytic reasoning process. Although monitoring the output of the two systems for conflict is crucial to avoid decision making errors there are some widely different views on the efficiency of the process. Kahneman [Kahneman, D. (2002). Maps of bounded rationality: A perspective on intuitive judgement and choice. Nobel Prize Lecture. Retrieved January 11, 2006, from: http://nobelprize.org/nobel_prizes/economics/laureates/2002/kahnemann-lecture.pdf] and Evans [Evans, J. St. B. T. (1984). Heuristic and analytic processing in reasoning. British Journal of Psychology, 75, 451-468], for example, claim that the monitoring of the heuristic system is typically quite lax whereas others such as Sloman [Sloman, S. A. (1996). The empirical case for two systems of reasoning. Psychological Bulletin, 119, 3-22] and Epstein [Epstein, S. (1994). Integration of the cognitive and psychodynamic unconscious. American Psychologists, 49, 709-724] claim it is flawless and people typically experience a struggle between what they "know" and "feel" in case of a conflict. The present study contrasted these views. Participants solved classic base rate neglect problems while thinking aloud. In these problems a stereotypical description cues a response that conflicts with the response based on the analytic base rate information. Verbal protocols showed no direct evidence for an explicitly experienced conflict. As Kahneman and Evans predicted, participants hardly ever mentioned the base rates and seemed to base their judgment exclusively on heuristic reasoning. However, more implicit measures of conflict detection such as participants' retrieval of the base rate information in an unannounced recall test, decision making latencies, and the tendency to review the base rates indicated that the base rates had been thoroughly processed. On control problems where base rates and description did not conflict this was not the case. Results suggest that whereas the popular characterization of conflict detection as an actively experienced struggle can be questioned there is nevertheless evidence for Sloman's and Epstein's basic claim about the flawless operation of the monitoring. Whenever the base rates and description disagree people will detect this conflict and consequently redirect attention towards a deeper processing of the base rates. Implications for the dual process framework and the rationality debate are discussed.

  17. A Cultured Learning Environment: Implementing a Problem- and Service-Based Microbiology Capstone Course to Assess Process- and Skill-Based Learning Objectives

    ERIC Educational Resources Information Center

    Watson, Rachel M.; Willford, John D.; Pfeifer, Mariel A.

    2018-01-01

    In this study, a problem-based capstone course was designed to assess the University of Wyoming Microbiology Program's skill-based and process-based student learning objectives. Students partnered with a local farm, a community garden, and a free downtown clinic in order to conceptualize, propose, perform, and present studies addressing problems…

  18. Lagging behind Writing Pedagogical Developments: The Impact of Implementing Process-Based Approach on Learners' Writing in a Vietnamese Secondary Education Context

    ERIC Educational Resources Information Center

    Ngo, Chau M.; Trinh, Lap Q.

    2011-01-01

    The field of English language education has seen developments in writing pedagogy, moving from product-based to process-based and then to genre-based approaches. In Vietnam, teaching secondary school students how to write in English is still lagging behind these growing developments. Product-based approach is commonly seen in English writing…

  19. Reference Model for Project Support Environments Version 1.0

    DTIC Science & Technology

    1993-02-28

    relationship with the framework’s Process Support services and with the Lifecycle Process Engineering services. Examples: "* ORCA (Object-based...Design services. Examples: "* ORCA (Object-based Requirements Capture and Analysis). "* RETRAC (REquirements TRACeability). 4.3 Life-Cycle Process...34traditional" computer tools. Operations: Examples of audio and video processing operations include: "* Create, modify, and delete sound and video data

  20. Modeling of yield and environmental impact categories in tea processing units based on artificial neural networks.

    PubMed

    Khanali, Majid; Mobli, Hossein; Hosseinzadeh-Bandbafha, Homa

    2017-12-01

    In this study, an artificial neural network (ANN) model was developed for predicting the yield and life cycle environmental impacts based on energy inputs required in processing of black tea, green tea, and oolong tea in Guilan province of Iran. A life cycle assessment (LCA) approach was used to investigate the environmental impact categories of processed tea based on the cradle to gate approach, i.e., from production of input materials using raw materials to the gate of tea processing units, i.e., packaged tea. Thus, all the tea processing operations such as withering, rolling, fermentation, drying, and packaging were considered in the analysis. The initial data were obtained from tea processing units while the required data about the background system was extracted from the EcoInvent 2.2 database. LCA results indicated that diesel fuel and corrugated paper box used in drying and packaging operations, respectively, were the main hotspots. Black tea processing unit caused the highest pollution among the three processing units. Three feed-forward back-propagation ANN models based on Levenberg-Marquardt training algorithm with two hidden layers accompanied by sigmoid activation functions and a linear transfer function in output layer, were applied for three types of processed tea. The neural networks were developed based on energy equivalents of eight different input parameters (energy equivalents of fresh tea leaves, human labor, diesel fuel, electricity, adhesive, carton, corrugated paper box, and transportation) and 11 output parameters (yield, global warming, abiotic depletion, acidification, eutrophication, ozone layer depletion, human toxicity, freshwater aquatic ecotoxicity, marine aquatic ecotoxicity, terrestrial ecotoxicity, and photochemical oxidation). The results showed that the developed ANN models with R 2 values in the range of 0.878 to 0.990 had excellent performance in predicting all the output variables based on inputs. Energy consumption for processing of green tea, oolong tea, and black tea were calculated as 58,182, 60,947, and 66,301 MJ per ton of dry tea, respectively.

  1. Biosensor system-on-a-chip including CMOS-based signal processing circuits and 64 carbon nanotube-based sensors for the detection of a neurotransmitter.

    PubMed

    Lee, Byung Yang; Seo, Sung Min; Lee, Dong Joon; Lee, Minbaek; Lee, Joohyung; Cheon, Jun-Ho; Cho, Eunju; Lee, Hyunjoong; Chung, In-Young; Park, Young June; Kim, Suhwan; Hong, Seunghun

    2010-04-07

    We developed a carbon nanotube (CNT)-based biosensor system-on-a-chip (SoC) for the detection of a neurotransmitter. Here, 64 CNT-based sensors were integrated with silicon-based signal processing circuits in a single chip, which was made possible by combining several technological breakthroughs such as efficient signal processing, uniform CNT networks, and biocompatible functionalization of CNT-based sensors. The chip was utilized to detect glutamate, a neurotransmitter, where ammonia, a byproduct of the enzymatic reaction of glutamate and glutamate oxidase on CNT-based sensors, modulated the conductance signals to the CNT-based sensors. This is a major technological advancement in the integration of CNT-based sensors with microelectronics, and this chip can be readily integrated with larger scale lab-on-a-chip (LoC) systems for various applications such as LoC systems for neural networks.

  2. Scalable graphene production: perspectives and challenges of plasma applications

    NASA Astrophysics Data System (ADS)

    Levchenko, Igor; Ostrikov, Kostya (Ken); Zheng, Jie; Li, Xingguo; Keidar, Michael; B. K. Teo, Kenneth

    2016-05-01

    Graphene, a newly discovered and extensively investigated material, has many unique and extraordinary properties which promise major technological advances in fields ranging from electronics to mechanical engineering and food production. Unfortunately, complex techniques and high production costs hinder commonplace applications. Scaling of existing graphene production techniques to the industrial level without compromising its properties is a current challenge. This article focuses on the perspectives and challenges of scalability, equipment, and technological perspectives of the plasma-based techniques which offer many unique possibilities for the synthesis of graphene and graphene-containing products. The plasma-based processes are amenable for scaling and could also be useful to enhance the controllability of the conventional chemical vapour deposition method and some other techniques, and to ensure a good quality of the produced graphene. We examine the unique features of the plasma-enhanced graphene production approaches, including the techniques based on inductively-coupled and arc discharges, in the context of their potential scaling to mass production following the generic scaling approaches applicable to the existing processes and systems. This work analyses a large amount of the recent literature on graphene production by various techniques and summarizes the results in a tabular form to provide a simple and convenient comparison of several available techniques. Our analysis reveals a significant potential of scalability for plasma-based technologies, based on the scaling-related process characteristics. Among other processes, a greater yield of 1 g × h-1 m-2 was reached for the arc discharge technology, whereas the other plasma-based techniques show process yields comparable to the neutral-gas based methods. Selected plasma-based techniques show lower energy consumption than in thermal CVD processes, and the ability to produce graphene flakes of various sizes reaching hundreds of square millimetres, and the thickness varying from a monolayer to 10-20 layers. Additional factors such as electrical voltage and current, not available in thermal CVD processes could potentially lead to better scalability, flexibility and control of the plasma-based processes. Advantages and disadvantages of various systems are also considered.

  3. Case-based medical informatics

    PubMed Central

    Pantazi, Stefan V; Arocha, José F; Moehr, Jochen R

    2004-01-01

    Background The "applied" nature distinguishes applied sciences from theoretical sciences. To emphasize this distinction, we begin with a general, meta-level overview of the scientific endeavor. We introduce the knowledge spectrum and four interconnected modalities of knowledge. In addition to the traditional differentiation between implicit and explicit knowledge we outline the concepts of general and individual knowledge. We connect general knowledge with the "frame problem," a fundamental issue of artificial intelligence, and individual knowledge with another important paradigm of artificial intelligence, case-based reasoning, a method of individual knowledge processing that aims at solving new problems based on the solutions to similar past problems. We outline the fundamental differences between Medical Informatics and theoretical sciences and propose that Medical Informatics research should advance individual knowledge processing (case-based reasoning) and that natural language processing research is an important step towards this goal that may have ethical implications for patient-centered health medicine. Discussion We focus on fundamental aspects of decision-making, which connect human expertise with individual knowledge processing. We continue with a knowledge spectrum perspective on biomedical knowledge and conclude that case-based reasoning is the paradigm that can advance towards personalized healthcare and that can enable the education of patients and providers. We center the discussion on formal methods of knowledge representation around the frame problem. We propose a context-dependent view on the notion of "meaning" and advocate the need for case-based reasoning research and natural language processing. In the context of memory based knowledge processing, pattern recognition, comparison and analogy-making, we conclude that while humans seem to naturally support the case-based reasoning paradigm (memory of past experiences of problem-solving and powerful case matching mechanisms), technical solutions are challenging. Finally, we discuss the major challenges for a technical solution: case record comprehensiveness, organization of information on similarity principles, development of pattern recognition and solving ethical issues. Summary Medical Informatics is an applied science that should be committed to advancing patient-centered medicine through individual knowledge processing. Case-based reasoning is the technical solution that enables a continuous individual knowledge processing and could be applied providing that challenges and ethical issues arising are addressed appropriately. PMID:15533257

  4. Scalable graphene production: perspectives and challenges of plasma applications.

    PubMed

    Levchenko, Igor; Ostrikov, Kostya Ken; Zheng, Jie; Li, Xingguo; Keidar, Michael; B K Teo, Kenneth

    2016-05-19

    Graphene, a newly discovered and extensively investigated material, has many unique and extraordinary properties which promise major technological advances in fields ranging from electronics to mechanical engineering and food production. Unfortunately, complex techniques and high production costs hinder commonplace applications. Scaling of existing graphene production techniques to the industrial level without compromising its properties is a current challenge. This article focuses on the perspectives and challenges of scalability, equipment, and technological perspectives of the plasma-based techniques which offer many unique possibilities for the synthesis of graphene and graphene-containing products. The plasma-based processes are amenable for scaling and could also be useful to enhance the controllability of the conventional chemical vapour deposition method and some other techniques, and to ensure a good quality of the produced graphene. We examine the unique features of the plasma-enhanced graphene production approaches, including the techniques based on inductively-coupled and arc discharges, in the context of their potential scaling to mass production following the generic scaling approaches applicable to the existing processes and systems. This work analyses a large amount of the recent literature on graphene production by various techniques and summarizes the results in a tabular form to provide a simple and convenient comparison of several available techniques. Our analysis reveals a significant potential of scalability for plasma-based technologies, based on the scaling-related process characteristics. Among other processes, a greater yield of 1 g × h(-1) m(-2) was reached for the arc discharge technology, whereas the other plasma-based techniques show process yields comparable to the neutral-gas based methods. Selected plasma-based techniques show lower energy consumption than in thermal CVD processes, and the ability to produce graphene flakes of various sizes reaching hundreds of square millimetres, and the thickness varying from a monolayer to 10-20 layers. Additional factors such as electrical voltage and current, not available in thermal CVD processes could potentially lead to better scalability, flexibility and control of the plasma-based processes. Advantages and disadvantages of various systems are also considered.

  5. Technical and Energy Performance of an Advanced, Aqueous Ammonia-Based CO2 Capture Technology for a 500 MW Coal-Fired Power Station.

    PubMed

    Li, Kangkang; Yu, Hai; Feron, Paul; Tade, Moses; Wardhaugh, Leigh

    2015-08-18

    Using a rate-based model, we assessed the technical feasibility and energy performance of an advanced aqueous-ammonia-based postcombustion capture process integrated with a coal-fired power station. The capture process consists of three identical process trains in parallel, each containing a CO2 capture unit, an NH3 recycling unit, a water separation unit, and a CO2 compressor. A sensitivity study of important parameters, such as NH3 concentration, lean CO2 loading, and stripper pressure, was performed to minimize the energy consumption involved in the CO2 capture process. Process modifications of the rich-split process and the interheating process were investigated to further reduce the solvent regeneration energy. The integrated capture system was then evaluated in terms of the mass balance and the energy consumption of each unit. The results show that our advanced ammonia process is technically feasible and energy-competitive, with a low net power-plant efficiency penalty of 7.7%.

  6. Aligning observed and modelled behaviour based on workflow decomposition

    NASA Astrophysics Data System (ADS)

    Wang, Lu; Du, YuYue; Liu, Wei

    2017-09-01

    When business processes are mostly supported by information systems, the availability of event logs generated from these systems, as well as the requirement of appropriate process models are increasing. Business processes can be discovered, monitored and enhanced by extracting process-related information. However, some events cannot be correctly identified because of the explosion of the amount of event logs. Therefore, a new process mining technique is proposed based on a workflow decomposition method in this paper. Petri nets (PNs) are used to describe business processes, and then conformance checking of event logs and process models is investigated. A decomposition approach is proposed to divide large process models and event logs into several separate parts that can be analysed independently; while an alignment approach based on a state equation method in PN theory enhances the performance of conformance checking. Both approaches are implemented in programmable read-only memory (ProM). The correctness and effectiveness of the proposed methods are illustrated through experiments.

  7. Process-based tolerance assessment of connecting rod machining process

    NASA Astrophysics Data System (ADS)

    Sharma, G. V. S. S.; Rao, P. Srinivasa; Surendra Babu, B.

    2016-06-01

    Process tolerancing based on the process capability studies is the optimistic and pragmatic approach of determining the manufacturing process tolerances. On adopting the define-measure-analyze-improve-control approach, the process potential capability index ( C p) and the process performance capability index ( C pk) values of identified process characteristics of connecting rod machining process are achieved to be greater than the industry benchmark of 1.33, i.e., four sigma level. The tolerance chain diagram methodology is applied to the connecting rod in order to verify the manufacturing process tolerances at various operations of the connecting rod manufacturing process. This paper bridges the gap between the existing dimensional tolerances obtained via tolerance charting and process capability studies of the connecting rod component. Finally, the process tolerancing comparison has been done by adopting a tolerance capability expert software.

  8. A Process-Philosophical Understanding of Organizational Learning as "Wayfinding": Process, Practices and Sensitivity to Environmental Affordances

    ERIC Educational Resources Information Center

    Chia, Robert

    2017-01-01

    Purpose: This paper aims to articulate a practice-based, non-cognitivist approach to organizational learning. Design/methodology/approach: This paper explores the potential contribution of a process-based "practice turn" in social theory for understanding organizational learning. Findings: In complex, turbulent environments, robust…

  9. Trace-Based Microanalytic Measurement of Self-Regulated Learning Processes

    ERIC Educational Resources Information Center

    Siadaty, Melody; Gaševic, Dragan; Hatala, Marek

    2016-01-01

    To keep pace with today's rapidly growing knowledge-driven society, productive self-regulation of one's learning processes are essential. We introduce and discuss a trace-based measurement protocol to measure the effects of scaffolding interventions on self-regulated learning (SRL) processes. It guides tracing of learners' actions in a learning…

  10. Process Evaluation in Corrections-Based Substance Abuse Treatment.

    ERIC Educational Resources Information Center

    Wolk, James L.; Hartmann, David J.

    1996-01-01

    Argues that process evaluation is needed to validate prison-based substance abuse treatment effectiveness. Five groups--inmates, treatment staff, prison staff, prison administration, and the parole board--should be a part of this process evaluation. Discusses these five groups relative to three stages of development of substance abuse treatment in…

  11. A COMPOSITE HOLLOW FIBER MEMBRANE-BASED PERVAPORATION PROCESS FOR SEPARATION OF VOCS FROM AQUEOUS SURFACTANT SOLUTIONS. (R825511C027)

    EPA Science Inventory

    The separation and recovery of VOCs from surfactant-containing aqueous solutions by a composite hollow fiber membrane-based pervaporation process has been studied. The process employed hydrophobic microporous polypropylene hollow fibers having a thin plasma polymerized silicon...

  12. Towards an Intelligent Planning Knowledge Base Development Environment

    NASA Technical Reports Server (NTRS)

    Chien, S.

    1994-01-01

    ract describes work in developing knowledge base editing and debugging tools for the Multimission VICAR Planner (MVP) system. MVP uses artificial intelligence planning techniques to automatically construct executable complex image processing procedures (using models of the smaller constituent image processing requests made to the JPL Multimission Image Processing Laboratory.

  13. Using Data-Based Inquiry and Decision Making To Improve Instruction.

    ERIC Educational Resources Information Center

    Feldman, Jay; Tung, Rosann

    2001-01-01

    Discusses a study of six schools using data-based inquiry and decision-making process to improve instruction. Findings identified two conditions to support successful implementation of the process: administrative support, especially in providing teachers learning time, and teacher leadership to encourage and support colleagues to own the process.…

  14. An adaptive management process for forest soil conservation.

    Treesearch

    Michael P. Curran; Douglas G. Maynard; Ronald L. Heninger; Thomas A. Terry; Steven W. Howes; Douglas M. Stone; Thomas Niemann; Richard E. Miller; Robert F. Powers

    2005-01-01

    Soil disturbance guidelines should be based on comparable disturbance categories adapted to specific local soil conditions, validated by monitoring and research. Guidelines, standards, and practices should be continually improved based on an adaptive management process, which is presented in this paper. Core components of this process include: reliable monitoring...

  15. Improved Warm-Working Process For An Iron-Base Alloy

    NASA Technical Reports Server (NTRS)

    Cone, Fred P.; Cryns, Brendan J.; Miller, John A.; Zanoni, Robert

    1992-01-01

    Warm-working process produces predominantly unrecrystallized grain structure in forgings of iron-base alloy A286 (PWA 1052 composition). Yield strength and ultimate strength increased, and elongation and reduction of area at break decreased. Improved process used on forgings up to 10 in. thick and weighing up to 900 lb.

  16. Advanced process control framework initiative

    NASA Astrophysics Data System (ADS)

    Hill, Tom; Nettles, Steve

    1997-01-01

    The semiconductor industry, one the world's most fiercely competitive industries, is driven by increasingly complex process technologies and global competition to improve cycle time, quality, and process flexibility. Due to the complexity of these problems, current process control techniques are generally nonautomated, time-consuming, reactive, nonadaptive, and focused on individual fabrication tools and processes. As the semiconductor industry moves into higher density processes, radical new approaches are required. To address the need for advanced factory-level process control in this environment, Honeywell, Advanced Micro Devices (AMD), and SEMATECH formed the Advanced Process Control Framework Initiative (APCFI) joint research project. The project defines and demonstrates an Advanced Process Control (APC) approach based on SEMATECH's Computer Integrated Manufacturing (CIM) Framework. Its scope includes the coordination of Manufacturing Execution Systems, process control tools, and wafer fabrication equipment to provide necessary process control capabilities. Moreover, it takes advantage of the CIM Framework to integrate and coordinate applications from other suppliers that provide services necessary for the overall system to function. This presentation discusses the key concept of model-based process control that differentiates the APC Framework. This major improvement over current methods enables new systematic process control by linking the knowledge of key process settings to desired product characteristics that reside in models created with commercial model development tools The unique framework-based approach facilitates integration of commercial tools and reuse of their data by tying them together in an object-based structure. The presentation also explores the perspective of each organization's involvement in the APCFI project. Each has complementary goals and expertise to contribute; Honeywell represents the supplier viewpoint, AMD represents the user with 'real customer requirements', and SEMATECH provides a consensus-building organization that widely disseminates technology to suppliers and users in the semiconductor industry that face similar equipment and factory control systems challenges.

  17. Applying Constructivist and Objectivist Learning Theories in the Design of a Web-based Course: Implications for Practice.

    ERIC Educational Resources Information Center

    Moallem, Mahnaz

    2001-01-01

    Provides an overview of the process of designing and developing a Web-based course using instructional design principles and models, including constructivist and objectivist theories. Explains the process of implementing an instructional design model in designing a Web-based undergraduate course and evaluates the model based on course evaluations.…

  18. Dependent Measure and Time Constraints Modulate the Competition between Conflicting Feature-Based and Rule-Based Generalization Processes

    ERIC Educational Resources Information Center

    Cobos, Pedro L.; Gutiérrez-Cobo, María J.; Morís, Joaquín; Luque, David

    2017-01-01

    In our study, we tested the hypothesis that feature-based and rule-based generalization involve different types of processes that may affect each other producing different results depending on time constraints and on how generalization is measured. For this purpose, participants in our experiments learned cue-outcome relationships that followed…

  19. Testing Strategies for Model-Based Development

    NASA Technical Reports Server (NTRS)

    Heimdahl, Mats P. E.; Whalen, Mike; Rajan, Ajitha; Miller, Steven P.

    2006-01-01

    This report presents an approach for testing artifacts generated in a model-based development process. This approach divides the traditional testing process into two parts: requirements-based testing (validation testing) which determines whether the model implements the high-level requirements and model-based testing (conformance testing) which determines whether the code generated from a model is behaviorally equivalent to the model. The goals of the two processes differ significantly and this report explores suitable testing metrics and automation strategies for each. To support requirements-based testing, we define novel objective requirements coverage metrics similar to existing specification and code coverage metrics. For model-based testing, we briefly describe automation strategies and examine the fault-finding capability of different structural coverage metrics using tests automatically generated from the model.

  20. Stochastic simulation by image quilting of process-based geological models

    NASA Astrophysics Data System (ADS)

    Hoffimann, Júlio; Scheidt, Céline; Barfod, Adrian; Caers, Jef

    2017-09-01

    Process-based modeling offers a way to represent realistic geological heterogeneity in subsurface models. The main limitation lies in conditioning such models to data. Multiple-point geostatistics can use these process-based models as training images and address the data conditioning problem. In this work, we further develop image quilting as a method for 3D stochastic simulation capable of mimicking the realism of process-based geological models with minimal modeling effort (i.e. parameter tuning) and at the same time condition them to a variety of data. In particular, we develop a new probabilistic data aggregation method for image quilting that bypasses traditional ad-hoc weighting of auxiliary variables. In addition, we propose a novel criterion for template design in image quilting that generalizes the entropy plot for continuous training images. The criterion is based on the new concept of voxel reuse-a stochastic and quilting-aware function of the training image. We compare our proposed method with other established simulation methods on a set of process-based training images of varying complexity, including a real-case example of stochastic simulation of the buried-valley groundwater system in Denmark.

  1. 42 CFR 425.112 - Required processes and patient-centeredness criteria.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    .... (a) General. (1) An ACO must— (i) Promote evidence-based medicine and beneficiary engagement... to accomplish the following: (1) Promote evidence-based medicine. These processes must cover...) Communication of clinical knowledge/evidence-based medicine to beneficiaries in a way that is understandable to...

  2. 42 CFR 425.112 - Required processes and patient-centeredness criteria.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    .... (a) General. (1) An ACO must— (i) Promote evidence-based medicine and beneficiary engagement... to accomplish the following: (1) Promote evidence-based medicine. These processes must cover...) Communication of clinical knowledge/evidence-based medicine to beneficiaries in a way that is understandable to...

  3. 42 CFR 425.112 - Required processes and patient-centeredness criteria.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    .... (a) General. (1) An ACO must— (i) Promote evidence-based medicine and beneficiary engagement... to accomplish the following: (1) Promote evidence-based medicine. These processes must cover...) Communication of clinical knowledge/evidence-based medicine to beneficiaries in a way that is understandable to...

  4. Shelf-stable egg-based products processed by high pressure thermal sterilization

    USDA-ARS?s Scientific Manuscript database

    Producing a thermally sterilized egg-based product with increased shelf life without losing the sensory and nutritional properties of the freshly prepared product is challenging. Until recently, all commercial shelf-stable egg-based products were sterilized using conventional thermal processing; how...

  5. Additive Manufacturing of IN100 Superalloy Through Scanning Laser Epitaxy for Turbine Engine Hot-Section Component Repair: Process Development, Modeling, Microstructural Characterization, and Process Control

    NASA Astrophysics Data System (ADS)

    Acharya, Ranadip; Das, Suman

    2015-09-01

    This article describes additive manufacturing (AM) of IN100, a high gamma-prime nickel-based superalloy, through scanning laser epitaxy (SLE), aimed at the creation of thick deposits onto like-chemistry substrates for enabling repair of turbine engine hot-section components. SLE is a metal powder bed-based laser AM technology developed for nickel-base superalloys with equiaxed, directionally solidified, and single-crystal microstructural morphologies. Here, we combine process modeling, statistical design-of-experiments (DoE), and microstructural characterization to demonstrate fully metallurgically bonded, crack-free and dense deposits exceeding 1000 μm of SLE-processed IN100 powder onto IN100 cast substrates produced in a single pass. A combined thermal-fluid flow-solidification model of the SLE process compliments DoE-based process development. A customized quantitative metallography technique analyzes digital cross-sectional micrographs and extracts various microstructural parameters, enabling process model validation and process parameter optimization. Microindentation measurements show an increase in the hardness by 10 pct in the deposit region compared to the cast substrate due to microstructural refinement. The results illustrate one of the very few successes reported for the crack-free deposition of IN100, a notoriously "non-weldable" hot-section alloy, thus establishing the potential of SLE as an AM method suitable for hot-section component repair and for future new-make components in high gamma-prime containing crack-prone nickel-based superalloys.

  6. A CSP-Based Agent Modeling Framework for the Cougaar Agent-Based Architecture

    NASA Technical Reports Server (NTRS)

    Gracanin, Denis; Singh, H. Lally; Eltoweissy, Mohamed; Hinchey, Michael G.; Bohner, Shawn A.

    2005-01-01

    Cognitive Agent Architecture (Cougaar) is a Java-based architecture for large-scale distributed agent-based applications. A Cougaar agent is an autonomous software entity with behaviors that represent a real-world entity (e.g., a business process). A Cougaar-based Model Driven Architecture approach, currently under development, uses a description of system's functionality (requirements) to automatically implement the system in Cougaar. The Communicating Sequential Processes (CSP) formalism is used for the formal validation of the generated system. Two main agent components, a blackboard and a plugin, are modeled as CSP processes. A set of channels represents communications between the blackboard and individual plugins. The blackboard is represented as a CSP process that communicates with every agent in the collection. The developed CSP-based Cougaar modeling framework provides a starting point for a more complete formal verification of the automatically generated Cougaar code. Currently it is used to verify the behavior of an individual agent in terms of CSP properties and to analyze the corresponding Cougaar society.

  7. Model-Driven Useware Engineering

    NASA Astrophysics Data System (ADS)

    Meixner, Gerrit; Seissler, Marc; Breiner, Kai

    User-oriented hardware and software development relies on a systematic development process based on a comprehensive analysis focusing on the users' requirements and preferences. Such a development process calls for the integration of numerous disciplines, from psychology and ergonomics to computer sciences and mechanical engineering. Hence, a correspondingly interdisciplinary team must be equipped with suitable software tools to allow it to handle the complexity of a multimodal and multi-device user interface development approach. An abstract, model-based development approach seems to be adequate for handling this complexity. This approach comprises different levels of abstraction requiring adequate tool support. Thus, in this chapter, we present the current state of our model-based software tool chain. We introduce the use model as the core model of our model-based process, transformation processes, and a model-based architecture, and we present different software tools that provide support for creating and maintaining the models or performing the necessary model transformations.

  8. The development of additive manufacturing technique for nickel-base alloys: A review

    NASA Astrophysics Data System (ADS)

    Zadi-Maad, Ahmad; Basuki, Arif

    2018-04-01

    Nickel-base alloys are an attractive alloy due to its excellent mechanical properties, a high resistance to creep deformation, corrosion, and oxidation. However, it is a hard task to control performance when casting or forging for this material. In recent years, additive manufacturing (AM) process has been implemented to replace the conventional directional solidification process for the production of nickel-base alloys. Due to its potentially lower cost and flexibility manufacturing process, AM is considered as a substitute technique for the existing. This paper provides a comprehensive review of the previous work related to the AM techniques for Ni-base alloys while highlighting current challenges and methods to solving them. The properties of conventionally manufactured Ni-base alloys are also compared with the AM fabricated alloys. The mechanical properties obtained from tension, hardness and fatigue test are included, along with discussions of the effect of post-treatment process. Recommendations for further work are also provided.

  9. Church-Based Recruitment to Reach Korean Immigrants: An Integrative Review.

    PubMed

    Park, Chorong; Jang, Myoungock; Nam, Soohyun; Grey, Margaret; Whittemore, Robin

    2017-04-01

    Although the Korean church has been frequently used to recruit Korean immigrants in research, little is known about the specific strategies and process. The purpose of this integrative review was to describe recruitment strategies in studies of Korean immigrants and to identify the process of Korean church-based recruitment. Thirty-three studies met inclusion criteria. Four stages of church-based recruitment were identified: initiation, endorsement, advertisement, and implementation. This review identified aspects of the church-based recruitment process in Korean immigrants, which are different from the Black and Hispanic literature, due to their hierarchical culture and language barriers. Getting permission from pastors and announcing the study by pastors at Sunday services were identified as the key components of the process. Using the church newsletter to advertise the study was the most effective strategy for the advertisement stage. Despite several limitations, church-based recruitment is a very feasible and effective way to recruit Korean immigrants.

  10. An Ontology-Based Conceptual Model For Accumulating And Reusing Knowledge In A DMAIC Process

    NASA Astrophysics Data System (ADS)

    Nguyen, ThanhDat; Kifor, Claudiu Vasile

    2015-09-01

    DMAIC (Define, Measure, Analyze, Improve, and Control) is an important process used to enhance quality of processes basing on knowledge. However, it is difficult to access DMAIC knowledge. Conventional approaches meet a problem arising from structuring and reusing DMAIC knowledge. The main reason is that DMAIC knowledge is not represented and organized systematically. In this article, we overcome the problem basing on a conceptual model that is a combination of DMAIC process, knowledge management, and Ontology engineering. The main idea of our model is to utilizing Ontologies to represent knowledge generated by each of DMAIC phases. We build five different knowledge bases for storing all knowledge of DMAIC phases with the support of necessary tools and appropriate techniques in Information Technology area. Consequently, these knowledge bases provide knowledge available to experts, managers, and web users during or after DMAIC execution in order to share and reuse existing knowledge.

  11. Implementation of a Web-Based Collaborative Process Planning System

    NASA Astrophysics Data System (ADS)

    Wang, Huifen; Liu, Tingting; Qiao, Li; Huang, Shuangxi

    Under the networked manufacturing environment, all phases of product manufacturing involving design, process planning, machining and assembling may be accomplished collaboratively by different enterprises, even different manufacturing stages of the same part may be finished collaboratively by different enterprises. Based on the self-developed networked manufacturing platform eCWS(e-Cooperative Work System), a multi-agent-based system framework for collaborative process planning is proposed. In accordance with requirements of collaborative process planning, share resources provided by cooperative enterprises in the course of collaboration are classified into seven classes. Then a reconfigurable and extendable resource object model is built. Decision-making strategy is also studied in this paper. Finally a collaborative process planning system e-CAPP is developed and applied. It provides strong support for distributed designers to collaboratively plan and optimize product process though network.

  12. Understanding Process in Group-Based Intervention Delivery: Social Network Analysis and Intra-entity Variability Methods as Windows into the "Black Box".

    PubMed

    Molloy Elreda, Lauren; Coatsworth, J Douglas; Gest, Scott D; Ram, Nilam; Bamberger, Katharine

    2016-11-01

    Although the majority of evidence-based programs are designed for group delivery, group process and its role in participant outcomes have received little empirical attention. Data were collected from 20 groups of participants (94 early adolescents, 120 parents) enrolled in an efficacy trial of a mindfulness-based adaptation of the Strengthening Families Program (MSFP). Following each weekly session, participants reported on their relations to group members. Social network analysis and methods sensitive to intraindividual variability were integrated to examine weekly covariation between group process and participant progress, and to predict post-intervention outcomes from levels and changes in group process. Results demonstrate hypothesized links between network indices of group process and intervention outcomes and highlight the value of this unique analytic approach to studying intervention group process.

  13. Indirect three-dimensional printing of synthetic polymer scaffold based on thermal molding process.

    PubMed

    Park, Jeong Hun; Jung, Jin Woo; Kang, Hyun-Wook; Cho, Dong-Woo

    2014-06-01

    One of the major issues in tissue engineering has been the development of three-dimensional (3D) scaffolds, which serve as a structural template for cell growth and extracellular matrix formation. In scaffold-based tissue engineering, 3D printing (3DP) technology has been successfully applied for the fabrication of complex 3D scaffolds by using both direct and indirect techniques. In principle, direct 3DP techniques rely on the straightforward utilization of the final scaffold materials during the actual scaffold fabrication process. In contrast, indirect 3DP techniques use a negative mold based on a scaffold design, to which the desired biomaterial is cast and then sacrificed to obtain the final scaffold. Such indirect 3DP techniques generally impose a solvent-based process for scaffold fabrication, resulting in a considerable increase in the fabrication time and poor mechanical properties. In addition, the internal architecture of the resulting scaffold is affected by the properties of the biomaterial solution. In this study, we propose an advanced indirect 3DP technique using projection-based micro-stereolithography and an injection molding system (IMS) in order to address these challenges. The scaffold was fabricated by a thermal molding process using IMS to overcome the limitation of the solvent-based molding process in indirect 3DP techniques. The results indicate that the thermal molding process using an IMS has achieved a substantial reduction in scaffold fabrication time and has also provided the scaffold with higher mechanical modulus and strength. In addition, cell adhesion and proliferation studies have indicated no significant difference in cell activity between the scaffolds prepared by solvent-based and thermal molding processes.

  14. Standardized Review and Approval Process for High-Cost Medication Use Promotes Value-Based Care in a Large Academic Medical System

    PubMed Central

    Durvasula, Raghu; Kelly, Janet; Schleyer, Anneliese; Anawalt, Bradley D.; Somani, Shabir; Dellit, Timothy H.

    2018-01-01

    Background As healthcare costs rise and reimbursements decrease, healthcare organization leadership and clinical providers must collaborate to provide high-value healthcare. Medications are a key driver of the increasing cost of healthcare, largely as a result of the proliferation of expensive specialty drugs, including biologic agents. Such medications contribute significantly to the inpatient diagnosis-related group payment system, often with minimal or unproved benefit over less-expensive therapies. Objective To describe a systematic review process to reduce non–evidence-based inpatient use of high-cost medications across a large multihospital academic health system. Methods We created a Pharmacy & Therapeutics subcommittee consisting of clinicians, pharmacists, and an ethics representative. This committee developed a standardized process for a timely review (<48 hours) and approval of high-cost medications based on their clinical effectiveness, safety, and appropriateness. The engagement of clinical experts in the development of the consensus-based guidelines for the use of specific medications facilitated the clinicians' acceptance of the review process. Results Over a 2-year period, a total of 85 patient-specific requests underwent formal review. All reviews were conducted within 48 hours. This review process has reduced the non–evidence-based use of specialty medications and has resulted in a pharmacy savings of $491,000 in fiscal year 2016, with almost 80% of the savings occurring in the last 2 quarters, because our process has matured. Conclusion The creation of a collaborative review process to ensure consistent, evidence-based utilization of high-cost medications provides value-based care, while minimizing unnecessary practice variation and reducing the cost of inpatient care.

  15. The remote sensing image segmentation mean shift algorithm parallel processing based on MapReduce

    NASA Astrophysics Data System (ADS)

    Chen, Xi; Zhou, Liqing

    2015-12-01

    With the development of satellite remote sensing technology and the remote sensing image data, traditional remote sensing image segmentation technology cannot meet the massive remote sensing image processing and storage requirements. This article put cloud computing and parallel computing technology in remote sensing image segmentation process, and build a cheap and efficient computer cluster system that uses parallel processing to achieve MeanShift algorithm of remote sensing image segmentation based on the MapReduce model, not only to ensure the quality of remote sensing image segmentation, improved split speed, and better meet the real-time requirements. The remote sensing image segmentation MeanShift algorithm parallel processing algorithm based on MapReduce shows certain significance and a realization of value.

  16. The Scenario-Based Engineering Process (SEP): a user-centered approach for the development of health care systems.

    PubMed

    Harbison, K; Kelly, J; Burnell, L; Silva, J

    1995-01-01

    The Scenario-based Engineering Process (SEP) is a user-focused methodology for large and complex system design. This process supports new application development from requirements analysis with domain models to component selection, design and modification, implementation, integration, and archival placement. It is built upon object-oriented methodologies, domain modeling strategies, and scenario-based techniques to provide an analysis process for mapping application requirements to available components. We are using SEP in the health care applications that we are developing. The process has already achieved success in the manufacturing and military domains and is being adopted by many organizations. SEP should prove viable in any domain containing scenarios that can be decomposed into tasks.

  17. Technical Potential Assessment for the Renewable Energy Zone (REZ) Process: A GIS-Based Approach

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, Nathan; Roberts, Billy J

    Geographic Information Systems (GIS)-based energy resource and technical potential assessments identify areas capable of supporting high levels of renewable energy (RE) development as part of a Renewable Energy Zone (REZ) Transmission Planning process. This document expands on the REZ Process to aid practitioners in conducting GIS-based RE resource and technical potential assessments. The REZ process is an approach to plan, approve, and build transmission infrastructure that connects REZs - geographic areas that have high-quality RE resources, suitable topography and land-use designations, and demonstrated developer interest - to the power system. The REZ process helps to increase the share of solarmore » photovoltaic (PV), wind, and other resources while also maintaining reliability and economics.« less

  18. Mechanical Properties of Aluminum-Based Dissimilar Alloy Joints by Power Beams, Arc and FSW Processes

    NASA Astrophysics Data System (ADS)

    Okubo, Michinori; Kon, Tomokuni; Abe, Nobuyuki

    Dissimilar smart joints are useful. In this research, welded quality of dissimilar aluminum alloys of 3 mm thickness by various welding processes and process parameters have been investigated by hardness and tensile tests, and observation of imperfection and microstructure. Base metals used in this study are A1050-H24, A2017-T3, A5083-O, A6061-T6 and A7075-T651. Welding processes used are YAG laser beam, electron beam, metal inert gas arc, tungsten inert gas arc and friction stir welding. The properties of weld zones are affected by welding processes, welding parameters and combination of base metals. Properties of high strength aluminum alloy joints are improved by friction stir welding.

  19. An expert systems application to space base data processing

    NASA Technical Reports Server (NTRS)

    Babb, Stephen M.

    1988-01-01

    The advent of space vehicles with their increased data requirements are reflected in the complexity of future telemetry systems. Space based operations with its immense operating costs will shift the burden of data processing and routine analysis from the space station to the Orbital Transfer Vehicle (OTV). A research and development project is described which addresses the real time onboard data processing tasks associated with a space based vehicle, specifically focusing on an implementation of an expert system.

  20. Hardware based redundant multi-threading inside a GPU for improved reliability

    DOEpatents

    Sridharan, Vilas; Gurumurthi, Sudhanva

    2015-05-05

    A system and method for verifying computation output using computer hardware are provided. Instances of computation are generated and processed on hardware-based processors. As instances of computation are processed, each instance of computation receives a load accessible to other instances of computation. Instances of output are generated by processing the instances of computation. The instances of output are verified against each other in a hardware based processor to ensure accuracy of the output.

  1. Acausal measurement-based quantum computing

    NASA Astrophysics Data System (ADS)

    Morimae, Tomoyuki

    2014-07-01

    In measurement-based quantum computing, there is a natural "causal cone" among qubits of the resource state, since the measurement angle on a qubit has to depend on previous measurement results in order to correct the effect of by-product operators. If we respect the no-signaling principle, by-product operators cannot be avoided. Here we study the possibility of acausal measurement-based quantum computing by using the process matrix framework [Oreshkov, Costa, and Brukner, Nat. Commun. 3, 1092 (2012), 10.1038/ncomms2076]. We construct a resource process matrix for acausal measurement-based quantum computing restricting local operations to projective measurements. The resource process matrix is an analog of the resource state of the standard causal measurement-based quantum computing. We find that if we restrict local operations to projective measurements the resource process matrix is (up to a normalization factor and trivial ancilla qubits) equivalent to the decorated graph state created from the graph state of the corresponding causal measurement-based quantum computing. We also show that it is possible to consider a causal game whose causal inequality is violated by acausal measurement-based quantum computing.

  2. Development and Validation of the Social Information Processing Application: A Web-Based Measure of Social Information Processing Patterns in Elementary School-Age Boys

    ERIC Educational Resources Information Center

    Kupersmidt, Janis B.; Stelter, Rebecca; Dodge, Kenneth A.

    2011-01-01

    The purpose of this study was to evaluate the psychometric properties of an audio computer-assisted self-interviewing Web-based software application called the Social Information Processing Application (SIP-AP) that was designed to assess social information processing skills in boys in RD through 5th grades. This study included a racially and…

  3. The Process of Adaptation of a Community-Level, Evidence-Based Intervention for HIV-Positive African American Men Who Have Sex with Men in Two Cities

    ERIC Educational Resources Information Center

    Robinson, Beatrice E.; Galbraith, Jennifer S.; Lund, Sharon M.; Hamilton, Autumn R.; Shankle, Michael D.

    2012-01-01

    We describe the process of adapting a community-level, evidence-based behavioral intervention (EBI), Community PROMISE, for HIV-positive African American men who have sex with men (AAMSM). The Centers for Disease Control and Prevention (CDC) Map of the Adaptation Process (MAP) guided the adaptation process for this new target population by two…

  4. Dehydration processes using membranes with hydrophobic coating

    DOEpatents

    Huang, Yu; Baker, Richard W; Aldajani, Tiem; Ly, Jennifer

    2013-07-30

    Processes for removing water from organic compounds, especially polar compounds such as alcohols. The processes include a membrane-based dehydration step, using a membrane that has a dioxole-based polymer selective layer or the like and a hydrophilic selective layer, and can operate even when the stream to be treated has a high water content, such as 10 wt % or more. The processes are particularly useful for dehydrating ethanol.

  5. System of Objectified Judgement Analysis (SOJA) as a tool in rational and transparent drug-decision making.

    PubMed

    Janknegt, Robert; Scott, Mike; Mairs, Jill; Timoney, Mark; McElnay, James; Brenninkmeijer, Rob

    2007-10-01

    Drug selection should be a rational process that embraces the principles of evidence-based medicine. However, many factors may affect the choice of agent. It is against this background that the System of Objectified Judgement Analysis (SOJA) process for rational drug-selection was developed. This article describes how the information on which the SOJA process is based, was researched and processed.

  6. Goal selection versus process control in a brain-computer interface based on sensorimotor rhythms.

    PubMed

    Royer, Audrey S; He, Bin

    2009-02-01

    In a brain-computer interface (BCI) utilizing a process control strategy, the signal from the cortex is used to control the fine motor details normally handled by other parts of the brain. In a BCI utilizing a goal selection strategy, the signal from the cortex is used to determine the overall end goal of the user, and the BCI controls the fine motor details. A BCI based on goal selection may be an easier and more natural system than one based on process control. Although goal selection in theory may surpass process control, the two have never been directly compared, as we are reporting here. Eight young healthy human subjects participated in the present study, three trained and five naïve in BCI usage. Scalp-recorded electroencephalograms (EEG) were used to control a computer cursor during five different paradigms. The paradigms were similar in their underlying signal processing and used the same control signal. However, three were based on goal selection, and two on process control. For both the trained and naïve populations, goal selection had more hits per run, was faster, more accurate (for seven out of eight subjects) and had a higher information transfer rate than process control. Goal selection outperformed process control in every measure studied in the present investigation.

  7. Contingency theoretic methodology for agent-based web-oriented manufacturing systems

    NASA Astrophysics Data System (ADS)

    Durrett, John R.; Burnell, Lisa J.; Priest, John W.

    2000-12-01

    The development of distributed, agent-based, web-oriented, N-tier Information Systems (IS) must be supported by a design methodology capable of responding to the convergence of shifts in business process design, organizational structure, computing, and telecommunications infrastructures. We introduce a contingency theoretic model for the use of open, ubiquitous software infrastructure in the design of flexible organizational IS. Our basic premise is that developers should change in the way they view the software design process from a view toward the solution of a problem to one of the dynamic creation of teams of software components. We postulate that developing effective, efficient, flexible, component-based distributed software requires reconceptualizing the current development model. The basic concepts of distributed software design are merged with the environment-causes-structure relationship from contingency theory; the task-uncertainty of organizational- information-processing relationships from information processing theory; and the concept of inter-process dependencies from coordination theory. Software processes are considered as employees, groups of processes as software teams, and distributed systems as software organizations. Design techniques already used in the design of flexible business processes and well researched in the domain of the organizational sciences are presented. Guidelines that can be utilized in the creation of component-based distributed software will be discussed.

  8. Consumer psychology: categorization, inferences, affect, and persuasion.

    PubMed

    Loken, Barbara

    2006-01-01

    This chapter reviews research on consumer psychology with emphasis on the topics of categorization, inferences, affect, and persuasion. The chapter reviews theory-based empirical research during the period 1994-2004. Research on categorization includes empirical research on brand categories, goals as organizing frameworks and motivational bases for judgments, and self-based processing. Research on inferences includes numerous types of inferences that are cognitively and/or experienced based. Research on affect includes the effects of mood on processing and cognitive and noncognitive bases for attitudes and intentions. Research on persuasion focuses heavily on the moderating role of elaboration and dual-process models, and includes research on attitude strength responses, advertising responses, and negative versus positive evaluative dimensions.

  9. REPLACEMENT OF HAZARDOUS MATERIAL IN WIDE WEB FLEXOGRAPHIC PRINTING PROCESS

    EPA Science Inventory

    This study examined on a technical and economic basis, the effect of substituting water-based inks in a flexographic printing process. o reduce volatile organic compound (VOC) emissions by switching from the use of solvent-based inks to water based inks, several equipment modific...

  10. REPLACEMENT OF HAZARDOUS MATERIAL IN WIDE WEB FLEXOGRAPHIC PRINTING PROCESS

    EPA Science Inventory

    This study examined on a technical and economic basis, the effect of substituting water-based inks in a flexographic printing process. To reduce volatile organic compound (VOC) emissions by switching from the use of solvent-based inks to water based inks, several equipment modifi...

  11. [Preface for special issue on bio-based materials (2016)].

    PubMed

    Weng, Yunxuan

    2016-06-25

    Bio-based materials are new materials or chemicals with renewable biomass as raw materials such as grain, legume, straw, bamboo and wood powder. This class of materials includes bio-based polymer, biobased fiber, glycotechnology products, biobased rubber and plastics produced by biomass thermoplastic processing and basic biobased chemicals, for instance, bio-alcohols, organic acids, alkanes, and alkenes, obtained by bio-synthesis, bio-processing and bio-refinery. Owing to its environmental friendly and resource conservation, bio-based materials are becoming a new dominant industry taking the lead in the world scientific and technological innovation and economic development. An overview of bio-based materials development is reported in this special issue, and the industrial status and research progress of the following aspects, including biobased fiber, polyhydroxyalkanoates, biodegradable mulching film, bio-based polyamide, protein based biomedical materials, bio-based polyurethane, and modification and processing of poly(lactic acid), are introduced.

  12. Induced Polarization Influences the Fundamental Forces in DNA Base Flipping

    PubMed Central

    2015-01-01

    Base flipping in DNA is an important process involved in genomic repair and epigenetic control of gene expression. The driving forces for these processes are not fully understood, especially in the context of the underlying dynamics of the DNA and solvent effects. We studied double-stranded DNA oligomers that have been previously characterized by imino proton exchange NMR using both additive and polarizable force fields. Our results highlight the importance of induced polarization on the base flipping process, yielding near-quantitative agreement with experimental measurements of the equilibrium between the base-paired and flipped states. Further, these simulations allow us to quantify for the first time the energetic implications of polarization on the flipping pathway. Free energy barriers to base flipping are reduced by changes in dipole moments of both the flipped bases that favor solvation of the bases in the open state and water molecules adjacent to the flipping base. PMID:24976900

  13. Proportional reasoning as a heuristic-based process: time constraint and dual task considerations.

    PubMed

    Gillard, Ellen; Van Dooren, Wim; Schaeken, Walter; Verschaffel, Lieven

    2009-01-01

    The present study interprets the overuse of proportional solution methods from a dual process framework. Dual process theories claim that analytic operations involve time-consuming executive processing, whereas heuristic operations are fast and automatic. In two experiments to test whether proportional reasoning is heuristic-based, the participants solved "proportional" problems, for which proportional solution methods provide correct answers, and "nonproportional" problems known to elicit incorrect answers based on the assumption of proportionality. In Experiment 1, the available solution time was restricted. In Experiment 2, the executive resources were burdened with a secondary task. Both manipulations induced an increase in proportional answers and a decrease in correct answers to nonproportional problems. These results support the hypothesis that the choice for proportional methods is heuristic-based.

  14. Soft sensor modeling based on variable partition ensemble method for nonlinear batch processes

    NASA Astrophysics Data System (ADS)

    Wang, Li; Chen, Xiangguang; Yang, Kai; Jin, Huaiping

    2017-01-01

    Batch processes are always characterized by nonlinear and system uncertain properties, therefore, the conventional single model may be ill-suited. A local learning strategy soft sensor based on variable partition ensemble method is developed for the quality prediction of nonlinear and non-Gaussian batch processes. A set of input variable sets are obtained by bootstrapping and PMI criterion. Then, multiple local GPR models are developed based on each local input variable set. When a new test data is coming, the posterior probability of each best performance local model is estimated based on Bayesian inference and used to combine these local GPR models to get the final prediction result. The proposed soft sensor is demonstrated by applying to an industrial fed-batch chlortetracycline fermentation process.

  15. Risk-based Strategy to Determine Testing Requirement for the Removal of Residual Process Reagents as Process-related Impurities in Bioprocesses.

    PubMed

    Qiu, Jinshu; Li, Kim; Miller, Karen; Raghani, Anil

    2015-01-01

    The purpose of this article is to recommend a risk-based strategy for determining clearance testing requirements of the process reagents used in manufacturing biopharmaceutical products. The strategy takes account of four risk factors. Firstly, the process reagents are classified into two categories according to their safety profile and history of use: generally recognized as safe (GRAS) and potential safety concern (PSC) reagents. The clearance testing of GRAS reagents can be eliminated because of their safe use historically and process capability to remove these reagents. An estimated safety margin (Se) value, a ratio of the exposure limit to the estimated maximum reagent amount, is then used to evaluate the necessity for testing the PSC reagents at an early development stage. The Se value is calculated from two risk factors, the starting PSC reagent amount per maximum product dose (Me), and the exposure limit (Le). A worst-case scenario is assumed to estimate the Me value, that is common. The PSC reagent of interest is co-purified with the product and no clearance occurs throughout the entire purification process. No clearance testing is required for this PSC reagent if its Se value is ≥1; otherwise clearance testing is needed. Finally, the point of the process reagent introduction to the process is also considered in determining the necessity of the clearance testing for process reagents. How to use the measured safety margin as a criterion for determining PSC reagent testing at process characterization, process validation, and commercial production stages are also described. A large number of process reagents are used in the biopharmaceutical manufacturing to control the process performance. Clearance testing for all of the process reagents will be an enormous analytical task. In this article, a risk-based strategy is described to eliminate unnecessary clearance testing for majority of the process reagents using four risk factors. The risk factors included in the strategy are (i) safety profile of the reagents, (ii) the starting amount of the process reagents used in the manufacturing process, (iii) the maximum dose of the product, and (iv) the point of introduction of the process reagents in the process. The implementation of the risk-based strategy can eliminate clearance testing for approximately 90% of the process reagents used in the manufacturing processes. This science-based strategy allows us to ensure patient safety and meet regulatory agency expectations throughout the product development life cycle. © PDA, Inc. 2015.

  16. Chang'E-3 data pre-processing system based on scientific workflow

    NASA Astrophysics Data System (ADS)

    tan, xu; liu, jianjun; wang, yuanyuan; yan, wei; zhang, xiaoxia; li, chunlai

    2016-04-01

    The Chang'E-3(CE3) mission have obtained a huge amount of lunar scientific data. Data pre-processing is an important segment of CE3 ground research and application system. With a dramatic increase in the demand of data research and application, Chang'E-3 data pre-processing system(CEDPS) based on scientific workflow is proposed for the purpose of making scientists more flexible and productive by automating data-driven. The system should allow the planning, conduct and control of the data processing procedure with the following possibilities: • describe a data processing task, include:1)define input data/output data, 2)define the data relationship, 3)define the sequence of tasks,4)define the communication between tasks,5)define mathematical formula, 6)define the relationship between task and data. • automatic processing of tasks. Accordingly, Describing a task is the key point whether the system is flexible. We design a workflow designer which is a visual environment for capturing processes as workflows, the three-level model for the workflow designer is discussed:1) The data relationship is established through product tree.2)The process model is constructed based on directed acyclic graph(DAG). Especially, a set of process workflow constructs, including Sequence, Loop, Merge, Fork are compositional one with another.3)To reduce the modeling complexity of the mathematical formulas using DAG, semantic modeling based on MathML is approached. On top of that, we will present how processed the CE3 data with CEDPS.

  17. Testing Theories of Recognition Memory by Predicting Performance Across Paradigms

    ERIC Educational Resources Information Center

    Smith, David G.; Duncan, Matthew J. J.

    2004-01-01

    Signal-detection theory (SDT) accounts of recognition judgments depend on the assumption that recognition decisions result from a single familiarity-based process. However, fits of a hybrid SDT model, called dual-process theory (DPT), have provided evidence for the existence of a second, recollection-based process. In 2 experiments, the authors…

  18. Water-based binary polyol process for the controllable synthesis of silver nanoparticles inhibiting human and foodborne pathogenic bacteria

    USDA-ARS?s Scientific Manuscript database

    The polyol process is a widely used strategy for producing nanoparticles from various reducible metallic precursors; however it requires a bulk polyol liquid reaction with additional protective agents at high temperatures. Here, we report a water-based binary polyol process using low concentrations ...

  19. Social Workers' Orientation toward the Evidence-Based Practice Process: A Dutch Survey

    ERIC Educational Resources Information Center

    van der Zwet, Renske J. M.; Kolmer, Deirdre M. Beneken genaamd; Schalk, René

    2016-01-01

    Objectives: This study assesses social workers' orientation toward the evidence-based practice (EBP) process and explores which specific variables (e.g. age) are associated. Methods: Data were collected from 341 Dutch social workers through an online survey which included a Dutch translation of the EBP Process Assessment Scale (EBPPAS), along with…

  20. How Do Teachers Learn Together? A Study of School-Based Teacher Learning in China from the Perspective of Organisational Learning

    ERIC Educational Resources Information Center

    Zhang, Xiaolei; Wong, Jocelyn L. N.

    2018-01-01

    Studies of professional development have examined the influence of school-based approaches on in-service teacher learning and change but have seldom investigated teachers' job-embedded learning processes. This paper explores the dynamic processes of teacher learning in school-based settings. A qualitative comparative case study based on the…

  1. Problem Based Learning and the scientific process

    NASA Astrophysics Data System (ADS)

    Schuchardt, Daniel Shaner

    This research project was developed to inspire students to constructively use problem based learning and the scientific process to learn middle school science content. The student population in this study consisted of male and female seventh grade students. Students were presented with authentic problems that are connected to physical and chemical properties of matter. The intent of the study was to have students use the scientific process of looking at existing knowledge, generating learning issues or questions about the problems, and then developing a course of action to research and design experiments to model resolutions to the authentic problems. It was expected that students would improve their ability to actively engage with others in a problem solving process to achieve a deeper understanding of Michigan's 7th Grade Level Content Expectations, the Next Generation Science Standards, and a scientific process. Problem based learning was statistically effective in students' learning of the scientific process. Students statistically showed improvement on pre to posttest scores. The teaching method of Problem Based Learning was effective for seventh grade science students at Dowagiac Middle School.

  2. Ethanol precipitation for purification of recombinant antibodies.

    PubMed

    Tscheliessnig, Anne; Satzer, Peter; Hammerschmidt, Nikolaus; Schulz, Henk; Helk, Bernhard; Jungbauer, Alois

    2014-10-20

    Currently, the golden standard for the purification of recombinant humanized antibodies (rhAbs) from CHO cell culture is protein A chromatography. However, due to increasing rhAbs titers alternative methods have come into focus. A new strategy for purification of recombinant human antibodies from CHO cell culture supernatant based on cold ethanol precipitation (CEP) and CaCl2 precipitation has been developed. This method is based on the cold ethanol precipitation, the process used for purification of antibodies and other components from blood plasma. We proof the applicability of the developed process for four different antibodies resulting in similar yield and purity as a protein A chromatography based process. This process can be further improved using an anion-exchange chromatography in flowthrough mode e.g. a monolith as last step so that residual host cell protein is reduced to a minimum. Beside the ethanol based process, our data also suggest that ethanol could be replaced with methanol or isopropanol. The process is suited for continuous operation. Copyright © 2014 The Authors. Published by Elsevier B.V. All rights reserved.

  3. Managing Analysis Models in the Design Process

    NASA Technical Reports Server (NTRS)

    Briggs, Clark

    2006-01-01

    Design of large, complex space systems depends on significant model-based support for exploration of the design space. Integrated models predict system performance in mission-relevant terms given design descriptions and multiple physics-based numerical models. Both the design activities and the modeling activities warrant explicit process definitions and active process management to protect the project from excessive risk. Software and systems engineering processes have been formalized and similar formal process activities are under development for design engineering and integrated modeling. JPL is establishing a modeling process to define development and application of such system-level models.

  4. Striatal and Hippocampal Entropy and Recognition Signals in Category Learning: Simultaneous Processes Revealed by Model-Based fMRI

    PubMed Central

    Davis, Tyler; Love, Bradley C.; Preston, Alison R.

    2012-01-01

    Category learning is a complex phenomenon that engages multiple cognitive processes, many of which occur simultaneously and unfold dynamically over time. For example, as people encounter objects in the world, they simultaneously engage processes to determine their fit with current knowledge structures, gather new information about the objects, and adjust their representations to support behavior in future encounters. Many techniques that are available to understand the neural basis of category learning assume that the multiple processes that subserve it can be neatly separated between different trials of an experiment. Model-based functional magnetic resonance imaging offers a promising tool to separate multiple, simultaneously occurring processes and bring the analysis of neuroimaging data more in line with category learning’s dynamic and multifaceted nature. We use model-based imaging to explore the neural basis of recognition and entropy signals in the medial temporal lobe and striatum that are engaged while participants learn to categorize novel stimuli. Consistent with theories suggesting a role for the anterior hippocampus and ventral striatum in motivated learning in response to uncertainty, we find that activation in both regions correlates with a model-based measure of entropy. Simultaneously, separate subregions of the hippocampus and striatum exhibit activation correlated with a model-based recognition strength measure. Our results suggest that model-based analyses are exceptionally useful for extracting information about cognitive processes from neuroimaging data. Models provide a basis for identifying the multiple neural processes that contribute to behavior, and neuroimaging data can provide a powerful test bed for constraining and testing model predictions. PMID:22746951

  5. Design and Simulation of Material-Integrated Distributed Sensor Processing with a Code-Based Agent Platform and Mobile Multi-Agent Systems

    PubMed Central

    Bosse, Stefan

    2015-01-01

    Multi-agent systems (MAS) can be used for decentralized and self-organizing data processing in a distributed system, like a resource-constrained sensor network, enabling distributed information extraction, for example, based on pattern recognition and self-organization, by decomposing complex tasks in simpler cooperative agents. Reliable MAS-based data processing approaches can aid the material-integration of structural-monitoring applications, with agent processing platforms scaled to the microchip level. The agent behavior, based on a dynamic activity-transition graph (ATG) model, is implemented with program code storing the control and the data state of an agent, which is novel. The program code can be modified by the agent itself using code morphing techniques and is capable of migrating in the network between nodes. The program code is a self-contained unit (a container) and embeds the agent data, the initialization instructions and the ATG behavior implementation. The microchip agent processing platform used for the execution of the agent code is a standalone multi-core stack machine with a zero-operand instruction format, leading to a small-sized agent program code, low system complexity and high system performance. The agent processing is token-queue-based, similar to Petri-nets. The agent platform can be implemented in software, too, offering compatibility at the operational and code level, supporting agent processing in strong heterogeneous networks. In this work, the agent platform embedded in a large-scale distributed sensor network is simulated at the architectural level by using agent-based simulation techniques. PMID:25690550

  6. Design and simulation of material-integrated distributed sensor processing with a code-based agent platform and mobile multi-agent systems.

    PubMed

    Bosse, Stefan

    2015-02-16

    Multi-agent systems (MAS) can be used for decentralized and self-organizing data processing in a distributed system, like a resource-constrained sensor network, enabling distributed information extraction, for example, based on pattern recognition and self-organization, by decomposing complex tasks in simpler cooperative agents. Reliable MAS-based data processing approaches can aid the material-integration of structural-monitoring applications, with agent processing platforms scaled to the microchip level. The agent behavior, based on a dynamic activity-transition graph (ATG) model, is implemented with program code storing the control and the data state of an agent, which is novel. The program code can be modified by the agent itself using code morphing techniques and is capable of migrating in the network between nodes. The program code is a self-contained unit (a container) and embeds the agent data, the initialization instructions and the ATG behavior implementation. The microchip agent processing platform used for the execution of the agent code is a standalone multi-core stack machine with a zero-operand instruction format, leading to a small-sized agent program code, low system complexity and high system performance. The agent processing is token-queue-based, similar to Petri-nets. The agent platform can be implemented in software, too, offering compatibility at the operational and code level, supporting agent processing in strong heterogeneous networks. In this work, the agent platform embedded in a large-scale distributed sensor network is simulated at the architectural level by using agent-based simulation techniques.

  7. Technology for the product and process data base

    NASA Technical Reports Server (NTRS)

    Barnes, R. D.

    1984-01-01

    The computerized product and process data base is increasingly recognized to be the cornerstone component of an overall system aimed at the integrated automation of the industrial processes of a given company or enterprise. The technology needed to support these more effective computer integrated design and manufacturing methods, especially the concept of 3-D computer-sensible product definitions rather than engineering drawings, is not fully available and rationalized. Progress is being made, however, in bridging this technology gap with concentration on the modeling of sophisticated information and data structures, high-performance interactive user interfaces and comprehensive tools for managing the resulting computerized product definition and process data base.

  8. Design of virtual simulation experiment based on key events

    NASA Astrophysics Data System (ADS)

    Zhong, Zheng; Zhou, Dongbo; Song, Lingxiu

    2018-06-01

    Considering complex content and lacking of guidance in virtual simulation experiments, the key event technology in VR narrative theory was introduced for virtual simulation experiment to enhance fidelity and vividness process. Based on the VR narrative technology, an event transition structure was designed to meet the need of experimental operation process, and an interactive event processing model was used to generate key events in interactive scene. The experiment of" margin value of bees foraging" based on Biologic morphology was taken as an example, many objects, behaviors and other contents were reorganized. The result shows that this method can enhance the user's experience and ensure experimental process complete and effectively.

  9. Light Weight Biomorphous Cellular Ceramics from Cellulose Templates

    NASA Technical Reports Server (NTRS)

    Singh, Mrityunjay; Yee, Bo-Moon; Gray, Hugh R. (Technical Monitor)

    2003-01-01

    Bimorphous ceramics are a new class of materials that can be fabricated from the cellulose templates derived from natural biopolymers. These biopolymers are abundantly available in nature and are produced by the photosynthesis process. The wood cellulose derived carbon templates have three- dimensional interconnectivity. A wide variety of non-oxide and oxide based ceramics have been fabricated by template conversion using infiltration and reaction-based processes. The cellular anatomy of the cellulose templates plays a key role in determining the processing parameters (pyrolysis, infiltration conditions, etc.) and resulting ceramic materials. The processing approach, microstructure, and mechanical properties of the biomorphous cellular ceramics (silicon carbide and oxide based) have been discussed.

  10. Process simulation of modified dry grind ethanol plant with recycle of pretreated and enzymatically hydrolyzed distillers' grains.

    PubMed

    Kim, Youngmi; Mosier, Nathan; Ladisch, Michael R

    2008-08-01

    Distillers' grains (DG), a co-product of a dry grind ethanol process, is an excellent source of supplemental proteins in livestock feed. Studies have shown that, due to its high polymeric sugar contents and ease of hydrolysis, the distillers' grains have potential as an additional source of fermentable sugars for ethanol fermentation. The benefit of processing the distillers' grains to extract fermentable sugars lies in an increased ethanol yield without significant modification in the current dry grind technology. Three different potential configurations of process alternatives in which pretreated and hydrolyzed distillers' grains are recycled for an enhanced overall ethanol yield are proposed and discussed in this paper based on the liquid hot water (LHW) pretreatment of distillers' grains. Possible limitations of each proposed process are also discussed. This paper presents a compositional analysis of distillers' grains, as well as a simulation of the modified dry grind processes with recycle of distillers' grains. Simulated material balances for the modified dry grind processes are established based on the base case assumptions. These balances are compared to the conventional dry grind process in terms of ethanol yield, compositions of its co-products, and accumulation of fermentation inhibitors. Results show that 14% higher ethanol yield is achievable by processing and hydrolyzing the distillers' grains for additional fermentable sugars, as compared to the conventional dry grind process. Accumulation of fermentation by-products and inhibitory components in the proposed process is predicted to be 2-5 times higher than in the conventional dry grind process. The impact of fermentation inhibitors is reviewed and discussed. The final eDDGS (enhanced dried distillers' grains) from the modified processes has 30-40% greater protein content per mass than DDGS, and its potential as a value-added process is also analyzed. While the case studies used to illustrate the process simulation are based on LHW pretreated DG, the process simulation itself provides a framework for evaluation of the impact of other pretreatments.

  11. Process Evaluation for a Prison-based Substance Abuse Program.

    ERIC Educational Resources Information Center

    Staton, Michele; Leukefeld, Carl; Logan, T. K.; Purvis, Rick

    2000-01-01

    Presents findings from a process evaluation conducted in a prison-based substance abuse program in Kentucky. Discusses key components in the program, including a detailed program description, modifications in planned treatment strategies, program documentation, and perspectives of staff and clients. Findings suggest that prison-based programs have…

  12. DESTRUCTION OF PAHS AND PCBS IN WATER USING SULFATE RADICAL-BASED CATALYTIC ADVANCED OXIDATION PROCESSES

    EPA Science Inventory

    A new class of advanced oxidation processes (AOPs) based on sulfate radicals is being tested for the degradation of polycyclic aromatic hydrocarbons (PAHs) and polychlorinated biphenyls (PCBs) in aqueous solution. These AOPs are based on the generation of sulfate radicals through...

  13. SULFATE RADICAL-BASED FERROUS-PEROXYMONOSULFATE OXIDATIVE SYSTEM FOR PCBs DEGRADATION IN AQUEOUS AND SEDIMENT SYSTEMS

    EPA Science Inventory

    Polychlorinated biphenyls (PCBs) in the environment pose long-term risk to public health because of their persistent and toxic nature. This study investigates the degradation of PCBs using sulfate radical-based advanced oxidation processes (SR-AOPs). These processes are based o...

  14. An Overview of Computer-Based Natural Language Processing.

    ERIC Educational Resources Information Center

    Gevarter, William B.

    Computer-based Natural Language Processing (NLP) is the key to enabling humans and their computer-based creations to interact with machines using natural languages (English, Japanese, German, etc.) rather than formal computer languages. NLP is a major research area in the fields of artificial intelligence and computational linguistics. Commercial…

  15. Performance-Based Assessment: An Alternative Assessment Process for Young Gifted Children.

    ERIC Educational Resources Information Center

    Hafenstein, Norma Lu; Tucker, Brooke

    Performance-based assessment provides an alternative identification method for young gifted children. A performance-based identification process was developed and implemented to select three-, four-, and five-year-old children for inclusion in a school for gifted children. Literature regarding child development, characteristics of young gifted…

  16. Community Leadership through Community-Based Programming: The Role of the Community College.

    ERIC Educational Resources Information Center

    Boone, Edgar J.; And Others

    Organized around 15 tasks involved in the community-based programming (CBP) process, this book provides practical, field-tested guidance on successfully implementing CBP in community colleges. Following prefatory materials, the following chapters are provided: (1) "An Introduction to the Community-Based Programming Process" (Edgar J.…

  17. Dynamic Approaches to Language Processing

    ERIC Educational Resources Information Center

    Srinivasan, Narayanan

    2007-01-01

    Symbolic rule-based approaches have been a preferred way to study language and cognition. Dissatisfaction with rule-based approaches in the 1980s lead to alternative approaches to study language, the most notable being the dynamic approaches to language processing. Dynamic approaches provide a significant alternative by not being rule-based and…

  18. Maximizing the Impact of Program Evaluation: A Discrepancy-Based Process for Educational Program Evaluation.

    ERIC Educational Resources Information Center

    Cantor, Jeffrey A.

    This paper describes a formative/summative process for educational program evaluation, which is appropriate for higher education programs and is based on M. Provus' Discrepancy Evaluation Model and the principles of instructional design. The Discrepancy Based Methodology for Educational Program Evaluation facilitates systematic and detailed…

  19. Reasoning in explanation-based decision making.

    PubMed

    Pennington, N; Hastie, R

    1993-01-01

    A general theory of explanation-based decision making is outlined and the multiple roles of inference processes in the theory are indicated. A typology of formal and informal inference forms, originally proposed by Collins (1978a, 1978b), is introduced as an appropriate framework to represent inferences that occur in the overarching explanation-based process. Results from the analysis of verbal reports of decision processes are presented to demonstrate the centrality and systematic character of reasoning in a representative legal decision-making task.

  20. Usalpharma: A Cloud-Based Architecture to Support Quality Assurance Training Processes in Health Area Using Virtual Worlds

    PubMed Central

    García-Peñalvo, Francisco J.; Pérez-Blanco, Jonás Samuel; Martín-Suárez, Ana

    2014-01-01

    This paper discusses how cloud-based architectures can extend and enhance the functionality of the training environments based on virtual worlds and how, from this cloud perspective, we can provide support to analysis of training processes in the area of health, specifically in the field of training processes in quality assurance for pharmaceutical laboratories, presenting a tool for data retrieval and analysis that allows facing the knowledge discovery in the happenings inside the virtual worlds. PMID:24778593

  1. Memory and learning behaviors mimicked in nanogranular SiO2-based proton conductor gated oxide-based synaptic transistors

    NASA Astrophysics Data System (ADS)

    Wan, Chang Jin; Zhu, Li Qiang; Zhou, Ju Mei; Shi, Yi; Wan, Qing

    2013-10-01

    In neuroscience, signal processing, memory and learning function are established in the brain by modifying ionic fluxes in neurons and synapses. Emulation of memory and learning behaviors of biological systems by nanoscale ionic/electronic devices is highly desirable for building neuromorphic systems or even artificial neural networks. Here, novel artificial synapses based on junctionless oxide-based protonic/electronic hybrid transistors gated by nanogranular phosphorus-doped SiO2-based proton-conducting films are fabricated on glass substrates by a room-temperature process. Short-term memory (STM) and long-term memory (LTM) are mimicked by tuning the pulse gate voltage amplitude. The LTM process in such an artificial synapse is due to the proton-related interfacial electrochemical reaction. Our results are highly desirable for building future neuromorphic systems or even artificial networks via electronic elements.In neuroscience, signal processing, memory and learning function are established in the brain by modifying ionic fluxes in neurons and synapses. Emulation of memory and learning behaviors of biological systems by nanoscale ionic/electronic devices is highly desirable for building neuromorphic systems or even artificial neural networks. Here, novel artificial synapses based on junctionless oxide-based protonic/electronic hybrid transistors gated by nanogranular phosphorus-doped SiO2-based proton-conducting films are fabricated on glass substrates by a room-temperature process. Short-term memory (STM) and long-term memory (LTM) are mimicked by tuning the pulse gate voltage amplitude. The LTM process in such an artificial synapse is due to the proton-related interfacial electrochemical reaction. Our results are highly desirable for building future neuromorphic systems or even artificial networks via electronic elements. Electronic supplementary information (ESI) available. See DOI: 10.1039/c3nr02987e

  2. The Effect of Ultrasonic Additive Manufacturing on Integrated Printed Electronic Conductors

    NASA Astrophysics Data System (ADS)

    Bournias-Varotsis, Alkaios; Wang, Shanda; Hutt, David; Engstrøm, Daniel S.

    2018-07-01

    Ultrasonic additive manufacturing (UAM) is a low temperature manufacturing method capable of embedding printed electronics in metal components. The effect of UAM processing on the resistivity of conductive tracks printed with five different conductive pastes based on silver, copper or carbon flakes/particles in either a thermoplastic or thermoset filler binder are investigated. For all but the carbon-based paste, the resistivity changed linearly with the UAM energy input. After UAM processing, a resistivity increase of more than 150 times was recorded for the copper based thermoset paste. The silver based pastes showed a resistivity increase of between 1.1 and 50 times from their initial values. The carbon-based paste showed no change in resistivity after UAM processing. Focussed ion beam microstructure analysis of the printed conductive tracks before and after UAM processing showed that the silver particles and flakes in at least one of the pastes partly dislodged from their thermoset filler creating voids, thereby increasing the resistivity, whereas the silver flakes in a thermoplastic filler did not dislodge due to material flow of the polymer binder. The lowest resistivity (8 × 10-5 Ω cm) after UAM processing was achieved for a thermoplastic paste with silver flakes at low UAM processing energy.

  3. The Effect of Ultrasonic Additive Manufacturing on Integrated Printed Electronic Conductors

    NASA Astrophysics Data System (ADS)

    Bournias-Varotsis, Alkaios; Wang, Shanda; Hutt, David; Engstrøm, Daniel S.

    2018-03-01

    Ultrasonic additive manufacturing (UAM) is a low temperature manufacturing method capable of embedding printed electronics in metal components. The effect of UAM processing on the resistivity of conductive tracks printed with five different conductive pastes based on silver, copper or carbon flakes/particles in either a thermoplastic or thermoset filler binder are investigated. For all but the carbon-based paste, the resistivity changed linearly with the UAM energy input. After UAM processing, a resistivity increase of more than 150 times was recorded for the copper based thermoset paste. The silver based pastes showed a resistivity increase of between 1.1 and 50 times from their initial values. The carbon-based paste showed no change in resistivity after UAM processing. Focussed ion beam microstructure analysis of the printed conductive tracks before and after UAM processing showed that the silver particles and flakes in at least one of the pastes partly dislodged from their thermoset filler creating voids, thereby increasing the resistivity, whereas the silver flakes in a thermoplastic filler did not dislodge due to material flow of the polymer binder. The lowest resistivity (8 × 10-5 Ω cm) after UAM processing was achieved for a thermoplastic paste with silver flakes at low UAM processing energy.

  4. Reconfigurable environmentally adaptive computing

    NASA Technical Reports Server (NTRS)

    Coxe, Robin L. (Inventor); Galica, Gary E. (Inventor)

    2008-01-01

    Described are methods and apparatus, including computer program products, for reconfigurable environmentally adaptive computing technology. An environmental signal representative of an external environmental condition is received. A processing configuration is automatically selected, based on the environmental signal, from a plurality of processing configurations. A reconfigurable processing element is reconfigured to operate according to the selected processing configuration. In some examples, the environmental condition is detected and the environmental signal is generated based on the detected condition.

  5. Low-SWaP coincidence processing for Geiger-mode LIDAR video

    NASA Astrophysics Data System (ADS)

    Schultz, Steven E.; Cervino, Noel P.; Kurtz, Zachary D.; Brown, Myron Z.

    2015-05-01

    Photon-counting Geiger-mode lidar detector arrays provide a promising approach for producing three-dimensional (3D) video at full motion video (FMV) data rates, resolution, and image size from long ranges. However, coincidence processing required to filter raw photon counts is computationally expensive, generally requiring significant size, weight, and power (SWaP) and also time. In this paper, we describe a laboratory test-bed developed to assess the feasibility of low-SWaP, real-time processing for 3D FMV based on Geiger-mode lidar. First, we examine a design based on field programmable gate arrays (FPGA) and demonstrate proof-of-concept results. Then we examine a design based on a first-of-its-kind embedded graphical processing unit (GPU) and compare performance with the FPGA. Results indicate feasibility of real-time Geiger-mode lidar processing for 3D FMV and also suggest utility for real-time onboard processing for mapping lidar systems.

  6. Enhancing Users' Participation in Business Process Modeling through Ontology-Based Training

    NASA Astrophysics Data System (ADS)

    Macris, A.; Malamateniou, F.; Vassilacopoulos, G.

    Successful business process design requires active participation of users who are familiar with organizational activities and business process modelling concepts. Hence, there is a need to provide users with reusable, flexible, agile and adaptable training material in order to enable them instil their knowledge and expertise in business process design and automation activities. Knowledge reusability is of paramount importance in designing training material on process modelling since it enables users participate actively in process design/redesign activities stimulated by the changing business environment. This paper presents a prototype approach for the design and use of training material that provides significant advantages to both the designer (knowledge - content reusability and semantic web enabling) and the user (semantic search, knowledge navigation and knowledge dissemination). The approach is based on externalizing domain knowledge in the form of ontology-based knowledge networks (i.e. training scenarios serving specific training needs) so that it is made reusable.

  7. Integrated controls design optimization

    DOEpatents

    Lou, Xinsheng; Neuschaefer, Carl H.

    2015-09-01

    A control system (207) for optimizing a chemical looping process of a power plant includes an optimizer (420), an income algorithm (230) and a cost algorithm (225) and a chemical looping process models. The process models are used to predict the process outputs from process input variables. Some of the process in puts and output variables are related to the income of the plant; and some others are related to the cost of the plant operations. The income algorithm (230) provides an income input to the optimizer (420) based on a plurality of input parameters (215) of the power plant. The cost algorithm (225) provides a cost input to the optimizer (420) based on a plurality of output parameters (220) of the power plant. The optimizer (420) determines an optimized operating parameter solution based on at least one of the income input and the cost input, and supplies the optimized operating parameter solution to the power plant.

  8. Technology development for lunar base water recycling

    NASA Technical Reports Server (NTRS)

    Schultz, John R.; Sauer, Richard L.

    1992-01-01

    This paper will review previous and ongoing work in aerospace water recycling and identify research activities required to support development of a lunar base. The development of a water recycle system for use in the life support systems envisioned for a lunar base will require considerable research work. A review of previous work on aerospace water recycle systems indicates that more efficient physical and chemical processes are needed to reduce expendable and power requirements. Development work on biological processes that can be applied to microgravity and lunar environments also needs to be initiated. Biological processes are inherently more efficient than physical and chemical processes and may be used to minimize resupply and waste disposal requirements. Processes for recovering and recycling nutrients such as nitrogen, phosphorus, and sulfur also need to be developed to support plant growth units. The development of efficient water quality monitors to be used for process control and environmental monitoring also needs to be initiated.

  9. Development of Replacements for Phoscoating Used in Forging, Extrusion and Metal Forming Processes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kerry Barnett

    2003-03-01

    Many forging, extrusion, heading and other metal forming processes use graphite-based lubricants, phosphate coatings, and other potentially hazardous or harmful substances to improve the tribology of the metal forming process. The application of phosphate-based coatings has long been studied to determine if other synthetic ''clean'' lubricants could provide the same degree of protection afforded by phoscoatings and its formulations. So far, none meets the cost and performance objectives provided by phoscoatings as a general aid to the metal forming industry. In as much as phoscoatings and graphite have replaced lead-based lubricants, the metal forming industry has had previous experience withmore » a legislated requirement to change processes. However, without a proactive approach to phoscoating replacement, many metal forming processes could find themselves without a cost effective tribology material necessary for the metal forming process« less

  10. Signal Processing Studies of a Simulated Laser Doppler Velocimetry-Based Acoustic Sensor

    DTIC Science & Technology

    1990-10-17

    investigated using spectral correlation methods. Results indicate that it may be possible to extend demonstrated LDV-based acoustic sensor sensitivities using higher order processing techniques. (Author)

  11. Characterizing Process-Based River and Floodplain Restoration Projects on Federal Lands in Oregon, and Assessing Catalysts and Barriers to Implementation

    NASA Astrophysics Data System (ADS)

    Bianco, S.; Jones, J. A.; Gosnell, H.

    2017-12-01

    Process-based restoration, a new approach to river and floodplain management, is being implemented on federal lands across Oregon. These management efforts are aimed at promoting key physical processes in order to improve river ecological function, create diverse habitat, and increase biological productivity for ESA-listed bull trout and spring Chinook salmon. Although the practice is being disseminated across the Pacific Northwest, it remains unclear what is driving aquatic and riparian ecosystem restoration towards this process-based approach and away from form-based methods such as Rosgen's Natural Channel Design. The technical aspects of process-based restoration have been described in the literature (ex. Beechie et al. 2010), but little is known about the practice from a social science perspective, and few case studies exist to assess the impact of these efforts. We combine semi-structured qualitative interviews with management experts and photogrammetric analysis to better understand how complex social processes and changing ideas about aquatic ecosystems are manifesting on the ground in federal land management. This study characterizes process-based river and floodplain restoration projects on federal lands in Oregon, and identifies catalysts and barriers to its implementation. The Deer Creek Floodplain Enhancement project serves as a case study for photogrammetric analysis. To characterize long-term changes at Deer Creek, geomorphic features were mapped and classified using orthoimage mosaics developed from a time series of historic aerial photographs dating back to 1954. 3D Digital Elevation Models (3D-DEMs) were created of portions of the modified sections of Deer Creek and its floodplain immediately before and after restoration using drone-captured aerial photography and a photogrammetric technique called Structure from Motion. These 3D-DEMs have enabled extraction of first-order geomorphic variables to compare pre- and post-project conditions. This study improves understanding of the historic range of conditions at Deer Creek, and assesses how process-based restoration activities drive short-term changes in geomorphic features, which can in turn influence complex riverine processes such as energy dissipation and sediment deposition.

  12. Experimental Methods for Investigation of Shape Memory Based Elastocaloric Cooling Processes and Model Validation

    PubMed Central

    Schmidt, Marvin; Ullrich, Johannes; Wieczorek, André; Frenzel, Jan; Eggeler, Gunther; Schütze, Andreas; Seelecke, Stefan

    2016-01-01

    Shape Memory Alloys (SMA) using elastocaloric cooling processes have the potential to be an environmentally friendly alternative to the conventional vapor compression based cooling process. Nickel-Titanium (Ni-Ti) based alloy systems, especially, show large elastocaloric effects. Furthermore, exhibit large latent heats which is a necessary material property for the development of an efficient solid-state based cooling process. A scientific test rig has been designed to investigate these processes and the elastocaloric effects in SMAs. The realized test rig enables independent control of an SMA's mechanical loading and unloading cycles, as well as conductive heat transfer between SMA cooling elements and a heat source/sink. The test rig is equipped with a comprehensive monitoring system capable of synchronized measurements of mechanical and thermal parameters. In addition to determining the process-dependent mechanical work, the system also enables measurement of thermal caloric aspects of the elastocaloric cooling effect through use of a high-performance infrared camera. This combination is of particular interest, because it allows illustrations of localization and rate effects — both important for efficient heat transfer from the medium to be cooled. The work presented describes an experimental method to identify elastocaloric material properties in different materials and sample geometries. Furthermore, the test rig is used to investigate different cooling process variations. The introduced analysis methods enable a differentiated consideration of material, process and related boundary condition influences on the process efficiency. The comparison of the experimental data with the simulation results (of a thermomechanically coupled finite element model) allows for better understanding of the underlying physics of the elastocaloric effect. In addition, the experimental results, as well as the findings based on the simulation results, are used to improve the material properties. PMID:27168093

  13. Experimental Methods for Investigation of Shape Memory Based Elastocaloric Cooling Processes and Model Validation.

    PubMed

    Schmidt, Marvin; Ullrich, Johannes; Wieczorek, André; Frenzel, Jan; Eggeler, Gunther; Schütze, Andreas; Seelecke, Stefan

    2016-05-02

    Shape Memory Alloys (SMA) using elastocaloric cooling processes have the potential to be an environmentally friendly alternative to the conventional vapor compression based cooling process. Nickel-Titanium (Ni-Ti) based alloy systems, especially, show large elastocaloric effects. Furthermore, exhibit large latent heats which is a necessary material property for the development of an efficient solid-state based cooling process. A scientific test rig has been designed to investigate these processes and the elastocaloric effects in SMAs. The realized test rig enables independent control of an SMA's mechanical loading and unloading cycles, as well as conductive heat transfer between SMA cooling elements and a heat source/sink. The test rig is equipped with a comprehensive monitoring system capable of synchronized measurements of mechanical and thermal parameters. In addition to determining the process-dependent mechanical work, the system also enables measurement of thermal caloric aspects of the elastocaloric cooling effect through use of a high-performance infrared camera. This combination is of particular interest, because it allows illustrations of localization and rate effects - both important for efficient heat transfer from the medium to be cooled. The work presented describes an experimental method to identify elastocaloric material properties in different materials and sample geometries. Furthermore, the test rig is used to investigate different cooling process variations. The introduced analysis methods enable a differentiated consideration of material, process and related boundary condition influences on the process efficiency. The comparison of the experimental data with the simulation results (of a thermomechanically coupled finite element model) allows for better understanding of the underlying physics of the elastocaloric effect. In addition, the experimental results, as well as the findings based on the simulation results, are used to improve the material properties.

  14. Multi-layered reasoning by means of conceptual fuzzy sets

    NASA Technical Reports Server (NTRS)

    Takagi, Tomohiro; Imura, Atsushi; Ushida, Hirohide; Yamaguchi, Toru

    1993-01-01

    The real world consists of a very large number of instances of events and continuous numeric values. On the other hand, people represent and process their knowledge in terms of abstracted concepts derived from generalization of these instances and numeric values. Logic based paradigms for knowledge representation use symbolic processing both for concept representation and inference. Their underlying assumption is that a concept can be defined precisely. However, as this assumption hardly holds for natural concepts, it follows that symbolic processing cannot deal with such concepts. Thus symbolic processing has essential problems from a practical point of view of applications in the real world. In contrast, fuzzy set theory can be viewed as a stronger and more practical notation than formal, logic based theories because it supports both symbolic processing and numeric processing, connecting the logic based world and the real world. In this paper, we propose multi-layered reasoning by using conceptual fuzzy sets (CFS). The general characteristics of CFS are discussed along with upper layer supervision and context dependent processing.

  15. Effect of high-pressure processing and milk on the anthocyanin composition and antioxidant capacity of strawberry-based beverages.

    PubMed

    Tadapaneni, Ravi Kiran; Banaszewski, Katarzyna; Patazca, Eduardo; Edirisinghe, Indika; Cappozzo, Jack; Jackson, Lauren; Burton-Freeman, Britt

    2012-06-13

    The present study investigated processing strategies and matrix effects on the antioxidant capacity (AC) and polyphenols (PP) content of fruit-based beverages: (1) strawberry powder (Str) + dairy, D-Str; (2) Str + water, ND-Str; (3) dairy + no Str, D-NStr. Beverages were subjected to high-temperature-short-time (HTST) and high-pressure processing (HPP). AC and PP were measured before and after processing and after a 5 week shelf-life study. Unprocessed D-Str had significantly lower AC compared to unprocessed ND-Str. Significant reductions in AC were apparent in HTST- compared to HPP-processed beverages (up to 600 MPa). PP content was significantly reduced in D-Str compared to ND-Str and in response to HPP and HTST in all beverages. After storage (5 weeks), AC and PP were reduced in all beverages compared to unprocessed and week 0 processed beverages. These findings indicate potentially negative effects of milk and processing on AC and PP of fruit-based beverages.

  16. Process modeling of an advanced NH₃ abatement and recycling technology in the ammonia-based CO₂ capture process.

    PubMed

    Li, Kangkang; Yu, Hai; Tade, Moses; Feron, Paul; Yu, Jingwen; Wang, Shujuan

    2014-06-17

    An advanced NH3 abatement and recycling process that makes great use of the waste heat in flue gas was proposed to solve the problems of ammonia slip, NH3 makeup, and flue gas cooling in the ammonia-based CO2 capture process. The rigorous rate-based model, RateFrac in Aspen Plus, was thermodynamically and kinetically validated by experimental data from open literature and CSIRO pilot trials at Munmorah Power Station, Australia, respectively. After a thorough sensitivity analysis and process improvement, the NH3 recycling efficiency reached as high as 99.87%, and the NH3 exhaust concentration was only 15.4 ppmv. Most importantly, the energy consumption of the NH3 abatement and recycling system was only 59.34 kJ/kg CO2 of electricity. The evaluation of mass balance and temperature steady shows that this NH3 recovery process was technically effective and feasible. This process therefore is a promising prospect toward industrial application.

  17. Development of a Novel Gas Pressurized Stripping Process-Based Technology for CO₂ Capture from Post-Combustion Flue Gases

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Shiaoguo

    A novel Gas Pressurized Stripping (GPS) post-combustion carbon capture (PCC) process has been developed by Carbon Capture Scientific, LLC, CONSOL Energy Inc., Nexant Inc., and Western Kentucky University in this bench-scale project. The GPS-based process presents a unique approach that uses a gas pressurized technology for CO₂ stripping at an elevated pressure to overcome the energy use and other disadvantages associated with the benchmark monoethanolamine (MEA) process. The project was aimed at performing laboratory- and bench-scale experiments to prove its technical feasibility and generate process engineering and scale-up data, and conducting a techno-economic analysis (TEA) to demonstrate its energy usemore » and cost competitiveness over the MEA process. To meet project goals and objectives, a combination of experimental work, process simulation, and technical and economic analysis studies were applied. The project conducted individual unit lab-scale tests for major process components, including a first absorption column, a GPS column, a second absorption column, and a flasher. Computer simulations were carried out to study the GPS column behavior under different operating conditions, to optimize the column design and operation, and to optimize the GPS process for an existing and a new power plant. The vapor-liquid equilibrium data under high loading and high temperature for the selected amines were also measured. The thermal and oxidative stability of the selected solvents were also tested experimentally and presented. A bench-scale column-based unit capable of achieving at least 90% CO₂ capture from a nominal 500 SLPM coal-derived flue gas slipstream was designed and built. This integrated, continuous, skid-mounted GPS system was tested using real flue gas from a coal-fired boiler at the National Carbon Capture Center (NCCC). The technical challenges of the GPS technology in stability, corrosion, and foaming of selected solvents, and environmental, health and safety risks have been addressed through experimental tests, consultation with vendors and engineering analysis. Multiple rounds of TEA were performed to improve the GPS-based PCC process design and operation, and to compare the energy use and cost performance of a nominal 550-MWe supercritical pulverized coal (PC) plant among the DOE/NETL report Case 11 (the PC plant without CO₂ capture), the DOE/NETL report Case 12 (the PC plant with benchmark MEA-based PCC), and the PC plant using GPS-based PCC. The results reveal that the net power produced in the PC plant with GPS-based PCC is 647 MWe, greater than that of the Case 12 (550 MWe). The 20-year LCOE for the PC plant with GPS-based PCC is 97.4 mills/kWh, or 152% of that of the Case 11, which is also 23% less than that of the Case 12. These results demonstrate that the GPS-based PCC process is energy-efficient and cost-effective compared with the benchmark MEA process.« less

  18. Beyond Depression: Towards a Process-Based Approach to Research, Diagnosis, and Treatment.

    PubMed

    Forgeard, Marie J C; Haigh, Emily A P; Beck, Aaron T; Davidson, Richard J; Henn, Fritz A; Maier, Steven F; Mayberg, Helen S; Seligman, Martin E P

    2011-12-01

    Despite decades of research on the etiology and treatment of depression, a significant proportion of the population is affected by the disorder, fails to respond to treatment and is plagued by relapse. Six prominent scientists, Aaron Beck, Richard Davidson, Fritz Henn, Steven Maier, Helen Mayberg, and Martin Seligman, gathered to discuss the current state of scientific knowledge on depression, and in particular on the basic neurobiological and psychopathological processes at play in the disorder. These general themes were addressed: 1) the relevance of learned helplessness as a basic process involved in the development of depression; 2) the limitations of our current taxonomy of psychological disorders; 3) the need to work towards a psychobiological process-based taxonomy; and 4) the clinical implications of implementing such a process-based taxonomy.

  19. Enzyme-based processing of soybean carbohydrate: Recent developments and future prospects.

    PubMed

    Al Loman, Abdullah; Ju, Lu-Kwang

    2017-11-01

    Soybean is well known for its high-value oil and protein. Carbohydrate is, however, an underutilized major component, representing almost 26-30% (w/w) of the dried bean. The complex soybean carbohydrate is not easily hydrolyzable and can cause indigestibility when included in food and feed. Enzymes can be used to hydrolyze the carbohydrate for improving soybean processing and value of soybean products. Here the enzyme-based processing developed for the following purposes is reviewed: hydrolysis of different carbohydrate-rich by/products from soybean processing, improvement of soybean oil extraction, and increase of nutritional value of soybean-based food and animal feed. Once hydrolyzed into fermentable sugars, soybean carbohydrate can find more value-added applications and further improve the overall economics of soybean processing. Copyright © 2017 Elsevier Inc. All rights reserved.

  20. Development of solution-processed nanowire composites for opto-electronics

    DOE PAGES

    Ginley, David S.; Aggarwal, Shruti; Singh, Rajiv; ...

    2016-12-20

    Here, silver nanowire-based contacts represent one of the major new directions in transparent contacts for opto-electronic devices with the added advantage that they can have Indium-Tin-Oxide-like properties at substantially reduced processing temperatures and without the use of vacuum-based processing. However, nanowires alone often do not adhere well to the substrate or other film interfaces; even after a relatively high-temperature anneal and unencapsulated nanowires show environmental degradation at high temperature and humidity. Here we report on the development of ZnO/Ag-nanowire composites that have sheet resistance below 10 Ω/sq and >90% transmittance from a solution-based process with process temperatures below 200 °C.more » These films have significant applications potential in photovoltaics and displays.« less

  1. Beyond Depression: Towards a Process-Based Approach to Research, Diagnosis, and Treatment

    PubMed Central

    Forgeard, Marie J. C.; Haigh, Emily A. P.; Beck, Aaron T.; Davidson, Richard J.; Henn, Fritz A.; Maier, Steven F.; Mayberg, Helen S.; Seligman, Martin E. P.

    2012-01-01

    Despite decades of research on the etiology and treatment of depression, a significant proportion of the population is affected by the disorder, fails to respond to treatment and is plagued by relapse. Six prominent scientists, Aaron Beck, Richard Davidson, Fritz Henn, Steven Maier, Helen Mayberg, and Martin Seligman, gathered to discuss the current state of scientific knowledge on depression, and in particular on the basic neurobiological and psychopathological processes at play in the disorder. These general themes were addressed: 1) the relevance of learned helplessness as a basic process involved in the development of depression; 2) the limitations of our current taxonomy of psychological disorders; 3) the need to work towards a psychobiological process-based taxonomy; and 4) the clinical implications of implementing such a process-based taxonomy. PMID:22509072

  2. CropEx Web-Based Agricultural Monitoring and Decision Support

    NASA Technical Reports Server (NTRS)

    Harvey. Craig; Lawhead, Joel

    2011-01-01

    CropEx is a Web-based agricultural Decision Support System (DSS) that monitors changes in crop health over time. It is designed to be used by a wide range of both public and private organizations, including individual producers and regional government offices with a vested interest in tracking vegetation health. The database and data management system automatically retrieve and ingest data for the area of interest. Another stores results of the processing and supports the DSS. The processing engine will allow server-side analysis of imagery with support for image sub-setting and a set of core raster operations for image classification, creation of vegetation indices, and change detection. The system includes the Web-based (CropEx) interface, data ingestion system, server-side processing engine, and a database processing engine. It contains a Web-based interface that has multi-tiered security profiles for multiple users. The interface provides the ability to identify areas of interest to specific users, user profiles, and methods of processing and data types for selected or created areas of interest. A compilation of programs is used to ingest available data into the system, classify that data, profile that data for quality, and make data available for the processing engine immediately upon the data s availability to the system (near real time). The processing engine consists of methods and algorithms used to process the data in a real-time fashion without copying, storing, or moving the raw data. The engine makes results available to the database processing engine for storage and further manipulation. The database processing engine ingests data from the image processing engine, distills those results into numerical indices, and stores each index for an area of interest. This process happens each time new data is ingested and processed for the area of interest, and upon subsequent database entries, the database processing engine qualifies each value for each area of interest and conducts a logical processing of results indicating when and where thresholds are exceeded. Reports are provided at regular, operator-determined intervals that include variances from thresholds and links to view raw data for verification, if necessary. The technology and method of development allow the code base to easily be modified for varied use in the real-time and near-real-time processing environments. In addition, the final product will be demonstrated as a means for rapid draft assessment of imagery.

  3. Embodied simulation in exposure-based therapies for posttraumatic stress disorder—a possible integration of cognitive behavioral theories, neuroscience, and psychoanalysis

    PubMed Central

    Peri, Tuvia; Gofman, Mordechai; Tal, Shahar; Tuval-Mashiach, Rivka

    2015-01-01

    Exposure to the trauma memory is the common denominator of most evidence-based interventions for posttraumatic stress disorder (PTSD). Although exposure-based therapies aim to change associative learning networks and negative cognitions related to the trauma memory, emotional interactions between patient and therapist have not been thoroughly considered in past evaluations of exposure-based therapy. This work focuses on recent discoveries of the mirror-neuron system and the theory of embodied simulation (ES). These conceptualizations may add a new perspective to our understanding of change processes in exposure-based treatments for PTSD patients. It is proposed that during exposure to trauma memories, emotional responses of the patient are transferred to the therapist through ES and then mirrored back to the patient in a modulated way. This process helps to alleviate the patient's sense of loneliness and enhances his or her ability to exert control over painful, trauma-related emotional responses. ES processes may enhance the integration of clinical insights originating in psychoanalytic theories—such as holding, containment, projective identification, and emotional attunement—with cognitive behavioral theories of learning processes in the alleviation of painful emotional responses aroused by trauma memories. These processes are demonstrated through a clinical vignette from an exposure-based therapy with a trauma survivor. Possible clinical implications for the importance of face-to-face relationships during exposure-based therapy are discussed. PMID:26593097

  4. Embodied simulation in exposure-based therapies for posttraumatic stress disorder-a possible integration of cognitive behavioral theories, neuroscience, and psychoanalysis.

    PubMed

    Peri, Tuvia; Gofman, Mordechai; Tal, Shahar; Tuval-Mashiach, Rivka

    2015-01-01

    Exposure to the trauma memory is the common denominator of most evidence-based interventions for posttraumatic stress disorder (PTSD). Although exposure-based therapies aim to change associative learning networks and negative cognitions related to the trauma memory, emotional interactions between patient and therapist have not been thoroughly considered in past evaluations of exposure-based therapy. This work focuses on recent discoveries of the mirror-neuron system and the theory of embodied simulation (ES). These conceptualizations may add a new perspective to our understanding of change processes in exposure-based treatments for PTSD patients. It is proposed that during exposure to trauma memories, emotional responses of the patient are transferred to the therapist through ES and then mirrored back to the patient in a modulated way. This process helps to alleviate the patient's sense of loneliness and enhances his or her ability to exert control over painful, trauma-related emotional responses. ES processes may enhance the integration of clinical insights originating in psychoanalytic theories-such as holding, containment, projective identification, and emotional attunement-with cognitive behavioral theories of learning processes in the alleviation of painful emotional responses aroused by trauma memories. These processes are demonstrated through a clinical vignette from an exposure-based therapy with a trauma survivor. Possible clinical implications for the importance of face-to-face relationships during exposure-based therapy are discussed.

  5. Itô and Stratonovich integrals on compound renewal processes: the normal/Poisson case

    NASA Astrophysics Data System (ADS)

    Germano, Guido; Politi, Mauro; Scalas, Enrico; Schilling, René L.

    2010-06-01

    Continuous-time random walks, or compound renewal processes, are pure-jump stochastic processes with several applications in insurance, finance, economics and physics. Based on heuristic considerations, a definition is given for stochastic integrals driven by continuous-time random walks, which includes the Itô and Stratonovich cases. It is then shown how the definition can be used to compute these two stochastic integrals by means of Monte Carlo simulations. Our example is based on the normal compound Poisson process, which in the diffusive limit converges to the Wiener process.

  6. Process Based on SysML for New Launchers System and Software Developments

    NASA Astrophysics Data System (ADS)

    Hiron, Emmanuel; Miramont, Philippe

    2010-08-01

    The purpose of this paper is to present the Astrium-ST engineering process based on SysML. This process is currently set-up in the frame of common CNES /Astrium-ST R&T studies related to the Ariane 5 electrical system and flight software modelling. The tool used to set up this process is Rhapsody release 7.3 from IBM-Software firm [1]. This process focuses on the system engineering phase dedicated to Software with the objective to generate both System documents (sequential system design and flight control) and Software specifications.

  7. Social Models: Blueprints or Processes?

    ERIC Educational Resources Information Center

    Little, Graham R.

    1981-01-01

    Discusses the nature and implications of two different models for societal planning: (1) the problem-solving process approach based on Karl Popper; and (2) the goal-setting "blueprint" approach based on Karl Marx. (DC)

  8. Cultural adaptation process for international dissemination of the strengthening families program.

    PubMed

    Kumpfer, Karol L; Pinyuchon, Methinin; Teixeira de Melo, Ana; Whiteside, Henry O

    2008-06-01

    The Strengthening Families Program (SFP) is an evidence-based family skills training intervention developed and found efficacious for substance abuse prevention by U.S researchers in the 1980s. In the 1990s, a cultural adaptation process was developed to transport SFP for effectiveness trials with diverse populations (African, Hispanic, Asian, Pacific Islander, and Native American). Since 2003, SFP has been culturally adapted for use in 17 countries. This article reviews the SFP theory and research and a recommended cultural adaptation process. Challenges in international dissemination of evidence-based programs (EBPs) are discussed based on the results of U.N. and U.S. governmental initiatives to transport EBP family interventions to developing countries. The technology transfer and quality assurance system are described, including the language translation and cultural adaptation process for materials development, staff training, and on-site and online Web-based supervision and technical assistance and evaluation services to assure quality implementation and process evaluation feedback for improvements.

  9. Smartphone-based noise mapping: Integrating sound level meter app data into the strategic noise mapping process.

    PubMed

    Murphy, Enda; King, Eoin A

    2016-08-15

    The strategic noise mapping process of the EU has now been ongoing for more than ten years. However, despite the fact that a significant volume of research has been conducted on the process and related issues there has been little change or innovation in how relevant authorities and policymakers are conducting the process since its inception. This paper reports on research undertaken to assess the possibility for smartphone-based noise mapping data to be integrated into the traditional strategic noise mapping process. We compare maps generated using the traditional approach with those generated using smartphone-based measurement data. The advantage of the latter approach is that it has the potential to remove the need for exhaustive input data into the source calculation model for noise prediction. In addition, the study also tests the accuracy of smartphone-based measurements against simultaneous measurements taken using traditional sound level meters in the field. Copyright © 2016 Elsevier B.V. All rights reserved.

  10. Chronic Motivational State Interacts with Task Reward Structure in Dynamic Decision-Making

    PubMed Central

    Cooper, Jessica A.; Worthy, Darrell A.; Maddox, W. Todd

    2015-01-01

    Research distinguishes between a habitual, model-free system motivated toward immediately rewarding actions, and a goal-directed, model-based system motivated toward actions that improve future state. We examined the balance of processing in these two systems during state-based decision-making. We tested a regulatory fit hypothesis (Maddox & Markman, 2010) that predicts that global trait motivation affects the balance of habitual- vs. goal-directed processing but only through its interaction with the task framing as gain-maximization or loss-minimization. We found support for the hypothesis that a match between an individual’s chronic motivational state and the task framing enhances goal-directed processing, and thus state-based decision-making. Specifically, chronic promotion-focused individuals under gain-maximization and chronic prevention-focused individuals under loss-minimization both showed enhanced state-based decision-making. Computational modeling indicates that individuals in a match between global chronic motivational state and local task reward structure engaged more goal-directed processing, whereas those in a mismatch engaged more habitual processing. PMID:26520256

  11. An evaluation and implementation of rule-based Home Energy Management System using the Rete algorithm.

    PubMed

    Kawakami, Tomoya; Fujita, Naotaka; Yoshihisa, Tomoki; Tsukamoto, Masahiko

    2014-01-01

    In recent years, sensors become popular and Home Energy Management System (HEMS) takes an important role in saving energy without decrease in QoL (Quality of Life). Currently, many rule-based HEMSs have been proposed and almost all of them assume "IF-THEN" rules. The Rete algorithm is a typical pattern matching algorithm for IF-THEN rules. Currently, we have proposed a rule-based Home Energy Management System (HEMS) using the Rete algorithm. In the proposed system, rules for managing energy are processed by smart taps in network, and the loads for processing rules and collecting data are distributed to smart taps. In addition, the number of processes and collecting data are reduced by processing rules based on the Rete algorithm. In this paper, we evaluated the proposed system by simulation. In the simulation environment, rules are processed by a smart tap that relates to the action part of each rule. In addition, we implemented the proposed system as HEMS using smart taps.

  12. MIRADS-2 Implementation Manual

    NASA Technical Reports Server (NTRS)

    1975-01-01

    The Marshall Information Retrieval and Display System (MIRADS) which is a data base management system designed to provide the user with a set of generalized file capabilities is presented. The system provides a wide variety of ways to process the contents of the data base and includes capabilities to search, sort, compute, update, and display the data. The process of creating, defining, and loading a data base is generally called the loading process. The steps in the loading process which includes (1) structuring, (2) creating, (3) defining, (4) and implementing the data base for use by MIRADS are defined. The execution of several computer programs is required to successfully complete all steps of the loading process. This library must be established as a cataloged mass storage file as the first step in MIRADS implementation. The procedure for establishing the MIRADS Library is given. The system is currently operational for the UNIVAC 1108 computer system utilizing the Executive Operating System. All procedures relate to the use of MIRADS on the U-1108 computer.

  13. Model-Based PAT for Quality Management in Pharmaceuticals Freeze-Drying: State of the Art

    PubMed Central

    Fissore, Davide

    2017-01-01

    Model-based process analytical technologies can be used for the in-line control and optimization of a pharmaceuticals freeze-drying process, as well as for the off-line design of the process, i.e., the identification of the optimal operating conditions. This paper aims at presenting the state of the art in this field, focusing, particularly, on three groups of systems, namely, those based on the temperature measurement (i.e., the soft sensor), on the chamber pressure measurement (i.e., the systems based on the test of pressure rise and of pressure decrease), and on the sublimation flux estimate (i.e., the tunable diode laser absorption spectroscopy and the valveless monitoring system). The application of these systems for in-line process optimization (e.g., using a model predictive control algorithm) and to get a true quality by design (e.g., through the off-line calculation of the design space of the process) is presented and discussed. PMID:28224123

  14. Neuro-estimator based GMC control of a batch reactive distillation.

    PubMed

    Prakash, K J Jithin; Patle, Dipesh S; Jana, Amiya K

    2011-07-01

    In this paper, an artificial neural network (ANN)-based nonlinear control algorithm is proposed for a simulated batch reactive distillation (RD) column. In the homogeneously catalyzed reactive process, an esterification reaction takes place for the production of ethyl acetate. The fundamental model has been derived incorporating the reaction term in the model structure of the nonreactive distillation process. The process operation is simulated at the startup phase under total reflux conditions. The open-loop process dynamics is also addressed running the batch process at the production phase under partial reflux conditions. In this study, a neuro-estimator based generic model controller (GMC), which consists of an ANN-based state predictor and the GMC law, has been synthesized. Finally, this proposed control law has been tested on the representative batch reactive distillation comparing with a gain-scheduled proportional integral (GSPI) controller and with its ideal performance (ideal GMC). Copyright © 2011 ISA. Published by Elsevier Ltd. All rights reserved.

  15. Building a Knowledge to Action Program in Stroke Rehabilitation.

    PubMed

    Janzen, Shannon; McIntyre, Amanda; Richardson, Marina; Britt, Eileen; Teasell, Robert

    2016-09-01

    The knowledge to action (KTA) process proposed by Graham et al (2006) is a framework to facilitate the development and application of research evidence into clinical practice. The KTA process consists of the knowledge creation cycle and the action cycle. The Evidence Based Review of Stroke Rehabilitation is a foundational part of the knowledge creation cycle and has helped guide the development of best practice recommendations in stroke. The Rehabilitation Knowledge to Action Project is an audit-feedback process for the clinical implementation of best practice guidelines, which follows the action cycle. The objective of this review was to: (1) contextualize the Evidence Based Review of Stroke Rehabilitation and Rehabilitation Knowledge to Action Project within the KTA model and (2) show how this process led to improved evidence-based practice in stroke rehabilitation. Through this process, a single centre was able to change clinical practice and promote a culture that supports the use of evidence-based practices in stroke rehabilitation.

  16. Unity and disunity in evolutionary sciences: process-based analogies open common research avenues for biology and linguistics.

    PubMed

    List, Johann-Mattis; Pathmanathan, Jananan Sylvestre; Lopez, Philippe; Bapteste, Eric

    2016-08-20

    For a long time biologists and linguists have been noticing surprising similarities between the evolution of life forms and languages. Most of the proposed analogies have been rejected. Some, however, have persisted, and some even turned out to be fruitful, inspiring the transfer of methods and models between biology and linguistics up to today. Most proposed analogies were based on a comparison of the research objects rather than the processes that shaped their evolution. Focusing on process-based analogies, however, has the advantage of minimizing the risk of overstating similarities, while at the same time reflecting the common strategy to use processes to explain the evolution of complexity in both fields. We compared important evolutionary processes in biology and linguistics and identified processes specific to only one of the two disciplines as well as processes which seem to be analogous, potentially reflecting core evolutionary processes. These new process-based analogies support novel methodological transfer, expanding the application range of biological methods to the field of historical linguistics. We illustrate this by showing (i) how methods dealing with incomplete lineage sorting offer an introgression-free framework to analyze highly mosaic word distributions across languages; (ii) how sequence similarity networks can be used to identify composite and borrowed words across different languages; (iii) how research on partial homology can inspire new methods and models in both fields; and (iv) how constructive neutral evolution provides an original framework for analyzing convergent evolution in languages resulting from common descent (Sapir's drift). Apart from new analogies between evolutionary processes, we also identified processes which are specific to either biology or linguistics. This shows that general evolution cannot be studied from within one discipline alone. In order to get a full picture of evolution, biologists and linguists need to complement their studies, trying to identify cross-disciplinary and discipline-specific evolutionary processes. The fact that we found many process-based analogies favoring transfer from biology to linguistics further shows that certain biological methods and models have a broader scope than previously recognized. This opens fruitful paths for collaboration between the two disciplines. This article was reviewed by W. Ford Doolittle and Eugene V. Koonin.

  17. Removal of inhibitors from pre-hydrolysis liquor of kraft-based dissolving pulp production process using adsorption and flocculation processes.

    PubMed

    Liu, Xin; Fatehi, Pedram; Ni, Yonghao

    2012-07-01

    A process for removing inhibitors from pre-hydrolysis liquor (PHL) of a kraft-based dissolving pulp production process by adsorption and flocculation, and the characteristics of this process were studied. In this process, industrially produced PHL was treated with unmodified and oxidized activated carbon as an absorbent and polydiallyldimethylammonium chloride (PDADMAC) as a flocculant. The overall removal of lignin and furfural in the developed process was 83.3% and 100%, respectively, while that of hemicelluloses was 32.7%. These results confirmed that the developed process can remove inhibitors from PHL prior to producing value-added products, e.g. ethanol and xylitol via fermentation. Copyright © 2012 Elsevier Ltd. All rights reserved.

  18. ICT as a Catalyst in Problem-Based Learning Processes? A Comparison of Online and Campus-Based PBL in Swedish Fire-Fighter Training

    ERIC Educational Resources Information Center

    Holmgren, Robert

    2013-01-01

    This article focuses on the impact on learning processes when digital technologies are integrated into PBL (problem-based learning) oriented distance training. Based on socio-cultural perspectives on learning and a comparative distance-campus as well as a time-perspective, instructor and student roles, and learning activities were explored.…

  19. Is Time-Based Prospective Remembering Mediated by Self-Initiated Rehearsals? Role of Incidental Cues, Ongoing Activity, Age, and Motivation

    ERIC Educational Resources Information Center

    Kvavilashvili, Lia; Fisher, Laura

    2007-01-01

    The present research examined self-reported rehearsal processes in naturalistic time-based prospective memory tasks (Study 1 and 2) and compared them with the processes in event-based tasks (Study 3). Participants had to remember to phone the experimenter either at a prearranged time (a time-based task) or after receiving a certain text message…

  20. Multi-Core Processors: An Enabling Technology for Embedded Distributed Model-Based Control (Postprint)

    DTIC Science & Technology

    2008-07-01

    generation of process partitioning, a thread pipelining becomes possible. In this paper we briefly summarize the requirements and trends for FADEC based... FADEC environment, presenting a hypothetical realization of an example application. Finally we discuss the application of Time-Triggered...based control applications of the future. 15. SUBJECT TERMS Gas turbine, FADEC , Multi-core processing technology, disturbed based control

  1. From point process observations to collective neural dynamics: Nonlinear Hawkes process GLMs, low-dimensional dynamics and coarse graining

    PubMed Central

    Truccolo, Wilson

    2017-01-01

    This review presents a perspective on capturing collective dynamics in recorded neuronal ensembles based on multivariate point process models, inference of low-dimensional dynamics and coarse graining of spatiotemporal measurements. A general probabilistic framework for continuous time point processes reviewed, with an emphasis on multivariate nonlinear Hawkes processes with exogenous inputs. A point process generalized linear model (PP-GLM) framework for the estimation of discrete time multivariate nonlinear Hawkes processes is described. The approach is illustrated with the modeling of collective dynamics in neocortical neuronal ensembles recorded in human and non-human primates, and prediction of single-neuron spiking. A complementary approach to capture collective dynamics based on low-dimensional dynamics (“order parameters”) inferred via latent state-space models with point process observations is presented. The approach is illustrated by inferring and decoding low-dimensional dynamics in primate motor cortex during naturalistic reach and grasp movements. Finally, we briefly review hypothesis tests based on conditional inference and spatiotemporal coarse graining for assessing collective dynamics in recorded neuronal ensembles. PMID:28336305

  2. From point process observations to collective neural dynamics: Nonlinear Hawkes process GLMs, low-dimensional dynamics and coarse graining.

    PubMed

    Truccolo, Wilson

    2016-11-01

    This review presents a perspective on capturing collective dynamics in recorded neuronal ensembles based on multivariate point process models, inference of low-dimensional dynamics and coarse graining of spatiotemporal measurements. A general probabilistic framework for continuous time point processes reviewed, with an emphasis on multivariate nonlinear Hawkes processes with exogenous inputs. A point process generalized linear model (PP-GLM) framework for the estimation of discrete time multivariate nonlinear Hawkes processes is described. The approach is illustrated with the modeling of collective dynamics in neocortical neuronal ensembles recorded in human and non-human primates, and prediction of single-neuron spiking. A complementary approach to capture collective dynamics based on low-dimensional dynamics ("order parameters") inferred via latent state-space models with point process observations is presented. The approach is illustrated by inferring and decoding low-dimensional dynamics in primate motor cortex during naturalistic reach and grasp movements. Finally, we briefly review hypothesis tests based on conditional inference and spatiotemporal coarse graining for assessing collective dynamics in recorded neuronal ensembles. Published by Elsevier Ltd.

  3. Automated Signal Processing Applied to Volatile-Based Inspection of Greenhouse Crops

    PubMed Central

    Jansen, Roel; Hofstee, Jan Willem; Bouwmeester, Harro; van Henten, Eldert

    2010-01-01

    Gas chromatograph–mass spectrometers (GC-MS) have been used and shown utility for volatile-based inspection of greenhouse crops. However, a widely recognized difficulty associated with GC-MS application is the large and complex data generated by this instrument. As a consequence, experienced analysts are often required to process this data in order to determine the concentrations of the volatile organic compounds (VOCs) of interest. Manual processing is time-consuming, labour intensive and may be subject to errors due to fatigue. The objective of this study was to assess whether or not GC-MS data can also be automatically processed in order to determine the concentrations of crop health associated VOCs in a greenhouse. An experimental dataset that consisted of twelve data files was processed both manually and automatically to address this question. Manual processing was based on simple peak integration while the automatic processing relied on the algorithms implemented in the MetAlign™ software package. The results of automatic processing of the experimental dataset resulted in concentrations similar to that after manual processing. These results demonstrate that GC-MS data can be automatically processed in order to accurately determine the concentrations of crop health associated VOCs in a greenhouse. When processing GC-MS data automatically, noise reduction, alignment, baseline correction and normalisation are required. PMID:22163594

  4. [Process orientation as a tool of strategic approaches to corporate governance and integrated management systems].

    PubMed

    Sens, Brigitte

    2010-01-01

    The concept of general process orientation as an instrument of organisation development is the core principle of quality management philosophy, i.e. the learning organisation. Accordingly, prestigious quality awards and certification systems focus on process configuration and continual improvement. In German health care organisations, particularly in hospitals, this general process orientation has not been widely implemented yet - despite enormous change dynamics and the requirements of both quality and economic efficiency of health care processes. But based on a consistent process architecture that considers key processes as well as management and support processes, the strategy of excellent health service provision including quality, safety and transparency can be realised in daily operative work. The core elements of quality (e.g., evidence-based medicine), patient safety and risk management, environmental management, health and safety at work can be embedded in daily health care processes as an integrated management system (the "all in one system" principle). Sustainable advantages and benefits for patients, staff, and the organisation will result: stable, high-quality, efficient, and indicator-based health care processes. Hospitals with their broad variety of complex health care procedures should now exploit the full potential of total process orientation. Copyright © 2010. Published by Elsevier GmbH.

  5. Holistic processing of static and moving faces.

    PubMed

    Zhao, Mintao; Bülthoff, Isabelle

    2017-07-01

    Humans' face ability develops and matures with extensive experience in perceiving, recognizing, and interacting with faces that move most of the time. However, how facial movements affect 1 core aspect of face ability-holistic face processing-remains unclear. Here we investigated the influence of rigid facial motion on holistic and part-based face processing by manipulating the presence of facial motion during study and at test in a composite face task. The results showed that rigidly moving faces were processed as holistically as static faces (Experiment 1). Holistic processing of moving faces persisted whether facial motion was presented during study, at test, or both (Experiment 2). Moreover, when faces were inverted to eliminate the contributions of both an upright face template and observers' expertise with upright faces, rigid facial motion facilitated holistic face processing (Experiment 3). Thus, holistic processing represents a general principle of face perception that applies to both static and dynamic faces, rather than being limited to static faces. These results support an emerging view that both perceiver-based and face-based factors contribute to holistic face processing, and they offer new insights on what underlies holistic face processing, how information supporting holistic face processing interacts with each other, and why facial motion may affect face recognition and holistic face processing differently. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  6. Grief-Processing-Based Psychological Intervention for Children Orphaned by AIDS in Central China: A Pilot Study

    ERIC Educational Resources Information Center

    Lin, Xiuyun; Fang, Xiaoyi; Chi, Peilian; Li, Xiaoming; Chen, Wenrui; Heath, Melissa Allen

    2014-01-01

    A group of 124 children orphaned by AIDS (COA), who resided in two orphanages funded by the Chinese government, participated in a study investigating the efficacy of a grief-processing-based psychological group intervention. This psychological intervention program was designed to specifically help COA process their grief and reduce their…

  7. Coaching Process Based on Transformative Learning Theory for Changing the Instructional Mindset of Elementary School Teachers

    ERIC Educational Resources Information Center

    Kawinkamolroj, Milintra; Triwaranyu, Charinee; Thongthew, Sumlee

    2015-01-01

    This research aimed to develop coaching process based on transformative learning theory for changing the mindset about instruction of elementary school teachers. Tools used in this process include mindset tests and questionnaires designed to assess the instructional mindset of teachers and to allow the teachers to reflect on how they perceive…

  8. A Scheme for Understanding Group Processes in Problem-Based Learning

    ERIC Educational Resources Information Center

    Hammar Chiriac, Eva

    2008-01-01

    The purpose of this study was to identify, describe and interpret group processes occurring in tutorials in problem-based learning. Another aim was to investigate if a combination of Steiner's (Steiner, I. D. (1972). "Group process and productivity". New York: Academic Press.) theory of group work and Bion's (Bion, W. R. (1961). "Experiences in…

  9. Auditory Processing Interventions and Developmental Dyslexia: A Comparison of Phonemic and Rhythmic Approaches

    ERIC Educational Resources Information Center

    Thomson, Jennifer M.; Leong, Victoria; Goswami, Usha

    2013-01-01

    The purpose of this study was to compare the efficacy of two auditory processing interventions for developmental dyslexia, one based on rhythm and one based on phonetic training. Thirty-three children with dyslexia participated and were assigned to one of three groups (a) a novel rhythmic processing intervention designed to highlight auditory…

  10. The Effectiveness of Adopting E-Readers to Facilitate EFL Students' Process-Based Academic Writing

    ERIC Educational Resources Information Center

    Hung, Hui-Chun; Young, Shelley Shwu-Ching

    2015-01-01

    English as Foreign Language (EFL) students face additional difficulties for academic writing largely due to their level of language competency. An appropriate structural process of writing can help students develop their academic writing skills. This study explored the use of the e-readers to facilitate EFL students' process-based academic…

  11. Polymer based tunneling sensor

    NASA Technical Reports Server (NTRS)

    Wang, Jing (Inventor); Zhao, Yongjun (Inventor); Cui, Tianhong (Inventor)

    2006-01-01

    A process for fabricating a polymer based circuit by the following steps. A mold of a design is formed through a lithography process. The design is transferred to a polymer substrate through a hot embossing process. A metal layer is then deposited over at least part of said design and at least one electrical lead is connected to said metal layer.

  12. Local Anesthetic Microencapsulation.

    DTIC Science & Technology

    1983-11-04

    tollowing I.M. injection of microencapsulated lidocaine and etidocaine than following solution injections. Local toxicity of these microcapsule injections...Distribution 41 Table 12 Processing Summary of Lidocaine (Base) 43 Microencapsulation Table 13 Lidocaine (Base) Microcapsule Size 44 Distribution...Table 14 Processing Summary of Et’idocaine-HCl 45 Microencapsulation Table 15 Etidocaine-HCl Microcapsule Size 47 Distribution Table 16 Process Summary

  13. Seven-Step Problem-Based Learning in an Interaction Design Course

    ERIC Educational Resources Information Center

    Schultz, Nette; Christensen, Hans Peter

    2004-01-01

    The objective in this paper is the implementation of the highly structured seven-step problem-based learning (PBL) procedure as part of the learning process in a human-computer interaction (HCI) design course at the Technical University of Denmark, taking into account the common learning processes in PBL and the interaction design process. These…

  14. Free Computer-Based Assistive Technology to Support Students with High-Incidence Disabilities in the Writing Process

    ERIC Educational Resources Information Center

    Bouck, Emily C.; Meyer, Nancy K.; Satsangi, Rajiv; Savage, Melissa N.; Hunley, Megan

    2015-01-01

    Written expression is a neglected but critical component of education; yet, the writing process--from prewriting, to writing, and postwriting--is often an area of struggle for students with disabilities. One strategy to assist students with disabilities struggling with the writing process is the use of computer-based technology. This article…

  15. Recovery Processes of Organic Acids from Fermentation Broths in the Biomass-Based Industry.

    PubMed

    Li, Qian-Zhu; Jiang, Xing-Lin; Feng, Xin-Jun; Wang, Ji-Ming; Sun, Chao; Zhang, Hai-Bo; Xian, Mo; Liu, Hui-Zhou

    2016-01-01

    The new movement towards green chemistry and renewable feedstocks makes microbial production of chemicals more competitive. Among the numerous chemicals, organic acids are more attractive targets for process development efforts in the renewable-based biorefinery industry. However, most of the production costs in microbial processes are higher than that in chemical processes, among which over 60% are generated by separation processes. Therefore, the research of separation and purification processes is important for a promising biorefinery industry. This review highlights the progress of recovery processes in the separation and purification of organic acids, including their advantages and disadvantages, current situation, and future prospects in terms of recovery yields and industrial application.

  16. EEG feature selection method based on decision tree.

    PubMed

    Duan, Lijuan; Ge, Hui; Ma, Wei; Miao, Jun

    2015-01-01

    This paper aims to solve automated feature selection problem in brain computer interface (BCI). In order to automate feature selection process, we proposed a novel EEG feature selection method based on decision tree (DT). During the electroencephalogram (EEG) signal processing, a feature extraction method based on principle component analysis (PCA) was used, and the selection process based on decision tree was performed by searching the feature space and automatically selecting optimal features. Considering that EEG signals are a series of non-linear signals, a generalized linear classifier named support vector machine (SVM) was chosen. In order to test the validity of the proposed method, we applied the EEG feature selection method based on decision tree to BCI Competition II datasets Ia, and the experiment showed encouraging results.

  17. Space-based optical image encryption.

    PubMed

    Chen, Wen; Chen, Xudong

    2010-12-20

    In this paper, we propose a new method based on a three-dimensional (3D) space-based strategy for the optical image encryption. The two-dimensional (2D) processing of a plaintext in the conventional optical encryption methods is extended to a 3D space-based processing. Each pixel of the plaintext is considered as one particle in the proposed space-based optical image encryption, and the diffraction of all particles forms an object wave in the phase-shifting digital holography. The effectiveness and advantages of the proposed method are demonstrated by numerical results. The proposed method can provide a new optical encryption strategy instead of the conventional 2D processing, and may open up a new research perspective for the optical image encryption.

  18. DOE-DARPA High-Performance Corrosion-Resistant Materials (HPCRM), Annual HPCRM Team Meeting & Technical Review

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Farmer, J; Brown, B; Bayles, B

    The overall goal is to develop high-performance corrosion-resistant iron-based amorphous-metal coatings for prolonged trouble-free use in very aggressive environments: seawater & hot geothermal brines. The specific technical objectives are: (1) Synthesize Fe-based amorphous-metal coating with corrosion resistance comparable/superior to Ni-based Alloy C-22; (2) Establish processing parameter windows for applying and controlling coating attributes (porosity, density, bonding); (3) Assess possible cost savings through substitution of Fe-based material for more expensive Ni-based Alloy C-22; (4) Demonstrate practical fabrication processes; (5) Produce quality materials and data with complete traceability for nuclear applications; and (6) Develop, validate and calibrate computational models to enable lifemore » prediction and process design.« less

  19. Teaching WP and DP with CP/M-Based Microcomputers.

    ERIC Educational Resources Information Center

    Bartholome, Lloyd W.

    1982-01-01

    The use of CP/M (Control Program Monitor)-based microcomputers in teaching word processing and data processing is explored. The system's advantages, variations, dictionary software, and future are all discussed. (CT)

  20. An Ontology-Based Approach to Incorporate User-Generated Geo-Content Into Sdi

    NASA Astrophysics Data System (ADS)

    Deng, D.-P.; Lemmens, R.

    2011-08-01

    The Web is changing the way people share and communicate information because of emergence of various Web technologies, which enable people to contribute information on the Web. User-Generated Geo-Content (UGGC) is a potential resource of geographic information. Due to the different production methods, UGGC often cannot fit in geographic information model. There is a semantic gap between UGGC and formal geographic information. To integrate UGGC into geographic information, this study conducts an ontology-based process to bridge this semantic gap. This ontology-based process includes five steps: Collection, Extraction, Formalization, Mapping, and Deployment. In addition, this study implements this process on Twitter messages, which is relevant to Japan Earthquake disaster. By using this process, we extract disaster relief information from Twitter messages, and develop a knowledge base for GeoSPARQL queries in disaster relief information.

  1. Biomimetic design processes in architecture: morphogenetic and evolutionary computational design.

    PubMed

    Menges, Achim

    2012-03-01

    Design computation has profound impact on architectural design methods. This paper explains how computational design enables the development of biomimetic design processes specific to architecture, and how they need to be significantly different from established biomimetic processes in engineering disciplines. The paper first explains the fundamental difference between computer-aided and computational design in architecture, as the understanding of this distinction is of critical importance for the research presented. Thereafter, the conceptual relation and possible transfer of principles from natural morphogenesis to design computation are introduced and the related developments of generative, feature-based, constraint-based, process-based and feedback-based computational design methods are presented. This morphogenetic design research is then related to exploratory evolutionary computation, followed by the presentation of two case studies focusing on the exemplary development of spatial envelope morphologies and urban block morphologies.

  2. Recent developments in membrane-based separations in biotechnology processes: review.

    PubMed

    Rathore, A S; Shirke, A

    2011-01-01

    Membrane-based separations are the most ubiquitous unit operations in biotech processes. There are several key reasons for this. First, they can be used with a large variety of applications including clarification, concentration, buffer exchange, purification, and sterilization. Second, they are available in a variety of formats, such as depth filtration, ultrafiltration, diafiltration, nanofiltration, reverse osmosis, and microfiltration. Third, they are simple to operate and are generally robust toward normal variations in feed material and operating parameters. Fourth, membrane-based separations typically require lower capital cost when compared to other processing options. As a result of these advantages, a typical biotech process has anywhere from 10 to 20 membrane-based separation steps. In this article we review the major developments that have occurred on this topic with a focus on developments in the last 5 years.

  3. Modelling of the mercury loss in fluorescent lamps under the influence of metal oxide coatings

    NASA Astrophysics Data System (ADS)

    Santos Abreu, A.; Mayer, J.; Lenk, D.; Horn, S.; Konrad, A.; Tidecks, R.

    2016-11-01

    The mercury transport and loss mechanisms in the metal oxide coatings of mercury low pressure discharge fluorescent lamps have been investigated. An existing model based on a ballistic process is discussed in the context of experimental mercury loss data. Two different approaches to the modeling of the mercury loss have been developed. The first one is based on mercury transition rates between the plasma, the coating, and the glass without specifying the underlying physical processes. The second one is based on a transport process driven by diffusion and a binding process of mercury reacting to mercury oxide inside the layers. Moreover, we extended the diffusion based model to handle multi-component coatings. All approaches are applied to describe mercury loss experiments under the influence of an Al 2 O 3 coating.

  4. Increasing the efficiency of designing hemming processes by using an element-based metamodel approach

    NASA Astrophysics Data System (ADS)

    Kaiser, C.; Roll, K.; Volk, W.

    2017-09-01

    In the automotive industry, the manufacturing of automotive outer panels requires hemming processes in which two sheet metal parts are joined together by bending the flange of the outer part over the inner part. Because of decreasing development times and the steadily growing number of vehicle derivatives, an efficient digital product and process validation is necessary. Commonly used simulations, which are based on the finite element method, demand significant modelling effort, which results in disadvantages especially in the early product development phase. To increase the efficiency of designing hemming processes this paper presents a hemming-specific metamodel approach. The approach includes a part analysis in which the outline of the automotive outer panels is initially split into individual segments. By doing a para-metrization of each of the segments and assigning basic geometric shapes, the outline of the part is approximated. Based on this, the hemming parameters such as flange length, roll-in, wrinkling and plastic strains are calculated for each of the geometric basic shapes by performing a meta-model-based segmental product validation. The metamodel is based on an element similar formulation that includes a reference dataset of various geometric basic shapes. A random automotive outer panel can now be analysed and optimized based on the hemming-specific database. By implementing this approach into a planning system, an efficient optimization of designing hemming processes will be enabled. Furthermore, valuable time and cost benefits can be realized in a vehicle’s development process.

  5. Microstructure characterization of the stir zone of submerged friction stir processed aluminum alloy 2219

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Feng, Xiuli, E-mail: feng.97@osu.edu; State Key Laboratory of Advanced Welding and Joining, Harbin Institute of Technology, Harbin 150001; Liu, Huijie, E-mail: liuhj@hit.edu.cn

    Aluminum alloy 2219-T6 was friction stir processed using a novel submerged processing technique to facilitate cooling. Processing was conducted at a constant tool traverse speed of 200 mm/min and spindle rotation speeds in the range from 600 to 800 rpm. The microstructural characteristics of the base metal and processed zone, including grain structure and precipitation behavior, were studied using optical microscopy (OM), scanning electron microscopy (SEM) and transmission electron microscopy (TEM). Microhardness maps were constructed on polished cross sections of as-processed samples. The effect of tool rotation speed on the microstructure and hardness of the stir zone was investigated. Themore » average grain size of the stir zone was much smaller than that of the base metal, but the hardness was also lower due to the formation of equilibrium θ precipitates from the base metal θ′ precipitates. Stir zone hardness was found to decrease with increasing rotation speed (heat input). The effect of processing conditions on strength (hardness) was rationalized based on the competition between grain refinement strengthening and softening due to precipitate overaging. - Highlights: • SZ grain size (∼ 1 μm) is reduced by over one order of magnitude relative to the BM. • Hardness in the SZ is lower than that of the precipitation strengthened BM. • Metastable θ′ in the base metal transforms to equilibrium θ in the stir zone. • Softening in the SZ results from a decrease of precipitation strengthening.« less

  6. Vibration and acoustic frequency spectra for industrial process modeling using selective fusion multi-condition samples and multi-source features

    NASA Astrophysics Data System (ADS)

    Tang, Jian; Qiao, Junfei; Wu, ZhiWei; Chai, Tianyou; Zhang, Jian; Yu, Wen

    2018-01-01

    Frequency spectral data of mechanical vibration and acoustic signals relate to difficult-to-measure production quality and quantity parameters of complex industrial processes. A selective ensemble (SEN) algorithm can be used to build a soft sensor model of these process parameters by fusing valued information selectively from different perspectives. However, a combination of several optimized ensemble sub-models with SEN cannot guarantee the best prediction model. In this study, we use several techniques to construct mechanical vibration and acoustic frequency spectra of a data-driven industrial process parameter model based on selective fusion multi-condition samples and multi-source features. Multi-layer SEN (MLSEN) strategy is used to simulate the domain expert cognitive process. Genetic algorithm and kernel partial least squares are used to construct the inside-layer SEN sub-model based on each mechanical vibration and acoustic frequency spectral feature subset. Branch-and-bound and adaptive weighted fusion algorithms are integrated to select and combine outputs of the inside-layer SEN sub-models. Then, the outside-layer SEN is constructed. Thus, "sub-sampling training examples"-based and "manipulating input features"-based ensemble construction methods are integrated, thereby realizing the selective information fusion process based on multi-condition history samples and multi-source input features. This novel approach is applied to a laboratory-scale ball mill grinding process. A comparison with other methods indicates that the proposed MLSEN approach effectively models mechanical vibration and acoustic signals.

  7. Hadoop-Based Distributed System for Online Prediction of Air Pollution Based on Support Vector Machine

    NASA Astrophysics Data System (ADS)

    Ghaemi, Z.; Farnaghi, M.; Alimohammadi, A.

    2015-12-01

    The critical impact of air pollution on human health and environment in one hand and the complexity of pollutant concentration behavior in the other hand lead the scientists to look for advance techniques for monitoring and predicting the urban air quality. Additionally, recent developments in data measurement techniques have led to collection of various types of data about air quality. Such data is extremely voluminous and to be useful it must be processed at high velocity. Due to the complexity of big data analysis especially for dynamic applications, online forecasting of pollutant concentration trends within a reasonable processing time is still an open problem. The purpose of this paper is to present an online forecasting approach based on Support Vector Machine (SVM) to predict the air quality one day in advance. In order to overcome the computational requirements for large-scale data analysis, distributed computing based on the Hadoop platform has been employed to leverage the processing power of multiple processing units. The MapReduce programming model is adopted for massive parallel processing in this study. Based on the online algorithm and Hadoop framework, an online forecasting system is designed to predict the air pollution of Tehran for the next 24 hours. The results have been assessed on the basis of Processing Time and Efficiency. Quite accurate predictions of air pollutant indicator levels within an acceptable processing time prove that the presented approach is very suitable to tackle large scale air pollution prediction problems.

  8. On the Application of Different Event-Based Sampling Strategies to the Control of a Simple Industrial Process

    PubMed Central

    Sánchez, José; Guarnes, Miguel Ángel; Dormido, Sebastián

    2009-01-01

    This paper is an experimental study of the utilization of different event-based strategies for the automatic control of a simple but very representative industrial process: the level control of a tank. In an event-based control approach it is the triggering of a specific event, and not the time, that instructs the sensor to send the current state of the process to the controller, and the controller to compute a new control action and send it to the actuator. In the document, five control strategies based on different event-based sampling techniques are described, compared, and contrasted with a classical time-based control approach and a hybrid one. The common denominator in the time, the hybrid, and the event-based control approaches is the controller: a proportional-integral algorithm with adaptations depending on the selected control approach. To compare and contrast each one of the hybrid and the pure event-based control algorithms with the time-based counterpart, the two tasks that a control strategy must achieve (set-point following and disturbance rejection) are independently analyzed. The experimental study provides new proof concerning the ability of event-based control strategies to minimize the data exchange among the control agents (sensors, controllers, actuators) when an error-free control of the process is not a hard requirement. PMID:22399975

  9. Development of a Cr-Based Hard Composite Processed by Spark Plasma Sintering

    NASA Astrophysics Data System (ADS)

    García-Junceda, A.; Sáez, I.; Deng, X. X.; Torralba, J. M.

    2018-04-01

    This investigation analyzes the feasibility of processing a composite material comprising WC particles randomly dispersed in a matrix in which Cr is the main metallic binder. Thus, a new composite material is processed using a commercial, economic, and easily available Cr-based alloy, assuming that there is a certain Cr solubility in the WC particles acting as reinforcement. The processing route followed includes mechanical milling of the powders and consolidation by spark plasma sintering.

  10. Development, implementation and evaluation of an evidence-based program for introduction of new health technologies and clinical practices in a local healthcare setting.

    PubMed

    Harris, Claire; Garrubba, Marie; Allen, Kelly; King, Richard; Kelly, Cate; Thiagarajan, Malar; Castleman, Beverley; Ramsey, Wayne; Farjou, Dina

    2015-12-28

    This paper reports the process of establishing a transparent, accountable, evidence-based program for introduction of new technologies and clinical practices (TCPs) in a large Australian healthcare network. Many countries have robust evidence-based processes for assessment of new TCPs at national level. However many decisions are made by local health services where the resources and expertise to undertake health technology assessment (HTA) are limited and a lack of structure, process and transparency has been reported. An evidence-based model for process change was used to establish the program. Evidence from research and local data, experience of health service staff and consumer perspectives were incorporated at each of four steps: identifying the need for change, developing a proposal, implementation and evaluation. Checklists assessing characteristics of success, factors for sustainability and barriers and enablers were applied and implementation strategies were based on these findings. Quantitative and qualitative methods were used for process and outcome evaluation. An action research approach underpinned ongoing refinement to systems, processes and resources. A Best Practice Guide developed from the literature and stakeholder consultation identified seven program components: Governance, Decision-Making, Application Process, Monitoring and Reporting, Resources, Administration, and Evaluation and Quality Improvement. The aims of transparency and accountability were achieved. The processes are explicit, decisions published, outcomes recorded and activities reported. The aim of ascertaining rigorous evidence-based information for decision-making was not achieved in all cases. Applicants proposing new TCPs provided the evidence from research literature and local data however the information was often incorrect or inadequate, overestimating benefits and underestimating costs. Due to these limitations the initial application process was replaced by an Expression of Interest from applicants followed by a rigorous HTA by independent in-house experts. The program is generalisable to most health care organisations. With one exception, the components would be achievable with minimal additional resources; the lack of skills and resources required for HTA will limit effective application in many settings. A toolkit containing details of the processes and sample materials is provided to facilitate replication or local adaptation by those wishing to establish a similar program.

  11. Data quality and processing for decision making: divergence between corporate strategy and manufacturing processes

    NASA Astrophysics Data System (ADS)

    McNeil, Ronald D.; Miele, Renato; Shaul, Dennis

    2000-10-01

    Information technology is driving improvements in manufacturing systems. Results are higher productivity and quality. However, corporate strategy is driven by a number of factors and includes data and pressure from multiple stakeholders, which includes employees, managers, executives, stockholders, boards, suppliers and customers. It is also driven by information about competitors and emerging technology. Much information is based on processing of data and the resulting biases of the processors. Thus, stakeholders can base inputs on faulty perceptions, which are not reality based. Prior to processing, data used may be inaccurate. Sources of data and information may include demographic reports, statistical analyses, intelligence reports (e.g., marketing data), technology and primary data collection. The reliability and validity of data as well as the management of sources and information is critical element to strategy formulation. The paper explores data collection, processing and analyses from secondary and primary sources, information generation and report presentation for strategy formulation and contrast this with data and information utilized to drive internal process such as manufacturing. The hypothesis is that internal process, such as manufacturing, are subordinate to corporate strategies. The impact of possible divergence in quality of decisions at the corporate level on IT driven, quality-manufacturing processes based on measurable outcomes is significant. Recommendations for IT improvements at the corporate strategy level are given.

  12. Using artificial neural networks to model aluminium based sheet forming processes and tools details

    NASA Astrophysics Data System (ADS)

    Mekras, N.

    2017-09-01

    In this paper, a methodology and a software system will be presented concerning the use of Artificial Neural Networks (ANNs) for modeling aluminium based sheet forming processes. ANNs models’ creation is based on the training of the ANNs using experimental, trial and historical data records of processes’ inputs and outputs. ANNs models are useful in cases that processes’ mathematical models are not accurate enough, are not well defined or are missing e.g. in cases of complex product shapes, new material alloys, new process requirements, micro-scale products, etc. Usually, after the design and modeling of the forming tools (die, punch, etc.) and before mass production, a set of trials takes place at the shop floor for finalizing processes and tools details concerning e.g. tools’ minimum radii, die/punch clearance, press speed, process temperature, etc. and in relation with the material type, the sheet thickness and the quality achieved from the trials. Using data from the shop floor trials and forming theory data, ANNs models can be trained and created, and can be used to estimate processes and tools final details, hence supporting efficient set-up of processes and tools before mass production starts. The proposed ANNs methodology and the respective software system are implemented within the EU H2020 project LoCoMaTech for the aluminium-based sheet forming process HFQ (solution Heat treatment, cold die Forming and Quenching).

  13. Improved workflow modelling using role activity diagram-based modelling with application to a radiology service case study.

    PubMed

    Shukla, Nagesh; Keast, John E; Ceglarek, Darek

    2014-10-01

    The modelling of complex workflows is an important problem-solving technique within healthcare settings. However, currently most of the workflow models use a simplified flow chart of patient flow obtained using on-site observations, group-based debates and brainstorming sessions, together with historic patient data. This paper presents a systematic and semi-automatic methodology for knowledge acquisition with detailed process representation using sequential interviews of people in the key roles involved in the service delivery process. The proposed methodology allows the modelling of roles, interactions, actions, and decisions involved in the service delivery process. This approach is based on protocol generation and analysis techniques such as: (i) initial protocol generation based on qualitative interviews of radiology staff, (ii) extraction of key features of the service delivery process, (iii) discovering the relationships among the key features extracted, and, (iv) a graphical representation of the final structured model of the service delivery process. The methodology is demonstrated through a case study of a magnetic resonance (MR) scanning service-delivery process in the radiology department of a large hospital. A set of guidelines is also presented in this paper to visually analyze the resulting process model for identifying process vulnerabilities. A comparative analysis of different workflow models is also conducted. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  14. An Investigation of the Engagement of Elementary Students in the NCTM Process Standards after One Year of Standards-Based Instruction

    ERIC Educational Resources Information Center

    Fillingim, Jennifer Gale

    2010-01-01

    Contemporary mathematics education reform has placed increased emphasis on K-12 mathematics curriculum. Reform-based curricula, often referred to as "Standards-based" due to philosophical alignment with the NCTM Process Standards, have generated controversy among families, educators, and researchers. The mathematics education research…

  15. Dynamic neural network-based methods for compensation of nonlinear effects in multimode communication lines

    NASA Astrophysics Data System (ADS)

    Sidelnikov, O. S.; Redyuk, A. A.; Sygletos, S.

    2017-12-01

    We consider neural network-based schemes of digital signal processing. It is shown that the use of a dynamic neural network-based scheme of signal processing ensures an increase in the optical signal transmission quality in comparison with that provided by other methods for nonlinear distortion compensation.

  16. Culturally Based Intervention Development: The Case of Latino Families Dealing with Schizophrenia

    ERIC Educational Resources Information Center

    Barrio, Concepcion; Yamada, Ann-Marie

    2010-01-01

    Objectives: This article describes the process of developing a culturally based family intervention for Spanish-speaking Latino families with a relative diagnosed with schizophrenia. Method: Our iterative intervention development process was guided by a cultural exchange framework and based on findings from an ethnographic study. We piloted this…

  17. Assessing green-processing technologies for wet milling freshly hulled and germinated brown rice, leading to naturally fortified plant-based beverages

    USDA-ARS?s Scientific Manuscript database

    Rice milk beverages can well balanced nutrition. With healthier nutrition in consumer’s minds, national. Worldwide consumption/production of plant-based milk beverages are increasing. Much past research and invention was based on enzymatic conversion processes for starch that were uncomplicated be...

  18. Teacher Perceptions Regarding Portfolio-Based Components of Teacher Evaluations

    ERIC Educational Resources Information Center

    Nagel, Charles I.

    2012-01-01

    This study reports the results of teachers' and principals' perceptions of the package evaluation process, a process that uses a combination of a traditional evaluation with a portfolio-based assessment tool. In addition, this study contributes to the educational knowledge base by exploring the participants' views on the impact of…

  19. Guiding Students through the Jungle of Research-Based Literature

    ERIC Educational Resources Information Center

    Williams, Sherie

    2005-01-01

    Undergraduate students of today often lack the ability to effectively process research-based literature. In order to offer education students the most up-to-date methods, research-based literature must be considered. Hence a dilemma is born as to whether professors should discontinue requiring the processing of this type of information or teach…

  20. Evidence-Based Assessment of Attention-Deficit/Hyperactivity Disorder: Using Multiple Sources of Information

    ERIC Educational Resources Information Center

    Frazier, Thomas W.; Youngstrom, Eric A.

    2006-01-01

    In this article, the authors illustrate a step-by-step process of acquiring and integrating information according to the recommendations of evidence-based practices. A case example models the process, leading to specific recommendations regarding instruments and strategies for evidence-based assessment (EBA) of attention-deficit/hyperactivity…

  1. From the Analysis of Work-Processes to Designing Competence-Based Occupational Standards and Vocational Curricula

    ERIC Educational Resources Information Center

    Tutlys, Vidmantas; Spöttl, Georg

    2017-01-01

    Purpose: This paper aims to explore methodological and institutional challenges on application of the work-process analysis approach in the design and development of competence-based occupational standards for Lithuania. Design/methodology/approach: The theoretical analysis is based on the review of scientific literature and the analysis of…

  2. Process evaluation of two home-based bimanual training programs in children with unilateral cerebral palsy (the COAD-study): protocol for a mixed methods study.

    PubMed

    Beckers, Laura; van der Burg, Jan; Janssen-Potten, Yvonne; Rameckers, Eugène; Aarts, Pauline; Smeets, Rob

    2018-04-24

    As part of the COAD-study two home-based bimanual training programs for young children with unilateral Cerebral Palsy (uCP) have been developed, both consisting of a preparation phase and a home-based training phase. Parents are coached to use either an explicit or implicit motor learning approach while teaching bimanual activities to their child. A process evaluation of these complex interventions is crucial in order to draw accurate conclusions and provide recommendations for implementation in clinical practice and further research. The aim of the process evaluation is to systematically assess fidelity of the home-based training programs, to examine the mechanisms that contribute to their effects on child-related and parent-related outcomes, and to explore the influence of contextual factors. A mixed methods embedded design is used that emerges from a pragmatism paradigm. The qualitative strand involves a generic qualitative approach. The process evaluation components fidelity (quality), dose delivered (completeness), dose received (exposure and satisfaction), recruitment and context will be investigated. Data collection includes registration of attendance of therapists and remedial educationalists to a course regarding the home-based training programs; a questionnaire to evaluate this course by the instructor; a report form concerning the preparation phase to be completed by the therapist; registration and video analyses of the home-based training; interviews with parents and questionnaires to be filled out by the therapist and remedial educationalist regarding the process of training; and focus groups with therapists and remedial educationalists as well as registration of drop-out rates and reasons, to evaluate the overall home-based training programs. Inductive thematic analysis will be used to analyse qualitative data. Qualitative and quantitative findings are merged through meta-inference. So far, effects of home-based training programs in paediatric rehabilitation have been studied without an extensive process evaluation. The findings of this process evaluation will have implications for clinical practice and further research regarding development and application of home-based bimanual training programs, executed by parents and aimed at improving activity performance and participation of children with uCP.

  3. A neuro-inspired spike-based PID motor controller for multi-motor robots with low cost FPGAs.

    PubMed

    Jimenez-Fernandez, Angel; Jimenez-Moreno, Gabriel; Linares-Barranco, Alejandro; Dominguez-Morales, Manuel J; Paz-Vicente, Rafael; Civit-Balcells, Anton

    2012-01-01

    In this paper we present a neuro-inspired spike-based close-loop controller written in VHDL and implemented for FPGAs. This controller has been focused on controlling a DC motor speed, but only using spikes for information representation, processing and DC motor driving. It could be applied to other motors with proper driver adaptation. This controller architecture represents one of the latest layers in a Spiking Neural Network (SNN), which implements a bridge between robotics actuators and spike-based processing layers and sensors. The presented control system fuses actuation and sensors information as spikes streams, processing these spikes in hard real-time, implementing a massively parallel information processing system, through specialized spike-based circuits. This spike-based close-loop controller has been implemented into an AER platform, designed in our labs, that allows direct control of DC motors: the AER-Robot. Experimental results evidence the viability of the implementation of spike-based controllers, and hardware synthesis denotes low hardware requirements that allow replicating this controller in a high number of parallel controllers working together to allow a real-time robot control.

  4. A Neuro-Inspired Spike-Based PID Motor Controller for Multi-Motor Robots with Low Cost FPGAs

    PubMed Central

    Jimenez-Fernandez, Angel; Jimenez-Moreno, Gabriel; Linares-Barranco, Alejandro; Dominguez-Morales, Manuel J.; Paz-Vicente, Rafael; Civit-Balcells, Anton

    2012-01-01

    In this paper we present a neuro-inspired spike-based close-loop controller written in VHDL and implemented for FPGAs. This controller has been focused on controlling a DC motor speed, but only using spikes for information representation, processing and DC motor driving. It could be applied to other motors with proper driver adaptation. This controller architecture represents one of the latest layers in a Spiking Neural Network (SNN), which implements a bridge between robotics actuators and spike-based processing layers and sensors. The presented control system fuses actuation and sensors information as spikes streams, processing these spikes in hard real-time, implementing a massively parallel information processing system, through specialized spike-based circuits. This spike-based close-loop controller has been implemented into an AER platform, designed in our labs, that allows direct control of DC motors: the AER-Robot. Experimental results evidence the viability of the implementation of spike-based controllers, and hardware synthesis denotes low hardware requirements that allow replicating this controller in a high number of parallel controllers working together to allow a real-time robot control. PMID:22666004

  5. Solution NMR Spectroscopy in Target-Based Drug Discovery.

    PubMed

    Li, Yan; Kang, Congbao

    2017-08-23

    Solution NMR spectroscopy is a powerful tool to study protein structures and dynamics under physiological conditions. This technique is particularly useful in target-based drug discovery projects as it provides protein-ligand binding information in solution. Accumulated studies have shown that NMR will play more and more important roles in multiple steps of the drug discovery process. In a fragment-based drug discovery process, ligand-observed and protein-observed NMR spectroscopy can be applied to screen fragments with low binding affinities. The screened fragments can be further optimized into drug-like molecules. In combination with other biophysical techniques, NMR will guide structure-based drug discovery. In this review, we describe the possible roles of NMR spectroscopy in drug discovery. We also illustrate the challenges encountered in the drug discovery process. We include several examples demonstrating the roles of NMR in target-based drug discoveries such as hit identification, ranking ligand binding affinities, and mapping the ligand binding site. We also speculate the possible roles of NMR in target engagement based on recent processes in in-cell NMR spectroscopy.

  6. Parameter prediction based on Improved Process neural network and ARMA error compensation in Evaporation Process

    NASA Astrophysics Data System (ADS)

    Qian, Xiaoshan

    2018-01-01

    The traditional model of evaporation process parameters have continuity and cumulative characteristics of the prediction error larger issues, based on the basis of the process proposed an adaptive particle swarm neural network forecasting method parameters established on the autoregressive moving average (ARMA) error correction procedure compensated prediction model to predict the results of the neural network to improve prediction accuracy. Taking a alumina plant evaporation process to analyze production data validation, and compared with the traditional model, the new model prediction accuracy greatly improved, can be used to predict the dynamic process of evaporation of sodium aluminate solution components.

  7. Variance reduction for Fokker–Planck based particle Monte Carlo schemes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gorji, M. Hossein, E-mail: gorjih@ifd.mavt.ethz.ch; Andric, Nemanja; Jenny, Patrick

    Recently, Fokker–Planck based particle Monte Carlo schemes have been proposed and evaluated for simulations of rarefied gas flows [1–3]. In this paper, the variance reduction for particle Monte Carlo simulations based on the Fokker–Planck model is considered. First, deviational based schemes were derived and reviewed, and it is shown that these deviational methods are not appropriate for practical Fokker–Planck based rarefied gas flow simulations. This is due to the fact that the deviational schemes considered in this study lead either to instabilities in the case of two-weight methods or to large statistical errors if the direct sampling method is applied.more » Motivated by this conclusion, we developed a novel scheme based on correlated stochastic processes. The main idea here is to synthesize an additional stochastic process with a known solution, which is simultaneously solved together with the main one. By correlating the two processes, the statistical errors can dramatically be reduced; especially for low Mach numbers. To assess the methods, homogeneous relaxation, planar Couette and lid-driven cavity flows were considered. For these test cases, it could be demonstrated that variance reduction based on parallel processes is very robust and effective.« less

  8. Space-based infrared scanning sensor LOS determination and calibration using star observation

    NASA Astrophysics Data System (ADS)

    Chen, Jun; Xu, Zhan; An, Wei; Deng, Xin-Pu; Yang, Jun-Gang

    2015-10-01

    This paper provides a novel methodology for removing sensor bias from a space based infrared (IR) system (SBIRS) through the use of stars detected in the background field of the sensor. Space based IR system uses the LOS (line of sight) of target for target location. LOS determination and calibration is the key precondition of accurate location and tracking of targets in Space based IR system and the LOS calibration of scanning sensor is one of the difficulties. The subsequent changes of sensor bias are not been taking into account in the conventional LOS determination and calibration process. Based on the analysis of the imaging process of scanning sensor, a theoretical model based on the estimation of bias angles using star observation is proposed. By establishing the process model of the bias angles and the observation model of stars, using an extended Kalman filter (EKF) to estimate the bias angles, and then calibrating the sensor LOS. Time domain simulations results indicate that the proposed method has a high precision and smooth performance for sensor LOS determination and calibration. The timeliness and precision of target tracking process in the space based infrared (IR) tracking system could be met with the proposed algorithm.

  9. Development and evaluation of spatial point process models for epidermal nerve fibers.

    PubMed

    Olsbo, Viktor; Myllymäki, Mari; Waller, Lance A; Särkkä, Aila

    2013-06-01

    We propose two spatial point process models for the spatial structure of epidermal nerve fibers (ENFs) across human skin. The models derive from two point processes, Φb and Φe, describing the locations of the base and end points of the fibers. Each point of Φe (the end point process) is connected to a unique point in Φb (the base point process). In the first model, both Φe and Φb are Poisson processes, yielding a null model of uniform coverage of the skin by end points and general baseline results and reference values for moments of key physiologic indicators. The second model provides a mechanistic model to generate end points for each base, and we model the branching structure more directly by defining Φe as a cluster process conditioned on the realization of Φb as its parent points. In both cases, we derive distributional properties for observable quantities of direct interest to neurologists such as the number of fibers per base, and the direction and range of fibers on the skin. We contrast both models by fitting them to data from skin blister biopsy images of ENFs and provide inference regarding physiological properties of ENFs. Copyright © 2013 Elsevier Inc. All rights reserved.

  10. Gaussian process-based surrogate modeling framework for process planning in laser powder-bed fusion additive manufacturing of 316L stainless steel

    DOE PAGES

    Tapia, Gustavo; Khairallah, Saad A.; Matthews, Manyalibo J.; ...

    2017-09-22

    Here, Laser Powder-Bed Fusion (L-PBF) metal-based additive manufacturing (AM) is complex and not fully understood. Successful processing for one material, might not necessarily apply to a different material. This paper describes a workflow process that aims at creating a material data sheet standard that describes regimes where the process can be expected to be robust. The procedure consists of building a Gaussian process-based surrogate model of the L-PBF process that predicts melt pool depth in single-track experiments given a laser power, scan speed, and laser beam size combination. The predictions are then mapped onto a power versus scan speed diagrammore » delimiting the conduction from the keyhole melting controlled regimes. This statistical framework is shown to be robust even for cases where experimental training data might be suboptimal in quality, if appropriate physics-based filters are applied. Additionally, it is demonstrated that a high-fidelity simulation model of L-PBF can equally be successfully used for building a surrogate model, which is beneficial since simulations are getting more efficient and are more practical to study the response of different materials, than to re-tool an AM machine for new material powder.« less

  11. Gaussian process-based surrogate modeling framework for process planning in laser powder-bed fusion additive manufacturing of 316L stainless steel

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tapia, Gustavo; Khairallah, Saad A.; Matthews, Manyalibo J.

    Here, Laser Powder-Bed Fusion (L-PBF) metal-based additive manufacturing (AM) is complex and not fully understood. Successful processing for one material, might not necessarily apply to a different material. This paper describes a workflow process that aims at creating a material data sheet standard that describes regimes where the process can be expected to be robust. The procedure consists of building a Gaussian process-based surrogate model of the L-PBF process that predicts melt pool depth in single-track experiments given a laser power, scan speed, and laser beam size combination. The predictions are then mapped onto a power versus scan speed diagrammore » delimiting the conduction from the keyhole melting controlled regimes. This statistical framework is shown to be robust even for cases where experimental training data might be suboptimal in quality, if appropriate physics-based filters are applied. Additionally, it is demonstrated that a high-fidelity simulation model of L-PBF can equally be successfully used for building a surrogate model, which is beneficial since simulations are getting more efficient and are more practical to study the response of different materials, than to re-tool an AM machine for new material powder.« less

  12. Pen-based computers: Computers without keys

    NASA Technical Reports Server (NTRS)

    Conklin, Cheryl L.

    1994-01-01

    The National Space Transportation System (NSTS) is comprised of many diverse and highly complex systems incorporating the latest technologies. Data collection associated with ground processing of the various Space Shuttle system elements is extremely challenging due to the many separate processing locations where data is generated. This presents a significant problem when the timely collection, transfer, collation, and storage of data is required. This paper describes how new technology, referred to as Pen-Based computers, is being used to transform the data collection process at Kennedy Space Center (KSC). Pen-Based computers have streamlined procedures, increased data accuracy, and now provide more complete information than previous methods. The end results is the elimination of Shuttle processing delays associated with data deficiencies.

  13. Influence of branding on preference-based decision making.

    PubMed

    Philiastides, Marios G; Ratcliff, Roger

    2013-07-01

    Branding has become one of the most important determinants of consumer choices. Intriguingly, the psychological mechanisms of how branding influences decision making remain elusive. In the research reported here, we used a preference-based decision-making task and computational modeling to identify which internal components of processing are affected by branding. We found that a process of noisy temporal integration of subjective value information can model preference-based choices reliably and that branding biases are explained by changes in the rate of the integration process itself. This result suggests that branding information and subjective preference are integrated into a single source of evidence in the decision-making process, thereby altering choice behavior.

  14. Through-process modelling of texture and anisotropy in AA5182

    NASA Astrophysics Data System (ADS)

    Crumbach, M.; Neumann, L.; Goerdeler, M.; Aretz, H.; Gottstein, G.; Kopp, R.

    2006-07-01

    A through-process texture and anisotropy prediction for AA5182 sheet production from hot rolling through cold rolling and annealing is reported. Thermo-mechanical process data predicted by the finite element method (FEM) package T-Pack based on the software LARSTRAN were fed into a combination of physics based microstructure models for deformation texture (GIA), work hardening (3IVM), nucleation texture (ReNuc), and recrystallization texture (StaRT). The final simulated sheet texture was fed into a FEM simulation of cup drawing employing a new concept of interactively updated texture based yield locus predictions. The modelling results of texture development and anisotropy were compared to experimental data. The applicability to other alloys and processes is discussed.

  15. Analysis of InP-based single photon avalanche diodes based on a single recess-etching process

    NASA Astrophysics Data System (ADS)

    Lee, Kiwon

    2018-04-01

    Effects of the different etching techniques have been investigated by analyzing electrical and optical characteristics of two-types of single-diffused single photon avalanche diodes (SPADs). The fabricated two-types of SPADs have no diffusion depth variation by using a single diffusion process at the same time. The dry-etched SPADs show higher temperature dependence of a breakdown voltage, larger dark-count-rate (DCR), and lower photon-detection-efficiency (PDE) than those of the wet-etched SPADs due to plasma-induced damage of dry-etching process. The results show that the dry etching damages can more significantly affect the performance of the SPADs based on a single recess-etching process.

  16. Research on manufacturing service behavior modeling based on block chain theory

    NASA Astrophysics Data System (ADS)

    Zhao, Gang; Zhang, Guangli; Liu, Ming; Yu, Shuqin; Liu, Yali; Zhang, Xu

    2018-04-01

    According to the attribute characteristics of processing craft, the manufacturing service behavior is divided into service attribute, basic attribute, process attribute, resource attribute. The attribute information model of manufacturing service is established. The manufacturing service behavior information is successfully divided into public and private domain. Additionally, the block chain technology is introduced, and the information model of manufacturing service based on block chain principle is established, which solves the problem of sharing and secreting information of processing behavior, and ensures that data is not tampered with. Based on the key pairing verification relationship, the selective publishing mechanism for manufacturing information is established, achieving the traceability of product data, guarantying the quality of processing quality.

  17. Manufacturing process and material selection in concurrent collaborative design of MEMS devices

    NASA Astrophysics Data System (ADS)

    Zha, Xuan F.; Du, H.

    2003-09-01

    In this paper we present knowledge of an intensive approach and system for selecting suitable manufacturing processes and materials for microelectromechanical systems (MEMS) devices in concurrent collaborative design environment. In the paper, fundamental issues on MEMS manufacturing process and material selection such as concurrent design framework, manufacturing process and material hierarchies, and selection strategy are first addressed. Then, a fuzzy decision support scheme for a multi-criteria decision-making problem is proposed for estimating, ranking and selecting possible manufacturing processes, materials and their combinations. A Web-based prototype advisory system for the MEMS manufacturing process and material selection, WebMEMS-MASS, is developed based on the client-knowledge server architecture and framework to help the designer find good processes and materials for MEMS devices. The system, as one of the important parts of an advanced simulation and modeling tool for MEMS design, is a concept level process and material selection tool, which can be used as a standalone application or a Java applet via the Web. The running sessions of the system are inter-linked with webpages of tutorials and reference pages to explain the facets, fabrication processes and material choices, and calculations and reasoning in selection are performed using process capability and material property data from a remote Web-based database and interactive knowledge base that can be maintained and updated via the Internet. The use of the developed system including operation scenario, use support, and integration with an MEMS collaborative design system is presented. Finally, an illustration example is provided.

  18. Empowering occupational therapists to become evidence-based work rehabilitation practitioners.

    PubMed

    Vachon, Brigitte; Durand, Marie-José; LeBlanc, Jeannette

    2010-01-01

    Occupational therapists (OTs) engage in continuing education to integrate best available knowledge and skills into their practice. However, many barriers influence the degree to which they are currently able to integrate research evidence into their clinical decision making process. The specific objectives were to explore the clinical decision-making processes they used, and to describe the empowerment process they developed to become evidence-based practitioners. Eight OTs, who had attended a four-day workshop on evidence-based work rehabilitation, were recruited to participate to a reflective practice group. A collaborative research methodology was used. The group was convened for 12 meetings and held during a 15-month period. The data collected was analyzed using the grounded theory method. The results revealed the different decision-making modes used by OTs: defensive, repressed, cautious, autonomous intuitive and autonomous thoughtful. These modes influenced utilization of evidence and determined the stances taken toward practice change. Reflective learning facilitated their utilization of an evidence-based practice model through a three-level empowerment process: deliberateness, client-centeredness and system mindedness. During the course of this study, participants learned to become evidence-based practitioners. This process had an impact on how they viewed their clients, their practice and the work rehabilitation system.

  19. Results of research on development of an intellectual information system of bankruptcy risk assessment of the enterprise

    NASA Astrophysics Data System (ADS)

    Telipenko, E.; Chernysheva, T.; Zakharova, A.; Dumchev, A.

    2015-10-01

    The article represents research results about the knowledge base development for the intellectual information system for the bankruptcy risk assessment of the enterprise. It is described the process analysis of the knowledge base development; the main process stages, some problems and their solutions are given. The article introduces the connectionist model for the bankruptcy risk assessment based on the analysis of industrial enterprise financial accounting. The basis for this connectionist model is a three-layer perceptron with the back propagation of error algorithm. The knowledge base for the intellectual information system consists of processed information and the processing operation method represented as the connectionist model. The article represents the structure of the intellectual information system, the knowledge base, and the information processing algorithm for neural network training. The paper shows mean values of 10 indexes for industrial enterprises; with the help of them it is possible to carry out a financial analysis of industrial enterprises and identify correctly the current situation for well-timed managerial decisions. Results are given about neural network testing on the data of both bankrupt and financially strong enterprises, which were not included into training and test sets.

  20. An Internationally Consented Standard for Nursing Process-Clinical Decision Support Systems in Electronic Health Records.

    PubMed

    Müller-Staub, Maria; de Graaf-Waar, Helen; Paans, Wolter

    2016-11-01

    Nurses are accountable to apply the nursing process, which is key for patient care: It is a problem-solving process providing the structure for care plans and documentation. The state-of-the art nursing process is based on classifications that contain standardized concepts, and therefore, it is named Advanced Nursing Process. It contains valid assessments, nursing diagnoses, interventions, and nursing-sensitive patient outcomes. Electronic decision support systems can assist nurses to apply the Advanced Nursing Process. However, nursing decision support systems are missing, and no "gold standard" is available. The study aim is to develop a valid Nursing Process-Clinical Decision Support System Standard to guide future developments of clinical decision support systems. In a multistep approach, a Nursing Process-Clinical Decision Support System Standard with 28 criteria was developed. After pilot testing (N = 29 nurses), the criteria were reduced to 25. The Nursing Process-Clinical Decision Support System Standard was then presented to eight internationally known experts, who performed qualitative interviews according to Mayring. Fourteen categories demonstrate expert consensus on the Nursing Process-Clinical Decision Support System Standard and its content validity. All experts agreed the Advanced Nursing Process should be the centerpiece for the Nursing Process-Clinical Decision Support System and should suggest research-based, predefined nursing diagnoses and correct linkages between diagnoses, evidence-based interventions, and patient outcomes.

  1. Balancing Act: How to Capture Knowledge without Killing It.

    ERIC Educational Resources Information Center

    Brown, John Seely; Duguid, Paul

    2000-01-01

    Top-down processes for institutionalizing ideas can stifle creativity. Xerox researchers learned how to combine process-based and practice-based methods in order to disseminate best practices from a community of repair technicians. (JOW)

  2. Distinct and Overlapping Brain Areas Engaged during Value-Based, Mathematical, and Emotional Decision Processing

    PubMed Central

    Hsu, Chun-Wei; Goh, Joshua O. S.

    2016-01-01

    When comparing between the values of different choices, human beings can rely on either more cognitive processes, such as using mathematical computation, or more affective processes, such as using emotion. However, the neural correlates of how these two types of processes operate during value-based decision-making remain unclear. In this study, we investigated the extent to which neural regions engaged during value-based decision-making overlap with those engaged during mathematical and emotional processing in a within-subject manner. In a functional magnetic resonance imaging experiment, participants viewed stimuli that always consisted of numbers and emotional faces that depicted two choices. Across tasks, participants decided between the two choices based on the expected value of the numbers, a mathematical result of the numbers, or the emotional face stimuli. We found that all three tasks commonly involved various cortical areas including frontal, parietal, motor, somatosensory, and visual regions. Critically, the mathematical task shared common areas with the value but not emotion task in bilateral striatum. Although the emotion task overlapped with the value task in parietal, motor, and sensory areas, the mathematical task also evoked responses in other areas within these same cortical structures. Minimal areas were uniquely engaged for the value task apart from the other two tasks. The emotion task elicited a more expansive area of neural activity whereas value and mathematical task responses were in more focal regions. Whole-brain spatial correlation analysis showed that valuative processing engaged functional brain responses more similarly to mathematical processing than emotional processing. While decisions on expected value entail both mathematical and emotional processing regions, mathematical processes have a more prominent contribution particularly in subcortical processes. PMID:27375466

  3. Distinct and Overlapping Brain Areas Engaged during Value-Based, Mathematical, and Emotional Decision Processing.

    PubMed

    Hsu, Chun-Wei; Goh, Joshua O S

    2016-01-01

    When comparing between the values of different choices, human beings can rely on either more cognitive processes, such as using mathematical computation, or more affective processes, such as using emotion. However, the neural correlates of how these two types of processes operate during value-based decision-making remain unclear. In this study, we investigated the extent to which neural regions engaged during value-based decision-making overlap with those engaged during mathematical and emotional processing in a within-subject manner. In a functional magnetic resonance imaging experiment, participants viewed stimuli that always consisted of numbers and emotional faces that depicted two choices. Across tasks, participants decided between the two choices based on the expected value of the numbers, a mathematical result of the numbers, or the emotional face stimuli. We found that all three tasks commonly involved various cortical areas including frontal, parietal, motor, somatosensory, and visual regions. Critically, the mathematical task shared common areas with the value but not emotion task in bilateral striatum. Although the emotion task overlapped with the value task in parietal, motor, and sensory areas, the mathematical task also evoked responses in other areas within these same cortical structures. Minimal areas were uniquely engaged for the value task apart from the other two tasks. The emotion task elicited a more expansive area of neural activity whereas value and mathematical task responses were in more focal regions. Whole-brain spatial correlation analysis showed that valuative processing engaged functional brain responses more similarly to mathematical processing than emotional processing. While decisions on expected value entail both mathematical and emotional processing regions, mathematical processes have a more prominent contribution particularly in subcortical processes.

  4. [Development of whole process quality control and management system of traditional Chinese medicine decoction pieces based on traditional Chinese medicine quality tree].

    PubMed

    Yu, Wen-Kang; Dong, Ling; Pei, Wen-Xuan; Sun, Zhi-Rong; Dai, Jun-Dong; Wang, Yun

    2017-12-01

    The whole process quality control and management of traditional Chinese medicine (TCM) decoction pieces is a system engineering, involving the base environment, seeds and seedlings, harvesting, processing and other multiple steps, so the accurate identification of factors in TCM production process that may induce the quality risk, as well as reasonable quality control measures are very important. At present, the concept of quality risk is mainly concentrated in the aspects of management and regulations, etc. There is no comprehensive analysis on possible risks in the quality control process of TCM decoction pieces, or analysis summary of effective quality control schemes. A whole process quality control and management system for TCM decoction pieces based on TCM quality tree was proposed in this study. This system effectively combined the process analysis method of TCM quality tree with the quality risk management, and can help managers to make real-time decisions while realizing the whole process quality control of TCM. By providing personalized web interface, this system can realize user-oriented information feedback, and was convenient for users to predict, evaluate and control the quality of TCM. In the application process, the whole process quality control and management system of the TCM decoction pieces can identify the related quality factors such as base environment, cultivation and pieces processing, extend and modify the existing scientific workflow according to their own production conditions, and provide different enterprises with their own quality systems, to achieve the personalized service. As a new quality management model, this paper can provide reference for improving the quality of Chinese medicine production and quality standardization. Copyright© by the Chinese Pharmaceutical Association.

  5. Data near processing support for climate data analysis

    NASA Astrophysics Data System (ADS)

    Kindermann, Stephan; Ehbrecht, Carsten; Hempelmann, Nils

    2016-04-01

    Climate data repositories grow in size exponentially. Scalable data near processing capabilities are required to meet future data analysis requirements and to replace current "data download and process at home" workflows and approaches. On one hand side, these processing capabilities should be accessible via standardized interfaces (e.g. OGC WPS), on the other side a large variety of processing tools, toolboxes and deployment alternatives have to be supported and maintained at the data/processing center. We present a community approach of a modular and flexible system supporting the development, deployment and maintenace of OGC-WPS based web processing services. This approach is organized in an open source github project (called "bird-house") supporting individual processing services ("birds", e.g. climate index calculations, model data ensemble calculations), which rely on basic common infrastructural components (e.g. installation and deployment recipes, analysis code dependencies management). To support easy deployment at data centers as well as home institutes (e.g. for testing and development) the system supports the management of the often very complex package dependency chain of climate data analysis packages as well as docker based packaging and installation. We present a concrete deployment scenario at the German Climate Computing Center (DKRZ). The DKRZ one hand side hosts a multi-petabyte climate archive which is integrated e.g. into the european ENES and worldwide ESGF data infrastructure, and on the other hand hosts an HPC center supporting (model) data production and data analysis. The deployment scenario also includes openstack based data cloud services to support data import and data distribution for bird-house based WPS web processing services. Current challenges for inter-institutionnal deployments of web processing services supporting the european and international climate modeling community as well as the climate impact community are highlighted. Also aspects supporting future WPS based cross community usage scenarios supporting data reuse and data provenance aspects are reflected.

  6. The Morningside Initiative: Collaborative Development of a Knowledge Repository to Accelerate Adoption of Clinical Decision Support

    DTIC Science & Technology

    2010-01-01

    Comparative Effectiveness Research, or other efforts to determine best practices and to develop guidelines based on meta-analysis and evidence - based medicine . An...authoritative reviews or other evidence - based medicine sources, but they have been made unambiguous and computable – a process which sounds...best practice recommendation created through an evidence - based medicine (EBM) development process. The lifecycle envisions four stages of refinement

  7. Fully automated rodent brain MR image processing pipeline on a Midas server: from acquired images to region-based statistics.

    PubMed

    Budin, Francois; Hoogstoel, Marion; Reynolds, Patrick; Grauer, Michael; O'Leary-Moore, Shonagh K; Oguz, Ipek

    2013-01-01

    Magnetic resonance imaging (MRI) of rodent brains enables study of the development and the integrity of the brain under certain conditions (alcohol, drugs etc.). However, these images are difficult to analyze for biomedical researchers with limited image processing experience. In this paper we present an image processing pipeline running on a Midas server, a web-based data storage system. It is composed of the following steps: rigid registration, skull-stripping, average computation, average parcellation, parcellation propagation to individual subjects, and computation of region-based statistics on each image. The pipeline is easy to configure and requires very little image processing knowledge. We present results obtained by processing a data set using this pipeline and demonstrate how this pipeline can be used to find differences between populations.

  8. Optimized Laplacian image sharpening algorithm based on graphic processing unit

    NASA Astrophysics Data System (ADS)

    Ma, Tinghuai; Li, Lu; Ji, Sai; Wang, Xin; Tian, Yuan; Al-Dhelaan, Abdullah; Al-Rodhaan, Mznah

    2014-12-01

    In classical Laplacian image sharpening, all pixels are processed one by one, which leads to large amount of computation. Traditional Laplacian sharpening processed on CPU is considerably time-consuming especially for those large pictures. In this paper, we propose a parallel implementation of Laplacian sharpening based on Compute Unified Device Architecture (CUDA), which is a computing platform of Graphic Processing Units (GPU), and analyze the impact of picture size on performance and the relationship between the processing time of between data transfer time and parallel computing time. Further, according to different features of different memory, an improved scheme of our method is developed, which exploits shared memory in GPU instead of global memory and further increases the efficiency. Experimental results prove that two novel algorithms outperform traditional consequentially method based on OpenCV in the aspect of computing speed.

  9. Silicon materials task of the low cost solar array project. Phase 3: Effect of impurities and processing on silicon solar cells

    NASA Technical Reports Server (NTRS)

    Hopkins, R. H.; Davis, J. R.; Blais, P. D.; Rohatgi, A.; Campbell, R. B.; Rai-Choudhury, P.; Mollenkopf, H. C.; Mccormick, J. R.

    1979-01-01

    The 13th quarterly report of a study entitled an Investigation of the Effects of Impurities and Processing on Silicon Solar Cells is given. The objective of the program is to define the effects of impurities, various thermochemical processes and any impurity-process interactions on the performance of terrestrial silicon solar cells. The Phase 3 program effort falls in five areas: (1) cell processing studies; (2) completion of the data base and impurity-performance modeling for n-base cells; (3) extension of p-base studies to include contaminants likely to be introduced during silicon production, refining or crystal growth; (4) anisotropy effects; and (5) a preliminary study of the permanence of impurity effects in silicon solar cells. The quarterly activities for this report focus on tasks (1), (3) and (4).

  10. Process-oriented guided-inquiry learning: a natural fit for occupational therapy education.

    PubMed

    Jaffe, Lynn; Gibson, Robert; D'Amico, Mariana

    2015-04-01

    After a brief review of the major group cooperative learning strategies, this article presents the format and use of Process-Oriented Guided-Inquiry Learning (POGIL) as a recommended teaching strategy for occupational therapy classes. This recommendation is based upon evidence of effectiveness of this strategy for enhancing critical thinking, content retention, and teamwork. Strategies for learning the process and suggestions for its use are based upon literature evidence and the authors' experiences with this strategy over 4 years in a class on evidence-based practice.

  11. Wide-bandgap III-Nitride based Second Harmonic Generation

    DTIC Science & Technology

    2014-10-02

    fabrication process for a GaN LPS. Fig. 1: 3-step Fabrication process of a GaN based lateral polar structure. ( a ) Growth of a 20 nm AlN buffer layer...etching of the LT-AlN stripes. This results are shown in Fig. 2 ( a ) and (b). Fig. 2: AFM images of KOH ( a ) and RIE (b) patterned templates for lateral ...was varied between 0.6 - 1.0. FIG. 3: Growth process of AlGaN based Lateral Polar Structures. ( a ) RIE patterning. (b) Growth of HT- AlN. (c

  12. Negotiation-based Order Lot-Sizing Approach for Two-tier Supply Chain

    NASA Astrophysics Data System (ADS)

    Chao, Yuan; Lin, Hao Wen; Chen, Xili; Murata, Tomohiro

    This paper focuses on a negotiation based collaborative planning process for the determination of order lot-size over multi-period planning, and confined to a two-tier supply chain scenario. The aim is to study how negotiation based planning processes would be used to refine locally preferred ordering patterns, which would consequently affect the overall performance of the supply chain in terms of costs and service level. Minimal information exchanges in the form of mathematical models are suggested to represent the local preferences and used to support the negotiation processes.

  13. Model-Based Verification and Validation of the SMAP Uplink Processes

    NASA Technical Reports Server (NTRS)

    Khan, M. Omair; Dubos, Gregory F.; Tirona, Joseph; Standley, Shaun

    2013-01-01

    This case study stands as an example of how a project can validate a system-level design earlier in the project life cycle than traditional V&V processes by using simulation on a system model. Specifically, this paper describes how simulation was added to a system model of the Soil Moisture Active-Passive (SMAP) mission's uplink process.Also discussed are the advantages and disadvantages of the methods employed and the lessons learned; which are intended to benefit future model-based and simulation-based V&V development efforts.

  14. Design Process for Online Websites Created for Teaching Turkish as a Foreign Language in Web Based Environments

    ERIC Educational Resources Information Center

    Türker, Fatih Mehmet

    2016-01-01

    In today's world, where online learning environments have increased their efficiency in education and training, the design of the websites prepared for education and training purposes has become an important process. This study is about the teaching process of the online learning environments created to teach Turkish in web based environments, and…

  15. Process Evaluation of a School-Based Weight Gain Prevention Program: The Dutch Obesity Intervention in Teenagers (DOiT)

    ERIC Educational Resources Information Center

    Singh, A. S.; Chinapaw, M. J. M.; Brug, J.; van Mechelen, W.

    2009-01-01

    Health promotion programs benefit from an accompanying process evaluation since it can provide more insight in the strengths and weaknesses of a program. A process evaluation was conducted to assess the reach, implementation, satisfaction and maintenance of a school-based program aimed at the prevention of excessive weight gain among Dutch…

  16. Determination of the smoke-plume heights and their dynamics with ground-based scanning LIDAR

    Treesearch

    V. Kovalev; A. Petkov; C. Wold; S. Urbanski; W. M. Hao

    2015-01-01

    Lidar-data processing techniques are analyzed, which allow determining smoke-plume heights and their dynamics and can be helpful for the improvement of smoke dispersion and air quality models. The data processing algorithms considered in the paper are based on the analysis of two alternative characteristics related to the smoke dispersion process: the regularized...

  17. Features of electrophoretic deposition process of nanostructured electrode materials for planar Li-ion batteries

    NASA Astrophysics Data System (ADS)

    Melkozyorova, N. A.; Zinkevich, K. G.; Lebedev, E. A.; Alekseyev, A. V.; Gromov, D. G.; Kitsyuk, E. P.; Ryazanov, R. M.; Sysa, A. V.

    2017-11-01

    The features of electrophoretic deposition process of composite LiCoO2-based cathode and Si-based anode materials were researched. The influence of the deposition process parameters on the structure and composition of the deposit was revealed. The possibility of a local deposition of composites on a planar lithium-ion battery structure was demonstrated.

  18. Prevention of falls, malnutrition and pressure ulcers among older persons - nursing staff's experiences of a structured preventive care process.

    PubMed

    Lannering, Christina; Ernsth Bravell, Marie; Johansson, Linda

    2017-05-01

    A structured and systematic care process for preventive work, aimed to reduce falls, pressure ulcers and malnutrition among older people, has been developed in Sweden. The process involves risk assessment, team-based interventions and evaluation of results. Since development, this structured work process has become web-based and has been implemented in a national quality registry called 'Senior Alert' and used countrywide. The aim of this study was to describe nursing staff's experience of preventive work by using the structured preventive care process as outlined by Senior Alert. Eight focus group interviews were conducted during 2015 including staff from nursing homes and home-based nursing care in three municipalities. The interview material was subjected to qualitative content analysis. In this study, both positive and negative opinions were expressed about the process. The systematic and structured work flow seemed to only partly facilitate care providers to improve care quality by making better clinical assessments, performing team-based planned interventions and learning from results. Participants described lack of reliability in the assessments and varying opinions about the structure. Furthermore, organisational structures limited the preventive work. © 2016 John Wiley & Sons Ltd.

  19. Simplified process model discovery based on role-oriented genetic mining.

    PubMed

    Zhao, Weidong; Liu, Xi; Dai, Weihui

    2014-01-01

    Process mining is automated acquisition of process models from event logs. Although many process mining techniques have been developed, most of them are based on control flow. Meanwhile, the existing role-oriented process mining methods focus on correctness and integrity of roles while ignoring role complexity of the process model, which directly impacts understandability and quality of the model. To address these problems, we propose a genetic programming approach to mine the simplified process model. Using a new metric of process complexity in terms of roles as the fitness function, we can find simpler process models. The new role complexity metric of process models is designed from role cohesion and coupling, and applied to discover roles in process models. Moreover, the higher fitness derived from role complexity metric also provides a guideline for redesigning process models. Finally, we conduct case study and experiments to show that the proposed method is more effective for streamlining the process by comparing with related studies.

  20. Mode extraction on wind turbine blades via phase-based video motion estimation

    NASA Astrophysics Data System (ADS)

    Sarrafi, Aral; Poozesh, Peyman; Niezrecki, Christopher; Mao, Zhu

    2017-04-01

    In recent years, image processing techniques are being applied more often for structural dynamics identification, characterization, and structural health monitoring. Although as a non-contact and full-field measurement method, image processing still has a long way to go to outperform other conventional sensing instruments (i.e. accelerometers, strain gauges, laser vibrometers, etc.,). However, the technologies associated with image processing are developing rapidly and gaining more attention in a variety of engineering applications including structural dynamics identification and modal analysis. Among numerous motion estimation and image-processing methods, phase-based video motion estimation is considered as one of the most efficient methods regarding computation consumption and noise robustness. In this paper, phase-based video motion estimation is adopted for structural dynamics characterization on a 2.3-meter long Skystream wind turbine blade, and the modal parameters (natural frequencies, operating deflection shapes) are extracted. Phase-based video processing adopted in this paper provides reliable full-field 2-D motion information, which is beneficial for manufacturing certification and model updating at the design stage. The phase-based video motion estimation approach is demonstrated through processing data on a full-scale commercial structure (i.e. a wind turbine blade) with complex geometry and properties, and the results obtained have a good correlation with the modal parameters extracted from accelerometer measurements, especially for the first four bending modes, which have significant importance in blade characterization.

Top