Sample records for haines process

  1. Haines - Scagway Submarine Cable Intertie Project, Haines to Scagway, Alaska Final Technical and Construction Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    See, Alan; Rinehart, Bennie N; Marin, Glen

    1998-11-01

    The Haines to Skagway submarine cable project is located n Taiya Inlet, at the north end of Lynn Canal, in Southeast Alaska. The cable is approximately 15 miles long, with three landings and splice vaults. The cable is 35 kV, 3-Phase, and armored. The cable interconnects the Goat Lake Hydro Project near Skagway with the community of Haines. Both communities are now on 100% hydroelectric power. The Haines to Skagway submarine cable is the result of AP&T's goal of an alternative, economic, and environmentally friendly energy source for the communities served and to eliminate the use of diesel fuel asmore » the primary source of energy. Diesel units will continue to be used as a backup system.« less

  2. Using a Simple Parcel Model to Investigate the Haines Index

    Treesearch

    Mary Ann Jenkins; Steven K. Krueger; Ruiyu Sun

    2003-01-01

    The Haines Index (Haines 1988) ia fire-weather index based on stability and moisture conditions of the lower atmosphere that rates the potential for large fire growth or extreme fire behavior. The Hained Index is calculated by adding a temperature term a to a moisture term b.

  3. A North American regional reanalysis climatology of the Haines Index

    Treesearch

    Wei Lu; Joseph J. (Jay) Charney; Sharon Zhong; Xindi Bian; Shuhua Liu

    2011-01-01

    A warm-season (May through October) Haines Index climatology is derived using 32-km regional reanalysis temperature and humidity data from 1980 to 2007. We compute lapse rates, dewpoint depressions, Haines Index factors A and B, and values for each of the low-, mid- and high-elevation variants of the Haines Index. Statistical techniques are used to investigate the...

  4. Performance of the Haines Index During August 2000 for Montana

    Treesearch

    Brian E. Potter; Scott Goodrick

    2003-01-01

    The Haines Index, introduced by Haines (1988) as the Lower Atmosphere Severity Index, is designed to gauge how readily the lower mid-troposphere (500 to 4500 m AGL) will spur an otherwise fairly predictable fire to become erratic and unmanageable. Based on stability and moisture, the Haines Index (hereafter, HI) takes on integer values from 2 to 6, with 2 being very...

  5. The Perfect Career: Jennifer Hain-Teper--University of Illinois Library, Urbana-Champaign

    ERIC Educational Resources Information Center

    Library Journal, 2004

    2004-01-01

    This article details the work of Jennifer Hain-Teper, from the University of Illinois Library. For Jennifer Hain Teper, it all began with a simple question: "If you could do anything at all, regardless of pay, what would it be?" This was posed by a friend when Hain Teper was a volunteer at the wildflower research center in Austin, TX.…

  6. Magnetite deposits near Klukwan and Haines, southeastern Alaska

    USGS Publications Warehouse

    Robertson, Eugene C.

    1956-01-01

    Low-grade iron ore is found in magnetite-bearing pyroxenite bodies near Klukwan and Haines in Southeastern Alaska. An alluvial fan at Haines also contains magnetite-bearing rock of possible economic significance. The Haines-Klukwan area is underlain by rocks of Mesozoic Including epidote diorite, quartz diorite, and alaskite of the Coast Range batholith, metabasalt (recrystallized lava flows and pyroclastic rocks), and, in the southern part, interbedded slate and limestone. Layering and foliation, where perceptible, generally strike northwest and dip steeply northeast. The iron deposits are found at or near the contact between the metabasalt and epidote diorite; they appear to represent highly-altered lava flows that were metamorphosed during the emplacement of the batholith. Several billion tens of rock containing about 13 percent magnetic iron are included in the pyroxenite body at Klukwan. Sampling and dip-needle data suggest the presence there of two or three tabular aches in which the rock has an average magnetic iron content of 20 percent or more. Pyroxenite bodies outcropping in three areas near Haines apparently are lower in grade than the Klukwan deposit; lack of exposures prevented thorough sampling but reconnaissance traverses with a dip needle failed to reveal important zones of high-grade iron ore. An alluvial fan adjoining the pyroxenite body at Klukwan contains several hundred million tons of broken rock having a magneticiron content of about 10 percent.

  7. Climatological and statistical characteristics of the Haines Index for North America

    Treesearch

    Julie A. Winkler; Brian E. Potter; Dwight F. Wilhelm; Ryan P. Shadbolt; Krerk Piromsopa; Xindi Bian

    2007-01-01

    The Haines Index is an operational tool for evaluating the potential contribution of dry, unstable air to the development of large or erratic plume-dominated wildfires. The index has three variants related to surface elevation, and is calculated from temperature and humidity measurements at atmospheric pressure levels. To effectively use the Haines Index, fire...

  8. 33 CFR 165.1712 - Safety Zones; Annual Independence Day Firework Displays, Skagway, Haines, and Wrangell, AK.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... Day Firework Displays, Skagway, Haines, and Wrangell, AK. 165.1712 Section 165.1712 Navigation and... Displays, Skagway, Haines, and Wrangell, AK. (a) Regulated areas. The following areas are permanent safety..., Haines, AK within a 300-yard radius around the fireworks launch area, centered at approximate position 59...

  9. 33 CFR 165.1712 - Safety Zones; Annual Independence Day Firework Displays, Skagway, Haines, and Wrangell, AK.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... Day Firework Displays, Skagway, Haines, and Wrangell, AK. 165.1712 Section 165.1712 Navigation and... Displays, Skagway, Haines, and Wrangell, AK. (a) Regulated areas. The following areas are permanent safety..., Haines, AK within a 300-yard radius around the fireworks launch area, centered at approximate position 59...

  10. Timber supply and use in the Haines-Skagway area, Alaska.

    Treesearch

    Vernon J. LaBau; O. Keith Hutchison

    1976-01-01

    Discusses the results of a 1965 forest inventory of 449,300 acres in the Haines-Skagway area. Selected references are used to describe the economy of the area historically and currently. Interpretations and assessments of the timber resource in the continuing economy are made.

  11. Comments on the Yule Marble Haines block: potential replacement, Tomb of the Unknown Soldier, Arlington National Cemetery

    USGS Publications Warehouse

    Mossotti, Victor G.

    2014-01-01

    Marble for the Tomb of the Unknown Soldier at Arlington National Cemetery was cut from the Colorado Yule Marble Quarry in 1931. Although anecdotal reports suggest that cracks were noticed in the main section of the monument shortly after its installation at the Arlington National Cemetery in Arlington, Virginia, detailed documentation of the extent of cracking did not appear until 1963. Although debate continues as to whether the main section of the Tomb of the Unknowns monument should be repaired or replaced, Mr. John S. Haines of Glenwood Springs, Colorado, in anticipation of the permanent closing of the Yule Quarry, donated a 58-ton block of Yule Marble, the so-called Haines block, as a potential backup. The brief study reported here was conducted during mid-summer 2009 at the behest of the superintendent of Arlington National Cemetery. The field team entered the subterranean Yule Marble Quarry with the Chief Extraction Engineer in order to contrast the method used for extraction of the Haines block with the method that was probably used to extract the marble block that is now cracked. Based on surficial inspection and shallow coring of the Haines block, and on the nature of crack propagation in Yule Marble as judged by close inspection of a large collection of surrogate Yule Marble blocks, the team found the block to be structurally sound and cosmetically equivalent to the marble used for the current monument. If the Haines block were needed, it would be an appropriate replacement for the existing cracked section of the Tomb of the Unknown Soldier Monument.

  12. 33 CFR 207.170a - Eugene J. Burrell Navigation Lock in Haines Creek near Lisbon, Fla.; use, administration, and...

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 33 Navigation and Navigable Waters 3 2010-07-01 2010-07-01 false Eugene J. Burrell Navigation Lock in Haines Creek near Lisbon, Fla.; use, administration, and navigation. 207.170a Section 207.170a... REGULATIONS § 207.170a Eugene J. Burrell Navigation Lock in Haines Creek near Lisbon, Fla.; use...

  13. Accuracy of 24- and 48-Hour Forecasts of Haines' Index

    Treesearch

    Brian E. Potter; Jonathan E. Martin

    2001-01-01

    The University of Wisconsin-Madison produces Web-accessible, 24- and 48-hour forecasts of the Haines Index (a tool used to measure the atmospheric potential for large wildfire development) for most of North America using its nonhydrostatic modeling system. The authors examined the accuracy of these forecasts using data from 1999 and 2000. Measures used include root-...

  14. The interannual variability of the Haines Index over North America

    Treesearch

    Lejiang Yu; Shiyuan Zhong; Xindi Bian; Warren E. Heilman; Joseph J. Charney

    2013-01-01

    The Haines index (HI) is a fire-weather index that is widely used as an indicator of the potential for dry, low-static-stability air in the lower atmosphere to contribute to erratic fire behavior or large fire growth. This study examines the interannual variability of HI over North America and its relationship to indicators of large-scale circulation anomalies. The...

  15. Akuginow and Haines-Stiles Receive 2013 Robert C. Cowen Journalism Award: Citation

    NASA Astrophysics Data System (ADS)

    Alley, Richard

    2014-01-01

    From Cosmos to Mars and Pluto and back home, Geoffrey Haines-Stiles and Erna Akuginow have invested their careers reporting the best modern science in novel, compelling, and accessible ways through documentaries, live events, print, and new media. They are outstanding recipients of the AGU Robert C. Cowen Award for Sustained Achievement in Science Journalism.

  16. An examination of the sensitivity of numerically simulated wildfires to low-level atmospheric stability and moisture, and the consequences for the Haines Index

    Treesearch

    Mary Ann Jenkins

    2002-01-01

    The Haines Index, an operational fire-weather index introduced in 1988 and based on the observed stability and moisture content of the near-surface atmosphere, has been a useful indicator of the potential for high-risk fires in low wind conditions and flat terrain. The Haines Index is of limited use, however, as a predictor of actual fire behavior. To develop a fire-...

  17. Reconnaissance engineering geology of the Haines area, Alaska, with emphasis on evaluation of earthquake and other geologic hazards

    USGS Publications Warehouse

    Lemke, Richard Walter; Yehle, Lynn A.

    1972-01-01

    The Alaska earthquake of March 27, 1964, brought into sharp focus the need for engineering geologic studies in urban areas. Study of the Haines area constitutes an integral part of an overall program to evaluate earthquake and other geologic hazards in most of the larger Alaska coastal communities. The evaluations of geologic hazards that follow, although based only upon reconnaissance studies and, therefore, subject to revision, will provide broad guidelines useful in city and land-use planning. It is hoped that the knowledge gained will result in new facilities being built in the best possible geologic environments and being designed so as to minimize future loss of life and property damage. Haines, which is in the northern part of southeastern Alaska approximately 75 miles northwest of Juneau, had a population, of about 700 people in 1970. It is built at the northern end of the Chilkat Peninsula and lies within the Coast Mountains of the Pacific Mountain system. The climate is predominantly marine and is characterized by mild winters and cool summers. The mapped area described in this report comprises about 17 square miles of land; deep fiords constitute most of the remaining mapped area that is evaluated in this study. The Haines area was covered by glacier ice at least once and probably several times during the Pleistocene Epoch. The presence of emergent marine deposits, several hundred feet above sea level, demonstrates that the land has been uplifted relative to sea level since the last major deglaciation of the region about 10,000 years ago. The rate of relative uplift of the land at Haines during the past 39 years is 2.26 cm per year. Most or all of this uplift appears to be due to rebound as a result of deglaciation. Both bedrock and surficial deposits are present in the area. Metamorphic and igneous rocks constitute the exposed bedrock. The metamorphic rocks consist of metabasalt of Mesozoic age and pyroxenite of probable early middle Cretaceous age. The

  18. Application of the Haines Index in the fire warning system

    NASA Astrophysics Data System (ADS)

    Kalin, Lovro; Marija, Mokoric; Tomislav, Kozaric

    2016-04-01

    Croatia, as all Mediterranean countries, is strongly affected by large wildfires, particularly in the coastal region. In the last two decades the number and intensity of fires has been significantly increased, which is unanimously associated with climate change, e.g. global warming. More extreme fires are observed, and the fire-fighting season has been expanded to June and September. The meteorological support for fire protection and planning is therefore even more important. At the Meteorological and Hydrological Service of Croatia a comprehensive monitoring and warning system has been established. It includes standard components, such as short term forecast of Fire Weather Index (FWI), but long range forecast as well. However, due to more frequent hot and dry seasons, FWI index often does not provide additional information of extremely high fire danger, since it regularly takes the highest values for long periods. Therefore the additional tools have been investigated. One of widely used meteorological products is the Haines index (HI). It provides information of potential fire growth, taking into account only the vertical instability of the atmosphere, and not the state of the fuel. Several analyses and studies carried out at the Service confirmed the correlation of high HI values with large and extreme fires. The Haines index forecast has been used at the Service for several years, employing European Centre for Medium Range Weather Forecast (ECMWF) global prediction model, as well as the limited-area Aladin model. The verification results show that these forecast are reliable, when compared to radiosonde measurements. All these results provided the introduction of the additional fire warnings, that are issued by the Service's Forecast Department.

  19. Preliminary research findings from a study of the sociocultural effects of tourism in Haines, Alaska.

    Treesearch

    Lee K. Cerveny

    2004-01-01

    This report examines the growth and development of the tourism industry in Haines, Alaska, and its effects on community life and land use. It also describes the development of cruise-based tourism and its relation to shifts in local social and economic structures and patterns of land use, especially local recreation use trends. A multisited ethnographic approach was...

  20. Climatic variability of a fire-weather index based on turbulent kinetic energy and the Haines Index

    Treesearch

    Warren E. Heilman; Xindi Bian

    2010-01-01

    Combining the Haines Index (HI) with near-surface turbulent kinetic energy (TKEs) through a product of the two values (HITKEs) has shown promise as an indicator of the atmospheric potential for extreme and erratic fire behavior in the U.S. Numerical simulations of fire-weather evolution during past wildland fire episodes in...

  1. [Heavy metal pollution characteristics and ecological risk analysis for soil around Haining electroplating industrial park].

    PubMed

    Li, Jiong-Hui; Weng, Shan; Fang, Jing; Huang, Jia-Lei; Lu, Fang-Hua; Lu, Yu-Hao; Zhang, Hong-Ming

    2014-04-01

    The pollution status and potential ecological risks of heavy metal in soils around Haining electroplating industrial park were studied. Hakanson index approach was used to assess the ecological hazards of heavy metals in soils. Results showed that average concentrations of six heavy metals (Cu, Ni, Pb, Zn, Cd and Cr) in the soils were lower than the secondary criteria of environmental quality standard for soils, indicating limited harmful effects on the plants and the environment in general. Though the average soil concentrations were low, heavy metal concentrations in six sampling points located at the side of road still exceeded the criteria, with excessive rate of 13%. Statistic analysis showed that concentrations of Cu and Cd in roadside soils were significantly higher than those in non-roadside soils, indicating that the excessive heavy metal accumulations in the soil closely related with traffic transport. The average potential ecological hazard index of soils around Haining electroplating industrial park was 46.6, suggesting a slightly ecological harm. However, the potential ecological hazard index of soils with excessive heavy metals was 220-278, suggesting the medium ecological hazards. Cd was the most seriously ecological hazard factor.

  2. Evidence of Cold Climate Slope Processes from the New Jersey Coastal Plain: Debris Flow Stratigraphy at Haines Corner, Camden County, New Jersey

    USGS Publications Warehouse

    Newell, Wayne L.

    2005-01-01

    Excavations through surficial deposits across the New Jersey Coastal Plain commonly reveal homogenized surficial sediments, deformed sedimentary structures, chaotically rearranged bed-forms, and wedge-shaped cracks filled with sand from the top-most layers of extant soil profiles. As a whole, these abundant, broadly distributed phenomena are best explained as artifacts of an era of frozen ground during the last Pleistocene glacial maximum. Vigorous freeze-thaw processes and abundant seasonal rainfall created a landscape of low relief covered by highly mobile surficial deposits. The surficial deposits are at grade into broad, flat bottomed valleys now drained by small, tightly meandering, under-fit streams. Modern fluvial, aeolian, and slope processes are ineffectual in either creating or modifying these landscapes. One particularly brief exposure of complex slope deposits was documented at Haines Corner, Camden County, during the field work (1986) for the Surficial Geologic Map of southern and central New Jersey. The exposure, now presented and interpreted here, provides previously unavailable details of a system of freeze-thaw driven processes that unfolded upon a frozen, impermeable substrate 80 miles south of the southern margin of the Wisconsinan glacial advance to Long Island, N.Y. At the time of these extreme processes, the presently sub-aerial New Jersey Coastal Plain was not proximal to moderating effects of the Atlantic Ocean, being about 100 miles inland and 300 feet above the lowered sea level. Current studies of analogous deposits across the mid-Atlantic Coastal Plain now benefit from dating techniques that were not available during the geologic mapping field work (1985-'92). During the mapping in New Jersey, hundreds of exposures failed to produce datable carbon remains within the stratigraphy of the surficial deposits. Recently reported TL dates from wind-blown sand filling frost wedges, exposed elsewhere in New Jersey, indicate that the widely

  3. Using Haines Index coupled with fire weather model predicted from high resolution LAM forecasts to asses wildfire extreme behaviour in Southern Europe.

    NASA Astrophysics Data System (ADS)

    Gaetani, Francesco; Baptiste Filippi, Jean; Simeoni, Albert; D'Andrea, Mirko

    2010-05-01

    Haines Index (HI) was developed by USDA Forest Service to measure the atmosphere's contribution to the growth potential of a wildfire. The Haines Index combines two atmospheric factors that are known to have an effect on wildfires: Stability and Dryness. As operational tools, HI proved its ability to predict plume dominated high intensity wildfires. However, since HI does not take into account the fuel continuity, composition and moisture conditions and the effects of wind and topography on fire behaviour, its use as forecasting tool should be carefully considered. In this work we propose the use of HI, predicted from HR Limited Area Model forecasts, coupled with a Fire Weather model (i.e., RISICO system) fully operational in Italy since 2003. RISICO is based on dynamic models able to represent in space and in time the effects that environment and vegetal physiology have on fuels and, in turn, on the potential behaviour of wildfires. The system automatically acquires from remote databases a thorough data-set of input information both of in situ and spatial nature. Meteorological observations, radar data, Limited Area Model weather forecasts, EO data, and fuel data are managed by a Unified Interface able to process a wide set of different data. Specific semi-physical models are used in the system to simulate the dynamics of the fuels (load and moisture contents of dead and live fuel) and the potential fire behaviour (rate of spread and linear intensity). A preliminary validation of this approach will be provided with reference to Sardinia and Corsica Islands, two major islands of the Mediterranean See frequently affected by extreme plume dominated wildfires. A time series of about 3000 wildfires burnt in Sardinia and Corsica in 2007 and 2008 will be used to evaluate the capability of HI coupled with the outputs of the Fire Weather model to forecast the actual risk in time and in space.

  4. 77 FR 74508 - Notice of Availability of the Draft Resource Management Plan Amendment, Draft Environmental...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-12-14

    ... Statement for the Ring of Fire Resource Management Plan--Haines Planning Area, Alaska AGENCY: Bureau of Land.../Draft Environmental Impact Statement (EIS) for the Ring of Fire RMP for the Haines Planning Area and by... Fire RMP-- Haines Planning Area Amendment by any of the following methods: Email: [email protected

  5. Keeping Haines Real - Or Really Changing Haines?

    Treesearch

    Brian E. Potter; Dan Borsum; Don Haines

    2002-01-01

    Most incident command teams can handle low- to moderate-intensity fires with few unanticipated problems. However, high-intensity situations, especially the plume-dominated fires that often develop when winds are low and erratic behavior is unexpected, can create dangerous situations even for well-trained, experienced fire crews (Rothermel 1991). Plume-dominated fires...

  6. 75 FR 16822 - Federal Property Suitable as Facilities To Assist the Homeless

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-04-02

    ... FURTHER INFORMATION CONTACT: Kathy Ezzell, Department of Housing and Urban Development, 451 Seventh Street... ALASKA Dalton-Cache Border Station Mile 42 Haines Highway Haines AK 99827 Landholding Agency: GSA...

  7. Testing of the high accuracy inertial navigation system in the Shuttle Avionics Integration Lab

    NASA Technical Reports Server (NTRS)

    Strachan, Russell L.; Evans, James M.

    1991-01-01

    The description, results, and interpretation is presented of comparison testing between the High Accuracy Inertial Navigation System (HAINS) and KT-70 Inertial Measurement Unit (IMU). The objective was to show the HAINS can replace the KT-70 IMU in the space shuttle Orbiter, both singularly and totally. This testing was performed in the Guidance, Navigation, and Control Test Station (GTS) of the Shuttle Avionics Integration Lab (SAIL). A variety of differences between the two instruments are explained. Four, 5 day test sessions were conducted varying the number and slot position of the HAINS and KT-70 IMUs. The various steps in the calibration and alignment procedure are explained. Results and their interpretation are presented. The HAINS displayed a high level of performance accuracy previously unseen with the KT-70 IMU. The most significant improvement of the performance came in the Tuned Inertial/Extended Launch Hold tests. The HAINS exceeded the 4 hr specification requirement. The results obtained from the SAIL tests were generally well beyond the requirements of the procurement specification.

  8. Geochemical reanalysis of historical U.S. Geological Survey sediment samples from the Haines area, Juneau and Skagway quadrangles, southeast Alaska

    USGS Publications Warehouse

    Werdon, Melanie B.; Granitto, Matthew; Azain, Jaime S.

    2015-01-01

    The State of Alaska’s Strategic and Critical Minerals (SCM) Assessment project, a State-funded Capital Improvement Project (CIP), is designed to evaluate Alaska’s statewide potential for SCM resources. The SCM Assessment is being implemented by the Alaska Division of Geological & Geophysical Surveys (DGGS), and involves obtaining new airborne-geophysical, geological, and geochemical data. As part of the SCM Assessment, thousands of historical geochemical samples from DGGS, U.S. Geological Survey (USGS), and U.S. Bureau of Mines archives are being reanalyzed by DGGS using modern, quantitative, geochemical-analytical methods. The objective is to update the statewide geochemical database to more clearly identify areas in Alaska with SCM potential. The USGS is also undertaking SCM-related geologic studies in Alaska through the federally funded Alaska Critical Minerals cooperative project. DGGS and USGS share the goal of evaluating Alaska’s strategic and critical minerals potential and together created a Letter of Agreement (signed December 2012) and a supplementary Technical Assistance Agreement (#14CMTAA143458) to facilitate the two agencies’ cooperative work. Under these agreements, DGGS contracted the USGS in Denver to reanalyze historical USGS sediment samples from Alaska. For this report, DGGS funded reanalysis of 212 historical USGS sediment samples from the statewide Alaska Geochemical Database Version 2.0 (AGDB2; Granitto and others, 2013). Samples were chosen from the Chilkat, Klehini, Tsirku, and Takhin river drainages, as well as smaller drainages flowing into Chilkat and Chilkoot Inlets near Haines, Skagway Quadrangle, Southeast Alaska. Additionally some samples were also chosen from the Juneau gold belt, Juneau Quadrangle, Southeast Alaska (fig. 1). The USGS was responsible for sample retrieval from the National Geochemical Sample Archive (NGSA) in Denver, Colorado through the final quality assurance/quality control (QA/QC) of the geochemical

  9. 2. Historic American Buildings Survey, August, 1971 STREETSCAPE SHOWING EAST ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    2. Historic American Buildings Survey, August, 1971 STREETSCAPE SHOWING EAST (FRONT) ELEVATIONS OF HAINES BROTHERS BRICK STORE (FAR RIGHT), ORTH BUILDING, BRUNNER BROTHERS STORE, AND NORTH (FRONT) ELEVATION OF CITY HALL (FAR LEFT). - Haines Brothers Brick Store, 110 South Oregon Street, Jacksonville, Jackson County, OR

  10. Lattice Boltzmann simulations of supercritical CO2-water drainage displacement in porous media: CO2 saturation and displacement mechanism.

    PubMed

    Yamabe, Hirotatsu; Tsuji, Takeshi; Liang, Yunfeng; Matsuoka, Toshifumi

    2015-01-06

    CO2 geosequestration in deep aquifers requires the displacement of water (wetting phase) from the porous media by supercritical CO2 (nonwetting phase). However, the interfacial instabilities, such as viscous and capillary fingerings, develop during the drainage displacement. Moreover, the burstlike Haines jump often occurs under conditions of low capillary number. To study these interfacial instabilities, we performed lattice Boltzmann simulations of CO2-water drainage displacement in a 3D synthetic granular rock model at a fixed viscosity ratio and at various capillary numbers. The capillary numbers are varied by changing injection pressure, which induces changes in flow velocity. It was observed that the viscous fingering was dominant at high injection pressures, whereas the crossover of viscous and capillary fingerings was observed, accompanied by Haines jumps, at low injection pressures. The Haines jumps flowing forward caused a significant drop of CO2 saturation, whereas Haines jumps flowing backward caused an increase of CO2 saturation (per injection depth). We demonstrated that the pore-scale Haines jumps remarkably influenced the flow path and therefore equilibrium CO2 saturation in crossover domain, which is in turn related to the storage efficiency in the field-scale geosequestration. The results can improve our understandings of the storage efficiency by the effects of pore-scale displacement phenomena.

  11. Note on Conditional Compilation in Standard ML

    DTIC Science & Technology

    1993-06-01

    eOmputer-Science No-te on Coridhitiom Cominliati"I~n Standard ML1 Nicholas Haines Edoardo Biagioni Robert Hiarper mom Brian G. Mimnes June 1993 CMU...CS-93. 11 TIC ELECTE f 00..7733 %goo~~OO Note on Conditioual Compilation in Standard ML Nicholas Haines Edoardo Biagioni Robert Harper Brian G. Milnes

  12. Oral Traditions, Changing Rural Landscapes, and Science Education

    ERIC Educational Resources Information Center

    Stapleton, Sarah Riggs

    2017-01-01

    This forum response extends the argument made by Avery and Hains that oral traditions can be useful for including the cultures and contexts of rural areas within science instruction. To buttress the oral expressions presented in Avery and Hains, I compare oral expressions of a second rural area, 600 miles to the South, in Eastern North Carolina. I…

  13. Potential industrial sites in the Lynn Canal area, Alaska

    USGS Publications Warehouse

    Johnson, Arthur; Twenhofel, William Stephens

    1953-01-01

    Full development of a proposal to divert the headwaters of the Yukon River drainage from Canada into the Taiya River valley of Alaska would make available more than a half million kilowatts of electrical energy. Utilization of this block of power, for which there is at present no local market, will require an industrial and community development of appreciable magnitude. Suitable sites for industrial and community development near the proposed power source are limited because of the extremely rugged and mountainous terrain of the Lynn Canal area. This report considers potential industrial areas at Skagway, Taiya River, Ferebee River, Lutak Inlet, Haines and vicinity, Klukwan and vicinity, Haines to Klukwan along the Haines cutoff, Berners Bay, and Juneau and vicinity. The factors considered in their evaluation are topography, geology, climate, water supply, transportation facilities, and transmission-line routes from the source of power.

  14. Power Scaling of CW and Pulsed IR and Mid-IR OPSLs (Postprint)

    DTIC Science & Technology

    2013-01-01

    T. J. Rotter, G. Balakrishnan, and C. Hains University of New Mexico S. W . Koch, W . Stolz, and B . Kunert University of Marburg...Balakrishnan, and C. Hains (University of New Mexico ) S. W . Koch, W . Stolz, and B . Kunert (University of Marburg) 5d. PROJECT NUMBER 2002 5e. TASK...Yarboroughb, T. J Rotterc, G. Balakrishnanc, C. Hainsc, S.W. Kochd, W . Stolzd, B . Kunertd,R. Bedforde aNonlinear Control Strategies Inc, 3542 N

  15. Novel Heterongineered Detectors for Multi-Color Infrared Sensing

    DTIC Science & Technology

    2012-01-30

    barriers”. Appl. Phys. Lett. 98, 121106 (2011) 9. A. Khoshakhlagh, F. Jaeckel C. Hains J. B. Rodriguez , L. R. Dawson, K. Malloy, and S. Krishna...AlAs etch-stop layer. The detailed processing sequence is included in the Methods. b da c n + -GaAs 200 nm Mesa lndium bump 2.1 –2.1 FPA p d SP-FPA...FPA chip. The processing scheme of the plasmonic FPA chip consists of a dry etch to form the mesa , surface passivation, ohmic metal evaporation, under

  16. Solar Water Heater

    NASA Technical Reports Server (NTRS)

    1993-01-01

    As a Jet Propulsion Laboratory (JPL) scientist Dr. Eldon Haines studied the solar energy source and solar water heating. He concluded he could build a superior solar water heating system using the geyser pumping principle. He resigned from JPL to develop his system and later form Sage Advance Corporation to market the technology. Haines' Copper Cricket residential system has no moving parts, is immune to freeze damage, needs no roof-mounted tanks, and features low maintenance. It provides 50-90 percent of average hot water requirements. A larger system, the Copper Dragon, has been developed for commercial installations.

  17. Formulating physical processes in a full-range model of soil water retention

    NASA Astrophysics Data System (ADS)

    Nimmo, J. R.

    2016-12-01

    Currently-used water retention models vary in how much their formulas correspond to controlling physical processes such as capillarity, adsorption, and air-trapping. In model development, realistic correspondence to physical processes has often been a lower priority than ease of use and compatibility with other models. For example, the wettest range is normally represented simplistically, as by a straight line of zero slope, or by default using the same formulation as for the middle range. The new model presented here recognizes dominant processes within three segments of the range from oven-dryness to saturation. The adsorption-dominated dry range is represented by a logarithmic relation used in earlier models. The middle range of capillary advance/retreat and Haines jumps is represented by a new adaptation of the lognormal distribution function. In the wet range, the expansion of trapped air in response to matric pressure change is important because (1) it displaces water, and (2) it triggers additional volume-adjusting processes such as the collapse of liquid bridges between air pockets. For this range, the model incorporates the Boyles' law inverse-proportionality of trapped air volume and pressure, amplified by an empirical factor to account for the additional processes. With their basis in processes, the model's parameters have a strong physical interpretation, and in many cases can be assigned values from knowledge of fundamental relationships or individual measurements. An advantage of the physically-plausible treatment of the wet range is that it avoids such problems as the blowing-up of derivatives on approach to saturation, enhancing the model's utility for important but challenging wet-range phenomena such as domain exchange between preferential flow paths and soil matrix. Further development might be able to accommodate hysteresis by a systematic adjustment of the relation between the wet and middle ranges.

  18. Safety of the lateral trauma position in cervical spine injuries: a cadaver model study.

    PubMed

    Hyldmo, P K; Horodyski, M B; Conrad, B P; Dubose, D N; Røislien, J; Prasarn, M; Rechtine, G R; Søreide, E

    2016-08-01

    Endotracheal intubation is not always an option for unconscious trauma patients. Prehospital personnel are then faced with the dilemma of maintaining an adequate airway without risking deleterious movement of a potentially unstable cervical spine. To address these two concerns various alternatives to the classical recovery position have been developed. This study aims to determine the amount of motion induced by the recovery position, two versions of the HAINES (High Arm IN Endangered Spine) position, and the novel lateral trauma position (LTP). We surgically created global cervical instability between the C5 and C6 vertebrae in five fresh cadavers. We measured the rotational and translational (linear) range of motion during the different maneuvers using an electromagnetic tracking device and compared the results using a general linear mixed model (GLMM) for regression. In the recovery position, the range of motion for lateral bending was 11.9°. While both HAINES positions caused a similar range of motion, the motion caused by the LTP was 2.6° less (P = 0.037). The linear axial range of motion in the recovery position was 13.0 mm. In comparison, the HAINES 1 and 2 positions showed significantly less motion (-5.8 and -4.6 mm, respectively), while the LTP did not (-4.0 mm, P = 0.067). Our results indicate that in unconscious trauma patients, the LTP or one of the two HAINES techniques is preferable to the standard recovery position in cases of an unstable cervical spine injury. © 2016 The Authors. Acta Anaesthesiologica Scandinavica published by John Wiley & Sons Ltd on behalf of Acta Anaesthesiologica Scandinavica Foundation.

  19. 5. Historic American Buildings Survey Nathaniel R. Ewan, Photographer September ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    5. Historic American Buildings Survey Nathaniel R. Ewan, Photographer September 8, 1937 INTERIOR - DETAIL OF NUMERALS (1776) ON ENCLOSED END OF ORIGINAL HOUSE - Haines-Budd House, Lumberton, Burlington County, NJ

  20. Haine v. Leves, 7 April 1987.

    PubMed

    1987-01-01

    The Court ruled that where pupils coming from within a local community were segregated into single sex schools with differences in their curricula that resulted in a variation in educational qualification such that the school qualification of boys for tertiary education and employment was greater than that of girls, the Equal Opportunity Tribunal could conclude that the girls were being discriminated against under Section 31A of the Anti-Discrimination Act 1977. full text

  1. Status and Impacts of Arctic Freshwater Export

    NASA Astrophysics Data System (ADS)

    Haine, T. W. N.

    2017-12-01

    Large freshwater anomalies clearly exist in the Arctic Ocean. For example, liquid freshwater has accumulated in the Beaufort Gyre in the decade of the 2000s compared to 1980-2000, with an extra ≈5000 km3—about 25%—being stored. The sources of freshwater to the Arctic from precipitation and runoff have increased between these periods (most of the evidence comes from models). Despite flux increases from 2001 to 2011, it is uncertain if the marine freshwater source through Bering Strait for the 2000s has changed, as observations in the 1980s and 1990s are incomplete. The marine freshwater fluxes draining the Arctic through Fram and Davis straits are also insignificantly different. In this way, the balance of sources and sinks of freshwater to the Arctic, Canadian Arctic Archipelago (CAA), and Baffin Bay shifted to about 1200±730 km3yr-1 freshening the region, on average, during the 2000s. The observed accumulation of liquid freshwater is consistent with this increased supply and the loss of freshwater from sea ice (Figure, right). Evidence exists that such discharges can impact the Atlantic meridional overturning circulation, and hence Atlantic sector climate. Nevertheless, it appears that the observed AMOC variability since 2004, when high quality measurements began, is not attributable to anthropogenic influence. This work is based on, and updated from, Haine et al. (2015), Carmack et al. (2016), and Haine (2016). Haine, T. W. N. Ocean science: Vagaries of Atlantic overturning. Nature Geoscience, 9, 479-480, 10.1038/ngeo2748, 2016. T. W. N. Haine et al., Arctic Freshwater Export: Status, Mechanisms, and Prospects, Global Planetary Change, 125, 13-35, 10.1016/j.glopacha.2014.11.013, 2015. E. Carmack et al., Fresh water and its role in the Arctic Marine System: sources, disposition, storage, export, and physical and biogeochemical consequences in the Arctic and global oceans. J. G. Res. Biogeosciences, 10.1002/2015JG003140, 2016.

  2. String test

    MedlinePlus

    ... Management by Laboratory Methods . 23rd ed. Philadelphia, PA: Elsevier; 2017:chap 64. Bope ET, Kellerman RD. The ... ET, ed. Conn's Current Therapy 2016 . Philadelphia, PA: Elsevier; 2016:chap 3. Haines CF, Sears CL. Infectious ...

  3. Giardia infection

    MedlinePlus

    ... eds. Goldman-Cecil Medicine . 25th ed. Philadelphia, PA: Elsevier Saunders; 2015:chap 283. Haines CF, Sears CL. ... Gastrointestinal and Liver Disease . 10th ed. Philadelphia, PA: Elsevier Saunders; 2016:chap 110. Nash TE, Hill DR. ...

  4. Alcoholic liver disease

    MedlinePlus

    ... FF, ed. Ferri's Clinical Advisor 2018 . Philadelphia, PA: Elsevier; 2018:59-60. Carithers RL, McClain C. Alcoholic ... Gastrointestinal and Liver Disease . 10th ed. Philadelphia, PA: Elsevier Saunders; 2016:chap 86. Haines EJ, Oyama LC. ...

  5. 75 FR 69427 - PPL Holtwood, LLC; Notice of Application for Amendment of License and Soliciting Comments...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-11-12

    ... Susquehanna River, in Lancaster and York Counties, Pennsylvania. g. Filed Pursuant to: Federal Power Act, 16 U... a 33.8-acre parcel on which the Indian Steps Museum and Ulmer-Root-Haines Memorial Park and nature...

  6. Shigellosis

    MedlinePlus

    ... Nelson Textbook of Pediatrics . 20th ed. Philadelphia, PA: Elsevier; 2016:chap 340. DuPont HL. Approach to the ... eds. Goldman's Cecil Medicine . 25th ed. Philadelphia, PA: Elsevier Saunders; 2016:chap 283. Haines CF, Sears CL. ...

  7. CHARACTERIZING AND MITIGATING PATHOGENIC ORGANISMS RELATED TO CAFOS

    EPA Science Inventory

    CHARACTERIZING AND MITIGATING PATHOGENIC ORGANISMS RELATED TO CAFOs John Haines and Shane Rogers NRMRL Science Questions MYP Science Ouestion: What BMP treatment systems and restoration technologies are most effective options for watershed management? For mixed land use wa...

  8. Recruitment Practices Change, but Issues Remain the Same

    ERIC Educational Resources Information Center

    Hugo, Esther

    2012-01-01

    What the author found most surprising about Richard Haines' survey on 1974 recruitment practices was that the major issues are still relevant. His main points about recruitment, college and counselor interaction, and the need for better information still resonate as the profession consistently calls for clarity and transparency in the college…

  9. Advancing Educational Diversity: Antifragility, Standardization, Democracy, and a Multitude of Education Options

    ERIC Educational Resources Information Center

    Fortunato, Michael W. P.

    2017-01-01

    This essay is a response to a paper by Avery and Hains that raises questions about the often unintended effects of knowledge standardization in an educational setting. While many K-12 schools are implementing common core standards, and many institutions of higher education are implementing their own standardized educational practices, the question…

  10. Micro-PIV measurements of multiphase flow of water and liquid CO2 in 2-D heterogeneous porous micromodels

    NASA Astrophysics Data System (ADS)

    Li, Yaofa; Kazemifar, Farzan; Blois, Gianluca; Christensen, Kenneth T.

    2017-07-01

    We present an experimental study of pore-scale flow dynamics of liquid CO2 and water in a two-dimensional heterogeneous porous micromodel, inspired by the structure of a reservoir rock, at reservoir-relevant conditions (80 bar, 21°C). The entire process of CO2 infiltration into a water-saturated micromodel was captured using fluorescence microscopy and the micro-PIV method, which together reveal complex fluid displacement patterns and abrupt changes in velocity. The CO2 front migrated through the resident water in an intermittent manner, forming dendritic structures, termed fingers, in directions along, normal to, and even opposing the bulk pressure gradient. Such characteristics indicate the dominance of capillary fingering through the micromodel. Velocity burst events, termed Haines jumps, were also captured in the heterogeneous micromodel, during which the local Reynolds number was estimated to be ˜21 in the CO2 phase, exceeding the range of validity of Darcy's law. Furthermore, these drainage events were observed to be cooperative (i.e., across multiple pores simultaneously), with the zone of influence of such events extending beyond tens of pores, confirming, in a quantitative manner, that Haines jumps are nonlocal phenomena. After CO2 completely breaks through the porous section, shear-induced circulations caused by flowing CO2 were also observed, in agreement with previous studies using a homogeneous porous micromodel. To our knowledge, this study is the first quantitative measurement that incorporates both reservoir-relevant conditions and rock-inspired heterogeneity, and thus will be useful for pore-scale model development and validation.

  11. 75 FR 32096 - Standard Instrument Approach Procedures, and Takeoff Minimums and Obstacle Departure Procedures...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-06-07

    .... The large number of SIAPs, Takeoff Minimums and ODPs, in addition to their complex nature and the need... Rivers, MI, Three Rivers Muni Dr. Haines, NDB RWY 27, Amdt 7A, CANCELLED Brainerd, MN, Brainerd Lakes Rgnl, RNAV (GPS) RWY 5, Amdt 1 Brainerd, MN, Brainerd Lakes Rgnl, RNAV (GPS) RWY 12, Amdt 1 Brainerd...

  12. Update on Longleaf Pine Seed Supply Meeting

    Treesearch

    Mark J. Hainds

    2002-01-01

    This is an update of the activities following the September 1999 meeting concerning measures that were discussed to address the longleaf pine seed supply shortage. The people in attendance were Dr. Dean Gjerstad, Mark Haines, Robert Gandy, Larry Bishop, Dr. Ron Carey, Dr. Carey's graduate student Steve Oak, Dr. Jim Barnett, and Jill Barbour

  13. The potential impact of regional climate change on fire weather in the United States

    Treesearch

    Ying Tang; Shiyuan Zhong; Lifeng Luo; Xindi Bian; Warren E. Heilman; Julie. Winkler

    2015-01-01

    Climate change is expected to alter the frequency and severity of atmospheric conditions conducive for wildfires. In this study, we assess potential changes in fire weather conditions for the contiguous United States using the Haines Index (HI), a fire weather index that has been employed operationally to detect atmospheric conditions favorable for large and erratic...

  14. Strategies to Support Concentration

    ERIC Educational Resources Information Center

    Haines, Annette

    2017-01-01

    Annette Haines provides a comprehensive overview of concentration across the planes. She first lays the foundation for thinking about student engagement: It must be understood that concentration is found through the interest of the child, which is guided by the sensitive periods. When we understand the child's development in this way, we can offer…

  15. Spring Research Festival and NICBR Collaboration Winners Announced | Poster

    Cancer.gov

    By Carolynne Keenan, Contributing Writer, and Ashley DeVine, Staff Writer The winners of the 2014 Spring Research Festival (SRF), held May 7 and 8, were recognized on July 2, and included 20 NCI at Frederick researchers: Matthew Anderson, Victor Ayala, Matt Bess, Cristina Bergamaschi, Charlotte Choi, Rami Doueiri, Laura Guasch Pamies, Diana Haines, Saadia Iftikhar, Maria

  16. Workforce Education. Hotel and Motel Workers. A Section 353 Demonstration Project.

    ERIC Educational Resources Information Center

    Polk County Public Schools, Bartow, FL.

    This guide provides an overall view of a program designed to educate adult basic education (ABE) and English-as-a-Second-Language (ESL) students in job-related, language-oriented skills vital to their positions in the hotel/motel industry. The program was designed for the employees of Grenelefe Resort and Conference Center in Haines City, Florida,…

  17. 76 FR 70053 - Standard Instrument Approach Procedures, and Takeoff Minimums and Obstacle Departure Procedures...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-11-10

    ... 8260-15A. The large number of SIAPs, Takeoff Minimums and ODPs, in addition to their complex nature and... Three Rivers, MI, Three Rivers Muni Dr. Haines, Takeoff Minimums and Obstacle DP, Orig Brainerd, MN, Brainerd Lakes Rgnl, ILS OR LOC/DME RWY 34, Amdt 1 Park Rapids, MN, Park Rapids Muni-Konshok Field, NDB RWY...

  18. Thirty-One Years of Group Research in "Social Psychology Quarterly" (1975-2005)

    ERIC Educational Resources Information Center

    Harrod, Wendy J.; Welch, Bridget K.; Kushkowski, Jeff

    2009-01-01

    We examined trends in group research published in Social Psychology Quarterly (SPQ) from 1975 to 2005. We identified a total of 332 papers about groups published during the time period. Following Moreland, Hogg, and Hains (1994), we created an index of interest in groups by dividing the number of pages in papers about groups by the total number of…

  19. DLA Pre-Award Contracting System

    DTIC Science & Technology

    1993-05-01

    Gulley DPSSO Bldg 33 Standards Cheryl Haines DISC-RMO Bldg 36 Lead Time Jeff Hammer DGSC-P Bldg 32 DPACS WorkloadjersonnedALT Judy Harroson DLA-Z...33 DPACS Functionality Lou JuIg DISC-RM Bldg 36 Resource Data Sandra King DLA-ZSM 3A675 Project Oversight Scotie Knott DGSC-P Bldg 33 Post Award Dave

  20. Why Do You Write? Creative Writing and the Reflective Teacher

    ERIC Educational Resources Information Center

    Hains-Wesson, Rachael

    2013-01-01

    In this article, the author asserts that whether we write creatively or academically (or both) it takes time to understand the reasons why we "want" to write, and the more we write, the more we fully begin to appreciate why we have to write in the ?rst place. From an early age, nearly every day, Rachel Hains-Wesson actively participated in…

  1. Numerical Relativistic Quantum Optics

    DTIC Science & Technology

    2013-11-08

    Camilo, V.M. Kaspi, A.G. Lyne, R.N. Manchester, J.F. Bell, N. D’Amico, N.P.F. McKay, 24 and F. Crawford. Discovery of two high magnetic field radio... pulsars . The Astrophysical Journal, 541:367–373, Sep 2000. [15] M. Tatarakis, I. Watts, F.N. Beg, E.L. Clark, A.E. Dangor, A. Gopal, M.G. Haines, P.A

  2. Performance of the Abbott RealTime MTB RIF/INH resistance assay when used to test Mycobacterium tuberculosis specimens from Bangladesh.

    PubMed

    Kostera, Joshua; Leckie, Gregor; Abravaya, Klara; Wang, Hong

    2018-01-01

    The Abbott RealTime MTB RIF/INH Resistance Assay (RT MTB RIF/INH) is an assay for the detection of rifampicin (RIF)- and/or isoniazid (INH)-resistant Mycobacterium tuberculosis (MTB). The assay can be used to test sputum, bronchial alveolar lavage, and N-Acetyl-L-Cysteine (NALC)/NaOH pellets prepared from these samples. The assay can be used in direct testing mode, or in reflex mode following a MTB positive result produced by its companion assay, Abbott RT MTB. In this study, the direct testing mode was used to test paired sputum and NALC/NaOH pellets prepared from sputum collected from Bangladesh TB patients. One hundred and thirty two paired samples were tested. The RT MTB RIF/INH inhibition rate was 0%. One hundred and twenty-two paired samples had results above the assay limit of detection and were analyzed by comparing with results from phenotypic drug sensitivity testing, GeneXpert MTB/RIF (Xpert), and MTBDR plus (Hain). RT MTB RIF/INH results were in good agreement with those of GeneXpert and Hain. The ability of this assay to detect RIF and INH resistance may contribute to the global control of multidrug resistant tuberculosis.

  3. ARC-2010-AC00-0168-9-Edit

    NASA Image and Video Library

    2000-11-03

    The Honorable George P. Schultz during a Visit and tour of Ames Research Center. Shown here from left to right are in the background Bill Berry, Ames Deputy Director, Dr. Tom Edwards, Chief, Aviation Systems Division, Front row, Dr. Sidney Drell, Staford University, former U S Secretary of State George Schultz, Dr Richard Haines, Senior Research Csientist, FFC at the Future Flight Central Simulator facility.

  4. Inventory of File nam.t00z.smartconus06.tm00.grib2

    Science.gov Websites

    Temperature [K] 002 surface DPT 6 hour fcst Dew Point Temperature [K] 003 surface SPFH 6 hour fcst Specific Index [K] 027 surface HINDEX 6 hour fcst Haines Index [Numeric] 028 surface TMP 5 hour fcst Temperature [K] 029 surface TMP 4 hour fcst Temperature [K] 030 surface DPT 5 hour fcst Dew Point Temperature [K

  5. Inventory of File nam.t00z.smartak06.tm00.grib2

    Science.gov Websites

    Temperature [K] 002 surface DPT 6 hour fcst Dew Point Temperature [K] 003 surface SPFH 6 hour fcst Specific Haines Index [Numeric] 029 surface TMP 5 hour fcst Temperature [K] 030 surface TMP 4 hour fcst Temperature [K] 031 surface DPT 5 hour fcst Dew Point Temperature [K] 032 surface DPT 4 hour fcst Dew Point

  6. Inventory of File nam.t00z.smartak03.tm00.grib2

    Science.gov Websites

    Temperature [K] 002 surface DPT 3 hour fcst Dew Point Temperature [K] 003 surface SPFH 3 hour fcst Specific fcst Haines Index [Numeric] 026 surface TMP 2 hour fcst Temperature [K] 027 surface TMP 1 hour fcst Temperature [K] 028 surface DPT 2 hour fcst Dew Point Temperature [K] 029 surface DPT 1 hour fcst Dew Point

  7. Inventory of File nam.t00z.smartconus12.tm00.grib2

    Science.gov Websites

    Temperature [K] 002 surface DPT 12 hour fcst Dew Point Temperature [K] 003 surface SPFH 12 hour fcst Specific hour fcst Haines Index [Numeric] 030 surface TMP 11 hour fcst Temperature [K] 031 surface TMP 10 hour fcst Temperature [K] 032 surface DPT 11 hour fcst Dew Point Temperature [K] 033 surface DPT 10 hour

  8. Appendix A: Vascular Plants of GLEES

    Treesearch

    J. D. Haines; C .M. Regan

    1994-01-01

    This appendix provides a list of 230 vascular plant taxa that were field identified and/or collected over the period 1986-1990. Field identification was done by C.L. Simmons in 1986-87 (see Chapter 2). Subsequent taxa were field identified, collected, and verified by J.D. Haines and C.M. Regan in 1988-90. Voucher specimens were verified by taxonomists at the Rocky...

  9. Pyroconvection Risk in Australia: Climatological Changes in Atmospheric Stability and Surface Fire Weather Conditions

    NASA Astrophysics Data System (ADS)

    Dowdy, Andrew J.; Pepler, Acacia

    2018-02-01

    Extreme wildfires with strong convective processes in their plumes have recently led to disastrous impacts on various regions of the world. The Continuous Haines index (CH) is used in Australia to represent vertical atmospheric stability and humidity measures relating to pyroconvective processes. CH climatology is examined here using reanalysis data from 1979 to 2016, revealing large spatial and seasonal variations throughout Australia. Various measures of severity are investigated, including regionally specific thresholds. CH is combined with near-surface fire weather conditions, as a type of compound event, and is examined in relation to environmental conditions associated with pyroconvection. Significant long-term changes in CH are found for some regions and seasons, with these changes corresponding to changes in near-surface conditions in some cases. In particular, an increased risk of pyroconvection is identified for southeast Australia during spring and summer, due to decreased vertical atmospheric stability and humidity combined with more severe near-surface conditions.

  10. Post-polio syndrome. Cases report and review of literature.

    PubMed

    Pastuszak, Żanna; Stępień, Adam; Tomczykiewicz, Kazimierz; Piusińska-Macoch, Renata; Galbarczyk, Dariusz; Rolewska, Agnieszka

    It is estimated that around 15 million people survived polio infection worldwide since early twentieth century. In 1950 effective vaccination was used for first time. Since that time number of affected people decreased. The last epidemic of Haine-Medine disease in Poland was in 1950s. Another rare cases of infections were observed till 1970s. About at least 15 years after polio virus infection, slowly progressive muscle limbs paresis with muscle atrophy, joints pain, paresthesia were observed in polio survivors. That constellation of symptoms was called post-polio syndrome (PPS). PPS frequency among people after paralytic and nonparalytic polio infectious is ranged from 30% to 80%. Fatigue that leads to physical and mental activity deterioration is another important symptom that is observed in 90% of patients with PPS. Etiology of disease remains elusive. Probably it is an effect of spine frontal horns motoneurons damage during acute virus polio infection that leads to overloading and degeneration of remaining ones. The most important risk factors of PPS are female sex and respiratory symptoms during acute polio infection. Electromyography is an important part of PPS diagnostic process. Electrophysiological abnormalities are seen in clinically affected and unaffected muscles. The most frequent are fasciculations and fibrillations during rest activity, extension of motor unit area, time duration and amplitude. In this study we described three cases of people who developed PPS years after Haine-Medine disease and correlation between their EMG results and clinical status. We also analyzed electromyography results both after one month since first PPS signs occurred as well as after few years. Presentation of dynamic changes in EMG was the most important aim of that study. Copyright © 2017. Published by Elsevier Urban & Partner Sp. z o.o.

  11. The Tribology of Undulated Surfaces

    DTIC Science & Technology

    1989-05-30

    wear, in effect by decreasing the impact of plowing. Lubrication, hard coatings, fiber-reinforced composites are but a few examples. All these methods ha...In addition, the effects of pad width and cavity volume fraction of the undulated surface were also investigated. A plowing model proposed for...boundary lubricated sliding is in good agreement with experimental results. It is suggested, furthermore, that the undulated surfaces provide an effective

  12. Office of Strategic Services Training during World War II

    DTIC Science & Technology

    2010-06-01

    First Central Intelligence Agency (Berkeley: University of California Press, 1972); Thomas F. Troy, Donovan and the CIA: A History of the...William E. Colby Papers, Box 14, Folder 7, Seeley G. Mudd Library, Princeton University, Princeton, New Jer- sey. 50. Gerald K. Haines, “Virginia...subject heading] 29 January 1945, 2–3, CIA Records (RG 263), Thomas Troy Files, Box 6, Folder 46, National Archives, II. In August 2008, the

  13. A Microwave Method for Measuring Moisture Content, Density, and Grain Angle of Wood.

    DTIC Science & Technology

    1985-03-01

    Livermore, CA 94550. James, William L; Yen , You - Hsin ; King, Ray J. A microwave method for measuring moisture content,density, and grain angle of wood...Note S FPL-0250 March 1985 Density, and Grain 8 Angle of Wood William L. James, Physicist Forest Products Laboratory, Madison, WI You -Hain Yen ... Yen . You -1tsin. Microwave electromagnetic nondestructive testing of wood in real- time. Madison. WI: Department of Electronic and Computer

  14. 2002 Industry Analysis Research Paper: Global Environment, Global Industry, and Global Security: Managing the Crossroads

    DTIC Science & Technology

    2002-01-01

    from polluted wells or rivers and wastewater is discharged into ditches or untreated systems that in turn contaminate the drinking water for others who...being embedded into core business lines of many major corporations. The U.N. Conference on Environment and Development, held in Rio de Janeiro , Brazil in...climate change. Capt. Mohd Amdan Kurish, Royal Malaysian Navy Mr. John L. Gerlaugh, OSD Lt Col Thomas J. Hains, USAF. Mr. Kevin E . Holt, USMC Col

  15. Hydrodynamic Flow Control in Marine Mammals

    DTIC Science & Technology

    2008-05-06

    body- bound vorticity ( Wolfgang et al. 1999). The vorticity is smoothly propagated along the flexing body toward the tail. This vorticity is eventually...and Reichley 1985; Dolphin 1988; Pauly et al. 1998). Whales lunge toward their prey at 2.6 m/s (Jurasz and Jurasz 1979; Hain et al. 1982). The...unsteady RANS CFD code for ship hydrodynamics. IIHR Hydroscience and Engineering Report 531. Iowa City (IA): The University of Iowa. Pauly D, Trites

  16. Bioenergetic Defects and Oxidative Damage in Transgenic Mouse Models of Neurodegenerative Disorders

    DTIC Science & Technology

    2005-06-01

    Implications for Lewy body for- mation in Parkinson’s disease and dementia with Lewy bodies. 1. Folk, J. E. 1980 . Transglutaminases. Annu. Rev...Mazziotta, J. C., Pahl, J. J., St George- Hyslop , P., Neurodegen. 5:27-33. Haines, J. L., Gusella, J., Hoffman, J. M., Baxter, L. R., and 61. Matsuishi...and isoquinoline and Bright 1980 ). Systemic administration of 3-NP inhibits derivative neurotoxicity was associated with reduced activity SDH in the

  17. Defense Small Business Innovation Research Program (SBIR). Volume 4. Defense Agencies Abstracts of Phase 1 Awards from FY 1988 SBIR Solicitation

    DTIC Science & Technology

    1989-05-01

    FORMULATION -- FLOQUET-GALERKIN. DELPHI RESEARCH INC 701 HAINES AVE NW ALBUQUERQUE, NM 87102 CONTRACT NUMBER: PATRICK M DHOOGE TITLE: RESEARCH ON A...MEASURE THE DIELECTRIC RESPONSE OF A DIAMOND FILM AS IT GROWNS. DIESEL DYNE CORP 3044 MIDDLEBORO RD MORROW, OH 45152 CONTRACT NUMBER: RICHARD P JOHNSTON...TITLE: A STUDY OF AN ADVANCED VARIABLE CYCLE DIESEL ENGINE FOR USE IN A REMOTELY PILOTED VEHICLE TOPIC# 18 OFFICE: ASTO IDENT#: 22796 THIS STUDY IS

  18. Low Voltage Electron Beam Lithography

    DTIC Science & Technology

    1994-01-01

    September 1970 (Societe Franaise do Microscopic Elecuouique, Plaris, 1970) Vol. 2, p. 55. [31 H . C. Pfeiffer, "Basic limitations of probefonning systems...USA (editors: 0. Jobari and I. Corvin). [4) T. Groves, D. L Hunmond, H . Kuo, ’Elecmnm-beam broadening effct caused by discreteness of space charge...Electron Microscope Gun". Br. J. Appi. Phys.. February 1952, pp. 40-46. M. E. Haine, P. A. Einstein, and P. H . Brocherd. "Resistance Bias

  19. Meteorological Error Budget Using Open Source Data

    DTIC Science & Technology

    2016-09-01

    ARL-TR-7831 ● SEP 2016 US Army Research Laboratory Meteorological Error Budget Using Open- Source Data by J Cogan, J Smith, P...needed. Do not return it to the originator. ARL-TR-7831 ● SEP 2016 US Army Research Laboratory Meteorological Error Budget Using...Error Budget Using Open-Source Data 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) J Cogan, J Smith, P Haines

  20. Puget Sound Dredged Disposal Analysis (PSDDA). Final Environmental Impact Statement Unconfined Open-Water Disposal for Dredged Material, Phase 2. (North and South Puget Sound)

    DTIC Science & Technology

    1989-09-01

    flathead sole, rex sole, and rock sole all showed indications of blood worm infestations. One liver tumor was found in a rex sole during spring in the ZSF...concentrations Hainly in invertebrates; some trations (.01 ppb) in waters (from lOx to 42Ox reference) in fish livers ; rarely in fish of Puget Sound central...Eagle Harbor, and Sinclair fish livers , and birds in Inlet. Highest elevation industrialized ’-ban areas. along Ruston-Point Defiance Copper is a natural

  1. Bioenergetic Defects and Oxidative Damage in Transgenic Mouse Models of Neurodegenerative Disorders

    DTIC Science & Technology

    2004-05-01

    Grafton, S. T., Mazziotta, J. C., Pahl, J. J., St George- Hyslop , P., Neurodegen. 5:27-33. Haines, J. L., Gusella, J., Hoffman, J. M., Baxter, L. R., and 61...another TCA enzyme (Porter Previous studies showed that MPTP and isoquinoline and Bright 1980 ). Systemic administration of 3-NP inhibits derivative...Brouillet E., Ferrante R., Palfi S., Dolan R., Matthews R. T. Porter D. J. T. and Bright H. J. ( 1980 ) 3-Carbanionic substrate analogues and Beal M. F

  2. Spring Research Festival and NICBR Collaboration Winners Announced | Poster

    Cancer.gov

    By Carolynne Keenan, Contributing Writer, and Ashley DeVine, Staff Writer The winners of the 2014 Spring Research Festival (SRF), held May 7 and 8, were recognized on July 2, and included 20 NCI at Frederick researchers: Matthew Anderson, Victor Ayala, Matt Bess, Cristina Bergamaschi, Charlotte Choi, Rami Doueiri, Laura Guasch Pamies, Diana Haines, Saadia Iftikhar, Maria Kaltcheva, Wojciech Kasprzak, Balamurugan Kuppusamy, James Lautenberger, George Lountos, Megan Mounts, Uma Mudunuri, Martha Sklavos, Gloriana Shelton, Alex Sorum, and Shea Wright.

  3. Immunotherapeutic Strategies in Breast Cancer:Preclinical and Clinical Trials

    DTIC Science & Technology

    2012-09-01

    in enhanced CTL responses with anti-tumor activity. Journal of Immunology. 2000;165:539-47. 24. Haining WN, Davies J, Kanzler H, Drury L, Brenn T...including pancreatic cancer that express MUCl accounted for about 72% of new cases and for 66% of the deaths [ 1]. These observations have prompted...December 12, 2002. The costs of publication of this article were defrayed in part by the payment of page charges. This article must therefore be hereby

  4. Analytical and clinical performance characteristics of the Abbott RealTime MTB RIF/INH Resistance, an assay for the detection of rifampicin and isoniazid resistant Mycobacterium tuberculosis in pulmonary specimens.

    PubMed

    Kostera, Joshua; Leckie, Gregor; Tang, Ning; Lampinen, John; Szostak, Magdalena; Abravaya, Klara; Wang, Hong

    2016-12-01

    Clinical management of drug-resistant tuberculosis patients continues to present significant challenges to global health. To tackle these challenges, the Abbott RealTime MTB RIF/INH Resistance assay was developed to accelerate the diagnosis of rifampicin and/or isoniazid resistant tuberculosis to within a day. This article summarizes the performance of the Abbott RealTime MTB RIF/INH Resistance assay; including reliability, analytical sensitivity, and clinical sensitivity/specificity as compared to Cepheid GeneXpert MTB/RIF version 1.0 and Hain MTBDRplus version 2.0. The limit of detection (LOD) of the Abbott RealTime MTB RIF/INH Resistance assay was determined to be 32 colony forming units/milliliter (cfu/mL) using the Mycobacterium tuberculosis (MTB) strain H37Rv cell line. For rifampicin resistance detection, the Abbott RealTime MTB RIF/INH Resistance assay demonstrated statistically equivalent clinical sensitivity and specificity as compared to Cepheid GeneXpert MTB/RIF. For isoniazid resistance detection, the assay demonstrated statistically equivalent clinical sensitivity and specificity as compared to Hain MTBDRplus. The performance data presented herein demonstrate that the Abbott RealTime MTB RIF/INH Resistance assay is a sensitive, robust, and reliable test for realtime simultaneous detection of first line anti-tuberculosis antibiotics rifampicin and isoniazid in patient specimens. Copyright © 2016 The Author. Published by Elsevier Ltd.. All rights reserved.

  5. Evaluation of groundwater discharge into small lakes based on the temporal distribution of radon-222

    USGS Publications Warehouse

    Dimova, N.T.; Burnett, W.C.

    2011-01-01

    In order to evaluate groundwater discharge into small lakes we constructed a model that is based on the budget of 222Rn (radon t1/2 5 3.8 d) as a tracer. The main assumptions in our model are that the lake's waters are wellmixed horizontally and vertically; the only significant 222Rn source is via groundwater discharge; and the only losses are due to decay and atmospheric evasion. In order to evaluate the groundwater-derived 222Rn flux, we monitored the 222Rn concentration in lake water over periods long enough (usually 1-3 d) to observe changes likely caused by variations in atmospheric exchange (primarily a function of wind speed and temperature). We then attempt to reproduce the observed record by accounting for decay and atmospheric losses and by estimating the total 222Rn input flux using an iterative approach. Our methodology was tested in two lakes in central Florida: one of which is thought to have significant groundwater inputs (Lake Haines) and another that is known not to have any groundwater inflows but requires daily groundwater augmentation from a deep aquifer (Round Lake). Model results were consistent with independent seepage meter data at both Lake Haines (positive seepage of ??? 1.6 ?? 104 m3 d-1 in Mar 2008) and at Round Lake (no net groundwater seepage). ?? 2011, by the American Society of Limnology and Oceanography, Inc.

  6. Sliding Charge Density Waves and Related Problems.

    DTIC Science & Technology

    1987-03-31

    mecan -field ihcorn, with r-I oI. Lcntcr,, M ith inlinite-ranve interactions, the distinction Niet the exponential divergence follow4ed hv saturation...or the incommensu- liv simuiating Eq. (2) in systems up to 377 particles in rate chain. si/c ve were -ible to restrict tinite-size eflects to the... up another qualitative difference between 0 nc twko ,cis of theoretical results: While the incommensu- 0 20 40 60 80 100 120 140 _.:e :hain. like the

  7. Experiences with a high-blockage model tested in the NASA Ames 12-foot pressure wind tunnel

    NASA Technical Reports Server (NTRS)

    Coder, D. W.

    1984-01-01

    Representation of the flow around full-scale ships was sought in the subsonic wind tunnels in order to a Hain Reynolds numbers as high as possible. As part of the quest to attain the largest possible Reynolds number, large models with high blockage are used which result in significant wall interference effects. Some experiences with such a high blockage model tested in the NASA Ames 12-foot pressure wind tunnel are summarized. The main results of the experiment relating to wind tunnel wall interference effects are also presented.

  8. Examining Atmospheric and Ecological Drivers of Wildfires, Modeling Wildfire Occurrence in the Southwest United States, and Using Atmospheric Sounding Observations to Verify National Weather Service Spot Forecasts

    NASA Astrophysics Data System (ADS)

    Nauslar, Nicholas J.

    This dissertation is comprised of three different papers that all pertain to wildland fire applications. The first paper performs a verification analysis on mixing height, transport winds, and Haines Index from National Weather Service spot forecasts across the United States. The final two papers, which are closely related, examine atmospheric and ecological drivers of wildfire for the Southwest Area (SWA) (Arizona, New Mexico, west Texas, and Oklahoma panhandle) to better equip operational fire meteorologists and managers to make informed decisions on wildfire potential in this region. The verification analysis here utilizes NWS spot forecasts of mixing height, transport winds and Haines Index from 2009-2013 issued for a location within 50 km of an upper sounding location and valid for the day of the fire event. Mixing height was calculated from the 0000 UTC sounding via the Stull, Holzworth, and Richardson methods. Transport wind speeds were determined by averaging the wind speed through the boundary layer as determined by the three mixing height methods from the 0000 UTC sounding. Haines Index was calculated at low, mid, and high elevation based on the elevation of the sounding and spot forecast locations. Mixing height forecasts exhibited large mean absolute errors and biased towards over forecasting. Forecasts of transport wind speeds and Haines Index outperformed mixing height forecasts with smaller errors relative to their respective means. The rainfall and lightning associated with the North American Monsoon (NAM) can vary greatly intra- and inter-annually and has a large impact on wildfire activity across the SWA by igniting or suppressing wildfires. NAM onset thresholds and subsequent dates are determined for the SWA and each Predictive Service Area (PSA), which are sub-regions used by operational fire meteorologists to predict wildfire potential within the SWA, April through September from 1995-2013. Various wildfire activity thresholds using the number

  9. Rational Ruijsenaars Schneider hierarchy and bispectral difference operators

    NASA Astrophysics Data System (ADS)

    Iliev, Plamen

    2007-05-01

    We show that a monic polynomial in a discrete variable n, with coefficients depending on time variables t1,t2,…, is a τ-function for the discrete Kadomtsev-Petviashvili hierarchy if and only if the motion of its zeros is governed by a hierarchy of Ruijsenaars-Schneider systems. These τ-functions were considered in [L. Haine, P. Iliev, Commutative rings of difference operators and an adelic flag manifold, Int. Math. Res. Not. 2000 (6) (2000) 281-323], where it was proved that they parametrize rank one solutions to a difference-differential version of the bispectral problem.

  10. Applications of Artificial Intelligence in Voice Recognition Systems in Micro-Computers.

    DTIC Science & Technology

    1982-03-01

    DELTAO THEN 1290 1050 IF ANS$(I) = "HAIN MENU THEN 320 1060 IF ANS$(I) - " ABORTO THEN 3150 1070 IF ANS$(I) - 󈧄 BACK’ THEN 3590 1080 NEXT I 1090... ABORTO THEN 3150 1660 NEXT I 1670 SOTO 3350 3 REM’ ERROR PACK 1680 STOP 1690 REM SHIPS MENU 1700 REM------------ 1710 HOME : VTAB 5 :HTAB 15 :PRINT...IF ANS*(I) - PROFILESO THEN 3100 2470 IF IS$(I) - "MIN MENU" THEN 320 24Sf IF NB$(I) - "G0 BACK" THEN 3590 2490 IF ANS$(I) - " ABORTO THEN 3150 2500

  11. Screening a library of household substances for inhibitors of phosphatases: An introduction to high-throughput screening.

    PubMed

    Taylor, Ann T S

    2005-01-01

    Library screening methods are commonly used in industry and research. This article describes an experiment that screens a library of household substances for properties that would make a good "drug," including enzyme inhibition, neutral pH, and nondenaturing to proteins, using wheat germ acid phosphatase as the target protein. An adaptation of the experiment appropriate for lower level biochemistry or outreach is also described. This work was supported by Wabash College through the Haines Fund for the Study of Biochemistry and the National Science Foundation through Grant DUE 0126242. Copyright © 2005 International Union of Biochemistry and Molecular Biology, Inc.

  12. A review of seismoelectric data processing techniques

    NASA Astrophysics Data System (ADS)

    Warden, S. D.; Garambois, S.; Jouniaux, L.; Sailhac, P.

    2011-12-01

    Seismoelectric tomography is expected to combine the sensitivity of electromagnetic methods to hydrological properties such as water-content and permeability, to the high resolution of conventional seismic surveys. This innovative exploration technique seems very promising as it could characterize the fluids contained in reservoir rocks and detect thin layers invisible to other methods. However, it still needs to be improved before it can be successfully applied to real case problems. One of the main issues that need to be addressed is the development of wave separation techniques enabling to recover the signal of interest. Seismic waves passing through a fluid-saturated porous layered medium convert into at least two types of electromagnetic waves: the coseismic field (type I), accompanying seismic body and surface waves, and the independently propagating interface response (type II). The latter occurs when compressional waves encounter a contrast between electrical, chemical or mechanical properties in the subsurface, thus acting as a secondary source that can be generally approximated by a sum of electrical dipoles oscillating at the first Fresnel zone. Although properties of the medium in the vicinity of the receivers can be extracted from the coseismic waves, only the interface response provides subsurface information at depth, which makes it critical to separate both types of energy. This is a delicate problem, as the interface response may be several orders of magnitude weaker than the coseismic field. However, as reviewed by Haines et al. (2007), several properties of the interface response can be used to identify it: its dipolar amplitude pattern, its opposite polarity on opposite sides of the shot point and the electromagnetic velocity at which it travels, several orders of magnitude greater than seismic velocities. This latter attribute can be exploited to implement filtering techniques in frequency-wavenumber (f-k) and radon (tau-p) domain, which we

  13. Combining turbulent kinetic energy and Haines Index predictions for fire-weather assessments

    Treesearch

    Warren E. Heilman; Xindi Bian

    2007-01-01

    The 24- to 72-hour fire-weather predictions for different regions of the United States are now readily available from the regional Fire Consortia for Advanced Modeling of Meteorology and Smoke (FCAMMS) that were established as part of the U.S. National Fire Plan. These predictions are based on daily real-time MM5 model simulations of atmospheric conditions and fire-...

  14. Pore-scale modeling of phase change in porous media

    NASA Astrophysics Data System (ADS)

    Juanes, Ruben; Cueto-Felgueroso, Luis; Fu, Xiaojing

    2017-11-01

    One of the main open challenges in pore-scale modeling is the direct simulation of flows involving multicomponent mixtures with complex phase behavior. Reservoir fluid mixtures are often described through cubic equations of state, which makes diffuse interface, or phase field theories, particularly appealing as a modeling framework. What is still unclear is whether equation-of-state-driven diffuse-interface models can adequately describe processes where surface tension and wetting phenomena play an important role. Here we present a diffuse interface model of single-component, two-phase flow (a van der Waals fluid) in a porous medium under different wetting conditions. We propose a simplified Darcy-Korteweg model that is appropriate to describe flow in a Hele-Shaw cell or a micromodel, with a gap-averaged velocity. We study the ability of the diffuse-interface model to capture capillary pressure and the dynamics of vaporization/condensation fronts, and show that the model reproduces pressure fluctuations that emerge from abrupt interface displacements (Haines jumps) and from the break-up of wetting films.

  15. Enhanced Cr bioleaching efficiency from tannery sludge with coinoculation of Acidithiobacillus thiooxidans TS6 and Brettanomyces B65 in an air-lift reactor.

    PubMed

    Fang, Di; Zhou, Li-Xiang

    2007-09-01

    Bioleaching process has been demonstrated to be an effective technology in removing Cr from tannery sludge, but a large quantity of dissolved organic matter (DOM) present in tannery sludge often exhibits a marked toxicity to chemolithoautotrophic bioleaching bacteria such as Acidithiobacillus thiooxidans. The purpose of the present study was therefore to enhance Cr bioleaching efficiencies through introducing sludge DOM-degrading heterotrophic microorganism into the sulfur-based sludge bioleaching system. An acid-tolerant DOM-degrading yeast strain Brettanomyces B65 was successfully isolated from a local Haining tannery sludge and it could metabolize sludge DOM as a source of energy and carbon for growth. A combined bioleaching experiment (coupling Brettanomyces B65 and A. thiooxidans TS6) performed in an air-lift reactor indicated that the rates of sludge pH reduction and ORP increase were greatly improved, resulting in enhanced Cr solubilization. Compared with the 5 days required for maximum solubilization of Cr for the control (single bioleaching process without inoculation of Brettanomyces B65), the bioleaching period was significantly shorten to 3 days for the combined bioleaching system. Moreover, little nitrogen and phosphorous were lost and the content of Cr was below the permitted levels for land application after 3 days of bioleaching treatment.

  16. PCR identification of bacteria in blood culture does not fit the daily workflow of a routine microbiology laboratory.

    PubMed

    Karumaa, Santra; Kärpänoja, Pauliina; Sarkkinen, Hannu

    2012-03-01

    We have evaluated the GenoType blood culture assay (Hain Lifescience, Nehren, Germany) for the identification of bacteria in 233 positive blood cultures and assessed its suitability in the workflow of a routine microbiology laboratory. In 68/233 (29.2%) samples, the culture result could not be confirmed by the GenoType assay due to a lack of primers in the test, multiple organisms in the sample, or inconsistency with respect to the identification by culture. Although the GenoType blood culture assay gives satisfactory results for bacteria for which primers are available, there are difficulties in applying the test in the routine microbiology laboratory.

  17. Dusting off another shelf: further comments on classic gynecologic pathology books of yesteryear.

    PubMed

    Young, Robert H

    2005-01-01

    Selected outstanding books from the older literature on gynecologic pathology are reviewed with emphasis on drawing attention to the abundant helpful information and often outstanding illustrations that are worthy of review by present-day pathologists. This represents a follow up to a previous similar essay that appeared in Volume 19:67-84, 2000. The first three books cover general gynecological pathology: Gynecological Pathology by Carl Abel; Gynecological and Obstetrical Pathology by Robert T. Frank; and Haines and Taylor's Gynaecological Pathology by Magnus Haines and Claud W. Taylor. Each of them emphasizes the time-honored problem of mimicry of malignancy by diverse benign lesions or even aspects of normal histology. Awareness of the clinical background and cooperation between the clinician and pathologist are emphasized. The other three books considered are all devoted largely or exclusively to the ovary: Ovarian Tumors by Hans Selye, Ovarian Neoplasms, Morphology, and Classification by Karel Motlik, and Special Tumors of Ovary and Testis and Related Extragonadal Lesions by Gunnar Teilum. The book of Selye has a truly remarkable encyclopedic coverage of the older literature, the references being so comprehensive that they are presented in a separate volume. A number of aspects of the histopathology of ovarian tumors that have been emphasized in recent years are noted in Selye's book. Dr. Motlik's book presents a very high quality consideration of the differential diagnosis of ovarian tumors. Teilum's book contains a masterful account of the histopathology of germ cell tumors emphasizing a neoplasm with which his name will always be associated, the yolk sac tumor (endodermal sinus tumor). Numerous beautiful and refreshingly large illustrations are provided, and Dr. Teilum's interest in comparative pathology is evident in the pages, his linking of the famous Schiller-Duval bodies with the endodermal sinuses of the rat placenta, being the most notable example.

  18. Human performance measurement: Validation procedures applicable to advanced manned telescience systems

    NASA Technical Reports Server (NTRS)

    Haines, Richard F.

    1990-01-01

    As telescience systems become more and more complex, autonomous, and opaque to their operators it becomes increasingly difficult to determine whether the total system is performing as it should. Some of the complex and interrelated human performance measurement issues are addressed as they relate to total system validation. The assumption is made that human interaction with the automated system will be required well into the Space Station Freedom era. Candidate human performance measurement-validation techniques are discussed for selected ground-to-space-to-ground and space-to-space situations. Most of these measures may be used in conjunction with an information throughput model presented elsewhere (Haines, 1990). Teleoperations, teleanalysis, teleplanning, teledesign, and teledocumentation are considered, as are selected illustrative examples of space related telescience activities.

  19. Adiabatic burst evaporation from bicontinuous nanoporous membranes

    PubMed Central

    Ichilmann, Sachar; Rücker, Kerstin; Haase, Markus; Enke, Dirk

    2015-01-01

    Evaporation of volatile liquids from nanoporous media with bicontinuous morphology and pore diameters of a few 10 nm is an ubiquitous process. For example, such drying processes occur during syntheses of nanoporous materials by sol–gel chemistry or by spinodal decomposition in the presence of solvents as well as during solution impregnation of nanoporous hosts with functional guests. It is commonly assumed that drying is endothermic and driven by non-equilibrium partial pressures of the evaporating species in the gas phase. We show that nearly half of the liquid evaporates in an adiabatic mode involving burst-like liquid-to-gas conversions. During single adiabatic burst evaporation events liquid volumes of up to 107 μm3 are converted to gas. The adiabatic liquid-to-gas conversions occur if air invasion fronts get unstable because of the built-up of high capillary pressures. Adiabatic evaporation bursts propagate avalanche-like through the nanopore systems until the air invasion fronts have reached new stable configurations. Adiabatic cavitation bursts thus compete with Haines jumps involving air invasion front relaxation by local liquid flow without enhanced mass transport out of the nanoporous medium and prevail if the mean pore diameter is in the range of a few 10 nm. The results reported here may help optimize membrane preparation via solvent-based approaches, solution-loading of nanopore systems with guest materials as well as routine use of nanoporous membranes with bicontinuous morphology and may contribute to better understanding of adsorption/desorption processes in nanoporous media. PMID:25926406

  20. The Ohio State 1991 geopotential and sea surface topography harmonic coefficient models

    NASA Technical Reports Server (NTRS)

    Rapp, Richard H.; Wang, Yan Ming; Pavlis, Nikolaos K.

    1991-01-01

    The computation is described of a geopotential model to deg 360, a sea surface topography model to deg 10/15, and adjusted Geosat orbits for the first year of the exact repeat mission (ERM). This study started from the GEM-T2 potential coefficient model and it's error covariance matrix and Geosat orbits (for 22 ERMs) computed by Haines et al. using the GEM-T2 model. The first step followed the general procedures which use a radial orbit error theory originally developed by English. The Geosat data was processed to find corrections to the a priori geopotential model, corrections to a radial orbit error model for 76 Geosat arcs, and coefficients of a harmonic representation of the sea surface topography. The second stage of the analysis took place by doing a combination of the GEM-T2 coefficients with 30 deg gravity data derived from surface gravity data and anomalies obtained from altimeter data. The analysis has shown how a high degree spherical harmonic model can be determined combining the best aspects of two different analysis techniques. The error analysis was described that has led to the accuracy estimates for all the coefficients to deg 360. Significant work is needed to improve the modeling effort.

  1. Legionnaires disease presenting as acute kidney injury in the absence of pneumonia.

    PubMed

    Yogarajah, Meera; Sivasambu, Bhradeev

    2015-02-17

    Legionnaires disease is a pneumonic illness with multisystem involvement. In 1987, Haines et al reported the only reported case of isolated renal disease of legionellosis without concurrent respiratory disease. A 62-year-old man presented with generalised weakness and malaise and watery diarrhoea, and was found to have acute kidney injury on admission. He was initially managed as acute gastroenteritis complicated with dehydration and acute kidney injury with intravenous hydration. Despite adequate hydration, his renal function was worsening day by day. Later in the course of his sickness he developed pneumonic illness and was diagnosed with Legionnaires disease after a positive urine antigen test. We are reporting the second case of Legionnaires disease presenting as an isolated acute kidney injury in the absence of respiratory symptoms on presentation. 2015 BMJ Publishing Group Ltd.

  2. 78 FR 19431 - Safety Zones; Annual Independence Day Fireworks Displays, Skagway, Haines, and Wrangell, AK

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-04-01

    ... combination of a large number of spectators, congested waterways, darkness punctuated by bright flashes of... be of short duration, approximately three hours. Furthermore, vessels may be authorized to transit...

  3. 78 FR 38200 - Safety Zones; Annual Independence Day Fireworks Displays, Skagway, Haines, and Wrangell, AK

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-06-26

    ..., darkness punctuated by bright flashes of light, and burning debris has the potential to result in serious... enforcement of these safety zones will be of short duration, approximately three hours. Furthermore, vessels...

  4. Tyrosinase Inhibitory Activities of Carissa opaca Stapf ex Haines Roots Extracts and Their Phytochemical Analysis

    PubMed Central

    Malik, Wajeeha; Ahmed, Dildar; Izhar, Sania

    2017-01-01

    Objective: Carissa opaca is a medicinal plant with rich folkloric applications. The present research was conducted to explore the tyrosinase inhibitory potential of aqueous decoction (AD) and methanolic extract (ME) of roots of C. opaca and its fractions in various solvents and their phytochemical analysis. Materials and Methods: AD of the dried powdered roots of C. opaca was prepared by boiling in water. ME was prepared by cold maceration. Its fractions were obtained in solvents of increasing polarity, i.e., hexane, chloroform, ethyl acetate, n-butanol, and water. The biomass left after extraction with methanol was boiled in water to get its decoction Biomass aqueous decoction (BAD). Tyrosinase inhibitory activities of the samples were studied according to a reported method. Chemical compounds in the samples were identified by gas chromatography-mass spectrometry (GC-MS). Results: The AD, BAD, and ME and its fractions displayed remarkable tyrosinase inhibitory activity. The IC50 of AD was 23.33 μg/mL as compared to 15.80 μg/mL of the standard arbutin and that of BAD was 21.24 μg/mL. The IC50 of ME was 34.76 μg/mL while that of hexane, chloroform, ethyl acetate, n-butanolic, and aqueous fractions was 21.0, 44.73, 43.40, 27.66, and 25.06 μg/mL, respectively. The hexane fraction was thus most potent followed by aqueous fraction. By phytochemical analysis, campesterol, stigmasterol, gamma-sitosterol, alpha-amyrin, 9,19-cyclolanostan-3-ol, 24-methylene-,(3 β)-, lupeol, lup-20(29)-en-3-one, lup-20(29)-en-3-ol, acetate,(3 β)-, 2(1H) naphthalenone, 3,5,6,7,8,8a-hexahydro-4,8a-dimethyl-6-(1-methylethenyl)-, and 2,3,3-trimethyl-2-(3-methylbuta-1,3-dienyl)-6-methylenecyclohexanone were identified in the extracts by GC-MS. Other compounds included fatty acids and their esters. Some of these compounds are being first time reported here from this plant. Conclusions: The roots extracts exhibited considerable tyrosinase inhibitory activities, alluding to a possible application of the plant in cosmetic as whitening agent subject to further pharmacological studies. SUMMARY The present study aimed to explore the tyrosinase inhibitory potential of aqueous decoction and methanolic extract of roots of Carissa opaca and its fractions in various solvents and their phytochemical constituents. GCMS analysis was conducted to identify the phytochemicals. The extracts and fractions of C. opaca roots showed remarkable anti-tyrosinase activities alluding to their possible application to treat disorders related to overproduction of melanin. Abbreviations used: AD: Aqueous decoction; ME: Methanolic extract; BAD: Biomass aqueous decoction; GC-MS: Gas chromatography-mass spectrometry. PMID:29142412

  5. Modelling the Effects of Temperature and Cloud Cover Change on Mountain Permafrost Distribution, Northwest Canada

    NASA Astrophysics Data System (ADS)

    Bonnaventure, P. P.; Lewkowicz, A. G.

    2008-12-01

    Spatial models of permafrost probability for three study areas in northwest Canada between 59°N and 61°N were perturbed to investigate climate change impacts. The models are empirical-statistical in nature, based on basal temperature of snow (BTS) measurements in winter, and summer ground-truthing of the presence or absence of frozen ground. Predictions of BTS values are made using independent variables of elevation and potential incoming solar radiation (PISR), both derived from a 30 m DEM. These are then transformed into the probability of the presence or absence of permafrost through logistic regression. Under present climate conditions, permafrost percentages in the study areas are 44% for Haines Summit, British Columbia, 38% for Wolf Creek, Yukon, and 69% for part of the Ruby Range, Yukon (Bonnaventure and Lewkowicz, 2008; Lewkowicz and Bonaventure, 2008). Scenarios of air temperature change from -2K (approximating Neoglacial conditions) to +5K (possible within the next century according to the IPCC) were examined for the three sites. Manipulations were carried out by lowering or raising the terrain within the DEM assuming a mean environmental lapse rate of 6.5K/km. Under a -2K scenario, permafrost extent increased by 22-43% in the three study areas. Under a +5K warming, permafrost essentially disappeared in Haines Summit and Wolf Creek, while in the Ruby Range less than 12% of the area remained perennially frozen. It should be emphasized that these model predictions are for equilibrium conditions which might not be attained for several decades or longer in areas of cold permafrost. Cloud cover changes of -10% to +10% were examined through adjusting the partitioning of direct beam and diffuse radiation in the PISR input field. Changes to permafrost extent were small, ranging from -2% to -4% for greater cloudiness with changes of the opposite magnitude for less cloud. The results show that air temperature change has a much greater potential to affect mountain

  6. Effects of Near-Surface Atmospheric Stability and Moisture on Wildfire Behavior and Consequences for Haines Index

    Treesearch

    Ruiyu Sun; Mary Ann Jenkins

    2003-01-01

    Since the 1950s, extensive research has been conducted to investigate the relationship between near-surface atmospheric conditions and large wildfire growth and occurrence. Observational studies have demonstrated that near-surface dryness (e-g., Fahnestock 1965) and atmospheric instability (e-g., Brotak and Reifsnyder 1977) are correlated with large wildfire growth and...

  7. Comparison of the socio-demographic and clinical features of pulmonary TB patients infected with sub-lineages within the W-Beijing and non-Beijing Mycobacterium tuberculosis.

    PubMed

    Hu, Yi; Mathema, Barun; Zhao, Qi; Zheng, Xubin; Li, Dange; Jiang, Weili; Wang, Weibing; Xu, Biao

    2016-03-01

    Highly lethal outbreaks of multidrug-resistant (MDR) and extensively drug-resistant (XDR) tuberculosis are increasing. Mycobacterium tuberculosis variant Beijing family and its members is regarded as a successful clone of M. tuberculosis that is associated with drug resistance in China. Understanding the genetic characteristics and molecular mechanism of drug resistant tuberculosis within Beijing family may help to clarify its origin and evolutionary history and the driving forces behind its emergence and current dissemination. Totally of 1222 Mycobacterium tuberculosis isolates were recovered from patients in six counties of two provinces in eastern China within 2010/2012. Strain lineage and its major subgroups were studied respectively by using Spoligotyping and MIRU-VNTR. The 1st-line drug susceptibility was analyzed by proportional method and 2nd-line drug susceptibility was determined by the HAINs MTBDRsl test. The genetic characterization of drug resistance was analyzed by sequencing the previously reported genes and loci associated with drug resistance together with the multiple genotyping including MIRU-VNTR, Spoligotyping and LSP genotyping. Of the 1222 Mtb isolates, 298 (24.4%) were resistant to 1st-line drug and 73 (5.9%) were simultaneously resistant to INH and RIF namely MDR-TB. Respectively 23.8% of 1st-line drug resistant TB and 12.0% of the drug susceptible TB contained the mutation associated with 2nd-line drugs by HAINs test. The Spoligotyping of 1222 Mtb isolates revealed the 967 (79.1%) of the isolates belonged to the W-Beijing family. Within W-Beijing family, 78.8% MDR-TB were observed in the isolates with simultaneous deletion of RD105 and RD207, with sub-lineage 181 accounting for 75% of MDR-TB. Analysis of 24 MIRU-VNTR loci revealed that 88.2% (15/17) of MDR and extensively drug resistant (XDR) clustered isolates were sub-lineage 181. Sublineage 181 might have the capacity to spread throughout the general community in rural China. This is

  8. Scientists as Producers, Presenters, Videographers, Distributors and 'Stars': The Revolution In Science Filmmaking, from COSMOS to iPhones on Kilimanjaro

    NASA Astrophysics Data System (ADS)

    Haines-Stiles, G.; Akuginow, E.; Morris, K.

    2013-12-01

    In 1980, Carl Sagan's COSMOS received ratings of some 16 million and won three Emmys and a Peabody award. Sagan was hailed as a 'Showman of Science' by Time magazine, confirming his status as a science superstar. Haines-Stiles, 1st author for this presentation, was a Senior Producer and series director on what was for several decades PBS's highest-rated science series. Some researchers still consider primetime series on national networks as THE way to engage and inform audiences. But a revolution in both the making and consuming of science film and television has transformed the media landscape from high profile series such as COSMOS to more of a 'horizontal' ecosystem in which different formats for diverse audiences via multiple distribution networks are the norm. From the early 1990's the Internet has played an increasingly prominent role in this revolution. In 1993, Haines-Stiles and Akuginow added interactivity to traditional one-way TV broadcasts with 'Dale's Dive Diary,' in what was arguably the world's first science blog, detailing online the joys and rigors of working in Antarctica. Increasingly, the evolution of media allowed for the documentation of the process of doing science along with "eureka" discoveries and press conference results. In POLAR-PALOOZA (PPZA) this new perspective was further extended by taking Arctic and Antarctic researchers on the road to science museums in some 25 communities across the USA for spoken-word performances supported by High Definition video profiles of scientists at work at remote locations. In one instance, a researcher was given a crash course in videography and loaned a low-cost prosumer camcorder to take with her to the heart of East Antarctica. Excellent video was captured, and made part of large screen presentations in IMAX-scale theaters. In addition to the Summative Evaluation (required by project sponsors, NSF and NASA) which focused on audience responses, a recent research paper by communications scholar, Kim

  9. Studies of aerothermal loads generated in regions of shock/shock interaction in hypersonic flow

    NASA Technical Reports Server (NTRS)

    Holden, Michael S.; Moselle, John R.; Lee, Jinho

    1991-01-01

    Experimental studies were conducted to examine the aerothermal characteristics of shock/shock/boundary layer interaction regions generated by single and multiple incident shocks. The presented experimental studies were conducted over a Mach number range from 6 to 19 for a range of Reynolds numbers to obtain both laminar and turbulent interaction regions. Detailed heat transfer and pressure measurements were made for a range of interaction types and incident shock strengths over a transverse cylinder, with emphasis on the 3 and 4 type interaction regions. The measurements were compared with the simple Edney, Keyes, and Hains models for a range of interaction configurations and freestream conditions. The complex flowfields and aerothermal loads generated by multiple-shock impingement, while not generating as large peak loads, provide important test cases for code prediction. The detailed heat transfer and pressure measurements proved a good basis for evaluating the accuracy of simple prediction methods and detailed numerical solutions for laminar and transitional regions or shock/shock interactions.

  10. Advancing educational diversity: antifragility, standardization, democracy, and a multitude of education options

    NASA Astrophysics Data System (ADS)

    Fortunato, Michael W. P.

    2017-03-01

    This essay is a response to a paper by Avery and Hains that raises questions about the often unintended effects of knowledge standardization in an educational setting. While many K-12 schools are implementing common core standards, and many institutions of higher education are implementing their own standardized educational practices, the question is raised about what is lost in this effort to ensure regularity and consistency in educational outcomes. One such casualty may be local knowledge, which in a rural context includes ancestral knowledge about land, society, and cultural meaning. This essay explores whether or not efforts to standardize crowd out such knowledge, and decrease the diversity of knowledge within our society's complex ecosystem—thus making the ecosystem weaker. Using antifragility as a useful idea for examining system complexity, the essay considers the impact of standardization on innovation, democracy, and the valuation of some forms of knowledge (and its bearers) above others.

  11. Towards seasonal Arctic shipping route predictions

    NASA Astrophysics Data System (ADS)

    Haines, K.; Melia, N.; Hawkins, E.; Day, J. J.

    2017-12-01

    should become a key component of the future Arctic observing system. Melia, N., K. Haines, and E. Hawkins (2016), Sea ice decline and 21st century trans-Arctic shipping routes, Geophys. Res. Lett., doi:10.1002/ 2016GL069315. Melia, N., K. Haines, E. Hawkins and J.J. Day, 2017, Towards seasonal Arctic shipping route predictions. Env. Res. Lett., doi:10.1088/1748-9326/aa7a60

  12. Liquid redistribution behind a drainage front in porous media imaged by neutron radiography

    NASA Astrophysics Data System (ADS)

    Hoogland, Frouke; Lehmann, Peter; Moebius, Franziska; Vontobel, Peter; Or, Dani

    2013-04-01

    Drainage from porous media is a highly dynamic process involving the motion of a displacement front with rapid pore scale interfacial jumps and phase entrapment, but also a more gradual host of liquid redistribution processes in the unsaturated region behind the front. Depending on the velocity of the drainage process, liquid properties and the permeability of the porous medium, redistribution lingers long after the main drainage process is stopped, until gravity and capillary forces regain equilibrium. The rapid and often highly inertial Haines jumps at the drainage front challenge the validity of Buckingham-Darcy law and thus representation of the process based on the foundation of Richards equation. To quantify front displacement and liquid reconfiguration and to test validity of Richards equation with respect to fast drainage dynamics, we carried out drainage experiments by withdrawing water from the bottom of initially saturated sand-filled Hele-Shaw cells at constant water flux (2.6 or 13.1 mm/minute). Water content distribution and evolution of drainage front were measured with neutron radiography at spatial and temporal resolutions of 0.1 mm and 3 seconds, respectively. Water pressure was measured above and below the front using pressure transducers and a tensiometer. After the pump was stopped (at a front depth around 100 mm), capillary pressure values in the unsaturated region (above the front) gradually converged to a new equilibrium. The pressure signal in the saturated region below the front reflected viscous losses during flow that were relaxed when the pump stopped. During pressure relaxation water was redistributed primarily downward in the unsaturated region. Pressure signals and dynamics of water content profiles for fast process (13.6 mm/minute) could not be reproduced with Richards equation based on hydraulic functions determined in preceding laboratory experiments. To explore if the deviations stem from inappropriate hydraulic functions we

  13. Markov Processes in Image Processing

    NASA Astrophysics Data System (ADS)

    Petrov, E. P.; Kharina, N. L.

    2018-05-01

    Digital images are used as an information carrier in different sciences and technologies. The aspiration to increase the number of bits in the image pixels for the purpose of obtaining more information is observed. In the paper, some methods of compression and contour detection on the basis of two-dimensional Markov chain are offered. Increasing the number of bits on the image pixels will allow one to allocate fine object details more precisely, but it significantly complicates image processing. The methods of image processing do not concede by the efficiency to well-known analogues, but surpass them in processing speed. An image is separated into binary images, and processing is carried out in parallel with each without an increase in speed, when increasing the number of bits on the image pixels. One more advantage of methods is the low consumption of energy resources. Only logical procedures are used and there are no computing operations. The methods can be useful in processing images of any class and assignment in processing systems with a limited time and energy resources.

  14. Tourism and its effects on southeast Alaska communities and resources: case studies from Haines, Craig, and Hoonah, Alaska.

    Treesearch

    Lee K. Cerveny

    2005-01-01

    Tourism has become integral to southeast Alaska’s regional economy and has resulted in changes to the social and cultural fabric of community life as well as to natural resources used by Alaskans. This study incorporates an ethnographic approach to trace tourism development in three rural southeast Alaska communities featuring different levels and types of tourism. In...

  15. Process and Post-Process: A Discursive History.

    ERIC Educational Resources Information Center

    Matsuda, Paul Kei

    2003-01-01

    Examines the history of process and post-process in composition studies, focusing on ways in which terms, such as "current-traditional rhetoric,""process," and "post-process" have contributed to the discursive construction of reality. Argues that use of the term post-process in the context of second language writing needs to be guided by a…

  16. Perceptual processing affects conceptual processing.

    PubMed

    Van Dantzig, Saskia; Pecher, Diane; Zeelenberg, René; Barsalou, Lawrence W

    2008-04-05

    According to the Perceptual Symbols Theory of cognition (Barsalou, 1999), modality-specific simulations underlie the representation of concepts. A strong prediction of this view is that perceptual processing affects conceptual processing. In this study, participants performed a perceptual detection task and a conceptual property-verification task in alternation. Responses on the property-verification task were slower for those trials that were preceded by a perceptual trial in a different modality than for those that were preceded by a perceptual trial in the same modality. This finding of a modality-switch effect across perceptual processing and conceptual processing supports the hypothesis that perceptual and conceptual representations are partially based on the same systems. 2008 Cognitive Science Society, Inc.

  17. Process Correlation Analysis Model for Process Improvement Identification

    PubMed Central

    Park, Sooyong

    2014-01-01

    Software process improvement aims at improving the development process of software systems. It is initiated by process assessment identifying strengths and weaknesses and based on the findings, improvement plans are developed. In general, a process reference model (e.g., CMMI) is used throughout the process of software process improvement as the base. CMMI defines a set of process areas involved in software development and what to be carried out in process areas in terms of goals and practices. Process areas and their elements (goals and practices) are often correlated due to the iterative nature of software development process. However, in the current practice, correlations of process elements are often overlooked in the development of an improvement plan, which diminishes the efficiency of the plan. This is mainly attributed to significant efforts and the lack of required expertise. In this paper, we present a process correlation analysis model that helps identify correlations of process elements from the results of process assessment. This model is defined based on CMMI and empirical data of improvement practices. We evaluate the model using industrial data. PMID:24977170

  18. Process correlation analysis model for process improvement identification.

    PubMed

    Choi, Su-jin; Kim, Dae-Kyoo; Park, Sooyong

    2014-01-01

    Software process improvement aims at improving the development process of software systems. It is initiated by process assessment identifying strengths and weaknesses and based on the findings, improvement plans are developed. In general, a process reference model (e.g., CMMI) is used throughout the process of software process improvement as the base. CMMI defines a set of process areas involved in software development and what to be carried out in process areas in terms of goals and practices. Process areas and their elements (goals and practices) are often correlated due to the iterative nature of software development process. However, in the current practice, correlations of process elements are often overlooked in the development of an improvement plan, which diminishes the efficiency of the plan. This is mainly attributed to significant efforts and the lack of required expertise. In this paper, we present a process correlation analysis model that helps identify correlations of process elements from the results of process assessment. This model is defined based on CMMI and empirical data of improvement practices. We evaluate the model using industrial data.

  19. 5 CFR 1653.13 - Processing legal processes.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 5 Administrative Personnel 3 2014-01-01 2014-01-01 false Processing legal processes. 1653.13 Section 1653.13 Administrative Personnel FEDERAL RETIREMENT THRIFT INVESTMENT BOARD COURT ORDERS AND LEGAL PROCESSES AFFECTING THRIFT SAVINGS PLAN ACCOUNTS Legal Process for the Enforcement of a Participant's Legal...

  20. 5 CFR 1653.13 - Processing legal processes.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 5 Administrative Personnel 3 2013-01-01 2013-01-01 false Processing legal processes. 1653.13 Section 1653.13 Administrative Personnel FEDERAL RETIREMENT THRIFT INVESTMENT BOARD COURT ORDERS AND LEGAL PROCESSES AFFECTING THRIFT SAVINGS PLAN ACCOUNTS Legal Process for the Enforcement of a Participant's Legal...

  1. Management of processes of electrochemical dimensional processing

    NASA Astrophysics Data System (ADS)

    Akhmetov, I. D.; Zakirova, A. R.; Sadykov, Z. B.

    2017-09-01

    In different industries a lot high-precision parts are produced from hard-processed scarce materials. Forming such details can only be acting during non-contact processing, or a minimum of effort, and doable by the use, for example, of electro-chemical processing. At the present stage of development of metal working processes are important management issues electrochemical machining and its automation. This article provides some indicators and factors of electrochemical machining process.

  2. Biscrolling nanotube sheets and functional guests into yarns

    NASA Astrophysics Data System (ADS)

    Baughman, Ray

    2011-03-01

    Multifunctional applications of textiles have been limited by the inability to spin important materials into yarns. Generically applicable methods are demonstrated for producing weavable yarns comprising up to 95 wt % of otherwise unspinnable particulate or nanofiber powders that remain highly functional. Scrolled 50 nm thick carbon nanotube sheets confine these powders in the galleries of irregular scroll sacks, whose observed complex structures are related to twist-dependent extension of Archimedean spirals, Fermat spirals, or spiral pairs into scrolls. The strength and electronic connectivity of a small weight fraction of scrolled carbon nanotube sheet enables yarn weaving, sewing, knotting, braiding, and charge collection. This technology is used to make yarns of superconductors, Li-ion battery materials, graphene ribbons, catalytic nanofibers for fuel cells, and Ti O2 for photocatalysis. Work done in collaboration with Shaoli Fang, Xavier Lepro-Chavez, Chihye Lewis, Raquel Ovalle-Robles, Javier Carratero-Gonzalez, Elisabet Castillo-Martinez, Mikhail Kozlov, Jiyoung Oh, Neema Rawat, Carter Haines, Mohammed Haque, Vaishnavi Aare, Stephanie Stoughton, Anvar Zakhidov, and Ray Baughman, The University of Texas at Dallas / Alan G. MacDiarmid NanoTech Institute.

  3. Perceptual Processing Affects Conceptual Processing

    ERIC Educational Resources Information Center

    van Dantzig, Saskia; Pecher, Diane; Zeelenberg, Rene; Barsalou, Lawrence W.

    2008-01-01

    According to the Perceptual Symbols Theory of cognition (Barsalou, 1999), modality-specific simulations underlie the representation of concepts. A strong prediction of this view is that perceptual processing affects conceptual processing. In this study, participants performed a perceptual detection task and a conceptual property-verification task…

  4. SAR processing using SHARC signal processing systems

    NASA Astrophysics Data System (ADS)

    Huxtable, Barton D.; Jackson, Christopher R.; Skaron, Steve A.

    1998-09-01

    Synthetic aperture radar (SAR) is uniquely suited to help solve the Search and Rescue problem since it can be utilized either day or night and through both dense fog or thick cloud cover. Other papers in this session, and in this session in 1997, describe the various SAR image processing algorithms that are being developed and evaluated within the Search and Rescue Program. All of these approaches to using SAR data require substantial amounts of digital signal processing: for the SAR image formation, and possibly for the subsequent image processing. In recognition of the demanding processing that will be required for an operational Search and Rescue Data Processing System (SARDPS), NASA/Goddard Space Flight Center and NASA/Stennis Space Center are conducting a technology demonstration utilizing SHARC multi-chip modules from Boeing to perform SAR image formation processing.

  5. Improving the process of process modelling by the use of domain process patterns

    NASA Astrophysics Data System (ADS)

    Koschmider, Agnes; Reijers, Hajo A.

    2015-01-01

    The use of business process models has become prevalent in a wide area of enterprise applications. But while their popularity is expanding, concerns are growing with respect to their proper creation and maintenance. An obvious way to boost the efficiency of creating high-quality business process models would be to reuse relevant parts of existing models. At this point, however, limited support exists to guide process modellers towards the usage of appropriate model content. In this paper, a set of content-oriented patterns is presented, which is extracted from a large set of process models from the order management and manufacturing production domains. The patterns are derived using a newly proposed set of algorithms, which are being discussed in this paper. The authors demonstrate how such Domain Process Patterns, in combination with information on their historic usage, can support process modellers in generating new models. To support the wider dissemination and development of Domain Process Patterns within and beyond the studied domains, an accompanying website has been set up.

  6. High-throughput process development: I. Process chromatography.

    PubMed

    Rathore, Anurag S; Bhambure, Rahul

    2014-01-01

    Chromatographic separation serves as "a workhorse" for downstream process development and plays a key role in removal of product-related, host cell-related, and process-related impurities. Complex and poorly characterized raw materials and feed material, low feed concentration, product instability, and poor mechanistic understanding of the processes are some of the critical challenges that are faced during development of a chromatographic step. Traditional process development is performed as trial-and-error-based evaluation and often leads to a suboptimal process. High-throughput process development (HTPD) platform involves an integration of miniaturization, automation, and parallelization and provides a systematic approach for time- and resource-efficient chromatography process development. Creation of such platforms requires integration of mechanistic knowledge of the process with various statistical tools for data analysis. The relevance of such a platform is high in view of the constraints with respect to time and resources that the biopharma industry faces today. This protocol describes the steps involved in performing HTPD of process chromatography step. It described operation of a commercially available device (PreDictor™ plates from GE Healthcare). This device is available in 96-well format with 2 or 6 μL well size. We also discuss the challenges that one faces when performing such experiments as well as possible solutions to alleviate them. Besides describing the operation of the device, the protocol also presents an approach for statistical analysis of the data that is gathered from such a platform. A case study involving use of the protocol for examining ion-exchange chromatography of granulocyte colony-stimulating factor (GCSF), a therapeutic product, is briefly discussed. This is intended to demonstrate the usefulness of this protocol in generating data that is representative of the data obtained at the traditional lab scale. The agreement in the

  7. Process-based tolerance assessment of connecting rod machining process

    NASA Astrophysics Data System (ADS)

    Sharma, G. V. S. S.; Rao, P. Srinivasa; Surendra Babu, B.

    2016-06-01

    Process tolerancing based on the process capability studies is the optimistic and pragmatic approach of determining the manufacturing process tolerances. On adopting the define-measure-analyze-improve-control approach, the process potential capability index ( C p) and the process performance capability index ( C pk) values of identified process characteristics of connecting rod machining process are achieved to be greater than the industry benchmark of 1.33, i.e., four sigma level. The tolerance chain diagram methodology is applied to the connecting rod in order to verify the manufacturing process tolerances at various operations of the connecting rod manufacturing process. This paper bridges the gap between the existing dimensional tolerances obtained via tolerance charting and process capability studies of the connecting rod component. Finally, the process tolerancing comparison has been done by adopting a tolerance capability expert software.

  8. Value-driven process management: using value to improve processes.

    PubMed

    Melnyk, S A; Christensen, R T

    2000-08-01

    Every firm can be viewed as consisting of various processes. These processes affect everything that the firm does from accepting orders and designing products to scheduling production. In many firms, the management of processes often reflects considerations of efficiency (cost) rather than effectiveness (value). In this article, we introduce a well-structured process for managing processes that begins not with the process, but rather with the customer and the product and the concept of value. This process progresses through a number of steps which include issues such as defining value, generating the appropriate metrics, identifying the critical processes, mapping and assessing the performance of these processes, and identifying long- and short-term areas for action. What makes the approach presented in this article so powerful is that it explicitly links the customer to the process and that the process is evaluated in term of its ability to effectively serve the customers.

  9. 5 CFR 1653.13 - Processing legal processes.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... legal processes from the TSP is governed solely by the Federal Employees' Retirement System Act, 5 U.S.C... 5 Administrative Personnel 3 2010-01-01 2010-01-01 false Processing legal processes. 1653.13 Section 1653.13 Administrative Personnel FEDERAL RETIREMENT THRIFT INVESTMENT BOARD COURT ORDERS AND LEGAL...

  10. Coal liquefaction process with enhanced process solvent

    DOEpatents

    Givens, Edwin N.; Kang, Dohee

    1984-01-01

    In an improved coal liquefaction process, including a critical solvent deashing stage, high value product recovery is improved and enhanced process-derived solvent is provided by recycling second separator underflow in the critical solvent deashing stage to the coal slurry mix, for inclusion in the process solvent pool.

  11. A new process sensitivity index to identify important system processes under process model and parametric uncertainty

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dai, Heng; Ye, Ming; Walker, Anthony P.

    Hydrological models are always composed of multiple components that represent processes key to intended model applications. When a process can be simulated by multiple conceptual-mathematical models (process models), model uncertainty in representing the process arises. While global sensitivity analysis methods have been widely used for identifying important processes in hydrologic modeling, the existing methods consider only parametric uncertainty but ignore the model uncertainty for process representation. To address this problem, this study develops a new method to probe multimodel process sensitivity by integrating the model averaging methods into the framework of variance-based global sensitivity analysis, given that the model averagingmore » methods quantify both parametric and model uncertainty. A new process sensitivity index is derived as a metric of relative process importance, and the index includes variance in model outputs caused by uncertainty in both process models and model parameters. For demonstration, the new index is used to evaluate the processes of recharge and geology in a synthetic study of groundwater reactive transport modeling. The recharge process is simulated by two models that converting precipitation to recharge, and the geology process is also simulated by two models of different parameterizations of hydraulic conductivity; each process model has its own random parameters. The new process sensitivity index is mathematically general, and can be applied to a wide range of problems in hydrology and beyond.« less

  12. Defense Waste Processing Facility Process Enhancements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bricker, Jonathan

    2010-11-01

    Jonathan Bricker provides an overview of process enhancements currently being done at the Defense Waste Processing Facility (DWPF) at SRS. Some of these enhancements include: melter bubblers; reduction in water use, and alternate reductant.

  13. A new process sensitivity index to identify important system processes under process model and parametric uncertainty

    DOE PAGES

    Dai, Heng; Ye, Ming; Walker, Anthony P.; ...

    2017-03-28

    A hydrological model consists of multiple process level submodels, and each submodel represents a process key to the operation of the simulated system. Global sensitivity analysis methods have been widely used to identify important processes for system model development and improvement. The existing methods of global sensitivity analysis only consider parametric uncertainty, and are not capable of handling model uncertainty caused by multiple process models that arise from competing hypotheses about one or more processes. To address this problem, this study develops a new method to probe model output sensitivity to competing process models by integrating model averaging methods withmore » variance-based global sensitivity analysis. A process sensitivity index is derived as a single summary measure of relative process importance, and the index includes variance in model outputs caused by uncertainty in both process models and their parameters. Here, for demonstration, the new index is used to assign importance to the processes of recharge and geology in a synthetic study of groundwater reactive transport modeling. The recharge process is simulated by two models that convert precipitation to recharge, and the geology process is simulated by two models of hydraulic conductivity. Each process model has its own random parameters. Finally, the new process sensitivity index is mathematically general, and can be applied to a wide range of problems in hydrology and beyond.« less

  14. A new process sensitivity index to identify important system processes under process model and parametric uncertainty

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dai, Heng; Ye, Ming; Walker, Anthony P.

    A hydrological model consists of multiple process level submodels, and each submodel represents a process key to the operation of the simulated system. Global sensitivity analysis methods have been widely used to identify important processes for system model development and improvement. The existing methods of global sensitivity analysis only consider parametric uncertainty, and are not capable of handling model uncertainty caused by multiple process models that arise from competing hypotheses about one or more processes. To address this problem, this study develops a new method to probe model output sensitivity to competing process models by integrating model averaging methods withmore » variance-based global sensitivity analysis. A process sensitivity index is derived as a single summary measure of relative process importance, and the index includes variance in model outputs caused by uncertainty in both process models and their parameters. Here, for demonstration, the new index is used to assign importance to the processes of recharge and geology in a synthetic study of groundwater reactive transport modeling. The recharge process is simulated by two models that convert precipitation to recharge, and the geology process is simulated by two models of hydraulic conductivity. Each process model has its own random parameters. Finally, the new process sensitivity index is mathematically general, and can be applied to a wide range of problems in hydrology and beyond.« less

  15. Analysis of Hospital Processes with Process Mining Techniques.

    PubMed

    Orellana García, Arturo; Pérez Alfonso, Damián; Larrea Armenteros, Osvaldo Ulises

    2015-01-01

    Process mining allows for discovery, monitoring, and improving processes identified in information systems from their event logs. In hospital environments, process analysis has been a crucial factor for cost reduction, control and proper use of resources, better patient care, and achieving service excellence. This paper presents a new component for event logs generation in the Hospital Information System or HIS, developed at University of Informatics Sciences. The event logs obtained are used for analysis of hospital processes with process mining techniques. The proposed solution intends to achieve the generation of event logs in the system with high quality. The performed analyses allowed for redefining functions in the system and proposed proper flow of information. The study exposed the need to incorporate process mining techniques in hospital systems to analyze the processes execution. Moreover, we illustrate its application for making clinical and administrative decisions for the management of hospital activities.

  16. Hyperspectral processing in graphical processing units

    NASA Astrophysics Data System (ADS)

    Winter, Michael E.; Winter, Edwin M.

    2011-06-01

    With the advent of the commercial 3D video card in the mid 1990s, we have seen an order of magnitude performance increase with each generation of new video cards. While these cards were designed primarily for visualization and video games, it became apparent after a short while that they could be used for scientific purposes. These Graphical Processing Units (GPUs) are rapidly being incorporated into data processing tasks usually reserved for general purpose computers. It has been found that many image processing problems scale well to modern GPU systems. We have implemented four popular hyperspectral processing algorithms (N-FINDR, linear unmixing, Principal Components, and the RX anomaly detection algorithm). These algorithms show an across the board speedup of at least a factor of 10, with some special cases showing extreme speedups of a hundred times or more.

  17. Voyager image processing at the Image Processing Laboratory

    NASA Astrophysics Data System (ADS)

    Jepsen, P. L.; Mosher, J. A.; Yagi, G. M.; Avis, C. C.; Lorre, J. J.; Garneau, G. W.

    1980-09-01

    This paper discusses new digital processing techniques as applied to the Voyager Imaging Subsystem and devised to explore atmospheric dynamics, spectral variations, and the morphology of Jupiter, Saturn and their satellites. Radiometric and geometric decalibration processes, the modulation transfer function, and processes to determine and remove photometric properties of the atmosphere and surface of Jupiter and its satellites are examined. It is exhibited that selected images can be processed into 'approach at constant longitude' time lapse movies which are useful in observing atmospheric changes of Jupiter. Photographs are included to illustrate various image processing techniques.

  18. Voyager image processing at the Image Processing Laboratory

    NASA Technical Reports Server (NTRS)

    Jepsen, P. L.; Mosher, J. A.; Yagi, G. M.; Avis, C. C.; Lorre, J. J.; Garneau, G. W.

    1980-01-01

    This paper discusses new digital processing techniques as applied to the Voyager Imaging Subsystem and devised to explore atmospheric dynamics, spectral variations, and the morphology of Jupiter, Saturn and their satellites. Radiometric and geometric decalibration processes, the modulation transfer function, and processes to determine and remove photometric properties of the atmosphere and surface of Jupiter and its satellites are examined. It is exhibited that selected images can be processed into 'approach at constant longitude' time lapse movies which are useful in observing atmospheric changes of Jupiter. Photographs are included to illustrate various image processing techniques.

  19. Assessment of hospital processes using a process mining technique: Outpatient process analysis at a tertiary hospital.

    PubMed

    Yoo, Sooyoung; Cho, Minsu; Kim, Eunhye; Kim, Seok; Sim, Yerim; Yoo, Donghyun; Hwang, Hee; Song, Minseok

    2016-04-01

    Many hospitals are increasing their efforts to improve processes because processes play an important role in enhancing work efficiency and reducing costs. However, to date, a quantitative tool has not been available to examine the before and after effects of processes and environmental changes, other than the use of indirect indicators, such as mortality rate and readmission rate. This study used process mining technology to analyze process changes based on changes in the hospital environment, such as the construction of a new building, and to measure the effects of environmental changes in terms of consultation wait time, time spent per task, and outpatient care processes. Using process mining technology, electronic health record (EHR) log data of outpatient care before and after constructing a new building were analyzed, and the effectiveness of the technology in terms of the process was evaluated. Using the process mining technique, we found that the total time spent in outpatient care did not increase significantly compared to that before the construction of a new building, considering that the number of outpatients increased, and the consultation wait time decreased. These results suggest that the operation of the outpatient clinic was effective after changes were implemented in the hospital environment. We further identified improvements in processes using the process mining technique, thereby demonstrating the usefulness of this technique for analyzing complex hospital processes at a low cost. This study confirmed the effectiveness of process mining technology at an actual hospital site. In future studies, the use of process mining technology will be expanded by applying this approach to a larger variety of process change situations. Copyright © 2016. Published by Elsevier Ireland Ltd.

  20. Monitoring autocorrelated process: A geometric Brownian motion process approach

    NASA Astrophysics Data System (ADS)

    Li, Lee Siaw; Djauhari, Maman A.

    2013-09-01

    Autocorrelated process control is common in today's modern industrial process control practice. The current practice of autocorrelated process control is to eliminate the autocorrelation by using an appropriate model such as Box-Jenkins models or other models and then to conduct process control operation based on the residuals. In this paper we show that many time series are governed by a geometric Brownian motion (GBM) process. Therefore, in this case, by using the properties of a GBM process, we only need an appropriate transformation and model the transformed data to come up with the condition needs in traditional process control. An industrial example of cocoa powder production process in a Malaysian company will be presented and discussed to illustrate the advantages of the GBM approach.

  1. Integrated Process Modeling-A Process Validation Life Cycle Companion.

    PubMed

    Zahel, Thomas; Hauer, Stefan; Mueller, Eric M; Murphy, Patrick; Abad, Sandra; Vasilieva, Elena; Maurer, Daniel; Brocard, Cécile; Reinisch, Daniela; Sagmeister, Patrick; Herwig, Christoph

    2017-10-17

    During the regulatory requested process validation of pharmaceutical manufacturing processes, companies aim to identify, control, and continuously monitor process variation and its impact on critical quality attributes (CQAs) of the final product. It is difficult to directly connect the impact of single process parameters (PPs) to final product CQAs, especially in biopharmaceutical process development and production, where multiple unit operations are stacked together and interact with each other. Therefore, we want to present the application of Monte Carlo (MC) simulation using an integrated process model (IPM) that enables estimation of process capability even in early stages of process validation. Once the IPM is established, its capability in risk and criticality assessment is furthermore demonstrated. IPMs can be used to enable holistic production control strategies that take interactions of process parameters of multiple unit operations into account. Moreover, IPMs can be trained with development data, refined with qualification runs, and maintained with routine manufacturing data which underlines the lifecycle concept. These applications will be shown by means of a process characterization study recently conducted at a world-leading contract manufacturing organization (CMO). The new IPM methodology therefore allows anticipation of out of specification (OOS) events, identify critical process parameters, and take risk-based decisions on counteractions that increase process robustness and decrease the likelihood of OOS events.

  2. Meat Processing.

    ERIC Educational Resources Information Center

    Legacy, Jim; And Others

    This publication provides an introduction to meat processing for adult students in vocational and technical education programs. Organized in four chapters, the booklet provides a brief overview of the meat processing industry and the techniques of meat processing and butchering. The first chapter introduces the meat processing industry and…

  3. Process for improving metal production in steelmaking processes

    DOEpatents

    Pal, Uday B.; Gazula, Gopala K. M.; Hasham, Ali

    1996-01-01

    A process and apparatus for improving metal production in ironmaking and steelmaking processes is disclosed. The use of an inert metallic conductor in the slag containing crucible and the addition of a transition metal oxide to the slag are the disclosed process improvements.

  4. Process for improving metal production in steelmaking processes

    DOEpatents

    Pal, U.B.; Gazula, G.K.M.; Hasham, A.

    1996-06-18

    A process and apparatus for improving metal production in ironmaking and steelmaking processes is disclosed. The use of an inert metallic conductor in the slag containing crucible and the addition of a transition metal oxide to the slag are the disclosed process improvements. 6 figs.

  5. Process-Response Modeling and the Scientific Process.

    ERIC Educational Resources Information Center

    Fichter, Lynn S.

    1988-01-01

    Discusses the process-response model (PRM) in its theoretical and practical forms. Describes how geologists attempt to reconstruct the process from the response (the geologic phenomenon) being studied. (TW)

  6. Cognitive Processes in Discourse Comprehension: Passive Processes, Reader-Initiated Processes, and Evolving Mental Representations

    ERIC Educational Resources Information Center

    van den Broek, Paul; Helder, Anne

    2017-01-01

    As readers move through a text, they engage in various types of processes that, if all goes well, result in a mental representation that captures their interpretation of the text. With each new text segment the reader engages in passive and, at times, reader-initiated processes. These processes are strongly influenced by the readers'…

  7. In-process and post-process measurements of drill wear for control of the drilling process

    NASA Astrophysics Data System (ADS)

    Liu, Tien-I.; Liu, George; Gao, Zhiyu

    2011-12-01

    Optical inspection was used in this research for the post-process measurements of drill wear. A precision toolmakers" microscope was used. Indirect index, cutting force, is used for in-process drill wear measurements. Using in-process measurements to estimate the drill wear for control purpose can decrease the operation cost and enhance the product quality and safety. The challenge is to correlate the in-process cutting force measurements with the post-process optical inspection of drill wear. To find the most important feature, the energy principle was used in this research. It is necessary to select only the cutting force feature which shows the highest sensitivity to drill wear. The best feature selected is the peak of torque in the drilling process. Neuro-fuzzy systems were used for correlation purposes. The Adaptive-Network-Based Fuzzy Inference System (ANFIS) can construct fuzzy rules with membership functions to generate an input-output pair. A 1x6 ANFIS architecture with product of sigmoid membership functions can in-process measure the drill wear with an error as low as 0.15%. This is extremely important for control of the drilling process. Furthermore, the measurement of drill wear was performed under different drilling conditions. This shows that ANFIS has the capability of generalization.

  8. Reflow process stabilization by chemical characteristics and process conditions

    NASA Astrophysics Data System (ADS)

    Kim, Myoung-Soo; Park, Jeong-Hyun; Kim, Hak-Joon; Kim, Il-Hyung; Jeon, Jae-Ha; Gil, Myung-Goon; Kim, Bong-Ho

    2002-07-01

    With the shrunken device rule below 130nm, the patterning of smaller contact hole with enough process margin is required for mass production. Therefore, shrinking technology using thermal reflow process has been applied for smaller contact hole formation. In this paper, we have investigated the effects of chemical characteristics such as molecular weight, blocking ratio of resin, cross-linker amount and solvent type with its composition to reflow process of resist and found the optimized chemical composition for reflow process applicable condition. And several process conditions like resist coating thickness and multi-step thermal reflow method have been also evaluated to stabilize the pattern profile and improve CD uniformity after reflow process. From the experiment results, it was confirmed that the effect of crosslinker in resist to reflow properties such as reflow temperature and reflow rate were very critical and it controlled the pattern profile during reflow processing. And also, it showed stable CD uniformity and improved resist properties for top loss, film shrinkage and etch selectivity. The application of lower coating thickness of resist induced symmetric pattern profile even at edge with wider process margin. The introduction of two-step baking method for reflow process showed uniform CD value, also. It is believed that the application of resist containing crosslinker and optimized process conditions for smaller contact hole patterning is necessary for the mass production with a design rule below 130nm.

  9. Process-in-Network: A Comprehensive Network Processing Approach

    PubMed Central

    Urzaiz, Gabriel; Villa, David; Villanueva, Felix; Lopez, Juan Carlos

    2012-01-01

    A solid and versatile communications platform is very important in modern Ambient Intelligence (AmI) applications, which usually require the transmission of large amounts of multimedia information over a highly heterogeneous network. This article focuses on the concept of Process-in-Network (PIN), which is defined as the possibility that the network processes information as it is being transmitted, and introduces a more comprehensive approach than current network processing technologies. PIN can take advantage of waiting times in queues of routers, idle processing capacity in intermediate nodes, and the information that passes through the network. PMID:22969390

  10. Statistical Process Control for KSC Processing

    NASA Technical Reports Server (NTRS)

    Ford, Roger G.; Delgado, Hector; Tilley, Randy

    1996-01-01

    The 1996 Summer Faculty Fellowship Program and Kennedy Space Center (KSC) served as the basis for a research effort into statistical process control for KSC processing. The effort entailed several tasks and goals. The first was to develop a customized statistical process control (SPC) course for the Safety and Mission Assurance Trends Analysis Group. The actual teaching of this course took place over several weeks. In addition, an Internet version of the same course complete with animation and video excerpts from the course when it was taught at KSC was developed. The application of SPC to shuttle processing took up the rest of the summer research project. This effort entailed the evaluation of SPC use at KSC, both present and potential, due to the change in roles for NASA and the Single Flight Operations Contractor (SFOC). Individual consulting on SPC use was accomplished as well as an evaluation of SPC software for KSC use in the future. A final accomplishment of the orientation of the author to NASA changes, terminology, data format, and new NASA task definitions will allow future consultation when the needs arise.

  11. GOCE User Toolbox and Tutorial

    NASA Astrophysics Data System (ADS)

    Knudsen, P.; Benveniste, J.

    2011-07-01

    The GOCE User Toolbox GUT is a compilation of tools for the utilisation and analysis of GOCE Level 2 products. GUT support applications in Geodesy, Oceanography and Solid Earth Physics. The GUT Tutorial provides information and guidance in how to use the toolbox for a variety of applications. GUT consists of a series of advanced computer routines that carry out the required computations. It may be used on Windows PCs, UNIX/Linux Workstations, and Mac. The toolbox is supported by The GUT Algorithm Description and User Guide and The GUT Install Guide. A set of a-priori data and models are made available as well. GUT has been developed in a collaboration within the GUT Core Group. The GUT Core Group: S. Dinardo, D. Serpe, B.M. Lucas, R. Floberghagen, A. Horvath (ESA), O. Andersen, M. Herceg (DTU), M.-H. Rio, S. Mulet, G. Larnicol (CLS), J. Johannessen, L.Bertino (NERSC), H. Snaith, P. Challenor (NOC), K. Haines, D. Bretherton (NCEO), C. Hughes (POL), R.J. Bingham (NU), G. Balmino, S. Niemeijer, I. Price, L. Cornejo (S&T), M. Diament, I Panet (IPGP), C.C. Tscherning (KU), D. Stammer, F. Siegismund (UH), T. Gruber (TUM),

  12. New Insights on co-seismic landslide clustering

    NASA Astrophysics Data System (ADS)

    Meunier, Patrick; Marc, Odin; Hovius, Niels

    2015-04-01

    Earthquake-triggered landslides tend to cluster along topographic crests while rainfall-induced landslides should occur downslope preferentially, where pore pressure induced by groundwater flows is the highest [1]. Past studies on landslide clustering are all based on the analysis of complete dataset or subdataset of landslides associated with a given event (seismic or climatic) as a whole. In this work, we document the spatial and temporal variations of the landslide position (on hillslopes) within the epicentral area of the 1994 Northridge, the 1999 Chichi, the 2004 Niigata, the 2008 Iwate and the 2008 Wenchuan earthquakes. We show that crest clustering is not systematic, non uniform in space and exhibit patterns that vary a lot from one case to another. These patterns are not easy to interpret as they don't seem to be controlled by a single governing parameter but result from a complex interaction between local (hillslope length and gradient, lithology) and seismic (distance to source, slope aspect, radiation pattern, coseismic uplift) parameters. [1] Meunier, P., Hovius, N., & Haines, J. A. (2008). Topographic site effects and the location of earthquake induced landslides. Earth and Planetary Science Letters, 275(3), 221-232

  13. Into the complexity of coseismic landslide clustering

    NASA Astrophysics Data System (ADS)

    Meunier, Patrick; Marc, Odin; Uchida, Taro; Hovius, Niels

    2014-05-01

    Earthquake-triggered landslides tend to cluster along topographic crests while rainfall-induced landslides are more uniformly distributed on hillslopes [1]. In theory, rainfall induced landslides should even occur downslope preferentially, where pore pressure induced by groundwater flows is the highest. Past studies on landslide clustering are all based on the analysis of complete dataset or subdataset of landslides associated with a given event (seismic or climatic) as a whole. In this work, we document the spatial variation of the landslide position (on hillslopes) within the epicentral area for the cases of the 1999 Chichi, the 2004 Niigata and the 2008 Iwate earthquakes. We show that landslide clustering is not uniform in space and exhibit patterns that vary a lot from one case to another. These patterns are not easy to interpret as they don't seem to be controlled by a single governing parameter but result from a complex interaction between local (hillslope length and gradient, lithology) and seismic (distance to source, slope aspect, radiation pattern, coseismic uplift) parameters. [1] Meunier, P., Hovius, N., & Haines, J. A. (2008). Topographic site effects and the location of earthquake induced landslides. Earth and Planetary Science Letters, 275(3), 221-232.

  14. Temperature Data Assimilation with Salinity Corrections: Validation for the NSIPP Ocean Data Assimilation System in the Tropical Pacific Ocean, 1993-1998

    NASA Technical Reports Server (NTRS)

    Troccoli, Alberto; Rienecker, Michele M.; Keppenne, Christian L.; Johnson, Gregory C.

    2003-01-01

    The NASA Seasonal-to-Interannual Prediction Project (NSIPP) has developed an Ocean data assimilation system to initialize the quasi-isopycnal ocean model used in our experimental coupled-model forecast system. Initial tests of the system have focused on the assimilation of temperature profiles in an optimal interpolation framework. It is now recognized that correction of temperature only often introduces spurious water masses. The resulting density distribution can be statically unstable and also have a detrimental impact on the velocity distribution. Several simple schemes have been developed to try to correct these deficiencies. Here the salinity field is corrected by using a scheme which assumes that the temperature-salinity relationship of the model background is preserved during the assimilation. The scheme was first introduced for a zlevel model by Troccoli and Haines (1999). A large set of subsurface observations of salinity and temperature is used to cross-validate two data assimilation experiments run for the 6-year period 1993-1998. In these two experiments only subsurface temperature observations are used, but in one case the salinity field is also updated whenever temperature observations are available.

  15. Detection of mycobacteria in aquarium fish in Slovenia by culture and molecular methods.

    PubMed

    Pate, M; Jencic, V; Zolnir-Dovc, M; Ocepek, M

    2005-04-06

    Thirty-five aquarium fish were investigated for the presence of mycobacteria by culture and molecular methods. The following species were examined: goldfish Carassius auratus auratus, guppy Poecilia reticulata, 4 three-spot gourami Trichogaster trichopterus, dwarf gourami Colisa lalia, Siamese fighting fish Betta splendens, freshwater angelfish Pterophyllum scalare, African cichlid fish Cichlidae spp., cichlid fish Microgeophagus altispinosus, cichlid fish Pseudotropheus lombardoi, blue streak hap Labidochromis caeruleus, sterlet Acipenser ruthenus, southern platyfish Xiphophorus maculatus, and catfish Corydoras spp. Isolates of mycobacteria were obtained in 29 cases (82.9%). Two specimens were positive using Ziehl-Neelsen (ZN) staining, but the cultivation failed. Four specimens were both ZN- and culture-negative. On the basis of GenoType Mycobacterium assay (Hain Life-science) and restriction enzyme analysis of the amplified products (PCR-RFLP), 23 isolates (79.3%) were identified: 7 as Mycobacterium fortuitum, 6 as M. gordonae, 6 as M. marinum, 3 as M. chelonae, and 1 as M. peregrinum. Five isolates remained unidentified (Mycobacterium spp.). One case probably represented a mixed infection (M. marinum/M. fortuitum). Since M. marinum infections are also detected in humans, the significance of mycobacteria in aquarium fish should not be overlooked.

  16. Heat flux viscosity in collisional magnetized plasmas

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, C., E-mail: cliu@pppl.gov; Fox, W.; Bhattacharjee, A.

    2015-05-15

    Momentum transport in collisional magnetized plasmas due to gradients in the heat flux, a “heat flux viscosity,” is demonstrated. Even though no net particle flux is associated with a heat flux, in a plasma there can still be momentum transport owing to the velocity dependence of the Coulomb collision frequency, analogous to the thermal force. This heat-flux viscosity may play an important role in numerous plasma environments, in particular, in strongly driven high-energy-density plasma, where strong heat flux can dominate over ordinary plasma flows. The heat flux viscosity can influence the dynamics of the magnetic field in plasmas through themore » generalized Ohm's law and may therefore play an important role as a dissipation mechanism allowing magnetic field line reconnection. The heat flux viscosity is calculated directly using the finite-difference method of Epperlein and Haines [Phys. Fluids 29, 1029 (1986)], which is shown to be more accurate than Braginskii's method [S. I. Braginskii, Rev. Plasma Phys. 1, 205 (1965)], and confirmed with one-dimensional collisional particle-in-cell simulations. The resulting transport coefficients are tabulated for ease of application.« less

  17. Logistics Process Analysis ToolProcess Analysis Tool

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2008-03-31

    LPAT is the resulting integrated system between ANL-developed Enhanced Logistics Intra Theater Support Tool (ELIST) sponsored by SDDC-TEA and the Fort Future Virtual Installation Tool (sponsored by CERL). The Fort Future Simulation Engine was an application written in the ANL Repast Simphony framework and used as the basis for the process Anlysis Tool (PAT) which evolved into a stand=-along tool for detailed process analysis at a location. Combined with ELIST, an inter-installation logistics component was added to enable users to define large logistical agent-based models without having to program. PAT is the evolution of an ANL-developed software system called Fortmore » Future Virtual Installation Tool (sponsored by CERL). The Fort Future Simulation Engine was an application written in the ANL Repast Simphony framework and used as the basis for the Process Analysis Tool(PAT) which evolved into a stand-alone tool for detailed process analysis at a location (sponsored by the SDDC-TEA).« less

  18. Processing module operating methods, processing modules, and communications systems

    DOEpatents

    McCown, Steven Harvey; Derr, Kurt W.; Moore, Troy

    2014-09-09

    A processing module operating method includes using a processing module physically connected to a wireless communications device, requesting that the wireless communications device retrieve encrypted code from a web site and receiving the encrypted code from the wireless communications device. The wireless communications device is unable to decrypt the encrypted code. The method further includes using the processing module, decrypting the encrypted code, executing the decrypted code, and preventing the wireless communications device from accessing the decrypted code. Another processing module operating method includes using a processing module physically connected to a host device, executing an application within the processing module, allowing the application to exchange user interaction data communicated using a user interface of the host device with the host device, and allowing the application to use the host device as a communications device for exchanging information with a remote device distinct from the host device.

  19. The Research Process on Converter Steelmaking Process by Using Limestone

    NASA Astrophysics Data System (ADS)

    Tang, Biao; Li, Xing-yi; Cheng, Han-chi; Wang, Jing; Zhang, Yun-long

    2017-08-01

    Compared with traditional converter steelmaking process, steelmaking process with limestone uses limestone to replace lime partly. A lot of researchers have studied about the new steelmaking process. There are much related research about material balance calculation, the behaviour of limestone in the slag, limestone powder injection in converter and application of limestone in iron and steel enterprises. The results show that the surplus heat of converter can meet the need of the limestone calcination, and the new process can reduce the steelmaking process energy loss in the whole steelmaking process, reduce carbon dioxide emissions, and improve the quality of the gas.

  20. Supercritical crystallization: The RESs-process and the GAS-process

    NASA Astrophysics Data System (ADS)

    Berends, Edwin M.

    1994-09-01

    This Doctoral Ph.D. thesis describes the development of two novel crystallization processes utilizing supercritical fluids either as a solvent, the RESS-process, or as an anti-solvent, the GAS-process. In th RESS-process precipitation of the solute is performed by expansion of the solution over a nozzle to produce ultra-fine, monodisperse particles without any solvent inclusions. In the GAS-process a high pressure gas is dissolved into the liquid phase solvent, where it causes a volumetric expansion of this liquid solvent and lowers the equilibrium solubility. Particle size, particle size distribution and other particle characteristics such as their shape, internal structure and the residual amount of solvent in the particles are expected to be influenced by the liquid phase expansion profile.

  1. Membrane processes

    NASA Astrophysics Data System (ADS)

    Staszak, Katarzyna

    2017-11-01

    The membrane processes have played important role in the industrial separation process. These technologies can be found in all industrial areas such as food, beverages, metallurgy, pulp and paper, textile, pharmaceutical, automotive, biotechnology and chemical industry, as well as in water treatment for domestic and industrial application. Although these processes are known since twentieth century, there are still many studies that focus on the testing of new membranes' materials and determining of conditions for optimal selectivity, i. e. the optimum transmembrane pressure (TMP) or permeate flux to minimize fouling. Moreover the researchers proposed some calculation methods to predict the membrane processes properties. In this article, the laboratory scale experiments of membrane separation techniques, as well their validation by calculation methods are presented. Because membrane is the "heart" of the process, experimental and computational methods for its characterization are also described.

  2. Process Analytical Technology (PAT): batch-to-batch reproducibility of fermentation processes by robust process operational design and control.

    PubMed

    Gnoth, S; Jenzsch, M; Simutis, R; Lübbert, A

    2007-10-31

    The Process Analytical Technology (PAT) initiative of the FDA is a reaction on the increasing discrepancy between current possibilities in process supervision and control of pharmaceutical production processes and its current application in industrial manufacturing processes. With rigid approval practices based on standard operational procedures, adaptations of production reactors towards the state of the art were more or less inhibited for long years. Now PAT paves the way for continuous process and product improvements through improved process supervision based on knowledge-based data analysis, "Quality-by-Design"-concepts, and, finally, through feedback control. Examples of up-to-date implementations of this concept are presented. They are taken from one key group of processes in recombinant pharmaceutical protein manufacturing, the cultivations of genetically modified Escherichia coli bacteria.

  3. Gas processing handbook

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1982-04-01

    Brief details are given of processes including: BGC-Lurgi slagging gasification, COGAS, Exxon catalytic coal gasification, FW-Stoic 2-stage, GI two stage, HYGAS, Koppers-Totzek, Lurgi pressure gasification, Saarberg-Otto, Shell, Texaco, U-Gas, W-D.IGI, Wellman-Galusha, Westinghouse, and Winkler coal gasification processes; the Rectisol process; the Catacarb and the Benfield processes for removing CO/SUB/2, H/SUB/2s and COS from gases produced by the partial oxidation of coal; the selectamine DD, Selexol solvent, and Sulfinol gas cleaning processes; the sulphur-tolerant shift (SSK) process; and the Super-meth process for the production of high-Btu gas from synthesis gas.

  4. Development of functionally-oriented technological processes of electroerosive processing

    NASA Astrophysics Data System (ADS)

    Syanov, S. Yu

    2018-03-01

    The stages of the development of functionally oriented technological processes of electroerosive processing from the separation of the surfaces of parts and their service functions to the determination of the parameters of the process of electric erosion, which will provide not only the quality parameters of the surface layer, but also the required operational properties, are described.

  5. Emotional Processing, Interaction Process, and Outcome in Clarification-Oriented Psychotherapy for Personality Disorders: A Process-Outcome Analysis.

    PubMed

    Kramer, Ueli; Pascual-Leone, Antonio; Rohde, Kristina B; Sachse, Rainer

    2016-06-01

    It is important to understand the change processes involved in psychotherapies for patients with personality disorders (PDs). One patient process that promises to be useful in relation to the outcome of psychotherapy is emotional processing. In the present process-outcome analysis, we examine this question by using a sequential model of emotional processing and by additionally taking into account a therapist's appropriate responsiveness to a patient's presentation in clarification-oriented psychotherapy (COP), a humanistic-experiential form of therapy. The present study involved 39 patients with a range of PDs undergoing COP. Session 25 was assessed as part of the working phase of each therapy by external raters in terms of emotional processing using the Classification of Affective-Meaning States (CAMS) and in terms of the overall quality of therapist-patient interaction using the Process-Content-Relationship Scale (BIBS). Treatment outcome was assessed pre- and post-therapy using the Global Severity Index (GSI) of the SCL-90-R and the BDI. Results indicate that the good outcome cases showed more self-compassion, more rejecting anger, and a higher quality of therapist-patient interaction compared to poorer outcome cases. For good outcome cases, emotional processing predicted 18% of symptom change at the end of treatment, which was not found for poor outcome cases. These results are discussed within the framework of an integrative understanding of emotional processing as an underlying mechanism of change in COP, and perhaps in other effective therapy approaches for PDs.

  6. Process Materialization Using Templates and Rules to Design Flexible Process Models

    NASA Astrophysics Data System (ADS)

    Kumar, Akhil; Yao, Wen

    The main idea in this paper is to show how flexible processes can be designed by combining generic process templates and business rules. We instantiate a process by applying rules to specific case data, and running a materialization algorithm. The customized process instance is then executed in an existing workflow engine. We present an architecture and also give an algorithm for process materialization. The rules are written in a logic-based language like Prolog. Our focus is on capturing deeper process knowledge and achieving a holistic approach to robust process design that encompasses control flow, resources and data, as well as makes it easier to accommodate changes to business policy.

  7. Depth-of-processing effects on priming in stem completion: tests of the voluntary-contamination, conceptual-processing, and lexical-processing hypotheses.

    PubMed

    Richardson-Klavehn, A; Gardiner, J M

    1998-05-01

    Depth-of-processing effects on incidental perceptual memory tests could reflect (a) contamination by voluntary retrieval, (b) sensitivity of involuntary retrieval to prior conceptual processing, or (c) a deficit in lexical processing during graphemic study tasks that affects involuntary retrieval. The authors devised an extension of incidental test methodology--making conjunctive predictions about response times as well as response proportions--to discriminate among these alternatives. They used graphemic, phonemic, and semantic study tasks, and a word-stem completion test with incidental, intentional, and inclusion instructions. Semantic study processing was superior to phonemic study processing in the intentional and inclusion tests, but semantic and phonemic study processing produced equal priming in the incidental test, showing that priming was uncontaminated by voluntary retrieval--a conclusion reinforced by the response-time data--and that priming was insensitive to prior conceptual processing. The incidental test nevertheless showed a priming deficit following graphemic study processing, supporting the lexical-processing hypothesis. Adding a lexical decision to the 3 study tasks eliminated the priming deficit following graphemic study processing, but did not influence priming following phonemic and semantic processing. The results provide the first clear evidence that depth-of-processing effects on perceptual priming can reflect lexical processes, rather than voluntary contamination or conceptual processes.

  8. Assessment of Process Capability: the case of Soft Drinks Processing Unit

    NASA Astrophysics Data System (ADS)

    Sri Yogi, Kottala

    2018-03-01

    The process capability studies have significant impact in investigating process variation which is important in achieving product quality characteristics. Its indices are to measure the inherent variability of a process and thus to improve the process performance radically. The main objective of this paper is to understand capability of the process being produced within specification of the soft drinks processing unit, a premier brands being marketed in India. A few selected critical parameters in soft drinks processing: concentration of gas volume, concentration of brix, torque of crock has been considered for this study. Assessed some relevant statistical parameters: short term capability, long term capability as a process capability indices perspective. For assessment we have used real time data of soft drinks bottling company which is located in state of Chhattisgarh, India. As our research output suggested reasons for variations in the process which is validated using ANOVA and also predicted Taguchi cost function, assessed also predicted waste monetarily this shall be used by organization for improving process parameters. This research work has substantially benefitted the organization in understanding the various variations of selected critical parameters for achieving zero rejection.

  9. Integrated stationary Ornstein-Uhlenbeck process, and double integral processes

    NASA Astrophysics Data System (ADS)

    Abundo, Mario; Pirozzi, Enrica

    2018-03-01

    We find a representation of the integral of the stationary Ornstein-Uhlenbeck (ISOU) process in terms of Brownian motion Bt; moreover, we show that, under certain conditions on the functions f and g , the double integral process (DIP) D(t) = ∫βt g(s) (∫αs f(u) dBu) ds can be thought as the integral of a suitable Gauss-Markov process. Some theoretical and application details are given, among them we provide a simulation formula based on that representation by which sample paths, probability densities and first passage times of the ISOU process are obtained; the first-passage times of the DIP are also studied.

  10. Neurological Evidence Linguistic Processes Precede Perceptual Simulation in Conceptual Processing

    PubMed Central

    Louwerse, Max; Hutchinson, Sterling

    2012-01-01

    There is increasing evidence from response time experiments that language statistics and perceptual simulations both play a role in conceptual processing. In an EEG experiment we compared neural activity in cortical regions commonly associated with linguistic processing and visual perceptual processing to determine to what extent symbolic and embodied accounts of cognition applied. Participants were asked to determine the semantic relationship of word pairs (e.g., sky – ground) or to determine their iconic relationship (i.e., if the presentation of the pair matched their expected physical relationship). A linguistic bias was found toward the semantic judgment task and a perceptual bias was found toward the iconicity judgment task. More importantly, conceptual processing involved activation in brain regions associated with both linguistic and perceptual processes. When comparing the relative activation of linguistic cortical regions with perceptual cortical regions, the effect sizes for linguistic cortical regions were larger than those for the perceptual cortical regions early in a trial with the reverse being true later in a trial. These results map upon findings from other experimental literature and provide further evidence that processing of concept words relies both on language statistics and on perceptual simulations, whereby linguistic processes precede perceptual simulation processes. PMID:23133427

  11. Neurological evidence linguistic processes precede perceptual simulation in conceptual processing.

    PubMed

    Louwerse, Max; Hutchinson, Sterling

    2012-01-01

    There is increasing evidence from response time experiments that language statistics and perceptual simulations both play a role in conceptual processing. In an EEG experiment we compared neural activity in cortical regions commonly associated with linguistic processing and visual perceptual processing to determine to what extent symbolic and embodied accounts of cognition applied. Participants were asked to determine the semantic relationship of word pairs (e.g., sky - ground) or to determine their iconic relationship (i.e., if the presentation of the pair matched their expected physical relationship). A linguistic bias was found toward the semantic judgment task and a perceptual bias was found toward the iconicity judgment task. More importantly, conceptual processing involved activation in brain regions associated with both linguistic and perceptual processes. When comparing the relative activation of linguistic cortical regions with perceptual cortical regions, the effect sizes for linguistic cortical regions were larger than those for the perceptual cortical regions early in a trial with the reverse being true later in a trial. These results map upon findings from other experimental literature and provide further evidence that processing of concept words relies both on language statistics and on perceptual simulations, whereby linguistic processes precede perceptual simulation processes.

  12. Marshaling and Acquiring Resources for the Process Improvement Process

    DTIC Science & Technology

    1993-06-01

    stakeholders. ( Geber , 1990) D. IDENTIFYING SUPPLIERS Suppliers are just as crucial to setting requirements for processes as are customers. Although...output ( Geber , 1990, p. 32). Before gathering resources for process improvement, the functional manager must ensure that the relationship of internal...him patent information and clerical people process his applications. ( Geber , 1990, pp. 29-34) To get the full benefit of a white-collar worker as a

  13. Extensible packet processing architecture

    DOEpatents

    Robertson, Perry J.; Hamlet, Jason R.; Pierson, Lyndon G.; Olsberg, Ronald R.; Chun, Guy D.

    2013-08-20

    A technique for distributed packet processing includes sequentially passing packets associated with packet flows between a plurality of processing engines along a flow through data bus linking the plurality of processing engines in series. At least one packet within a given packet flow is marked by a given processing engine to signify by the given processing engine to the other processing engines that the given processing engine has claimed the given packet flow for processing. A processing function is applied to each of the packet flows within the processing engines and the processed packets are output on a time-shared, arbitered data bus coupled to the plurality of processing engines.

  14. Shuttle Processing

    NASA Technical Reports Server (NTRS)

    Guodace, Kimberly A.

    2010-01-01

    This slide presentation details shuttle processing flow which starts with wheel stop and ends with launching. The flow is from landing the orbiter is rolled into the Orbiter Processing Facility (OPF), where processing is performed, it is then rolled over to the Vehicle Assembly Building (VAB) where it is mated with the propellant tanks, and payloads are installed. A different flow is detailed if the weather at Kennedy Space Center requires a landing at Dryden.

  15. Adaptive-optics optical coherence tomography processing using a graphics processing unit.

    PubMed

    Shafer, Brandon A; Kriske, Jeffery E; Kocaoglu, Omer P; Turner, Timothy L; Liu, Zhuolin; Lee, John Jaehwan; Miller, Donald T

    2014-01-01

    Graphics processing units are increasingly being used for scientific computing for their powerful parallel processing abilities, and moderate price compared to super computers and computing grids. In this paper we have used a general purpose graphics processing unit to process adaptive-optics optical coherence tomography (AOOCT) images in real time. Increasing the processing speed of AOOCT is an essential step in moving the super high resolution technology closer to clinical viability.

  16. Determination of discharge during pulsating flow

    USGS Publications Warehouse

    Thompson, T.H.

    1968-01-01

    Pulsating flow in an open channel is a manifestation of unstable-flow conditions in which a series of translatory waves of perceptible magnitude develops and moves rapidly downstream. Pulsating flow is a matter of concern in the design and operation of steep-gradient channels. If it should occur at high stages in a channel designed for stable flow, the capacity of the channel may be inadequate at a discharge that is much smaller than that for which the channel was designed. If the overriding translatory wave carries an appreciable part of the total flow, conventional stream-gaging procedures cannot be used to determine the discharge; neither the conventional instrumentation nor conventional methodology is adequate. A method of determining the discharge during pulsating flow was tested in the Santa Anita Wash flood control channel in Arcadia, Calif., April 16, 1965. Observations of the dimensions and velocities of translatory waves were made during a period of controlled reservoir releases of about 100, 200, and 300 cfs (cubic feet per second). The method of computing discharge was based on (1) computation of the discharge in the overriding waves and (2) computation of the discharge in the shallow-depth, or overrun, part of the flow. Satisfactory results were obtained by this method. However, the procedure used-separating the flow into two components and then treating the shallow-depth component as though it were steady--has no theoretical basis. It is simply an expedient for use until laboratory investigation can provide a satisfactory analytical solution to the problem of computing discharge during pulsating flow. Sixteen months prior to the test in Santa Anita Wash, a robot camera had been designed .and programmed to obtain the data needed to compute discharge by the method described above. The photographic equipment had been installed in Haines Creek flood control channel in Los Angeles, Calif., but it had not been completely tested because of the infrequency of

  17. Processing approaches to cognition: the impetus from the levels-of-processing framework.

    PubMed

    Roediger, Henry L; Gallo, David A; Geraci, Lisa

    2002-01-01

    Processing approaches to cognition have a long history, from act psychology to the present, but perhaps their greatest boost was given by the success and dominance of the levels-of-processing framework. We review the history of processing approaches, and explore the influence of the levels-of-processing approach, the procedural approach advocated by Paul Kolers, and the transfer-appropriate processing framework. Processing approaches emphasise the procedures of mind and the idea that memory storage can be usefully conceptualised as residing in the same neural units that originally processed information at the time of encoding. Processing approaches emphasise the unity and interrelatedness of cognitive processes and maintain that they can be dissected into separate faculties only by neglecting the richness of mental life. We end by pointing to future directions for processing approaches.

  18. Nonthermal processing technologies as food safety intervention processes

    USDA-ARS?s Scientific Manuscript database

    Foods should provide sensorial satisfaction and nutrition to people. Yet, foodborne pathogens cause significant illness and lose of life to human kind every year. A processing intervention step may be necessary prior to the consumption to ensure the safety of foods. Nonthermal processing technologi...

  19. Batch Statistical Process Monitoring Approach to a Cocrystallization Process.

    PubMed

    Sarraguça, Mafalda C; Ribeiro, Paulo R S; Dos Santos, Adenilson O; Lopes, João A

    2015-12-01

    Cocrystals are defined as crystalline structures composed of two or more compounds that are solid at room temperature held together by noncovalent bonds. Their main advantages are the increase of solubility, bioavailability, permeability, stability, and at the same time retaining active pharmaceutical ingredient bioactivity. The cocrystallization between furosemide and nicotinamide by solvent evaporation was monitored on-line using near-infrared spectroscopy (NIRS) as a process analytical technology tool. The near-infrared spectra were analyzed using principal component analysis. Batch statistical process monitoring was used to create control charts to perceive the process trajectory and define control limits. Normal and non-normal operating condition batches were performed and monitored with NIRS. The use of NIRS associated with batch statistical process models allowed the detection of abnormal variations in critical process parameters, like the amount of solvent or amount of initial components present in the cocrystallization. © 2015 Wiley Periodicals, Inc. and the American Pharmacists Association.

  20. Process of discharging charge-build up in slag steelmaking processes

    DOEpatents

    Pal, Uday B.; Gazula, Gopala K. M.; Hasham, Ali

    1994-01-01

    A process and apparatus for improving metal production in ironmaking and steelmaking processes is disclosed. The use of an inert metallic conductor in the slag-containing crucible and the addition of a transition metal oxide to the slag are the disclosed process improvements.

  1. The process audit.

    PubMed

    Hammer, Michael

    2007-04-01

    Few executives question the idea that by redesigning business processes--work that runs from end to end across an enterprise--they can achieve extraordinary improvements in cost, quality, speed, profitability, and other key areas Yet in spite of their intentions and investments, many executives flounder, unsure about what exactly needs to be changed, by how much, and when. As a result, many organizations make little progress--if any at all--in their attempts to transform business processes. Michael Hammer has spent the past five years working with a group of leading companies to develop the Process and Enterprise Maturity Model (PEMM), a new framework that helps executives comprehend, formulate, and assess process-based transformation efforts. He has identified two distinct groups of characteristics that are needed for business processes to perform exceptionally well over a long period of time. Process enablers, which affect individual processes, determine how well a process is able to function. They are mutually interdependent--if any are missing, the others will be ineffective. However, enablers are not enough to develop high-performance processes; they only provide the potential to deliver high performance. A company must also possess or establish organizational capabilities that allow the business to offer a supportive environment. Together, the enablers and the capabilities provide an effective way for companies to plan and evaluate process-based transformations. PEMM is different from other frameworks, such as Capability Maturity Model Integration (CMMI), because it applies to all industries and all processes. The author describes how several companies--including Michelin, CSAA, Tetra Pak, Shell, Clorox, and Schneider National--have successfully used PEMM in various ways and at different stages to evaluate the progress of their process-based transformation efforts.

  2. Internal process: what is abstraction and distortion process?

    NASA Astrophysics Data System (ADS)

    Fiantika, F. R.; Budayasa, I. K.; Lukito, A.

    2018-03-01

    Geometry is one of the branch of mathematics that plays a major role in the development of science and technology. Thus, knowing the geometry concept is needed for students from their early basic level of thinking. A preliminary study showed that the elementary students have difficulty in perceiving parallelogram shape in a 2-dimention of a cube drawing as a square shape. This difficulty makes the students can not solve geometrical problems correctly. This problem is related to the internal thinking process in geometry. We conducted the exploration of students’ internal thinking processes in geometry particularly in distinguishing the square and parallelogram shape. How the students process their internal thinking through distortion and abstraction is the main aim of this study. Analysis of the geometrical test and deep interview are used in this study to obtain the data. The result of this study is there are two types of distortion and abstraction respectively in which the student used in their internal thinking processes.

  3. Transforming Collaborative Process Models into Interface Process Models by Applying an MDA Approach

    NASA Astrophysics Data System (ADS)

    Lazarte, Ivanna M.; Chiotti, Omar; Villarreal, Pablo D.

    Collaborative business models among enterprises require defining collaborative business processes. Enterprises implement B2B collaborations to execute these processes. In B2B collaborations the integration and interoperability of processes and systems of the enterprises are required to support the execution of collaborative processes. From a collaborative process model, which describes the global view of the enterprise interactions, each enterprise must define the interface process that represents the role it performs in the collaborative process in order to implement the process in a Business Process Management System. Hence, in this work we propose a method for the automatic generation of the interface process model of each enterprise from a collaborative process model. This method is based on a Model-Driven Architecture to transform collaborative process models into interface process models. By applying this method, interface processes are guaranteed to be interoperable and defined according to a collaborative process.

  4. Process Monitoring Evaluation and Implementation for the Wood Abrasive Machining Process

    PubMed Central

    Saloni, Daniel E.; Lemaster, Richard L.; Jackson, Steven D.

    2010-01-01

    Wood processing industries have continuously developed and improved technologies and processes to transform wood to obtain better final product quality and thus increase profits. Abrasive machining is one of the most important of these processes and therefore merits special attention and study. The objective of this work was to evaluate and demonstrate a process monitoring system for use in the abrasive machining of wood and wood based products. The system developed increases the life of the belt by detecting (using process monitoring sensors) and removing (by cleaning) the abrasive loading during the machining process. This study focused on abrasive belt machining processes and included substantial background work, which provided a solid base for understanding the behavior of the abrasive, and the different ways that the abrasive machining process can be monitored. In addition, the background research showed that abrasive belts can effectively be cleaned by the appropriate cleaning technique. The process monitoring system developed included acoustic emission sensors which tended to be sensitive to belt wear, as well as platen vibration, but not loading, and optical sensors which were sensitive to abrasive loading. PMID:22163477

  5. 21 CFR 1271.220 - Processing and process controls.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Processing and process controls. 1271.220 Section 1271.220 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) REGULATIONS UNDER CERTAIN OTHER ACTS ADMINISTERED BY THE FOOD AND DRUG ADMINISTRATION HUMAN CELLS...

  6. Pre- and Post-Processing Tools to Streamline the CFD Process

    NASA Technical Reports Server (NTRS)

    Dorney, Suzanne Miller

    2002-01-01

    This viewgraph presentation provides information on software development tools to facilitate the use of CFD (Computational Fluid Dynamics) codes. The specific CFD codes FDNS and CORSAIR are profiled, and uses for software development tools with these codes during pre-processing, interim-processing, and post-processing are explained.

  7. Processing Depth, Elaboration of Encoding, Memory Stores, and Expended Processing Capacity.

    ERIC Educational Resources Information Center

    Eysenck, Michael W.; Eysenck, M. Christine

    1979-01-01

    The effects of several factors on expended processing capacity were measured. Expended processing capacity was greater when information was retrieved from secondary memory than from primary memory, when processing was of a deep, semantic nature than when it was shallow and physical, and when processing was more elaborate. (Author/GDC)

  8. Wave Gradiometry for the Central U.S

    NASA Astrophysics Data System (ADS)

    liu, Y.; Holt, W. E.

    2013-12-01

    Wave gradiometry is a new technique utilizing the shape of seismic wave fields captured by USArray transportable stations to determine fundamental wave propagation characteristics. The horizontal and vertical wave displacements, spatial gradients and time derivatives of displacement are linearly linked by two coefficients which can be used to infer wave slowness, back azimuth, radiation pattern and geometrical spreading. The reducing velocity method from Langston [2007] is applied to pre-process our data. Spatial gradients of the shifted displacement fields are estimated using bi-cubic splines [Beavan and Haines, 2001]. Using singular value decomposition, the spatial gradients are then inverted to iteratively solve for wave parameters mentioned above. Numerical experiments with synthetic data sets provided by Princeton University's Neal Real Time Global Seismicity Portal are conducted to test the algorithm stability and evaluate errors. Our results based on real records in the central U.S. show that, the average Rayleigh wave phase velocity ranges from 3.8 to 4.2 km/s for periods from 60-125s, and 3.6 to 4.0 km/s for periods from 25-60s, which is consistent with earth model. Geometrical spreading and radiation pattern show similar features between different frequency bands. Azimuth variations are partially correlated with phase velocity change. Finally, we calculated waveform amplitude and spatial gradient uncertainties to determine formal errors in the estimated wave parameters. Further effort will be put into calculating shear wave velocity structure with respect to depth in the studied area. The wave gradiometry method is now being employed across the USArray using real observations and results obtained to date are for stations in eastern portion of the U.S. Rayleigh wave phase velocity derived from Aug, 20th, 2011 Vanuatu earthquake for periods from 100 - 125 s.

  9. Turbine blade processing

    NASA Technical Reports Server (NTRS)

    1975-01-01

    Space processing of directionally solidified eutectic-alloy type turbine blades is envisioned as a simple remelt operations in which precast blades are remelted in a preformed mold. Process systems based on induction melting, continuous resistance furnaces, and batch resistance furnaces were evaluated. The batch resistance furnace type process using a multiblade mold is considered to offer the best possibility for turbine blade processing.

  10. Euglena Transcript Processing.

    PubMed

    McWatters, David C; Russell, Anthony G

    2017-01-01

    RNA transcript processing is an important stage in the gene expression pathway of all organisms and is subject to various mechanisms of control that influence the final levels of gene products. RNA processing involves events such as nuclease-mediated cleavage, removal of intervening sequences referred to as introns and modifications to RNA structure (nucleoside modification and editing). In Euglena, RNA transcript processing was initially examined in chloroplasts because of historical interest in the secondary endosymbiotic origin of this organelle in this organism. More recent efforts to examine mitochondrial genome structure and RNA maturation have been stimulated by the discovery of unusual processing pathways in other Euglenozoans such as kinetoplastids and diplonemids. Eukaryotes containing large genomes are now known to typically contain large collections of introns and regulatory RNAs involved in RNA processing events, and Euglena gracilis in particular has a relatively large genome for a protist. Studies examining the structure of nuclear genes and the mechanisms involved in nuclear RNA processing have revealed that indeed Euglena contains large numbers of introns in the limited set of genes so far examined and also possesses large numbers of specific classes of regulatory and processing RNAs, such as small nucleolar RNAs (snoRNAs). Most interestingly, these studies have also revealed that Euglena possesses novel processing pathways generating highly fragmented cytosolic ribosomal RNAs and subunits and non-conventional intron classes removed by unknown splicing mechanisms. This unexpected diversity in RNA processing pathways emphasizes the importance of identifying the components involved in these processing mechanisms and their evolutionary emergence in Euglena species.

  11. Post-processing of metal matrix composites by friction stir processing

    NASA Astrophysics Data System (ADS)

    Sharma, Vipin; Singla, Yogesh; Gupta, Yashpal; Raghuwanshi, Jitendra

    2018-05-01

    In metal matrix composites non-uniform distribution of reinforcement particles resulted in adverse affect on the mechanical properties. It is of great interest to explore post-processing techniques that can eliminate particle distribution heterogeneity. Friction stir processing is a relatively newer technique used for post-processing of metal matrix composites to improve homogeneity in particles distribution. In friction stir processing, synergistic effect of stirring, extrusion and forging resulted in refinement of grains, reduction of reinforcement particles size, uniformity in particles distribution, reduction in microstructural heterogeneity and elimination of defects.

  12. The roles of a process development group in biopharmaceutical process startup.

    PubMed

    Goochee, Charles F

    2002-01-01

    The transfer of processes for biotherapeutic products into finalmanufacturing facilities was frequently problematic during the 1980's and early 1990's, resulting in costly delays to licensure(Pisano 1997). While plant startups for this class of products can become chaotic affairs, this is not an inherent or intrinsic feature. Major classes of process startup problems have been identified andmechanisms have been developed to reduce their likelihood of occurrence. These classes of process startup problems and resolution mechanisms are the major topic of this article. With proper planning and sufficient staffing, the probably of a smooth process startup for a biopharmaceutical product can be very high - i.e., successful process performance will often beachieved within the first two full-scale process lots in the plant. The primary focus of this article is the role of the Process Development Group in helping to assure this high probability of success.

  13. Process-based costing.

    PubMed

    Lee, Robert H; Bott, Marjorie J; Forbes, Sarah; Redford, Linda; Swagerty, Daniel L; Taunton, Roma Lee

    2003-01-01

    Understanding how quality improvement affects costs is important. Unfortunately, low-cost, reliable ways of measuring direct costs are scarce. This article builds on the principles of process improvement to develop a costing strategy that meets both criteria. Process-based costing has 4 steps: developing a flowchart, estimating resource use, valuing resources, and calculating direct costs. To illustrate the technique, this article uses it to cost the care planning process in 3 long-term care facilities. We conclude that process-based costing is easy to implement; generates reliable, valid data; and allows nursing managers to assess the costs of new or modified processes.

  14. Biomass process handbook

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1983-01-01

    Descriptions are given of 42 processes which use biomass to produce chemical products. Marketing and economic background, process description, flow sheets, costs, major equipment, and availability of technology are given for each of the 42 processes. Some of the chemicals discussed are: ethanol, ethylene, acetaldehyde, butanol, butadiene, acetone, citric acid, gluconates, itaconic acid, lactic acid, xanthan gum, sorbitol, starch polymers, fatty acids, fatty alcohols, glycerol, soap, azelaic acid, perlargonic acid, nylon-11, jojoba oil, furfural, furfural alcohol, tetrahydrofuran, cellulose polymers, products from pulping wastes, and methane. Processes include acid hydrolysis, enzymatic hydrolysis, fermentation, distillation, Purox process, and anaerobic digestion.

  15. Laser Processing of Multilayered Thermal Spray Coatings: Optimal Processing Parameters

    NASA Astrophysics Data System (ADS)

    Tewolde, Mahder; Zhang, Tao; Lee, Hwasoo; Sampath, Sanjay; Hwang, David; Longtin, Jon

    2017-12-01

    Laser processing offers an innovative approach for the fabrication and transformation of a wide range of materials. As a rapid, non-contact, and precision material removal technology, lasers are natural tools to process thermal spray coatings. Recently, a thermoelectric generator (TEG) was fabricated using thermal spray and laser processing. The TEG device represents a multilayer, multimaterial functional thermal spray structure, with laser processing serving an essential role in its fabrication. Several unique challenges are presented when processing such multilayer coatings, and the focus of this work is on the selection of laser processing parameters for optimal feature quality and device performance. A parametric study is carried out using three short-pulse lasers, where laser power, repetition rate and processing speed are varied to determine the laser parameters that result in high-quality features. The resulting laser patterns are characterized using optical and scanning electron microscopy, energy-dispersive x-ray spectroscopy, and electrical isolation tests between patterned regions. The underlying laser interaction and material removal mechanisms that affect the feature quality are discussed. Feature quality was found to improve both by using a multiscanning approach and an optional assist gas of air or nitrogen. Electrically isolated regions were also patterned in a cylindrical test specimen.

  16. Application of statistical process control and process capability analysis procedures in orbiter processing activities at the Kennedy Space Center

    NASA Technical Reports Server (NTRS)

    Safford, Robert R.; Jackson, Andrew E.; Swart, William W.; Barth, Timothy S.

    1994-01-01

    Successful ground processing at KSC requires that flight hardware and ground support equipment conform to specifications at tens of thousands of checkpoints. Knowledge of conformance is an essential requirement for launch. That knowledge of conformance at every requisite point does not, however, enable identification of past problems with equipment, or potential problem areas. This paper describes how the introduction of Statistical Process Control and Process Capability Analysis identification procedures into existing shuttle processing procedures can enable identification of potential problem areas and candidates for improvements to increase processing performance measures. Results of a case study describing application of the analysis procedures to Thermal Protection System processing are used to illustrate the benefits of the approaches described in the paper.

  17. Effect of processing parameters on FDM process

    NASA Astrophysics Data System (ADS)

    Chari, V. Srinivasa; Venkatesh, P. R.; Krupashankar, Dinesh, Veena

    2018-04-01

    This paper focused on the process parameters on fused deposition modeling (FDM). Infill, resolution, temperature are the process variables considered for experimental studies. Compression strength, Hardness test microstructure are the outcome parameters, this experimental study done based on the taguchi's L9 orthogonal array is used. Taguchi array used to build the 9 different models and also to get the effective output results on the under taken parameters. The material used for this experimental study is Polylactic Acid (PLA).

  18. Westinghouse modular grinding process - improvement for follow on processes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fehrmann, Henning

    2013-07-01

    In nuclear power plants (NPP) ion exchange (IX) resins are used in several systems for water treatment. The resins can be in bead or powdered form. For waste treatment of spent IX resins, two methods are basically used: Direct immobilization (e.g. with cement, bitumen, polymer or High Integrity Container (HIC)); Thermal treatment (e.g. drying, oxidation or pyrolysis). Bead resins have some properties (e.g. particle size and density) that can have negative impacts on following waste treatment processes. Negative impacts could be: Floatation of bead resins in cementation process; Sedimentation in pipeline during transportation; Poor compaction properties for Hot Resin Supercompactionmore » (HRSC). Reducing the particle size of the bead resins can have beneficial effects enhancing further treatment processes and overcoming prior mentioned effects. Westinghouse Electric Company has developed a modular grinding process to crush/grind the bead resins. This modular process is designed for flexible use and enables a selective adjustment of particle size to tailor the grinding system to the customer needs. The system can be equipped with a crusher integrated in the process tank and if necessary a colloid mill. The crusher reduces the bead resins particle size and converts the bead resins to a pump able suspension with lower sedimentation properties. With the colloid mill the resins can be ground to a powder. Compared to existing grinding systems this equipment is designed to minimize radiation exposure of the worker during operation and maintenance. Using the crushed and/or ground bead resins has several beneficial effects like facilitating cementation process and recipe development, enhancing oxidation of resins, improving the Hot Resin Supercompaction volume reduction performance. (authors)« less

  19. Materials processing in space, 1980 science planning document. [crystal growth, containerless processing, solidification, bioprocessing, and ultrahigh vacuum processes

    NASA Technical Reports Server (NTRS)

    Naumann, R. J.

    1980-01-01

    The scientific aspects of the Materials Processing in Space program are described with emphasis on the major categories of interest: (1) crystal growth; (2) solidification of metals, alloys, and composites; (3) fluids and chemical processes; (4) containerless processing, glasses, and refractories; (5) ultrahigh vacuum processes; and (6) bioprocessing. An index is provided for each of these areas. The possible contributions that materials science experiments in space can make to the various disciplines are summarized, and the necessity for performing experiments in space is justified. What has been learned from previous experiments relating to space processing, current investigations, and remaining issues that require resolution are discussed. Recommendations for the future direction of the program are included.

  20. Process simulation during the design process makes the difference: process simulations applied to a traditional design.

    PubMed

    Traversari, Roberto; Goedhart, Rien; Schraagen, Jan Maarten

    2013-01-01

    The objective is evaluation of a traditionally designed operating room using simulation of various surgical workflows. A literature search showed that there is no evidence for an optimal operating room layout regarding the position and size of an ultraclean ventilation (UCV) canopy with a separate preparation room for laying out instruments and in which patients are induced in the operating room itself. Neither was literature found reporting on process simulation being used for this application. Many technical guidelines and designs have mainly evolved over time, and there is no evidence on whether the proposed measures are also effective for the optimization of the layout for workflows. The study was conducted by applying observational techniques to simulated typical surgical procedures. Process simulations which included complete surgical teams and equipment required for the intervention were carried out for four typical interventions. Four observers used a form to record conflicts with the clean area boundaries and the height of the supply bridge. Preferences for particular layouts were discussed with the surgical team after each simulated procedure. We established that a clean area measuring 3 × 3 m and a supply bridge height of 2.05 m was satisfactory for most situations, provided a movable operation table is used. The only cases in which conflicts with the supply bridge were observed were during the use of a surgical robot (Da Vinci) and a surgical microscope. During multiple trauma interventions, bottlenecks regarding the dimensions of the clean area will probably arise. The process simulation of four typical interventions has led to significantly different operating room layouts than were arrived at through the traditional design process. Evidence-based design, human factors, work environment, operating room, traditional design, process simulation, surgical workflowsPreferred Citation: Traversari, R., Goedhart, R., & Schraagen, J. M. (2013). Process

  1. The standards process: X3 information processing systems

    NASA Technical Reports Server (NTRS)

    Emard, Jean-Paul

    1993-01-01

    The topics are presented in viewgraph form and include the following: International Organization for Standards (ISO); International Electrotechnical Committee (IEC); ISO/IEC Joint Technical Committee 1 (JTC-1); U.S. interface to JTC-1; ANSI; national organizations; U.S. standards development processes; national and international standards developing organizations; regional organizations; and X3 information processing systems.

  2. Kidney transplantation process in Brazil represented in business process modeling notation.

    PubMed

    Peres Penteado, A; Molina Cohrs, F; Diniz Hummel, A; Erbs, J; Maciel, R F; Feijó Ortolani, C L; de Aguiar Roza, B; Torres Pisa, I

    2015-05-01

    Kidney transplantation is considered to be the best treatment for people with chronic kidney failure, because it improves the patients' quality of life and increases their length of survival compared with patients undergoing dialysis. The kidney transplantation process in Brazil is defined through laws, decrees, ordinances, and resolutions, but there is no visual representation of this process. The aim of this study was to analyze official documents to construct a representation of the kidney transplantation process in Brazil with the use of business process modeling notation (BPMN). The methodology for this study was based on an exploratory observational study, document analysis, and construction of process diagrams with the use of BPMN. Two rounds of validations by specialists were conducted. The result includes the kidney transplantation process in Brazil representation with the use of BPMN. We analyzed 2 digital documents that resulted in 2 processes with 45 total of activities and events, 6 organizations involved, and 6 different stages of the process. The constructed representation makes it easier to understand the rules for the business of kidney transplantation and can be used by the health care professionals involved in the various activities within this process. Construction of a representation with language appropriate for the Brazilian lay public is underway. Copyright © 2015 Elsevier Inc. All rights reserved.

  3. Situation awareness acquired from monitoring process plants - the Process Overview concept and measure.

    PubMed

    Lau, Nathan; Jamieson, Greg A; Skraaning, Gyrd

    2016-07-01

    We introduce Process Overview, a situation awareness characterisation of the knowledge derived from monitoring process plants. Process Overview is based on observational studies of process control work in the literature. The characterisation is applied to develop a query-based measure called the Process Overview Measure. The goal of the measure is to improve coupling between situation and awareness according to process plant properties and operator cognitive work. A companion article presents the empirical evaluation of the Process Overview Measure in a realistic process control setting. The Process Overview Measure demonstrated sensitivity and validity by revealing significant effects of experimental manipulations that corroborated with other empirical results. The measure also demonstrated adequate inter-rater reliability and practicality for measuring SA based on data collected by process experts. Practitioner Summary: The Process Overview Measure is a query-based measure for assessing operator situation awareness from monitoring process plants in representative settings.

  4. Evidence of automatic processing in sequence learning using process-dissociation

    PubMed Central

    Mong, Heather M.; McCabe, David P.; Clegg, Benjamin A.

    2012-01-01

    This paper proposes a way to apply process-dissociation to sequence learning in addition and extension to the approach used by Destrebecqz and Cleeremans (2001). Participants were trained on two sequences separated from each other by a short break. Following training, participants self-reported their knowledge of the sequences. A recognition test was then performed which required discrimination of two trained sequences, either under the instructions to call any sequence encountered in the experiment “old” (the inclusion condition), or only sequence fragments from one half of the experiment “old” (the exclusion condition). The recognition test elicited automatic and controlled process estimates using the process dissociation procedure, and suggested both processes were involved. Examining the underlying processes supporting performance may provide more information on the fundamental aspects of the implicit and explicit constructs than has been attainable through awareness testing. PMID:22679465

  5. Information Technology Process Improvement Decision-Making: An Exploratory Study from the Perspective of Process Owners and Process Managers

    ERIC Educational Resources Information Center

    Lamp, Sandra A.

    2012-01-01

    There is information available in the literature that discusses information technology (IT) governance and investment decision making from an executive-level perception, yet there is little information available that offers the perspective of process owners and process managers pertaining to their role in IT process improvement and investment…

  6. Economic-Oriented Stochastic Optimization in Advanced Process Control of Chemical Processes

    PubMed Central

    Dobos, László; Király, András; Abonyi, János

    2012-01-01

    Finding the optimal operating region of chemical processes is an inevitable step toward improving economic performance. Usually the optimal operating region is situated close to process constraints related to product quality or process safety requirements. Higher profit can be realized only by assuring a relatively low frequency of violation of these constraints. A multilevel stochastic optimization framework is proposed to determine the optimal setpoint values of control loops with respect to predetermined risk levels, uncertainties, and costs of violation of process constraints. The proposed framework is realized as direct search-type optimization of Monte-Carlo simulation of the controlled process. The concept is illustrated throughout by a well-known benchmark problem related to the control of a linear dynamical system and the model predictive control of a more complex nonlinear polymerization process. PMID:23213298

  7. Pre-processing and post-processing in group-cluster mergers

    NASA Astrophysics Data System (ADS)

    Vijayaraghavan, R.; Ricker, P. M.

    2013-11-01

    Galaxies in clusters are more likely to be of early type and to have lower star formation rates than galaxies in the field. Recent observations and simulations suggest that cluster galaxies may be `pre-processed' by group or filament environments and that galaxies that fall into a cluster as part of a larger group can stay coherent within the cluster for up to one orbital period (`post-processing'). We investigate these ideas by means of a cosmological N-body simulation and idealized N-body plus hydrodynamics simulations of a group-cluster merger. We find that group environments can contribute significantly to galaxy pre-processing by means of enhanced galaxy-galaxy merger rates, removal of galaxies' hot halo gas by ram pressure stripping and tidal truncation of their galaxies. Tidal distortion of the group during infall does not contribute to pre-processing. Post-processing is also shown to be effective: galaxy-galaxy collisions are enhanced during a group's pericentric passage within a cluster, the merger shock enhances the ram pressure on group and cluster galaxies and an increase in local density during the merger leads to greater galactic tidal truncation.

  8. 43 CFR 2804.19 - How will BLM process my Processing Category 6 application?

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 43 Public Lands: Interior 2 2012-10-01 2012-10-01 false How will BLM process my Processing... process my Processing Category 6 application? (a) For Processing Category 6 applications, you and BLM must enter into a written agreement that describes how BLM will process your application. The final agreement...

  9. 43 CFR 2804.19 - How will BLM process my Processing Category 6 application?

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 43 Public Lands: Interior 2 2013-10-01 2013-10-01 false How will BLM process my Processing... process my Processing Category 6 application? (a) For Processing Category 6 applications, you and BLM must enter into a written agreement that describes how BLM will process your application. The final agreement...

  10. Electrotechnologies to process foods

    USDA-ARS?s Scientific Manuscript database

    Electrical energy is being used to process foods. In conventional food processing plants, electricity drives mechanical devices and controls the degree of process. In recent years, several processing technologies are being developed to process foods directly with electricity. Electrotechnologies use...

  11. Optical signal processing

    NASA Technical Reports Server (NTRS)

    Casasent, D.

    1978-01-01

    The article discusses several optical configurations used for signal processing. Electronic-to-optical transducers are outlined, noting fixed window transducers and moving window acousto-optic transducers. Folded spectrum techniques are considered, with reference to wideband RF signal analysis, fetal electroencephalogram analysis, engine vibration analysis, signal buried in noise, and spatial filtering. Various methods for radar signal processing are described, such as phased-array antennas, the optical processing of phased-array data, pulsed Doppler and FM radar systems, a multichannel one-dimensional optical correlator, correlations with long coded waveforms, and Doppler signal processing. Means for noncoherent optical signal processing are noted, including an optical correlator for speech recognition and a noncoherent optical correlator.

  12. PROCESS IMPROVEMENT STUDIES ON THE BATTELLE HYDROTHERMAL COAL PROCESS

    EPA Science Inventory

    The report gives results of a study to improve the economic viability of the Battelle Hydrothermal (HT) Coal Process by reducing the costs associated with liquid/solid separation and leachant regeneration. Laboratory experiments were conducted to evaluate process improvements for...

  13. Process yield improvements with process control terminal for varian serial ion implanters

    NASA Astrophysics Data System (ADS)

    Higashi, Harry; Soni, Ameeta; Martinez, Larry; Week, Ken

    Implant processes in a modern wafer production fab are extremely complex. There can be several types of misprocessing, i.e. wrong dose or species, double implants and missed implants. Process Control Terminals (PCT) for Varian 350Ds installed at Intel fabs were found to substantially reduce the number of misprocessing steps. This paper describes those misprocessing steps and their subsequent reduction with use of PCTs. Reliable and simple process control with serial process ion implanters has been in increasing demand. A well designed process control terminal greatly increases device yield by monitoring all pertinent implanter functions and enabling process engineering personnel to set up process recipes for simple and accurate system operation. By programming user-selectable interlocks, implant errors are reduced and those that occur are logged for further analysis and prevention. A process control terminal should also be compatible with office personal computers for greater flexibility in system use and data analysis. The impact from the capability of a process control terminal is increased productivity, ergo higher device yield.

  14. Accelerated design of bioconversion processes using automated microscale processing techniques.

    PubMed

    Lye, Gary J; Ayazi-Shamlou, Parviz; Baganz, Frank; Dalby, Paul A; Woodley, John M

    2003-01-01

    Microscale processing techniques are rapidly emerging as a means to increase the speed of bioprocess design and reduce material requirements. Automation of these techniques can reduce labour intensity and enable a wider range of process variables to be examined. This article examines recent research on various individual microscale unit operations including microbial fermentation, bioconversion and product recovery techniques. It also explores the potential of automated whole process sequences operated in microwell formats. The power of the whole process approach is illustrated by reference to a particular bioconversion, namely the Baeyer-Villiger oxidation of bicyclo[3.2.0]hept-2-en-6-one for the production of optically pure lactones.

  15. An Integrated Model of Emotion Processes and Cognition in Social Information Processing.

    ERIC Educational Resources Information Center

    Lemerise, Elizabeth A.; Arsenio, William F.

    2000-01-01

    Interprets literature on contributions of social cognitive and emotion processes to children's social competence in the context of an integrated model of emotion processes and cognition in social information processing. Provides neurophysiological and functional evidence for the centrality of emotion processes in personal-social decision making.…

  16. Process for making unsaturated hydrocarbons using microchannel process technology

    DOEpatents

    Tonkovich, Anna Lee [Dublin, OH; Yuschak, Thomas [Lewis Center, OH; LaPlante, Timothy J [Columbus, OH; Rankin, Scott [Columbus, OH; Perry, Steven T [Galloway, OH; Fitzgerald, Sean Patrick [Columbus, OH; Simmons, Wayne W [Dublin, OH; Mazanec, Terry Daymo, Eric

    2011-04-12

    The disclosed invention relates to a process for converting a feed composition comprising one or more hydrocarbons to a product comprising one or more unsaturated hydrocarbons, the process comprising: flowing the feed composition and steam in contact with each other in a microchannel reactor at a temperature in the range from about 200.degree. C. to about 1200.degree. C. to convert the feed composition to the product, the process being characterized by the absence of catalyst for converting the one or more hydrocarbons to one or more unsaturated hydrocarbons. Hydrogen and/or oxygen may be combined with the feed composition and steam.

  17. INTERIOR PHOTO OF MAIN PROCESSING BUILDING PROCESS MAKEUP AREA (CPP601) ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    INTERIOR PHOTO OF MAIN PROCESSING BUILDING PROCESS MAKEUP AREA (CPP-601) LOOKING SOUTHWEST. PHOTO TAKEN FROM NORTHEAST CORNER. INL PHOTO NUMBER HD-50-4-2. Mike Crane, Photographer, 6/2005 - Idaho National Engineering Laboratory, Idaho Chemical Processing Plant, Fuel Reprocessing Complex, Scoville, Butte County, ID

  18. INTERIOR PHOTO OF MAIN PROCESSING BUILDING PROCESS MAKEUP AREA (CPP601) ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    INTERIOR PHOTO OF MAIN PROCESSING BUILDING PROCESS MAKEUP AREA (CPP-601) LOOKING NORTH. PHOTO TAKEN FROM SOUTHWEST CORNER. INL PHOTO NUMBER HD-50-1-3. Mike Crane, Photographer, 6/2005 - Idaho National Engineering Laboratory, Idaho Chemical Processing Plant, Fuel Reprocessing Complex, Scoville, Butte County, ID

  19. Development of a Process Signature for Manufacturing Processes with Thermal Loads

    NASA Astrophysics Data System (ADS)

    Frerichs, Friedhelm; Meyer, Heiner; Strunk, Rebecca; Kolkwitz, Benjamin; Epp, Jeremy

    2018-06-01

    The newly proposed concept of Process Signatures enables the comparison of seemingly different manufacturing processes via a process-independent approach based on the analysis of the loading condition and resulting material modification. This contribution compares the recently published results, based on numerically achieved data for the development of Process Signatures for sole surface and volume heatings without phase transformations, with the experimental data. The numerical approach applies the moving heat source theory in combination with energetic quantities. The external thermal loadings of both processes were characterized by the resulting temperature development, which correlates with a change in the residual stress state. The numerical investigations show that surface and volume heatings are interchangeable for certain parameter regimes regarding the changes in the residual stress state. Mainly, temperature gradients and thermal diffusion are responsible for the considered modifications. The applied surface- and volume-heating models are used in shallow cut grinding and induction heating, respectively. The comparison of numerical and experimental data reveals similarities, but also some systematic deviations of the residual stresses at the surface. The evaluation and final discussion support the assertion for very fast stress relaxation processes within the subsurface region. A consequence would be that the stress relaxation processes, which are not yet included in the numerical models, must be included in the Process Signatures for sole thermal impacts.

  20. Polyimide processing additives

    NASA Technical Reports Server (NTRS)

    Fletcher, James C. (Inventor); Pratt, J. Richard (Inventor); St.clair, Terry L. (Inventor); Stoakley, Diane M. (Inventor); Burks, Harold D. (Inventor)

    1992-01-01

    A process for preparing polyimides having enhanced melt flow properties is described. The process consists of heating a mixture of a high molecular weight poly-(amic acid) or polyimide with a low molecular weight amic acid or imide additive in the range of 0.05 to 15 percent by weight of additive. The polyimide powders so obtained show improved processability, as evidenced by lower melt viscosity by capillary rheometry. Likewise, films prepared from mixtures of polymers with additives show improved processability with earlier onset of stretching by TMA.

  1. Polyimide processing additives

    NASA Technical Reports Server (NTRS)

    Pratt, J. Richard (Inventor); St.clair, Terry L. (Inventor); Stoakley, Diane M. (Inventor); Burks, Harold D. (Inventor)

    1993-01-01

    A process for preparing polyimides having enhanced melt flow properties is described. The process consists of heating a mixture of a high molecular weight poly-(amic acid) or polyimide with a low molecular weight amic acid or imide additive in the range of 0.05 to 15 percent by weight of the additive. The polyimide powders so obtained show improved processability, as evidenced by lower melt viscosity by capillary rheometry. Likewise, films prepared from mixtures of polymers with additives show improved processability with earlier onset of stretching by TMA.

  2. INTERIOR PHOTO OF MAIN PROCESSING BUILDING (CPP601) PROCESS MAKEUP AREA ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    INTERIOR PHOTO OF MAIN PROCESSING BUILDING (CPP-601) PROCESS MAKEUP AREA LOOKING SOUTH. PHOTO TAKEN FROM CENTER OF WEST WALL. INL PHOTO NUMBER HD-50-1-4. Mike Crane, Photographer, 6/2005 - Idaho National Engineering Laboratory, Idaho Chemical Processing Plant, Fuel Reprocessing Complex, Scoville, Butte County, ID

  3. INTERIOR PHOTO OF MAIN PROCESSING BUILDING PROCESS MAKEUP AREA (CPP601) ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    INTERIOR PHOTO OF MAIN PROCESSING BUILDING PROCESS MAKEUP AREA (CPP-601) LOOKING NORTHWEST. PHOTO TAKEN FROM MIDDLE OF CORRIDOR. INL PHOTO NUMBER HD-50-2-3. Mike Crane, Photographer, 6/2005 - Idaho National Engineering Laboratory, Idaho Chemical Processing Plant, Fuel Reprocessing Complex, Scoville, Butte County, ID

  4. INTERIOR PHOTO OF MAIN PROCESSING BUILDING PROCESS MAKEUP AREA (CPP601) ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    INTERIOR PHOTO OF MAIN PROCESSING BUILDING PROCESS MAKEUP AREA (CPP-601) LOOKING SOUTH. PHOTO TAKEN FROM MIDDLE OF CORRIDOR. INL PHOTO NUMBER HD-50-3-2. Mike Crane, Photographer, 6/2005 - Idaho National Engineering Laboratory, Idaho Chemical Processing Plant, Fuel Reprocessing Complex, Scoville, Butte County, ID

  5. Process for separating nitrogen from methane using microchannel process technology

    DOEpatents

    Tonkovich, Anna Lee [Marysville, OH; Qiu, Dongming [Dublin, OH; Dritz, Terence Andrew [Worthington, OH; Neagle, Paul [Westerville, OH; Litt, Robert Dwayne [Westerville, OH; Arora, Ravi [Dublin, OH; Lamont, Michael Jay [Hilliard, OH; Pagnotto, Kristina M [Cincinnati, OH

    2007-07-31

    The disclosed invention relates to a process for separating methane or nitrogen from a fluid mixture comprising methane and nitrogen, the process comprising: (A) flowing the fluid mixture into a microchannel separator, the microchannel separator comprising a plurality of process microchannels containing a sorption medium, the fluid mixture being maintained in the microchannel separator until at least part of the methane or nitrogen is sorbed by the sorption medium, and removing non-sorbed parts of the fluid mixture from the microchannel separator; and (B) desorbing the methane or nitrogen from the sorption medium and removing the desorbed methane or nitrogen from the microchannel separator. The process is suitable for upgrading methane from coal mines, landfills, and other sub-quality sources.

  6. FLOOR PLAN OF MAIN PROCESSING BUILDING (CPP601) BASEMENT SHOWING PROCESS ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    FLOOR PLAN OF MAIN PROCESSING BUILDING (CPP-601) BASEMENT SHOWING PROCESS CORRIDOR AND EIGHTEEN CELLS. TO LEFT IS LABORATORY BUILDING (CPP-602). INL DRAWING NUMBER 200-0601-00-706-051981. ALTERNATE ID NUMBER CPP-E-1981. - Idaho National Engineering Laboratory, Idaho Chemical Processing Plant, Fuel Reprocessing Complex, Scoville, Butte County, ID

  7. Fermentation process using specific oxygen uptake rates as a process control

    DOEpatents

    Van Hoek, Pim; Aristidou, Aristos; Rush, Brian J.

    2016-08-30

    Specific oxygen uptake (OUR) is used as a process control parameter in fermentation processes. OUR is determined during at least the production phase of a fermentation process, and process parameters are adjusted to maintain the OUR within desired ranges. The invention is particularly applicable when the fermentation is conducted using a microorganism having a natural PDC pathway that has been disrupted so that it no longer functions. Microorganisms of this sort often produce poorly under strictly anaerobic conditions. Microaeration controlled by monitoring OUR allows the performance of the microorganism to be optimized.

  8. Fermentation process using specific oxygen uptake rates as a process control

    DOEpatents

    Van Hoek, Pim [Minnetonka, MN; Aristidou, Aristos [Maple Grove, MN; Rush, Brian [Minneapolis, MN

    2011-05-10

    Specific oxygen uptake (OUR) is used as a process control parameter in fermentation processes. OUR is determined during at least the production phase of a fermentation process, and process parameters are adjusted to maintain the OUR within desired ranges. The invention is particularly applicable when the fermentation is conducted using a microorganism having a natural PDC pathway that has been disrupted so that it no longer functions. Microorganisms of this sort often produce poorly under strictly anaerobic conditions. Microaeration controlled by monitoring OUR allows the performance of the microorganism to be optimized.

  9. Fermentation process using specific oxygen uptake rates as a process control

    DOEpatents

    Hoek, Van; Pim, Aristidou [Minnetonka, MN; Aristos, Rush [Maple Grove, MN; Brian, [Minneapolis, MN

    2007-06-19

    Specific oxygen uptake (OUR) is used as a process control parameter in fermentation processes. OUR is determined during at least the production phase of a fermentation process, and process parameters are adjusted to maintain the OUR within desired ranges. The invention is particularly applicable when the fermentation is conducted using a microorganism having a natural PDC pathway that has been disrupted so that it no longer functions. Microorganisms of this sort often produce poorly under strictly anaerobic conditions. Microaeration controlled by monitoring OUR allows the performance of the microorganism to be optimized.

  10. Fermentation process using specific oxygen uptake rates as a process control

    DOEpatents

    Van Hoek, Pim; Aristidou, Aristos; Rush, Brian

    2014-09-09

    Specific oxygen uptake (OUR) is used as a process control parameter in fermentation processes. OUR is determined during at least the production phase of a fermentation process, and process parameters are adjusted to maintain the OUR within desired ranges. The invention is particularly applicable when the fermentation is conducted using a microorganism having a natural PDC pathway that has been disrupted so that it no longer functions. Microorganisms of this sort often produce poorly under strictly anaerobic conditions. Microaeration controlled by monitoring OUR allows the performance of the microorganism to be optimized.

  11. Peat Processing

    NASA Technical Reports Server (NTRS)

    1986-01-01

    Humics, Inc. already had patented their process for separating wet peat into components and processing it when they consulted NERAC regarding possible applications. The NERAC search revealed numerous uses for humic acid extracted from peat. The product improves seed germination, stimulates root development, and improves crop yields. There are also potential applications in sewage disposal and horticultural peat, etc.

  12. Picturing Quantum Processes

    NASA Astrophysics Data System (ADS)

    Coecke, Bob; Kissinger, Aleks

    2017-03-01

    Preface; 1. Introduction; 2. Guide to reading this textbook; 3. Processes as diagrams; 4. String diagrams; 5. Hilbert space from diagrams; 6. Quantum processes; 7. Quantum measurement; 8. Picturing classical-quantum processes; 9. Picturing phases and complementarity; 10. Quantum theory: the full picture; 11. Quantum foundations; 12. Quantum computation; 13. Quantum resources; 14. Quantomatic; Appendix A. Some notations; References; Index.

  13. Visual Processing in Rapid-Chase Systems: Image Processing, Attention, and Awareness

    PubMed Central

    Schmidt, Thomas; Haberkamp, Anke; Veltkamp, G. Marina; Weber, Andreas; Seydell-Greenwald, Anna; Schmidt, Filipp

    2011-01-01

    Visual stimuli can be classified so rapidly that their analysis may be based on a single sweep of feedforward processing through the visuomotor system. Behavioral criteria for feedforward processing can be evaluated in response priming tasks where speeded pointing or keypress responses are performed toward target stimuli which are preceded by prime stimuli. We apply this method to several classes of complex stimuli. (1) When participants classify natural images into animals or non-animals, the time course of their pointing responses indicates that prime and target signals remain strictly sequential throughout all processing stages, meeting stringent behavioral criteria for feedforward processing (rapid-chase criteria). (2) Such priming effects are boosted by selective visual attention for positions, shapes, and colors, in a way consistent with bottom-up enhancement of visuomotor processing, even when primes cannot be consciously identified. (3) Speeded processing of phobic images is observed in participants specifically fearful of spiders or snakes, suggesting enhancement of feedforward processing by long-term perceptual learning. (4) When the perceived brightness of primes in complex displays is altered by means of illumination or transparency illusions, priming effects in speeded keypress responses can systematically contradict subjective brightness judgments, such that one prime appears brighter than the other but activates motor responses as if it was darker. We propose that response priming captures the output of the first feedforward pass of visual signals through the visuomotor system, and that this output lacks some characteristic features of more elaborate, recurrent processing. This way, visuomotor measures may become dissociated from several aspects of conscious vision. We argue that “fast” visuomotor measures predominantly driven by feedforward processing should supplement “slow” psychophysical measures predominantly based on visual awareness

  14. Neural competition as a developmental process: Early hemispheric specialization for word processing delays specialization for face processing

    PubMed Central

    Li, Su; Lee, Kang; Zhao, Jing; Yang, Zhi; He, Sheng; Weng, Xuchu

    2013-01-01

    Little is known about the impact of learning to read on early neural development for word processing and its collateral effects on neural development in non-word domains. Here, we examined the effect of early exposure to reading on neural responses to both word and face processing in preschool children with the use of the Event Related Potential (ERP) methodology. We specifically linked children’s reading experience (indexed by their sight vocabulary) to two major neural markers: the amplitude differences between the left and right N170 on the bilateral posterior scalp sites and the hemispheric spectrum power differences in the γ band on the same scalp sites. The results showed that the left-lateralization of both the word N170 and the spectrum power in the γ band were significantly positively related to vocabulary. In contrast, vocabulary and the word left-lateralization both had a strong negative direct effect on the face right-lateralization. Also, vocabulary negatively correlated with the right-lateralized face spectrum power in the γ band even after the effects of age and the word spectrum power were partialled out. The present study provides direct evidence regarding the role of reading experience in the neural specialization of word and face processing above and beyond the effect of maturation. The present findings taken together suggest that the neural development of visual word processing competes with that of face processing before the process of neural specialization has been consolidated. PMID:23462239

  15. A level set method for determining critical curvatures for drainage and imbibition.

    PubMed

    Prodanović, Masa; Bryant, Steven L

    2006-12-15

    An accurate description of the mechanics of pore level displacement of immiscible fluids could significantly improve the predictions from pore network models of capillary pressure-saturation curves, interfacial areas and relative permeability in real porous media. If we assume quasi-static displacement, at constant pressure and surface tension, pore scale interfaces are modeled as constant mean curvature surfaces, which are not easy to calculate. Moreover, the extremely irregular geometry of natural porous media makes it difficult to evaluate surface curvature values and corresponding geometric configurations of two fluids. Finally, accounting for the topological changes of the interface, such as splitting or merging, is nontrivial. We apply the level set method for tracking and propagating interfaces in order to robustly handle topological changes and to obtain geometrically correct interfaces. We describe a simple but robust model for determining critical curvatures for throat drainage and pore imbibition. The model is set up for quasi-static displacements but it nevertheless captures both reversible and irreversible behavior (Haines jump, pore body imbibition). The pore scale grain boundary conditions are extracted from model porous media and from imaged geometries in real rocks. The method gives quantitative agreement with measurements and with other theories and computational approaches.

  16. Eastern Denali Fault surface trace map, eastern Alaska and Yukon, Canada

    USGS Publications Warehouse

    Bender, Adrian M.; Haeussler, Peter J.

    2017-05-04

    We map the 385-kilometer (km) long surface trace of the right-lateral, strike-slip Denali Fault between the Totschunda-Denali Fault intersection in Alaska, United States and the village of Haines Junction, Yukon, Canada. In Alaska, digital elevation models based on light detection and ranging and interferometric synthetic aperture radar data enabled our fault mapping at scales of 1:2,000 and 1:10,000, respectively. Lacking such resources in Yukon, we developed new structure-from-motion digital photogrammetry products from legacy aerial photos to map the fault surface trace at a scale of 1:10,000 east of the international border. The section of the fault that we map, referred to as the Eastern Denali Fault, did not rupture during the 2002 Denali Fault earthquake (moment magnitude 7.9). Seismologic, geodetic, and geomorphic evidence, along with a paleoseismic record of past ground-rupturing earthquakes, demonstrate Holocene and contemporary activity on the fault, however. This map of the Eastern Denali Fault surface trace complements other data sets by providing an openly accessible digital interpretation of the location, length, and continuity of the fault’s surface trace based on the accompanying digital topography dataset. Additionally, the digitized fault trace may provide geometric constraints useful for modeling earthquake scenarios and related seismic hazard.

  17. Compact and low-cost humanoid hand powered by nylon artificial muscles.

    PubMed

    Wu, Lianjun; Jung de Andrade, Monica; Saharan, Lokesh Kumar; Rome, Richard Steven; Baughman, Ray H; Tadesse, Yonas

    2017-02-03

    This paper focuses on design, fabrication and characterization of a biomimetic, compact, low-cost and lightweight 3D printed humanoid hand (TCP Hand) that is actuated by twisted and coiled polymeric (TCP) artificial muscles. The TCP muscles were recently introduced and provided unprecedented strain, mechanical work, and lifecycle (Haines et al 2014 Science 343 868-72). The five-fingered humanoid hand is under-actuated and has 16 degrees of freedom (DOF) in total (15 for fingers and 1 at the palm). In the under-actuated hand designs, a single actuator provides coupled motions at the phalanges of each finger. Two different designs are presented along with the essential elements consisting of actuators, springs, tendons and guide systems. Experiments were conducted to investigate the performance of the TCP muscles in response to the power input (power magnitude, type of wave form such as pulsed or square wave, and pulse duration) and the resulting actuation stroke and force generation. A kinematic model of the flexor tendons was developed to simulate the flexion motion and compare with experimental results. For fast finger movements, short high-power pulses were employed. Finally, we demonstrated the grasping of various objects using the humanoid TCP hand showing an array of functions similar to a natural hand.

  18. The Effect of an Energy Audit Service Learning Project on Student Perceptions of STEM Related Disciplines, Personal Behaviors/Actions towards the Environment, and Stewardship Skills

    NASA Astrophysics Data System (ADS)

    Gullo, Michael

    The purpose of this study was to investigate whether or not service learning could be considered an alternative teaching method in an environmental science classroom. In particular, the results of this research show whether an energy audit service learning project influenced student environmental awareness (knowledge of environmental issues, problems, and solutions), student personal actions/behaviors towards the environment, student perceptions and attitudes of science related careers, and community partnerships. Haines (2010) defines service learning as “a teaching and learning strategy that integrates meaningful community service with instruction and reflection to enrich the learning experience, teach civic responsibility, and strengthen communities” (p. 16). Moreover, service learning opportunities can encourage students to step out of their comfort zone and learn from hands-on experiences and apply knowledge obtained from lectures and classroom activities to real life situations. To add to the growing body of literature, the results of this study concluded that an energy audit service learning project did not have a measureable effect on student perceptions and attitudes of science related careers as compared to a more traditional teaching approach. However, the data from this study did indicate that an energy audit service learning project increased students personal actions/behaviors towards the environment more than a direct teaching approach.

  19. Process evaluation distributed system

    NASA Technical Reports Server (NTRS)

    Moffatt, Christopher L. (Inventor)

    2006-01-01

    The distributed system includes a database server, an administration module, a process evaluation module, and a data display module. The administration module is in communication with the database server for providing observation criteria information to the database server. The process evaluation module is in communication with the database server for obtaining the observation criteria information from the database server and collecting process data based on the observation criteria information. The process evaluation module utilizes a personal digital assistant (PDA). A data display module in communication with the database server, including a website for viewing collected process data in a desired metrics form, the data display module also for providing desired editing and modification of the collected process data. The connectivity established by the database server to the administration module, the process evaluation module, and the data display module, minimizes the requirement for manual input of the collected process data.

  20. Word Processing and the Writing Process: Enhancement or Distraction?

    ERIC Educational Resources Information Center

    Dalton, David W.; Watson, James F.

    This study examined the effects of a year-long word processing program on learners' holistic writing skills. Based on results of a writing pretest, 80 seventh grade students were designated as relatively high or low in prior writing achievement and assigned to one of two groups: a word processing treatment and a conventional writing process…

  1. Polycrystalline semiconductor processing

    DOEpatents

    Glaeser, Andreas M.; Haggerty, John S.; Danforth, Stephen C.

    1983-01-01

    A process for forming large-grain polycrystalline films from amorphous films for use as photovoltaic devices. The process operates on the amorphous film and uses the driving force inherent to the transition from the amorphous state to the crystalline state as the force which drives the grain growth process. The resultant polycrystalline film is characterized by a grain size that is greater than the thickness of the film. A thin amorphous film is deposited on a substrate. The formation of a plurality of crystalline embryos is induced in the amorphous film at predetermined spaced apart locations and nucleation is inhibited elsewhere in the film. The crystalline embryos are caused to grow in the amorphous film, without further nucleation occurring in the film, until the growth of the embryos is halted by imgingement on adjacently growing embryos. The process is applicable to both batch and continuous processing techniques. In either type of process, the thin amorphous film is sequentially doped with p and n type dopants. Doping is effected either before or after the formation and growth of the crystalline embryos in the amorphous film, or during a continuously proceeding crystallization step.

  2. Polycrystalline semiconductor processing

    DOEpatents

    Glaeser, A.M.; Haggerty, J.S.; Danforth, S.C.

    1983-04-05

    A process is described for forming large-grain polycrystalline films from amorphous films for use as photovoltaic devices. The process operates on the amorphous film and uses the driving force inherent to the transition from the amorphous state to the crystalline state as the force which drives the grain growth process. The resultant polycrystalline film is characterized by a grain size that is greater than the thickness of the film. A thin amorphous film is deposited on a substrate. The formation of a plurality of crystalline embryos is induced in the amorphous film at predetermined spaced apart locations and nucleation is inhibited elsewhere in the film. The crystalline embryos are caused to grow in the amorphous film, without further nucleation occurring in the film, until the growth of the embryos is halted by impingement on adjacently growing embryos. The process is applicable to both batch and continuous processing techniques. In either type of process, the thin amorphous film is sequentially doped with p and n type dopants. Doping is effected either before or after the formation and growth of the crystalline embryos in the amorphous film, or during a continuously proceeding crystallization step. 10 figs.

  3. Helium process cycle

    DOEpatents

    Ganni, Venkatarao

    2008-08-12

    A unique process cycle and apparatus design separates the consumer (cryogenic) load return flow from most of the recycle return flow of a refrigerator and/or liquefier process cycle. The refrigerator and/or liquefier process recycle return flow is recompressed by a multi-stage compressor set and the consumer load return flow is recompressed by an independent consumer load compressor set that maintains a desirable constant suction pressure using a consumer load bypass control valve and the consumer load return pressure control valve that controls the consumer load compressor's suction pressure. The discharge pressure of this consumer load compressor is thereby allowed to float at the intermediate pressure in between the first and second stage recycle compressor sets. Utilizing the unique gas management valve regulation, the unique process cycle and apparatus design in which the consumer load return flow is separate from the recycle return flow, the pressure ratios of each recycle compressor stage and all main pressures associated with the recycle return flow are allowed to vary naturally, thus providing a naturally regulated and balanced floating pressure process cycle that maintains optimal efficiency at design and off-design process cycle capacity and conditions automatically.

  4. Helium process cycle

    DOEpatents

    Ganni, Venkatarao

    2007-10-09

    A unique process cycle and apparatus design separates the consumer (cryogenic) load return flow from most of the recycle return flow of a refrigerator and/or liquefier process cycle. The refrigerator and/or liquefier process recycle return flow is recompressed by a multi-stage compressor set and the consumer load return flow is recompressed by an independent consumer load compressor set that maintains a desirable constant suction pressure using a consumer load bypass control valve and the consumer load return pressure control valve that controls the consumer load compressor's suction pressure. The discharge pressure of this consumer load compressor is thereby allowed to float at the intermediate pressure in between the first and second stage recycle compressor sets. Utilizing the unique gas management valve regulation, the unique process cycle and apparatus design in which the consumer load return flow is separate from the recycle return flow, the pressure ratios of each recycle compressor stage and all main pressures associated with the recycle return flow are allowed to vary naturally, thus providing a naturally regulated and balanced floating pressure process cycle that maintains optimal efficiency at design and off-design process cycle capacity and conditions automatically.

  5. Improving the Document Development Process: Integrating Relational Data and Statistical Process Control.

    ERIC Educational Resources Information Center

    Miller, John

    1994-01-01

    Presents an approach to document numbering, document titling, and process measurement which, when used with fundamental techniques of statistical process control, reveals meaningful process-element variation as well as nominal productivity models. (SR)

  6. Thermochemical water decomposition processes

    NASA Technical Reports Server (NTRS)

    Chao, R. E.

    1974-01-01

    Thermochemical processes which lead to the production of hydrogen and oxygen from water without the consumption of any other material have a number of advantages when compared to other processes such as water electrolysis. It is possible to operate a sequence of chemical steps with net work requirements equal to zero at temperatures well below the temperature required for water dissociation in a single step. Various types of procedures are discussed, giving attention to halide processes, reverse Deacon processes, iron oxide and carbon oxide processes, and metal and alkali metal processes. Economical questions are also considered.

  7. Business Development Process

    DTIC Science & Technology

    2001-10-31

    832-4736. DISTRIBUTION STATEMENT A Approved for Public Release Distribution Unlimited Attorney Docket No. 83042 BUSINESS DEVELOPMENT PROCESS TO... BUSINESS DEVELOPMENT PROCESS 3 4 STATEMENT OF GOVERNMENT INTEREST 5 The invention described herein may be manufactured and used 6 by or for the...INVENTION 11 (1) Field of the Invention 12 This invention generally relates to a business 13 development process for assessing new business ideas

  8. [Definition and stabilization of processes I. Management processes and support in a Urology Department].

    PubMed

    Pascual, Carlos; Luján, Marcos; Mora, José Ramón; Chiva, Vicente; Gamarra, Manuela

    2015-01-01

    The implantation of total quality management models in clinical departments can better adapt to the 2009 ISO 9004 model. An essential part of implantation of these models is the establishment of processes and their stabilization. There are four types of processes: key, management, support and operative (clinical). Management processes have four parts: process stabilization form, process procedures form, medical activities cost estimation form and, process flow chart. In this paper we will detail the creation of an essential process in a surgical department, such as the process of management of the surgery waiting list.

  9. 7 Processes that Enable NASA Software Engineering Technologies: Value-Added Process Engineering

    NASA Technical Reports Server (NTRS)

    Housch, Helen; Godfrey, Sally

    2011-01-01

    The presentation reviews Agency process requirements and the purpose, benefits, and experiences or seven software engineering processes. The processes include: product integration, configuration management, verification, software assurance, measurement and analysis, requirements management, and planning and monitoring.

  10. Implementation of a process analytical technology system in a freeze-drying process using Raman spectroscopy for in-line process monitoring.

    PubMed

    De Beer, T R M; Allesø, M; Goethals, F; Coppens, A; Heyden, Y Vander; De Diego, H Lopez; Rantanen, J; Verpoort, F; Vervaet, C; Remon, J P; Baeyens, W R G

    2007-11-01

    The aim of the present study was to propose a strategy for the implementation of a Process Analytical Technology system in freeze-drying processes. Mannitol solutions, some of them supplied with NaCl, were used as models to freeze-dry. Noninvasive and in-line Raman measurements were continuously performed during lyophilization of the solutions to monitor real time the mannitol solid state, the end points of the different process steps (freezing, primary drying, secondary drying), and physical phenomena occurring during the process. At-line near-infrared (NIR) and X-ray powder diffractometry (XRPD) measurements were done to confirm the Raman conclusions and to find out additional information. The collected spectra during the processes were analyzed using principal component analysis and multivariate curve resolution. A two-level full factorial design was used to study the significant influence of process (freezing rate) and formulation variables (concentration of mannitol, concentration of NaCl, volume of freeze-dried sample) upon freeze-drying. Raman spectroscopy was able to monitor (i) the mannitol solid state (amorphous, alpha, beta, delta, and hemihydrate), (ii) several process step end points (end of mannitol crystallization during freezing, primary drying), and (iii) physical phenomena occurring during freeze-drying (onset of ice nucleation, onset of mannitol crystallization during the freezing step, onset of ice sublimation). NIR proved to be a more sensitive tool to monitor sublimation than Raman spectroscopy, while XRPD helped to unravel the mannitol hemihydrate in the samples. The experimental design results showed that several process and formulation variables significantly influence different aspects of lyophilization and that both are interrelated. Raman spectroscopy (in-line) and NIR spectroscopy and XRPD (at-line) not only allowed the real-time monitoring of mannitol freeze-drying processes but also helped (in combination with experimental design) us

  11. Real-Time Plasma Process Condition Sensing and Abnormal Process Detection

    PubMed Central

    Yang, Ryan; Chen, Rongshun

    2010-01-01

    The plasma process is often used in the fabrication of semiconductor wafers. However, due to the lack of real-time etching control, this may result in some unacceptable process performances and thus leads to significant waste and lower wafer yield. In order to maximize the product wafer yield, a timely and accurately process fault or abnormal detection in a plasma reactor is needed. Optical emission spectroscopy (OES) is one of the most frequently used metrologies in in-situ process monitoring. Even though OES has the advantage of non-invasiveness, it is required to provide a huge amount of information. As a result, the data analysis of OES becomes a big challenge. To accomplish real-time detection, this work employed the sigma matching method technique, which is the time series of OES full spectrum intensity. First, the response model of a healthy plasma spectrum was developed. Then, we defined a matching rate as an indictor for comparing the difference between the tested wafers response and the health sigma model. The experimental results showed that this proposal method can detect process faults in real-time, even in plasma etching tools. PMID:22219683

  12. Feller processes: the next generation in modeling. Brownian motion, Lévy processes and beyond.

    PubMed

    Böttcher, Björn

    2010-12-03

    We present a simple construction method for Feller processes and a framework for the generation of sample paths of Feller processes. The construction is based on state space dependent mixing of Lévy processes. Brownian Motion is one of the most frequently used continuous time Markov processes in applications. In recent years also Lévy processes, of which Brownian Motion is a special case, have become increasingly popular. Lévy processes are spatially homogeneous, but empirical data often suggest the use of spatially inhomogeneous processes. Thus it seems necessary to go to the next level of generalization: Feller processes. These include Lévy processes and in particular brownian motion as special cases but allow spatial inhomogeneities. Many properties of Feller processes are known, but proving the very existence is, in general, very technical. Moreover, an applicable framework for the generation of sample paths of a Feller process was missing. We explain, with practitioners in mind, how to overcome both of these obstacles. In particular our simulation technique allows to apply Monte Carlo methods to Feller processes.

  13. Feller Processes: The Next Generation in Modeling. Brownian Motion, Lévy Processes and Beyond

    PubMed Central

    Böttcher, Björn

    2010-01-01

    We present a simple construction method for Feller processes and a framework for the generation of sample paths of Feller processes. The construction is based on state space dependent mixing of Lévy processes. Brownian Motion is one of the most frequently used continuous time Markov processes in applications. In recent years also Lévy processes, of which Brownian Motion is a special case, have become increasingly popular. Lévy processes are spatially homogeneous, but empirical data often suggest the use of spatially inhomogeneous processes. Thus it seems necessary to go to the next level of generalization: Feller processes. These include Lévy processes and in particular Brownian motion as special cases but allow spatial inhomogeneities. Many properties of Feller processes are known, but proving the very existence is, in general, very technical. Moreover, an applicable framework for the generation of sample paths of a Feller process was missing. We explain, with practitioners in mind, how to overcome both of these obstacles. In particular our simulation technique allows to apply Monte Carlo methods to Feller processes. PMID:21151931

  14. Neural competition as a developmental process: early hemispheric specialization for word processing delays specialization for face processing.

    PubMed

    Li, Su; Lee, Kang; Zhao, Jing; Yang, Zhi; He, Sheng; Weng, Xuchu

    2013-04-01

    Little is known about the impact of learning to read on early neural development for word processing and its collateral effects on neural development in non-word domains. Here, we examined the effect of early exposure to reading on neural responses to both word and face processing in preschool children with the use of the Event Related Potential (ERP) methodology. We specifically linked children's reading experience (indexed by their sight vocabulary) to two major neural markers: the amplitude differences between the left and right N170 on the bilateral posterior scalp sites and the hemispheric spectrum power differences in the γ band on the same scalp sites. The results showed that the left-lateralization of both the word N170 and the spectrum power in the γ band were significantly positively related to vocabulary. In contrast, vocabulary and the word left-lateralization both had a strong negative direct effect on the face right-lateralization. Also, vocabulary negatively correlated with the right-lateralized face spectrum power in the γ band even after the effects of age and the word spectrum power were partialled out. The present study provides direct evidence regarding the role of reading experience in the neural specialization of word and face processing above and beyond the effect of maturation. The present findings taken together suggest that the neural development of visual word processing competes with that of face processing before the process of neural specialization has been consolidated. Copyright © 2013 Elsevier Ltd. All rights reserved.

  15. Process margin enhancement for 0.25-μm metal etch process

    NASA Astrophysics Data System (ADS)

    Lee, Chung Y.; Ma, Wei Wen; Lim, Eng H.; Cheng, Alex T.; Joy, Raymond; Ross, Matthew F.; Wong, Selmer S.; Marlowe, Trey

    2000-06-01

    This study evaluates electron beam stabilization of UV6, a positive tone Deep-UV (DUV) resist from Shipley, for a 0.25 micrometer metal etch application. Results are compared between untreated resist and resist treated with different levels of electron beam stabilization. The electron beam processing was carried out in an ElectronCureTM flood electron beam exposure system from Honeywell International Inc., Electron Vision. The ElectronCureTM system utilizes a flood electron beam source which is larger in diameter than the substrate being processed, and is capable of variable energy so that the electron range is matched to the resist film thickness. Changes in the UV6 resist material as a result of the electron beam stabilization are monitored via spectroscopic ellipsometry for film thickness and index of refraction changes and FTIR for analysis of chemical changes. Thermal flow stability is evaluated by applying hot plate bakes of 150 degrees Celsius and 200 degrees Celsius, to patterned resist wafers with no treatment and with an electron beam dose level of 2000 (mu) C/cm2. A significant improvement in the thermal flow stability of the patterned UV6 resist features is achieved with the electron beam stabilization process. Etch process performance of the UV6 resist was evaluated by performing a metal pattern transfer process on wafers with untreated resist and comparing these with etch results on wafers with different levels of electron beam stabilization. The etch processing was carried out in an Applied Materials reactor with an etch chemistry including BCl3 and Cl2. All wafers were etched under the same conditions and the resist was treated after etch to prevent further erosion after etch but before SEM analysis. Post metal etch SEM cross-sections show the enhancement in etch resistance provided by the electron beam stabilization process. Enhanced process margin is achieved as a result of the improved etch resistance, and is observed in reduced resist side

  16. Power processing

    NASA Technical Reports Server (NTRS)

    Schwarz, F. C.

    1971-01-01

    Processing of electric power has been presented as a discipline that draws on almost every field of electrical engineering, including system and control theory, communications theory, electronic network design, and power component technology. The cost of power processing equipment, which often equals that of expensive, sophisticated, and unconventional sources of electrical energy, such as solar batteries, is a significant consideration in the choice of electric power systems.

  17. Dynamic control of remelting processes

    DOEpatents

    Bertram, Lee A.; Williamson, Rodney L.; Melgaard, David K.; Beaman, Joseph J.; Evans, David G.

    2000-01-01

    An apparatus and method of controlling a remelting process by providing measured process variable values to a process controller; estimating process variable values using a process model of a remelting process; and outputting estimated process variable values from the process controller. Feedback and feedforward control devices receive the estimated process variable values and adjust inputs to the remelting process. Electrode weight, electrode mass, electrode gap, process current, process voltage, electrode position, electrode temperature, electrode thermal boundary layer thickness, electrode velocity, electrode acceleration, slag temperature, melting efficiency, cooling water temperature, cooling water flow rate, crucible temperature profile, slag skin temperature, and/or drip short events are employed, as are parameters representing physical constraints of electroslag remelting or vacuum arc remelting, as applicable.

  18. In-Process Thermal Imaging of the Electron Beam Freeform Fabrication Process

    NASA Technical Reports Server (NTRS)

    Taminger, Karen M.; Domack, Christopher S.; Zalameda, Joseph N.; Taminger, Brian L.; Hafley, Robert A.; Burke, Eric R.

    2016-01-01

    Researchers at NASA Langley Research Center have been developing the Electron Beam Freeform Fabrication (EBF3) metal additive manufacturing process for the past 15 years. In this process, an electron beam is used as a heat source to create a small molten pool on a substrate into which wire is fed. The electron beam and wire feed assembly are translated with respect to the substrate to follow a predetermined tool path. This process is repeated in a layer-wise fashion to fabricate metal structural components. In-process imaging has been integrated into the EBF3 system using a near-infrared (NIR) camera. The images are processed to provide thermal and spatial measurements that have been incorporated into a closed-loop control system to maintain consistent thermal conditions throughout the build. Other information in the thermal images is being used to assess quality in real time by detecting flaws in prior layers of the deposit. NIR camera incorporation into the system has improved the consistency of the deposited material and provides the potential for real-time flaw detection which, ultimately, could lead to the manufacture of better, more reliable components using this additive manufacturing process.

  19. Hierarchical process memory: memory as an integral component of information processing

    PubMed Central

    Hasson, Uri; Chen, Janice; Honey, Christopher J.

    2015-01-01

    Models of working memory commonly focus on how information is encoded into and retrieved from storage at specific moments. However, in the majority of real-life processes, past information is used continuously to process incoming information across multiple timescales. Considering single unit, electrocorticography, and functional imaging data, we argue that (i) virtually all cortical circuits can accumulate information over time, and (ii) the timescales of accumulation vary hierarchically, from early sensory areas with short processing timescales (tens to hundreds of milliseconds) to higher-order areas with long processing timescales (many seconds to minutes). In this hierarchical systems perspective, memory is not restricted to a few localized stores, but is intrinsic to information processing that unfolds throughout the brain on multiple timescales. “The present contains nothing more than the past, and what is found in the effect was already in the cause.”Henri L Bergson PMID:25980649

  20. Process Analyzer

    NASA Technical Reports Server (NTRS)

    1994-01-01

    The ChemScan UV-6100 is a spectrometry system originally developed by Biotronics Technologies, Inc. under a Small Business Innovation Research (SBIR) contract. It is marketed to the water and wastewater treatment industries, replacing "grab sampling" with on-line data collection. It analyzes the light absorbance characteristics of a water sample, simultaneously detects hundreds of individual wavelengths absorbed by chemical substances in a process solution, and quantifies the information. Spectral data is then processed by ChemScan analyzer and compared with calibration files in the system's memory in order to calculate concentrations of chemical substances that cause UV light absorbance in specific patterns. Monitored substances can be analyzed for quality and quantity. Applications include detection of a variety of substances, and the information provided enables an operator to control a process more efficiently.

  1. Rapid process development of chromatographic process using direct analysis in real time mass spectrometry as a process analytical technology tool.

    PubMed

    Yan, Binjun; Chen, Teng; Xu, Zhilin; Qu, Haibin

    2014-06-01

    The concept of quality by design (QbD) is widely applied in the process development of pharmaceuticals. However, the additional cost and time have caused some resistance about QbD implementation. To show a possible solution, this work proposed a rapid process development method, which used direct analysis in real time mass spectrometry (DART-MS) as a process analytical technology (PAT) tool for studying the chromatographic process of Ginkgo biloba L., as an example. The breakthrough curves were fast determined by DART-MS at-line. A high correlation coefficient of 0.9520 was found between the concentrations of ginkgolide A determined by DART-MS and HPLC. Based on the PAT tool, the impacts of process parameters on the adsorption capacity were discovered rapidly, which showed a decreased adsorption capacity with the increase of the flow rate. This work has shown the feasibility and advantages of integrating PAT into QbD implementation for rapid process development. Copyright © 2014 Elsevier B.V. All rights reserved.

  2. Recollection is a continuous process: implications for dual-process theories of recognition memory.

    PubMed

    Mickes, Laura; Wais, Peter E; Wixted, John T

    2009-04-01

    Dual-process theory, which holds that recognition decisions can be based on recollection or familiarity, has long seemed incompatible with signal detection theory, which holds that recognition decisions are based on a singular, continuous memory-strength variable. Formal dual-process models typically regard familiarity as a continuous process (i.e., familiarity comes in degrees), but they construe recollection as a categorical process (i.e., recollection either occurs or does not occur). A continuous process is characterized by a graded relationship between confidence and accuracy, whereas a categorical process is characterized by a binary relationship such that high confidence is associated with high accuracy but all lower degrees of confidence are associated with chance accuracy. Using a source-memory procedure, we found that the relationship between confidence and source-recollection accuracy was graded. Because recollection, like familiarity, is a continuous process, dual-process theory is more compatible with signal detection theory than previously thought.

  3. Rethinking Process through Design

    ERIC Educational Resources Information Center

    Newcomb, Matthew; Leshowitz, Allison

    2017-01-01

    We take a look at work on writing processes by examining design processes. Design processes offer a greater emphasis on empathy with users, feedback and critique during idea generation, and varied uses of materials. After considering work already done on design and composition, we explore a variety of design processes and develop our own…

  4. Making process improvement 'stick'.

    PubMed

    Studer, Quint

    2014-06-01

    To sustain gains from a process improvement initiative, healthcare organizations should: Explain to staff why a process improvement initiative is needed. Encourage leaders within the organization to champion the process improvement, and tie their evaluations to its outcomes. Ensure that both leaders and employees have the skills to help sustain the sought-after process improvements.

  5. Computers for symbolic processing

    NASA Technical Reports Server (NTRS)

    Wah, Benjamin W.; Lowrie, Matthew B.; Li, Guo-Jie

    1989-01-01

    A detailed survey on the motivations, design, applications, current status, and limitations of computers designed for symbolic processing is provided. Symbolic processing computations are performed at the word, relation, or meaning levels, and the knowledge used in symbolic applications may be fuzzy, uncertain, indeterminate, and ill represented. Various techniques for knowledge representation and processing are discussed from both the designers' and users' points of view. The design and choice of a suitable language for symbolic processing and the mapping of applications into a software architecture are then considered. The process of refining the application requirements into hardware and software architectures is treated, and state-of-the-art sequential and parallel computers designed for symbolic processing are discussed.

  6. Honing process optimization algorithms

    NASA Astrophysics Data System (ADS)

    Kadyrov, Ramil R.; Charikov, Pavel N.; Pryanichnikova, Valeria V.

    2018-03-01

    This article considers the relevance of honing processes for creating high-quality mechanical engineering products. The features of the honing process are revealed and such important concepts as the task for optimization of honing operations, the optimal structure of the honing working cycles, stepped and stepless honing cycles, simulation of processing and its purpose are emphasized. It is noted that the reliability of the mathematical model determines the quality parameters of the honing process control. An algorithm for continuous control of the honing process is proposed. The process model reliably describes the machining of a workpiece in a sufficiently wide area and can be used to operate the CNC machine CC743.

  7. HYNOL PROCESS ENGINEERING: PROCESS CONFIGURATION, SITE PLAN, AND EQUIPMENT DESIGN

    EPA Science Inventory

    The report describes the design of the hydropyrolysis reactor system of the Hynol process. (NOTE: A bench scale methanol production facility is being constructed to demonstrate the technical feasibility of producing methanol from biomass using the Hynol process. The plant is bein...

  8. Silicon production process evaluations

    NASA Technical Reports Server (NTRS)

    1982-01-01

    Chemical engineering analyses involving the preliminary process design of a plant (1,000 metric tons/year capacity) to produce silicon via the technology under consideration were accomplished. Major activities in the chemical engineering analyses included base case conditions, reaction chemistry, process flowsheet, material balance, energy balance, property data, equipment design, major equipment list, production labor and forward for economic analysis. The process design package provided detailed data for raw materials, utilities, major process equipment and production labor requirements necessary for polysilicon production in each process.

  9. Modular Toolkit for Data Processing (MDP): A Python Data Processing Framework.

    PubMed

    Zito, Tiziano; Wilbert, Niko; Wiskott, Laurenz; Berkes, Pietro

    2008-01-01

    Modular toolkit for Data Processing (MDP) is a data processing framework written in Python. From the user's perspective, MDP is a collection of supervised and unsupervised learning algorithms and other data processing units that can be combined into data processing sequences and more complex feed-forward network architectures. Computations are performed efficiently in terms of speed and memory requirements. From the scientific developer's perspective, MDP is a modular framework, which can easily be expanded. The implementation of new algorithms is easy and intuitive. The new implemented units are then automatically integrated with the rest of the library. MDP has been written in the context of theoretical research in neuroscience, but it has been designed to be helpful in any context where trainable data processing algorithms are used. Its simplicity on the user's side, the variety of readily available algorithms, and the reusability of the implemented units make it also a useful educational tool.

  10. Examining Candidate Information Search Processes: The Impact of Processing Goals and Sophistication.

    ERIC Educational Resources Information Center

    Huang, Li-Ning

    2000-01-01

    Investigates how 4 different information-processing goals, varying on the dimensions of effortful versus effortless and impression-driven versus non-impression-driven processing, and individual difference in political sophistication affect the depth at which undergraduate students process candidate information and their decision-making strategies.…

  11. On Intelligent Design and Planning Method of Process Route Based on Gun Breech Machining Process

    NASA Astrophysics Data System (ADS)

    Hongzhi, Zhao; Jian, Zhang

    2018-03-01

    The paper states an approach of intelligent design and planning of process route based on gun breech machining process, against several problems, such as complex machining process of gun breech, tedious route design and long period of its traditional unmanageable process route. Based on gun breech machining process, intelligent design and planning system of process route are developed by virtue of DEST and VC++. The system includes two functional modules--process route intelligent design and its planning. The process route intelligent design module, through the analysis of gun breech machining process, summarizes breech process knowledge so as to complete the design of knowledge base and inference engine. And then gun breech process route intelligently output. On the basis of intelligent route design module, the final process route is made, edited and managed in the process route planning module.

  12. Process Mining Methodology for Health Process Tracking Using Real-Time Indoor Location Systems.

    PubMed

    Fernandez-Llatas, Carlos; Lizondo, Aroa; Monton, Eduardo; Benedi, Jose-Miguel; Traver, Vicente

    2015-11-30

    The definition of efficient and accurate health processes in hospitals is crucial for ensuring an adequate quality of service. Knowing and improving the behavior of the surgical processes in a hospital can improve the number of patients that can be operated on using the same resources. However, the measure of this process is usually made in an obtrusive way, forcing nurses to get information and time data, affecting the proper process and generating inaccurate data due to human errors during the stressful journey of health staff in the operating theater. The use of indoor location systems can take time information about the process in an unobtrusive way, freeing nurses, allowing them to engage in purely welfare work. However, it is necessary to present these data in a understandable way for health professionals, who cannot deal with large amounts of historical localization log data. The use of process mining techniques can deal with this problem, offering an easily understandable view of the process. In this paper, we present a tool and a process mining-based methodology that, using indoor location systems, enables health staff not only to represent the process, but to know precise information about the deployment of the process in an unobtrusive and transparent way. We have successfully tested this tool in a real surgical area with 3613 patients during February, March and April of 2015.

  13. Process Mining Methodology for Health Process Tracking Using Real-Time Indoor Location Systems

    PubMed Central

    Fernandez-Llatas, Carlos; Lizondo, Aroa; Monton, Eduardo; Benedi, Jose-Miguel; Traver, Vicente

    2015-01-01

    The definition of efficient and accurate health processes in hospitals is crucial for ensuring an adequate quality of service. Knowing and improving the behavior of the surgical processes in a hospital can improve the number of patients that can be operated on using the same resources. However, the measure of this process is usually made in an obtrusive way, forcing nurses to get information and time data, affecting the proper process and generating inaccurate data due to human errors during the stressful journey of health staff in the operating theater. The use of indoor location systems can take time information about the process in an unobtrusive way, freeing nurses, allowing them to engage in purely welfare work. However, it is necessary to present these data in a understandable way for health professionals, who cannot deal with large amounts of historical localization log data. The use of process mining techniques can deal with this problem, offering an easily understandable view of the process. In this paper, we present a tool and a process mining-based methodology that, using indoor location systems, enables health staff not only to represent the process, but to know precise information about the deployment of the process in an unobtrusive and transparent way. We have successfully tested this tool in a real surgical area with 3613 patients during February, March and April of 2015. PMID:26633395

  14. Closed-Loop Process Control for Electron Beam Freeform Fabrication and Deposition Processes

    NASA Technical Reports Server (NTRS)

    Taminger, Karen M. (Inventor); Hofmeister, William H. (Inventor); Martin, Richard E. (Inventor); Hafley, Robert A. (Inventor)

    2013-01-01

    A closed-loop control method for an electron beam freeform fabrication (EBF(sup 3)) process includes detecting a feature of interest during the process using a sensor(s), continuously evaluating the feature of interest to determine, in real time, a change occurring therein, and automatically modifying control parameters to control the EBF(sup 3) process. An apparatus provides closed-loop control method of the process, and includes an electron gun for generating an electron beam, a wire feeder for feeding a wire toward a substrate, wherein the wire is melted and progressively deposited in layers onto the substrate, a sensor(s), and a host machine. The sensor(s) measure the feature of interest during the process, and the host machine continuously evaluates the feature of interest to determine, in real time, a change occurring therein. The host machine automatically modifies control parameters to the EBF(sup 3) apparatus to control the EBF(sup 3) process in a closed-loop manner.

  15. 43 CFR 2884.17 - How will BLM process my Processing Category 6 application?

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 43 Public Lands: Interior 2 2012-10-01 2012-10-01 false How will BLM process my Processing...-WAY UNDER THE MINERAL LEASING ACT Applying for MLA Grants or TUPs § 2884.17 How will BLM process my... written agreement that describes how BLM will process your application. The final agreement consists of a...

  16. 43 CFR 2884.17 - How will BLM process my Processing Category 6 application?

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 43 Public Lands: Interior 2 2013-10-01 2013-10-01 false How will BLM process my Processing...-WAY UNDER THE MINERAL LEASING ACT Applying for MLA Grants or TUPs § 2884.17 How will BLM process my... written agreement that describes how BLM will process your application. The final agreement consists of a...

  17. Laser dimpling process parameters selection and optimization using surrogate-driven process capability space

    NASA Astrophysics Data System (ADS)

    Ozkat, Erkan Caner; Franciosa, Pasquale; Ceglarek, Dariusz

    2017-08-01

    Remote laser welding technology offers opportunities for high production throughput at a competitive cost. However, the remote laser welding process of zinc-coated sheet metal parts in lap joint configuration poses a challenge due to the difference between the melting temperature of the steel (∼1500 °C) and the vapourizing temperature of the zinc (∼907 °C). In fact, the zinc layer at the faying surface is vapourized and the vapour might be trapped within the melting pool leading to weld defects. Various solutions have been proposed to overcome this problem over the years. Among them, laser dimpling has been adopted by manufacturers because of its flexibility and effectiveness along with its cost advantages. In essence, the dimple works as a spacer between the two sheets in lap joint and allows the zinc vapour escape during welding process, thereby preventing weld defects. However, there is a lack of comprehensive characterization of dimpling process for effective implementation in real manufacturing system taking into consideration inherent changes in variability of process parameters. This paper introduces a methodology to develop (i) surrogate model for dimpling process characterization considering multiple-inputs (i.e. key control characteristics) and multiple-outputs (i.e. key performance indicators) system by conducting physical experimentation and using multivariate adaptive regression splines; (ii) process capability space (Cp-Space) based on the developed surrogate model that allows the estimation of a desired process fallout rate in the case of violation of process requirements in the presence of stochastic variation; and, (iii) selection and optimization of the process parameters based on the process capability space. The proposed methodology provides a unique capability to: (i) simulate the effect of process variation as generated by manufacturing process; (ii) model quality requirements with multiple and coupled quality requirements; and (iii

  18. Cox process representation and inference for stochastic reaction-diffusion processes

    NASA Astrophysics Data System (ADS)

    Schnoerr, David; Grima, Ramon; Sanguinetti, Guido

    2016-05-01

    Complex behaviour in many systems arises from the stochastic interactions of spatially distributed particles or agents. Stochastic reaction-diffusion processes are widely used to model such behaviour in disciplines ranging from biology to the social sciences, yet they are notoriously difficult to simulate and calibrate to observational data. Here we use ideas from statistical physics and machine learning to provide a solution to the inverse problem of learning a stochastic reaction-diffusion process from data. Our solution relies on a non-trivial connection between stochastic reaction-diffusion processes and spatio-temporal Cox processes, a well-studied class of models from computational statistics. This connection leads to an efficient and flexible algorithm for parameter inference and model selection. Our approach shows excellent accuracy on numeric and real data examples from systems biology and epidemiology. Our work provides both insights into spatio-temporal stochastic systems, and a practical solution to a long-standing problem in computational modelling.

  19. Performance of biofuel processes utilising separate lignin and carbohydrate processing.

    PubMed

    Melin, Kristian; Kohl, Thomas; Koskinen, Jukka; Hurme, Markku

    2015-09-01

    Novel biofuel pathways with increased product yields are evaluated against conventional lignocellulosic biofuel production processes: methanol or methane production via gasification and ethanol production via steam-explosion pre-treatment. The novel processes studied are ethanol production combined with methanol production by gasification, hydrocarbon fuel production with additional hydrogen produced from lignin residue gasification, methanol or methane synthesis using synthesis gas from lignin residue gasification and additional hydrogen obtained by aqueous phase reforming in synthesis gas production. The material and energy balances of the processes were calculated by Aspen flow sheet models and add on excel calculations applicable at the conceptual design stage to evaluate the pre-feasibility of the alternatives. The processes were compared using the following criteria: energy efficiency from biomass to products, primary energy efficiency, GHG reduction potential and economy (expressed as net present value: NPV). Several novel biorefinery concepts gave higher energy yields, GHG reduction potential and NPV. Copyright © 2015 Elsevier Ltd. All rights reserved.

  20. Neurophysiological evidence for transfer appropriate processing of memory: processing versus feature similarity.

    PubMed

    Schendan, Haune E; Kutas, Malra

    2007-08-01

    Transfer appropriate processing (TAP) accounts propose that memory is a function of the degree to which the same neural processes transfer appropriately from the study experience to the memory test. However, in prior research, study and test stimuli were often similar physically. In two experiments, event-related brain potentials (ERPs) were recorded to fragmented objects during an indirect memory test to isolate transfer of a specific perceptual process from overlap of physical features between experiences. An occipitotemporoparietal P2(00) at 200 msec showed implicit memory effects only when similar perceptual grouping processes of good continuation were repeatedly engaged-despite physical feature differences--as TAP accounts hypothesize. This result provides direct neurophysiological evidence for the critical role of process transfer across experiences for memory.

  1. Manufacturing Process Selection of Composite Bicycle’s Crank Arm using Analytical Hierarchy Process (AHP)

    NASA Astrophysics Data System (ADS)

    Luqman, M.; Rosli, M. U.; Khor, C. Y.; Zambree, Shayfull; Jahidi, H.

    2018-03-01

    Crank arm is one of the important parts in a bicycle that is an expensive product due to the high cost of material and production process. This research is aimed to investigate the potential type of manufacturing process to fabricate composite bicycle crank arm and to describe an approach based on analytical hierarchy process (AHP) that assists decision makers or manufacturing engineers in determining the most suitable process to be employed in manufacturing of composite bicycle crank arm at the early stage of the product development process to reduce the production cost. There are four types of processes were considered, namely resin transfer molding (RTM), compression molding (CM), vacuum bag molding and filament winding (FW). The analysis ranks these four types of process for its suitability in the manufacturing of bicycle crank arm based on five main selection factors and 10 sub factors. Determining the right manufacturing process was performed based on AHP process steps. Consistency test was performed to make sure the judgements are consistent during the comparison. The results indicated that the compression molding was the most appropriate manufacturing process because it has the highest value (33.6%) among the other manufacturing processes.

  2. Optimization of a novel enzyme treatment process for early-stage processing of sheepskins.

    PubMed

    Lim, Y F; Bronlund, J E; Allsop, T F; Shilton, A N; Edmonds, R L

    2010-01-01

    An enzyme treatment process for early-stage processing of sheepskins has been previously reported by the Leather and Shoe Research Association of New Zealand (LASRA) as an alternative to current industry operations. The newly developed process had marked benefits over conventional processing in terms of a lowered energy usage (73%), processing time (47%) as well as water use (49%), but had been developed as a "proof of principle''. The objective of this work was to develop the process further to a stage ready for adoption by industry. Mass balancing was used to investigate potential modifications for the process based on the understanding developed from a detailed analysis of preliminary design trials. Results showed that a configuration utilising a 2 stage counter-current system for the washing stages and segregation and recycling of enzyme float prior to dilution in the neutralization stage was a significant improvement. Benefits over conventional processing include a reduction of residual TDS by 50% at the washing stages and 70% savings on water use overall. Benefits over the un-optimized LASRA process are reduction of solids in product after enzyme treatment and neutralization stages by 30%, additional water savings of 21%, as well as 10% savings of enzyme usage.

  3. Business Process Management

    NASA Astrophysics Data System (ADS)

    Hantry, Francois; Papazoglou, Mike; van den Heuvel, Willem-Jan; Haque, Rafique; Whelan, Eoin; Carroll, Noel; Karastoyanova, Dimka; Leymann, Frank; Nikolaou, Christos; Lammersdorf, Winfried; Hacid, Mohand-Said

    Business process management is one of the core drivers of business innovation and is based on strategic technology and capable of creating and successfully executing end-to-end business processes. The trend will be to move from relatively stable, organization-specific applications to more dynamic, high-value ones where business process interactions and trends are examined closely to understand more accurately an application's requirements. Such collaborative, complex end-to-end service interactions give rise to the concept of Service Networks (SNs).

  4. Issues Management Process Course # 38401

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Binion, Ula Marie

    The purpose of this training it to advise Issues Management Coordinators (IMCs) on the revised Contractor Assurance System (CAS) Issues Management (IM) process. Terminal Objectives: Understand the Laboratory’s IM process; Understand your role in the Laboratory’s IM process. Learning Objectives: Describe the IM process within the context of the CAS; Describe the importance of implementing an institutional IM process at LANL; Describe the process flow for the Laboratory’s IM process; Apply the definition of an issue; Use available resources to determine initial screening risk levels for issues; Describe the required major process steps for each risk level; Describe the personnelmore » responsibilities for IM process implementation; Access available resources to support IM process implementation.« less

  5. gProcess and ESIP Platforms for Satellite Imagery Processing over the Grid

    NASA Astrophysics Data System (ADS)

    Bacu, Victor; Gorgan, Dorian; Rodila, Denisa; Pop, Florin; Neagu, Gabriel; Petcu, Dana

    2010-05-01

    The Environment oriented Satellite Data Processing Platform (ESIP) is developed through the SEE-GRID-SCI (SEE-GRID eInfrastructure for regional eScience) co-funded by the European Commission through FP7 [1]. The gProcess Platform [2] is a set of tools and services supporting the development and the execution over the Grid of the workflow based processing, and particularly the satelite imagery processing. The ESIP [3], [4] is build on top of the gProcess platform by adding a set of satellite image processing software modules and meteorological algorithms. The satellite images can reveal and supply important information on earth surface parameters, climate data, pollution level, weather conditions that can be used in different research areas. Generally, the processing algorithms of the satellite images can be decomposed in a set of modules that forms a graph representation of the processing workflow. Two types of workflows can be defined in the gProcess platform: abstract workflow (PDG - Process Description Graph), in which the user defines conceptually the algorithm, and instantiated workflow (iPDG - instantiated PDG), which is the mapping of the PDG pattern on particular satellite image and meteorological data [5]. The gProcess platform allows the definition of complex workflows by combining data resources, operators, services and sub-graphs. The gProcess platform is developed for the gLite middleware that is available in EGEE and SEE-GRID infrastructures [6]. gProcess exposes the specific functionality through web services [7]. The Editor Web Service retrieves information on available resources that are used to develop complex workflows (available operators, sub-graphs, services, supported resources, etc.). The Manager Web Service deals with resources management (uploading new resources such as workflows, operators, services, data, etc.) and in addition retrieves information on workflows. The Executor Web Service manages the execution of the instantiated workflows

  6. A Process Research Framework: The International Process Research Consortium

    DTIC Science & Technology

    2006-12-01

    projects ? 52 Theme P | IPRC Framework 5 P-30 How should a process for collaborative development be formulated? The development at different companies...requires some process for the actual collaboration . How should it be handled? P-31 How do we handle change? Requirements change during development ...source projects employ a single-site development model in which there is no large community of testers but rather a single-site small group

  7. Transformation from manufacturing process taxonomy to repair process taxonomy: a phenetic approach

    NASA Astrophysics Data System (ADS)

    Raza, Umair; Ahmad, Wasim; Khan, Atif

    2018-02-01

    The need of taxonomy is vital for knowledge sharing. This need has been portrayed by through-life engineering services/systems. This paper addresses this issue by repair process taxonomy development. Framework for repair process taxonomy was developed followed by its implementation. The importance of repair process taxonomy has been highlighted.

  8. Empirical evaluation of the Process Overview Measure for assessing situation awareness in process plants.

    PubMed

    Lau, Nathan; Jamieson, Greg A; Skraaning, Gyrd

    2016-03-01

    The Process Overview Measure is a query-based measure developed to assess operator situation awareness (SA) from monitoring process plants. A companion paper describes how the measure has been developed according to process plant properties and operator cognitive work. The Process Overview Measure demonstrated practicality, sensitivity, validity and reliability in two full-scope simulator experiments investigating dramatically different operational concepts. Practicality was assessed based on qualitative feedback of participants and researchers. The Process Overview Measure demonstrated sensitivity and validity by revealing significant effects of experimental manipulations that corroborated with other empirical results. The measure also demonstrated adequate inter-rater reliability and practicality for measuring SA in full-scope simulator settings based on data collected on process experts. Thus, full-scope simulator studies can employ the Process Overview Measure to reveal the impact of new control room technology and operational concepts on monitoring process plants. Practitioner Summary: The Process Overview Measure is a query-based measure that demonstrated practicality, sensitivity, validity and reliability for assessing operator situation awareness (SA) from monitoring process plants in representative settings.

  9. Badge Office Process Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Haurykiewicz, John Paul; Dinehart, Timothy Grant; Parker, Robert Young

    2016-05-12

    The purpose of this process analysis was to analyze the Badge Offices’ current processes from a systems perspective and consider ways of pursuing objectives set forth by SEC-PS, namely increased customer flow (throughput) and reduced customer wait times. Information for the analysis was gathered for the project primarily through Badge Office Subject Matter Experts (SMEs), and in-person observation of prevailing processes. Using the information gathered, a process simulation model was constructed to represent current operations and allow assessment of potential process changes relative to factors mentioned previously. The overall purpose of the analysis was to provide SEC-PS management with informationmore » and recommendations to serve as a basis for additional focused study and areas for potential process improvements in the future.« less

  10. The auditory basis of language impairments: temporal processing versus processing efficiency hypotheses.

    PubMed

    Hartley, Douglas E H; Hill, Penny R; Moore, David R

    2003-12-01

    Claims have been made that language-impaired children have deficits processing rapidly presented or brief sensory information. These claims, known as the 'temporal processing hypothesis', are supported by demonstrations that language-impaired children have excess backward masking (BM). One explanation for these results is that BM is developmentally delayed in these children. However, little was known about how BM normally develops. Recently, we assessed BM in normally developing 6- and 8-year-old children and adults. Results showed that BM thresholds continue to improve over a comparatively protracted period (>10 years old). We also analysed reported deficits in BM in language-impaired and younger children, in terms of a model of temporal resolution. This analysis suggests that poor processing efficiency, rather than deficits in temporal resolution, can account for these results. This 'processing efficiency hypothesis' was recently tested in our laboratory. This experiment measured BM as a function of delays between the tone and the noise in children and adults. Results supported the processing efficiency hypothesis, and suggested that reduced processing efficiency alone could account for differences between adults and children. These findings provide a new perspective on the mechanisms underlying communication disorders, and imply that remediation strategies should be directed towards improving processing efficiency, not temporal resolution.

  11. Anodizing Process

    NASA Technical Reports Server (NTRS)

    1983-01-01

    This anodizing process traces its origin to the 1960's when Reynolds Metals Company, under contract with Goddard Space Flight Center, developed a multipurpose anodizing electrolyte (MAE) process to produce a hard protective finish for spacecraft aluminum. MAE produces a high-density, abrasion-resistant film prior to the coloring step, in which the pores of the film are impregnated with a metallic form of salt. Tru-Color product applications include building fronts, railing, curtain walls, doors and windows.

  12. Fuel gas conditioning process

    DOEpatents

    Lokhandwala, Kaaeid A.

    2000-01-01

    A process for conditioning natural gas containing C.sub.3+ hydrocarbons and/or acid gas, so that it can be used as combustion fuel to run gas-powered equipment, including compressors, in the gas field or the gas processing plant. Compared with prior art processes, the invention creates lesser quantities of low-pressure gas per unit volume of fuel gas produced. Optionally, the process can also produce an NGL product.

  13. Chemical Processing Manual

    NASA Technical Reports Server (NTRS)

    Beyerle, F. J.

    1972-01-01

    Chemical processes presented in this document include cleaning, pickling, surface finishes, chemical milling, plating, dry film lubricants, and polishing. All types of chemical processes applicable to aluminum, for example, are to be found in the aluminum alloy section. There is a separate section for each category of metallic alloy plus a section for non-metals, such as plastics. The refractories, super-alloys and titanium, are prime candidates for the space shuttle, therefore, the chemical processes applicable to these alloys are contained in individual sections of this manual.

  14. Right Hemisphere Metaphor Processing? Characterizing the Lateralization of Semantic Processes

    ERIC Educational Resources Information Center

    Schmidt, Gwen L.; DeBuse, Casey J.; Seger, Carol A.

    2007-01-01

    Previous laterality studies have implicated the right hemisphere in the processing of metaphors, however it is not clear if this result is due to metaphoricity per se or another aspect of semantic processing. Three divided visual field experiments varied metaphorical and literal sentence familiarity. We found a right hemisphere advantage for…

  15. Feasibility of using continuous chromatography in downstream processing: Comparison of costs and product quality for a hybrid process vs. a conventional batch process.

    PubMed

    Ötes, Ozan; Flato, Hendrik; Winderl, Johannes; Hubbuch, Jürgen; Capito, Florian

    2017-10-10

    The protein A capture step is the main cost-driver in downstream processing, with high attrition costs especially when using protein A resin not until end of resin lifetime. Here we describe a feasibility study, transferring a batch downstream process to a hybrid process, aimed at replacing batch protein A capture chromatography with a continuous capture step, while leaving the polishing steps unchanged to minimize required process adaptations compared to a batch process. 35g of antibody were purified using the hybrid approach, resulting in comparable product quality and step yield compared to the batch process. Productivity for the protein A step could be increased up to 420%, reducing buffer amounts by 30-40% and showing robustness for at least 48h continuous run time. Additionally, to enable its potential application in a clinical trial manufacturing environment cost of goods were compared for the protein A step between hybrid process and batch process, showing a 300% cost reduction, depending on processed volumes and batch cycles. Copyright © 2017 Elsevier B.V. All rights reserved.

  16. Gas-separation process

    DOEpatents

    Toy, Lora G.; Pinnau, Ingo; Baker, Richard W.

    1994-01-01

    A process for separating condensable organic components from gas streams. The process makes use of a membrane made from a polymer material that is glassy and that has an unusually high free volume within the polymer material.

  17. Natural Language Processing.

    ERIC Educational Resources Information Center

    Chowdhury, Gobinda G.

    2003-01-01

    Discusses issues related to natural language processing, including theoretical developments; natural language understanding; tools and techniques; natural language text processing systems; abstracting; information extraction; information retrieval; interfaces; software; Internet, Web, and digital library applications; machine translation for…

  18. Explicit and Implicit Processes Constitute the Fast and Slow Processes of Sensorimotor Learning.

    PubMed

    McDougle, Samuel D; Bond, Krista M; Taylor, Jordan A

    2015-07-01

    A popular model of human sensorimotor learning suggests that a fast process and a slow process work in parallel to produce the canonical learning curve (Smith et al., 2006). Recent evidence supports the subdivision of sensorimotor learning into explicit and implicit processes that simultaneously subserve task performance (Taylor et al., 2014). We set out to test whether these two accounts of learning processes are homologous. Using a recently developed method to assay explicit and implicit learning directly in a sensorimotor task, along with a computational modeling analysis, we show that the fast process closely resembles explicit learning and the slow process approximates implicit learning. In addition, we provide evidence for a subdivision of the slow/implicit process into distinct manifestations of motor memory. We conclude that the two-state model of motor learning is a close approximation of sensorimotor learning, but it is unable to describe adequately the various implicit learning operations that forge the learning curve. Our results suggest that a wider net be cast in the search for the putative psychological mechanisms and neural substrates underlying the multiplicity of processes involved in motor learning. Copyright © 2015 the authors 0270-6474/15/359568-12$15.00/0.

  19. Explicit and Implicit Processes Constitute the Fast and Slow Processes of Sensorimotor Learning

    PubMed Central

    Bond, Krista M.; Taylor, Jordan A.

    2015-01-01

    A popular model of human sensorimotor learning suggests that a fast process and a slow process work in parallel to produce the canonical learning curve (Smith et al., 2006). Recent evidence supports the subdivision of sensorimotor learning into explicit and implicit processes that simultaneously subserve task performance (Taylor et al., 2014). We set out to test whether these two accounts of learning processes are homologous. Using a recently developed method to assay explicit and implicit learning directly in a sensorimotor task, along with a computational modeling analysis, we show that the fast process closely resembles explicit learning and the slow process approximates implicit learning. In addition, we provide evidence for a subdivision of the slow/implicit process into distinct manifestations of motor memory. We conclude that the two-state model of motor learning is a close approximation of sensorimotor learning, but it is unable to describe adequately the various implicit learning operations that forge the learning curve. Our results suggest that a wider net be cast in the search for the putative psychological mechanisms and neural substrates underlying the multiplicity of processes involved in motor learning. PMID:26134640

  20. Cascading activation from lexical processing to letter-level processing in written word production.

    PubMed

    Buchwald, Adam; Falconer, Carolyn

    2014-01-01

    Descriptions of language production have identified processes involved in producing language and the presence and type of interaction among those processes. In the case of spoken language production, consensus has emerged that there is interaction among lexical selection processes and phoneme-level processing. This issue has received less attention in written language production. In this paper, we present a novel analysis of the writing-to-dictation performance of an individual with acquired dysgraphia revealing cascading activation from lexical processing to letter-level processing. The individual produced frequent lexical-semantic errors (e.g., chipmunk → SQUIRREL) as well as letter errors (e.g., inhibit → INBHITI) and had a profile consistent with impairment affecting both lexical processing and letter-level processing. The presence of cascading activation is suggested by lower letter accuracy on words that are more weakly activated during lexical selection than on those that are more strongly activated. We operationalize weakly activated lexemes as those lexemes that are produced as lexical-semantic errors (e.g., lethal in deadly → LETAHL) compared to strongly activated lexemes where the intended target word (e.g., lethal) is the lexeme selected for production.

  1. Spitzer Telemetry Processing System

    NASA Technical Reports Server (NTRS)

    Stanboli, Alice; Martinez, Elmain M.; McAuley, James M.

    2013-01-01

    The Spitzer Telemetry Processing System (SirtfTlmProc) was designed to address objectives of JPL's Multi-mission Image Processing Lab (MIPL) in processing spacecraft telemetry and distributing the resulting data to the science community. To minimize costs and maximize operability, the software design focused on automated error recovery, performance, and information management. The system processes telemetry from the Spitzer spacecraft and delivers Level 0 products to the Spitzer Science Center. SirtfTlmProc is a unique system with automated error notification and recovery, with a real-time continuous service that can go quiescent after periods of inactivity. The software can process 2 GB of telemetry and deliver Level 0 science products to the end user in four hours. It provides analysis tools so the operator can manage the system and troubleshoot problems. It automates telemetry processing in order to reduce staffing costs.

  2. An improved plating process

    NASA Technical Reports Server (NTRS)

    Askew, John C.

    1994-01-01

    An alternative to the immersion process for the electrodeposition of chromium from aqueous solutions on the inside diameter (ID) of long tubes is described. The Vessel Plating Process eliminates the need for deep processing tanks, large volumes of solutions, and associated safety and environmental concerns. Vessel Plating allows the process to be monitored and controlled by computer thus increasing reliability, flexibility and quality. Elimination of the trivalent chromium accumulation normally associated with ID plating is intrinsic to the Vessel Plating Process. The construction and operation of a prototype Vessel Plating Facility with emphasis on materials of construction, engineered and operational safety and a unique system for rinse water recovery are described.

  3. Analyzing Discourse Processing Using a Simple Natural Language Processing Tool

    ERIC Educational Resources Information Center

    Crossley, Scott A.; Allen, Laura K.; Kyle, Kristopher; McNamara, Danielle S.

    2014-01-01

    Natural language processing (NLP) provides a powerful approach for discourse processing researchers. However, there remains a notable degree of hesitation by some researchers to consider using NLP, at least on their own. The purpose of this article is to introduce and make available a "simple" NLP (SiNLP) tool. The overarching goal of…

  4. Haines Board: A Review of Army School System for Officer Education and Training. USAIS Position. Volume 1 and Factual Data. Volume 2

    DTIC Science & Technology

    1965-07-01

    currently not suitable for general publica - tion, but which, because of their implication in the future, merit the atten- tion of senior Infantry...OC selection and evaluation, sug- gestions on departmental organization, evaluation of officer students, course prerequisites, attrition

  5. Precision Heating Process

    NASA Technical Reports Server (NTRS)

    1992-01-01

    A heat sealing process was developed by SEBRA based on technology that originated in work with NASA's Jet Propulsion Laboratory. The project involved connecting and transferring blood and fluids between sterile plastic containers while maintaining a closed system. SEBRA markets the PIRF Process to manufacturers of medical catheters. It is a precisely controlled method of heating thermoplastic materials in a mold to form or weld catheters and other products. The process offers advantages in fast, precise welding or shape forming of catheters as well as applications in a variety of other industries.

  6. A Petri Net-Based Software Process Model for Developing Process-Oriented Information Systems

    NASA Astrophysics Data System (ADS)

    Li, Yu; Oberweis, Andreas

    Aiming at increasing flexibility, efficiency, effectiveness, and transparency of information processing and resource deployment in organizations to ensure customer satisfaction and high quality of products and services, process-oriented information systems (POIS) represent a promising realization form of computerized business information systems. Due to the complexity of POIS, explicit and specialized software process models are required to guide POIS development. In this chapter we characterize POIS with an architecture framework and present a Petri net-based software process model tailored for POIS development with consideration of organizational roles. As integrated parts of the software process model, we also introduce XML nets, a variant of high-level Petri nets as basic methodology for business processes modeling, and an XML net-based software toolset providing comprehensive functionalities for POIS development.

  7. CMOS/SOS processing

    NASA Technical Reports Server (NTRS)

    Ramondetta, P.

    1980-01-01

    Report describes processes used in making complementary - metal - oxide - semiconductor/silicon-on-sapphire (CMOS/SOS) integrated circuits. Report lists processing steps ranging from initial preparation of sapphire wafers to final mapping of "good" and "bad" circuits on a wafer.

  8. Drug Development Process

    MedlinePlus

    ... Home Food Drugs Medical Devices Radiation-Emitting Products Vaccines, Blood & Biologics Animal & Veterinary Cosmetics Tobacco Products For Patients Home For Patients Learn About Drug and Device Approvals The Drug Development Process The Drug Development Process Share Tweet Linkedin Pin ...

  9. Process based analysis of manually controlled drilling processes for bone

    NASA Astrophysics Data System (ADS)

    Teicher, Uwe; Achour, Anas Ben; Nestler, Andreas; Brosius, Alexander; Lauer, Günter

    2018-05-01

    The machining operation drilling is part of the standard repertoire for medical applications. This machining cycle, which is usually a multi-stage process, generates the geometric element for the subsequent integration of implants, which are screwed into the bone in subsequent processes. In addition to the form, shape and position of the generated drill hole, it is also necessary to use a technology that ensures an operation with minimal damage. A surface damaged by excessive mechanical and thermal energy input shows a deterioration in the healing capacity of implants and represents a structure with complications for inflammatory reactions. The resulting loads are influenced by the material properties of the bone, the used technology and the tool properties. An important aspect of the process analysis is the fact that machining of bone is in most of the cases a manual process that depends mainly on the skills of the operator. This includes, among other things, the machining time for the production of a drill hole, since manual drilling is a force-controlled process. Experimental work was carried out on the bone of a porcine mandible in order to investigate the interrelation of the applied load during drilling. It can be shown that the load application can be subdivided according to the working feed direction. The entire drilling process thus consists of several time domains, which can be divided into the geometry-generating feed motion and a retraction movement of the tool. It has been shown that the removal of the tool from the drill hole has a significant influence on the mechanical load input. This fact is proven in detail by a new evaluation methodology. The causes of this characteristic can also be identified, as well as possible ways of reducing the load input.

  10. Gas-separation process

    DOEpatents

    Toy, L.G.; Pinnau, I.; Baker, R.W.

    1994-01-25

    A process is described for separating condensable organic components from gas streams. The process makes use of a membrane made from a polymer material that is glassy and that has an unusually high free volume within the polymer material. 6 figures.

  11. News: Process intensification

    EPA Science Inventory

    Conservation of materials and energy is a major objective to the philosophy of sustainability. Where production processes can be intensified to assist these objectives, significant advances have been developed to assist conservation as well as cost. Process intensification (PI) h...

  12. What Is Group Process?: Integrating Process Work into Psychoeducational Groups

    ERIC Educational Resources Information Center

    Mills, Bethany; McBride, Dawn Lorraine

    2016-01-01

    Process work has long been a tenet of successful counseling outcomes. However, there is little literature available that focuses on how to best integrate process work into group settings--particularly psychoeducational groups that are content heavy and most often utilized in a school setting. In this article, the authors provide an overview of the…

  13. Processes for metal extraction

    NASA Technical Reports Server (NTRS)

    Bowersox, David F.

    1992-01-01

    This report describes the processing of plutonium at Los Alamos National Laboratory (LANL), and operation illustrating concepts that may be applicable to the processing of lunar materials. The toxic nature of plutonium requires a highly closed system for processing lunar surface materials.

  14. EARSEC SAR processing system

    NASA Astrophysics Data System (ADS)

    Protheroe, Mark; Sloggett, David R.; Sieber, Alois J.

    1994-12-01

    Traditionally, the production of high quality Synthetic Aperture Radar imagery has been an area where a potential user would have to expend large amounts of money in either the bespoke development of a processing chain dedicated to his requirements or in the purchase of a dedicated hardware platform adapted using accelerator boards and enhanced memory management. Whichever option the user adopted there were limitations based on the desire for a realistic throughput in data load and time. The user had a choice, made early in the purchase, for either a system that adopted innovative algorithmic manipulation, to limit the processing time of the purchase of expensive hardware. The former limits the quality of the product, while the latter excludes the user from any visibility into the processing chain. Clearly there was a need for a SAR processing architecture that gave the user a choice into the methodology to be adopted for a particular processing sequence, allowing him to decide on either a quick (lower quality) product or a detailed slower (high quality) product, without having to change the algorithmic base of his processor or the hardware platform. The European Commission, through the Advanced Techniques unit of the Joint Research Centre (JRC) Institute for Remote Sensing at Ispra in Italy, realizing the limitations on current processing abilities, initiated its own program to build airborne SAR and Electro-Optical (EO) sensor systems. This program is called the European Airborne Remote Sensing Capabilities (EARSEC) program. This paper describes the processing system developed for the airborne SAR sensor system. The paper considers the requirements for the system and the design of the EARSEC Airborne SAR Processing System. It highlights the development of an open SAR processing architecture where users have full access to intermediate products that arise from each of the major processing stages. It also describes the main processing stages in the overall

  15. Reasoning with case histories of process knowledge for efficient process development

    NASA Technical Reports Server (NTRS)

    Bharwani, Seraj S.; Walls, Joe T.; Jackson, Michael E.

    1988-01-01

    The significance of compiling case histories of empirical process knowledge and the role of such histories in improving the efficiency of manufacturing process development is discussed in this paper. Methods of representing important investigations as cases and using the information from such cases to eliminate redundancy of empirical investigations in analogous process development situations are also discussed. A system is proposed that uses such methods to capture the problem-solving framework of the application domain. A conceptual design of the system is presented and discussed.

  16. Chemical processing of glasses

    NASA Astrophysics Data System (ADS)

    Laine, Richard M.

    1990-11-01

    The development of chemical processing methods for the fabrication of glass and ceramic shapes for photonic applications is frequently Edisonian in nature. In part, this is because the numerous variables that must be optimized to obtain a given material with a specific shape and particular properties cannot be readily defined based on fundamental principles. In part, the problems arise because the basic chemistry of common chemical processing systems has not been fully delineated. The prupose of this paper is to provide an overview of the basic chemical problems associated with chemical processing. The emphasis will be on sol-gel processing, a major subset pf chemical processing. Two alternate approaches to chemical processing of glasses are also briefly discussed. One approach concerns the use of bimetallic alkoxide oligomers and polymers as potential precursors to mulimetallic glasses. The second approach describes the utility of metal carboxylate precursors to multimetallic glasses.

  17. 5 CFR 1653.13 - Processing legal processes.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... complete, a legal process must contain all pages and attachments; it must also provide (or be accompanied... no further action will be taken with respect to the document. (f) As soon as practicable after...

  18. 5 CFR 1653.13 - Processing legal processes.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... complete, a legal process must contain all pages and attachments; it must also provide (or be accompanied... no further action will be taken with respect to the document. (f) As soon as practicable after...

  19. Baldovin-Stella stochastic volatility process and Wiener process mixtures

    NASA Astrophysics Data System (ADS)

    Peirano, P. P.; Challet, D.

    2012-08-01

    Starting from inhomogeneous time scaling and linear decorrelation between successive price returns, Baldovin and Stella recently proposed a powerful and consistent way to build a model describing the time evolution of a financial index. We first make it fully explicit by using Student distributions instead of power law-truncated Lévy distributions and show that the analytic tractability of the model extends to the larger class of symmetric generalized hyperbolic distributions and provide a full computation of their multivariate characteristic functions; more generally, we show that the stochastic processes arising in this framework are representable as mixtures of Wiener processes. The basic Baldovin and Stella model, while mimicking well volatility relaxation phenomena such as the Omori law, fails to reproduce other stylized facts such as the leverage effect or some time reversal asymmetries. We discuss how to modify the dynamics of this process in order to reproduce real data more accurately.

  20. Serial Learning Process: Test of Chaining, Position, and Dual-Process Hypotheses

    ERIC Educational Resources Information Center

    Giurintano, S. L.

    1973-01-01

    The chaining, position, and dual-process hypotheses of serial learning (SL) as well as serial recall, reordering, and relearning of paired-associate learning were examined to establish learning patterns. Results provide evidence for dual-process hypothesis. (DS)

  1. Elemental sulfur recovery process

    DOEpatents

    Flytzani-Stephanopoulos, M.; Zhicheng Hu.

    1993-09-07

    An improved catalytic reduction process for the direct recovery of elemental sulfur from various SO[sub 2]-containing industrial gas streams. The catalytic process provides combined high activity and selectivity for the reduction of SO[sub 2] to elemental sulfur product with carbon monoxide or other reducing gases. The reaction of sulfur dioxide and reducing gas takes place over certain catalyst formulations based on cerium oxide. The process is a single-stage, catalytic sulfur recovery process in conjunction with regenerators, such as those used in dry, regenerative flue gas desulfurization or other processes, involving direct reduction of the SO[sub 2] in the regenerator off gas stream to elemental sulfur in the presence of a catalyst. 4 figures.

  2. Elemental sulfur recovery process

    DOEpatents

    Flytzani-Stephanopoulos, Maria; Hu, Zhicheng

    1993-01-01

    An improved catalytic reduction process for the direct recovery of elemental sulfur from various SO.sub.2 -containing industrial gas streams. The catalytic process provides combined high activity and selectivity for the reduction of SO.sub.2 to elemental sulfur product with carbon monoxide or other reducing gases. The reaction of sulfur dioxide and reducing gas takes place over certain catalyst formulations based on cerium oxide. The process is a single-stage, catalytic sulfur recovery process in conjunction with regenerators, such as those used in dry, regenerative flue gas desulfurization or other processes, involving direct reduction of the SO.sub.2 in the regenerator off gas stream to elemental sulfur in the presence of a catalyst.

  3. Processed and ultra-processed foods are associated with lower-quality nutrient profiles in children from Colombia.

    PubMed

    Cornwell, Brittany; Villamor, Eduardo; Mora-Plazas, Mercedes; Marin, Constanza; Monteiro, Carlos A; Baylin, Ana

    2018-01-01

    To determine if processed and ultra-processed foods consumed by children in Colombia are associated with lower-quality nutrition profiles than less processed foods. We obtained information on sociodemographic and anthropometric variables and dietary information through dietary records and 24 h recalls from a convenience sample of the Bogotá School Children Cohort. Foods were classified into three categories: (i) unprocessed and minimally processed foods, (ii) processed culinary ingredients and (iii) processed and ultra-processed foods. We also examined the combination of unprocessed foods and processed culinary ingredients. Representative sample of children from low- to middle-income families in Bogotá, Colombia. Children aged 5-12 years in 2011 Bogotá School Children Cohort. We found that processed and ultra-processed foods are of lower dietary quality in general. Nutrients that were lower in processed and ultra-processed foods following adjustment for total energy intake included: n-3 PUFA, vitamins A, B12, C and E, Ca and Zn. Nutrients that were higher in energy-adjusted processed and ultra-processed foods compared with unprocessed foods included: Na, sugar and trans-fatty acids, although we also found that some healthy nutrients, including folate and Fe, were higher in processed and ultra-processed foods compared with unprocessed and minimally processed foods. Processed and ultra-processed foods generally have unhealthy nutrition profiles. Our findings suggest the categorization of foods based on processing characteristics is promising for understanding the influence of food processing on children's dietary quality. More studies accounting for the type and degree of food processing are needed.

  4. Monitoring Process Effectiveness

    EPA Science Inventory

    Treatment of municipal sludges to produce biosolids which meet federal and/or state requirements for land application requires process monitoring. The goal of process monitoring is to produce biosolids of consistent and reliable quality. In its simplest form, for Class B treatme...

  5. Food processing and allergenicity.

    PubMed

    Verhoeckx, Kitty C M; Vissers, Yvonne M; Baumert, Joseph L; Faludi, Roland; Feys, Marcel; Flanagan, Simon; Herouet-Guicheney, Corinne; Holzhauser, Thomas; Shimojo, Ryo; van der Bolt, Nieke; Wichers, Harry; Kimber, Ian

    2015-06-01

    Food processing can have many beneficial effects. However, processing may also alter the allergenic properties of food proteins. A wide variety of processing methods is available and their use depends largely on the food to be processed. In this review the impact of processing (heat and non-heat treatment) on the allergenic potential of proteins, and on the antigenic (IgG-binding) and allergenic (IgE-binding) properties of proteins has been considered. A variety of allergenic foods (peanuts, tree nuts, cows' milk, hens' eggs, soy, wheat and mustard) have been reviewed. The overall conclusion drawn is that processing does not completely abolish the allergenic potential of allergens. Currently, only fermentation and hydrolysis may have potential to reduce allergenicity to such an extent that symptoms will not be elicited, while other methods might be promising but need more data. Literature on the effect of processing on allergenic potential and the ability to induce sensitisation is scarce. This is an important issue since processing may impact on the ability of proteins to cause the acquisition of allergic sensitisation, and the subject should be a focus of future research. Also, there remains a need to develop robust and integrated methods for the risk assessment of food allergenicity. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.

  6. Space processing economics

    NASA Technical Reports Server (NTRS)

    Bredt, J. H.

    1974-01-01

    Two types of space processing operations may be considered economically justified; they are manufacturing operations that make profits and experiment operations that provide needed applied research results at lower costs than those of alternative methods. Some examples from the Skylab experiments suggest that applied research should become cost effective soon after the space shuttle and Spacelab become operational. In space manufacturing, the total cost of space operations required to process materials must be repaid by the value added to the materials by the processing. Accurate estimates of profitability are not yet possible because shuttle operational costs are not firmly established and the markets for future products are difficult to estimate. However, approximate calculations show that semiconductor products and biological preparations may be processed on a scale consistent with market requirements and at costs that are at least compatible with profitability using the Shuttle/Spacelab system.

  7. FLUORINATION PROCESS

    DOEpatents

    McMillan, T.S.

    1957-10-29

    A process for the fluorination of uranium metal is described. It is known that uranium will react with liquid chlorine trifluoride but the reaction proceeds at a slow rate. However, a mixture of a halogen trifluoride together with hydrogen fluoride reacts with uranium at a significantly faster rate than does a halogen trifluoride alone. Bromine trifluoride is suitable for use in the process, but chlorine trifluoride is preferred. Particularly suitable is a mixture of ClF/sub 3/ and HF having a mole ratio (moles

  8. A DMAIC approach for process capability improvement an engine crankshaft manufacturing process

    NASA Astrophysics Data System (ADS)

    Sharma, G. V. S. S.; Rao, P. Srinivasa

    2014-05-01

    The define-measure-analyze-improve-control (DMAIC) approach is a five-strata approach, namely DMAIC. This approach is the scientific approach for reducing the deviations and improving the capability levels of the manufacturing processes. The present work elaborates on DMAIC approach applied in reducing the process variations of the stub-end-hole boring operation of the manufacture of crankshaft. This statistical process control study starts with selection of the critical-to-quality (CTQ) characteristic in the define stratum. The next stratum constitutes the collection of dimensional measurement data of the CTQ characteristic identified. This is followed by the analysis and improvement strata where the various quality control tools like Ishikawa diagram, physical mechanism analysis, failure modes effects analysis and analysis of variance are applied. Finally, the process monitoring charts are deployed at the workplace for regular monitoring and control of the concerned CTQ characteristic. By adopting DMAIC approach, standard deviation is reduced from 0.003 to 0.002. The process potential capability index ( C P) values improved from 1.29 to 2.02 and the process performance capability index ( C PK) values improved from 0.32 to 1.45, respectively.

  9. Process control systems: integrated for future process technologies

    NASA Astrophysics Data System (ADS)

    Botros, Youssry; Hajj, Hazem M.

    2003-06-01

    Process Control Systems (PCS) are becoming more crucial to the success of Integrated Circuit makers due to their direct impact on product quality, cost, and Fab output. The primary objective of PCS is to minimize variability by detecting and correcting non optimal performance. Current PCS implementations are considered disparate, where each PCS application is designed, deployed and supported separately. Each implementation targets a specific area of control such as equipment performance, wafer manufacturing, and process health monitoring. With Intel entering the nanometer technology era, tighter process specifications are required for higher yields and lower cost. This requires areas of control to be tightly coupled and integrated to achieve the optimal performance. This requirement can be achieved via consistent design and deployment of the integrated PCS. PCS integration will result in several benefits such as leveraging commonalities, avoiding redundancy, and facilitating sharing between implementations. This paper will address PCS implementations and focus on benefits and requirements of the integrated PCS. Intel integrated PCS Architecture will be then presented and its components will be briefly discussed. Finally, industry direction and efforts to standardize PCS interfaces that enable PCS integration will be presented.

  10. Generic Health Management: A System Engineering Process Handbook Overview and Process

    NASA Technical Reports Server (NTRS)

    Wilson, Moses Lee; Spruill, Jim; Hong, Yin Paw

    1995-01-01

    Health Management, a System Engineering Process, is one of those processes-techniques-and-technologies used to define, design, analyze, build, verify, and operate a system from the viewpoint of preventing, or minimizing, the effects of failure or degradation. It supports all ground and flight elements during manufacturing, refurbishment, integration, and operation through combined use of hardware, software, and personnel. This document will integrate Health Management Processes (six phases) into five phases in such a manner that it is never a stand alone task/effort which separately defines independent work functions.

  11. Metallurgical processing: A compilation

    NASA Technical Reports Server (NTRS)

    1973-01-01

    The items in this compilation, all relating to metallurgical processing, are presented in two sections. The first section includes processes which are general in scope and applicable to a variety of metals or alloys. The second describes the processes that concern specific metals and their alloys.

  12. Comprehension Processes in Reading.

    ERIC Educational Resources Information Center

    Balota, D. A., Ed.; And Others

    Focusing on the process of reading comprehension, this book contains chapters on some central topics relevant to understanding the processes associated with comprehending text. The articles and their authors are as follows: (1) "Comprehension Processes: Introduction" (K. Rayner); (2) "The Role of Meaning in Word Recognition"…

  13. Change Processes in Organization.

    ERIC Educational Resources Information Center

    1998

    This document contains four papers from a symposium on change processes in organizations. "Mid-stream Corrections: Decisions Leaders Make during Organizational Change Processes" (David W. Frantz) analyzes three organizational leaders to determine whether and how they take corrective actions or adapt their decision-making processes when…

  14. Global-local processing relates to spatial and verbal processing: implications for sex differences in cognition.

    PubMed

    Pletzer, Belinda; Scheuringer, Andrea; Scherndl, Thomas

    2017-09-05

    Sex differences have been reported for a variety of cognitive tasks and related to the use of different cognitive processing styles in men and women. It was recently argued that these processing styles share some characteristics across tasks, i.e. male approaches are oriented towards holistic stimulus aspects and female approaches are oriented towards stimulus details. In that respect, sex-dependent cognitive processing styles share similarities with attentional global-local processing. A direct relationship between cognitive processing and global-local processing has however not been previously established. In the present study, 49 men and 44 women completed a Navon paradigm and a Kimchi Palmer task as well as a navigation task and a verbal fluency task with the goal to relate the global advantage (GA) effect as a measure of global processing to holistic processing styles in both tasks. Indeed participants with larger GA effects displayed more holistic processing during spatial navigation and phonemic fluency. However, the relationship to cognitive processing styles was modulated by the specific condition of the Navon paradigm, as well as the sex of participants. Thus, different types of global-local processing play different roles for cognitive processing in men and women.

  15. Real-time radar signal processing using GPGPU (general-purpose graphic processing unit)

    NASA Astrophysics Data System (ADS)

    Kong, Fanxing; Zhang, Yan Rockee; Cai, Jingxiao; Palmer, Robert D.

    2016-05-01

    This study introduces a practical approach to develop real-time signal processing chain for general phased array radar on NVIDIA GPUs(Graphical Processing Units) using CUDA (Compute Unified Device Architecture) libraries such as cuBlas and cuFFT, which are adopted from open source libraries and optimized for the NVIDIA GPUs. The processed results are rigorously verified against those from the CPUs. Performance benchmarked in computation time with various input data cube sizes are compared across GPUs and CPUs. Through the analysis, it will be demonstrated that GPGPUs (General Purpose GPU) real-time processing of the array radar data is possible with relatively low-cost commercial GPUs.

  16. [Definition and stabilization of processes II. Clinical Processes in a Urology Department].

    PubMed

    Pascual, Carlos; Luján, Marcos; Mora, José Ramón; Diz, Manuel Ramón; Martín, Carlos; López, María Carmen

    2015-01-01

    New models in clinical management seek a clinical practice based on quality, efficacy and efficiency, avoiding variability and improvisation. In this paper we have developed one of the most frequent clinical processes in our speciality, the process based on DRG 311 or transurethral procedures without complications. Along it we will describe its components: Stabilization form, clinical trajectory, cost calculation, and finally the process flowchart.

  17. Chemical processing of lunar materials

    NASA Technical Reports Server (NTRS)

    Criswell, D. R.; Waldron, R. D.

    1979-01-01

    The paper highlights recent work on the general problem of processing lunar materials. The discussion covers lunar source materials, refined products, motivations for using lunar materials, and general considerations for a lunar or space processing plant. Attention is given to chemical processing through various techniques, including electrolysis of molten silicates, carbothermic/silicothermic reduction, carbo-chlorination process, NaOH basic-leach process, and HF acid-leach process. Several options for chemical processing of lunar materials are well within the state of the art of applied chemistry and chemical engineering to begin development based on the extensive knowledge of lunar materials.

  18. Process evaluation of discharge planning implementation in healthcare using normalization process theory.

    PubMed

    Nordmark, Sofi; Zingmark, Karin; Lindberg, Inger

    2016-04-27

    Discharge planning is a care process that aims to secure the transfer of care for the patient at transition from home to the hospital and back home. Information exchange and collaboration between care providers are essential, but deficits are common. A wide range of initiatives to improve the discharge planning process have been developed and implemented for the past three decades. However, there are still high rates of reported medical errors and adverse events related to failures in the discharge planning. Using theoretical frameworks such as Normalization Process Theory (NPT) can support evaluations of complex interventions and processes in healthcare. The aim of this study was to explore the embedding and integration of the DPP from the perspective of registered nurses, district nurses and homecare organizers. The study design was explorative, using the NPT as a framework to explore the embedding and integration of the DPP. Data consisted of written documentation from; workshops with staff, registered adverse events and system failures, web based survey and individual interviews with staff. Using the NPT as a framework to explore the embedding and integration of discharge planning after 10 years in use showed that the staff had reached a consensus of opinion of what the process was (coherence) and how they evaluated the process (reflexive monitoring). However, they had not reached a consensus of opinion of who performed the process (cognitive participation) and how it was performed (collective action). This could be interpreted as the process had not become normalized in daily practice. The result shows necessity to observe the implementation of old practices to better understand the needs of new ones before developing and implementing new practices or supportive tools within healthcare to reach the aim of development and to accomplish sustainable implementation. The NPT offers a generalizable framework for analysis, which can explain and shape the

  19. Illuminating e-beam processing

    USDA-ARS?s Scientific Manuscript database

    This month's Processing column will explore electronic beam (e-beam) processing. E-beam processing uses a low energy form of irradiation and has emerged as a highly promising treatment for both food safety and quarantine purposes. It is also used to extend food shelf life. This column will review...

  20. Lyophilization process design space.

    PubMed

    Patel, Sajal Manubhai; Pikal, Michael J

    2013-11-01

    The application of key elements of quality by design (QbD), such as risk assessment, process analytical technology, and design space, is discussed widely as it relates to freeze-drying process design and development. However, this commentary focuses on constructing the Design and Control Space, particularly for the primary drying step of the freeze-drying process. Also, practical applications and considerations of claiming a process Design Space under the QbD paradigm have been discussed. © 2013 Wiley Periodicals, Inc. and the American Pharmacists Association.

  1. Advanced Hydrogen Liquefaction Process

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schwartz, Joseph; Kromer, Brian; Neu, Ben

    2011-09-28

    The project identified and quantified ways to reduce the cost of hydrogen liquefaction, and reduce the cost of hydrogen distribution. The goal was to reduce the power consumption by 20% and then to reduce the capital cost. Optimizing the process, improving process equipment, and improving ortho-para conversion significantly reduced the power consumption of liquefaction, but by less than 20%. Because the efficiency improvement was less than the target, the program was stopped before the capital cost was addressed. These efficiency improvements could provide a benefit to the public to improve the design of future hydrogen liquefiers. The project increased themore » understanding of hydrogen liquefaction by modeling different processes and thoroughly examining ortho-para separation and conversion. The process modeling provided a benefit to the public because the project incorporated para hydrogen into the process modeling software, so liquefaction processes can be modeled more accurately than using only normal hydrogen. Adding catalyst to the first heat exchanger, a simple method to reduce liquefaction power, was identified, analyzed, and quantified. The demonstrated performance of ortho-para separation is sufficient for at least one identified process concept to show reduced power cost when compared to hydrogen liquefaction processes using conventional ortho-para conversion. The impact of improved ortho-para conversion can be significant because ortho para conversion uses about 20-25% of the total liquefaction power, but performance improvement is necessary to realize a substantial benefit. Most of the energy used in liquefaction is for gas compression. Improvements in hydrogen compression will have a significant impact on overall liquefier efficiency. Improvements to turbines, heat exchangers, and other process equipment will have less impact.« less

  2. EDITORIAL: Industrial Process Tomography

    NASA Astrophysics Data System (ADS)

    Anton Johansen, Geir; Wang, Mi

    2008-09-01

    There has been tremendous development within measurement science and technology over the past couple of decades. New sensor technologies and compact versatile signal recovery electronics are continuously expanding the limits of what can be measured and the accuracy with which this can be done. Miniaturization of sensors and the use of nanotechnology push these limits further. Also, thanks to powerful and cost-effective computer systems, sophisticated measurement and reconstruction algorithms previously only accessible in advanced laboratories are now available for in situ online measurement systems. The process industries increasingly require more process-related information, motivated by key issues such as improved process control, process utilization and process yields, ultimately driven by cost-effectiveness, quality assurance, environmental and safety demands. Industrial process tomography methods have taken advantage of the general progress in measurement science, and aim at providing more information, both quantitatively and qualitatively, on multiphase systems and their dynamics. The typical approach for such systems has been to carry out one local or bulk measurement and assume that this is representative of the whole system. In some cases, this is sufficient. However, there are many complex systems where the component distribution varies continuously and often unpredictably in space and time. The foundation of industrial tomography is to conduct several measurements around the periphery of a multiphase process, and use these measurements to unravel the cross-sectional distribution of the process components in time and space. This information is used in the design and optimization of industrial processes and process equipment, and also to improve the accuracy of multiphase system measurements in general. In this issue we are proud to present a selection of the 145 papers presented at the 5th World Congress on Industrial Process Tomography in Bergen

  3. Using Unified Modelling Language (UML) as a process-modelling technique for clinical-research process improvement.

    PubMed

    Kumarapeli, P; De Lusignan, S; Ellis, T; Jones, B

    2007-03-01

    The Primary Care Data Quality programme (PCDQ) is a quality-improvement programme which processes routinely collected general practice computer data. Patient data collected from a wide range of different brands of clinical computer systems are aggregated, processed, and fed back to practices in an educational context to improve the quality of care. Process modelling is a well-established approach used to gain understanding and systematic appraisal, and identify areas of improvement of a business process. Unified modelling language (UML) is a general purpose modelling technique used for this purpose. We used UML to appraise the PCDQ process to see if the efficiency and predictability of the process could be improved. Activity analysis and thinking-aloud sessions were used to collect data to generate UML diagrams. The UML model highlighted the sequential nature of the current process as a barrier for efficiency gains. It also identified the uneven distribution of process controls, lack of symmetric communication channels, critical dependencies among processing stages, and failure to implement all the lessons learned in the piloting phase. It also suggested that improved structured reporting at each stage - especially from the pilot phase, parallel processing of data and correctly positioned process controls - should improve the efficiency and predictability of research projects. Process modelling provided a rational basis for the critical appraisal of a clinical data processing system; its potential maybe underutilized within health care.

  4. Processed and ultra-processed food products: consumption trends in Canada from 1938 to 2011.

    PubMed

    Moubarac, Jean-Claude; Batal, Malek; Martins, Ana Paula Bortoletto; Claro, Rafael; Levy, Renata Bertazzi; Cannon, Geoffrey; Monteiro, Carlos

    2014-01-01

    A classification of foods based on the nature, extent, and purpose of industrial food processing was used to assess changes in household food expenditures and dietary energy availability between 1938 and 2011 in Canada. Food acquisitions from six household food budget surveys (1938/1939 , 1953, 1969, 1984, 2001, and 2011) were classified into unprocessed or minimally processed foods, processed culinary ingredients, and ready-to-consume processed or ultra-processed products. Contributions of each group to household food expenditures, and to dietary energy availability (kcal per capita) were calculated. During the period studied, household expenditures and dietary energy availability fell for both unprocessed or minimally processed foods and culinary ingredients, and rose for ready-to-consume products. The caloric share of foods fell from 34.3% to 25.6% and from 37% to 12.7% for culinary ingredients. The share of ready-to-consume products rose from 28.7% to 61.7%, and the increase was especially noteworthy for those that were ultra-processed. The most important factor that has driven changes in Canadian dietary patterns between 1938 and 2011 is the replacement of unprocessed or minimally processed foods and culinary ingredients used in the preparation of dishes and meals; these have been displaced by ready-to-consume ultra-processed products. Nutrition research and practice should incorporate information about food processing into dietary assessments.

  5. The Constitutional Amendment Process

    ERIC Educational Resources Information Center

    Chism, Kahlil

    2005-01-01

    This article discusses the constitutional amendment process. Although the process is not described in great detail, Article V of the United States Constitution allows for and provides instruction on amending the Constitution. While the amendment process currently consists of six steps, the Constitution is nevertheless quite difficult to change.…

  6. Survey of Event Processing

    DTIC Science & Technology

    2007-12-01

    1 A Brief History of Event Processing... history of event processing. The Applications section defines several application domains and use cases for event processing technology. Event...subscription” and “subscription language” will be used where some will often use “(continuous) query” or “query language.” A Brief History of

  7. Methane/nitrogen separation process

    DOEpatents

    Baker, Richard W.; Lokhandwala, Kaaeid A.; Pinnau, Ingo; Segelke, Scott

    1997-01-01

    A membrane separation process for treating a gas stream containing methane and nitrogen, for example, natural gas. The separation process works by preferentially permeating methane and rejecting nitrogen. We have found that the process is able to meet natural gas pipeline specifications for nitrogen, with acceptably small methane loss, so long as the membrane can exhibit a methane/nitrogen selectivity of about 4, 5 or more. This selectivity can be achieved with some rubbery and super-glassy membranes at low temperatures. The process can also be used for separating ethylene from nitrogen.

  8. Gasoline from coal in the state of Illinois: feasibility study. Volume I. Design. [KBW gasification process, ICI low-pressure methanol process and Mobil M-gasoline process

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1980-01-01

    Volume 1 describes the proposed plant: KBW gasification process, ICI low-pressure methanol process and Mobil M-gasoline process, and also with ancillary processes, such as oxygen plant, shift process, RECTISOL purification process, sulfur recovery equipment and pollution control equipment. Numerous engineering diagrams are included. (LTN)

  9. Computer integrated manufacturing/processing in the HPI. [Hydrocarbon Processing Industry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yoshimura, J.S.

    1993-05-01

    Hydrocarbon Processing and Systemhouse Inc., developed a comprehensive survey on the status of computer integrated manufacturing/processing (CIM/CIP) targeted specifically to the unique requirements of the hydrocarbon processing industry. These types of surveys and other benchmarking techniques can be invaluable in assisting companies to maximize business benefits from technology investments. The survey was organized into 5 major areas: CIM/CIP planning, management perspective, functional applications, integration and technology infrastructure and trends. The CIM/CIP planning area dealt with the use and type of planning methods to plan, justify implement information technology projects. The management perspective section addressed management priorities, expenditure levels and implementationmore » barriers. The functional application area covered virtually all functional areas of organization and focused on the specific solutions and benefits in each of the functional areas. The integration section addressed the needs and integration status of the organization's functional areas. Finally, the technology infrastructure and trends section dealt with specific technologies in use as well as trends over the next three years. In February 1993, summary areas from preliminary results were presented at the 2nd International Conference on Productivity and Quality in the Hydrocarbon Processing Industry.« less

  10. How yogurt is processed

    USDA-ARS?s Scientific Manuscript database

    This month’s Processing column on the theme of “How Is It Processed?” focuses on yogurt. Yogurt is known for its health-promoting properties. This column will provide a brief overview of the history of yogurt and the current market. It will also unveil both traditional and modern yogurt processing t...

  11. Process Development of Porcelain Ceramic Material with Binder Jetting Process for Dental Applications

    NASA Astrophysics Data System (ADS)

    Miyanaji, Hadi; Zhang, Shanshan; Lassell, Austin; Zandinejad, Amirali; Yang, Li

    2016-03-01

    Custom ceramic structures possess significant potentials in many applications such as dentistry and aerospace where extreme environments are present. Specifically, highly customized geometries with adequate performance are needed for various dental prostheses applications. This paper demonstrates the development of process and post-process parameters for a dental porcelain ceramic material using binder jetting additive manufacturing (AM). Various process parameters such as binder amount, drying power level, drying time and powder spread speed were studied experimentally for their effect on geometrical and mechanical characteristics of green parts. In addition, the effects of sintering and printing parameters on the qualities of the densified ceramic structures were also investigated experimentally. The results provide insights into the process-property relationships for the binder jetting AM process, and some of the challenges of the process that need to be further characterized for the successful adoption of the binder jetting technology in high quality ceramic fabrications are discussed.

  12. NTP comparison process

    NASA Technical Reports Server (NTRS)

    Corban, Robert

    1993-01-01

    The systems engineering process for the concept definition phase of the program involves requirements definition, system definition, and consistent concept definition. The requirements definition process involves obtaining a complete understanding of the system requirements based on customer needs, mission scenarios, and nuclear thermal propulsion (NTP) operating characteristics. A system functional analysis is performed to provide a comprehensive traceability and verification of top-level requirements down to detailed system specifications and provides significant insight into the measures of system effectiveness to be utilized in system evaluation. The second key element in the process is the definition of system concepts to meet the requirements. This part of the process involves engine system and reactor contractor teams to develop alternative NTP system concepts that can be evaluated against specific attributes, as well as a reference configuration against which to compare system benefits and merits. Quality function deployment (QFD), as an excellent tool within Total Quality Management (TQM) techniques, can provide the required structure and provide a link to the voice of the customer in establishing critical system qualities and their relationships. The third element of the process is the consistent performance comparison. The comparison process involves validating developed concept data and quantifying system merits through analysis, computer modeling, simulation, and rapid prototyping of the proposed high risk NTP subsystems. The maximum amount possible of quantitative data will be developed and/or validated to be utilized in the QFD evaluation matrix. If upon evaluation of a new concept or its associated subsystems determine to have substantial merit, those features will be incorporated into the reference configuration for subsequent system definition and comparison efforts.

  13. RATES OF REACTION AND PROCESS DESIGN DATA FOR THE HYDROCARB PROCESS

    EPA Science Inventory

    The report provides experimental and process design data in support of studies for developing the coprocessing of fossil fuels with biomass by the Hydrocarb process. The experimental work includes the hydropyrolysis of biomass and the thermal decomposition of methane in a 2.44 m ...

  14. Developing the JPL Engineering Processes

    NASA Technical Reports Server (NTRS)

    Linick, Dave; Briggs, Clark

    2004-01-01

    This paper briefly recounts the recent history of process reengineering at the NASA Jet Propulsion Laboratory, with a focus on the engineering processes. The JPL process structure is described and the process development activities of the past several years outlined. The main focus of the paper is on the current process structure, the emphasis on the flight project life cycle, the governance approach that lead to Flight Project Practices, and the remaining effort to capture process knowledge at the detail level of the work group.

  15. Economics of polysilicon processes

    NASA Technical Reports Server (NTRS)

    Yaws, C. L.; Li, K. Y.; Chou, S. M.

    1986-01-01

    Techniques are being developed to provide lower cost polysilicon material for solar cells. Existing technology which normally provides semiconductor industry polysilicon material is undergoing changes and also being used to provide polysilicon material for solar cells. Economics of new and existing technologies are presented for producing polysilicon. The economics are primarily based on the preliminary process design of a plant producing 1,000 metric tons/year of silicon. The polysilicon processes include: Siemen's process (hydrogen reduction of trichlorosilane); Union Carbide process (silane decomposition); and Hemlock Semiconductor process (hydrogen reduction of dichlorosilane). The economics include cost estimates of capital investment and product cost to produce polysilicon via the technology. Sensitivity analysis results are also presented to disclose the effect of major paramentes such as utilities, labor, raw materials and capital investment.

  16. Spacelab Data Processing Facility

    NASA Technical Reports Server (NTRS)

    1983-01-01

    The Spacelab Data Processing Facility (SDPF) processes, monitors, and accounts for the payload data from Spacelab and other Shuttle missions and forwards relevant data to various user facilities worldwide. The SLDPF is divided into the Spacelab Input Processing System (SIPS) and the Spacelab Output Processing System (SOPS). The SIPS division demultiplexes, synchronizes, time tags, quality checks, accounts for the data, and formats the data onto tapes. The SOPS division further edits, blocks, formats, and records the data on tape for shipment to users. User experiments must conform to the Spacelab's onboard High Rate Multiplexer (HRM) format for maximum process ability. Audio, analog, instrumentation, high density, experiment data, input/output data, quality control and accounting, and experimental channel tapes along with a variety of spacelab ancillary tapes are provided to the user by SLDPF.

  17. Styrene process condensate treatment with a combination process of UF and NF for reuse.

    PubMed

    Wang, Aijun; Liu, Guangmin; Huang, Jin; Wang, Lijuan; Li, Guangbin; Su, Xudong; Qi, Hong

    2013-01-15

    Aiming at reusing the SPC to save water resource and heat energy, a combination treatment process of UF/NF was applied to remove inorganic irons, suspended particles and little amount of organic contaminants in this article. To achieve the indexes of CODM≤5.00 mg L(-1), oil≤2.00 mg L(-1), conductivity≤10.00 μs cm(-1), pH of 6.0-8.0, the NF membrane process was adopted. It was necessary to employ a pretreatment process to reduce NF membrane fouling. Hence UF membrane as an efficient pretreatment unit was proposed to remove the inorganic particles, such as iron oxide catalyst, to meet the influent demands of NF. The effluent of UF, which was less than 0.02 mg L(-1) of total iron, went into a security filter and then was pumped into the NF process unit. High removal efficiencies of CODM, oil and conductivity were achieved by using NF process. The ABS grafting copolymerization experiment showed that the effluent of the combination process met the criteria of ABS production process, meanwhile the process could alleviate the environment pollution. It was shown that this combination process concept was feasible and successful in treating the SPC. Copyright © 2012 Elsevier B.V. All rights reserved.

  18. Cantilever epitaxial process

    DOEpatents

    Ashby, Carol I.; Follstaedt, David M.; Mitchell, Christine C.; Han, Jung

    2003-07-29

    A process of growing a material on a substrate, particularly growing a Group II-VI or Group III-V material, by a vapor-phase growth technique where the growth process eliminates the need for utilization of a mask or removal of the substrate from the reactor at any time during the processing. A nucleation layer is first grown upon which a middle layer is grown to provide surfaces for subsequent lateral cantilever growth. The lateral growth rate is controlled by altering the reactor temperature, pressure, reactant concentrations or reactant flow rates. Semiconductor materials, such as GaN, can be produced with dislocation densities less than 10.sup.7 /cm.sup.2.

  19. A case study: application of statistical process control tool for determining process capability and sigma level.

    PubMed

    Chopra, Vikram; Bairagi, Mukesh; Trivedi, P; Nagar, Mona

    2012-01-01

    Statistical process control is the application of statistical methods to the measurement and analysis of variation process. Various regulatory authorities such as Validation Guidance for Industry (2011), International Conference on Harmonisation ICH Q10 (2009), the Health Canada guidelines (2009), Health Science Authority, Singapore: Guidance for Product Quality Review (2008), and International Organization for Standardization ISO-9000:2005 provide regulatory support for the application of statistical process control for better process control and understanding. In this study risk assessments, normal probability distributions, control charts, and capability charts are employed for selection of critical quality attributes, determination of normal probability distribution, statistical stability, and capability of production processes, respectively. The objective of this study is to determine tablet production process quality in the form of sigma process capability. By interpreting data and graph trends, forecasting of critical quality attributes, sigma process capability, and stability of process were studied. The overall study contributes to an assessment of process at the sigma level with respect to out-of-specification attributes produced. Finally, the study will point to an area where the application of quality improvement and quality risk assessment principles for achievement of six sigma-capable processes is possible. Statistical process control is the most advantageous tool for determination of the quality of any production process. This tool is new for the pharmaceutical tablet production process. In the case of pharmaceutical tablet production processes, the quality control parameters act as quality assessment parameters. Application of risk assessment provides selection of critical quality attributes among quality control parameters. Sequential application of normality distributions, control charts, and capability analyses provides a valid statistical

  20. A comprehensive analysis of the IMRT dose delivery process using statistical process control (SPC).

    PubMed

    Gérard, Karine; Grandhaye, Jean-Pierre; Marchesi, Vincent; Kafrouni, Hanna; Husson, François; Aletti, Pierre

    2009-04-01

    The aim of this study is to introduce tools to improve the security of each IMRT patient treatment by determining action levels for the dose delivery process. To achieve this, the patient-specific quality control results performed with an ionization chamber--and which characterize the dose delivery process--have been retrospectively analyzed using a method borrowed from industry: Statistical process control (SPC). The latter consisted in fulfilling four principal well-structured steps. The authors first quantified the short-term variability of ionization chamber measurements regarding the clinical tolerances used in the cancer center (+/- 4% of deviation between the calculated and measured doses) by calculating a control process capability (C(pc)) index. The C(pc) index was found superior to 4, which implies that the observed variability of the dose delivery process is not biased by the short-term variability of the measurement. Then, the authors demonstrated using a normality test that the quality control results could be approximated by a normal distribution with two parameters (mean and standard deviation). Finally, the authors used two complementary tools--control charts and performance indices--to thoroughly analyze the IMRT dose delivery process. Control charts aim at monitoring the process over time using statistical control limits to distinguish random (natural) variations from significant changes in the process, whereas performance indices aim at quantifying the ability of the process to produce data that are within the clinical tolerances, at a precise moment. The authors retrospectively showed that the analysis of three selected control charts (individual value, moving-range, and EWMA control charts) allowed efficient drift detection of the dose delivery process for prostate and head-and-neck treatments before the quality controls were outside the clinical tolerances. Therefore, when analyzed in real time, during quality controls, they should improve the

  1. CIMOSA process classification for business process mapping in non-manufacturing firms: A case study

    NASA Astrophysics Data System (ADS)

    Latiffianti, Effi; Siswanto, Nurhadi; Wiratno, Stefanus Eko; Saputra, Yudha Andrian

    2017-11-01

    A business process mapping is one important means to enable an enterprise to effectively manage the value chain. One of widely used approaches to classify business process for mapping purpose is Computer Integrated Manufacturing System Open Architecture (CIMOSA). CIMOSA was initially designed for Computer Integrated Manufacturing (CIM) system based enterprises. This paper aims to analyze the use of CIMOSA process classification for business process mapping in the firms that do not fall within the area of CIM. Three firms of different business area that have used CIMOSA process classification were observed: an airline firm, a marketing and trading firm for oil and gas products, and an industrial estate management firm. The result of the research has shown that CIMOSA can be used in non-manufacturing firms with some adjustment. The adjustment includes addition, reduction, or modification of some processes suggested by CIMOSA process classification as evidenced by the case studies.

  2. Experimental research of solid waste drying in the process of thermal processing

    NASA Astrophysics Data System (ADS)

    Bukhmirov, V. V.; Kolibaba, O. B.; Gabitov, R. N.

    2015-10-01

    The convective drying process of municipal solid waste layer as a polydispersed multicomponent porous structure is studied. On the base of the experimental data criterial equations for calculating heat transfer and mass transfer processes in the layer, depending on the humidity of the material, the speed of the drying agent and the layer height are obtained. These solutions are used in the thermal design of reactors for the thermal processing of multicomponent organic waste.

  3. Conceptual models of information processing

    NASA Technical Reports Server (NTRS)

    Stewart, L. J.

    1983-01-01

    The conceptual information processing issues are examined. Human information processing is defined as an active cognitive process that is analogous to a system. It is the flow and transformation of information within a human. The human is viewed as an active information seeker who is constantly receiving, processing, and acting upon the surrounding environmental stimuli. Human information processing models are conceptual representations of cognitive behaviors. Models of information processing are useful in representing the different theoretical positions and in attempting to define the limits and capabilities of human memory. It is concluded that an understanding of conceptual human information processing models and their applications to systems design leads to a better human factors approach.

  4. Alsep data processing: How we processed Apollo Lunar Seismic Data

    NASA Technical Reports Server (NTRS)

    Latham, G. V.; Nakamura, Y.; Dorman, H. J.

    1979-01-01

    The Apollo lunar seismic station network gathered data continuously at a rate of 3 x 10 to the 8th power bits per day for nearly eight years until the termination in September, 1977. The data were processed and analyzed using a PDP-15 minicomputer. On the average, 1500 long-period seismic events were detected yearly. Automatic event detection and identification schemes proved unsuccessful because of occasional high noise levels and, above all, the risk of overlooking unusual natural events. The processing procedures finally settled on consist of first plotting all the data on a compressed time scale, visually picking events from the plots, transferring event data to separate sets of tapes and performing detailed analyses using the latter. Many problems remain especially for automatically processing extraterrestrial seismic signals.

  5. The "Process" of Process Use: Methods for Longitudinal Assessment in a Multisite Evaluation

    ERIC Educational Resources Information Center

    Shaw, Jessica; Campbell, Rebecca

    2014-01-01

    Process use refers to the ways in which stakeholders and/or evaluands change as a function of participating in evaluation activities. Although the concept of process use has been well discussed in the literature, exploration of methodological strategies for the measurement and assessment of process use has been limited. Typically, empirical…

  6. Emotional language processing: how mood affects integration processes during discourse comprehension.

    PubMed

    Egidi, Giovanna; Nusbaum, Howard C

    2012-09-01

    This research tests whether mood affects semantic processing during discourse comprehension by facilitating integration of information congruent with moods' valence. Participants in happy, sad, or neutral moods listened to stories with positive or negative endings during EEG recording. N400 peak amplitudes showed mood congruence for happy and sad participants: endings incongruent with participants' moods demonstrated larger peaks. Happy and neutral moods exhibited larger peaks for negative endings, thus showing a similarity between negativity bias (neutral mood) and mood congruence (happy mood). Mood congruence resulted in differential processing of negative information: happy mood showed larger amplitudes for negative endings than neutral mood, and sad mood showed smaller amplitudes. N400 peaks were also sensitive to whether ending valence was communicated directly or as a result of inference. This effect was moderately modulated by mood. In conclusion, the notion of context for discourse processing should include comprehenders' affective states preceding language processing. Copyright © 2011 Elsevier Inc. All rights reserved.

  7. Semisolid Metal Processing Consortium

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Apelian,Diran

    Mathematical modeling and simulations of semisolid filling processes remains a critical issue in understanding and optimizing the process. Semisolid slurries are non-Newtonian materials that exhibit complex rheological behavior. There the way these slurries flow in cavities is very different from the way liquid in classical casting fills cavities. Actually filling in semisolid processing is often counter intuitive

  8. How tofu is processed

    USDA-ARS?s Scientific Manuscript database

    This month’s Processing column will continue the theme of “How Is It Processed?” The column will focus on tofu, which is sometimes called “the cheese of Asia.” It is a nutritious, protein-rich bean curd made by coagulating soy milk. There are many different types of tofu, and they are processed in a...

  9. Kennedy Space Center Payload Processing

    NASA Technical Reports Server (NTRS)

    Lawson, Ronnie; Engler, Tom; Colloredo, Scott; Zide, Alan

    2011-01-01

    This slide presentation reviews the payload processing functions at Kennedy Space Center. It details some of the payloads processed at KSC, the typical processing tasks, the facilities available for processing payloads, and the capabilities and customer services that are available.

  10. Parallel Processing with Digital Signal Processing Hardware and Software

    NASA Technical Reports Server (NTRS)

    Swenson, Cory V.

    1995-01-01

    The assembling and testing of a parallel processing system is described which will allow a user to move a Digital Signal Processing (DSP) application from the design stage to the execution/analysis stage through the use of several software tools and hardware devices. The system will be used to demonstrate the feasibility of the Algorithm To Architecture Mapping Model (ATAMM) dataflow paradigm for static multiprocessor solutions of DSP applications. The individual components comprising the system are described followed by the installation procedure, research topics, and initial program development.

  11. Microsystem process networks

    DOEpatents

    Wegeng, Robert S [Richland, WA; TeGrotenhuis, Ward E [Kennewick, WA; Whyatt, Greg A [West Richland, WA

    2006-10-24

    Various aspects and applications of microsystem process networks are described. The design of many types of microsystems can be improved by ortho-cascading mass, heat, or other unit process operations. Microsystems having exergetically efficient microchannel heat exchangers are also described. Detailed descriptions of numerous design features in microcomponent systems are also provided.

  12. METAL PLATING PROCESS

    DOEpatents

    Walker, D.E.; Noland, R.A.

    1958-08-12

    A process ts described for obtaining a closely bonded coating of steel or iron on uranium. The process consists of providing, between the steel and uramium. a layer of silver. amd then pressure rolling tbe assembly at about 600 deg C until a reduction of from l0 to 50% has been obtained.

  13. Microsystem process networks

    DOEpatents

    Wegeng, Robert S [Richland, WA; TeGrotenhuis, Ward E [Kennewick, WA; Whyatt, Greg A [West Richland, WA

    2010-01-26

    Various aspects and applications or microsystem process networks are described. The design of many types of microsystems can be improved by ortho-cascading mass, heat, or other unit process operations. Microsystems having energetically efficient microchannel heat exchangers are also described. Detailed descriptions of numerous design features in microcomponent systems are also provided.

  14. Microsystem process networks

    DOEpatents

    Wegeng, Robert S.; TeGrotenhuis, Ward E.; Whyatt, Greg A.

    2007-09-18

    Various aspects and applications of microsystem process networks are described. The design of many types of Microsystems can be improved by ortho-cascading mass, heat, or other unit process operations. Microsystems having energetically efficient microchannel heat exchangers are also described. Detailed descriptions of numerous design features in microcomponent systems are also provided.

  15. Industrial process surveillance system

    DOEpatents

    Gross, Kenneth C.; Wegerich, Stephan W.; Singer, Ralph M.; Mott, Jack E.

    1998-01-01

    A system and method for monitoring an industrial process and/or industrial data source. The system includes generating time varying data from industrial data sources, processing the data to obtain time correlation of the data, determining the range of data, determining learned states of normal operation and using these states to generate expected values, comparing the expected values to current actual values to identify a current state of the process closest to a learned, normal state; generating a set of modeled data, and processing the modeled data to identify a data pattern and generating an alarm upon detecting a deviation from normalcy.

  16. Industrial Process Surveillance System

    DOEpatents

    Gross, Kenneth C.; Wegerich, Stephan W; Singer, Ralph M.; Mott, Jack E.

    2001-01-30

    A system and method for monitoring an industrial process and/or industrial data source. The system includes generating time varying data from industrial data sources, processing the data to obtain time correlation of the data, determining the range of data, determining learned states of normal operation and using these states to generate expected values, comparing the expected values to current actual values to identify a current state of the process closest to a learned, normal state; generating a set of modeled data, and processing the modeled data to identify a data pattern and generating an alarm upon detecting a deviation from normalcy.

  17. Methane/nitrogen separation process

    DOEpatents

    Baker, R.W.; Lokhandwala, K.A.; Pinnau, I.; Segelke, S.

    1997-09-23

    A membrane separation process is described for treating a gas stream containing methane and nitrogen, for example, natural gas. The separation process works by preferentially permeating methane and rejecting nitrogen. The authors have found that the process is able to meet natural gas pipeline specifications for nitrogen, with acceptably small methane loss, so long as the membrane can exhibit a methane/nitrogen selectivity of about 4, 5 or more. This selectivity can be achieved with some rubbery and super-glassy membranes at low temperatures. The process can also be used for separating ethylene from nitrogen. 11 figs.

  18. Business Process Modeling: Perceived Benefits

    NASA Astrophysics Data System (ADS)

    Indulska, Marta; Green, Peter; Recker, Jan; Rosemann, Michael

    The process-centered design of organizations and information systems is globally seen as an appropriate response to the increased economic pressure on organizations. At the methodological core of process-centered management is process modeling. However, business process modeling in large initiatives can be a time-consuming and costly exercise, making it potentially difficult to convince executive management of its benefits. To date, and despite substantial interest and research in the area of process modeling, the understanding of the actual benefits of process modeling in academia and practice is limited. To address this gap, this paper explores the perception of benefits derived from process modeling initiatives, as reported through a global Delphi study. The study incorporates the views of three groups of stakeholders - academics, practitioners and vendors. Our findings lead to the first identification and ranking of 19 unique benefits associated with process modeling. The study in particular found that process modeling benefits vary significantly between practitioners and academics. We argue that the variations may point to a disconnect between research projects and practical demands.

  19. Applied digital signal processing systems for vortex flowmeter with digital signal processing.

    PubMed

    Xu, Ke-Jun; Zhu, Zhi-Hai; Zhou, Yang; Wang, Xiao-Fen; Liu, San-Shan; Huang, Yun-Zhi; Chen, Zhi-Yuan

    2009-02-01

    The spectral analysis is combined with digital filter to process the vortex sensor signal for reducing the effect of disturbance at low frequency from pipe vibrations and increasing the turndown ratio. Using digital signal processing chip, two kinds of digital signal processing systems are developed to implement these algorithms. One is an integrative system, and the other is a separated system. A limiting amplifier is designed in the input analog condition circuit to adapt large amplitude variation of sensor signal. Some technique measures are taken to improve the accuracy of the output pulse, speed up the response time of the meter, and reduce the fluctuation of the output signal. The experimental results demonstrate the validity of the digital signal processing systems.

  20. Modeling biochemical transformation processes and information processing with Narrator.

    PubMed

    Mandel, Johannes J; Fuss, Hendrik; Palfreyman, Niall M; Dubitzky, Werner

    2007-03-27

    Software tools that model and simulate the dynamics of biological processes and systems are becoming increasingly important. Some of these tools offer sophisticated graphical user interfaces (GUIs), which greatly enhance their acceptance by users. Such GUIs are based on symbolic or graphical notations used to describe, interact and communicate the developed models. Typically, these graphical notations are geared towards conventional biochemical pathway diagrams. They permit the user to represent the transport and transformation of chemical species and to define inhibitory and stimulatory dependencies. A critical weakness of existing tools is their lack of supporting an integrative representation of transport, transformation as well as biological information processing. Narrator is a software tool facilitating the development and simulation of biological systems as Co-dependence models. The Co-dependence Methodology complements the representation of species transport and transformation together with an explicit mechanism to express biological information processing. Thus, Co-dependence models explicitly capture, for instance, signal processing structures and the influence of exogenous factors or events affecting certain parts of a biological system or process. This combined set of features provides the system biologist with a powerful tool to describe and explore the dynamics of life phenomena. Narrator's GUI is based on an expressive graphical notation which forms an integral part of the Co-dependence Methodology. Behind the user-friendly GUI, Narrator hides a flexible feature which makes it relatively easy to map models defined via the graphical notation to mathematical formalisms and languages such as ordinary differential equations, the Systems Biology Markup Language or Gillespie's direct method. This powerful feature facilitates reuse, interoperability and conceptual model development. Narrator is a flexible and intuitive systems biology tool. It is

  1. Modeling biochemical transformation processes and information processing with Narrator

    PubMed Central

    Mandel, Johannes J; Fuß, Hendrik; Palfreyman, Niall M; Dubitzky, Werner

    2007-01-01

    Background Software tools that model and simulate the dynamics of biological processes and systems are becoming increasingly important. Some of these tools offer sophisticated graphical user interfaces (GUIs), which greatly enhance their acceptance by users. Such GUIs are based on symbolic or graphical notations used to describe, interact and communicate the developed models. Typically, these graphical notations are geared towards conventional biochemical pathway diagrams. They permit the user to represent the transport and transformation of chemical species and to define inhibitory and stimulatory dependencies. A critical weakness of existing tools is their lack of supporting an integrative representation of transport, transformation as well as biological information processing. Results Narrator is a software tool facilitating the development and simulation of biological systems as Co-dependence models. The Co-dependence Methodology complements the representation of species transport and transformation together with an explicit mechanism to express biological information processing. Thus, Co-dependence models explicitly capture, for instance, signal processing structures and the influence of exogenous factors or events affecting certain parts of a biological system or process. This combined set of features provides the system biologist with a powerful tool to describe and explore the dynamics of life phenomena. Narrator's GUI is based on an expressive graphical notation which forms an integral part of the Co-dependence Methodology. Behind the user-friendly GUI, Narrator hides a flexible feature which makes it relatively easy to map models defined via the graphical notation to mathematical formalisms and languages such as ordinary differential equations, the Systems Biology Markup Language or Gillespie's direct method. This powerful feature facilitates reuse, interoperability and conceptual model development. Conclusion Narrator is a flexible and intuitive systems

  2. Silicon production process evaluations

    NASA Technical Reports Server (NTRS)

    1981-01-01

    Chemical engineering analysis of the HSC process (Hemlock Semiconductor Corporation) for producing silicon from dichlorosilane in a 1,000 MT/yr plant was continued. Progress and status for the chemical engineering analysis of the HSC process are reported for the primary process design engineering activities: base case conditions (85%), reaction chemistry (85%), process flow diagram (60%), material balance (60%), energy balance (30%), property data (30%), equipment design (20%) and major equipment list (10%). Engineering design of the initial distillation column (D-01, stripper column) in the process was initiated. The function of the distillation column is to remove volatile gases (such as hydrogen and nitrogen) which are dissolved in liquid chlorosilanes. Initial specifications and results for the distillation column design are reported including the variation of tray requirements (equilibrium stages) with reflux ratio for the distillation.

  3. Ordinal Process Dissociation and the Measurement of Automatic and Controlled Processes

    ERIC Educational Resources Information Center

    Hirshman, Elliot

    2004-01-01

    The process-dissociation equations (L. Jacoby, 1991) have been applied to results from inclusion and exclusion tasks to derive quantitative estimates of the influence of controlled and automatic processes on memory. This research has provoked controversies (e.g., T. Curran & D. Hintzman, 1995) regarding the validity of specific assumptions…

  4. Component processes underlying future thinking.

    PubMed

    D'Argembeau, Arnaud; Ortoleva, Claudia; Jumentier, Sabrina; Van der Linden, Martial

    2010-09-01

    This study sought to investigate the component processes underlying the ability to imagine future events, using an individual-differences approach. Participants completed several tasks assessing different aspects of future thinking (i.e., fluency, specificity, amount of episodic details, phenomenology) and were also assessed with tasks and questionnaires measuring various component processes that have been hypothesized to support future thinking (i.e., executive processes, visual-spatial processing, relational memory processing, self-consciousness, and time perspective). The main results showed that executive processes were correlated with various measures of future thinking, whereas visual-spatial processing abilities and time perspective were specifically related to the number of sensory descriptions reported when specific future events were imagined. Furthermore, individual differences in self-consciousness predicted the subjective feeling of experiencing the imagined future events. These results suggest that future thinking involves a collection of processes that are related to different facets of future-event representation.

  5. Hynol Process Engineering: Process Configuration, Site Plan, and Equipment Design

    DTIC Science & Technology

    1996-02-01

    feed stock. Compared with other methanol production processes, direct emissions of carbon dioxide can be substantially reduced by using the Hynol...A bench scale methanol production facility is being constructed to demonstrate the technical feasibility of producing methanol from biomass using the ...Hynol process. The plant is being designed to convert 50 lb./hr of biomass to methanol. The biomass consists of wood, and natural gas is used as a co

  6. Business process modeling in healthcare.

    PubMed

    Ruiz, Francisco; Garcia, Felix; Calahorra, Luis; Llorente, César; Gonçalves, Luis; Daniel, Christel; Blobel, Bernd

    2012-01-01

    The importance of the process point of view is not restricted to a specific enterprise sector. In the field of health, as a result of the nature of the service offered, health institutions' processes are also the basis for decision making which is focused on achieving their objective of providing quality medical assistance. In this chapter the application of business process modelling - using the Business Process Modelling Notation (BPMN) standard is described. Main challenges of business process modelling in healthcare are the definition of healthcare processes, the multi-disciplinary nature of healthcare, the flexibility and variability of the activities involved in health care processes, the need of interoperability between multiple information systems, and the continuous updating of scientific knowledge in healthcare.

  7. Adaptive Memory: Evaluating Alternative Forms of Fitness-Relevant Processing in the Survival Processing Paradigm

    PubMed Central

    Sandry, Joshua; Trafimow, David; Marks, Michael J.; Rice, Stephen

    2013-01-01

    Memory may have evolved to preserve information processed in terms of its fitness-relevance. Based on the assumption that the human mind comprises different fitness-relevant adaptive mechanisms contributing to survival and reproductive success, we compared alternative fitness-relevant processing scenarios with survival processing. Participants rated words for relevancy to fitness-relevant and control conditions followed by a delay and surprise recall test (Experiment 1a). Participants recalled more words processed for their relevance to a survival situation. We replicated these findings in an online study (Experiment 2) and a study using revised fitness-relevant scenarios (Experiment 3). Across all experiments, we did not find a mnemonic benefit for alternative fitness-relevant processing scenarios, questioning assumptions associated with an evolutionary account of remembering. Based on these results, fitness-relevance seems to be too wide-ranging of a construct to account for the memory findings associated with survival processing. We propose that memory may be hierarchically sensitive to fitness-relevant processing instructions. We encourage future researchers to investigate the underlying mechanisms responsible for survival processing effects and work toward developing a taxonomy of adaptive memory. PMID:23585858

  8. Integrated decontamination process for metals

    DOEpatents

    Snyder, Thomas S.; Whitlow, Graham A.

    1991-01-01

    An integrated process for decontamination of metals, particularly metals that are used in the nuclear energy industry contaminated with radioactive material. The process combines the processes of electrorefining and melt refining to purify metals that can be decontaminated using either electrorefining or melt refining processes.

  9. Word Processing Competencies.

    ERIC Educational Resources Information Center

    Gatlin, Rebecca; And Others

    Research indicates that people tend to use only five percent of the capabilities available in word processing software. The major objective of this study was to determine to what extent word processing was used by businesses, what competencies were required by those businesses, and how those competencies were being learned in Mid-South states. A…

  10. Handbook of Petroleum Processing

    NASA Astrophysics Data System (ADS)

    Jones, David S. J.; Pujado, Peter P.

    This handbook describes and discusses the features that make up the petroleum refining industry. It begins with a description of the crude oils and their nature, and continues with the saleable products from the refining processes, with a review of the environmental impact. There is a complete overview of the processes that make up the refinery with a brief history of those processes.

  11. A KPI-based process monitoring and fault detection framework for large-scale processes.

    PubMed

    Zhang, Kai; Shardt, Yuri A W; Chen, Zhiwen; Yang, Xu; Ding, Steven X; Peng, Kaixiang

    2017-05-01

    Large-scale processes, consisting of multiple interconnected subprocesses, are commonly encountered in industrial systems, whose performance needs to be determined. A common approach to this problem is to use a key performance indicator (KPI)-based approach. However, the different KPI-based approaches are not developed with a coherent and consistent framework. Thus, this paper proposes a framework for KPI-based process monitoring and fault detection (PM-FD) for large-scale industrial processes, which considers the static and dynamic relationships between process and KPI variables. For the static case, a least squares-based approach is developed that provides an explicit link with least-squares regression, which gives better performance than partial least squares. For the dynamic case, using the kernel representation of each subprocess, an instrument variable is used to reduce the dynamic case to the static case. This framework is applied to the TE benchmark process and the hot strip mill rolling process. The results show that the proposed method can detect faults better than previous methods. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.

  12. Advanced process control framework initiative

    NASA Astrophysics Data System (ADS)

    Hill, Tom; Nettles, Steve

    1997-01-01

    The semiconductor industry, one the world's most fiercely competitive industries, is driven by increasingly complex process technologies and global competition to improve cycle time, quality, and process flexibility. Due to the complexity of these problems, current process control techniques are generally nonautomated, time-consuming, reactive, nonadaptive, and focused on individual fabrication tools and processes. As the semiconductor industry moves into higher density processes, radical new approaches are required. To address the need for advanced factory-level process control in this environment, Honeywell, Advanced Micro Devices (AMD), and SEMATECH formed the Advanced Process Control Framework Initiative (APCFI) joint research project. The project defines and demonstrates an Advanced Process Control (APC) approach based on SEMATECH's Computer Integrated Manufacturing (CIM) Framework. Its scope includes the coordination of Manufacturing Execution Systems, process control tools, and wafer fabrication equipment to provide necessary process control capabilities. Moreover, it takes advantage of the CIM Framework to integrate and coordinate applications from other suppliers that provide services necessary for the overall system to function. This presentation discusses the key concept of model-based process control that differentiates the APC Framework. This major improvement over current methods enables new systematic process control by linking the knowledge of key process settings to desired product characteristics that reside in models created with commercial model development tools The unique framework-based approach facilitates integration of commercial tools and reuse of their data by tying them together in an object-based structure. The presentation also explores the perspective of each organization's involvement in the APCFI project. Each has complementary goals and expertise to contribute; Honeywell represents the supplier viewpoint, AMD represents the user

  13. Lubricant Coating Process

    NASA Technical Reports Server (NTRS)

    1989-01-01

    "Peen Plating," a NASA developed process for applying molybdenum disulfide, is the key element of Techniblast Co.'s SURFGUARD process for applying high strength solid lubricants. The process requires two machines -- one for cleaning and one for coating. The cleaning step allows the coating to be bonded directly to the substrate to provide a better "anchor." The coating machine applies a half a micron thick coating. Then, a blast gun, using various pressures to vary peening intensities for different applications, fires high velocity "media" -- peening hammers -- ranging from plastic pellets to steel shot. Techniblast was assisted by Rural Enterprises, Inc. Coating service can be performed at either Techniblast's or a customer's facility.

  14. Liquefaction processes and systems and liquefaction process intermediate compositions

    DOEpatents

    Schmidt, Andrew J.; Hart, Todd R.; Billing, Justin M.; Maupin, Gary D.; Hallen, Richard T.; Anderson, Daniel B.

    2014-07-12

    Liquefaction processes are provided that can include: providing a biomass slurry solution having a temperature of at least 300.degree. C. at a pressure of at least 2000 psig; cooling the solution to a temperature of less than 150.degree. C.; and depressurizing the solution to release carbon dioxide from the solution and form at least part of a bio-oil foam. Liquefaction processes are also provided that can include: filtering the biomass slurry to remove particulates; and cooling and depressurizing the filtered solution to form the bio-oil foam. Liquefaction systems are provided that can include: a heated biomass slurry reaction zone maintained above 300.degree. C. and at least 2000 psig and in continuous fluid communication with a flash cooling/depressurization zone maintained below 150.degree. C. and between about 125 psig and about atmospheric pressure. Liquefaction systems are also provided that can include a foam/liquid separation system. Liquefaction process intermediate compositions are provided that can include a bio-oil foam phase separated from an aqueous biomass solids solution.

  15. [Preliminary processing, processing and usage of Dendrobii Caulis in history].

    PubMed

    Yang, Wen-yu; Tang, Sheng; Shi, Dong-jun; Chen, Xiang-gui; Li, Ming-yuan; Tang, Xian-fu; Yuan, Chang-jiang

    2015-07-01

    On account of the dense cuticles of the fresh stem and the light, hard and pliable texture of the dried stem, Dendrobii Caulis is difficult to dry or pulverize. So, it is very important to the ancient doctors that Dendrobii Caulis should be properly treated and applied to keep or evoke its medicinal effects. The current textual research results about the preliminary processing, processing and usage methods of Dendrobii Caulis showed that: (1) In history the clinical use of fresh or processed Dendrobii Caulis as teas and tinctures were very common. (2) Its roots and rhizomes would be removed before using. (3) Some ancillary approaches were applied to shorten drying times, such as rinsing with boiling mulberry-ash soup, washing or soaking with liquor, mixing with rice pulp and then basking, etc. (4) According to the ancients knowledge, the sufficient pulverization, by means of slicing, rasping, hitting or pestling techniques, was necessary for Dendrobii Caulis to take its effects. (5) The heat processing methods for Dendrobii Caulis included stir-baking, stir-frying, steaming, decocting and stewing techniques, usually with liquor as an auxiliary material. Among above mentioned, steaming by pretreating with liquor was most commonly used, and this scheme was colorfully drawn in Bu Yi Lei Gong Pao Zhi Bian Lan (Ming Dynasty, 1591 CE) ; moreover, decocting in advance or long-time simmering so as to prepare paste products were recommended in the Qing Dynasty. (6) Some different processing programs involving stir-baking with grit, air-tightly baking with ondol (Kangs), fumigating with sulfur, which appeared in modern times and brought attractive outward appearance of the drug, went against ancients original intentions of ensuring drug efficacy.

  16. A comprehensive analysis of the IMRT dose delivery process using statistical process control (SPC)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gerard, Karine; Grandhaye, Jean-Pierre; Marchesi, Vincent

    The aim of this study is to introduce tools to improve the security of each IMRT patient treatment by determining action levels for the dose delivery process. To achieve this, the patient-specific quality control results performed with an ionization chamber--and which characterize the dose delivery process--have been retrospectively analyzed using a method borrowed from industry: Statistical process control (SPC). The latter consisted in fulfilling four principal well-structured steps. The authors first quantified the short term variability of ionization chamber measurements regarding the clinical tolerances used in the cancer center ({+-}4% of deviation between the calculated and measured doses) by calculatingmore » a control process capability (C{sub pc}) index. The C{sub pc} index was found superior to 4, which implies that the observed variability of the dose delivery process is not biased by the short term variability of the measurement. Then, the authors demonstrated using a normality test that the quality control results could be approximated by a normal distribution with two parameters (mean and standard deviation). Finally, the authors used two complementary tools--control charts and performance indices--to thoroughly analyze the IMRT dose delivery process. Control charts aim at monitoring the process over time using statistical control limits to distinguish random (natural) variations from significant changes in the process, whereas performance indices aim at quantifying the ability of the process to produce data that are within the clinical tolerances, at a precise moment. The authors retrospectively showed that the analysis of three selected control charts (individual value, moving-range, and EWMA control charts) allowed efficient drift detection of the dose delivery process for prostate and head-and-neck treatments before the quality controls were outside the clinical tolerances. Therefore, when analyzed in real time, during quality controls, they

  17. Tracing the decision-making process of physicians with a Decision Process Matrix.

    PubMed

    Hausmann, Daniel; Zulian, Cristina; Battegay, Edouard; Zimmerli, Lukas

    2016-10-18

    Decision-making processes in a medical setting are complex, dynamic and under time pressure, often with serious consequences for a patient's condition. The principal aim of the present study was to trace and map the individual diagnostic process of real medical cases using a Decision Process Matrix [DPM]). The naturalistic decision-making process of 11 residents and a total of 55 medical cases were recorded in an emergency department, and a DPM was drawn up according to a semi-structured technique following four steps: 1) observing and recording relevant information throughout the entire diagnostic process, 2) assessing options in terms of suspected diagnoses, 3) drawing up an initial version of the DPM, and 4) verifying the DPM, while adding the confidence ratings. The DPM comprised an average of 3.2 suspected diagnoses and 7.9 information units (cues). The following three-phase pattern could be observed: option generation, option verification, and final diagnosis determination. Residents strove for the highest possible level of confidence before making the final diagnoses (in two-thirds of the medical cases with a rating of practically certain) or excluding suspected diagnoses (with practically impossible in half of the cases). The following challenges have to be addressed in the future: real-time capturing of emerging suspected diagnoses in the memory of the physician, definition of meaningful information units, and a more contemporary measurement of confidence. DPM is a useful tool for tracing real and individual diagnostic processes. The methodological approach with DPM allows further investigations into the underlying cognitive diagnostic processes on a theoretical level and improvement of individual clinical reasoning skills in practice.

  18. Processing of plastics

    PubMed Central

    Spaak, Albert

    1975-01-01

    An overview is given of the processing of plastic materials from the handling of polymers in the pellet and powder form to manufacturing of a plastic fabricated product. Various types of equipment used and melt processing ranges of various polymer formulations to make the myriad of plastic products that are commercially available are discussed. PMID:1175556

  19. Manufacturability improvements in EUV resist processing toward NXE:3300 processing

    NASA Astrophysics Data System (ADS)

    Kuwahara, Yuhei; Matsunaga, Koichi; Shimoaoki, Takeshi; Kawakami, Shinichiro; Nafus, Kathleen; Foubert, Philippe; Goethals, Anne-Marie; Shimura, Satoru

    2014-03-01

    As the design rule of semiconductor process gets finer, extreme ultraviolet lithography (EUVL) technology is aggressively studied as a process for 22nm half pitch and beyond. At present, the studies for EUV focus on manufacturability. It requires fine resolution, uniform, smooth patterns and low defectivity, not only after lithography but also after the etch process. In the first half of 2013, a CLEAN TRACKTM LITHIUS ProTMZ-EUV was installed at imec for POR development in preparation of the ASML NXE:3300. This next generation coating/developing system is equipped with state of the art defect reduction technology. This tool with advanced functions can achieve low defect levels. This paper reports on the progress towards manufacturing defectivity levels and latest optimizations towards the NXE:3300 POR for both lines/spaces and contact holes at imec.

  20. [Sociophysiology: basic processes of empathy].

    PubMed

    Haker, Helene; Schimansky, Jenny; Rössler, Wulf

    2010-01-01

    The aim of this review is to describe sociophysiological and social cognitive processes that underlie the complex phenomenon of human empathy. Automatic reflexive processes such as physiological contagion and action mirroring are mediated by the mirror neuron system. They are a basis for further processing of social signals and a physiological link between two individuals. This link comprises simultaneous activation of shared motor representations. Shared representations lead implicitly via individual associations in the limbic and vegetative system to a shared affective state. These processes are called sociophysiology. Further controlled- reflective, self-referential processing of those social signals leads to explicit, conscious representations of others' minds. Those higher-order processes are called social cognition. The interaction of physiological and cognitive social processes lets arise the phenomenon of human empathy.

  1. Energy saving processes for nitrogen removal in organic wastewater from food processing industries in Thailand.

    PubMed

    Johansen, N H; Suksawad, N; Balslev, P

    2004-01-01

    Nitrogen removal from organic wastewater is becoming a demand in developed communities. The use of nitrite as intermediate in the treatment of wastewater has been largely ignored, but is actually a relevant energy saving process compared to conventional nitrification/denitrification using nitrate as intermediate. Full-scale results and pilot-scale results using this process are presented. The process needs some additional process considerations and process control to be utilized. Especially under tropical conditions the nitritation process will round easily, and it must be expected that many AS treatment plants in the food industry already produce NO2-N. This uncontrolled nitrogen conversion can be the main cause for sludge bulking problems. It is expected that sludge bulking problems in many cases can be solved just by changing the process control in order to run a more consequent nitritation. Theoretically this process will decrease the oxygen consumption for oxidation by 25% and the use of carbon source for the reduction will be decreased by 40% compared to the conventional process.

  2. Living olefin polymerization processes

    DOEpatents

    Schrock, Richard R.; Baumann, Robert

    1999-01-01

    Processes for the living polymerization of olefin monomers with terminal carbon-carbon double bonds are disclosed. The processes employ initiators that include a metal atom and a ligand having two group 15 atoms and a group 16 atom or three group 15 atoms. The ligand is bonded to the metal atom through two anionic or covalent bonds and a dative bond. The initiators are particularly stable under reaction conditions in the absence of olefin monomer. The processes provide polymers having low polydispersities, especially block copolymers having low polydispersities. It is an additional advantage of these processes that, during block copolymer synthesis, a relatively small amount of homopolymer is formed.

  3. Living olefin polymerization processes

    DOEpatents

    Schrock, R.R.; Baumann, R.

    1999-03-30

    Processes for the living polymerization of olefin monomers with terminal carbon-carbon double bonds are disclosed. The processes employ initiators that include a metal atom and a ligand having two group 15 atoms and a group 16 atom or three group 15 atoms. The ligand is bonded to the metal atom through two anionic or covalent bonds and a dative bond. The initiators are particularly stable under reaction conditions in the absence of olefin monomer. The processes provide polymers having low polydispersities, especially block copolymers having low polydispersities. It is an additional advantage of these processes that, during block copolymer synthesis, a relatively small amount of homopolymer is formed.

  4. Living olefin polymerization processes

    DOEpatents

    Schrock, Richard R.; Baumann, Robert

    2003-08-26

    Processes for the living polymerization of olefin monomers with terminal carbon-carbon double bonds are disclosed. The processes employ initiators that include a metal atom and a ligand having two group 15 atoms and a group 16 atom or three group 15 atoms. The ligand is bonded to the metal atom through two anionic or covalent bonds and a dative bond. The initiators are particularly stable under reaction conditions in the absence of olefin monomer. The processes provide polymers having low polydispersities, especially block copolymers having low polydispersities. It is an additional advantage of these processes that, during block copolymer synthesis, a relatively small amount of homopolymer is formed.

  5. Living olefin polymerization processes

    DOEpatents

    Schrock, Richard R.; Bauman, Robert

    2006-11-14

    Processes for the living polymerization of olefin monomers with terminal carbon-carbon double bonds are disclosed. The processes employ initiators that include a metal atom and a ligand having two group 15 atoms and a group 16 atom or three group 15 atoms. The ligand is bonded to the metal atom through two anionic or covalent bonds and a dative bond. The initiators are particularly stable under reaction conditions in the absence of olefin monomer. The processes provide polymers having low polydispersities, especially block copolymers having low polydispersities. It is an additional advantage of these processes that, during block copolymer synthesis, a relatively small amount of homopolymer is formed.

  6. Industrial process surveillance system

    DOEpatents

    Gross, K.C.; Wegerich, S.W.; Singer, R.M.; Mott, J.E.

    1998-06-09

    A system and method are disclosed for monitoring an industrial process and/or industrial data source. The system includes generating time varying data from industrial data sources, processing the data to obtain time correlation of the data, determining the range of data, determining learned states of normal operation and using these states to generate expected values, comparing the expected values to current actual values to identify a current state of the process closest to a learned, normal state; generating a set of modeled data, and processing the modeled data to identify a data pattern and generating an alarm upon detecting a deviation from normalcy. 96 figs.

  7. Neuroscientific Model of Motivational Process

    PubMed Central

    Kim, Sung-il

    2013-01-01

    Considering the neuroscientific findings on reward, learning, value, decision-making, and cognitive control, motivation can be parsed into three sub processes, a process of generating motivation, a process of maintaining motivation, and a process of regulating motivation. I propose a tentative neuroscientific model of motivational processes which consists of three distinct but continuous sub processes, namely reward-driven approach, value-based decision-making, and goal-directed control. Reward-driven approach is the process in which motivation is generated by reward anticipation and selective approach behaviors toward reward. This process recruits the ventral striatum (reward area) in which basic stimulus-action association is formed, and is classified as an automatic motivation to which relatively less attention is assigned. By contrast, value-based decision-making is the process of evaluating various outcomes of actions, learning through positive prediction error, and calculating the value continuously. The striatum and the orbitofrontal cortex (valuation area) play crucial roles in sustaining motivation. Lastly, the goal-directed control is the process of regulating motivation through cognitive control to achieve goals. This consciously controlled motivation is associated with higher-level cognitive functions such as planning, retaining the goal, monitoring the performance, and regulating action. The anterior cingulate cortex (attention area) and the dorsolateral prefrontal cortex (cognitive control area) are the main neural circuits related to regulation of motivation. These three sub processes interact with each other by sending reward prediction error signals through dopaminergic pathway from the striatum and to the prefrontal cortex. The neuroscientific model of motivational process suggests several educational implications with regard to the generation, maintenance, and regulation of motivation to learn in the learning environment. PMID:23459598

  8. Neuroscientific model of motivational process.

    PubMed

    Kim, Sung-Il

    2013-01-01

    Considering the neuroscientific findings on reward, learning, value, decision-making, and cognitive control, motivation can be parsed into three sub processes, a process of generating motivation, a process of maintaining motivation, and a process of regulating motivation. I propose a tentative neuroscientific model of motivational processes which consists of three distinct but continuous sub processes, namely reward-driven approach, value-based decision-making, and goal-directed control. Reward-driven approach is the process in which motivation is generated by reward anticipation and selective approach behaviors toward reward. This process recruits the ventral striatum (reward area) in which basic stimulus-action association is formed, and is classified as an automatic motivation to which relatively less attention is assigned. By contrast, value-based decision-making is the process of evaluating various outcomes of actions, learning through positive prediction error, and calculating the value continuously. The striatum and the orbitofrontal cortex (valuation area) play crucial roles in sustaining motivation. Lastly, the goal-directed control is the process of regulating motivation through cognitive control to achieve goals. This consciously controlled motivation is associated with higher-level cognitive functions such as planning, retaining the goal, monitoring the performance, and regulating action. The anterior cingulate cortex (attention area) and the dorsolateral prefrontal cortex (cognitive control area) are the main neural circuits related to regulation of motivation. These three sub processes interact with each other by sending reward prediction error signals through dopaminergic pathway from the striatum and to the prefrontal cortex. The neuroscientific model of motivational process suggests several educational implications with regard to the generation, maintenance, and regulation of motivation to learn in the learning environment.

  9. Design of production process main shaft process with lean manufacturing to improve productivity

    NASA Astrophysics Data System (ADS)

    Siregar, I.; Nasution, A. A.; Andayani, U.; Anizar; Syahputri, K.

    2018-02-01

    This object research is one of manufacturing companies that produce oil palm machinery parts. In the production process there is delay in the completion of the Main shaft order. Delays in the completion of the order indicate the low productivity of the company in terms of resource utilization. This study aimed to obtain a draft improvement of production processes that can improve productivity by identifying and eliminating activities that do not add value (non-value added activity). One approach that can be used to reduce and eliminate non-value added activity is Lean Manufacturing. This study focuses on the identification of non-value added activity with value stream mapping analysis tools, while the elimination of non-value added activity is done with tools 5 whys and implementation of pull demand system. Based on the research known that non-value added activity on the production process of the main shaft is 9,509.51 minutes of total lead time 10,804.59 minutes. This shows the level of efficiency (Process Cycle Efficiency) in the production process of the main shaft is still very low by 11.89%. Estimation results of improvement showed a decrease in total lead time became 4,355.08 minutes and greater process cycle efficiency that is equal to 29.73%, which indicates that the process was nearing the concept of lean production.

  10. Real-time monitoring of clinical processes using complex event processing and transition systems.

    PubMed

    Meinecke, Sebastian

    2014-01-01

    Dependencies between tasks in clinical processes are often complex and error-prone. Our aim is to describe a new approach for the automatic derivation of clinical events identified via the behaviour of IT systems using Complex Event Processing. Furthermore we map these events on transition systems to monitor crucial clinical processes in real-time for preventing and detecting erroneous situations.

  11. Composing Models of Geographic Physical Processes

    NASA Astrophysics Data System (ADS)

    Hofer, Barbara; Frank, Andrew U.

    Processes are central for geographic information science; yet geographic information systems (GIS) lack capabilities to represent process related information. A prerequisite to including processes in GIS software is a general method to describe geographic processes independently of application disciplines. This paper presents such a method, namely a process description language. The vocabulary of the process description language is derived formally from mathematical models. Physical processes in geography can be described in two equivalent languages: partial differential equations or partial difference equations, where the latter can be shown graphically and used as a method for application specialists to enter their process models. The vocabulary of the process description language comprises components for describing the general behavior of prototypical geographic physical processes. These process components can be composed by basic models of geographic physical processes, which is shown by means of an example.

  12. Eigenforms, Discrete Processes and Quantum Processes

    NASA Astrophysics Data System (ADS)

    Kauffman, Louis H.

    2012-05-01

    This essay is a discussion of the concept of eigenform, due to Heinz von Foerster, and its relationship with discrete physics and quantum mechanics. We interpret the square root of minus one as a simple oscillatory process - a clock, and as an eigenform. By taking a generalization of this identification of i as a clock and eigenform, we show how quantum mechanics emerges from discrete physics.

  13. 40 CFR 63.138 - Process wastewater provisions-performance standards for treatment processes managing Group 1...

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    .../or Table 9 compounds are similar and often identical. (3) Biological treatment processes. Biological treatment processes in compliance with this section may be either open or closed biological treatment processes as defined in § 63.111. An open biological treatment process in compliance with this section need...

  14. Reforming process

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mitsche, R.T.; Pope, G.N.

    A process for reforming a naphtha feedstock is disclosed. The reforming process is effected at reforming conditions in contact with a catalyst comprising a platinum group metal component and a group iv-a metal component composited with an alumina support wherein said support is prepared by admixing an alpha alumina monohydrate with an aqueous ammoniacal solution having a ph of at least about 7.5 to form a stable suspension. A salt of a strong acid, e.g., aluminum nitrate, is commingled with the suspension to form an extrudable paste or dough. On extrusion, the extrudate is dried and calcined to form saidmore » alumina support.« less

  15. Implementation of the Business Process Modelling Notation (BPMN) in the modelling of anatomic pathology processes.

    PubMed

    Rojo, Marcial García; Rolón, Elvira; Calahorra, Luis; García, Felix Oscar; Sánchez, Rosario Paloma; Ruiz, Francisco; Ballester, Nieves; Armenteros, María; Rodríguez, Teresa; Espartero, Rafael Martín

    2008-07-15

    Process orientation is one of the essential elements of quality management systems, including those in use in healthcare. Business processes in hospitals are very complex and variable. BPMN (Business Process Modelling Notation) is a user-oriented language specifically designed for the modelling of business (organizational) processes. Previous experiences of the use of this notation in the processes modelling within the Pathology in Spain or another country are not known. We present our experience in the elaboration of the conceptual models of Pathology processes, as part of a global programmed surgical patient process, using BPMN. With the objective of analyzing the use of BPMN notation in real cases, a multidisciplinary work group was created, including software engineers from the Dep. of Technologies and Information Systems from the University of Castilla-La Mancha and health professionals and administrative staff from the Hospital General de Ciudad Real. The work in collaboration was carried out in six phases: informative meetings, intensive training, process selection, definition of the work method, process describing by hospital experts, and process modelling. The modelling of the processes of Anatomic Pathology is presented using BPMN. The presented subprocesses are those corresponding to the surgical pathology examination of the samples coming from operating theatre, including the planning and realization of frozen studies. The modelling of Anatomic Pathology subprocesses has allowed the creation of an understandable graphical model, where management and improvements are more easily implemented by health professionals.

  16. Implementation of the Business Process Modelling Notation (BPMN) in the modelling of anatomic pathology processes

    PubMed Central

    Rojo, Marcial García; Rolón, Elvira; Calahorra, Luis; García, Felix Óscar; Sánchez, Rosario Paloma; Ruiz, Francisco; Ballester, Nieves; Armenteros, María; Rodríguez, Teresa; Espartero, Rafael Martín

    2008-01-01

    Background Process orientation is one of the essential elements of quality management systems, including those in use in healthcare. Business processes in hospitals are very complex and variable. BPMN (Business Process Modelling Notation) is a user-oriented language specifically designed for the modelling of business (organizational) processes. Previous experiences of the use of this notation in the processes modelling within the Pathology in Spain or another country are not known. We present our experience in the elaboration of the conceptual models of Pathology processes, as part of a global programmed surgical patient process, using BPMN. Methods With the objective of analyzing the use of BPMN notation in real cases, a multidisciplinary work group was created, including software engineers from the Dep. of Technologies and Information Systems from the University of Castilla-La Mancha and health professionals and administrative staff from the Hospital General de Ciudad Real. The work in collaboration was carried out in six phases: informative meetings, intensive training, process selection, definition of the work method, process describing by hospital experts, and process modelling. Results The modelling of the processes of Anatomic Pathology is presented using BPMN. The presented subprocesses are those corresponding to the surgical pathology examination of the samples coming from operating theatre, including the planning and realization of frozen studies. Conclusion The modelling of Anatomic Pathology subprocesses has allowed the creation of an understandable graphical model, where management and improvements are more easily implemented by health professionals. PMID:18673511

  17. Food-Processing Wastes.

    PubMed

    Frenkel, Val S; Cummings, Gregg A; Maillacheruvu, K Y; Tang, Walter Z

    2015-10-01

    Literature published in 2014 and early 2015 related to food processing wastes treatment for industrial applications are reviewed. This review is a subsection of the Treatment Systems section of the annual Water Environment Federation literature review and covers the following food processing industries and applications: general, meat and poultry, fruits and vegetables, dairy and beverage, and miscellaneous treatment of food wastes.

  18. Food-Processing Wastes.

    PubMed

    Frenkel, Val S; Cummings, Gregg A; Maillacheruvu, K Y; Tang, Walter Z

    2017-10-01

    Literature published in 2016 and early 2017 related to food processing wastes treatment for industrial applications are reviewed. This review is a subsection of the Treatment Systems section of the annual Water Environment Federation literature review and covers the following food processing industries and applications: general, meat and poultry, fruits and vegetables, dairy and beverage, and miscellaneous treatment of food wastes.

  19. Food-Processing Wastes.

    PubMed

    Frenkel, Val S; Cummings, Gregg A; Maillacheruvu, K Y; Tang, Walter Z

    2016-10-01

    Literature published in 2015 and early 2016 related to food processing wastes treatment for industrial applications are reviewed. This review is a subsection of the Treatment Systems section of the annual Water Environment Federation literature review and covers the following food processing industries and applications: general, meat and poultry, fruits and vegetables, dairy and beverage, and miscellaneous treatment of food wastes.

  20. Powder treatment process

    DOEpatents

    Weyand, J.D.

    1988-02-09

    Disclosed are: (1) a process comprising spray drying a powder-containing slurry, the slurry containing a powder constituent susceptible of oxidizing under the temperature conditions of the spray drying, while reducing the tendency for oxidation of the constituent by including as a liquid constituent of the slurry an organic liquid; (2) a process comprising spray drying a powder-containing slurry, the powder having been pretreated to reduce content of a powder constituent susceptible of oxidizing under the temperature conditions of the spray drying, the pretreating comprising heating the powder to react the constituent; and (3) a process comprising reacting ceramic powder, grinding the reacted powder, slurrying the ground powder, spray drying the slurried powder, and blending the dried powder with metal powder. 2 figs.

  1. Powder treatment process

    DOEpatents

    Weyand, John D.

    1988-01-01

    (1) A process comprising spray drying a powder-containing slurry, the slurry containing a powder constituent susceptible of oxidizing under the temperature conditions of the spray drying, while reducing the tendency for oxidation of the constituent by including as a liquid constituent of the slurry an organic liquid; (2) a process comprising spray drying a powder-containing slurry, the powder having been pretreated to reduce content of a powder constituent susceptible of oxidizing under the temperature conditions of the spray drying, the pretreating comprising heating the powder to react the constituent; and (3) a process comprising reacting ceramic powder, grinding the reacted powder, slurrying the ground powder, spray drying the slurried powder, and blending the dried powder with metal powder.

  2. Research in Stochastic Processes

    DTIC Science & Technology

    1988-08-31

    stationary sequence, Stochastic Proc. Appl. 29, 1988, 155-169 T. Hsing, J. Husler and M.R. Leadbetter, On the exceedance point process for a stationary...Nandagopalan, On exceedance point processes for "regular" sample functions, Proc. Volume, Oberxolfach Conf. on Extreme Value Theory, J. Husler and R. Reiss...exceedance point processes for stationary sequences under mild oscillation restrictions, Apr. 88. Obermotfach Conf. on Extremal Value Theory. Ed. J. HUsler

  3. Heat Transfer Processes for the Thermal Energy Balance of Organisms. Physical Processes in Terrestrial and Aquatic Ecosystems, Transport Processes.

    ERIC Educational Resources Information Center

    Stevenson, R. D.

    This module is part of a series designed to be used by life science students for instruction in the application of physical theory to ecosystem operation. Most modules contain computer programs which are built around a particular application of a physical process. This module describes heat transfer processes involved in the exchange of heat…

  4. Process Intensification for Cellulosic Biorefineries.

    PubMed

    Sadula, Sunitha; Athaley, Abhay; Zheng, Weiqing; Ierapetritou, Marianthi; Saha, Basudeb

    2017-06-22

    Utilization of renewable carbon source, especially non-food biomass is critical to address the climate change and future energy challenge. Current chemical and enzymatic processes for producing cellulosic sugars are multistep, and energy- and water-intensive. Techno-economic analysis (TEA) suggests that upstream lignocellulose processing is a major hurdle to the economic viability of the cellulosic biorefineries. Process intensification, which integrates processes and uses less water and energy, has the potential to overcome the aforementioned challenges. Here, we demonstrate a one-pot depolymerization and saccharification process of woody biomass, energy crops, and agricultural residues to produce soluble sugars with high yields. Lignin is separated as a solid for selective upgrading. Further integration of our upstream process with a reactive extraction step makes energy-efficient separation of sugars in the form of furans. TEA reveals that the process efficiency and integration enable, for the first time, economic production of feed streams that could profoundly improve process economics for downstream cellulosic bioproducts. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  5. DEFINITIVE SOX CONTROL PROCESS EVALUATIONS: LIMESTONE, DOUBLE ALKALI, AND CITRATE FGD PROCESSES

    EPA Science Inventory

    The report gives results of a detailed comparative technical and economic evaluation of limestone slurry, generic double alkali, and citrate flue gas desulfurization (FGD) processes, assuming proven technology and using representative power plant, process design, and economic pre...

  6. Supporting Cross-Organizational Process Control

    NASA Astrophysics Data System (ADS)

    Angelov, Samuil; Vonk, Jochem; Vidyasankar, Krishnamurthy; Grefen, Paul

    E-contracts express the rights and obligations of parties through a formal, digital representation of the contract provisions. In process intensive relationships, e-contracts contain business processes that a party promises to perform for the counter party, optionally allowing monitoring of the execution of the promised processes. In this paper, we describe an approach in which the counter party is allowed to control the process execution. This approach will lead to more flexible and efficient business relations which are essential in the context of modern, highly dynamic and complex collaborations among companies. We present a specification of the process controls available to the consumer and their support in the private process specification of the provider.

  7. Central waste processing system

    NASA Technical Reports Server (NTRS)

    Kester, F. L.

    1973-01-01

    A new concept for processing spacecraft type wastes has been evaluated. The feasibility of reacting various waste materials with steam at temperatures of 538 - 760 C in both a continuous and batch reactor with residence times from 3 to 60 seconds has been established. Essentially complete gasification is achieved. Product gases are primarily hydrogen, carbon dioxide, methane, and carbon monoxide. Water soluble synthetic wastes are readily processed in a continuous tubular reactor at concentrations up to 20 weight percent. The batch reactor is able to process wet and dry wastes at steam to waste weight ratios from 2 to 20. Feces, urine, and synthetic wastes have been successfully processed in the batch reactor.

  8. Certification-Based Process Analysis

    NASA Technical Reports Server (NTRS)

    Knight, Russell L.

    2013-01-01

    Space mission architects are often challenged with knowing which investment in technology infusion will have the highest return. Certification-based analysis (CBA) gives architects and technologists a means to communicate the risks and advantages of infusing technologies at various points in a process. Various alternatives can be compared, and requirements based on supporting streamlining or automation can be derived and levied on candidate technologies. CBA is a technique for analyzing a process and identifying potential areas of improvement. The process and analysis products are used to communicate between technologists and architects. Process means any of the standard representations of a production flow; in this case, any individual steps leading to products, which feed into other steps, until the final product is produced at the end. This sort of process is common for space mission operations, where a set of goals is reduced eventually to a fully vetted command sequence to be sent to the spacecraft. Fully vetting a product is synonymous with certification. For some types of products, this is referred to as verification and validation, and for others it is referred to as checking. Fundamentally, certification is the step in the process where one insures that a product works as intended, and contains no flaws.

  9. Laminar soot processes

    NASA Technical Reports Server (NTRS)

    Sunderland, P. B.; Lin, K.-C.; Faeth, G. M.

    1995-01-01

    Soot processes within hydrocarbon fueled flames are important because they affect the durability and performance of propulsion systems, the hazards of unwanted fires, the pollutant and particulate emissions from combustion processes, and the potential for developing computational combustion. Motivated by these observations, the present investigation is studying soot processes in laminar diffusion and premixed flames in order to better understand the soot and thermal radiation emissions of luminous flames. Laminar flames are being studied due to their experimental and computational tractability, noting the relevance of such results to practical turbulent flames through the laminar flamelet concept. Weakly-buoyant and nonbuoyant laminar diffusion flames are being considered because buoyancy affects soot processes in flames while most practical flames involve negligible effects of buoyancy. Thus, low-pressure weakly-buoyant flames are being observed during ground-based experiments while near atmospheric pressure nonbuoyant flames will be observed during space flight experiments at microgravity. Finally, premixed laminar flames also are being considered in order to observe some aspects of soot formation for simpler flame conditions than diffusion flames. The main emphasis of current work has been on measurements of soot nucleation and growth in laminar diffusion and premixed flames.

  10. Silicon production process evaluations

    NASA Technical Reports Server (NTRS)

    1981-01-01

    The chemical engineering analysis of the preliminary process design of a process for producing solar cell grade silicon from dichlorosilane is presented. A plant to produce 1,000 MT/yr of silicon is analyzed. Progress and status for the plant design are reported for the primary activities of base case conditions (60 percent), reaction chemistry (50 percent), process flow diagram (35 percent), energy balance (10 percent), property data (10 percent) and equipment design (5 percent).

  11. NASA Hazard Analysis Process

    NASA Technical Reports Server (NTRS)

    Deckert, George

    2010-01-01

    This viewgraph presentation reviews The NASA Hazard Analysis process. The contents include: 1) Significant Incidents and Close Calls in Human Spaceflight; 2) Subsystem Safety Engineering Through the Project Life Cycle; 3) The Risk Informed Design Process; 4) Types of NASA Hazard Analysis; 5) Preliminary Hazard Analysis (PHA); 6) Hazard Analysis Process; 7) Identify Hazardous Conditions; 8) Consider All Interfaces; 9) Work a Preliminary Hazard List; 10) NASA Generic Hazards List; and 11) Final Thoughts

  12. Hydrogen recovery process

    DOEpatents

    Baker, Richard W.; Lokhandwala, Kaaeid A.; He, Zhenjie; Pinnau, Ingo

    2000-01-01

    A treatment process for a hydrogen-containing off-gas stream from a refinery, petrochemical plant or the like. The process includes three separation steps: condensation, membrane separation and hydrocarbon fraction separation. The membrane separation step is characterized in that it is carried out under conditions at which the membrane exhibits a selectivity in favor of methane over hydrogen of at least about 2.5.

  13. Coal liquefaction process

    DOEpatents

    Skinner, Ronald W.; Tao, John C.; Znaimer, Samuel

    1985-01-01

    This invention relates to an improved process for the production of liquid carbonaceous fuels and solvents from carbonaceous solid fuels, especially coal. The claimed improved process includes the hydrocracking of the light SRC mixed with a suitable hydrocracker solvent. The recycle of the resulting hydrocracked product, after separation and distillation, is used to produce a solvent for the hydrocracking of the light solvent refined coal.

  14. Quality-by-Design (QbD): An integrated process analytical technology (PAT) approach for a dynamic pharmaceutical co-precipitation process characterization and process design space development.

    PubMed

    Wu, Huiquan; White, Maury; Khan, Mansoor A

    2011-02-28

    The aim of this work was to develop an integrated process analytical technology (PAT) approach for a dynamic pharmaceutical co-precipitation process characterization and design space development. A dynamic co-precipitation process by gradually introducing water to the ternary system of naproxen-Eudragit L100-alcohol was monitored at real-time in situ via Lasentec FBRM and PVM. 3D map of count-time-chord length revealed three distinguishable process stages: incubation, transition, and steady-state. The effects of high risk process variables (slurry temperature, stirring rate, and water addition rate) on both derived co-precipitation process rates and final chord-length-distribution were evaluated systematically using a 3(3) full factorial design. Critical process variables were identified via ANOVA for both transition and steady state. General linear models (GLM) were then used for parameter estimation for each critical variable. Clear trends about effects of each critical variable during transition and steady state were found by GLM and were interpreted using fundamental process principles and Nyvlt's transfer model. Neural network models were able to link process variables with response variables at transition and steady state with R(2) of 0.88-0.98. PVM images evidenced nucleation and crystal growth. Contour plots illustrated design space via critical process variables' ranges. It demonstrated the utility of integrated PAT approach for QbD development. Published by Elsevier B.V.

  15. Silicon-gate CMOS/SOS processing

    NASA Technical Reports Server (NTRS)

    Ramondetta, P.

    1979-01-01

    Major silicon-gate CMOS/SOS processes are described. Sapphire substrate preparation is also discussed, as well as the following process variations: (1) the double epi process; and (2) ion implantation.

  16. Global processing takes time: A meta-analysis on local-global visual processing in ASD.

    PubMed

    Van der Hallen, Ruth; Evers, Kris; Brewaeys, Katrien; Van den Noortgate, Wim; Wagemans, Johan

    2015-05-01

    What does an individual with autism spectrum disorder (ASD) perceive first: the forest or the trees? In spite of 30 years of research and influential theories like the weak central coherence (WCC) theory and the enhanced perceptual functioning (EPF) account, the interplay of local and global visual processing in ASD remains only partly understood. Research findings vary in indicating a local processing bias or a global processing deficit, and often contradict each other. We have applied a formal meta-analytic approach and combined 56 articles that tested about 1,000 ASD participants and used a wide range of stimuli and tasks to investigate local and global visual processing in ASD. Overall, results show no enhanced local visual processing nor a deficit in global visual processing. Detailed analysis reveals a difference in the temporal pattern of the local-global balance, that is, slow global processing in individuals with ASD. Whereas task-dependent interaction effects are obtained, gender, age, and IQ of either participant groups seem to have no direct influence on performance. Based on the overview of the literature, suggestions are made for future research. (c) 2015 APA, all rights reserved).

  17. [Practice report: the process-based indicator dashboard. Visualising quality assurance results in standardised processes].

    PubMed

    Petzold, Thomas; Hertzschuch, Diana; Elchlep, Frank; Eberlein-Gonska, Maria

    2014-01-01

    Process management (PM) is a valuable method for the systematic analysis and structural optimisation of the quality and safety of clinical treatment. PM requires a high motivation and willingness to implement changes of both employees and management. Definition of quality indicators is required to systematically measure the quality of the specified processes. One way to represent comparable quality results is the use of quality indicators of the external quality assurance in accordance with Sect. 137 SGB V—a method which the Federal Joint Committee (GBA) and the institutions commissioned by the GBA have employed and consistently enhanced for more than ten years. Information on the quality of inpatient treatment is available for 30 defined subjects throughout Germany. The combination of specified processes with quality indicators is beneficial for the information of employees. A process-based indicator dashboard provides essential information about the treatment process. These can be used for process analysis. In a continuous consideration of these indicator results values can be determined and errors will be remedied quickly. If due consideration is given to these indicators, they can be used for benchmarking to identify potential process improvements. Copyright © 2014. Published by Elsevier GmbH.

  18. From Process to Product: Your Risk Process at Work

    NASA Technical Reports Server (NTRS)

    Kundrot, Craig E.; Fogarty, Jenifer; Charles, John; Buquo, Lynn; Sibonga, Jean; Alexander, David; Horn, Wayne G.; Edwards, J. Michelle

    2010-01-01

    The Space Life Sciences Directorate (SLSD) and Human Research Program (HRP) at the NASA/Johnson Space Center work together to address and manage the human health and performance risks associated with human space flight. This includes all human system requirements before, during, and after space flight, providing for research, and managing the risk of adverse long-term health outcomes for the crew. We previously described the framework and processes developed for identifying and managing these human system risks. The focus of this panel is to demonstrate how the implementation of the framework and associated processes has provided guidance in the management and communication of human system risks. The risks of early onset osteoporosis, CO2 exposure, and intracranial hypertension in particular have all benefitted from the processes developed for human system risk management. Moreover, we are continuing to develop capabilities, particularly in the area of information architecture, which will also be described. We are working to create a system whereby all risks and associated actions can be tracked and related to one another electronically. Such a system will enhance the management and communication capabilities for the human system risks, thereby increasing the benefit to researchers and flight surgeons.

  19. Social Network Supported Process Recommender System

    PubMed Central

    Ye, Yanming; Yin, Jianwei; Xu, Yueshen

    2014-01-01

    Process recommendation technologies have gained more and more attention in the field of intelligent business process modeling to assist the process modeling. However, most of the existing technologies only use the process structure analysis and do not take the social features of processes into account, while the process modeling is complex and comprehensive in most situations. This paper studies the feasibility of social network research technologies on process recommendation and builds a social network system of processes based on the features similarities. Then, three process matching degree measurements are presented and the system implementation is discussed subsequently. Finally, experimental evaluations and future works are introduced. PMID:24672309

  20. Social network supported process recommender system.

    PubMed

    Ye, Yanming; Yin, Jianwei; Xu, Yueshen

    2014-01-01

    Process recommendation technologies have gained more and more attention in the field of intelligent business process modeling to assist the process modeling. However, most of the existing technologies only use the process structure analysis and do not take the social features of processes into account, while the process modeling is complex and comprehensive in most situations. This paper studies the feasibility of social network research technologies on process recommendation and builds a social network system of processes based on the features similarities. Then, three process matching degree measurements are presented and the system implementation is discussed subsequently. Finally, experimental evaluations and future works are introduced.

  1. Influence of Process Parameters on the Process Efficiency in Laser Metal Deposition Welding

    NASA Astrophysics Data System (ADS)

    Güpner, Michael; Patschger, Andreas; Bliedtner, Jens

    Conventionally manufactured tools are often completely constructed of a high-alloyed, expensive tool steel. An alternative way to manufacture tools is the combination of a cost-efficient, mild steel and a functional coating in the interaction zone of the tool. Thermal processing methods, like laser metal deposition, are always characterized by thermal distortion. The resistance against the thermal distortion decreases with the reduction of the material thickness. As a consequence, there is a necessity of a special process management for the laser based coating of thin parts or tools. The experimental approach in the present paper is to keep the energy and the mass per unit length constant by varying the laser power, the feed rate and the powder mass flow. The typical seam parameters are measured in order to characterize the cladding process, define process limits and evaluate the process efficiency. Ways to optimize dilution, angular distortion and clad height are presented.

  2. Second Language Processing: When Are First and Second Languages Processed Similarly?

    ERIC Educational Resources Information Center

    Sabourin, Laura; Stowe, Laurie A.

    2008-01-01

    In this article we investigate the effects of first language (L1) on second language (L2) neural processing for two grammatical constructions (verbal domain dependency and grammatical gender), focusing on the event-related potential P600 effect, which has been found in both L1 and L2 processing. Native Dutch speakers showed a P600 effect for both…

  3. Optimum processing of mammographic film.

    PubMed

    Sprawls, P; Kitts, E L

    1996-03-01

    Underprocessing of mammographic film can result in reduced contrast and visibility of breast structures and an unnecessary increase in radiation dose to the patient. Underprocessing can be caused by physical factors (low developer temperature, inadequate development time, insufficient developer agitation) or chemical factors (developer not optimized for film type; overdiluted, underreplenished, contaminated, or frequently changed developer). Conventional quality control programs are designed to produce consistent processing but do not address the issue of optimum processing. Optimum processing is defined as the level of processing that produces the film performance characteristics (contrast and sensitivity) specified by the film manufacturer. Optimum processing of mammographic film can be achieved by following a two-step protocol. The first step is to set up the processing conditions according to recommendations from the film and developer chemistry manufacturers. The second step is to verify the processing results by comparing them with sensitometric data provided by the film manufacturer.

  4. Materials processing in space

    NASA Technical Reports Server (NTRS)

    1975-01-01

    The feasibility and possible advantages of processing materials in a nongravitational field are considered. Areas of investigation include biomedical applications, the processing of inorganic materials, and flight programs and funding.

  5. Isothermal separation processes

    NASA Technical Reports Server (NTRS)

    England, C.

    1982-01-01

    The isothermal processes of membrane separation, supercritical extraction and chromatography were examined using availability analysis. The general approach was to derive equations that identified where energy is consumed in these processes and how they compare with conventional separation methods. These separation methods are characterized by pure work inputs, chiefly in the form of a pressure drop which supplies the required energy. Equations were derived for the energy requirement in terms of regular solution theory. This approach is believed to accurately predict the work of separation in terms of the heat of solution and the entropy of mixing. It can form the basis of a convenient calculation method for optimizing membrane and solvent properties for particular applications. Calculations were made on the energy requirements for a membrane process separating air into its components.

  6. Biomimetics: process, tools and practice.

    PubMed

    Fayemi, P E; Wanieck, K; Zollfrank, C; Maranzana, N; Aoussat, A

    2017-01-23

    Biomimetics applies principles and strategies abstracted from biological systems to engineering and technological design. With a huge potential for innovation, biomimetics could evolve into a key process in businesses. Yet challenges remain within the process of biomimetics, especially from the perspective of potential users. We work to clarify the understanding of the process of biomimetics. Therefore, we briefly summarize the terminology of biomimetics and bioinspiration. The implementation of biomimetics requires a stated process. Therefore, we present a model of the problem-driven process of biomimetics that can be used for problem-solving activity. The process of biomimetics can be facilitated by existing tools and creative methods. We mapped a set of tools to the biomimetic process model and set up assessment sheets to evaluate the theoretical and practical value of these tools. We analyzed the tools in interdisciplinary research workshops and present the characteristics of the tools. We also present the attempt of a utility tree which, once finalized, could be used to guide users through the process by choosing appropriate tools respective to their own expertize. The aim of this paper is to foster the dialogue and facilitate a closer collaboration within the field of biomimetics.

  7. Membrane thickening aerobic digestion processes.

    PubMed

    Woo, Bryen

    2014-01-01

    Sludge management accounts for approximately 60% of the total wastewater treatment plant expenditure and laws for sludge disposal are becoming increasingly stringent, therefore much consideration is required when designing a solids handling process. A membrane thickening aerobic digestion process integrates a controlled aerobic digestion process with pre-thickening waste activated sludge using membrane technology. This process typically features an anoxic tank, an aerated membrane thickener operating in loop with a first-stage digester followed by second-stage digestion. Membrane thickening aerobic digestion processes can handle sludge from any liquid treatment process and is best for facilities obligated to meet low total phosphorus and nitrogen discharge limits. Membrane thickening aerobic digestion processes offer many advantages including: producing a reusable quality permeate with minimal levels of total phosphorus and nitrogen that can be recycled to the head works of a plant, protecting the performance of a biological nutrient removal liquid treatment process without requiring chemical addition, providing reliable thickening up to 4% solids concentration without the use of polymers or attention to decanting, increasing sludge storage capacities in existing tanks, minimizing the footprint of new tanks, reducing disposal costs, and providing Class B stabilization.

  8. Space-time-modulated stochastic processes

    NASA Astrophysics Data System (ADS)

    Giona, Massimiliano

    2017-10-01

    Starting from the physical problem associated with the Lorentzian transformation of a Poisson-Kac process in inertial frames, the concept of space-time-modulated stochastic processes is introduced for processes possessing finite propagation velocity. This class of stochastic processes provides a two-way coupling between the stochastic perturbation acting on a physical observable and the evolution of the physical observable itself, which in turn influences the statistical properties of the stochastic perturbation during its evolution. The definition of space-time-modulated processes requires the introduction of two functions: a nonlinear amplitude modulation, controlling the intensity of the stochastic perturbation, and a time-horizon function, which modulates its statistical properties, providing irreducible feedback between the stochastic perturbation and the physical observable influenced by it. The latter property is the peculiar fingerprint of this class of models that makes them suitable for extension to generic curved-space times. Considering Poisson-Kac processes as prototypical examples of stochastic processes possessing finite propagation velocity, the balance equations for the probability density functions associated with their space-time modulations are derived. Several examples highlighting the peculiarities of space-time-modulated processes are thoroughly analyzed.

  9. Ultrasonic Processing of Materials

    NASA Astrophysics Data System (ADS)

    Han, Qingyou

    2015-08-01

    Irradiation of high-energy ultrasonic vibration in metals and alloys generates oscillating strain and stress fields in solids, and introduces nonlinear effects such as cavitation, acoustic streaming, and radiation pressure in molten materials. These nonlinear effects can be utilized to assist conventional material processing processes. This article describes recent research at Oak Ridge National Labs and Purdue University on using high-intensity ultrasonic vibrations for degassing molten aluminum, processing particulate-reinforced metal matrix composites, refining metals and alloys during solidification process and welding, and producing bulk nanostructures in solid metals and alloys. Research results suggest that high-intensity ultrasonic vibration is capable of degassing and dispersing small particles in molten alloys, reducing grain size during alloy solidification, and inducing nanostructures in solid metals.

  10. Biodiesel production process from microalgae oil by waste heat recovery and process integration.

    PubMed

    Song, Chunfeng; Chen, Guanyi; Ji, Na; Liu, Qingling; Kansha, Yasuki; Tsutsumi, Atsushi

    2015-10-01

    In this work, the optimization of microalgae oil (MO) based biodiesel production process is carried out by waste heat recovery and process integration. The exergy analysis of each heat exchanger presented an efficient heat coupling between hot and cold streams, thus minimizing the total exergy destruction. Simulation results showed that the unit production cost of optimized process is 0.592$/L biodiesel, and approximately 0.172$/L biodiesel can be avoided by heat integration. Although the capital cost of the optimized biodiesel production process increased 32.5% and 23.5% compared to the reference cases, the operational cost can be reduced by approximately 22.5% and 41.6%. Copyright © 2015 Elsevier Ltd. All rights reserved.

  11. Thermodynamics of Irreversible Processes. Physical Processes in Terrestrial and Aquatic Ecosystems, Transport Processes.

    ERIC Educational Resources Information Center

    Levin, Michael; Gallucci, V. F.

    These materials were designed to be used by life science students for instruction in the application of physical theory to ecosystem operation. Most modules contain computer programs which are built around a particular application of a physical process. This module describes the application of irreversible thermodynamics to biology. It begins with…

  12. Leading processes of patient care and treatment in hierarchical healthcare organizations in Sweden--process managers' experiences.

    PubMed

    Nilsson, Kerstin; Sandoff, Mette

    2015-01-01

    The purpose of this study is to gain better understanding of the roles and functions of process managers by describing Swedish process managers' experiences of leading processes involving patient care and treatment when working in a hierarchical health-care organization. This study is based on an explorative design. The data were gathered from interviews with 12 process managers at three Swedish hospitals. These data underwent qualitative and interpretative analysis with a modified editing style. The process managers' experiences of leading processes in a hierarchical health-care organization are described under three themes: having or not having a mandate, exposure to conflict situations and leading process development. The results indicate a need for clarity regarding process manager's responsibility and work content, which need to be communicated to all managers and staff involved in the patient care and treatment process, irrespective of department. There also needs to be an emphasis on realistic expectations and orientation of the goals that are an intrinsic part of the task of being a process manager. Generalizations from the results of the qualitative interview studies are limited, but a deeper understanding of the phenomenon was reached, which, in turn, can be transferred to similar settings. This study contributes qualitative descriptions of leading care and treatment processes in a functional, hierarchical health-care organization from process managers' experiences, a subject that has not been investigated earlier.

  13. Microencapsulation Processes

    NASA Astrophysics Data System (ADS)

    Whateley, T. L.; Poncelet, D.

    2005-06-01

    Microencapsulation by solvent evaporation is a novel technique to enable the controlled delivery of active materials.The controlled release of drugs, for example, is a key challenge in the pharmaceutical industries. Although proposed several decades ago, it remains largely an empirical laboratory process.The Topical Team has considered its critical points and the work required to produce a more effective technology - better control of the process for industrial production, understanding of the interfacial dynamics, determination of the solvent evaporation profile, and establishment of the relation between polymer/microcapsule structures.The Team has also defined how microgravity experiments could help in better understanding microencapsulation by solvent evaporation, and it has proposed a strategy for a collaborative project on the topic.

  14. SEL's Software Process-Improvement Program

    NASA Technical Reports Server (NTRS)

    Basili, Victor; Zelkowitz, Marvin; McGarry, Frank; Page, Jerry; Waligora, Sharon; Pajerski, Rose

    1995-01-01

    The goals and operations of the Software Engineering Laboratory (SEL) is reviewed. For nearly 20 years the SEL has worked to understand, assess, and improve software and the development process within the production environment of the Flight Dynamics Division (FDD) of NASA's Goddard Space Flight Center. The SEL was established in 1976 with the goals of reducing: (1) the defect rate of delivered software, (2) the cost of software to support flight projects, and (3) the average time to produce mission-support software. After studying over 125 projects of FDD, the results have guided the standards, management practices, technologies, and the training within the division. The results of the studies have been a 75 percent reduction in defects, a 50 percent reduction in cost, and a 25 percent reduction in development time. Over time the goals of SEL have been clarified. The goals are now stated as: (1) Understand baseline processes and product characteristics, (2) Assess improvements that have been incorporated into the development projects, (3) Package and infuse improvements into the standard SEL process. The SEL improvement goal is to demonstrate continual improvement of the software process by carrying out analysis, measurement and feedback to projects with in the FDD environment. The SEL supports the understanding of the process by study of several processes including, the effort distribution, and error detection rates. The SEL assesses and refines the processes. Once the assessment and refinement of a process is completed, the SEL packages the process by capturing the process in standards, tools and training.

  15. 49 CFR Appendix A to Part 580 - Secure Printing Processes and Other Secure Processes

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 49 Transportation 7 2014-10-01 2014-10-01 false Secure Printing Processes and Other Secure Processes A Appendix A to Part 580 Transportation Other Regulations Relating to Transportation (Continued... DISCLOSURE REQUIREMENTS Pt. 580, App. A Appendix A to Part 580—Secure Printing Processes and Other Secure...

  16. 49 CFR Appendix A to Part 580 - Secure Printing Processes and Other Secure Processes

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 49 Transportation 7 2012-10-01 2012-10-01 false Secure Printing Processes and Other Secure Processes A Appendix A to Part 580 Transportation Other Regulations Relating to Transportation (Continued... DISCLOSURE REQUIREMENTS Pt. 580, App. A Appendix A to Part 580—Secure Printing Processes and Other Secure...

  17. 49 CFR Appendix A to Part 580 - Secure Printing Processes and Other Secure Processes

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 49 Transportation 7 2011-10-01 2011-10-01 false Secure Printing Processes and Other Secure Processes A Appendix A to Part 580 Transportation Other Regulations Relating to Transportation (Continued... DISCLOSURE REQUIREMENTS Pt. 580, App. A Appendix A to Part 580—Secure Printing Processes and Other Secure...

  18. 49 CFR Appendix A to Part 580 - Secure Printing Processes and Other Secure Processes

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 49 Transportation 7 2013-10-01 2013-10-01 false Secure Printing Processes and Other Secure Processes A Appendix A to Part 580 Transportation Other Regulations Relating to Transportation (Continued... DISCLOSURE REQUIREMENTS Pt. 580, App. A Appendix A to Part 580—Secure Printing Processes and Other Secure...

  19. Teaching Process Design through Integrated Process Synthesis

    ERIC Educational Resources Information Center

    Metzger, Matthew J.; Glasser, Benjamin J.; Patel, Bilal; Hildebrandt, Diane; Glasser, David

    2012-01-01

    The design course is an integral part of chemical engineering education. A novel approach to the design course was recently introduced at the University of the Witwatersrand, Johannesburg, South Africa. The course aimed to introduce students to systematic tools and techniques for setting and evaluating performance targets for processes, as well as…

  20. Risk-based process safety assessment and control measures design for offshore process facilities.

    PubMed

    Khan, Faisal I; Sadiq, Rehan; Husain, Tahir

    2002-09-02

    Process operation is the most hazardous activity next to the transportation and drilling operation on an offshore oil and gas (OOG) platform. Past experiences of onshore and offshore oil and gas activities have revealed that a small mis-happening in the process operation might escalate to a catastrophe. This is of especial concern in the OOG platform due to the limited space and compact geometry of the process area, less ventilation, and difficult escape routes. On an OOG platform, each extra control measure, which is implemented, not only occupies space on the platform and increases congestion but also adds extra load to the platform. Eventualities in the OOG platform process operation can be avoided through incorporating the appropriate control measures at the early design stage. In this paper, the authors describe a methodology for risk-based process safety decision making for OOG activities. The methodology is applied to various offshore process units, that is, the compressor, separators, flash drum and driers of an OOG platform. Based on the risk potential, appropriate safety measures are designed for each unit. This paper also illustrates that implementation of the designed safety measures reduces the high Fatal accident rate (FAR) values to an acceptable level.

  1. Active pharmaceutical ingredient (API) production involving continuous processes--a process system engineering (PSE)-assisted design framework.

    PubMed

    Cervera-Padrell, Albert E; Skovby, Tommy; Kiil, Søren; Gani, Rafiqul; Gernaey, Krist V

    2012-10-01

    A systematic framework is proposed for the design of continuous pharmaceutical manufacturing processes. Specifically, the design framework focuses on organic chemistry based, active pharmaceutical ingredient (API) synthetic processes, but could potentially be extended to biocatalytic and fermentation-based products. The method exploits the synergic combination of continuous flow technologies (e.g., microfluidic techniques) and process systems engineering (PSE) methods and tools for faster process design and increased process understanding throughout the whole drug product and process development cycle. The design framework structures the many different and challenging design problems (e.g., solvent selection, reactor design, and design of separation and purification operations), driving the user from the initial drug discovery steps--where process knowledge is very limited--toward the detailed design and analysis. Examples from the literature of PSE methods and tools applied to pharmaceutical process design and novel pharmaceutical production technologies are provided along the text, assisting in the accumulation and interpretation of process knowledge. Different criteria are suggested for the selection of batch and continuous processes so that the whole design results in low capital and operational costs as well as low environmental footprint. The design framework has been applied to the retrofit of an existing batch-wise process used by H. Lundbeck A/S to produce an API: zuclopenthixol. Some of its batch operations were successfully converted into continuous mode, obtaining higher yields that allowed a significant simplification of the whole process. The material and environmental footprint of the process--evaluated through the process mass intensity index, that is, kg of material used per kg of product--was reduced to half of its initial value, with potential for further reduction. The case-study includes reaction steps typically used by the pharmaceutical

  2. A Controlled Agitation Process for Improving Quality of Canned Green Beans during Agitation Thermal Processing.

    PubMed

    Singh, Anika; Pratap Singh, Anubhav; Ramaswamy, Hosahalli S

    2016-06-01

    This work introduces the concept of a controlled agitation thermal process to reduce quality damage in liquid-particulate products during agitation thermal processing. Reciprocating agitation thermal processing (RA-TP) was used as the agitation thermal process. In order to reduce the impact of agitation, a new concept of "stopping agitations after sufficient development of cold-spot temperature" was proposed. Green beans were processed in No. 2 (307×409) cans filled with liquids of various consistency (0% to 2% CMC) at various frequencies (1 to 3 Hz) of RA-TP using a full-factorial design and heat penetration results were collected. Corresponding operator's process time to impart a 10-min process lethality (Fo ) and agitation time (AT) were calculated using heat penetration results. Accordingly, products were processed again by stopping agitations as per 3 agitation regimes, namely; full time agitation, equilibration time agitation, and partial time agitation. Processed products were photographed and tested for visual quality, color, texture, breakage of green beans, turbidity, and percentage of insoluble solids in can liquid. Results showed that stopping agitations after sufficient development of cold-spot temperatures is an effective way of reducing product damages caused by agitation (for example, breakage of beans and its leaching into liquid). Agitations till one-log temperature difference gave best color, texture and visual product quality for low-viscosity liquid-particulate mixture and extended agitations till equilibration time was best for high-viscosity products. Thus, it was shown that a controlled agitation thermal process is more effective in obtaining high product quality as compared to a regular agitation thermal process. © 2016 Institute of Food Technologists®

  3. 21 CFR 660.51 - Processing.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ...) Processing method. (1) The processing method shall be one that has been shown to yield consistently a... be colored green. (3) Only that material which has been fully processed, thoroughly mixed in a single...

  4. Chemical Sensing in Process Analysis.

    ERIC Educational Resources Information Center

    Hirschfeld, T.; And Others

    1984-01-01

    Discusses: (1) rationale for chemical sensors in process analysis; (2) existing types of process chemical sensors; (3) sensor limitations, considering lessons of chemometrics; (4) trends in process control sensors; and (5) future prospects. (JN)

  5. Spacelab Data Processing Facility

    NASA Technical Reports Server (NTRS)

    1983-01-01

    The capabilities of the Spacelab Data Processing Facility (SPDPF) are highlighted. The capturing, quality monitoring, processing, accounting, and forwarding of vital Spacelab data to various user facilities around the world are described.

  6. Telerobotic electronic materials processing experiment

    NASA Technical Reports Server (NTRS)

    Ollendorf, Stanford

    1991-01-01

    The Office of Commercial Programs (OCP), working in conjunction with NASA engineers at the Goddard Space Flight Center, is supporting research efforts in robot technology and microelectronics materials processing that will provide many spinoffs for science and industry. The Telerobotic Materials Processing Experiment (TRMPX) is a Shuttle-launched materials processing test payload using a Get Away Special can. The objectives of the project are to define, develop, and demonstrate an automated materials processing capability under realistic flight conditions. TRMPX will provide the capability to test the production processes that are dependent on microgravity. The processes proposed for testing include the annealing of amorphous silicon to increase grain size for more efficient solar cells, thin film deposition to demonstrate the potential of fabricating solar cells in orbit, and the annealing of radiation damaged solar cells.

  7. Image Processing Software

    NASA Technical Reports Server (NTRS)

    1992-01-01

    To convert raw data into environmental products, the National Weather Service and other organizations use the Global 9000 image processing system marketed by Global Imaging, Inc. The company's GAE software package is an enhanced version of the TAE, developed by Goddard Space Flight Center to support remote sensing and image processing applications. The system can be operated in three modes and is combined with HP Apollo workstation hardware.

  8. Hydropyrolysis process

    DOEpatents

    Ullman, Alan Z.; Silverman, Jacob; Friedman, Joseph

    1986-01-01

    An improved process for producing a methane-enriched gas wherein a hydrogen-deficient carbonaceous material is treated with a hydrogen-containing pyrolysis gas at an elevated temperature and pressure to produce a product gas mixture including methane, carbon monoxide and hydrogen. The improvement comprises passing the product gas mixture sequentially through a water-gas shift reaction zone and a gas separation zone to provide separate gas streams of methane and of a recycle gas comprising hydrogen, carbon monoxide and methane for recycle to the process. A controlled amount of steam also is provided which when combined with the recycle gas provides a pyrolysis gas for treatment of additional hydrogen-deficient carbonaceous material. The amount of steam used and the conditions within the water-gas shift reaction zone and gas separation zone are controlled to obtain a steady-state composition of pyrolysis gas which will comprise hydrogen as the principal constituent and a minor amount of carbon monoxide, steam and methane so that no external source of hydrogen is needed to supply the hydrogen requirements of the process. In accordance with a particularly preferred embodiment, conditions are controlled such that there also is produced a significant quantity of benzene as a valuable coproduct.

  9. Three year evaluation of Xpert MTB/RIF in a low prevalence tuberculosis setting: A Scottish perspective.

    PubMed

    Parcell, Benjamin J; Jarchow-MacDonald, Anna A; Seagar, Amie-Louise; Laurenson, Ian F; Prescott, Gordon J; Lockhart, Michael

    2017-05-01

    Xpert MTB/RIF (Cepheid) is a rapid molecular assay shown to be sensitive and specific for pulmonary tuberculosis (TB) diagnosis in highly endemic countries. We evaluated its diagnostic performance in a low TB prevalence setting, examined rifampicin resistance detection and quantitative capabilities predicting graded auramine microscopy and time to positivity (TTP) of culture. Xpert MTB/RIF was used to test respiratory samples over a 3 year period. Samples underwent graded auramine microscopy, solid/liquid culture, in-house IS6110 real-time PCR, and GenoType MTBDRplus (HAIN Lifescience) to determine rifampicin and/or isoniazid resistance. A total of 2103 Xpert MTB/RIF tests were performed. Compared to culture sensitivity was 95.8%, specificity 99.5%, positive predictive value (PPV) 82.1%, and negative predictive value (NPV) 99.9%. A positive correlation was found between auramine microscopy grade and Xpert MTB/RIF assay load. We found a clear reduction in the median TTP as Xpert MTB/RIF assay load increased. Rifampicin resistance was detected. Xpert MTB/RIF was rapid and accurate in diagnosing pulmonary TB in a low prevalence area. Rapid results will influence infection prevention and control and treatment measures. The excellent NPV obtained suggests further work should be carried out to assess its role in replacing microscopy. Copyright © 2017 The British Infection Association. Published by Elsevier Ltd. All rights reserved.

  10. The use of computed tomography scans and the Bender Gestalt Test in the assessment of competency to stand trial and criminal responsibility in the field of mental health and law.

    PubMed

    Mosotho, Nathaniel Lehlohonolo; Timile, Ino; Joubert, Gina

    computed tomography and the Bender Gestalt Test are some of the tests used routinely for the assessment of alleged offenders referred under Sections 77 and 78 of the Criminal Procedure Act 51 of 1977. An exploratory retrospective study was conducted at the Free State Psychiatric Complex. The aim of this study was to identify the extent to which the Bender Gestalt Test results and the computed tomography scans are associated with outcomes in the assessment of competency to stand trial and criminal responsibility in individuals referred to the Free State Psychiatric Complex (FSPC) observation unit. This was a cross-sectional study and the entire population of patients admitted in 2013 was included in the study. The clinical and demographic data were obtained from patient files. The majority of participants were black, males, single and unemployed. The most common diagnosis was schizophrenia. The current study showed no statistically significant association between the Bender Gestalt Test Hain's scores and the outcome of criminal responsibility and competency to stand trial. Similarly, the study also showed no statistically significant association between the presence of a brain lesion and the outcome of criminal responsibility and competency to stand trial. It was also concluded that as CT scans are expensive, patients should be referred for that service only when there is a clear clinical indication to do so. Copyright © 2016 Elsevier Ltd. All rights reserved.

  11. The Integration of Word Processing with Data Processing in an Educational Environment. Final Report.

    ERIC Educational Resources Information Center

    Patterson, Lorna; Schlender, Jim

    A project examined the Office of the Future and determined trends regarding an integration of word processing and data processing. It then sought to translate those trends into an educational package to develop the potential information specialist. A survey instrument completed by 33 office managers and word processing and data processing…

  12. 49 CFR Appendix A to Part 580 - Secure Printing Processes and Other Secure Processes

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 49 Transportation 7 2010-10-01 2010-10-01 false Secure Printing Processes and Other Secure... DISCLOSURE REQUIREMENTS Pt. 580, App. A Appendix A to Part 580—Secure Printing Processes and Other Secure... printing—a printing process utilized in the production of bank-notes and other security documents whereby...

  13. Spoken Language Processing Model: Bridging Auditory and Language Processing to Guide Assessment and Intervention

    ERIC Educational Resources Information Center

    Medwetsky, Larry

    2011-01-01

    Purpose: This article outlines the author's conceptualization of the key mechanisms that are engaged in the processing of spoken language, referred to as the spoken language processing model. The act of processing what is heard is very complex and involves the successful intertwining of auditory, cognitive, and language mechanisms. Spoken language…

  14. Clinical process cost analysis.

    PubMed

    Marrin, C A; Johnson, L C; Beggs, V L; Batalden, P B

    1997-09-01

    New systems of reimbursement are exerting enormous pressure on clinicians and hospitals to reduce costs. Using cheaper supplies or reducing the length of stay may be a satisfactory short-term solution, but the best strategy for long-term success is radical reduction of costs by reengineering the processes of care. However, few clinicians or institutions know the actual costs of medical care; nor do they understand, in detail, the activities involved in the delivery of care. Finally, there is no accepted method for linking the two. Clinical process cost analysis begins with the construction of a detailed flow diagram incorporating each activity in the process of care. The cost of each activity is then calculated, and the two are linked. This technique was applied to Diagnosis Related Group 75 to analyze the real costs of the operative treatment of lung cancer at one institution. Total costs varied between $6,400 and $7,700. The major driver of costs was personnel time, which accounted for 55% of the total. Forty percent of the total cost was incurred in the operating room. The cost of care decreased progressively during hospitalization. Clinical process cost analysis provides detailed information about the costs and processes of care. The insights thus obtained may be used to reduce costs by reengineering the process.

  15. Simulative design and process optimization of the two-stage stretch-blow molding process

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hopmann, Ch.; Rasche, S.; Windeck, C.

    2015-05-22

    The total production costs of PET bottles are significantly affected by the costs of raw material. Approximately 70 % of the total costs are spent for the raw material. Therefore, stretch-blow molding industry intends to reduce the total production costs by an optimized material efficiency. However, there is often a trade-off between an optimized material efficiency and required product properties. Due to a multitude of complex boundary conditions, the design process of new stretch-blow molded products is still a challenging task and is often based on empirical knowledge. Application of current CAE-tools supports the design process by reducing development timemore » and costs. This paper describes an approach to determine optimized preform geometry and corresponding process parameters iteratively. The wall thickness distribution and the local stretch ratios of the blown bottle are calculated in a three-dimensional process simulation. Thereby, the wall thickness distribution is correlated with an objective function and preform geometry as well as process parameters are varied by an optimization algorithm. Taking into account the correlation between material usage, process history and resulting product properties, integrative coupled simulation steps, e.g. structural analyses or barrier simulations, are performed. The approach is applied on a 0.5 liter PET bottle of Krones AG, Neutraubling, Germany. The investigations point out that the design process can be supported by applying this simulative optimization approach. In an optimization study the total bottle weight is reduced from 18.5 g to 15.5 g. The validation of the computed results is in progress.« less

  16. Simulative design and process optimization of the two-stage stretch-blow molding process

    NASA Astrophysics Data System (ADS)

    Hopmann, Ch.; Rasche, S.; Windeck, C.

    2015-05-01

    The total production costs of PET bottles are significantly affected by the costs of raw material. Approximately 70 % of the total costs are spent for the raw material. Therefore, stretch-blow molding industry intends to reduce the total production costs by an optimized material efficiency. However, there is often a trade-off between an optimized material efficiency and required product properties. Due to a multitude of complex boundary conditions, the design process of new stretch-blow molded products is still a challenging task and is often based on empirical knowledge. Application of current CAE-tools supports the design process by reducing development time and costs. This paper describes an approach to determine optimized preform geometry and corresponding process parameters iteratively. The wall thickness distribution and the local stretch ratios of the blown bottle are calculated in a three-dimensional process simulation. Thereby, the wall thickness distribution is correlated with an objective function and preform geometry as well as process parameters are varied by an optimization algorithm. Taking into account the correlation between material usage, process history and resulting product properties, integrative coupled simulation steps, e.g. structural analyses or barrier simulations, are performed. The approach is applied on a 0.5 liter PET bottle of Krones AG, Neutraubling, Germany. The investigations point out that the design process can be supported by applying this simulative optimization approach. In an optimization study the total bottle weight is reduced from 18.5 g to 15.5 g. The validation of the computed results is in progress.

  17. Conceptual Framework for the Mapping of Management Process with Information Technology in a Business Process

    PubMed Central

    Chellappa, Swarnalatha; Nagarajan, Asha

    2015-01-01

    This study on component framework reveals the importance of management process and technology mapping in a business environment. We defined ERP as a software tool, which has to provide business solution but not necessarily an integration of all the departments. Any business process can be classified as management process, operational process and the supportive process. We have gone through entire management process and were enable to bring influencing components to be mapped with a technology for a business solution. Governance, strategic management, and decision making are thoroughly discussed and the need of mapping these components with the ERP is clearly explained. Also we suggest that implementation of this framework might reduce the ERP failures and especially the ERP misfit was completely rectified. PMID:25861688

  18. Conceptual framework for the mapping of management process with information technology in a business process.

    PubMed

    Rajarathinam, Vetrickarthick; Chellappa, Swarnalatha; Nagarajan, Asha

    2015-01-01

    This study on component framework reveals the importance of management process and technology mapping in a business environment. We defined ERP as a software tool, which has to provide business solution but not necessarily an integration of all the departments. Any business process can be classified as management process, operational process and the supportive process. We have gone through entire management process and were enable to bring influencing components to be mapped with a technology for a business solution. Governance, strategic management, and decision making are thoroughly discussed and the need of mapping these components with the ERP is clearly explained. Also we suggest that implementation of this framework might reduce the ERP failures and especially the ERP misfit was completely rectified.

  19. PROCESS WATER BUILDING, TRA605. CONTEXTUAL VIEW, CAMERA FACING SOUTHEAST. PROCESS ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    PROCESS WATER BUILDING, TRA-605. CONTEXTUAL VIEW, CAMERA FACING SOUTHEAST. PROCESS WATER BUILDING AND ETR STACK ARE IN LEFT HALF OF VIEW. TRA-666 IS NEAR CENTER, ABUTTED BY SECURITY BUILDING; TRA-626, AT RIGHT EDGE OF VIEW BEHIND BUS. INL NEGATIVE NO. HD46-34-1. Mike Crane, Photographer, 4/2005 - Idaho National Engineering Laboratory, Test Reactor Area, Materials & Engineering Test Reactors, Scoville, Butte County, ID

  20. GPU applications for data processing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vladymyrov, Mykhailo, E-mail: mykhailo.vladymyrov@cern.ch; Aleksandrov, Andrey; INFN sezione di Napoli, I-80125 Napoli

    2015-12-31

    Modern experiments that use nuclear photoemulsion imply fast and efficient data acquisition from the emulsion can be performed. The new approaches in developing scanning systems require real-time processing of large amount of data. Methods that use Graphical Processing Unit (GPU) computing power for emulsion data processing are presented here. It is shown how the GPU-accelerated emulsion processing helped us to rise the scanning speed by factor of nine.

  1. Model-based software process improvement

    NASA Technical Reports Server (NTRS)

    Zettervall, Brenda T.

    1994-01-01

    The activities of a field test site for the Software Engineering Institute's software process definition project are discussed. Products tested included the improvement model itself, descriptive modeling techniques, the CMM level 2 framework document, and the use of process definition guidelines and templates. The software process improvement model represents a five stage cyclic approach for organizational process improvement. The cycles consist of the initiating, diagnosing, establishing, acting, and leveraging phases.

  2. Carbon wastewater treatment process

    NASA Technical Reports Server (NTRS)

    Humphrey, M. F.; Simmons, G. M.; Dowler, W. L.

    1974-01-01

    A new powdered-carbon treatment process is being developed for the elimination of the present problems, associated with the disposal of biologically active sewage waste solids, and with water reuse. This counter-current flow process produces an activated carbon, which is obtained from the pyrolysis of the sewage solids, and utilizes this material to remove the adulterating materials from the water. Additional advantages of the process are the elimination of odors, the removal of heavy metals, and the potential for energy conservation.

  3. Expert system and process optimization techniques for real-time monitoring and control of plasma processes

    NASA Astrophysics Data System (ADS)

    Cheng, Jie; Qian, Zhaogang; Irani, Keki B.; Etemad, Hossein; Elta, Michael E.

    1991-03-01

    To meet the ever-increasing demand of the rapidly-growing semiconductor manufacturing industry it is critical to have a comprehensive methodology integrating techniques for process optimization real-time monitoring and adaptive process control. To this end we have accomplished an integrated knowledge-based approach combining latest expert system technology machine learning method and traditional statistical process control (SPC) techniques. This knowledge-based approach is advantageous in that it makes it possible for the task of process optimization and adaptive control to be performed consistently and predictably. Furthermore this approach can be used to construct high-level and qualitative description of processes and thus make the process behavior easy to monitor predict and control. Two software packages RIST (Rule Induction and Statistical Testing) and KARSM (Knowledge Acquisition from Response Surface Methodology) have been developed and incorporated with two commercially available packages G2 (real-time expert system) and ULTRAMAX (a tool for sequential process optimization).

  4. Ultra-processed Food Intake and Obesity: What Really Matters for Health-Processing or Nutrient Content?

    PubMed

    Poti, Jennifer M; Braga, Bianca; Qin, Bo

    2017-12-01

    The aim of this narrative review was to summarize and critique recent evidence evaluating the association between ultra-processed food intake and obesity. Four of five studies found that higher purchases or consumption of ultra-processed food was associated with overweight/obesity. Additional studies reported relationships between ultra-processed food intake and higher fasting glucose, metabolic syndrome, increases in total and LDL cholesterol, and risk of hypertension. It remains unclear whether associations can be attributed to processing itself or the nutrient content of ultra-processed foods. Only three of nine studies used a prospective design, and the potential for residual confounding was high. Recent research provides fairly consistent support for the association of ultra-processed food intake with obesity and related cardiometabolic outcomes. There is a clear need for further studies, particularly those using longitudinal designs and with sufficient control for confounding, to potentially confirm these findings in different populations and to determine whether ultra-processed food consumption is associated with obesity independent of nutrient content.

  5. Cognitive Risk Factors for Specific Learning Disorder: Processing Speed, Temporal Processing, and Working Memory.

    PubMed

    Moll, Kristina; Göbel, Silke M; Gooch, Debbie; Landerl, Karin; Snowling, Margaret J

    2016-01-01

    High comorbidity rates between reading disorder (RD) and mathematics disorder (MD) indicate that, although the cognitive core deficits underlying these disorders are distinct, additional domain-general risk factors might be shared between the disorders. Three domain-general cognitive abilities were investigated in children with RD and MD: processing speed, temporal processing, and working memory. Since attention problems frequently co-occur with learning disorders, the study examined whether these three factors, which are known to be associated with attention problems, account for the comorbidity between these disorders. The sample comprised 99 primary school children in four groups: children with RD, children with MD, children with both disorders (RD+MD), and typically developing children (TD controls). Measures of processing speed, temporal processing, and memory were analyzed in a series of ANCOVAs including attention ratings as covariate. All three risk factors were associated with poor attention. After controlling for attention, associations with RD and MD differed: Although deficits in verbal memory were associated with both RD and MD, reduced processing speed was related to RD, but not MD; and the association with RD was restricted to processing speed for familiar nameable symbols. In contrast, impairments in temporal processing and visuospatial memory were associated with MD, but not RD. © Hammill Institute on Disabilities 2014.

  6. Efficient Process Migration for Parallel Processing on Non-Dedicated Networks of Workstations

    NASA Technical Reports Server (NTRS)

    Chanchio, Kasidit; Sun, Xian-He

    1996-01-01

    This paper presents the design and preliminary implementation of MpPVM, a software system that supports process migration for PVM application programs in a non-dedicated heterogeneous computing environment. New concepts of migration point as well as migration point analysis and necessary data analysis are introduced. In MpPVM, process migrations occur only at previously inserted migration points. Migration point analysis determines appropriate locations to insert migration points; whereas, necessary data analysis provides a minimum set of variables to be transferred at each migration pint. A new methodology to perform reliable point-to-point data communications in a migration environment is also discussed. Finally, a preliminary implementation of MpPVM and its experimental results are presented, showing the correctness and promising performance of our process migration mechanism in a scalable non-dedicated heterogeneous computing environment. While MpPVM is developed on top of PVM, the process migration methodology introduced in this study is general and can be applied to any distributed software environment.

  7. An Application of X-Ray Fluorescence as Process Analytical Technology (PAT) to Monitor Particle Coating Processes.

    PubMed

    Nakano, Yoshio; Katakuse, Yoshimitsu; Azechi, Yasutaka

    2018-06-01

    An attempt to apply X-Ray Fluorescence (XRF) analysis to evaluate small particle coating process as a Process Analytical Technologies (PAT) was made. The XRF analysis was used to monitor coating level in small particle coating process with at-line manner. The small particle coating process usually consists of multiple coating processes. This study was conducted by a simple coating particles prepared by first coating of a model compound (DL-methionine) and second coating by talc on spherical microcrystalline cellulose cores. The particles with two layered coating are enough to demonstrate the small particle coating process. From the result by the small particle coating process, it was found that the XRF signal played different roles, resulting that XRF signals by first coating (layering) and second coating (mask coating) could demonstrate the extent with different mechanisms for the coating process. Furthermore, the particle coating of the different particle size has also been investigated to evaluate size effect of these coating processes. From these results, it was concluded that the XRF could be used as a PAT in monitoring particle coating processes and become powerful tool in pharmaceutical manufacturing.

  8. Fuzzy image processing in sun sensor

    NASA Technical Reports Server (NTRS)

    Mobasser, S.; Liebe, C. C.; Howard, A.

    2003-01-01

    This paper will describe how the fuzzy image processing is implemented in the instrument. Comparison of the Fuzzy image processing and a more conventional image processing algorithm is provided and shows that the Fuzzy image processing yields better accuracy then conventional image processing.

  9. [Process optimisation in hospitals: from process to business organisation].

    PubMed

    Eberlein-Gonska, Maria

    2010-01-01

    Apart from a multidimensional quality definition and the understanding of quality as a company-wide challenge, a third essential element of quality management is prevention. Thus, company quality policy has to be prevention-oriented and requires both customer and process orientation as important prerequisites. Process orientation especially focuses on the critical analyses of work flows as a condition for identifying early intervention options which, in turn, may influence the result. Developing a business organisation requires the definition of criteria for space planning, room assignment and room integration in consideration of both medical and economic aspects and the architectural concept. Specific experiences will be demonstrated as a case study using the example of a new building in the midst of the Carl Gustav Carus University Hospital in Dresden, the Diagnostic Centre for Internal Medicine and Neurology. The hospital management placed an order to develop a sustainable as well as feasible business organisation for all the different departments. The idea was to create a medical centre where maximum use was made of all planned spaces and resources on the basis of target processes which had to be defined and agreed upon with all the persons concerned. In a next step all the personal, space and operational resources required were assigned. The success of management in all industries, including the health care sector, crucially depends on the translation of ideas into practice, among them the critical factor of sustainability. In this context, the support by the management as a role model, a formal frame for the respective project group and the definition of controlling via defined indicators have special importance. The example of the Diagnostic Centre for Internal Medicine and Neurology demonstrates that the result of changed processes may release a cultural change where competition can be replaced by cooperation step by step. Copyright © 2010. Published by

  10. Graphical Language for Data Processing

    NASA Technical Reports Server (NTRS)

    Alphonso, Keith

    2011-01-01

    A graphical language for processing data allows processing elements to be connected with virtual wires that represent data flows between processing modules. The processing of complex data, such as lidar data, requires many different algorithms to be applied. The purpose of this innovation is to automate the processing of complex data, such as LIDAR, without the need for complex scripting and programming languages. The system consists of a set of user-interface components that allow the user to drag and drop various algorithmic and processing components onto a process graph. By working graphically, the user can completely visualize the process flow and create complex diagrams. This innovation supports the nesting of graphs, such that a graph can be included in another graph as a single step for processing. In addition to the user interface components, the system includes a set of .NET classes that represent the graph internally. These classes provide the internal system representation of the graphical user interface. The system includes a graph execution component that reads the internal representation of the graph (as described above) and executes that graph. The execution of the graph follows the interpreted model of execution in that each node is traversed and executed from the original internal representation. In addition, there are components that allow external code elements, such as algorithms, to be easily integrated into the system, thus making the system infinitely expandable.

  11. Clients' emotional processing in psychotherapy: a comparison between cognitive-behavioral and process-experiential therapies.

    PubMed

    Watson, Jeanne C; Bedard, Danielle L

    2006-02-01

    The authors compared clients' emotional processing in good and bad outcome cases in cognitive behavioral therapy (CBT) and process-experiential therapy (PET) and investigated whether clients' emotional processing increases over the course of therapy. Twenty minutes from each of 3 sessions from 40 clients were rated on the Experiencing Scale. A 2x2x3 analysis of variance showed a significant difference between outcome and therapy groups, with clients in the good outcome and PET groups showing significantly higher levels of emotional processing than those in the poor outcome and CBT groups, respectively. Clients' level of emotional processing significantly increased from the beginning to the midpoint of therapy. The results indicate that CBT clients are more distant and disengaged from their emotional experience than clients in PET. Copyright (c) 2006 APA, all rights reserved.

  12. The Internet Process Addiction Test: Screening for Addictions to Processes Facilitated by the Internet.

    PubMed

    Northrup, Jason C; Lapierre, Coady; Kirk, Jeffrey; Rae, Cosette

    2015-07-28

    The Internet Process Addiction Test (IPAT) was created to screen for potential addictive behaviors that could be facilitated by the internet. The IPAT was created with the mindset that the term "Internet addiction" is structurally problematic, as the Internet is simply the medium that one uses to access various addictive processes. The role of the internet in facilitating addictions, however, cannot be minimized. A new screening tool that effectively directed researchers and clinicians to the specific processes facilitated by the internet would therefore be useful. This study shows that the Internet Process Addiction Test (IPAT) demonstrates good validity and reliability. Four addictive processes were effectively screened for with the IPAT: Online video game playing, online social networking, online sexual activity, and web surfing. Implications for further research and limitations of the study are discussed.

  13. Infrared processing of foods

    USDA-ARS?s Scientific Manuscript database

    Infrared (IR) processing of foods has been gaining popularity over conventional processing in several unit operations, including drying, peeling, baking, roasting, blanching, pasteurization, sterilization, disinfection, disinfestation, cooking, and popping . It has shown advantages over conventional...

  14. Methods in Astronomical Image Processing

    NASA Astrophysics Data System (ADS)

    Jörsäter, S.

    A Brief Introductory Note History of Astronomical Imaging Astronomical Image Data Images in Various Formats Digitized Image Data Digital Image Data Philosophy of Astronomical Image Processing Properties of Digital Astronomical Images Human Image Processing Astronomical vs. Computer Science Image Processing Basic Tools of Astronomical Image Processing Display Applications Calibration of Intensity Scales Calibration of Length Scales Image Re-shaping Feature Enhancement Noise Suppression Noise and Error Analysis Image Processing Packages: Design of AIPS and MIDAS AIPS MIDAS Reduction of CCD Data Bias Subtraction Clipping Preflash Subtraction Dark Subtraction Flat Fielding Sky Subtraction Extinction Correction Deconvolution Methods Rebinning/Combining Summary and Prospects for the Future

  15. Basic abnormalities in visual processing affect face processing at an early age in autism spectrum disorder.

    PubMed

    Vlamings, Petra Hendrika Johanna Maria; Jonkman, Lisa Marthe; van Daalen, Emma; van der Gaag, Rutger Jan; Kemner, Chantal

    2010-12-15

    A detailed visual processing style has been noted in autism spectrum disorder (ASD); this contributes to problems in face processing and has been directly related to abnormal processing of spatial frequencies (SFs). Little is known about the early development of face processing in ASD and the relation with abnormal SF processing. We investigated whether young ASD children show abnormalities in low spatial frequency (LSF, global) and high spatial frequency (HSF, detailed) processing and explored whether these are crucially involved in the early development of face processing. Three- to 4-year-old children with ASD (n = 22) were compared with developmentally delayed children without ASD (n = 17). Spatial frequency processing was studied by recording visual evoked potentials from visual brain areas while children passively viewed gratings (HSF/LSF). In addition, children watched face stimuli with different expressions, filtered to include only HSF or LSF. Enhanced activity in visual brain areas was found in response to HSF versus LSF information in children with ASD, in contrast to control subjects. Furthermore, facial-expression processing was also primarily driven by detail in ASD. Enhanced visual processing of detailed (HSF) information is present early in ASD and occurs for neutral (gratings), as well as for socially relevant stimuli (facial expressions). These data indicate that there is a general abnormality in visual SF processing in early ASD and are in agreement with suggestions that a fast LSF subcortical face processing route might be affected in ASD. This could suggest that abnormal visual processing is causative in the development of social problems in ASD. Copyright © 2010 Society of Biological Psychiatry. Published by Elsevier Inc. All rights reserved.

  16. Bank Record Processing

    NASA Technical Reports Server (NTRS)

    1982-01-01

    Barnett Banks of Florida, Inc. operates 150 banking offices in 80 Florida cities. Banking offices have computerized systems for processing deposits or withdrawals in checking/savings accounts, and for handling commercial and installment loan transactions. In developing a network engineering design for the terminals used in record processing, an affiliate, Barnett Computing Company, used COSMIC's STATCOM program. This program provided a reliable network design tool and avoided the cost of developing new software.

  17. Qualitative Process Theory.

    DTIC Science & Technology

    1982-02-01

    4. PERFORMING ORG. REPORT NUMBER S 7. AUTNOR(a) S.CONTRACT OR GRANT NUMBER(.j- Kenneth .D. Forbus NOq,,4-8o-C-o5o5 S. PERFORMING ORGANIZATION NAME...AND ADDRESS 10. PROGRAM ELEMENT. PROjECT. TASK Artificial Intelligence Laboratory. AREA & WORK UNIT NUMBERS 545 -Technology Square Cambridge...about processes, their effects , and their limits. Qualitati e rocess theory defines a simple notion of Reasoning about process also Imotivates a new

  18. Coal liquefaction process

    DOEpatents

    Karr, Jr., Clarence

    1977-04-19

    An improved coal liquefaction process is provided which enables conversion of a coal-oil slurry to a synthetic crude refinable to produce larger yields of gasoline and diesel oil. The process is characterized by a two-step operation applied to the slurry prior to catalytic desulfurization and hydrogenation in which the slurry undergoes partial hydrogenation to crack and hydrogenate asphaltenes and the partially hydrogenated slurry is filtered to remove minerals prior to subsequent catalytic hydrogenation.

  19. Dawn Spacecraft Processing

    NASA Image and Video Library

    2007-04-10

    The Dawn spacecraft is seen here in clean room C of Astrotech's Payload Processing Facility. In the clean room, the spacecraft will undergo further processing. Dawn's mission is to explore two of the asteroid belt's most intriguing and dissimilar occupants: asteroid Vesta and the dwarf planet Ceres. The Dawn mission is managed by JPL, a division of the California Institute of Technology in Pasadena, for NASA's Science Mission Directorate in Washington, D.C.

  20. Image Processing Software

    NASA Technical Reports Server (NTRS)

    1990-01-01

    The Ames digital image velocimetry technology has been incorporated in a commercially available image processing software package that allows motion measurement of images on a PC alone. The software, manufactured by Werner Frei Associates, is IMAGELAB FFT. IMAGELAB FFT is a general purpose image processing system with a variety of other applications, among them image enhancement of fingerprints and use by banks and law enforcement agencies for analysis of videos run during robberies.