Sample records for mcdowell-wellman process

  1. DEMONSTRATION OF WELLMAN-LORD/ALLIED CHEMICAL FGD TECHNOLOGY: DEMONSTRATION TEST SECOND YEAR RESULTS

    EPA Science Inventory

    The report gives results of an evaluation of the performance (over a 2-year period) of a full-scale flue gas desulfurization (FGD) unit to demonstrate the Wellman-Lord/Allied Chemical process. The process is regenerable, employing sodium sulfite wet scrubbing, thermal regeneratio...

  2. 27 CFR 9.36 - McDowell Valley.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ....” (b) Approved maps. The appropriate map for determining the boundaries of the McDowell Valley... and the ridge line (highest elevation line) between the McDowell Creek Valley and the Dooley Creek Valley. (3) Then southeasterly along the ridge line (highest elevation line) to the intersection of the...

  3. 27 CFR 9.36 - McDowell Valley.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ....” (b) Approved maps. The appropriate map for determining the boundaries of the McDowell Valley... and the ridge line (highest elevation line) between the McDowell Creek Valley and the Dooley Creek Valley. (3) Then southeasterly along the ridge line (highest elevation line) to the intersection of the...

  4. 40 CFR 81.334 - North Carolina.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... Lee County X Lenoir County X Lincoln County X McDowell County X Macon County X Madison County X Martin... Lee County X Lenoir County X Lincoln County X McDowell County X Macon County X Madison County X Martin... County Lenoir County Lincoln County Macon County Madison County Martin County McDowell County Mecklenburg...

  5. Mountains to Climb: A Union-Led Partnership to Revitalize Education in McDowell County, West Virginia

    ERIC Educational Resources Information Center

    Dubin, Jennifer

    2016-01-01

    For more than four years, the AFT has guided the efforts of a public-private partnership to bring much-needed resources and services to McDowell County, West Virginia, a geographically isolated area in the heart of Appalachia. Known as Reconnecting McDowell, the initiative has also encouraged a renewed emphasis on improving education by focusing…

  6. A Silent Witness for Peace: The Case of Schoolteacher Mary Stone McDowell and America at War

    ERIC Educational Resources Information Center

    Howlett, Patricia; Howlett, Charles F.

    2008-01-01

    A 1964 television series, "Profiles in Courage," based on the late President John F. Kennedy's Pulitzer prize-winning book, featured the life of Mary Stone McDowell, a quiet, yet strong, teacher. Within peace circles, McDowell was a well-known figure. Yet what captured the interest of the show's producers was the stand she took during World War I.…

  7. Citizen-Scientist Digitization of a Complex Geologic Map of the McDowell Mountains (Scottsdale, Arizona).

    NASA Astrophysics Data System (ADS)

    Gruber, D.; Skotnicki, S.; Gootee, B.

    2016-12-01

    The work of citizen scientists has become very important to researchers doing field work and internet-based projects but has not been widely utilized in digital mapping. The McDowell Mountains - located in Scottsdale, Arizona, at the edge of the basin-and-range province and protected as part of the McDowell Sonoran Preserve - are geologically complex. Until recently, no comprehensive geologic survey of the entire range had been done. Over the last 9 years geologist Steven Skotnicki spent 2000 hours mapping the complex geology of the range. His work, born of personal interest and partially supported by the McDowell Sonoran Conservancy, resulted in highly detailed hand-drawn survey maps. Dr. Skotnicki's work provides important new information and raises interesting research questions about the geology of this range. Citizen scientists of the McDowell Sonoran Conservancy Field Institute digitized Dr. Skotnicki's maps. A team of 10 volunteers, trained in ArcMap digitization techniques and led by volunteer project leader Daniel Gruber, performed the digitization work. Technical oversight of mapping using ArcMap, including provision of USGS-based mapping toolbars, was provided by Arizona Geological Survey (AZGS) research geologist Brian Gootee. The map digitization process identified and helped resolve a number of mapping questions. The citizen-scientist team spent 900 hours on training, digitization, quality checking, and project coordination with support and review by Skotnicki and Gootee. The resulting digital map has approximately 3000 polygons, 3000 points, and 86 map units with complete metadata and unit descriptions. The finished map is available online through AZGS and can be accessed in the field on mobile devices. User location is shown on the map and metadata can be viewed with a tap. The citizen scientist map digitization team has made this important geologic information available to the public and accessible to other researchers quickly and efficiently.

  8. Arsenic Removal from Drinking Water by Adsorptive Media U.S. EPA Demonstration Project at Wellman, TX, Final Performance Evaluation Report

    EPA Science Inventory

    This report documents the activities performed and the results obtained from the arsenic removal treatment technology demonstration project in the City of Wellman, TX. The main objective of the project was to evaluate the effectiveness of AdEdge Technologies’ AD-33 media in remo...

  9. Arsenic Removal from Drinking Water by Adsorptive Media U.S. EPA Demonstration Project at Wellman, TX, Six-Month Evaluation Report

    EPA Science Inventory

    This report documents the activities performed and the results obtained from the first six months of the arsenic removal treatment technology demonstration project in the City of Wellman, TX. The main objective of the project was to evaluate the effectiveness of AdEdge Technolog...

  10. 40 CFR 81.334 - North Carolina.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Lee County X Lenoir County X Lincoln County X McDowell County X Macon County X Madison County X Martin... Lee County X Lenoir County X Lincoln County X McDowell County X Macon County X Madison County X Martin... Hertford County Hoke County Hyde County Iredell County Jackson County Johnston County Jones County Lee...

  11. 40 CFR 81.334 - North Carolina.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... Lee County X Lenoir County X Lincoln County X McDowell County X Macon County X Madison County X Martin... Lee County X Lenoir County X Lincoln County X McDowell County X Macon County X Madison County X Martin... Hertford County Hoke County Hyde County Iredell County Jackson County Johnston County Jones County Lee...

  12. Hidden Process Models

    DTIC Science & Technology

    2009-12-18

    cannot be detected with univariate techniques, but require multivariate analysis instead (Kamitani and Tong [2005]). Two other time series analysis ...learning for time series analysis . The historical record of DBNs can be traced back to Dean and Kanazawa [1988] and Dean and Wellman [1991], with...Rev. 8-98) Prescribed by ANSI Std Z39-18 Keywords: Hidden Process Models, probabilistic time series modeling, functional Magnetic Resonance Imaging

  13. Needs assessment of school and community physical activity opportunities in rural West Virginia: the McDowell CHOICES planning effort.

    PubMed

    Kristjansson, Alfgeir L; Elliott, Eloise; Bulger, Sean; Jones, Emily; Taliaferro, Andrea R; Neal, William

    2015-04-03

    McDowell CHOICES (Coordinated Health Opportunities Involving Communities, Environments, and Schools) Project is a county wide endeavor aimed at increasing opportunities for physical activity (PA) in McDowell County, West Virginia (WV). A comprehensive needs-assessment laid the foundation of the project. During the 6 month needs assessment, multiple sources of data were collected in two Town Hall Meetings (n = 80); a student online PA interest survey (n = 465); a PA and nutrition survey among 5(th) (10-11 years) and 8(th) graders (13-14 years) with questions adapted from the CDC's Youth Risk Behavior Surveillance Survey (n = 442, response rate = 82.2%); six semi-structured school and community focus groups (n = 44); school site visits (n = 11); and BMI screening (n = 550, response rate = 69.7%). One third of children in McDowell County meet the national PA minimum of 60 minutes daily. At least 40% of 5(th) and 8(th) graders engage in electronic screen activity for 3 hours or more every day. The prevalence of obesity in 5(th) graders is higher in McDowell County than the rest of WV (~55% vs. 47% respectively). SWOT analyses of focus group data suggest an overall interest in PA but also highlight a need for increase in structured PA opportunities. Focus group data also suggested that a central communication (e.g. internet-based) platform would be beneficial to advertise and boost participation both in current and future programs. Schools were commonly mentioned as potential facilities for public PA participation throughout the county, both with regards to access and convenience. School site visits suggest that schools need more equipment and resources for before, during, and after school programs. An overwhelming majority of participants in the McDowell CHOICES needs assessment were interested to participate in more PA programs throughout the county as well as to improve opportunities for the provision of such programs. Public schools were widely recognized as the hub of the communities and provide the best venue for PA promotion for both students and adult citizens, and can potentially serve as a platform for change in rural communities such as McDowell County.

  14. 75 FR 34485 - Notice of Affirmative Decisions on Petitions for Modification Granted in Whole or in Part

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-06-17

    ...-08993, located in Boone County, West Virginia. Regulation Affected: 30 CFR 75.1101-1(b) (Deluge-type.... No. 46-08769, located in McDowell County, West Virginia. Regulation Affected: 30 CFR 75.1101-1(b.... No. 46-08884, located in McDowell County, West Virginia. Regulation Affected: 30 CFR 75.1101-1(b...

  15. Humans, Animals and the World We Inhabit--On and beyond the Symposium "Second Nature, 'Bildung' and McDowell: David Bakhurst's 'The Formation of Reason'"

    ERIC Educational Resources Information Center

    Misawa, Koichiro

    2017-01-01

    David Bakhurst's 2011 book "The Formation of Reason" explores the philosophy of John McDowell in general and the Aristotelian notion of second nature more specifically, topics to which philosophers of education have not yet given adequate attention. The book's widespread appeal led to the symposium "Second Nature, Bildung and…

  16. Interannual variations in needle and sapwood traits of Pinus edulis branches under an experimental drought

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Guerin, Marceau; Martin-Benito, Dario; von Arx, Georg

    In recent years, widespread forest mortality in response to drought has been documented worldwide (Allen, Breshears & McDowell 2015). An example of widespread and rapid increase in drought-induced mortality, or die-off, was observed for Pinus edulis Engelm. across the Southwestern USA in response to several years of reduced rainfall and high vapor pressure deficits (VPD) (Breshears et al. 2009; Allen et al. 2010; Williams et al. 2013). Although stomatal closure under drought has been hypothesized to increase mortality through carbon starvation (McDowell et al. 2008; Breshears et al. 2009), more evidences exist for mortality being caused by hydraulic failure (Plautmore » et al. 2012; McDowell et al. 2013; Sevanto et al. 2014; Garcia-Forner et al. 2016). Regardless of the mechanism of drought-induced decline, maintaining a positive supply of water to the foliage is critical for tree functioning and survival.« less

  17. Interannual variations in needle and sapwood traits of Pinus edulis branches under an experimental drought

    DOE PAGES

    Guerin, Marceau; Martin-Benito, Dario; von Arx, Georg; ...

    2018-01-05

    In recent years, widespread forest mortality in response to drought has been documented worldwide (Allen, Breshears & McDowell 2015). An example of widespread and rapid increase in drought-induced mortality, or die-off, was observed for Pinus edulis Engelm. across the Southwestern USA in response to several years of reduced rainfall and high vapor pressure deficits (VPD) (Breshears et al. 2009; Allen et al. 2010; Williams et al. 2013). Although stomatal closure under drought has been hypothesized to increase mortality through carbon starvation (McDowell et al. 2008; Breshears et al. 2009), more evidences exist for mortality being caused by hydraulic failure (Plautmore » et al. 2012; McDowell et al. 2013; Sevanto et al. 2014; Garcia-Forner et al. 2016). Regardless of the mechanism of drought-induced decline, maintaining a positive supply of water to the foliage is critical for tree functioning and survival.« less

  18. First-Year Students' Impressions of Pair Programming in CS1

    ERIC Educational Resources Information Center

    Simon, Beth; Hanks, Brian

    2008-01-01

    Pair programming, as part of the Agile Development process, has noted benefits in professional software development scenarios. These successes have led to a rise in use of pair programming in educational settings, particularly in Computer Science 1 (CS1). Specifically, McDowell et al. [2006] has shown that students using pair programming in CS1 do…

  19. Gas processing handbook

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1982-04-01

    Brief details are given of processes including: BGC-Lurgi slagging gasification, COGAS, Exxon catalytic coal gasification, FW-Stoic 2-stage, GI two stage, HYGAS, Koppers-Totzek, Lurgi pressure gasification, Saarberg-Otto, Shell, Texaco, U-Gas, W-D.IGI, Wellman-Galusha, Westinghouse, and Winkler coal gasification processes; the Rectisol process; the Catacarb and the Benfield processes for removing CO/SUB/2, H/SUB/2s and COS from gases produced by the partial oxidation of coal; the selectamine DD, Selexol solvent, and Sulfinol gas cleaning processes; the sulphur-tolerant shift (SSK) process; and the Super-meth process for the production of high-Btu gas from synthesis gas.

  20. RESTRICTIVE CARDIOMYOPATHY AND SECONDARY CONGESTIVE HEART FAILURE IN A MCDOWELL'S CARPET PYTHON (MORELIA SPILOTA MCDOWELLI).

    PubMed

    Schilliger, Lionel; Chetboul, Valérie; Damoiseaux, Cécile; Nicolier, Alexandra

    2016-12-01

    Echocardiography is an established and noninvasive diagnostic tool used in herpetologic cardiology. Various cardiac lesions have been previously described in reptiles with the exception of restrictive cardiomyopathy. In this case report, restrictive cardiomyopathy and congestive heart failure associated with left atrial and sinus venosus dilation were diagnosed in a 2-yr-old captive lethargic McDowell's carpet python ( Morelia spilota mcdowelli), based on echocardiographic, Doppler, and histopathologic examinations. This cardiomyopathy was also associated with thrombosis within the sinus venosus.

  1. The Cost of Thinking about False Beliefs: Evidence from Adults' Performance on a Non-Inferential Theory of Mind Task

    ERIC Educational Resources Information Center

    Apperly, Ian A.; Back, Elisa; Samson, Dana; France, Lisa

    2008-01-01

    Much of what we know about other people's beliefs comes non-inferentially from what people tell us. Developmental research suggests that 3-year-olds have difficulty processing such information: they suffer interference from their own knowledge of reality when told about someone's false belief (e.g., [Wellman, H. M., & Bartsch, K. (1988). Young…

  2. Southwestern Pine Forests Likely to Disappear

    ScienceCinema

    McDowell, Nathan

    2018-01-16

    A new study, led by Los Alamos National Laboratory's Nathan McDowell, suggests that widespread loss of a major forest type, the pine-juniper woodlands of the Southwestern U.S., could be wiped out by the end of this century due to climate change, and that conifers throughout much of the Northern Hemisphere may be on a similar trajectory. New results, reported in the journal Nature Climate Change, suggest that global models may underestimate predictions of forest death. McDowell and his large international team strove to provide the missing pieces of understanding tree death at three levels: plant, regional and global. The team rigorously developed and evaluated multiple process-based and empirical models against experimental results, and then compared these models to results from global vegetation models to examine independent simulations. They discovered that the global models simulated mortality throughout the Northern Hemisphere that was of similar magnitude, but much broader spatial scale, as the evaluated ecosystem models predicted for in the Southwest.

  3. Southwestern Pine Forests Likely to Disappear

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McDowell, Nathan

    A new study, led by Los Alamos National Laboratory's Nathan McDowell, suggests that widespread loss of a major forest type, the pine-juniper woodlands of the Southwestern U.S., could be wiped out by the end of this century due to climate change, and that conifers throughout much of the Northern Hemisphere may be on a similar trajectory. New results, reported in the journal Nature Climate Change, suggest that global models may underestimate predictions of forest death. McDowell and his large international team strove to provide the missing pieces of understanding tree death at three levels: plant, regional and global. The teammore » rigorously developed and evaluated multiple process-based and empirical models against experimental results, and then compared these models to results from global vegetation models to examine independent simulations. They discovered that the global models simulated mortality throughout the Northern Hemisphere that was of similar magnitude, but much broader spatial scale, as the evaluated ecosystem models predicted for in the Southwest.« less

  4. The critical amplifying role of increasing atmospheric moisture demand on tree mortality and associated regional die-off

    DOE PAGES

    Breshears, David D.; Adams, Henry D.; Eamus, Derek; ...

    2013-08-02

    Drought-induced tree mortality, including large-scale die-off events and increases in background rates of mortality, is a global phenomenon (Allen et al., 2010) that can directly impact numerous earth system properties and ecosystem goods and services (Adams et al., 2010; Breshears et al., 2011; Anderegg et al., 2013). Tree mortality is particularly of concern because of the likelihood that it will increase in frequency and extent with climate change (McDowell et al., 2008, 2011; Adams et al., 2009; McDowell, 2011; Williams et al., 2013). Recent plant science advances related to drought have focused on understanding the physiological mechanisms that not onlymore » affect plant growth and associated carbon metabolism, but also the more challenging issue of predicting plant mortality thresholds (McDowell et al., 2013). Although some advances related to mechanisms of mortality have been made and have increased emphasis on interrelationships between carbon metabolism and plant hydraulics (McDowell et al., 2011), notably few studies have specifically evaluated effects of increasing atmospheric demand for moisture (i.e., vapour pressure deficit; VPD) on rates of tree death. In this opinion article we highlight the importance of considering the key risks of future large-scale tree die-off and other mortality events arising from increased VPD. Here we focus on mortality of trees, but our point about the importance of VPD is also relevant to other vascular plants.« less

  5. Test Review: E. Schopler, M. E. Van Bourgondien, G. J. Wellman, & S. R. Love "Childhood Autism Rating Scale" (2nd Ed.). Los Angeles, CA--Western Psychological Services, 2010

    ERIC Educational Resources Information Center

    Vaughan, Chelsea A.

    2011-01-01

    The author reviews Childhood Autism Rating Scale-Second Edition (CARS2), a useful tool for supporting the diagnostic process and for forming intervention recommendations once a diagnosis has been made. CARS2 consists of three rating forms designed to identify symptoms associated with autism spectrum disorders (ASD). Published in 2010, the CARS2…

  6. Global warming accelerates drought-induced forest death

    ScienceCinema

    McDowell, Nathan; Pockman, William

    2018-05-16

    Many southwestern forests in the United States will disappear or be heavily altered by 2050, according to a series of joint Los Alamos National Laboratory-University of New Mexico studies. Nathan McDowell, a Los Alamos plant physiologist, and William Pockman, a UNM biology professor, explain that their research, and more from scientists around the world, is forecasting that by 2100 most conifer forests should be heavily disturbed, if not gone, as air temperatures rise in combination with drought. "Everybody knows trees die when there's a drought, if there's bark beetles or fire, yet nobody in the world can predict it with much accuracy." McDowell said. "What's really changed is that the temperature is going up," thus the researchers are imposing artificial drought conditions on segments of wild forest in the Southwest and pushing forests to their limit to discover the exact processes of mortality and survival. The study is centered on drought experiments in woodlands at both Los Alamos and the Sevilleta National Wildlife Refuge in central New Mexico. Both sites are testing hypotheses about how forests die on mature, wild trees, rather than seedlings in a greenhouse, through the ecosystem-scale removal of 50 percent of yearly precipitation through large water-diversion trough systems.

  7. Citizen-Scientist Led Quartz Vein Investigation in the McDowell Sonoran Preserve, Scottsdale, Arizona, Resulting in Significant Geologic Discoveries and a Peer-Reviewed Report Coauthored and with Maps by Citizen-Scientists.

    NASA Astrophysics Data System (ADS)

    Gruber, D.; Gootee, B.

    2016-12-01

    Citizen-scientists of the McDowell Sonoran Conservancy Field Institute originated and led this project to study milky quartz deposits. Milky quartz veins of all sizes are visible throughout the McDowell Sonoran Preserve (Scottsdale, Arizona) and are commonly found in Arizona Proterozoic rocks. No research on milky quartz has been done locally and little is known about its formation and emplacement history. Working with Brian Gootee, research geologist with the Arizona Geological Survey (AZGS), a citizen science team identified candidate study sites with large quartz veins and then conducted aerial balloon photography followed by geologic mapping, basic data collection, photo-documentation, and sampling from two sites. Samples were analyzed with a UV lamp, Geiger counter, and x-ray fluorescence spectrometer. Petroscopic analysis and interpretation of the samples were done by Gootee. Daniel Gruber, the citizen-science project leader, and Gootee summarized methodology, sample analyses, and interpretation in a report including detailed geologic maps. Analysis of samples from one site provided evidence of several events of Proterozoic quartz formation. The other site hosted pegmatite, cumulates, graphic granite and orbicular granite in association with milky quartz, all discovered by citizen scientists. The milky quartz and surrounding pegmatites in granite at this site trace the progression of late-stage crystallization at the margin of a fractionated granite batholith, providing an exemplary opportunity for further research into batholith geochemistry and evolution. The project required 1000 hours of citizen-science time for training, field work, data organization and entry, mapping, and writing. The report by Gootee and Gruber was reviewed and published by AZGS as an Open File Report in its online document repository. The citizen scientist team leveraged the time of professional geologists to expand knowledge of an important geologic feature of the McDowell Mountains.

  8. A field- and laboratory-based quantitative analysis of alluvium: Relating analytical results to TIMS data

    NASA Technical Reports Server (NTRS)

    Wenrich, Melissa L.; Hamilton, Victoria E.; Christensen, Philip R.

    1995-01-01

    Thermal Infrared Multispectral Scanner (TIMS) data were acquired over the McDowell Mountains northeast of Scottsdale, Arizona during August 1994. The raw data were processed to emphasize lithologic differences using a decorrelation stretch and assigning bands 5, 3, and 1 to red, green, and blue, respectively. Processed data of alluvium flanking the mountains exhibit moderate color variation. The objective of this study was to determine, using a quantitative approach, what environmental variable(s), in the absence of bedrock, is/are responsible for influencing the spectral properties of the desert alluvial surface.

  9. Behavioral variability in an evolutionary theory of behavior dynamics.

    PubMed

    Popa, Andrei; McDowell, J J

    2016-03-01

    McDowell's evolutionary theory of behavior dynamics (McDowell, 2004) instantiates populations of behaviors (abstractly represented by integers) that evolve under the selection pressure of the environment in the form of positive reinforcement. Each generation gives rise to the next via low-level Darwinian processes of selection, recombination, and mutation. The emergent patterns can be analyzed and compared to those produced by biological organisms. The purpose of this project was to explore the effects of high mutation rates on behavioral variability in environments that arranged different reinforcer rates and magnitudes. Behavioral variability increased with the rate of mutation. High reinforcer rates and magnitudes reduced these effects; low reinforcer rates and magnitudes augmented them. These results are in agreement with live-organism research on behavioral variability. Various combinations of mutation rates, reinforcer rates, and reinforcer magnitudes produced similar high-level outcomes (equifinality). These findings suggest that the independent variables that describe an experimental condition interact; that is, they do not influence behavior independently. These conclusions have implications for the interpretation of high levels of variability, mathematical undermatching, and the matching theory. The last part of the discussion centers on a potential biological counterpart for the rate of mutation, namely spontaneous fluctuations in the brain's default mode network. © 2016 Society for the Experimental Analysis of Behavior.

  10. Global warming accelerates drought-induced forest death

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McDowell, Nathan; Pockman, William

    2013-07-09

    Many southwestern forests in the United States will disappear or be heavily altered by 2050, according to a series of joint Los Alamos National Laboratory-University of New Mexico studies. Nathan McDowell, a Los Alamos plant physiologist, and William Pockman, a UNM biology professor, explain that their research, and more from scientists around the world, is forecasting that by 2100 most conifer forests should be heavily disturbed, if not gone, as air temperatures rise in combination with drought. "Everybody knows trees die when there's a drought, if there's bark beetles or fire, yet nobody in the world can predict it withmore » much accuracy." McDowell said. "What's really changed is that the temperature is going up," thus the researchers are imposing artificial drought conditions on segments of wild forest in the Southwest and pushing forests to their limit to discover the exact processes of mortality and survival. The study is centered on drought experiments in woodlands at both Los Alamos and the Sevilleta National Wildlife Refuge in central New Mexico. Both sites are testing hypotheses about how forests die on mature, wild trees, rather than seedlings in a greenhouse, through the ecosystem-scale removal of 50 percent of yearly precipitation through large water-diversion trough systems.« less

  11. A computational model of selection by consequences: log survivor plots.

    PubMed

    Kulubekova, Saule; McDowell, J J

    2008-06-01

    [McDowell, J.J, 2004. A computational model of selection by consequences. J. Exp. Anal. Behav. 81, 297-317] instantiated the principle of selection by consequences in a virtual organism with an evolving repertoire of possible behaviors undergoing selection, reproduction, and mutation over many generations. The process is based on the computational approach, which is non-deterministic and rules-based. The model proposes a causal account for operant behavior. McDowell found that the virtual organism consistently showed a hyperbolic relationship between response and reinforcement rates according to the quantitative law of effect. To continue validation of the computational model, the present study examined its behavior on the molecular level by comparing the virtual organism's IRT distributions in the form of log survivor plots to findings from live organisms. Log survivor plots did not show the "broken-stick" feature indicative of distinct bouts and pauses in responding, although the bend in slope of the plots became more defined at low reinforcement rates. The shape of the virtual organism's log survivor plots was more consistent with the data on reinforced responding in pigeons. These results suggest that log survivor plot patterns of the virtual organism were generally consistent with the findings from live organisms providing further support for the computational model of selection by consequences as a viable account of operant behavior.

  12. Mountain Island Lake, North Carolina; analysis of ambient conditions and simulation of hydrodynamics, constituent transport, and water-quality characteristics, 1996–97

    USGS Publications Warehouse

    Bales, Jerad D.; Sarver, Kathleen M.; Giorgino, Mary J.

    2001-01-01

    Mountain Island Lake is an impoundment of the Catawba River in North Carolina and supplies drinking water to more than 600,000 people in Charlotte, Gastonia, Mount Holly, and several other communities. The U.S. Geological Survey, in cooperation with the Charlotte-Mecklenburg Utilities, conducted an investigation of the reservoir to characterize hydrologic and water-quality conditions and to develop and apply a simulation model to predict the response of the reservoir to changes in constituent loadings or the flow regime.During 1996–97, flows into Mountain Island Lake were dominated by releases from Cowans Ford Dam on Lake Norman, with more than 85 percent of the total inflow to the reservoir coming from Lake Norman. Riverbend Steam Station discharges accounted for about 12 percent of the inflows to the reservoir, and inflows from tributary streams contributed less than 1.5 percent of the total inflows. Releases through Mountain Island Dam accounted for about 81 percent of outflows from the reservoir, while Riverbend Steam Station withdrawals, which were equal to discharge from the facility, constituted about 13 percent of the reservoir withdrawals. About 5.5 percent of the withdrawals from the reservoir were for water supply.Strong thermal stratification was seldom observed in Mountain Island Lake during April 1996-September 1997. As a result, dissolved-oxygen concentrations were only infrequently less than 4 milligrams per liter, and seldom less than 5 milligrams per liter throughout the entire reservoir, including the coves. The Riverbend Steam Station thermal discharge had a pronounced effect on surface-water temperatures near the outfall.McDowell Creek, which drains to McDowell Creek cove, receives treated wastewater from a large municipal facility and has exhibited signs of poor water-quality conditions in the past. During April 1996-September 1997, concentrations of nitrate, ammonia, total phosphorus, and chlorophyll a were higher in McDowell Creek cove than elsewhere throughout the reservoir. Nevertheless, the highest chlorophyll a concentration measured during the study was 13 micrograms per liter—well below the North Carolina ambient water-quality standard of 40 micrograms per liter. In the mainstem of the reservoir, near-bottom ammonia concentrations occasionally were greater than near-surface concentrations. However, the relatively large top-to-bottom differences in ammonia and phosphorus that have been observed in other Catawba River reservoirs were not present in Mountain Island Lake.External loadings of suspended solids, nitrogen, phosphorus, and biochemical oxygen demand were determined for May 1996-April 1997. Flows through Cowans Ford Dam contributed more than 80 percent of the biochemical oxygen demand and nitrogen load to the reservoir, with McDowell Creek contributing about 15 percent of the biochemical oxygen demand load. In contrast, McDowell Creek contributed about half of the phosphorus load to the reservoir, while inflows through Cowans Ford Dam contributed about one-fourth of the phosphorus load, and the McDowell Creek wastewater-treatment plant contributed about 15 percent of the total phosphorus load. The remainder of the phosphorus loadings came from Gar Creek and the discharge from the Riverbend ash settling pond.Mountain Island Lake is a relatively small (11.3-square-kilometer surface area) impoundment. An area of 181 square kilometers drains directly to the reservoir, but much of this area is undergoing development. In addition, the reservoir receives treated effluent from a municipal wastewater-treatment facility.The two-dimensional, laterally averaged model CE-QUAL-W2 was applied to Mountain Island Lake. The model was configured to simulate water level, water temperature, and 12 water-quality constituents. The model included the mainstem, four coves, three point-source discharges, and three withdrawals.Simulated water levels generally were within 10 centimeters of measured values, indicating a good calibration of the water balance for the reservoir. The root-mean-square difference between measured and simulated water temperatures was about 1 to 1.5 degrees Celsius, and vertical distributions of water temperature were accurately simulated in both the mainstem and coves.Seasonal and spatial patterns of nitrate, ammonia, orthophosphorus, and chlorophyll a were reasonably reproduced by the water-quality model. Because of the absence of the denitrification process in the model formulation, nitrate concentrations typically were overpredicted. Simulated and measured ammonia concentrations seldom differed by more than 0.01 milligram per liter, and simulations of seasonal fluctuations in chlorophyll a were representative of measured conditions. The root mean square of the difference between measured and simulated dissolved-oxygen concentrations was about 1 milligram per liter.The calibrated water-quality model was applied to evaluate (1) the movement of a conservative, neutrally buoyant material, or tracer, through the reservoir for several sets of conditions; (2) the effects of the Riverbend thermal discharge on water temperature in the reservoir; (3) the effects of changes in water-supply withdrawal rates on water-quality conditions; and (4) changes in reservoir water quality in response to changes in point- and nonpoint-source loadings. In general, dissolved material entering Mountain Island Lake from both Cowans Ford Dam and McDowell Creek during the summer moves along the bottom of the lake toward Mountain Island Dam, with little mixing of dissolved material into the surface layers. Simulations suggest that dissolved material can move upstream in the reservoir when flows from Cowans Ford Dam are near zero. Dissolved material can remain in Mountain Island Lake for a period far in excess of the theoretical retention time of 12 days.Simulations indicated that the Riverbend thermal discharge increases water temperature in the surface layers of the downstream part of the reservoir by as much as 5 degrees Celsius. However, the discharge has little effect on near-bottom water temperature.Based on model simulations, a proposed doubling of the water-supply withdrawals from Mountain Island Lake has no readily apparent effect on water quality in the reservoir. The increased withdrawal rate may have some localized effects on circulation in the reservoir, but a more detailed model of the intake zone would be required to identify those effects.The effects of a 20-percent increase in water-chemistry loadings through Cowans Ford Dam and from McDowell Creek were simulated separately. Increased loadings from Cowans Ford Dam had about the same effect on water-quality conditions near Mountain Island Dam as did increased loadings from McDowell Creek. Maintaining good water quality in Mountain Island Lake depends on maintaining good water quality in Lake Norman as well as in the inflows from the McDowell Creek watershed.

  13. Four-State Cost Study. Revised

    ERIC Educational Resources Information Center

    Conger, Sharmila Basu; Bell, Alli; Stanley, Jeff

    2010-01-01

    As part of Lumina Foundation's state productivity initiative in higher education, the State Higher Education Executive Officers (SHEEO) hosted a discussion of state level higher education cost studies in May 2008. After subsequent conversations with Jane Wellman, Executive Director of the Delta Cost Project, and SHEEO representatives from four…

  14. 22. ORE STORAGE BRIDGE AT PLANT'S LOWER DOCK, LOOKING SOUTH. ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    22. ORE STORAGE BRIDGE AT PLANT'S LOWER DOCK, LOOKING SOUTH. THE WELLMAN ORE BRIDGE (THERE ARE TWO AT THIS DOCK) WERE ERECTED IN 1908 BY THE CORRIGAN, McKINNEY COMPANY. - Corrigan, McKinney Steel Company, 3100 East Forty-fifth Street, Cleveland, Cuyahoga County, OH

  15. Young Alumni Giving: An Exploration of Institutional Strategies

    ERIC Educational Resources Information Center

    Bent, Lauren G.

    2012-01-01

    An economy struggling to rise out of recession presents a difficult time for institutions of higher education as state and federal support for higher education is not keeping pace with costs (Newman, Couturier, & Scurry, 2004; Wellman, Desrochers, & Lenihan, 2009). It is essential for colleges and universities to raise funds from external…

  16. Hydrologic processes and nutrient dynamics in a pristine mountain catchment

    USGS Publications Warehouse

    F. Richard Hauer,; Fagre, Daniel B.; Stanford, Jack A.

    2002-01-01

    Nutrient dynamics in watersheds have been used as an ecosystem-level indicator of overall ecosystem function or response to disturbance (e.g. Borman.N et al. 1974, WEBSTER et al. 1992). The examination of nutrients has been evaluated to determine responses to logging practices or other changes in watershed land use. Nutrient dynamics have been related to changing physical and biological characteristics (Mulholl AND 1992, CHESTNUT & McDowell 2000). Herein, the concentrations and dynamics of nitrogen, phosphorus and particulate organic carbon were examined in a large pristine watershed because they are affected by changes in discharge directly from the catchment and after passage through a large oligotrophic lake. 

  17. Theory of Mind and Executive Function in Chinese Preschool Children

    ERIC Educational Resources Information Center

    Duh, Shinchieh; Paik, Jae H.; Miller, Patricia H.; Gluck, Stephanie C.; Li, Hui; Himelfarb, Igor

    2016-01-01

    Cross-cultural research on children's theory of mind (ToM) understanding has raised questions about its developmental sequence and relationship with executive function (EF). The current study examined how ToM develops (using the tasks from Wellman & Liu, 2004) in relation to 2 EF skills (conflict inhibition, working memory) in 997 Chinese…

  18. Interpersonal Consulting Skills among Instructional Technology Consultants at an Institution of Higher Education in the Midwest--A Multiple Case Study

    ERIC Educational Resources Information Center

    van Leusen, Peter

    2013-01-01

    As new developments in digital technologies rapidly influence our society, higher education organizations are under increasing pressure to utilize new instructional methods and technologies to educate students (Educause, 2005; Phipps & Wellman, 2001; U.S. Department of Education, 2010). The task to integrate these tools into teaching and…

  19. EXTERNAL OVERVIEW OF WHEELROOM (TURBINE ROOM) WITH PENSTOCK FOR #1 ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    EXTERNAL OVER-VIEW OF WHEELROOM (TURBINE ROOM) WITH PENSTOCK FOR #1 AND #2 GENERATORS AND #2 EXCITER, VIEWED WEST TO EAST. TURBINES ARE HORIZONTAL TWIN FRANCIS TURBINES, MANUFACTURED BY WELLMAN-SEAVER MORGAN CO. IN 1911. PHOTO BY JET LOWE, HAER, 1995. - Elwha River Hydroelectric System, Elwha Hydroelectric Dam & Plant, Port Angeles, Clallam County, WA

  20. 13. VIEW OF AREAWAY 104 LOOKING SOUTHEAST, TOWARD ANIMAL QUARTERS. ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    13. VIEW OF AREAWAY 104 LOOKING SOUTHEAST, TOWARD ANIMAL QUARTERS. WOOD STUD WALL WITH WIRE MESH INFILL HAS BEEN ADDED. HORIZONTAL WOOD PANELING ON WALLS IS PAINTED. FLOOR IS CONCRETE SLAB. - Presidio of San Francisco, Cavalry Stables, Cowles Street, between Lincoln Boulevard & McDowell Street, San Francisco, San Francisco County, CA

  1. 2010 Neuroscience Director’s Strategic Initiative

    DTIC Science & Technology

    2011-02-01

    distribution unlimited. 13. SUPPLEMENTARY NOTES 14. ABSTRACT Understanding how Soldiers’ cognitive abilities meet the increasing demands of dynamic...In order to acquire, monitor, and assess Soldier sensory, perceptual, emotional, cognitive , and physical performance within realistic operational...brain state classification algorithm from EEG data acquired from participants performing tasks with varied cognitive demands. Third, Kaleb McDowell

  2. The Eagle’s Talons. The American Experience at War

    DTIC Science & Technology

    1988-12-01

    carrying American passengers (for 143 EAGLE’S TALONS example, the Lusitania ), became the most volatile issue be- tween the United States and Germany. It...99, 103-4, 108, 118-22 Lusitania : 144, 374 McClernand, John Alexander: Lys River: 163 103 McDowell, Irvin: 105, 114-15, MacArthur, Douglas: xv, 16

  3. Metric Selection for Ecosystem Restoration

    DTIC Science & Technology

    2013-06-01

    focus on wetlands, submerged aquatic vegetation, oyster reefs, riparian forest, and wet prairie (Miner 2005). The objective of these Corps...of coastal habitats, Volume Two: Tools for monitoring coastal habitats. NOAA Coastal Ocean Program Decision Analysis Series No. 23. Silver Spring, MD...NOAA National Centers for Coastal Ocean Science. Thom, R. M., and K. F. Wellman. 1996. Planning aquatic ecosystem restoration monitoring programs

  4. The Relationship between Children's Gaze Reporting and Theory of Mind

    ERIC Educational Resources Information Center

    D'Entremont, Barbara; Seamans, Elizabeth; Boudreau, Elyse

    2012-01-01

    Seventy-nine 3- and 4-year-old children were tested on gaze-reporting ability and Wellman and Liu's (2004) continuous measure of theory of mind (ToM). Children were better able to report where someone was looking when eye and head direction were provided as a cue compared with when only eye direction cues were provided. With the exception of…

  5. The beryllium "double standard" standard.

    PubMed

    Egilman, David S; Bagley, Sarah; Biklen, Molly; Golub, Alison Stern; Bohme, Susanna Rankin

    2003-01-01

    Brush Wellman, the world's leading producer and supplier of beryllium products, has systematically hidden cases of beryllium disease that occurred below the threshold limit value (TLV) and lied about the efficacy of the TLV in published papers, lectures, reports to government agencies, and instructional materials prepared for customers and workers. Hypocritically, Brush Wellman instituted a zero exposure standard for corporate executives while workers and customers were told the 2 microgram standard was "safe." Brush intentionally used its workers as "canaries for the plant," and referred to them as such. Internal documents and corporate depositions indicate that these actions were intentional and that the motive was money. Despite knowledge of the inadequacy of the TLV, Brush has successfully used it as a defense against lawsuits brought by injured workers and as a sales device to provide reassurance to customers. Brush's policy has reaped an untold number of victims and resulted in mass distribution of beryllium in consumer products. Such corporate malfeasance is perpetuated by the current market system, which is controlled by an organized oligopoly that creates an incentive for the neglect of worker health and safety in favor of externalizing costs to victimized workers, their families, and society at large.

  6. 23. VIEW TOWARD EAST CORNER OF ROOM 205. FORMER SKYLIGHT ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    23. VIEW TOWARD EAST CORNER OF ROOM 205. FORMER SKYLIGHT IN SLOPED GYPSUM BOARD CEILING HAS BEEN ROOFED OVER. WALLS ARE A COMBINATION OF GYPSUM BOARD AND WOOD PLANK. WOOD POST SUPPORTS BEAM AT NORTHEAST WALL. - Presidio of San Francisco, Cavalry Stables, Cowles Street, between Lincoln Boulevard & McDowell Street, San Francisco, San Francisco County, CA

  7. Feasibility Study of Comprehensive School Physical Activity Programs in Appalachian Communities: The McDowell CHOICES Project

    ERIC Educational Resources Information Center

    Jones, Emily M.; Taliaferro, Andrea R.; Elliott, Eloise M.; Bulger, Sean M.; Kristjansson, Alfgeir L.; Neal, William; Allar, Ishonté

    2014-01-01

    Increasing rates of childhood obesity has prompted calls for comprehensive approaches to school-based physical activity (PA). The purpose of this study was to evaluate the feasibility of comprehensive school physical activity program (CSPAP) development and related contextual issues within a rural Appalachian county using a Systems Approach. A…

  8. Chronology of rock falls and slides in a desert mountain range: Case study from the Sonoran Desert in south-central Arizona

    NASA Astrophysics Data System (ADS)

    Dorn, Ronald I.

    2014-10-01

    In order to respond to the general paucity of information on the chronology of ubiquitous small rock falls and slides that litter the slopes of desert mountain ranges, a case study in the Sonoran Desert reveals new insight into the desert geomorphology of mountain slopes. Rock falls and rock slides in the McDowell Mountains that abut metropolitan Phoenix, USA, fall in three chronometric groupings dated by conventional radiocarbon and rock varnish microlamination methods. First, the oldest events are > 74 ka and take the form of stable colluvial boulder fields - positive relief features that are tens of meters long and a few meters wide. Second, randomly sampled slides and falls of various sizes and positions wasted during wetter periods of the terminal Pleistocene and Holocene. Third, an anomalous clustering of slides and falls occurred during the late Medieval Warm Period (Medieval Climatic Anomaly) when an extreme storm was a possible but unlikely trigger. One speculative hypothesis for the cluster of Medieval Warm Period events is that a small to moderate sized earthquake shook heavily shattered bedrock - close to failure - just enough to cause a spate of rock falls and slides. A second speculative hypothesis is that this dry period enhanced physical weathering processes such as dirt cracking. However, the reasons for the recent clustering of rock falls remain enigmatic. While the temporal distribution of slides and falls suggests a minimal hazard potential for homes and roads on the margins of the McDowell Mountains, this finding may not necessary match other desert ranges in metropolitan Phoenix or mountains with different rock types and structures that abut other arid urban centers.

  9. Culture and the Sequence of Steps in Theory of Mind Development

    ERIC Educational Resources Information Center

    Shahaeian, Ameneh; Peterson, Candida C.; Slaughter, Virginia; Wellman, Henry M.

    2011-01-01

    To examine cultural contrasts in the ordered sequence of conceptual developments leading to theory of mind (ToM), we compared 135 3- to 6-year-olds (77 Australians; 58 Iranians) on an established 5-step ToM scale (Wellman & Liu, 2004). There was a cross-cultural difference in the sequencing of ToM steps but not in overall rates of ToM mastery.…

  10. Systems Fragility: The Sociology of Chaos

    DTIC Science & Technology

    2015-03-01

    Science (New York, N.Y.) 302, no. 5652 (2004): 1912, doi:10.1126/science.1090847. 70 Barry Wellman, and Scot Wortley, “Different Strokes from...Michael Dombeck, Jack Williams, and Christopher Wood , “Wildfire Policy and Public Lands: Integrating Scientific Understanding with Social Concerns...bitstream/10092/2809/ 1/12593870_ResOrgs_IFED_dec04_EDSM.pdf Dombeck, Michael, Jack Williams, and Christopher Wood . “Wildfire Policy and Public

  11. Micro-Cultural Influences on Theory of Mind Development: A Comparative Study of Middle-Class and "Pemulung" Children in Jakarta, Indonesia

    ERIC Educational Resources Information Center

    Kuntoro, Ike Anggraika; Saraswati, Liliek; Peterson, Candida; Slaughter, Virginia

    2013-01-01

    We investigated cultural influences on young children's acquisition of social-cognitive concepts. A theory of mind (ToM) scale (Wellman & Liu, 2004) was given to 129 children (71 boys, 58 girls) ranging in age from 3 years 0 months to 7 years 10 months. The children were from three distinct cultural groups: (a) trash pickers…

  12. Do Humans Have Two Systems to Track Beliefs and Belief-Like States?

    ERIC Educational Resources Information Center

    Apperly, Ian A.; Butterfill, Stephen A.

    2009-01-01

    The lack of consensus on how to characterize humans' capacity for belief reasoning has been brought into sharp focus by recent research. Children fail critical tests of belief reasoning before 3 to 4 years of age (H. Wellman, D. Cross, & J. Watson, 2001; H. Wimmer & J. Perner, 1983), yet infants apparently pass false-belief tasks at 13 or 15…

  13. Research to Develop Biomedical Applications of Free Electron Laser Technology

    DTIC Science & Technology

    2011-03-31

    Wellman Center for Photomedicine/Massachusetts General Hospital ,55 Fruit St,Boston,MA,02114-2621 8. PERFORMING ORGANIZATION REPORT NUMBER ; AFRL-OSR... affect soldiers. To achieve this broad goal, we have undertaken projects focused on novel treatments of infectious diseases and physical trauma...death. Burns destroy the cutaneous barrier, rendering the affected tissue non- perfused, and bacteria find the burn wound a highly nutritional environment

  14. On the Performance of the Underwater Acoustic Sensor Networks

    DTIC Science & Technology

    2015-05-01

    ABOVE ADDRESS. The University of the District of Columbia Computer Science and Informati Briana Lowe Wellman Washington, DC 20008 -1122 ABSTRACT On the...built. I would like to acknowledge the Electrical and Computer Engineering Department which has been helpful throughout my master degree, especially Dr...transmitted message, and, therefore, take advantage of the BER variation, which depends on the underwater acoustic channel environment. Several computer

  15. Visual Analysis of Social Networks in a Counter-Insurgency Context

    DTIC Science & Technology

    2011-06-01

    Batagelj and Mrvar 2003] specifically focus on the analysis and visualisation of extremely large networks. Moreover, on top of these data about the...and behavioral components of a complex conflict ecosystem, SpringSim: 23. Batagelj , V. & Mrvar , A., (2003), Pajek - analysis and visualisation of...information regarding network patterns and structures, no spatial information is usually encoded. This is despite the fact that already Wellman [ 1996

  16. Materials Properties Research at MSFC

    NASA Technical Reports Server (NTRS)

    Presson, Joan B.; Burdine, Robert (Technical Monitor)

    2002-01-01

    MSFC is currently planning, organizing and directing test coupon fabrication and subsequent CTE testing for two mirror materials of specific interest to the AMSD and NGST programs, Beryllium 0-30H (Be 0-30H) and Ultra Low Expansion glass (ULE). The ULE test coupons are being fabricated at MSFC from AMSD core residuals provided by Kodak, The Be 0-30H test coupons are being fabricated at Brush Wellman using residuals from the SBMD. Both sets of test coupons will be sent to a test vendor selected through the NASA competitive proposal process with the test results being provided by written report to MSFC by the end of the fiscal year. The test results will become model input data for the AMSD analysts, both MSFC and contractor, providing an enhancement to the historical CTE data currently available.

  17. 8. VIEW LOOKING NORTHWEST DOWN CENTRAL AXIS OF ROOM 110. ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    8. VIEW LOOKING NORTHWEST DOWN CENTRAL AXIS OF ROOM 110. NOTE CHANGE IN CEILING TREATMENT: WOOD PLANKS IN CENTER, ALL OTHER AREAS ARE GYPSUM BOARD. FLOOR IN CENTRAL AREA IS CONCRETE. POSTS AND BEAMS ARE ALL WOOD CONSTRUCTION. - Presidio of San Francisco, Cavalry Stables, Cowles Street, between Lincoln Boulevard & McDowell Street, San Francisco, San Francisco County, CA

  18. A Discussion Guide for UnCommon Knowledge: The "Voices of Girls" Documentary. [Videotape].

    ERIC Educational Resources Information Center

    AEL, Inc., Charleston, WV.

    Rural and Urban Images: Voices of Girls in Science, Mathematics, and Technology was a 3-year project that began in fall 1995 with a group of sixth-grade girls and followed the same girls through eighth grade. The project took place in two West Virginia counties, but this videotaped documentary features its implementation in rural McDowell County…

  19. Environmental Compliance Assessment Protocol - Centers for Disease Control and Prevention (ECAP-CDC), West Virginia Supplement

    DTIC Science & Technology

    1994-11-01

    memmingeri Virginia heartleaf Asarum shuttleworthii Large-flowered heartleaf Asciepias viridis Green milkweed Asplenium septentrionale Forked spleenwort...quadrangulata Blue ash Galactia volubilis Milk pea Gaylussacia brachycera Box huckleberry Gaylussacia dumosa Dwarf huckleberry Gentiana alba Yello gentian...Name Common Name Habenaria viridis var bra cteata Long-bracted green orchid Helianthemum canadense Canada frostweed Helianthus dowellianus McDowell

  20. 78 FR 37788 - In the Matter of: Juan Ricardo Puente-Paez, Inmate Number #05086-379, FCI McDowell, Federal...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-06-24

    ... exported from the United States to Mexico four military spec Interceptor body armor vests, which were... owned, possessed or controlled by the Denied Person, or service any item, of whatever origin, that is... subject to the Regulations are the foreign-produced direct product of U.S.-origin technology. V. This...

  1. Images of the Negro in American Literature. Patterns of Literary Criticism, No. 5.

    ERIC Educational Resources Information Center

    Gross, Seymour L. , Ed.; Hardy, John Edward, Ed.

    The 15 studies in this collection investigate the various images of the Negro in American literature--images which range from streotype to archetype. In the first six studies, critics discuss the literary tradition of the Negro in colonial literature (Milton Cantor), in the Southern novel prior to 1850 (Tremaine McDowell), in literature of the…

  2. When Police Intervene: Race, Gender, and Discipline of Black Male Students at an Urban High School

    ERIC Educational Resources Information Center

    Hines-Datiri, Dorothy

    2015-01-01

    Courtney and Dennis, two African American male students at McDowell High, were arrested at school for throwing water balloons during senior prank week. The principal assigned two police officers to the magnet school to oversee the implementation of a new discipline protocol. However, several members of the school staff were ill-informed about the…

  3. Public, Private, Past, and Present: An Exploration of the Language and Musical Structures of Kotiria/Wanano Women's "Kaya Basa" "Sad Songs"

    ERIC Educational Resources Information Center

    Hosemann, Aimee J.

    2017-01-01

    This dissertation explores the way Kotiria/Wanano (E. Tukanoan, Kotiria hereafter) women of the Brazilian "Alto Rio Negro" (ARN) contrive (McDowell 1990) "kaya basa" "sad songs" using linguistic and musical resources to construct songs that express loneliness and other private emotions, while also creating alliances…

  4. Native American casino gambling in Arizona: A case study of the Fort McDowell reservation.

    PubMed

    Anders, G C

    1996-09-01

    Since Congress passed the American Indian Gambling Regulatory Act (IGRA) in 1988, there has been an explosion in the number of gambling casinos located on Native American reservations. It is estimated that in 1994 the total net revenue from 81 Native American casinos exceeded $2.3 billion dollars. As the number of Native American casinos grows along with the volume of gambling activity, opposition increases from states, the established gambling industry hurt by lost revenues, and groups with moral objections to gambling. This article reports on an effort to measure the economic impact of the Fort McDowell casino located near Phoenix, Arizona. The article discusses the casino's history and current operations. Next, it explains the use of an input-output model to compute the impact of casino's income and employment effects on the economy of Maricopa County. It is estimated that the casino is responsible for 2,483 new jobs, and an increase of approximately $80.35 million in regional output. Additional information is necessary to more accurately assess both the benefits and costs of the casino. Unfortunately, subsequent efforts to collect additional data have been unsuccessful. The conclusion discusses why, and raises questions regarding Native American gaming.

  5. Competence and Performance in Belief-Desire Reasoning across Two Cultures: The Truth, the Whole Truth and Nothing but the Truth about False Belief?

    ERIC Educational Resources Information Center

    Yazdi, Amir Amin; German, Tim P.; Defeyter, Margaret Anne; Siegal, Michael

    2006-01-01

    There is a change in false belief task performance across the 3-5 year age range, as confirmed in a recent meta-analysis [Wellman, H. M., Cross, D., & Watson, J. (2001). Meta-analysis of theory mind development: The truth about false-belief. "Child Development," 72, 655-684]. This meta-analysis identified several performance factors influencing…

  6. Transforming Graph Data for Statistical Relational Learning

    DTIC Science & Technology

    2012-10-01

    Jordan, 2003), PLSA (Hofmann, 1999), ? Classification via RMN (Taskar et al., 2003) or SVM (Hasan, Chaoji, Salem , & Zaki, 2006) ? Hierarchical...dimensionality reduction methods such as Principal 407 Rossi, McDowell, Aha, & Neville Component Analysis (PCA), Principal Factor Analysis ( PFA ), and...clustering algorithm. Journal of the Royal Statistical Society. Series C, Applied statistics, 28, 100–108. Hasan, M. A., Chaoji, V., Salem , S., & Zaki, M

  7. 14. INTERIOR OF ROOM 101 LOOKING NORTHWEST. SIX OVER SIX ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    14. INTERIOR OF ROOM 101 LOOKING NORTHWEST. SIX OVER SIX LITE WOOD FRAME WINDOWS ARE CENTERED ON NORTHEAST AND NORTHWEST WALLS. ADDED TREATMENT TO WALLS IS A GYPSUM BOARD FINISH WITH WOOD TRIM, BOTH PAINTED. ADDED FLOOR TREATMENT IS VINYL COMPOSITION TILE. - Presidio of San Francisco, Cavalry Stables, Cowles Street, between Lincoln Boulevard & McDowell Street, San Francisco, San Francisco County, CA

  8. Modeling Cerebral Vascular Injury

    DTIC Science & Technology

    2016-01-01

    vessels to inform the material response of the surrounding brain tissue. 15. SUBJECT TERMS traumatic brain injury, vasculature, injury biomechanics ...Margulies SS. A fiber-reinforced composite model of the viscoelastic behavior of the brainstem in shear. Journal of Biomechanics . 1999;32:865– 870...RH, McDowell K, Vettel J. High rate computational brain injury biomechanics . ARL Ballistic Technology Workshop; 2010 May 24–26; Herndon, VA. Kraft

  9. 50th Annual Technical Meeting of the Society of Engineering Science (SES)

    DTIC Science & Technology

    2014-08-15

    McDowell (Gerogia Tech), Min Zhou () Virtual Characterization of composites with Lamination Defects for wind turbine spar cap MUKUNDAN SRINIVASAN...Zhang (IHCP Singapore) Damage Mechanisms in Irradiated Metallic Glasses Richard Baumer (MIT), Michael Demkowicz (MIT) Slip Avalanches in Amorphous...Michigan, 48090) Atomistic Simulations of c+a Pyramidal Slip in Magnesium Single Crystal under Compression Xiaozhi Tang (MIT & BJTU), Yafang Guo

  10. 10. INTERIOR VIEW OF ROOM 110, LOOKING NORTHWEST FROM THE ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    10. INTERIOR VIEW OF ROOM 110, LOOKING NORTHWEST FROM THE EAST CORNER. SPACE IS DIVIDED BY EXPOSED POSTS AND BEAMS DESIGNATING ANIMALS QUARTERS AND AISLES. FLOORS ARE WOOD PLANKS IN ANIMAL AREAS CHANGING TO CONCRETE IN CENTER AISLES. CEILING IS GYPSUM BOARD. - Presidio of San Francisco, Cavalry Stables, Cowles Street, between Lincoln Boulevard & McDowell Street, San Francisco, San Francisco County, CA

  11. The Characterization of the Phlebotomus papatasi Transcriptome

    DTIC Science & Technology

    2013-04-01

    Computational identification of novel chitinase-like proteins in the Drosophila melanogaster genome . Bioinformatics. 2004; 20, no. 2:161–169. [PubMed: 14734306...discovery in organisms where sequencing the whole genome is not possible (Lindlof 2003), or in addition to genome information for more accurate gene...biology of these important vectors, and generate essential data for annotation of the newly sequenced phlebotomine sand fly genomes (McDowell et al

  12. A Geometrically Nonlinear Phase Field Theory of Brittle Fracture

    DTIC Science & Technology

    2014-10-01

    of crack propagation. Philos Mag 91:75–95 Sun X, Khaleel M (2004) Modeling of glass fracture damage using continuum damage mechanics -static spherical...elastic fracture mechanics ). Engineering finite element (FE) simula- tions often invoke continuum damage mechanics the- ories, wherein the tangent...stiffness of a material ele- ment degrades as “damage” accumulates.Conventional continuum damage mechanics theories (Clayton and McDowell 2003, 2004; Sun and

  13. Landfill to Learning Facility

    NASA Astrophysics Data System (ADS)

    Venner, Laura

    2008-05-01

    Engaging "K-to-Gray” audiences (children, families, and older adults) in scientific exploration and discovery is the main goal of the NJMC Center for Environmental and Scientific Education and the William D. McDowell Observatory located in Lyndhurst, NJ. Perched atop a closed and reclaimed municipal solid waste landfill, our new LEED - certified building (certification pending) and William D. McDowell observatory will bring hands-on scientific experiences to the 25,000 students and 3,000 adults that visit our site from the NY/NJ region each year. Our programs adhere to the New Jersey Core Curriculum Content Standards and are modified for accessibility for the underserved communities that visit us, specifically those individuals that have mobility, sensory, and/or cognitive ability differences. The programs are conducted in a classroom setting and are designed to nourish the individual's inquisitive nature and provide an opportunity to function as a scientist by, making observations, performing experiments and recording data. We have an $850,000, three year NSF grant that targets adults with disabilities and older adults with age related limitations in vision, hearing, cognition and/or mobility. From dip netting in the marsh to astronomical investigation of the cosmos, the MEC/CESE remains committed to reaching the largest audience possible and leaving them with a truly exceptional scientific experience that serves to educate and inspire.

  14. The Mind behind the Message: Advancing Theory-of-Mind Scales for Typically Developing Children, and Those with Deafness, Autism, or Asperger Syndrome

    ERIC Educational Resources Information Center

    Peterson, Candida C.; Wellman, Henry M.; Slaughter, Virginia

    2012-01-01

    Children aged 3-2 years (n = 184) with typical development, deafness, autism, or Asperger syndrome took a series of theory-of-mind (ToM) tasks to confirm and extend previous developmental scaling evidence. A new sarcasm task, in the format of H. M. Wellman and D. Liu's (2004) 5-step ToM Scale, added a statistically reliable 6th step to the scale…

  15. Hardware Photos: Image Showing JWST Engineering Demonstration Mirror, Mounted Ready for Machining at AXYS and Image Showing HIP Can Containing Light Mirrors 1 and 2 Ready for Mirror Fabrication

    NASA Technical Reports Server (NTRS)

    OKeefe, Sean

    2004-01-01

    The images in this viewgraph presentation have the following captions: 1) EDU mirror after being sawed in half; 2) EDU Delivered to Axsys; 3) Be EDU Blank Received and Machining Started; 4) Loaded HIP can for flight PM segments 1 and 2; 5) Flight Blanks 1 and 2 Loaded into HIP Can at Brush-Wellman; 6) EDU in Machining at Axsys.

  16. On the Mutual Information of Multi-hop Acoustic Sensors Network in Underwater Wireless Communication

    DTIC Science & Technology

    2014-05-01

    DO NOT RETURN YOUR FORM TO THE ABOVE ADDRESS. The University of the District of Columbia Computer Science and Informati Briana Lowe Wellman Washington...financial support throughout my Master’s study and research. Also, I would like to acknowledge the Faculty of the Electrical and Computer Engineering...received bits are in error, and then compute the bit-error-rate as the number of bit errors divided by the total number of bits in the transmitted signal

  17. 30. VIEW OF ROOM 212 LOOKING SOUTHEAST TOWARDS EXIT DOORS. ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    30. VIEW OF ROOM 212 LOOKING SOUTHEAST TOWARDS EXIT DOORS. EXPOSED MASONRY WALL WITH BRICK DETAILING OVER ARCHED DOORWAY IS UNPAINTED. ORIGINAL USE OF ROOM 212 WAS AS A HAYLOFT; EXTERIOR DOOR WAS USED FOR LOADING HAY. TRUSSWORK AND BEAMS ARE EXPOSED AND UNPAINTED. WALLS HAVE PAINTED WOOD PANELING, CEILING MATERIAL IS EXPOSED WOOD ROOF SHEATHING. - Presidio of San Francisco, Cavalry Stables, Cowles Street, between Lincoln Boulevard & McDowell Street, San Francisco, San Francisco County, CA

  18. Workshop on Prognosis of Aircraft and Space Devices, Components and Systems

    DTIC Science & Technology

    2008-02-19

    of Detection," Harry Millwater , University of Texas at San Antonio 4:00 pm "A New Approach for Investigating Crystal Stresses that Drive the...Probability of Detection, Harry Millwater , University of Texas at San Antonio This research examines the simulation of recurring automated...david.mcdowell(2>me.2;atech.edu iennifer.michaels(5),ece.2atech.edu mpm4(a),cornell .edu Millwater , Harry R. Nagy, Peter B. Pratt, David M. Dept. Mechanical

  19. Collisions of O+ with He at low energies

    NASA Astrophysics Data System (ADS)

    Joseph, Dwayne C.; Saha, B. C.; Zhao, L. B.

    2009-05-01

    We have investigated the following charge transfer processO^+( ^4S^0, ^2D^0, ^2P^0)+He->O( ^3P)+He^+-δE using the full quantum [1] and semi-classical molecular [2]orbital close-coupling (MOCC) approximations. The quantum MOCC equations are solved numerically in the adiabatic representation [3]. Using MRD-CI package [4] the ab initio configuration interaction calculation is carried out for potential energies. Details of our findings will be reported in the conference. [1] B. H. Bransden and M. R. C. McDowell, ``Charge Exchange and the Theory of Ion-Atom Collisions'', Clarendon Press, Oxford, 1992. [2] M. Kimura and N. F. Lane, At. Mol. Opt. Phys 26, 79 (1990). [3] J. P. Braga and J. C. Belchoir, J. Comput. Chem 17, 1559 (1996). [4] R. J. Buenker, ``Current Aspects of Quantum Chemistry 1981, Vol 21, edited by R. Carbo (Elsevier, Amsterdam), p 17.

  20. MSFC Test Results for Selected Mirrors: Brush-Wellman/Goodrich 0.5 meter Joined-Beryllium Mirror; IABG 0.5 meter C/SiC Mirror; Xinetics 0.5 meter SiC Mirror; and Kodak 0.23 meter SiO2 Mirror

    NASA Technical Reports Server (NTRS)

    Hadaway, James; Blackwell, Lisa; Matthews, Gary; Eng, Ron; Stahl, Phil; Hraba, John; Thornton, Gary

    2002-01-01

    The results of cryo tests performed at the XRCF on the above mirrors will be presented. Each mirror was tested from room-temperature to around 30 K. The first three were tested together on a 3-mirror stand in the large chamber using the PhaseCam interferometer, while the Kodak mirror was tested in the small chamber using the EPI interferometer.

  1. A Center of Excellence in Rotary Wing Aircraft Technology. Phase 2. Program Maturation Phase, 15 January 1988 - 14 January 1993

    DTIC Science & Technology

    1993-03-14

    COSTSHARING REOUIRED AND UNIVERSITY COO SHARING c ison For INCREASED . . *AFLIGHT SIMULATION TASK ADDED D ! !L ý Figure 1. _It• .. ... Avail. 2Blot S. . .. lI...Vibrations and Structnal Dynamics .................. 28 Task 4. Damage Resistance in Rotorcraft Structus ........................ 31 D . Flight Mechanics and...Twenty-Second Symposium (Volume i), ASTM STP 1131, H. A. Ernst, A. Saxena, and D . L. McDowell, Eds., American Society for Testing and Materials

  2. 28. ROOM 211, VIEW TO THE SOUTHEAST. THE LONG NARROW ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    28. ROOM 211, VIEW TO THE SOUTHEAST. THE LONG NARROW SPACE HAS EXPOSED TRUSSWORK IN UNPAINTED WOOD AS DID ALL UPSTAIRS ROOMS IN THEIR ORIGINAL CONDITION. CLERESTORY WINDOWS ARE INTERSPERSED WITH VENTS ALONG BOTH LONG SIDES OF THE ROOM. WALLS HAVE WIDE WOOD PANELING THAT IS PAINTED, FLOORS ARE WOOD. DOORWAY IN SOUTHWEST WALL LEADS TO UNFINISHED ATTIC SPACE. - Presidio of San Francisco, Cavalry Stables, Cowles Street, between Lincoln Boulevard & McDowell Street, San Francisco, San Francisco County, CA

  3. 4. OVERALL VIEW OF THE SOUTHEAST FACADE. THE BRICK MASONRY ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    4. OVERALL VIEW OF THE SOUTHEAST FACADE. THE BRICK MASONRY WALLS ARE LAID IN COMMON BOND WITH A BRICK DETAIL SURROUNDING THE FLAT ARCHED WOODEN DOORS. THE SYMMETRICAL PLACEMENT OF DOORS HAS BEEN VISUALLY AFFECTED BY THE ADDITION OF A WOOD FIRE STAIR. A BEAM USED TO LOAD HAY INTO THE UPPER LOFT AREA PROTRUDES THROUGH THE MASONRY WALL JUST BELOW THE ROOF LINE. - Presidio of San Francisco, Cavalry Stables, Cowles Street, between Lincoln Boulevard & McDowell Street, San Francisco, San Francisco County, CA

  4. 6. DETAIL OF FLATARCHED WOODEN DOORS USED FOR ANIMAL ENTRY ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    6. DETAIL OF FLAT-ARCHED WOODEN DOORS USED FOR ANIMAL ENTRY AT CENTER OF SOUTHWEST FACADE. DOORS ARE COMPRISED OF THREE WOOD PANELS WITH A KICKPLATE ON SIDES. SMALL, SIX LITE WINDOWS COVERED BY GRILLS AND DETAILED WITH A THICK MASONRY LEDGE ARE LOCATED UNDER THE EAVE ON EITHER SIDE OF DOOR. A DRAINPIPE WITH A CONCRETE SPLASH BLOCK IS A LATER ADDITION. - Presidio of San Francisco, Cavalry Stables, Cowles Street, between Lincoln Boulevard & McDowell Street, San Francisco, San Francisco County, CA

  5. Comparison of Long-Term Storage in Chemical Fixatives on Morphology and Anatomy of Super-Dwarf Wheat

    NASA Technical Reports Server (NTRS)

    Bubenheim, David L.; Campbell, W. F.; Salisbury, F. B.; Hole, P. S.; Gillespie, L.; Levbinskikh, M.; Kliss, Mark H. (Technical Monitor)

    1996-01-01

    Wheat plants (Triticum aestivum L. cv Super-Dwarf) are grown in the microgravity of space and harvested for morphological and anatomical comparison with those exposed to gravity on earth. Such plants are subjected to relatively long periods of storage in chemical fixatives. Examination, evaluation and verification that the integrity of the vascular system is maintained for extended periods of storage in fixatives are required. McDowell and Trump's [4% Formaldehyde -- 1 % Glutaraldehyde (4F: 1 G)] or Variant I [(Russian Fixative): Formalin: Acetic Acid: Alcohol] fixatives, adjusted to pH 7.2, were placed in Aclam(TM), FilmORap(TM), or FilmOSun(TM) plastic bags on April 4, 1994. Wheat seedlings were harvested on days 9, 28, and 68 and preserved in these fixatives. Subsamples of leaves and/or seeds were taken from these stocks after various times in storage, dehydrated, and embedded in Spurr's, LR White's or Unicryl resin. Semithin (1 mm) and thin (50-70 nm) sections were examined by light and transmission electron microscopy. In a few sections, we have observed a slight plasmolysis of the cytostol in leaf tissue fixed with the Variant I, but overall there seem to be no major artifacts in the anatomical structure. The plasmalemma and other organelles appeared normal in the McDowell and Trump fixative. Use of differential chromophores suggests that LR White or Unicryl resins may give greater flexibility for enzyme localizations at both the light and electron microscopical levels.

  6. Research frontiers in drought-induced tree mortality: Crossing scales and disciplines

    DOE PAGES

    Hartmann, Henrik; Adams, Henry D.; Anderegg, William R. L.; ...

    2015-01-12

    Sudden and widespread forest die-back and die-off (e.g., Huang & Anderegg, 2012) and increased mortality rates (e.g., Peng et al., 2011) in many forest ecosystems across the globe have been linked to drought and elevated temperatures (Allen et al., 2010, Fig. 1). Furthermore, these observations have caused a focus on the physiological mechanisms of drought-induced tree mortality (e.g. McDowell et al., 2008) and many studies, both observational and manipulative, have been carried out to explain tree death during drought from a physiological perspective.

  7. Expert System for Test Program Set Fault Candidate Selection

    DTIC Science & Technology

    1989-09-01

    Mark Herbst Mr. Paul Janusz Mr. Wayne Lee Ms. Patricia Lyon Ms. Sharyn McDowell Mr. Richard Payne Ms. Elizabeth Parliman Mr. Albert Stanbury Ms. Allison...SMCAR-ESP-L AMSMC-QA(R) AMSMC-QAK-B(R), R. Fer Rock Island, IL 61299-6000 28 Commander U.S. Army Materiel Command ATTN: AMCQA-E, Mr. Chris Neubert AMCPD...ATTN: AMSEL-PA-MT-S, Mr. Paul Kogut Mr. Andy Mills AMSEL-PA AMSEL-PA-DL AMSEL-RD-SE-CRM-CM AMSEL-RD-SE-AST Ft. Monmouth, NJ 07703-5023 35 Director U.S

  8. Commercial low-Btu coal-gasification plant. Feasibility study: General Refractories Company, Florence, Kentucky. Volume I. Project summary. [Wellman-Galusha

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    1981-11-01

    In response to a 1980 Department of Energy solicitation, the General Refractories Company submitted a Proposal for a feasibility study of a low Btu gasification facility for its Florence, KY plant. The proposed facility would substitute low Btu gas from a fixed bed gasifier for natural gas now used in the manufacture of insulation board. The Proposal from General Refractories was prompted by a concern over the rising costs of natural gas, and the anticipation of a severe increase in fuel costs resulting from deregulation. The proposed feasibility study is defined. The intent is to provide General Refractories with themore » basis upon which to determine the feasibility of incorporating such a facility in Florence. To perform the work, a Grant for which was awarded by the DOE, General Refractories selected Dravo Engineers and Contractors based upon their qualifications in the field of coal conversion, and the fact that Dravo has acquired the rights to the Wellman-Galusha technology. The LBG prices for the five-gasifier case are encouraging. Given the various natural gas forecasts available, there seems to be a reasonable possibility that the five-gasifier LBG prices will break even with natural gas prices somewhere between 1984 and 1989. General Refractories recognizes that there are many uncertainties in developing these natural gas forecasts, and if the present natural gas decontrol plan is not fully implemented some financial risks occur in undertaking the proposed gasification facility. Because of this, General Refractories has decided to wait for more substantiating evidence that natural gas prices will rise as is now being predicted.« less

  9. Astronomy on a Landfill

    NASA Astrophysics Data System (ADS)

    Venner, Laura

    2008-09-01

    Engaging "K-to-Gray” audiences (children, families, and older adults) in astronomical activities is one of the main goals of the NJMC Center for Environmental and Scientific Education and the William D. McDowell Observatory located in Lyndhurst, NJ. Perched atop a closed and reclaimed municipal solid waste landfill, our new LEED - certified building (certification pending) and William D. McDowell observatory will assist in bringing the goals of IYA 2009 to the approximately 25,000 students and 15,000 adults that visit our site from the NY/NJ region each year. Diversifying our traditional environmental science offerings, we have incorporated astronomy into our repertoire with "The Sun Through Time” module, which includes storytelling, cultural astronomy, telescope anatomy, and other activities that are based on the electromagnetic spectrum and our current knowledge of the sun. These lessons have also been modified to bring astronomy to underserved communities, specifically those individuals that have dexterity or cognitive ability differences. The program is conducted in a classroom setting and is designed to meet New Jersey Core Curriculum Content Standards. With the installation of our new 20” telescope, students and amateur astronomers will be given the opportunity to perform rudimentary research. In addition, a program is in development that will allow individuals to measure local sky brightness and understand the effects of light pollution on astronomical viewing. Teaching astronomy in an urban setting presents many challenges. All individuals, regardless of ability level or location, should be given the opportunity to be exposed to the wonders of the universe and the MEC/CESE has been successful in providing those opportunities.

  10. Astronomy on a Landfill

    NASA Astrophysics Data System (ADS)

    Venner, Laura

    2008-05-01

    Engaging "K-to-Gray” audiences (children, families, and older adults) in astronomical activities is one of the main goals of the NJMC Center for Environmental and Scientific Education and the William D. McDowell Observatory located in Lyndhurst, NJ. Perched atop a closed and reclaimed municipal solid waste landfill, our new LEED - certified building (certification pending) and William D. McDowell observatory will assist in bringing the goals of IYA 2009 to the approximately 25,000 students and 3,000 adults that visit our site from the NY/NJ region each year. Diversifying our traditional environmental science offerings, we have incorporated astronomy into our repertoire with "The Sun Through Time” module, which includes storytelling, cultural astronomy, telescope anatomy, and other activities that are based on the electromagnetic spectrum and our current knowledge of the sun. These lessons have also been modified to bring astronomy to underserved communities, specifically those individuals that have dexterity or cognitive ability differences. The program is conducted in a classroom setting and is designed to meet New Jersey Core Curriculum Content Standards. With the installation of our new 20” telescope, students and amateur astronomers will be given the opportunity to perform rudimentary research. In addition, a program is in development that will allow individuals to measure local sky brightness and understand the effects of light pollution on astronomical viewing. Teaching astronomy in an urban setting presents many challenges. All individuals, regardless of ability level or location, should be given the opportunity to be exposed to the wonders of the universe and the MEC/CESE has been successful in providing those opportunities.

  11. When representations conflict with reality: the preschooler's problem with false beliefs and "false" photographs.

    PubMed

    Zaitchik, D

    1990-04-01

    It has been argued that young preschoolers cannot correctly attribute a false belief to a deceived actor (Wimmer & Perner, 1983). Some researchers claim that the problem lies in the child's inadequate epistemology (Chandler & Boyes, 1982; Wellman, 1988); as such, it is specific to the child's theory of mind and no such problem should appear in reasoning about nonmental representations. This prediction is tested below in the "false photograph" task: here an actor takes a photograph of an object in location X; the object is then moved to location Y. Preschool subjects are asked: "In the picture, where is the object?" Results indicate that photographs are no easier to reason about than are beliefs. Manipulations to boost performance on the photograph task proved ineffective. Further, an explanation of the failure as a processing limitation having nothing to do with the representational nature of beliefs or photographs was ruled out. It is argued that young children's failure on the false belief task is not due to an inadequate epistemology (though they may have one) and is symptomatic of a larger problem with representations.

  12. Is this the right time to come out? Case study.

    PubMed

    Williamson, A D; Woods, J D; Conley, J M; O'Barr, W M; Losey, M R; Colbert, C; Wofford, J; McNamara, E

    1993-01-01

    In this fictional case study, Adam Lawson is a promising young associate at Kirkham McDowell Securities, a St. Louis underwriting and financial advisory firm. Recently, Adam helped to bring in an extremely lucrative deal, and soon he and a few other associates will be honored for their efforts at the firm's silver anniversary dinner. George Campbell, vice president in mergers and acquisitions, is caught unprepared when Adam tells him that, after serious reflection, he has decided to bring his partner, Robert Collins, to the banquet. George is one of Adam's biggest supporters at the firm, and he personally has no problem with Adam being gay. But it is one thing for Adam to come out of the closet at the office. It is quite another to do so at a public company-client event. After all, Kirkham McDowell's client roster includes some very conservative companies--one of the country's largest defense contractors, for example. George is concerned with how Adam's openness about his sexual orientation will play with their clients and, as a result, how senior management will react. Adam has not come to George for permission to bring Robert to the dinner. But clearly Adam wants some sort of response. George has never faced sexual diversity issues in the workplace before, and there is no company policy to guide him. Just how negative an effect could Robert have on Adam's career with the firm and the firm's relationship with its clients? Isn't it possible that even the firm's most conservative clients will simply decide that Adam's choice of guest is a personal matter--not a business one?(ABSTRACT TRUNCATED AT 250 WORDS)

  13. Affine generalization of the Komar complex of general relativity

    NASA Astrophysics Data System (ADS)

    Mielke, Eckehard W.

    2001-02-01

    On the basis of the ``on shell'' Noether identities of the metric-affine gauge approach of gravity, an affine superpotential is derived which comprises the energy- and angular-momentum content of exact solutions. In the special case of general relativity (GR) or its teleparallel equivalent, the Komar or Freud complex, respectively, are recovered. Applying this to the spontaneously broken anti-de Sitter gauge model of McDowell and Mansouri with an induced Euler term automatically yields the correct mass and spin of the Kerr-AdS solution of GR with a (induced) cosmological constant without the factor two discrepancy of the Komar formula.

  14. Correction of Lying Ears by Augmentation of the Conchoscaphal Angle.

    PubMed

    Kim, Sung-Eun; Yeo, Chi-Ho; Kim, Taegon; Kim, Yong-Ha; Lee, Jun Ho; Chung, Kyu-Jin

    2017-01-01

    Lying ears are defined as ears that protrude less from the head, and in frontal view, are characterized by lateral positioning of antihelical contour relative to the helical rim. These aesthetically displeasing ears require correction in accord with the goals of otoplasty stated by McDowell. The authors present a case of lying ears treated by correcting the conchomastoid angle using Z-plasty, resection of posterior auricular muscle, and correction of the conchoscaphal angle by releasing cartilage using 2 full-thickness incisions and grafting of a conchal cartilage spacer. By combining these techniques, the authors efficiently corrected lying ears and produced aesthetically pleasing results.

  15. Hydrogeology, groundwater flow, and groundwater quality of an abandoned underground coal-mine aquifer, Elkhorn Area, West Virginia

    USGS Publications Warehouse

    Kozar, Mark D.; McCoy, Kurt J.; Britton, James Q.; Blake, B.M.

    2017-01-01

    The Pocahontas No. 3 coal seam in southern West Virginia has been extensively mined by underground methods since the 1880’s. An extensive network of abandoned mine entries in the Pocahontas No. 3 has since filled with good-quality water, which is pumped from wells or springs discharging from mine portals (adits), and used as a source of water for public supplies. This report presents results of a three-year investigation of the geology, hydrology, geochemistry, and groundwater flow processes within abandoned underground coal mines used as a source of water for public supply in the Elkhorn area, McDowell County, West Virginia. This study focused on large (> 500 gallon per minute) discharges from the abandoned mines used as public supplies near Elkhorn, West Virginia. Median recharge calculated from base-flow recession of streamflow at Johns Knob Branch and 12 other streamflow gaging stations in McDowell County was 9.1 inches per year. Using drainage area versus mean streamflow relationships from mined and unmined watersheds in McDowell County, the subsurface area along dip of the Pocahontas No. 3 coal-mine aquifer contributing flow to the Turkey Gap mine discharge was determined to be 7.62 square miles (mi2), almost 10 times larger than the 0.81 mi2 surface watershed. Results of this investigation indicate that groundwater flows down dip beneath surface drainage divides from areas up to six miles east in the adjacent Bluestone River watershed. A conceptual model was developed that consisted of a stacked sequence of perched aquifers, controlled by stress-relief and subsidence fractures, overlying a highly permeable abandoned underground coal-mine aquifer, capable of substantial interbasin transfer of water. Groundwater-flow directions are controlled by the dip of the Pocahontas No. 3 coal seam, the geometry of abandoned mine workings, and location of unmined barriers within that seam, rather than surface topography. Seven boreholes were drilled to intersect abandoned mine workings in the Pocahontas No. 3 coal seam and underlying strata in various structural settings of the Turkey Gap and adjacent down-dip mines. Geophysical logging and aquifer testing were conducted on the boreholes to locate the coal- mine aquifers, characterize fracture geometry, and define permeable zones within strata overlying and underlying the Pocahontas No. 3 coal-mine aquifer. Water levels were measured monthly in the wells and showed a relatively static phreatic zone within subsided strata a few feet above the top of or within the Pocahontas No. 3 coal-mine aquifer (PC3MA). A groundwater-flow model was developed to verify and refine the conceptual understanding of groundwater flow and to develop groundwater budgets for the study area. The model consisted of four layers to represent overburden strata, the Pocahontas No. 3 coal-mine aquifer, underlying fractured rock, and fractured rock below regional drainage. Simulation of flow in the flooded abandoned mine entries using highly conductive layers or zones within the model, was unable to realistically simulate interbasin transfer of water. Therefore it was necessary to represent the coal-mine aquifer as an internal boundary condition rather than a contrast in aquifer properties. By representing the coal-mine aquifer with a series of drain nodes and optimizing input parameters with parameter estimation software, model errors were reduced dramatically and discharges for Elkhorn Creek, Johns Knob Branch, and other tributaries were more accurately simulated. Flow in the Elkhorn Creek and Johns Knob Branch watersheds is dependent on interbasin transfer of water, primarily from up dip areas of abandoned mine workings in the Pocahontas No. 3 coal-mine aquifer within the Bluestone River watershed to the east. For the 38th, 70th, and 87th percentile flow duration of streams in the region, mean measured groundwater discharge was estimated to be 1.30, 0.47, and 0.39 cubic feet per square mile (ft3/s/mi2

  16. Toxicological and chemical characterization of the process stream materials and gas combustion products of an experimental low-btu coal gasifier.

    PubMed

    Benson, J M; Hanson, R L; Royer, R E; Clark, C R; Henderson, R F

    1984-04-01

    The process gas stream of an experimental pressurized McDowell-Wellman stirred-bed low-Btu coal gasifier, and combustion products of the clean gas were characterized as to their mutagenic properties and chemical composition. Samples of aerosol droplets condensed from the gas were obtained at selected positions along the process stream using a condenser train. Mutagenicity was assessed using the Ames Salmonella mammalian microsome mutagenicity assay (TA98, with and without rat liver S9). All materials required metabolic activation to be mutagenic. Droplets condensed from gas had a specific mutagenicity of 6.7 revertants/microgram (50,000 revertants/liter of raw gas). Methylnaphthalene, phenanthrene, chrysene, and nitrogen-containing compounds were positively identified in a highly mutagenic fraction of raw gas condensate. While gas cleanup by the humidifier-tar trap system and Venturi scrubber led to only a small reduction in specific mutagenicity of the cooled process stream material (4.1 revertants/microgram), a significant overall reduction in mutagenicity was achieved (to 2200 revertants/liter) due to a substantial reduction in the concentration of material in the gas. By the end of gas cleanup, gas condensates had no detectable mutagenic activity. Condensates of combustion product gas, which contained several polycyclic aromatic compounds, had a specific mutagenicity of 1.1 revertants/microgram (4.0 revertants/liter). Results indicate that the process stream material is potentially toxic and that care should be taken to limit exposure of workers to the condensed tars during gasifier maintenance and repair and to the aerosolized tars emitted in fugitive emissions. Health risks to the general population resulting from exposure to gas combustion products are expected to be minimal.

  17. The cost of thinking about false beliefs: evidence from adults' performance on a non-inferential theory of mind task.

    PubMed

    Apperly, Ian A; Back, Elisa; Samson, Dana; France, Lisa

    2008-03-01

    Much of what we know about other people's beliefs comes non-inferentially from what people tell us. Developmental research suggests that 3-year-olds have difficulty processing such information: they suffer interference from their own knowledge of reality when told about someone's false belief (e.g., [Wellman, H. M., & Bartsch, K. (1988). Young children's reasoning about beliefs. Cognition, 30, 239-277.]). The current studies examined for the first time whether similar interference occurs in adult participants. In two experiments participants read sentences describing the real colour of an object and a man's false belief about the colour of the object, then judged the accuracy of a picture probe depicting either reality or the man's belief. Processing costs for picture probes depicting reality were consistently greater in this false belief condition than in a matched control condition in which the sentences described the real colour of one object and a man's unrelated belief about the colour of another object. A similar pattern was observed for picture probes depicting the man's belief in most cases. Processing costs were not sensitive to the time available for encoding the information presented in the sentences: costs were observed when participants read the sentences at their own pace (Experiment 1) or at a faster or a slower pace (Experiment 2). This suggests that adults' difficulty was not with encoding information about reality and a conflicting false belief, but with holding this information in mind and using it to inform a subsequent judgement.

  18. Do humans have two systems to track beliefs and belief-like states?

    PubMed

    Apperly, Ian A; Butterfill, Stephen A

    2009-10-01

    The lack of consensus on how to characterize humans' capacity for belief reasoning has been brought into sharp focus by recent research. Children fail critical tests of belief reasoning before 3 to 4 years of age (H. Wellman, D. Cross, & J. Watson, 2001; H. Wimmer & J. Perner, 1983), yet infants apparently pass false-belief tasks at 13 or 15 months (K. H. Onishi & R. Baillargeon, 2005; L. Surian, S. Caldi, & D. Sperber, 2007). Nonhuman animals also fail critical tests of belief reasoning but can show very complex social behavior (e.g., J. Call & M. Tomasello, 2005). Fluent social interaction in adult humans implies efficient processing of beliefs, yet direct tests suggest that belief reasoning is cognitively demanding, even for adults (e.g., I. A. Apperly, D. Samson, & G. W. Humphreys, 2009). The authors interpret these findings by drawing an analogy with the domain of number cognition, where similarly contrasting results have been observed. They propose that the success of infants and nonhuman animals on some belief reasoning tasks may be best explained by a cognitively efficient but inflexible capacity for tracking belief-like states. In humans, this capacity persists in parallel with a later-developing, more flexible but more cognitively demanding theory-of-mind abilities.

  19. Reasoning about beliefs: a human specialization?

    PubMed

    Povinelli, D J; Giambrone, S

    2001-01-01

    A recent meta-analysis performed by Wellman, Cross, and Watson clears the air surrounding young children's performance on tests of false belief by showing that it is highly likely that there is some type of conceptual development between 3 and 5 years of age that supports improved task performance. The data concerning the evolutionary origin of these abilities, however, is considerably less clear. Nonetheless, there is some reason to suspect that theory of mind is unique to our species, and that its original function was to provide a more abstract level of describing ancient behavioral patterns (such as deception, reconciliation, and gaze following)-behaviors that humans share in common with many other species. Thus, the initial selective advantage of theory of mind may have been because it increased the flexibility of already-existing behaviors, not because it generated scores of radically new ones.

  20. How Universal Are Free Will Beliefs? Cultural Differences in Chinese and U.S. 4- and 6-Year-Olds.

    PubMed

    Wente, Adrienne O; Bridgers, Sophie; Zhao, Xin; Seiver, Elizabeth; Zhu, Liqi; Gopnik, Alison

    2016-05-01

    This study explores the development of free will beliefs across cultures. Sixty-seven Chinese 4- and 6-year-olds were asked questions to gauge whether they believed that people could freely choose to inhibit or act against their desires. Responses were compared to those given by the U.S. children in Kushnir, Gopnik, Chernyak, Seiver, and Wellman (). Results indicate that children from both cultures increased the amount of choice they ascribed with age. For inhibition questions, Chinese children ascribed less choice than the U.S. children. Qualitative explanations revealed that the U.S. children were also more likely to endorse notions of autonomous choice. These findings suggest both cultural differences and similarities in free will beliefs. © 2016 The Authors. Child Development © 2016 Society for Research in Child Development, Inc.

  1. Cost-Effectiveness of Nivolumab-Ipilimumab Combination Therapy Compared with Monotherapy for First-Line Treatment of Metastatic Melanoma in the United States.

    PubMed

    Oh, Anna; Tran, Dang M; McDowell, Leann C; Keyvani, Dor; Barcelon, Jay Andrew; Merino, Oscar; Wilson, Leslie

    2017-06-01

    The approval of new immunotherapies has dramatically changed the treatment landscape of metastatic melanoma. These survival gains come with trade-offs in side effects and costs, as well as important considerations for third-party payer systems, physicians, and patients. To develop a Markov model to determine the cost-effectiveness of nivolumab, ipilimumab, and nivolumab-ipilimumab combination as firstline therapy in metastatic melanoma, while accounting for differential effectiveness in programmed death-ligand 1 (PD-L1) positive and negative patients. A 3-state Markov model (PD-L1 positive stable disease, PD-L1 negative stable disease, and progression and/or death) was developed using a U.S. societal perspective with a lifetime time horizon of 14.5 years. Transition probabilities were calculated from progression-free (PF) survival data reported in the CheckMate-067 trial. Costs were expressed in 2015 U.S. dollars and were determined using national sources. Adverse event (AE) management was determined using immune-related AE (irAE) data from CheckMate-067, irAE management guides for nivolumab and ipilimumab, and treatment guidelines. Utilities were obtained from published literature, using melanoma-specific studies when available, and were weighted based on incidence and duration of irAEs. Base case, one-way sensitivity, and probabilistic sensitivity analyses were conducted. Nivolumab-ipilimumab combination therapy was not the cost-effective choice ($454,092 per PF quality-adjusted life-year [QALY]) compared with nivolumab monotherapy in a base case analysis at a willingness-to-pay threshold of $100,000 per PFQALY. Combination therapy and nivolumab monotherapy were cost-effective choices compared with ipilimumab monotherapy. PD-L1 positive status, utility of nivolumab and combination therapy, and medication costs contributed the most uncertainty to the model. In a population of 100% PD-L1 negative patients, nivolumab was still the optimal treatment, but combination therapy had an improved incremental cost-effectiveness ratio (ICER) of $295,903 per PFQALY. Combination therapy became dominated by nivolumab, when 68% of the sample was PD-L1 positive. In addition, the cost of ipilimumab would have to decrease to < $21,555 per dose for combination therapy to have an ICER < $100,000 per PFQALY and to < $19,151 (a 42% reduction) to be more cost-effective than nivolumab monotherapy. Nivolumab-ipilimumab combination therapy was not cost-effective compared with nivolumab monotherapy, which was the most cost-effective option. Professionals in managed care settings should consider the pharmacoeconomic implications of these new immunotherapies as they make value-based formulary decisions, and future cost-effectiveness studies are completed. No funding supported this study. Merino was a contractor with EMD Serono at the time of this study but does not have any conflicts of interest and did not receive any funding related to this study. All other authors have no financial disclosures and no conflicts of interest. All the authors contributed to the study concept and design. Tran, McDowell, and Barcelon took the lead in data collection, along with Oh, Keyvani, and Merino. All authors except Merino contributed to data interpretation. The manuscript was written by Oh, Tran, McDowell, and Wilson and revised by Oh, Tran, McDowell, Wilson, and Keyvani. This analysis was presented at Academy of Managed Care Pharmacy Managed Care & Specialty Pharmacy Annual Meeting 2016, April 19-22, 2016, in San Francisco, California, and at the International Society for Pharmacoeconomics and Outcomes Research Annual International Meeting, May 21-25, 2016, in Washington DC.

  2. The mind behind the message: Advancing theory of mind scales for typically developing children, and those with deafness, autism, or Asperger Syndrome

    PubMed Central

    Peterson, Candida C.; Wellman, Henry M.; Slaughter, Virginia

    2013-01-01

    Children aged 3 to 12 years (n=184) with typical development, deafness, autism or Asperger Syndrome took a series of theory-of-mind (ToM) tasks to confirm and extend previous developmental scaling evidence. A new sarcasm task, in the format of Wellman and Liu’s (2004) 5-step ToM scale, added a statistically reliable sixth step to the scale for all diagnostic groups. A key previous finding, divergence in task sequencing for children with autism, was confirmed. Comparisons among diagnostic groups, controlling age and language ability, showed that typical developers mastered the six ToM steps ahead of each of the three disabled groups, with implications for ToM theories. The final (sarcasm) task challenged even nondisabled 9-year-olds, demonstrating the new scale’s sensitivity to post-preschool ToM growth. PMID:22304467

  3. Theory of mind and executive function in Chinese preschool children.

    PubMed

    Duh, Shinchieh; Paik, Jae H; Miller, Patricia H; Gluck, Stephanie C; Li, Hui; Himelfarb, Igor

    2016-04-01

    Cross-cultural research on children's theory of mind (ToM) understanding has raised questions about its developmental sequence and relationship with executive function (EF). The current study examined how ToM develops (using the tasks from Wellman & Liu, 2004) in relation to 2 EF skills (conflict inhibition, working memory) in 997 Chinese preschoolers (ages 3, 4, 5) in Chengdu, China. Compared with prior research with other Chinese and non-Chinese children, some general patterns in development were replicated in this sample. However, the children showed culture-specific reversals in the developmental sequence of ToM. For example, Chengdu children performed differently on the 2 false-belief tasks that were thought to be equivalent. Furthermore, conflict inhibition as well as working memory uniquely predicted ToM performance. We discuss the issues of ToM development as they relate to test items and cross-cultural--and subcultural--differences. (c) 2016 APA, all rights reserved).

  4. Culture and the sequence of steps in theory of mind development.

    PubMed

    Shahaeian, Ameneh; Peterson, Candida C; Slaughter, Virginia; Wellman, Henry M

    2011-09-01

    To examine cultural contrasts in the ordered sequence of conceptual developments leading to theory of mind (ToM), we compared 135 3- to 6-year-olds (77 Australians; 58 Iranians) on an established 5-step ToM scale (Wellman & Liu, 2004). There was a cross-cultural difference in the sequencing of ToM steps but not in overall rates of ToM mastery. In line with our predictions, the children from Iran conformed to a distinctive sequence previously observed only in children in China. In contrast to the case with children from Australia (and the United States), knowledge access was understood earlier than opinion diversity in children from Iran, consistent with this collectivist culture's emphasis on filial respect, dispute avoidance, and acquiring knowledge. Having a sibling was linked with faster overall ToM progress in Australia only and was not related to scale sequences in either culture.

  5. Comparison of fixation and processing methods for hairless guinea pig skin following sulfur mustard exposure. (Reannouncement with new availability information)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bryant, M.A.; Braue Jr, E.H.

    1992-12-31

    Ten anesthetized hairless guinea pigs Crl:IAF(HA)BR were exposed to 10 pi of neat sulfur mustard (HD) in a vapor cup on their skin for 7 min. At 24 h postexposure, the guinea pigs were euthanatized and skin sections taken for histologic evaluation. The skin was fixed using either 10% neutral buffered formalin (NBF), McDowell Trump fixative (4CF-IG), Zenker`s formol-saline (Helly`s fluid), or Zenker`s fluid. Fixed skin sections were cut in half: one half was embedded in paraffin and the other half in plastic (glycol methacrylate). Paraffin-embedded tissue was stained with hematoxylin and eosin; plastic-embedded tissue was stained with Lee`s methylenemore » blue basic fuchsin. Skin was also frozen unfixed, sectioned by cryostat, and stained with pinacyanole. HD-exposed skin was evaluated histologically for the presence of epidermal and follicular necrosis, microblister formation, epidermitis, and intracellular edema to determine the optimal fixation and embedding method for lesion preservation. The percentage of histologic sections with lesions varied little between fixatives and was similar for both paraffin and plastic embedding material. Plastic-embedded sections were thinner, allowing better histologic evaluation, but were more difficult to stain. Plastic embedding material did not infiltrate tissue fixed in Zenker`s fluid or Zenker`s formol-saline. Frozen tissue sections were prepared in the least processing time and lesion preservation was comparable to fixed tissue. It was concluded that standard histologic processing using formalin fixation and paraffin embedding is adequate for routine histopathological evaluation of HD skin lesions in the hairless guinea pig.... Sulfur mustard, Vesicating agents, Pathology, Hairless guinea pig model, Fixation.« less

  6. Meddy trajectories in the Canary Basin measured during the SEMAPHORE experiment, 1993-1995

    NASA Astrophysics Data System (ADS)

    Richardson, Philip L.; Tychensky, Aude

    1998-10-01

    As part of the Structures des Echanges Mer-Atmosphere, Proprietes des Heterogeneites Oceaniques: Recherche Experimentale (SEMAPHORE) experiment, four Mediterranean water eddies (Meddies) were identified in the Canary Basin and tracked with freely drifting RAFOS floats. One large and energetic Meddy, discovered 1700 km west of Cape Saint Vincent, Portugal, set a distance and speed record as it translated another 1700 km southwestward at 3.9 cm/s during 1.5 years. This Meddy traveled 57% of the distance from Cape Saint Vincent toward the spot McDowell and Rossby [1978] found a possible Meddy north of the Dominican Republic. Two Meddies were observed to interact with the Azores Current as they passed underneath or through it. Three Meddies collided with tall seamounts, which seemed to disrupt the normal swirl velocity, perhaps fatally in two cases. One Meddy appeared to bifurcate when it collided with seamounts.

  7. 78 FR 20883 - Tonto National Forest; Arizona; Salt River Allotments Vegetative Management EIS

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-04-08

    ...The Tonto National Forest hereby gives notice that it is extending the public comment period for the Salt River Allotments Vegetative Management Draft Environmental Impact Statement (Draft EIS), which was published in the Federal Register on February 22, 2013, (Volume 78, No. 36) originally for a 45-day comment period. Please see the Notice of Availability of the Draft EIS (78 FR 12310) for more detailed information related to the Salt River Allotments Vegetative Management Draft EIS. In response to requests for additional time, the Forest Service will extend the comment period from April 8, 2013, to May 8, 2013. Federal, State, tribal, and local governments and other interested parties are requested to comment on the Draft EIS. Comments will be accepted by email to [email protected] or by mail to Debbie Cress, Tonto National Forest, 2324 E. McDowell Rd., Phoenix, AZ 85006 (928) 595-2093 or faxed to (602) 225-5295.

  8. Evaluating bump control techniques through convergence monitoring

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Campoli, A.A.

    1987-07-01

    A coal mine bump is the violent failure of a pillar or pillars due to overstress. Retreat coal mining concentrates stresses on the pillars directly outby gob areas, and the situation becomes critical when mining a coalbed encased in rigid associated strata. Bump control techniques employed by the Olga Mine, McDowell County, WV, were evaluated through convergence monitoring in a Bureau of Mines study. Olga uses a novel pillar splitting mining method to extract 55-ft by 70-ft chain pillars, under 1,100 to 1,550 ft of overburden. Three rows of pillars are mined simultaneously to soften the pillar line and reducemore » strain energy storage capacity. Localized stress reduction (destressing) techniques, auger drilling and shot firing, induced approximately 0.1 in. of roof-to-floor convergence in ''high'' -stress pillars near the gob line. Auger drilling of a ''low''-stress pillar located between two barrier pillars produced no convergence effects.« less

  9. The implementation of an elementary STEM learning team and the effect on teacher self-efficacy: An action research study

    NASA Astrophysics Data System (ADS)

    Hernandez, Jennifer F.

    Science, technology, engineering, and math (STEM) education is part of a national movement to prepare students for the demands of a 21st century workforce. STEM uses an integrated, real-world problem solving approach to increase the levels of collaboration, communication, critical, and creative thinking in students. If expectations for students have increased to stay competitive in a global market, teachers must be equipped to meet the needs of the new 21st century learners in their classrooms. To that end, professional learning for educators is essential to ensure they are equipped with the tools necessary for success. While there are many approaches to teacher development, professional learning teams, based on the work of Garmston and Wellman, focus on teachers' instructional delivery, targeted student learning needs, planning, implementing new strategies, collaboration, and reflective dialogue. The purpose of the study is to improve instructional practice providing quality STEM instruction to students and increase teacher self-efficacy---a teachers' perception of his or her ability to instruct students in the STEM disciplines. Theoretical implications of a study on an elementary STEM learning team could affect the way schools deliver STEM professional learning opportunities to teachers and the way students are delivered a quality STEM education. Research has shown that Model I behavior would limit the change process of professional learning through a surface inspection of the issues; however model II behaviors would benefit the teachers, students and organization because teachers would be collaborating on specific objectives to develop a knowledge base and skill set to meet students' needs. Extending professional development by engaging stakeholders in a collaborative process to build model II behaviors will create an organizational structure that facilitates learning.

  10. Belief and sign, true and false: the unique of false belief reasoning.

    PubMed

    Zhang, Ting; Zhang, Qin; Li, Yiyuan; Long, Changquan; Li, Hong

    2013-11-01

    For a long time, a controversy has been proposed that whether the process of theory of mind is a result of domain-specific or domain-general changes (Wellman in The handbook of childhood cognitive development. Blackwell Publication, New Jersey, 2011). This event-related potential study explored the neural time course of domain-general and domain-specific components in belief reasoning. Fourteen participants completed location transfer false belief (FB), true belief (TB), false sign (FS) and true sign (TS) tasks, in which two pictures told a story related to a dog that ran from a green into a red box. In the TB and FB tasks, a boy saw or did not see the transfer of the dog, respectively. In the FS and TS tasks, an arrow that pointed to the green box either altered its direction to the red box or did not alter following the transfer of the dog. Participants then inferred where the boy thought of, or the arrow indicated the location of the dog. FB and TB reasoning elicited lower N2 amplitudes than FS and TS reasoning, which is associated with domain-general components, the detection, and classification. The late slow wave (LSW) for FB was more positive at frontal, central, and parietal sites than FS because of the domain-specific component involved in FB reasoning. However, the LSW was less positive for TB than for FB but did not differ from the TS condition, which implies that mental representation might not be involved in TB reasoning.

  11. Friends, friendlessness, and the social consequences of gaining a theory of mind.

    PubMed

    Fink, Elian; Begeer, Sander; Peterson, Candida C; Slaughter, Virginia; de Rosnay, Marc

    2015-03-01

    Fink, Begeer, Peterson, Slaughter, and de Rosnay (2014) conducted a prospective longitudinal study showing that theory-of-mind (ToM) development at school entry (mean age 5.61 years) significantly predicted friendlessness both concurrently and 2 years later. Friendlessness (defined as lacking any friendship that is mutually reciprocated) is conceptually and empirically distinct from group popularity and independently predicts adverse mental health outcomes throughout life. Here, we respond to the thoughtful commentaries by Wellman (Brit. J. Dev. Psychol, 2015; 33, 24-26), Mizokawa and Koyasu (Brit. J. Dev. Psychol, 2015; 33, 21-23), and Lerner and Lillard (Brit. J. Dev. Psychol, 2015; 33, 18-20) with a focus on three key issues, namely (a) the definition and measurement of friendship, (b) the measurement of advanced ToM development beyond the preschool years, and (c) the exciting future potential for ToM-based training and intervention studies to combat chronic friendlessness. © 2015 The British Psychological Society.

  12. Cryogenic Properties of Aluminum Beryllium and Beryllium Materials

    NASA Technical Reports Server (NTRS)

    Gamwell, Wayne R.; McGill, Preston B.

    2003-01-01

    Ultimate tensile strength, yield strength, and elongation were obtained for the aluminum-beryllium alloy, AlBeMetl62 (38%Al-62%Be), at cryogenic (-195.5 C (-320 F) and (-252.8 C) (-423 F)) temperatures, and for an optical grade beryllium, O-30H (99%Be), at -252.8 C. AlBeMetl62 material was purchased to the requirements of SAE-AMS7912, "Aluminum-Beryllium Alloy, Extrusions." O-30H material was purchased to the requirements of Brush Wellman Inc. specification O-30H Optical Grade Beryllium. The ultimate tensile and yield strengths for extruded AlBeMetl62 material increased with decreasing temperature, and the percent elongation decreased with decreasing temperature. Design properties for the ultimate tensile strength, yield strength, and percent elongation for extruded AlBeMetl62 were generated. It was not possible to distinguish a difference in the room and cryogenic ultimate strength for the hot isostatically pressed (HIP'ed) O-30H material. The O30H elongation decreased with decreasing temperature.

  13. The mind behind the message: advancing theory-of-mind scales for typically developing children, and those with deafness, autism, or Asperger syndrome.

    PubMed

    Peterson, Candida C; Wellman, Henry M; Slaughter, Virginia

    2012-01-01

    Children aged 3-12 years (n = 184) with typical development, deafness, autism, or Asperger syndrome took a series of theory-of-mind (ToM) tasks to confirm and extend previous developmental scaling evidence. A new sarcasm task, in the format of H. M. Wellman and D. Liu's (2004) 5-step ToM Scale, added a statistically reliable 6th step to the scale for all diagnostic groups. A key previous finding, divergence in task sequencing for children with autism, was confirmed. Comparisons among diagnostic groups, controlling age, and language ability, showed that typical developers mastered the 6 ToM steps ahead of each of the 3 disabled groups, with implications for ToM theories. The final (sarcasm) task challenged even nondisabled 9-year-olds, demonstrating the new scale's sensitivity to post-preschool ToM growth. © 2012 The Authors. Child Development © 2012 Society for Research in Child Development, Inc.

  14. Nanotechnology for photodynamic therapy: a perspective from the Laboratory of Dr. Michael R. Hamblin in the Wellman Center for Photomedicine at Massachusetts General Hospital and Harvard Medical School.

    PubMed

    Hamblin, Michael R; Chiang, Long Y; Lakshmanan, Shanmugamurthy; Huang, Ying-Ying; Garcia-Diaz, Maria; Karimi, Mahdi; de Souza Rastelli, Alessandra Nara; Chandran, Rakkiyappan

    2015-08-01

    The research interests of the Hamblin Laboratory are broadly centered on the use of different kinds of light to treat many different diseases. Photodynamic therapy (PDT) uses the combination of dyes with visible light to produce reactive oxygen species and kill bacteria, cancer cells and destroy unwanted tissue. Likewise, UV light is also good at killing especially pathogens. By contrast, red or near-infrared light can have the opposite effect, to act to preserve tissue from dying and can stimulate healing and regeneration. In all these applications, nanotechnology is having an ever-growing impact. In PDT, self-assembled nano-drug carriers (micelles, liposomes, etc.) play a great role in solubilizing the photosensitizers, metal nanoparticles can carry out plasmon resonance enhancement, and fullerenes can act as photosensitizers, themselves. In the realm of healing, single-walled carbon nanotubes can be electrofocused to produce nano-electonic biomedical devices, and nanomaterials will play a great role in restorative dentistry.

  15. Toddlers Benefit from Labeling on an Executive Function Search Task

    PubMed Central

    Miller, Stephanie E.; Marcovitch, Stuart

    2010-01-01

    Although labeling improves executive function (EF) performance in children older than 3 (e.g., Kirkham, Cruess, & Diamond, 2003), the results from studies with younger children have been equivocal (e.g., Sophian & Wellman, 1983). In the present study, we assessed performance in a computerized multistep multilocation search task with older 2-year-old children. The correct search location was either: (a) not marked by a familiar picture nor given a distinct label, (b) marked by a familiar picture but not given a distinct label (c) marked by a familiar picture and labeled by the experimenter, or (d) marked by a familiar picture and labeled by the participant. The results revealed that accuracy improved across conditions such that children made fewest errors when they generated the label for the hiding location. These findings support the hierarchical competing systems model (Marcovitch & Zelazo, 2006, 2009) that postulates that improved performance can be explained by more powerful representations that guide search behavior. PMID:21112597

  16. Homology of the jaw muscles in lizards and snakes-a solution from a comparative gnathostome approach.

    PubMed

    Johnston, Peter

    2014-03-01

    Homology or shared evolutionary origin of jaw adductor muscles in lizards and snakes has been difficult to establish, although snakes clearly arose within the lizard radiation. Lizards typically have temporal adductors layered lateral to medial, and in snakes the muscles are arranged in a rostral to caudal pattern. Recent work has suggested that the jaw adductor group in gnathostomes is arranged as a folded sheet; when this theory is applied to snakes, homology with lizard morphology can be seen. This conclusion revisits the work of S.B. McDowell, J Herpetol 1986; 20:353-407, who proposed that homology involves identity of m. levator anguli oris and the loss of m. adductor mandibulae externus profundus, at least in "advanced" (colubroid) snakes. Here I advance the folded sheet hypothesis across the whole snake tree using new and literature data, and provide a solution to this homology problem. Copyright © 2014 Wiley Periodicals, Inc.

  17. Multilinear stress-strain and failure calibrations for Ti-6Al-4V.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Corona, Edmundo

    This memo concerns calibration of an elastic-plastic J 2 material model for Ti-6Al-4V (grade 5) alloy based on tensile uniaxial stress-strain data obtained in the laboratory. In addition, tension tests on notched specimens provided data to calibrate two ductile failure models: Johnson-Cook and Wellman's tearing parameter. The tests were conducted by Kim Haulen- beek and Dave Johnson (1528) in the Structural Mechanics Laboratory (SML) during late March and early April, 2017. The SML EWP number was 4162. The stock material was a TIMETALR® 6-4 Titanium billet with 9 in. by 9 in. square section and length of 137 in. Themore » product description indicates that it was a forging delivered in annealed condition (2 hours @ 1300oF, AC at the mill). The tensile mechanical properties reported in the material certi cation are given in Table 1, where σ o represents the 0.2% strain offset yield stress, σ u the ultimate stress, ε f the elongation at failure and R.A. the reduction in area.« less

  18. Theory of Mind and Reading Comprehension in Deaf and Hard-of-Hearing Signing Children

    PubMed Central

    Holmer, Emil; Heimann, Mikael; Rudner, Mary

    2016-01-01

    Theory of Mind (ToM) is related to reading comprehension in hearing children. In the present study, we investigated progression in ToM in Swedish deaf and hard-of-hearing (DHH) signing children who were learning to read, as well as the association of ToM with reading comprehension. Thirteen children at Swedish state primary schools for DHH children performed a Swedish Sign Language (SSL) version of the Wellman and Liu (2004) ToM scale, along with tests of reading comprehension, SSL comprehension, and working memory. Results indicated that ToM progression did not differ from that reported in previous studies, although ToM development was delayed despite age-appropriate sign language skills. Correlation analysis revealed that ToM was associated with reading comprehension and working memory, but not sign language comprehension. We propose that some factor not investigated in the present study, possibly represented by inference making constrained by working memory capacity, supports both ToM and reading comprehension and may thus explain the results observed in the present study. PMID:27375532

  19. Adding sound to theory of mind: Comparing children's development of mental-state understanding in the auditory and visual realms.

    PubMed

    Hasni, Anita A; Adamson, Lauren B; Williamson, Rebecca A; Robins, Diana L

    2017-12-01

    Theory of mind (ToM) gradually develops during the preschool years. Measures of ToM usually target visual experience, but auditory experiences also provide valuable social information. Given differences between the visual and auditory modalities (e.g., sights persist, sounds fade) and the important role environmental input plays in social-cognitive development, we asked whether modality might influence the progression of ToM development. The current study expands Wellman and Liu's ToM scale (2004) by testing 66 preschoolers using five standard visual ToM tasks and five newly crafted auditory ToM tasks. Age and gender effects were found, with 4- and 5-year-olds demonstrating greater ToM abilities than 3-year-olds and girls passing more tasks than boys; there was no significant effect of modality. Both visual and auditory tasks formed a scalable set. These results indicate that there is considerable consistency in when children are able to use visual and auditory inputs to reason about various aspects of others' mental states. Copyright © 2017 Elsevier Inc. All rights reserved.

  20. Factorial structure of the 'ToM Storybooks': A test evaluating multiple components of Theory of Mind.

    PubMed

    Bulgarelli, Daniela; Testa, Silvia; Molina, Paola

    2015-06-01

    This study examined the factorial structure of the Theory of Mind (ToM) Storybooks, a comprehensive 93-item instrument tapping the five components in Wellman's model of ToM (emotion recognition, understanding of desire and beliefs, ability to distinguish between physical and mental entities, and awareness of the link between perception and knowledge). A sample of 681 three- to eight-year-old Italian children was divided into three age groups to assess whether factorial structure varied across different age ranges. Partial credit model analysis was applied to the data, leading to the empirical identification of 23 composite variables aggregating the ToM Storybooks items. Confirmatory factor analysis was then conducted on the composite variables, providing support for the theoretical model. There were partial differences in the specific composite variables making up the dimensions for each of the three age groups. A single test evaluating distinct dimensions of ToM is a valuable resource for clinical practice which may be used to define differential profiles for specific populations. © 2014 The British Psychological Society.

  1. Reducing the language content in ToM tests: A developmental scale.

    PubMed

    Burnel, Morgane; Perrone-Bertolotti, Marcela; Reboul, Anne; Baciu, Monica; Durrleman, Stephanie

    2018-02-01

    The goal of the current study was to statistically evaluate the reliable scalability of a set of tasks designed to assess Theory of Mind (ToM) without language as a confounding variable. This tool might be useful to study ToM in populations where language is impaired or to study links between language and ToM. Low verbal versions of the ToM tasks proposed by Wellman and Liu (2004) for their scale were tested in 234 children (2.5 years to 11.9 years). Results showed that 5 of the tasks formed a scale according to both Guttman and Rasch models whereas all 6 tasks could form a scale according to the Rasch model only. The main difference from the original scale was that the Explicit False Belief task could be included whereas the Knowledge Access (KA) task could not. The authors argue that the more verbal version of the KA task administered in previous studies could have measured language understanding rather than ToM. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  2. Theory of Mind and Reading Comprehension in Deaf and Hard-of-Hearing Signing Children.

    PubMed

    Holmer, Emil; Heimann, Mikael; Rudner, Mary

    2016-01-01

    Theory of Mind (ToM) is related to reading comprehension in hearing children. In the present study, we investigated progression in ToM in Swedish deaf and hard-of-hearing (DHH) signing children who were learning to read, as well as the association of ToM with reading comprehension. Thirteen children at Swedish state primary schools for DHH children performed a Swedish Sign Language (SSL) version of the Wellman and Liu (2004) ToM scale, along with tests of reading comprehension, SSL comprehension, and working memory. Results indicated that ToM progression did not differ from that reported in previous studies, although ToM development was delayed despite age-appropriate sign language skills. Correlation analysis revealed that ToM was associated with reading comprehension and working memory, but not sign language comprehension. We propose that some factor not investigated in the present study, possibly represented by inference making constrained by working memory capacity, supports both ToM and reading comprehension and may thus explain the results observed in the present study.

  3. Reconceptualizing reflexivity and dissonance in professional and personal domains.

    PubMed

    Brooks, Ann

    2008-09-01

    Debates around 'reflexivity' and the construction of the gendered self within late modernity have occupied the attention of both 'reflexive modernization' theorists (Beck, Giddens and Lash 1994; Beck and Beck-Gernsheim 1996; Giddens 1991, 1992) as well as gender and feminist theorists. While theorists such as Beck and Giddens have been preoccupied with establishing the connection between reflexivity and the construction of the 'non-gendered' self, gender and feminist theorists have sought to amplify the debate by exploring the intersecting nexus of contemporary theorizing, more fully within this context. This paper explores the theoretical underpinnings of these debates and their application to specific professional and personal domains. I consider three case studies to assess these issues as outlined in my own work, Brooks 2006, and in the work of Wajcman and Martin 2002, and McDowell 1997, which draw on empirical research and explore changes to gender identity within professional and personal domains. I conclude that there is little evidence in the research presented here of any systematic reconfiguring of gender identities leading to a detraditionalization of gender as suggested by the 'reflexive modernization' theorists.

  4. Body Awareness: Construct and Self-Report Measures

    PubMed Central

    Mehling, Wolf E.; Gopisetty, Viranjini; Daubenmier, Jennifer; Price, Cynthia J.; Hecht, Frederick M.; Stewart, Anita

    2009-01-01

    Objectives Heightened body awareness can be adaptive and maladaptive. Improving body awareness has been suggested as an approach for treating patients with conditions such as chronic pain, obesity and post-traumatic stress disorder. We assessed the psychometric quality of selected self-report measures and examined their items for underlying definitions of the construct. Data sources PubMed, PsychINFO, HaPI, Embase, Digital Dissertations Database. Review methods Abstracts were screened; potentially relevant instruments were obtained and systematically reviewed. Instruments were excluded if they exclusively measured anxiety, covered emotions without related physical sensations, used observer ratings only, or were unobtainable. We restricted our study to the proprioceptive and interoceptive channels of body awareness. The psychometric properties of each scale were rated using a structured evaluation according to the method of McDowell. Following a working definition of the multi-dimensional construct, an inter-disciplinary team systematically examined the items of existing body awareness instruments, identified the dimensions queried and used an iterative qualitative process to refine the dimensions of the construct. Results From 1,825 abstracts, 39 instruments were screened. 12 were included for psychometric evaluation. Only two were rated as high standard for reliability, four for validity. Four domains of body awareness with 11 sub-domains emerged. Neither a single nor a compilation of several instruments covered all dimensions. Key domains that might potentially differentiate adaptive and maladaptive aspects of body awareness were missing in the reviewed instruments. Conclusion Existing self-report instruments do not address important domains of the construct of body awareness, are unable to discern between adaptive and maladaptive aspects of body awareness, or exhibit other psychometric limitations. Restricting the construct to its proprio- and interoceptive channels, we explore the current understanding of the multi-dimensional construct and suggest next steps for further research. PMID:19440300

  5. The balance of give and take in caregiver-partner relationships: An examination of self-perceived burden, relationship equity, and quality of life from the perspective of care recipients following stroke.

    PubMed

    McPherson, Christine J; Wilson, Keith G; Chyurlia, Livia; Leclerc, Charles

    2010-05-01

    We examined the sense of being a burden to others or self-perceived burden (SPB) in people with stroke. A mail survey was completed by 57 former inpatients and their partner caregivers. The care recipient survey included measures of functional status, quality of life, marital satisfaction, equity in the relationship, and psychological distress, as well as SPB using the Self-Perceived Burden Scale (SPBS; Cousineau, McDowell, Hotz, & Hébert, 2003). The caregiver survey included similar measures in addition to a caregiver burden measure. SPB was found to be a prevalent and distressing concern. SPBS scores correlated with measures of functional status and mood; however, the correlations were highest for measures of family roles and work/productivity. Using equity theory as a basis to examine the SPB construct, care recipients who perceived themselves as overbenefiting from the relationship had significantly higher SPB scores than those whose relationship was viewed as equitable or underbenefiting. For some receiving care from a partner after stroke is associated SPB. This sense of burden is related to changes in help-seeking behavior, quality of life, and distress.

  6. X-Ray Diffraction Studies of 145 MeV proton-irradiated AlBeMet 162

    DOE PAGES

    Elbakhshwan, Mohamed; McDonald, Kirk T.; Ghose, Sanjit; ...

    2016-08-03

    AlBeMet 162 (Materion Co., formerly Brush Wellman) has been irradiated with 145 MeV protons up to 1.2x10 20 cm -2 fluence, with irradiation temperatures in the range of 100-220oC. Macroscopic postirradiation evaluation on the evolution of mechanical and thermal properties was integrated with a comprehensive X-ray- diffraction study using high-energy monochromatic and polychromatic X-ray beams, which offered a microscopic view of the irradiation damage effects on AlBeMet. The study confirmed the stability of the metal-matrix composite, its resistance to proton damage, and the continuing separation of the two distinct phases, fcc aluminum and hcp beryllium, following irradiation. Furthermore, based onmore » the absence of inter-planar distance change during proton irradiation, it was confirmed that the stacking faults and clusters on the Al (111) planes are stable, and thus can migrate from the cascade region and be absorbed at various sinks. XRD analysis of the unirradiated AlBeMet 162 showed clear change in the texture of the fcc phase with orientation especially in the Al (111) reflection which exhibits a “non-perfect” six-fold symmetry, implying lack of isotropy in the composite.« less

  7. The validity and scalability of the Theory of Mind Scale with toddlers and preschoolers.

    PubMed

    Hiller, Rachel M; Weber, Nathan; Young, Robyn L

    2014-12-01

    Despite the importance of theory of mind (ToM) for typical development, there remain 2 key issues affecting our ability to draw robust conclusions. One is the continued focus on false belief as the sole measure of ToM. The second is the lack of empirically validated measures of ToM as a broad construct. Our key aim was to examine the validity and reliability of the 5-item ToM scale (Peterson, Wellman, & Liu, 2005). In particular, we extended on previous research of this scale by assessing its scalability and validity for use with children from 2 years of age. Sixty-eight typically developing children (aged 24 to 61 months) were assessed on the scale's 5 tasks, along with a sixth Sally-Anne false-belief task. Our data replicated the scalability of the 5 tasks for a Rasch-but not Guttman-scale. Guttman analysis showed that a 4-item scale may be more suitable for this age range. Further, the tasks showed good internal consistency and validity for use with children as young as 2 years of age. Overall, the measure provides a valid and reliable tool for the assessment of ToM, and in particular, the longitudinal assessment of this ability as a construct. (c) 2014 APA, all rights reserved.

  8. Age and gender dependent development of Theory of Mind in 6- to 8-years old children

    PubMed Central

    Calero, Cecilia I.; Salles, Alejo; Semelman, Mariano; Sigman, Mariano

    2013-01-01

    The ability to attribute different mental states to distinct individuals, or Theory of Mind (ToM), is widely believed to be developed mostly during preschool years. How different factors such as gender, number of siblings, or coarse personality traits affect this development is not entirely agreed upon. Here, we introduce a computerized version of the scaled ToM suite of tasks introduced by Wellman and Liu (2004), which allows us to meaningfully test ToM development on children 6 to 8-years old. We find that kids this age are still not entirely proficient in all ToM tasks, and continue to show a progression of performance with age. By testing this new age range, too, we are able to observe a significant advantage of girls over boys in ToM performance. Other factors such as number of siblings, birth order, and coarse personality traits show no significant relation with the ToM task results. Finally, we introduce a novel way to quantify the scaling property of the suite involving a sequence of set inclusions on one hand and a comparison between specially tailored sets of logistic models on the other. These measures confirm the validity of the scale in the 6- to 8-years old range. PMID:23785326

  9. Age and gender dependent development of Theory of Mind in 6- to 8-years old children.

    PubMed

    Calero, Cecilia I; Salles, Alejo; Semelman, Mariano; Sigman, Mariano

    2013-01-01

    The ability to attribute different mental states to distinct individuals, or Theory of Mind (ToM), is widely believed to be developed mostly during preschool years. How different factors such as gender, number of siblings, or coarse personality traits affect this development is not entirely agreed upon. Here, we introduce a computerized version of the scaled ToM suite of tasks introduced by Wellman and Liu (2004), which allows us to meaningfully test ToM development on children 6 to 8-years old. We find that kids this age are still not entirely proficient in all ToM tasks, and continue to show a progression of performance with age. By testing this new age range, too, we are able to observe a significant advantage of girls over boys in ToM performance. Other factors such as number of siblings, birth order, and coarse personality traits show no significant relation with the ToM task results. Finally, we introduce a novel way to quantify the scaling property of the suite involving a sequence of set inclusions on one hand and a comparison between specially tailored sets of logistic models on the other. These measures confirm the validity of the scale in the 6- to 8-years old range.

  10. Biomedical optics centers: forty years of multidisciplinary clinical translation for improving human health

    NASA Astrophysics Data System (ADS)

    Tromberg, Bruce J.; Anderson, R. Rox; Birngruber, Reginald; Brinkmann, Ralf; Berns, Michael W.; Parrish, John A.; Apiou-Sbirlea, Gabriela

    2016-12-01

    Despite widespread government and public interest, there are significant barriers to translating basic science discoveries into clinical practice. Biophotonics and biomedical optics technologies can be used to overcome many of these hurdles, due, in part, to offering new portable, bedside, and accessible devices. The current JBO special issue highlights promising activities and examples of translational biophotonics from leading laboratories around the world. We identify common essential features of successful clinical translation by examining the origins and activities of three major international academic affiliated centers with beginnings traceable to the mid-late 1970s: The Wellman Center for Photomedicine (Mass General Hospital, USA), the Beckman Laser Institute and Medical Clinic (University of California, Irvine, USA), and the Medical Laser Center Lübeck at the University of Lübeck, Germany. Major factors driving the success of these programs include visionary founders and leadership, multidisciplinary research and training activities in light-based therapies and diagnostics, diverse funding portfolios, and a thriving entrepreneurial culture that tolerates risk. We provide a brief review of how these three programs emerged and highlight critical phases and lessons learned. Based on these observations, we identify pathways for encouraging the growth and formation of similar programs in order to more rapidly and effectively expand the impact of biophotonics and biomedical optics on human health.

  11. Biomedical optics centers: forty years of multidisciplinary clinical translation for improving human health.

    PubMed

    Tromberg, Bruce J; Anderson, R Rox; Birngruber, Reginald; Brinkmann, Ralf; Berns, Michael W; Parrish, John A; Apiou-Sbirlea, Gabriela

    2016-12-01

    Despite widespread government and public interest, there are significant barriers to translating basic science discoveries into clinical practice. Biophotonics and biomedical optics technologies can be used to overcome many of these hurdles, due, in part, to offering new portable, bedside, and accessible devices. The current JBO special issue highlights promising activities and examples of translational biophotonics from leading laboratories around the world. We identify common essential features of successful clinical translation by examining the origins and activities of three major international academic affiliated centers with beginnings traceable to the mid-late 1970s: The Wellman Center for Photomedicine (Mass General Hospital, USA), the Beckman Laser Institute and Medical Clinic (University of California, Irvine, USA), and the Medical Laser Center Lübeck at the University of Lübeck, Germany. Major factors driving the success of these programs include visionary founders and leadership, multidisciplinary research and training activities in light-based therapies and diagnostics, diverse funding portfolios, and a thriving entrepreneurial culture that tolerates risk. We provide a brief review of how these three programs emerged and highlight critical phases and lessons learned. Based on these observations, we identify pathways for encouraging the growth and formation of similar programs in order to more rapidly and effectively expand the impact of biophotonics and biomedical optics on human health.

  12. Confirmation of linear system theory prediction: Rate of change of Herrnstein's kappa as a function of response-force requirement.

    PubMed

    McDowell, J J; Wood, H M

    1985-01-01

    Four human subjects worked on all combinations of five variable-interval schedules and five reinforcer magnitudes ( cent/reinforcer) in each of two phases of the experiment. In one phase the force requirement on the operandum was low (1 or 11 N) and in the other it was high (25 or 146 N). Estimates of Herrnstein's kappa were obtained at each reinforcer magnitude. The results were: (1) response rate was more sensitive to changes in reinforcement rate at the high than at the low force requirement, (2) kappa increased from the beginning to the end of the magnitude range for all subjects at both force requirements, (3) the reciprocal of kappa was a linear function of the reciprocal of reinforcer magnitude for seven of the eight data sets, and (4) the rate of change of kappa was greater at the high than at the low force requirement by an order of magnitude or more. The second and third findings confirm predictions made by linear system theory, and replicate the results of an earlier experiment (McDowell & Wood, 1984). The fourth finding confirms a further prediction of the theory and supports the theory's interpretation of conflicting data on the constancy of Herrnstein's kappa.

  13. Confirmation of linear system theory prediction: Rate of change of Herrnstein's κ as a function of response-force requirement

    PubMed Central

    McDowell, J. J; Wood, Helena M.

    1985-01-01

    Four human subjects worked on all combinations of five variable-interval schedules and five reinforcer magnitudes (¢/reinforcer) in each of two phases of the experiment. In one phase the force requirement on the operandum was low (1 or 11 N) and in the other it was high (25 or 146 N). Estimates of Herrnstein's κ were obtained at each reinforcer magnitude. The results were: (1) response rate was more sensitive to changes in reinforcement rate at the high than at the low force requirement, (2) κ increased from the beginning to the end of the magnitude range for all subjects at both force requirements, (3) the reciprocal of κ was a linear function of the reciprocal of reinforcer magnitude for seven of the eight data sets, and (4) the rate of change of κ was greater at the high than at the low force requirement by an order of magnitude or more. The second and third findings confirm predictions made by linear system theory, and replicate the results of an earlier experiment (McDowell & Wood, 1984). The fourth finding confirms a further prediction of the theory and supports the theory's interpretation of conflicting data on the constancy of Herrnstein's κ. PMID:16812408

  14. Ecosystem thresholds, tipping points, and critical transitions

    USGS Publications Warehouse

    Munson, Seth M.; Reed, Sasha C.; Peñuelas, Josep; McDowell, Nathan G.; Sala, Osvaldo E.

    2018-01-01

    Abrupt shifts in ecosystems are cause for concern and will likelyintensify under global change (Scheffer et al., 2001). The terms‘thresho lds’, ‘tipping points’, and ‘critical transitions’ have beenused interchangeably to refer to sudden changes in the integrityor state of an ecosystem caused by environmental drivers(Holling, 1973; May, 1977). Threshold-based concepts havesignific antly aided our capacity to predict the controls overecosystem structure and functioning (Schwinning et al., 2004;Peters et al., 2007) and have become a framework to guide themanagement of natural resources (Glick et al., 2010; Allen et al.,2011). However, our unders tanding of how biotic and abioticdrivers interact to regulate ecosystem responses and of ways toforecast th e impending responses remain limited. Terrestrialecosystems, in particular, are already responding to globalchange in ways that are both transformati onal and difficult topredict due to strong heterogeneity across temporal and spatialscales (Pe~nuelas & Filella, 2001; McDowell et al., 2011;Munson, 2013; Reed et al., 2016). Comparing approaches formeasuring ecosystem performance in response to changingenvironme ntal conditions and for detecting stress and thresholdresponses can improve tradition al tests of resilience and provideearly warning signs of ecosystem transitions. Similarly, com-paring responses across ecosystems can offer insight into themechanisms that underlie variation in threshold responses.

  15. A dislocation density-based continuum model of the anisotropic shock response of single crystal α-cyclotrimethylene trinitramine

    NASA Astrophysics Data System (ADS)

    Luscher, D. J.; Addessio, F. L.; Cawkwell, M. J.; Ramos, K. J.

    2017-01-01

    We have developed a model for the finite deformation thermomechanical response of α-cyclotrimethylene trinitramine (RDX). Our model accounts for nonlinear thermoelastic lattice deformation through a free energy-based equation of state developed by Cawkwell et al. (2016) in combination with temperature and pressure dependent elastic constants, as well as dislocation-mediated plastic slip on a set of slip systems motivated by experimental observation. The kinetics of crystal plasticity are modeled using the Orowan equation relating slip rate to dislocation density and the dislocation velocity developed by Austin and McDowell (2011), which naturally accounts for transition from thermally activated to dislocation drag limited regimes. Evolution of dislocation density is specified in terms of local ordinary differential equations reflecting dislocation-dislocation interactions. This paper presents details of the theory and parameterization of the model, followed by discussion of simulations of flyer plate impact experiments. Impact conditions explored within this combined simulation and experimental effort span shock pressures ranging from 1 to 3 GPa for four crystallographic orientations and multiple specimen thicknesses. Simulation results generated using this model are shown to be in strong agreement with velocimetry measurements from the corresponding plate impact experiments. Finally, simulation results are used to motivate conclusions about the nature of dislocation-mediated plasticity in RDX.

  16. Investigation of ITER candidate beryllium grades irradiated at high temperature

    NASA Astrophysics Data System (ADS)

    Kupriyanov, I. B.; Gorokhov, V. A.; Melder, R. R.; Ostrovsky, Z. E.; Gervash, A. A.

    1998-10-01

    Beryllium is one of the main candidate materials both for the neutron multiplier in a solid breeding blanket and for the plasma facing components. That is why the investigation of beryllium behaviour under the typical for fusion reactor loading, in particular under the neutron irradiation, is of a great importance. This paper presents some results of investigation of five beryllium grades (DshG-200, TR-30, TshG-56, TRR, TE-30, TIP-30) fabricated by VNIINM, Russia, and one (S-65) fabricated by Brush Wellman, USA. The average grain size of the investigated beryllium grades varied from 8 to 40 μm, beryllium oxide content was 0.7-3.2 wt.%, initial tensile strength 250-680 MPa. All the samples were irradiated in active zone of SM-3 reactor of 650-700°C up to the fast neutron fluence (5.5-6.2) × 10 21 cm -2 (2.7-3.0 dpa, helium content up to 1150 appm), E > 0.1 MeV. Irradiation swelling of the materials was revealed to be in the range of 0.3-1.7%. Beryllium grades TR-30 and TRR having the smallest grain size and highest beryllium oxide content, demonstrated minimal swelling, which did not exceed 0.3% at 700°C and fluence 5.5 × 10 21 cm -2. Mechanical properties and microstructure parameters measured before and after irradiation are also presented.

  17. The Rhynie hot-spring system: implications for the Devonian timescale, development of Devonian biota, gold mineralization, evolution of the atmosphere and Earth outgassing

    NASA Astrophysics Data System (ADS)

    Mark, D.; Rice, C.; Stuart, F.; Trewin, N.

    2011-12-01

    The Rhynie cherts are hot spring sinters that contain world-renowned plant and animal remains and anomalously high quantities of heavy metals, including gold. The biota in several beds is preserved undeformed with plants in life positions thus establishing that they and the indurating hydrothermal fluids were coeval. Despite the international importance of the Rhynie cherts their age has been poorly constrained for three reasons: (1) lack of a precise radio-isotopic age, (2) low resolution of spore biostratigraphic schemes for Devonian terrestrial deposits, with only one to a few zones per stage, and (3) poor resolution of the early Devonian timescale. Wellman (2004) assigned a Pragian-?earliest Emsian age to the Rhynie cherts on the basis of the spore assemblage. An 40Ar/39Ar dating study targeting Rhynie chert yielded an age of 395 ± 12 Ma (1σ) (Rice et al., 1995). This contribution discusses a new high-precision 40Ar/39Ar age (407.1 ± 2.2 Ma, 2σ) for the Devonian hot-spring system at Rhynie (Mark et al., 2011) and demonstrates that a proposed U-Pb age (411.5 ± 1.1 Ma, 2σ) for the Rhynie cherts (Parry et al., 2011) is inconsistent with both field evidence and our interpretation of the U-Pb data. The 40Ar/39Ar age provides a robust marker for the polygonalis-emsiensis Spore Assemblage Biozone within the Pragian-?earliest Emsian. It also constrains the age of a wealth of flora and fauna preserved in life positions as well as dating gold mineralization. Furthermore, we have now determined the Ar isotope composition of pristine samples of the Rhynie chert using an ARGUS multi-collector mass spectrometer and a low blank laser extraction technique. 40Ar/36Ar are systematically lower than the modern air value (Lee et al., 2006), and are not accompanied by non-atmospheric 38Ar/36Ar ratios. We conclude that the Rhynie chert captured and has preserved Devonian atmosphere-derived Ar. The data indicate that the 40Ar/36Ar of Devonian atmosphere was at least 3 % lower than the modern air value (Lee et al., 2006). Thus the Earth's atmosphere has accumulated at least 5 ± 0.2 x 1016 moles of 40Ar in the last c. 407 Ma, at an average rate of 1.24 ± 0.06 x 108 mol 40Ar/year. This overlaps the 40Ar accumulation rate determined from ice cores for the last 800,000 years (Bender et al. 2008) and implies that there has been no resolvable temporal change in outgassing rate since the mid-Palaeozoic. The new chronological and Ar isotope data provide a unique tie point and dictate outgassing of the Earth's interior early in Earth history. [1] Bender, M. et al. (2008) Proceedings of the National Academy of Sciences, 105, 8232-8237. [2] Wellman, C.H., 2004. Proceedings of the Royal Society of London. Biological Sciences, 271, 985-992. [3] Lee, J.Y. et al. (2006) Geochimica et Cosmochimica Acta, 70, 4507-4512. [4] Mark, D.F. et al. (2011) Geochimica et Cosmochimica Acta, 75, 555-569. [5] Parry, S.F. et al. (2011) Journal of the Geological Society, London, 168, 863-872. [6] Rice, C.M. et al. (1995) Journal of the Geological Society, London, 152, 229-2250.

  18. Comparison of trailside degradation across a gradient of trail use in the Sonoran Desert.

    PubMed

    Rowe, Helen Ivy; Tluczek, Melanie; Broatch, Jennifer; Gruber, Dan; Jones, Steve; Langenfeld, Debbie; McNamara, Peggy; Weinstein, Leona

    2018-02-01

    As recreational visitation to the Sonoran Desert increases, the concern of scientists, managers and advocates who manage its natural resources deepens. Although many studies have been conducted on trampling of undisturbed vegetation and the effects of trails on adjacent plant and soil communities, little such research has been conducted in the arid southwest. We sampled nine 450-m trail segments with different visitation levels in Scottsdale's McDowell Sonoran Preserve over three years to understand the effects of visitation on soil erosion, trailside soil crusts and plant communities. Soil crust was reduced by 27-34% near medium and high use trails (an estimated peak rate of 13-70 visitors per hour) compared with control plots, but there was less than 1% reduction near low use trails (peak rate of two to four visitors per hour). We did not detect soil erosion in the center 80% of the trampled area of any of the trails. The number of perennial plant species dropped by less than one plant species on average, but perennial plant cover decreased by 7.5% in trailside plots compared with control plots 6 m off-trail. At the current levels of visitation, the primary management focus should be keeping people on the originally constructed trail tread surface to reduce impact to adjacent soil crusts. Copyright © 2017 Elsevier Ltd. All rights reserved.

  19. Regional Stratigraphy and Petroleum Systems of the Illinois Basin, U.S.A.

    USGS Publications Warehouse

    Swezey, Christopher S.

    2009-01-01

    The publication combines data on Paleozoic and Mesozoic stratigraphy and petroleum geology of the Illinois basin, U.S.A., in order to facilitate visualizing the stratigraphy on a regional scale and visualizing stratigraphic relations within the basin. Data are presented in eight schematic chronostratigraphic sections arranged approximately from north to south, with time denoted in equal increments along the sections, in addition to the areal extent of this structural basin. The stratigraphic data are modified from Hass (1956), Conant and Swanson (1961), Wilman and others (1975), American Association of Petroleum Geologists (1984, 1986), Olive and McDowell (1986), Shaver and others (1986), Thompson (1986), Mancini and others (1996), and Harrison and Litwin (1997). The time scale is taken from Gradstein and others (2004). Additional stratigraphic nomenclature is from Harland and others (1990), Babcock and others (2007), and Bergstrom and others (2008). Stratigraphic sequences as defined by Sloss (1963, 1988) and Wheeler (1963) also are included, as well as the locations of major petroleum source rocks and major petroleum plays. The stratigraphic units shown are colored according to predominant lithology, in order to emphasize general lithologic patterns and to provide a broad overview of the Illinois basin. For the purpose of comparison, three columns on the right show schematic depictions of stratigraphy and interpreted events in the Illinois basin and in the adjacent Michigan and Appalachian basins.

  20. A dislocation density-based continuum model of the anisotropic shock response of single crystal α-cyclotrimethylene trinitramine

    DOE PAGES

    Luscher, Darby Jon; Addessio, Francis L.; Cawkwell, Marc Jon; ...

    2017-01-01

    Here, we have developed a model for the finite deformation thermomechanical response of α-cyclotrimethylene trinitramine (RDX). Our model accounts for nonlinear thermoelastic lattice deformation through a free energy-based equation of state developed by Cawkwell et al. (2016) in combination with temperature and pressure dependent elastic constants, as well as dislocation-mediated plastic slip on a set of slip systems motivated by experimental observation. The kinetics of crystal plasticity are modeled using the Orowan equation relating slip rate to dislocation density and the dislocation velocity developed by Austin and McDowell (2011), which naturally accounts for transition from thermally activated to dislocation dragmore » limited regimes. Evolution of dislocation density is specified in terms of local ordinary differential equations reflecting dislocation–dislocation interactions. This paper presents details of the theory and parameterization of the model, followed by discussion of simulations of flyer plate impact experiments. Impact conditions explored within this combined simulation and experimental effort span shock pressures ranging from 1 to 3 GPa for four crystallographic orientations and multiple specimen thicknesses. Simulation results generated using this model are shown to be in strong agreement with velocimetry measurements from the corresponding plate impact experiments. Finally, simulation results are used to motivate conclusions about the nature of dislocation-mediated plasticity in RDX.« less

  1. Measurements and Modelling of Sputtering Rates with Low Energy Ions

    NASA Astrophysics Data System (ADS)

    Ruzic, David N.; Smith, Preston C.; Turkot, Robert B., Jr.

    1996-10-01

    The angular-resolved sputtering yield of Be by D+, and Al by Ar+ was predicted and then measured. A 50 to 1000 eV ion beam from a Colutron was focused on to commercial grade and magnetron target grade samples. The S-65 C grade beryllium samples were supplied by Brush Wellman and the Al samples from TOSOH SMD. In our vacuum chamber the samples can be exposed to a dc D or Ar plasma to remove oxide, load the surface and more-nearly simulate steady state operating conditions in the plasma device. The angular distribution of the sputtered atoms was measured by collection on a single crystal graphite witness plate. The areal density of Be or Al (and BeO2 or Al2O3, after exposure to air) was then measured using a Scanning Auger Spectrometer. Total yield was also measured by deposition onto a quartz crystal oscillator simultaneously to deposition onto the witness plate. A three dimensional version of vectorized fractal TRIM (VFTRIM3D), a Monte-Carlo computer code which includes surface roughness characterized by fractal geometry, was used to predict the angular distribution of the sputtered particles and a global sputtering coefficient. Over a million trajectories were simulated for each incident angle to determine the azimuthal and polar angle distributions of the sputtered atoms. The experimental results match closely with the simulations for total yield, while the measured angular distributions depart somewhat from the predicted cosine curve.

  2. A Bayesian Framework for False Belief Reasoning in Children: A Rational Integration of Theory-Theory and Simulation Theory

    PubMed Central

    Asakura, Nobuhiko; Inui, Toshio

    2016-01-01

    Two apparently contrasting theories have been proposed to account for the development of children's theory of mind (ToM): theory-theory and simulation theory. We present a Bayesian framework that rationally integrates both theories for false belief reasoning. This framework exploits two internal models for predicting the belief states of others: one of self and one of others. These internal models are responsible for simulation-based and theory-based reasoning, respectively. The framework further takes into account empirical studies of a developmental ToM scale (e.g., Wellman and Liu, 2004): developmental progressions of various mental state understandings leading up to false belief understanding. By representing the internal models and their interactions as a causal Bayesian network, we formalize the model of children's false belief reasoning as probabilistic computations on the Bayesian network. This model probabilistically weighs and combines the two internal models and predicts children's false belief ability as a multiplicative effect of their early-developed abilities to understand the mental concepts of diverse beliefs and knowledge access. Specifically, the model predicts that children's proportion of correct responses on a false belief task can be closely approximated as the product of their proportions correct on the diverse belief and knowledge access tasks. To validate this prediction, we illustrate that our model provides good fits to a variety of ToM scale data for preschool children. We discuss the implications and extensions of our model for a deeper understanding of developmental progressions of children's ToM abilities. PMID:28082941

  3. A Bayesian Framework for False Belief Reasoning in Children: A Rational Integration of Theory-Theory and Simulation Theory.

    PubMed

    Asakura, Nobuhiko; Inui, Toshio

    2016-01-01

    Two apparently contrasting theories have been proposed to account for the development of children's theory of mind (ToM): theory-theory and simulation theory. We present a Bayesian framework that rationally integrates both theories for false belief reasoning. This framework exploits two internal models for predicting the belief states of others: one of self and one of others. These internal models are responsible for simulation-based and theory-based reasoning, respectively. The framework further takes into account empirical studies of a developmental ToM scale (e.g., Wellman and Liu, 2004): developmental progressions of various mental state understandings leading up to false belief understanding. By representing the internal models and their interactions as a causal Bayesian network, we formalize the model of children's false belief reasoning as probabilistic computations on the Bayesian network. This model probabilistically weighs and combines the two internal models and predicts children's false belief ability as a multiplicative effect of their early-developed abilities to understand the mental concepts of diverse beliefs and knowledge access. Specifically, the model predicts that children's proportion of correct responses on a false belief task can be closely approximated as the product of their proportions correct on the diverse belief and knowledge access tasks. To validate this prediction, we illustrate that our model provides good fits to a variety of ToM scale data for preschool children. We discuss the implications and extensions of our model for a deeper understanding of developmental progressions of children's ToM abilities.

  4. The Effects of Theory of Mind Training on the False Belief Understanding of Deaf and Hard-of-Hearing Students in Prekindergarten and Kindergarten.

    PubMed

    Tucci, Stacey L; Easterbrooks, Susan R; Lederberg, Amy R

    2016-07-01

    Data from a growing number of research studies indicate that children with hearing loss are delayed in Theory of Mind (ToM) development when compared to their typically developing, hearing peers. While other researchers have studied the developmental trajectories of ToM in school-age students who are deaf, a limited number have addressed the need for interventions for this population. The present study extends the current research on ToM interventions to the Prekindergarten and Kindergarten levels. This study used a single-case multiple baseline design to examine the effects of a ToM intervention on participants' false belief understanding as well as outcomes on a near generalization measure and a far generalization measure. A ToM thought bubble intervention (i.e., a visual representation of what people are thinking) developed by Wellman and Peterson (2013 Deafness, thought bubbles, and theory-of-mind development. Developmental Psychology, 49, 2357-2367) was modified in key areas. Results from the Single-Case Design portion of the study indicate a functional, or causal, relation between the ToM intervention and the participants' acquisition of the targeted skills in each stage although progress was not uniform. Results from the pre-post assessments indicate that the children did make progress up the scale. These results inform the field in regard to the efficacy and feasibility of a ToM intervention for young deaf children. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  5. Urban remote sensing applications: TIMS observations of the City of Scottsdale

    NASA Technical Reports Server (NTRS)

    Christensen, Philip R.; Melendrez, David E.; Anderson, Donald L.; Hamilton, Victoria E.; Wenrich, Melissa L.; Howard, Douglas

    1995-01-01

    A research program has been initiated between Arizona State University and the City of Scottsdale, Arizona to study the potential applications of TIMS (Thermal Infrared Multispectral Scanner) data for urban scene classification, desert environmental assessment, and change detection. This program is part of a long-term effort to integrate remote sensing observations into state and local planning activities to improve decision making and future planning. Specific test sites include a section of the downtown Scottsdale region that has been mapped in very high detail as part of a pilot program to develop an extensive GIS database. This area thus provides excellent time history of the evolution of the city infrastructure, such as the timing and composition of street repavement. A second area of study includes the McDowell intensive study by state and local agencies to assess potential sites for urban development as well as preservation. These activities are of particular relevance as the Phoenix metropolitan area undergoes major expansion into the surrounding desert areas. The objectives of this study in urban areas are aimed at determining potential applications of TIMS data for classifying and assessing land use and surface temperatures. Land use centers on surface impermeability studies for storm runoff assessment and pollution control. These studies focus on determining the areal abundance of urban vegetation and undeveloped soil. Highly experimental applications include assessment and monitoring of pavement condition. Temperature studies focus on determining swimming pool area and temperature for use in monitoring evaporating and urban water consumption. These activities are of particular relevance as the Phoenix metropolitan area undergoes major expansion into the surrounding desert area.

  6. Classification and authentication of unknown water samples using machine learning algorithms.

    PubMed

    Kundu, Palash K; Panchariya, P C; Kundu, Madhusree

    2011-07-01

    This paper proposes the development of water sample classification and authentication, in real life which is based on machine learning algorithms. The proposed techniques used experimental measurements from a pulse voltametry method which is based on an electronic tongue (E-tongue) instrumentation system with silver and platinum electrodes. E-tongue include arrays of solid state ion sensors, transducers even of different types, data collectors and data analysis tools, all oriented to the classification of liquid samples and authentication of unknown liquid samples. The time series signal and the corresponding raw data represent the measurement from a multi-sensor system. The E-tongue system, implemented in a laboratory environment for 6 numbers of different ISI (Bureau of Indian standard) certified water samples (Aquafina, Bisleri, Kingfisher, Oasis, Dolphin, and McDowell) was the data source for developing two types of machine learning algorithms like classification and regression. A water data set consisting of 6 numbers of sample classes containing 4402 numbers of features were considered. A PCA (principal component analysis) based classification and authentication tool was developed in this study as the machine learning component of the E-tongue system. A proposed partial least squares (PLS) based classifier, which was dedicated as well; to authenticate a specific category of water sample evolved out as an integral part of the E-tongue instrumentation system. The developed PCA and PLS based E-tongue system emancipated an overall encouraging authentication percentage accuracy with their excellent performances for the aforesaid categories of water samples. Copyright © 2011 ISA. Published by Elsevier Ltd. All rights reserved.

  7. Extending unified-theory-of-reinforcement neural networks to steady-state operant behavior.

    PubMed

    Calvin, Olivia L; McDowell, J J

    2016-06-01

    The unified theory of reinforcement has been used to develop models of behavior over the last 20 years (Donahoe et al., 1993). Previous research has focused on the theory's concordance with the respondent behavior of humans and animals. In this experiment, neural networks were developed from the theory to extend the unified theory of reinforcement to operant behavior on single-alternative variable-interval schedules. This area of operant research was selected because previously developed neural networks could be applied to it without significant alteration. Previous research with humans and animals indicates that the pattern of their steady-state behavior is hyperbolic when plotted against the obtained rate of reinforcement (Herrnstein, 1970). A genetic algorithm was used in the first part of the experiment to determine parameter values for the neural networks, because values that were used in previous research did not result in a hyperbolic pattern of behavior. After finding these parameters, hyperbolic and other similar functions were fitted to the behavior produced by the neural networks. The form of the neural network's behavior was best described by an exponentiated hyperbola (McDowell, 1986; McLean and White, 1983; Wearden, 1981), which was derived from the generalized matching law (Baum, 1974). In post-hoc analyses the addition of a baseline rate of behavior significantly improved the fit of the exponentiated hyperbola and removed systematic residuals. The form of this function was consistent with human and animal behavior, but the estimated parameter values were not. Copyright © 2016 Elsevier B.V. All rights reserved.

  8. Competence and performance in belief-desire reasoning across two cultures: the truth, the whole truth and nothing but the truth about false belief?

    PubMed

    Yazdi, Amir Amin; German, Tim P; Defeyter, Margaret Anne; Siegal, Michael

    2006-06-01

    There is a change in false belief task performance across the 3-5 year age range, as confirmed in a recent meta-analysis [Wellman, H. M., Cross, D., & Watson, J. (2001). Meta-analysis of theory mind development: The truth about false-belief. Child Development, 72, 655-684]. This meta-analysis identified several performance factors influencing success, including manipulations that highlight the salience of the initial belief content (such as asking where Sally will look first for the marble). However, because a proportion of variance in performance remained unexplained even when identified performance factors were controlled for, the authors concluded from the standpoint of a 'theory-theory' account that children's improvement is the result of conceptual change. Further, the meta-analysis showed that manipulations such as 'look first' improve performance only in children who are in the older part of the 3-5 year range, and thus plausibly operating with a 'transitional' theory of mind--just on the point of realizing conceptual change. Here, we present three studies systematically investigating the 'look first' manipulation which showed that: (i) the advantage for the look first question can be demonstrated in children across different cultures, (ii) look first has an effect that is additive to the improvement with age; there is no interaction such that older children gain more benefit from younger children, (iii) performance in younger children can be, but is not always, elevated to levels that are statistically above chance. These results challenge the theory-theory account and are discussed in terms of models of belief-desire reasoning in which both conceptual competence and performance factors play central roles.

  9. Not just a sum of its parts: How tasks of the theory of mind scale relate to executive function across time.

    PubMed

    Doenyas, Ceymi; Yavuz, H Melis; Selcuk, Bilge

    2018-02-01

    There is a well-established relationship between theory of mind (ToM) and executive function (EF) during the preschool years. However, less is known about the concurrent and longitudinal relations between EF and specific tasks tapping different aspects of ToM. The current study investigated the ToM-EF relationship across 1 year in 3- to 5-year-old Turkish children using the ToM battery of Wellman and Liu (2004), which measures understanding of diverse desires (DD), diverse beliefs (DB), knowledge access (KA), contents false belief (CFB), explicit false belief (EFB), and hidden emotion (HE). This battery has not yet been used in its entirety to test the predictive relations between ToM and EF. We used peg-tapping and day-night tasks to measure EF. Our sample comprised 150 Turkish preschool children (69 girls) aged 36-60 months at Time 1 (T1) and 49-73 months at Time 2 (T2). Using the ToM composite with all six tasks, when child's age, receptive language, and T1 ability level (EF or ToM) were controlled, T1 EF significantly predicted T2 ToM, whereas T1 ToM did not predict T2 EF. Among DD, DB, KA, false belief understanding (FBU: the composite score of CFB and EFB), and HE, only KA and FBU were significantly associated with EF at T1 and T2. Further regression analyses showed that KA did not have a predictive relationship with EF. Instead, FBU drove the predictive EF-ToM relationship across time. Thus, in Turkish children, earlier EF predicts later ToM, but especially the FBU component, in this well-validated battery. Copyright © 2017. Published by Elsevier Inc.

  10. Mechanical properties and microstructure of copper alloys and copper alloy-stainless steel laminates for fusion reactor high heat flux applications

    NASA Astrophysics Data System (ADS)

    Leedy, Kevin Daniel

    A select group of copper alloys and bonded copper alloy-stainless steel panels are under consideration for heat sink applications in first wall and divertor structures of a planned thermonuclear fusion reactor. Because these materials must retain high strengths and withstand high heat fluxes, their material properties and microstructures must be well understood. Candidate copper alloys include precipitate strengthened CuNiBe and CuCrZr and dispersion strengthened Cu-Alsb2Osb3 (CuAl25). In this study, uniaxial mechanical fatigue tests were conducted on bulk copper alloy materials at temperatures up to 500sp°C in air and vacuum environments. Based on standardized mechanical properties measurement techniques, a series of tests were also implemented to characterize copper alloy-316L stainless steel joints produced by hot isostatic pressing or by explosive bonding. The correlation between mechanical properties and the microstructure of fatigued copper alloys and the interface of copper alloy-stainless steel laminates was examined. Commercial grades of these alloys were used to maintain a degree of standardization in the materials testing. The commercial alloys used were OMG Americas Glidcop CuAl25 and CuAl15; Brush Wellman Hycon 3HP and Trefimetaux CuNiBe; and Kabelmetal Elbrodur and Trefimetaux CuCrZr. CuAl25 and CuNiBe alloys possessed the best combination of fatigue resistance and microstructural stability. The CuAl25 alloy showed only minimal microstructural changes following fatigue while the CuNiBe alloy consistently exhibited the highest fatigue strength. Transmission electron microscopy observations revealed that small matrix grain sizes and high densities of submicron strengthening phases promoted homogeneous slip deformation in the copper alloys. Thus, highly organized fatigue dislocation structure formation, as commonly found in oxygen-free high conductivity Cu, was inhibited. A solid plate of CuAl25 alloy hot isostatically pressed to a 316L stainless steel plate showed the best overall mechanical properties of the studied bi-metallic bonded panels. Bond properties were nominally inferior to constituent bulk material properties and fracture toughness values, in particular, were quite low for all bonded laminates. Delamination near the copper alloy-stainless steel interface was the dominate failure mode in the bi-metallic panels. The joining processes caused microstructural alterations in the bond interfacial regions including: microporosity, new precipitate formation, existing precipitate morphology changes and interdiffusion of constituent elements.

  11. Arousal Intensity is a Distinct Pathophysiological Trait in Obstructive Sleep Apnea

    PubMed Central

    Amatoury, Jason; Azarbarzin, Ali; Younes, Magdy; Jordan, Amy S.; Wellman, Andrew; Eckert, Danny J.

    2016-01-01

    Study Objectives: Arousals from sleep vary in duration and intensity. Accordingly, the physiological consequences of different types of arousals may also vary. Factors that influence arousal intensity are only partly understood. This study aimed to determine if arousal intensity is mediated by the strength of the preceding respiratory stimulus, and investigate other factors mediating arousal intensity and its role on post-arousal ventilatory and pharyngeal muscle responses. Methods: Data were acquired in 71 adults (17 controls, 54 obstructive sleep apnea patients) instrumented with polysomnography equipment plus genioglossus and tensor palatini electromyography (EMG), a nasal mask and pneumotachograph, and an epiglottic pressure sensor. Transient reductions in CPAP were delivered during sleep to induce respiratory-related arousals. Arousal intensity was measured using a validated 10-point scale. Results: Average arousal intensity was not related to the magnitude of the preceding respiratory stimuli but was positively associated with arousal duration, time to arousal, rate of change in epiglottic pressure and negatively with BMI (R2 > 0.10, P ≤ 0.006). High (> 5) intensity arousals caused greater ventilatory responses than low (≤ 5) intensity arousals (10.9 [6.8–14.5] vs. 7.8 [4.7–12.9] L/min; P = 0.036) and greater increases in tensor palatini EMG (10 [3–17] vs. 6 [2–11]%max; P = 0.031), with less pronounced increases in genioglossus EMG. Conclusions: Average arousal intensity is independent of the preceding respiratory stimulus. This is consistent with arousal intensity being a distinct trait. Respiratory and pharyngeal muscle responses increase with arousal intensity. Thus, patients with higher arousal intensities may be more prone to respiratory control instability. These findings are important for sleep apnea pathogenesis. Citation: Amatoury J, Azarbarzin A, Younes M, Jordan AS, Wellman A, Eckert DJ. Arousal intensity is a distinct pathophysiological trait in obstructive sleep apnea. SLEEP 2016;39(12):2091–2100. PMID:27784404

  12. Quantification of Accelerometer Derived Impacts Associated With Competitive Games in National Collegiate Athletic Association Division I College Football Players.

    PubMed

    Wellman, Aaron D; Coad, Sam C; Goulet, Grant C; McLellan, Christopher P

    2017-02-01

    Wellman, AD, Coad, SC, Goulet, GC, and McLellan, CP. Quantification of accelerometer derived impacts associated with competitive games in National Collegiate Athletic Association division I college football players. J Strength Cond Res 31(2): 330-338, 2017-The aims of the present study were to (a) examine positional impact profiles of National Collegiate Athletic Association (NCAA) division I college football players using global positioning system (GPS) and integrated accelerometry (IA) technology and (b) determine if positional differences in impact profiles during competition exist within offensive and defensive teams. Thirty-three NCAA division I Football Bowl Subdivision players were monitored using GPS and IA (GPSports) during 12 regular season games throughout the 2014 season. Individual player data sets (n = 294) were divided into offensive and defensive teams, and positional subgroups. The intensity, number, and distribution of impact forces experienced by players during competition were recorded. Positional differences were found for the distribution of impacts within offensive and defensive teams. Wide receivers sustained more very light and light to moderate (5-6.5 G force) impacts than other position groups, whereas the running backs were involved in more severe (>10 G force) impacts than all offensive position groups, with the exception of the quarterbacks (p ≤ 0.05). The defensive back and linebacker groups were subject to more very light (5.0-6.0 G force) impacts, and the defensive tackle group sustained more heavy and very heavy (7.1-10 G force) impacts than other defensive positions (p ≤ 0.05). Data from the present study provide novel quantification of positional impact profiles related to the physical demands of college football games and highlight the need for position-specific monitoring and training in the preparation for the impact loads experienced during NCAA division I football competition.

  13. Upper Airway Collapsibility (Pcrit) and Pharyngeal Dilator Muscle Activity are Sleep Stage Dependent

    PubMed Central

    Carberry, Jayne C.; Jordan, Amy S.; White, David P.; Wellman, Andrew; Eckert, Danny J.

    2016-01-01

    Study Objectives: An anatomically narrow/highly collapsible upper airway is the main cause of obstructive sleep apnea (OSA). Upper airway muscle activity contributes to airway patency and, like apnea severity, can be sleep stage dependent. Conversely, existing data derived from a small number of participants suggest that upper airway collapsibility, measured by the passive pharyngeal critical closing pressure (Pcrit) technique, is not sleep stage dependent. This study aimed to determine the effect of sleep stage on Pcrit and upper airway muscle activity in a larger cohort than previously tested. Methods: Pcrit and/or muscle data were obtained from 72 adults aged 20–64 y with and without OSA.Pcrit was determined via transient reductions in continuous positive airway pressure (CPAP) during N2, slow wave sleep (SWS) and rapid eye movement (REM) sleep. Genioglossus and tensor palatini muscle activities were measured: (1) awake with and without CPAP, (2) during stable sleep on CPAP, and (3) in response to the CPAP reductions used to quantify Pcrit. Results: Pcrit was 4.9 ± 1.4 cmH2O higher (more collapsible) during REM versus SWS (P = 0.012), 2.3 ± 0.6 cmH2O higher during REM versus N2 (P < 0.001), and 1.6 ± 0.7 cmH2O higher in N2 versus SWS (P = 0.048). Muscle activity decreased from wakefulness to sleep and from SWS to N2 to REM sleep for genioglossus but not for tensor palatini. Pharyngeal muscle activity increased by ∼50% by breath 5 following CPAP reductions. Conclusions: Upper airway collapsibility measured via the Pcrit technique and genioglossus muscle activity vary with sleep stage. These findings should be taken into account when performing and interpreting “passive” Pcrit measurements. Citation: Carberry JC, Jordan AS, White DP, Wellman A, Eckert DJ. Upper airway collapsibility (Pcrit) and pharyngeal dilator muscle activity are sleep stage dependent. SLEEP 2016;39(3):511–521. PMID:26612386

  14. Basolateral Amygdala and the Regulation of Fear-Conditioned Changes in Sleep: Role of Corticotropin-Releasing Factor

    PubMed Central

    Wellman, Laurie L.; Yang, Linghui; Ambrozewicz, Marta A.; Machida, Mayumi; Sanford, Larry D.

    2013-01-01

    Study Objective: To determine whether corticotropin-releasing factor (CRF) in the basolateral amygdala (BLA) modulated sleep and fear-conditioned alterations in sleep. Design: After 2 days of habituation to recording procedures, baseline sleep recordings were obtained. The animals were then habituated to the handling procedure necessary for microinjections over 2 consecutive days. In experiment 1, rats received microinjections of 0.5 μL antalarmin (1.61 or 4.82 mM), a CRF receptor 1 antagonist, or distilled water once a week for 3 wk. In experiment 2, rats received a microinjection of either antalarmin or vehicle prior to inescapable shock training (ST; 20 shocks; 0.8 mA, 0.5 sec; 1 min interstimulus interval). The animals were placed back in the context 7 days later for 30 min without shock (CR; context re-exposure). Sleep was recorded for 8 h after each manipulation. Setting: NA. Subjects: Outbred Wistar rats. Interventions: The rats were surgically implanted with electrodes for recording the electroencephalogram and electromyogram for determining arousal state and with bilateral guide cannulae directed at BLA. Measurements and Results: Antalarmin microinjected into BLA did not significantly alter sleep under undisturbed conditions. However, antalarmin microinjected bilaterally into BLA prior to ST blocked reductions in rapid eye movement sleep that ST normally produces. Further, the single microinjection prior to ST blocked the reduction in rapid eye movement typically seen after subsequent CR. Behavioral freezing, an indicator of fear memory, was not altered. Conclusions: CRF in BLA is involved in regulating stress-induced alterations in sleep and it plays a role in modulating how stressful memories influence sleep. Citation: Wellman LL; Yang L; Ambrozewicz MA; Machida M; Sanford LD. Basolateral amygdala and the regulation of fear-conditioned changes in sleep: role of corticotropin-releasing factor. SLEEP 2013;36(4):471-480. PMID:23564994

  15. When Worlds Collide: Chandra Observes Titanic Merger

    NASA Astrophysics Data System (ADS)

    2002-04-01

    NASA's Chandra X-ray Observatory has provided the best X-ray image yet of two Milky Way-like galaxies in the midst of a head-on collision. Since all galaxies - including our own - may have undergone mergers, this provides insight into how the universe came to look as it does today. Astronomers believe the mega-merger in the galaxy known as Arp 220 triggered the formation of huge numbers of new stars, sent shock waves rumbling through intergalactic space, and could possibly lead to the formation of a supermassive black hole in the center of the new conglomerate galaxy. The Chandra data also suggest that merger of these two galaxies began only 10 million years ago, a short time in astronomical terms. "The Chandra observations show that things really get messed up when two galaxies run into each other at full speed," said David Clements of the Imperial College, London, one of the team members involved in the study. "The event affects everything from the formation of massive black holes to the dispersal of heavy elements into the universe." Arp 220 is considered to be a prototype for understanding what conditions were like in the early universe, when massive galaxies and supermassive black holes were presumably formed by numerous galaxy collisions. At a relatively nearby distance of about 250 million light years, Arp 220 is the closest example of an "ultra-luminous" galaxy, one that gives off a trillion times as much radiation as our Sun. The Chandra image shows a bright central region at the waist of a glowing, hour-glass-shaped cloud of multimillion-degree gas. Rushing out of the galaxy at hundreds of thousands of miles per hour, the super-heated as forms a "superwind," thought to be due to explosive activity generated by the formation of hundreds of millions of new stars. Farther out, spanning a distance of 75,000 light years, are giant lobes of hot gas that could be galactic remnants flung into intergalactic space by the early impact of the collision. Whether the lobes will continue to expand into space or fall back into Arp 220 is unknown. The center of Arp 220 is of particular interest. Chandra observations allowed astronomers to pinpoint an X-ray source at the exact location of the nucleus of one of the pre-merger galaxies. Another fainter X-ray source nearby may coincide with the nucleus of the other galaxy remnant. The X-ray power output of these point-like sources is greater than expected for stellar black holes accreting from companion stars. The authors suggest that these sources could be due to supermassive black holes at the centers of the merging galaxies. These two remnant sources are relatively weak, and provide strong evidence to support the theory that the extraordinary luminosity of Arp 220 - about a hundred times that of our Milky Way galaxy - is due to the rapid rate of star formation and not to an active, supermassive black hole in the center. However, in a few hundred million years, this balance of power may change. The two massive black holes could merge to produce a central supermassive black hole. This new arrangement could cause much more gas to fall into the central black hole, creating a power source equal to or greater than that due to star formation. "The unusual concentration of X-ray sources in the very center of Arp 220 suggests that we could be observing the early stages of the creation of a supermassive black hole and the eventual rise to power of an active galactic nucleus," said Jonathan McDowell of the Harvard-Smithsonian Center for Astrophysics, Cambridge, MA, another member of the team studying Arp 220. Clements and McDowell were joined on this research by an international group of researchers from the United States, United Kingdom and Spain. Chandra observed Arp 220 on June 24, 2000, for approximately 56,000 seconds using the Advanced CCD Imaging Spectrometer (ACIS) instrument. ACIS was developed for NASA by Pennsylvania State University, University Park, PA, and the Massachusetts Institute of Technology, Cambridge, MA. NASA's Marshall Space Flight Center in Huntsville, Ala., manages the Chandra program, and TRW, Inc., Redondo Beach, Calif., is the prime contractor. The Smithsonian's Chandra X-ray Center controls science and flight operations from Cambridge, Mass.

  16. Mild Airflow Limitation during N2 Sleep Increases K-complex Frequency and Slows Electroencephalographic Activity

    PubMed Central

    Nguyen, Chinh D.; Wellman, Andrew; Jordan, Amy S.; Eckert, Danny J.

    2016-01-01

    Study Objectives: To determine the effects of mild airflow limitation on K-complex frequency and morphology and electroencephalogram (EEG) spectral power. Methods: Transient reductions in continuous positive airway pressure (CPAP) during stable N2 sleep were performed to induce mild airflow limitation in 20 patients with obstructive sleep apnea (OSA) and 10 healthy controls aged 44 ± 13 y. EEG at C3 and airflow were measured in 1-min windows to quantify K-complex properties and EEG spectral power immediately before and during transient reductions in CPAP. The frequency and morphology (amplitude and latency of P200, N550 and N900 components) of K-complexes and EEG spectral power were compared between conditions. Results: During mild airflow limitation (18% reduction in peak inspiratory airflow from baseline, 0.38 ± 0.11 versus 0.31 ± 0.1 L/sec) insufficient to cause American Academy of Sleep Medicine-defined cortical arousal, K-complex frequency (9.5 ± 4.5 versus 13.7 ± 6.4 per min, P < 0.01), N550 amplitude (25 ± 3 versus 27 ± 3 μV, P < 0.01) and EEG spectral power (delta: 147 ± 48 versus 230 ± 99 μV2, P < 0.01 and theta bands: 31 ± 14 versus 34 ± 13 μV2, P < 0.01) significantly increased whereas beta band power decreased (14 ± 5 versus 11 ± 4 μV2, P < 0.01) compared to the preceding non flow-limited period on CPAP. K-complex frequency, morphology, and timing did not differ between patients and controls. Conclusion: Mild airflow limitation increases K-complex frequency, N550 amplitude, and spectral power of delta and theta bands. In addition to providing mechanistic insight into the role of mild airflow limitation on K-complex characteristics and EEG activity, these findings may have important implications for respiratory conditions in which airflow limitation during sleep is common (e.g., snoring and OSA). Citation: Nguyen CD, Wellman A, Jordan AS, Eckert DJ. Mild airflow limitation during N2 sleep increases k-complex frequency and slows electroencephalographic activity. SLEEP 2016;39(3):541–550. PMID:26612389

  17. Quality resource networks for young women in science: The role of Internet-facilitated ties

    NASA Astrophysics Data System (ADS)

    Gillette, Shana Cecile

    In communications, a new approach to the study of online interaction has been suggested by social network analysts. Garton, Haythornthwaite, and Wellman (1997) have outlined the importance of using network analysis to study how media are interconnected with other social aspects of a media user's world. As applied here, this approach to communication when combined with recent network studies from the fields of education and rural development, provides a method for looking at the role of Internet-facilitated ties in the development of resource networks in the learning communities of young women from seven rural schools across the state of Washington. Twenty-six young women (ages 14-16) from diverse cultural and ethnic backgrounds (approximately half of the participants are Hispanic or Native American, the other half are White) participated in the research. Participants were selected because they shared a common educational orientation through Rural Girls in Science, a NSF-funded program at the Northwest Center for Research on Women at the University of Washington. As part of the school-based component of the Rural Girls in Science program, all 26 participants designed and conducted year-long, community-based research projects in science. Each school in the program was provided an Internet workstation for communication and research. Through the Internet, students could conceivably maintain distant ties with mentors and research scientists whom they met at summer camp as well as seek additional information resources. Toward the conclusion of the long-term research projects, each student participant was interviewed using a participatory form of network analysis that included a combined qualitative and quantitative approach. Given the small number of participants and schools in the sample, the results from the analysis can not be generalized to a larger population. However the study of the structure and composition of networks among individuals and school groups provided insight into how media are implicated in the development of resource networks, in particular for a subset of students who have been underrepresented in science--young ethnic minority women.

  18. Effects of Stressor Predictability and Controllability on Sleep, Temperature, and Fear Behavior in Mice

    PubMed Central

    Yang, Linghui; Wellman, Laurie L.; Ambrozewicz, Marta A.; Sanford, Larry D.

    2011-01-01

    Study Objectives: Predictability and controllability are important factors in the persisting effects of stress. We trained mice with signaled, escapable shock (SES) and with signaled, inescapable shock (SIS) to determine whether shock predictability can be a significant factor in the effects of stress on sleep. Design: Male BALB/cJ mice were implanted with transmitters for recording EEG, activity, and temperature via telemetry. After recovery from surgery, baseline sleep recordings were obtained for 2 days. The mice were then randomly assigned to SES (n = 9) and yoked SIS (n = 9) conditions. The mice were presented cues (90 dB, 2 kHz tones) that started 5.0 sec prior to and co-terminated with footshocks (0.5 mA; 5.0 sec maximum duration). SES mice always received shock but could terminate it by moving to the non-occupied chamber in a shuttlebox. SIS mice received identical tones and shocks, but could not alter shock duration. Twenty cue-shock pairings (1.0-min interstimulus intervals) were presented on 2 days (ST1 and ST2). Seven days after ST2, SES and SIS mice, in their home cages, were presented with cues identical to those presented during ST1 and ST2. Setting: NA. Patients or Participants: NA. Interventions: NA. Measurements and Results: On each training and test day, EEG, activity and temperature were recorded for 20 hours. Freezing was scored in response to the cue alone. Compared to SIS mice, SES mice showed significantly increased REM after ST1 and ST2. Compared to SES mice, SIS mice showed significantly increased NREM after ST1 and ST2. Both groups showed reduced REM in response to cue presentation alone. Both groups showed similar stress-induced increases in temperature and freezing in response to the cue alone. Conclusions: These findings indicate that predictability (modeled by signaled shock) can play a significant role in the effects of stress on sleep. Citation: Yang L; Wellman LL; Ambrozewicz MA; Sanford LD. Effects of stressor predictability and controllability on sleep, temperature, and fear behavior in mice. SLEEP 2011;34(6):759-771. PMID:21629364

  19. Trazodone Increases the Respiratory Arousal Threshold in Patients with Obstructive Sleep Apnea and a Low Arousal Threshold

    PubMed Central

    Eckert, Danny J.; Malhotra, Atul; Wellman, Andrew; White, David P.

    2014-01-01

    Study Objectives: The effect of common sedatives on upper airway physiology and breathing during sleep in obstructive sleep apnea (OSA) has been minimally studied. Conceptually, certain sedatives may worsen OSA in some patients. However, sleep and breathing could improve with certain sedatives in patients with OSA with a low respiratory arousal threshold. This study aimed to test the hypothesis that trazodone increases the respiratory arousal threshold in patients with OSA and a low arousal threshold. Secondary aims were to examine the effects of trazodone on upper airway dilator muscle activity, upper airway collapsibility, and breathing during sleep. Design: Patients were studied on 4 separate nights according to a within-subjects cross-over design. Setting: Sleep physiology laboratory. Patients: Seven patients with OSA and a low respiratory arousal threshold. Interventions: In-laboratory polysomnograms were obtained at baseline and after 100 mg of trazodone was administered, followed by detailed overnight physiology experiments under the same conditions. During physiology studies, continuous positive airway pressure was transiently lowered to measure arousal threshold (negative epiglottic pressure prior to arousal), dilator muscle activity (genioglossus and tensor palatini), and upper airway collapsibility (Pcrit). Measurements and Results: Trazodone increased the respiratory arousal threshold by 32 ± 6% (-11.5 ± 1.4 versus -15.3 ± 2.2 cmH2O, P < 0.01) but did not alter the apnea-hypopnea index (39 ± 12 versus 39 ± 11 events/h sleep, P = 0.94). Dilator muscle activity and Pcrit also did not systematically change with trazodone. Conclusions: Trazodone increases the respiratory arousal threshold in patients with obstructive sleep apnea and a low arousal threshold without major impairment in dilator muscle activity or upper airway collapsibility. However, the magnitude of change in arousal threshold was insufficient to overcome the compromised upper airway anatomy in these patients. Citation: Eckert DJ; Malhotra A; Wellman A; White DP. Trazodone increases the respiratory arousal threshold in patients with obstructive sleep apnea and a low arousal threshold. SLEEP 2014;37(4):811-819. PMID:24899767

  20. Comparison of water soil erosion on Spanish Mediterannean abandoned land and agricultural fields under vine, almond, olives and citrus

    NASA Astrophysics Data System (ADS)

    Rodrigo-Comino, Jesús; Martínez-Hernández, Carlos; Iserloh, Thomas; Cerdà, Artemi

    2017-04-01

    The abandonment of agricultural lands is considered as a global dynamic with on- and off-site consequences on the soil mostly ignored (Vanmaercke et al., 2011), which enhance land degradation processes by increasing water soil erosion (Cammeraat et al., 2010; Keesstra et al., 2012) and by decreasing biodiversity (Brevik et al., 2015; Smith et al., 2015). However, there is a lack of information at pedon scale about the assessment and quantification of which environmental elements activate or avoid water soil erosion after its respective abandonment. Small portable rainfall simulators are considered as useful tool for measuring interrelated soil erosion processes such as splash, initial rainfall-runoff processes, infiltration, sediment yield, water turbidity or nutrient suspensions (Cerdà, 1999; Iserloh et al., 2013; Rodrigo Comino et al., 2016). 105 experiments were conducted with a small portable rainfall simulator (rainfall intensity of 40 mm h-1 in 30 minutes) in four different land uses and their respective abandoned land: i) citrus and olives (Valencia), almonds (Murcia) and vines (Málaga). We studied the main environmental factors that may determine water soil erosion during the performed experiments: slope, vegetation cover, rock fragment cover, soil properties (texture) and hydrological responses (time to runoff and infiltration generation). REFERENCES Brevik, E.C., Cerdà, A., Mataix-Solera, J., Pereg, L., Quinton, J.N., Six, J., Van Oost, K., 2015. The interdisciplinary nature of SOIL. SOIL 1, 117-129. doi:10.5194/soil-1-117-2015 Cammeraat, E.L.H., Cerdà, A., Imeson, A.C., 2010. Ecohydrological adaptation of soils following land abandonment in a semi-arid environment. Ecohydrology 3, 421-430. doi:10.1002/eco.161 Cerdà, A., 1999. Simuladores de lluvia y su aplicación a la Geomorfología: Estado de la cuestión. Cuad. Investig. Geográfica 45-84. Iserloh, T., Ries, J.B., Arnáez, J., Boix-Fayos, C., Butzen, V., Cerdà, A., Echeverría, M.T., Fernández-Gálvez, J., Fister, W., Geißler, C., Gómez, J.A., Gómez-Macpherson, H., Kuhn, N.J., Lázaro, R., León, F.J., Martínez-Mena, M., Martínez-Murillo, J.F., Marzen, M., Mingorance, M.D., Ortigosa, L., Peters, P., Regüés, D., Ruiz-Sinoga, J.D., Scholten, T., Seeger, M., Solé-Benet, A., Wengel, R., Wirtz, S., 2013. European small portable rainfall simulators: A comparison of rainfall characteristics. Catena 110, 100-112. doi:10.1016/j.catena.2013.05.013 Keesstra, S., Geissen, V., Mosse, K., Piiranen, S., Scudiero, E., Leistra, M., van Schaik, L., 2012. Soil as a filter for groundwater quality. Curr. Opin. Environ. Sustain., Terrestrial systems 4, 507-516. doi:10.1016/j.cosust.2012.10.007 Rodrigo Comino, J., Iserloh, T., Morvan, X., Malam Issa, O., Naisse, C., Keesstra, S.D., Cerdà, A., Prosdocimi, M., Arnáez, J., Lasanta, T., Ramos, M.C., Marqués, M.J., Ruiz Colmenero, M., Bienes, R., Ruiz Sinoga, J.D., Seeger, M., Ries, J.B., 2016. Soil Erosion Processes in European Vineyards: A Qualitative Comparison of Rainfall Simulation Measurements in Germany, Spain and France. Hydrology 3, 6. doi:10.3390/hydrology3010006 Smith, P., Cotrufo, M.F., Rumpel, C., Paustian, K., Kuikman, P.J., Elliott, J.A., McDowell, R., Griffiths, R.I., Asakawa, S., Bustamante, M., House, J.I., Sobocká, J., Harper, R., Pan, G., West, P.C., Gerber, J.S., Clark, J.M., Adhya, T., Scholes, R.J., Scholes, M.C., 2015. Biogeochemical cycles and biodiversity as key drivers of ecosystem services provided by soils. SOIL 1, 665-685. doi:10.5194/soil-1-665-2015 Vanmaercke, M., Poesen, J., Maetens, W., de Vente, J., Verstraeten, G., 2011. Sediment yield as a desertification risk indicator. Sci. Total Environ. 409, 1715-1725. doi:10.1016/j.scitotenv.2011.01.034

  1. Relations Between Coastal Catchment Attributes and Submarine Groundwater Discharge at Different Scales

    NASA Astrophysics Data System (ADS)

    Moosdorf, N.; Langlotz, S. T.

    2016-02-01

    Submarine groundwater discharge (SGD) has been recognized as a relevant field of coastal research in the last years. Its implications on local scale have been documented by an increasing number of studies researching individual locations with SGD. The local studies also often emphasize its large variability. On the other end, global scale studies try to estimate SGD related fluxes of e.g. carbon (Cole et al., 2007) and nitrogen (Beusen et al., 2013). These studies naturally use a coarse resolution, too coarse to represent the aforementioned local variability of SGD (Moosdorf et al., 2015). A way to transfer information of the local variability of SGD to large scale flux estimates is needed. Here we discuss the upscaling of local studies based on the definition and typology of coastal catchments. Coastal catchments are those stretches of coast that do not drain into major rivers but directly into the sea. Their attributes, e.g. climate, topography, land cover, or lithology can be used to extrapolate from the local scale to larger scales. We present first results of a typology, compare coastal catchment attributes to SGD estimates from field studies and discuss upscaling as well as the associated uncertainties. This study aims at bridging the gap between the scales and enabling an improved representation of local scale variability on continental to global scale. With this, it can contribute to a recent initiative to model large scale SGD fluxes (NExT SGD). References: Beusen, A.H.W., Slomp, C.P., Bouwman, A.F., 2013. Global land-ocean linkage: direct inputs of nitrogen to coastal waters via submarine groundwater discharge. Environmental Research Letters, 8(3): 6. Cole, J.J., Prairie, Y.T., Caraco, N.F., McDowell, W.H., Tranvik, L.J., Striegl, R.G., Duarte, C.M., Kortelainen, P., Downing, J.A., Middelburg, J.J., Melack, J., 2007. Plumbing the global carbon cycle: Integrating inland waters into the terrestrial carbon budget. Ecosystems, 10(1): 171-184. Moosdorf, N., Stieglitz, T., Waska, H., Durr, H.H., Hartmann, J., 2015. Submarine groundwater discharge from tropical islands: a review. Grundwasser, 20(1): 53-67.

  2. Use of subdermal contraceptive implants in a community-based family planning program. Experience after two years.

    PubMed

    Mittelmark, M B; Hansen, W B; Shiferaw, B; Bradham, D D

    1995-10-01

    In North Carolina, the Rutherford County Family Planning Council obtained funds from a special grant for levonorgestrel implants for women not eligible for medical assistance benefits. The Council approved the following approaches to promoting responsible sexual behavior and preventing unwanted pregnancy: creation of an interagency council to monitor the program, education in the schools on responsible sexual behavior, establishment of an information-sharing network for social service agencies, and expanded, low-cost or free family planning services. During 1992-1993, clinicians at the county health department and in private practices inserted implants in 287 women aged 13-37 living mainly in Rutherford County but also in McDowell and Polk counties. A survey was also conducted in the public high school to obtain self-assessment and information about family planning from female adolescents. Age distribution of the acceptors of the contraceptive implants was 40% for 13-19 year olds (the initiative's target group), 34% for 21-25 year olds, and 32% for 18-20 year olds (32%). The two-year insertion rate for women aged 10-19 was 17.3/1000 compared to 20.8/1000 for women aged 20-29. The implantation rate was greatest among 18-25 year olds and lowest among women aged 26 and older. The method of payment for implantation was medical assistance in 69% of cases and a philanthropic foundation for women not eligible for medical assistance in 29% of cases. 8% had the implants removed during the study period. The leading reason for removal was psychological distress (25%), followed by headaches (20.8%), desire to conceive (16.7%), bleeding (12.5%), and medical contraindication (12.5%). The interval between implantation and removal ranged from less than 3 months to more than 12 months. 2.3% of the female high school students used implants. Among the 596 students who were sexually active, 4.2% used implants, 1.85% used a diaphragm, 27.5% used condoms, and 15% used oral contraceptives. The implant acceptors attended 65% of scheduled 3-month follow-up visits.

  3. Ecological and meteorological drought monitoring in East Asia

    NASA Astrophysics Data System (ADS)

    Kim, J. B.; Um, M. J.; Kim, Y.; Chae, Y.

    2016-12-01

    This study aims to how well the ecological drought index can capture the drought status in the East Asia. We estimated the drought severe index (DSI), which uses the evapotranspiration, potential evapotranspiration and the normalized difference vegetation index (NDVI), suggested by Mu et al. (2013) to define the ecological drought. In addition, the meteorological drought index, which is standardized precipitation and evapotranspiration index (SPEI), are estimated and compared to the DSI. The satellite data by moderate resolution imaging spectroradiometer (MODIS) and advanced very-high-resolution radiometer (AVHRR) are used to analyze the DSI and the monthly precipitation and temperature data in the climate research unit (CRU) are applied to estimate the SPEI for 2000-2013 in the East Asia. We conducted the statistical analyses to investigate the drought characteristics of the ecological and meteorological drought indices (i.e. the DSI and SPEI, respectively) and then compared those characteristics drought indices depending on the drought status. We found the DSI did not well captured the drought status when the categories originally suggested by Mu et al. (2013) are applied to divide the drought status in the study area. Consequently, the modified categories for the DSI in this study is suggested and then applied to define the drought status. The modified categories in this study show the great improvement to capture the drought status in the East Asia even though the results cannot be acquired around Taklamakan desert due to the lack of the satellite data. These results illustrate the ecological drought index, such as the DSI, can be applied for the monitoring of the drought in the East Asia and then can give the detailed information of drought status because the satellite data have the relatively high spatial resolutions compared to the observations such as the CRU data. Reference Mu Q, Zhao M, Kimball JS, McDowell NG, Running SW (2013) A remotely sensed global terrestrial drought severity index. Bulletin of the American Meteorological Society 94(1): 83-98. Acknowledgement This study was supported by the Korea Meteorological Administration R&D Program under Grant KMIPA 2015-6180. Corresponding Author: yeonjoo.kim@yonsei.ac.kr

  4. Carbon dynamics in the Elbe land-ocean transition zone

    NASA Astrophysics Data System (ADS)

    Amann, Thorben; Weiss, Andreas; Hartmann, Jens

    2010-05-01

    Recent model data reveal a discrepancy between the mobilisation of carbon from the terrestrial system into the fluvial system and the amount of carbon reaching the ocean. It is estimated that of 1.9 Pg C yr-1 total terrestrial input (Cole et al., 2007), 0.12-0.41 Pg C yr-1 are lost through CO2-evasion from inner and outer estuaries to the atmosphere (Chen & Borges, 2009) while 0.9 Pg C yr-1 are exported to the ocean (Cole et al., 2007). Therefore estuaries can be considered as significant CO2 sources. To better understand temporal and spatial patterns of critical biogeochemical transformations in the land-ocean transition zone (LOTZ), an extensive historical hydrochemical dataset of the Elbe-river and -inner estuary system was analysed. The LOTZ of the river Elbe can be distinguished into four zones with respect to changes in carbon species abundance: the non-tidal river zone, the tidal harbour zone, the maximum turbidity zone (MTZ) and the river mouth zone. The concentrations of suspended matter and POC decrease from the non-tidal river zone reaching their minima in the harbour zone. The MTZ is characterised by maximum SPM and POC values, while both parameters decrease to a further minimum in the river mouth. Interestingly the POC concentration has nearly doubled in the period 1999-2007 if compared to the period 1985-1998. A possible cause may be the decrease in the general pollution of the river, despite of decreasing N and P loads in the past decades. This is supported by the observed reduction of DOC concentrations by 50% in the earlier period. In contrast the proportions of DOC and POC values within the four zones did not change. The doubling of POC concentrations between the two periods is not reflected in increasing SPM concentrations, resulting in higher POC (wt-% SPM) values. A decrease of POC (wt-% SPM) from the non-tidal river zone to the river mouth indicates loss of organic carbon due to respiration processes. This is supported by an increase of nitrate and phosphate concentrations as well as dissolved inorganic carbon. Presented analysis is used to develop a new spatial framework for quantification of carbon dynamics especially addressing sinks and sources of carbon in the land-ocean transition zone of the river Elbe. References Chen, C.-T.A. and Borges, A.V. (2009), „Reconciling opposing views on carbon cycling in the coastal ocean: Continental shelves as sinks and near-shore ecosystems as sources of atmospheric CO2', Deep-Sea Research II (56), 578-590. Cole, J. and Prairie, Y. and Caraco, N. and McDowell, W. and Tranvik, L. and Striegl, R. and Duarte, C. and Kortelainen, P. and Downing, J. and Middelburg, J. and Melack, J. (2007), "Plumbing the Global Carbon Cycle: Integrating Inland Waters into the Terrestrial Carbon Budget", Ecosystems 10 (1), 172-185.

  5. Engineered Zircaloy Cladding Modifications for Improved Accident Tolerance of LWR Nuclear Fuel

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Heuser, Brent; Stubbins, James; Kozlowski, Tomasz

    The DOE NEUP sponsored IRP on accident tolerant fuel (ATF) entitled Engineered Zircaloy Cladding Modifications for Improved Accident Tolerance of LWR Nuclear Fuel involved three academic institutions, Idaho National Laboratory (INL), and ATI Materials (ATI). Detailed descriptions of the work at the University of Illinois (UIUC, prime), the University of Florida (UF), the University of Michigan (UMich), and INL are included in this document as separate sections. This summary provides a synopsis of the work performed across the IRP team. Two ATF solution pathways were initially proposed, coatings on monolithic Zr-based LWR cladding material and selfhealing modifications of Zr-based alloys.more » The coating pathway was extensively investigated, both experimentally and in computations. Experimental activities related to ATF coatings were centered at UIUC, UF, and UMich and involved coating development and testing, and ion irradiation. Neutronic and thermal hydraulic aspects of ATF coatings were the focus of computational work at UIUC and UMich, while materials science aspects were the focus of computational work at UF and INL. ATI provided monolithic Zircaloy 2 and 4 material and a binary Zr-Y alloy material. The selfhealing pathway was investigated with advanced computations only. Beryllium was identified as a valid self-healing additive early in this work. However, all attempts to fabricate a Zr-Be alloy failed. Several avenues of fabrication were explored. ATI ultimately declined our fabrication request over health concerns associated with Be (we note that Be was not part of the original work scope and the ATI SOW). Likewise, Ames Laboratory declined our fabrication request, citing known litigation dating to the 1980s and 1990s involving the U.S. Federal government and U.S. National Laboratory employees involving the use of Be. Materion (formerly, Brush Wellman) also declined our fabrication request, citing the difficulty in working with a highly reactive Zr and Be. International fabrication options were explored in Europe and Asia, but this proved to be impractical, if not impossible. Consequently, experimental investigation of the Zr-Be binary system was dropped and exploration binary Zr-Y binary system was initiated. The motivation behind the Zr-Y system is the known thermodynamic stability of yttria over zirconia.« less

  6. Individual Differences in Animal Stress Models: Considering Resilience, Vulnerability, and the Amygdala in Mediating the Effects of Stress and Conditioned Fear on Sleep

    PubMed Central

    Wellman, Laurie L.; Fitzpatrick, Mairen E.; Hallum, Olga Y.; Sutton, Amy M.; Williams, Brook L.; Sanford, Larry D.

    2016-01-01

    Study Objectives: To examine the REM sleep response to stress and fearful memories as a potential marker of stress resilience and vulnerability and to assess the role of the basolateral amygdala (BLA) in mediating the effects of fear memory on sleep. Methods: Outbred Wistar rats were surgically implanted with electrodes for recording EEG and EMG and with bilateral guide cannulae directed at the BLA. Data loggers were placed intraperitoneally to record core body temperature. After recovery from surgery, the rats received shock training (ST: 20 footshocks, 0.8 mA, 0.5-s duration, 60-s interstimulus interval) and afterwards received microinjections of the GABAA agonist muscimol (MUS; 1.0 μM) to inactivate BLA or microinjections of vehicle (VEH) alone. Subsequently, the rats were separated into 4 groups (VEH-vulnerable (VEH-Vul; n = 14), VEH-resilient (VEH-Res; n = 13), MUS-vulnerable (MUS-Vul; n = 8), and MUS-resilient (MUS-Res; n = 11) based on whether or not REM was decreased, compared to baseline, during the first 4 h following ST. We then compared sleep, freezing, and the stress response (stress-induced hyperthermia, SIH) across groups to determine the effects of ST and fearful context re-exposure alone (CTX). Results: REM was significantly reduced on the ST day in both VEH-Vul and MUS-Vul rats; however, post-ST MUS blocked the reduction in REM on the CTX day in the MUS-Vul group. The VEH-Res and MUS-Res rats showed similar levels of REM on both ST and CTX days. The effects of post-ST inactivation of BLA on freezing and SIH were minimal. Conclusions: Outbred Wistar rats can show significant individual differences in the effects of stress on REM that are mediated by BLA. These differences in REM can be independent of behavioral fear and the peripheral stress response, and may be an important biomarker of stress resilience and vulnerability. Citation: Wellman LL, Fitzpatrick ME, Hallum OY, Sutton AM, Williams BL, Sanford LD. Individual differences in animal stress models: considering resilience, vulnerability, and the amygdala in mediating the effects of stress and conditioned fear on sleep. SLEEP 2016;39(6):1293–1303. PMID:27091518

  7. Using Wavelets to Evaluate Watershed Signals from Precipitation Extremes at a Localized Temporal Scale

    NASA Astrophysics Data System (ADS)

    Gentry, R. W.; Koirala, S. R.

    2008-12-01

    Resource managers in the future will be required to make decisions regarding complex systems under extreme uncertainty and to evaluate the sustainability of these natural systems. The variability and extremes of precipitation will be one of the major variables impacting natural systems, and decision making. These future decisions will be evaluated based upon economic costs and benefits, and core mission valuation. This will be particularly important in evaluating the effects and impacts of climate change on natural system response. In this case study, we evaluate the signal organization and its nature within a watershed in east Tennessee. In this study, temporal analyses were conducted on weekly time series data of water chemistry (nitrate, chloride, sulfate and calcium concentrations) collected from November 1995 to December 2005 at the West Fork of Walker Branch in Oak Ridge, Tennessee (Mulholland 1993, 2004). Hydrochemistry plays an important role in ecosystem services, particularly nitrate (Mulholland et al. 2008), and in general the signal responses can be complex. The time series in this study was modeled using a wavelet approach as a mechanism for evaluating short-term temporal effects. In general, time series signals of watershed hydrochemistry may provide clues as to broad environmental, ecological and economic impacts at the basin scale. References: Mulholland, P.J. (1993), Hydrometric and stream chemistry evidence of three storm flowpaths in Walker Branch Watershed, Journal of Hydrology, 151: 291-316. Mulholland, P.J. (2004). The importance of in-stream uptake for regulating stream concentrations and outputs of N and P from a forested watershed: evidence from long-term chemistry records for Walker Branch Watershed, Biogeochemistr. 70: 403-426. Mulholland, P.J., A.M. Helton, G.C. Poole, R.O. Hall Jr., S.K. Hamilton, B.J. Peterson, J.L. Tank, L.R. Ashkenas, L.W. Cooper, C.N. Dahm, W.K. Dodds, S.E.E. Findlay, S.V. Gregory, N.B. Grimm, S.L. Johnson, W.H. McDowell, J.L. Meyer, H.M. Valett, J.R. Webster, C.P. Arango, J.J. Beaulieu, M.J. Bernot, A.J. Burgin, C.L Crenshaw, L.T. Johnson, B.R. Niederlehner, J.M. O'Brien, J.D. Potter, R.W. Sheibley, D.J. Sobota, and S.M. Thomas (2008). Stream denitrification across biomes and its response to anthropogenic nitrate loading, Nature, 452(13): 202-206.

  8. "A Future for Fisheries?" Setting of a Field-based Class for Evaluation of Aquaculture and Fisheries Sustainability

    NASA Astrophysics Data System (ADS)

    Macko, Stephen; O'Connell, Matthew

    2016-04-01

    For the first time in 2015, aquaculture yields approximately equaled global wild capture fisheries. Are either of these levels of production sustainable? This course explored the limitations of both sources of fishery landings and included legal limitations, environmental concerns and technological problems and adaptations. It made use of visits to aquaculture facilities, government laboratories like NOAA , as well as large fish distribution centers like J.J. McDowell's Seafood (Jessup, MD), and included presentations by experts on legalities including the Law of the Sea. In addition, short day-long trips to "ocean-related" locations were also used to supplement the experience and included speakers involved with aquaculture. Central Virginia is a fortunate location for such a class, with close access for travel to the Chesapeake Bay and numerous field stations, museums with ocean-based exhibits (the Smithsonian and National Zoo) that address both extant and extinct Earth history, as well as national/state aquaria in Baltimore and Virginia Beach. Furthermore, visits to local seafood markets at local grocery stores, or larger city markets in Washington, Baltimore and Virginia Beach, enhance the exposure to productivity in the ocean, and viability of the fisheries sustainability. Sustainability awareness is increasingly a subject in educational settings. Marine science classes are perfect settings of establishing sustainability awareness owing to declining populations of organisms and perceived collapse in fisheries worldwide. Students in oceanography classes often request more direct exposure to actual ocean situations or field trips. This new approach to such a course supplement addresses the requests by utilizing local resources and short field trips for a limited number of students to locations in which Ocean experiences are available, and are often supported through education and outreach components. The vision of the class was a mixture of classroom time, readings, along with paper and laboratories. The course could then address not only the particulars of the marine science, but also aspects of sustainability with discussions on ethics, including keeping animals in captivity or overfishing of particular species and the special difficulties that arise from captive or culturing ocean populations. In addition, the class was encouraged to post web-based journals of experiences in order to share opinions of observations in each of the settings, including the evaluation of the foods they were consuming during the class.

  9. Laser ablation U-Th-Sm/He dating of detrital apatite

    NASA Astrophysics Data System (ADS)

    Guest, B.; Pickering, J. E.; Matthews, W.; Hamilton, B.; Sykes, C.

    2016-12-01

    Detrital apatite U-Th-Sm/He thermochronology has the potential to be a powerful tool for conducting basin thermal history analyses as well as complementing the well-established detrital zircon U-Pb approach in source to sink studies. A critical roadblock that prevents the routine application of detrital apatite U-Th-Sm/He thermochronology to solving geological problems is the costly and difficult whole grain approach that is generally used to obtain apatite U-Th-Sm/He data. We present a new analytical method for laser ablation thermochronology on apatite. Samples are ablated using a Resonetics™ 193 nm excimer laser and liberated 4He is measured using an ASI (Australian Scientific Instruments) Alphachron™ quadrupole mass spectrometer system; collectively known as the Resochron™. The ablated sites are imaged using a Zygo ZescopeTM optical profilometer and ablated pit volume measured using PitVol, a custom MatLabTM algorithm. The accuracy and precision of the method presented here was confirmed using well-characterized Durango apatite and Fish Canyon Tuff (FCT) apatite reference materials, with Durango apatite used as a primary reference and FCT apatite used as a secondary reference. The weighted average of our laser ablation Durango ages (30.5±0.35 Ma) compare well with ages obtained using conventional whole grain degassing and dissolution U-Th-Sm/He methods (32.56±0.43 Ma) (Jonckheere et.al., 1 993; Farley, 2000; McDowell et.al., 2005) for chips of the same Durango crystal. These Durango ages were used to produce a K-value to correct the secondary references and unknown samples. After correction, FCT apatite has a weighted average age of 28.37 ± 0.96 Ma, which agrees well with published ages. As a further test of this new method we have conducted a case study on a set of samples from the British Mountains of the Yukon Territory in NW Canada. Sandstone samples collected across the British Mountains were analyzed using conventional U-Th-Sm/He whole grain methods and then reanalyzed using our new Laser ablation approach. The laser ablation results are consistent with those obtained using conventional methods, confirming that apatite laser ablation U-Th-Sm/He thermochronology is a viable alternative for collecting large low temperature thermochronology data sets from detrital samples.

  10. Maps showing coal resources in the Crumpler Quadrangle, Mercer, McDowell, and Wyoming counties, West Virginia

    USGS Publications Warehouse

    Stricker, Gary D.

    1980-01-01

    Coal Geology The Crumpler quadrangle lies in the Appalachian Plateaus province, with the coal bearing Pocahontas and New River Formations of Pennsylvanian age having a gentle dip toward the northwest. Coal bed maps were prepared (figures 1-7) and resources were estimated (table 1) for seven of the many coal beds in the Crumpler quadrangle (Stricker, 1980, lists the names of the various coal beds in the quadrangle) following methods established by U.S. Bureau of Mines and U.S. Geological Survey, 1976. All of these coal beds crop out at the surface in the quadrangle, have a maximum thickness thickness of over-burden of less than 300 meters, and have been mined at the surface, or under-ground, or both. Resource estimates were not calculated for other coal beds in the Pocahontas and New River Formations, either because of insufficient data of because of the beds are too thin. Figure 8 is a generalized stratigraphic column of the coal-bearing sequence in the Crumpler quadrangle showing thickness and relative positions of the various coal beds. The Crumpler quadrangle originally contained about 498 million metric tons of coal. Approximately 326 million metric tons have been mined, or lost in mining, leaving remaining resources of 172 million metric tons. Analyses of the mined coal beds in the Crumpler and adjacent quadrangle show the coal is medium - to low volatile bituminous (most are low volatile bituminous), containing 14-27 percent volatile matter (with an arithmetic mean of 18 percent), 2.1-22.4 percent ash (with an arithmetic mean of 7 percent), and 0.5-1.8 percent total sulfur (with an arithmetic mean of 0.8 percent). Heating values range from 6,380 to 8,610 Kcal/kg on an as-received basis. Trace element and major and minor oxide composition, of both whole coal and laboratory ash, for 59 samples within or near the quadrangle were obtained from USCHEM (Geochemical Data File or National Coal Resources Data System), (Kozey and others, 1980.) Neither elements of environmental concern such as arsenic, lead, mercury, and selenium nor potentially valuable elements such as germanium, uranium and thorium were found in significant amounts in the coal.

  11. Continuous Wave Stimulated Raman Spectroscopy Inside a Hollow Core Photonic Crystal Fiber

    NASA Astrophysics Data System (ADS)

    Domenech, Jose L.; Cueto, Maite

    2013-06-01

    Hollow-core photonic crystal fibers (HCPCF) have raised new opportunities to study light-matter interaction. Dielectric or metallic capillaries are intrinsically lossy, making poor light guides. In contrast, HCPCFs can guide light quite efficiently, due to the band-gap effect produced by an array of smaller channels which surrounds a central hollow core with a few μm diameter. The tight confinement of light inside the core, that can be filled with gases, as well as a long interaction length, enhance multiple nonlinear phenomena, making it possible to devise new ways to do low signal level spectroscopy, as is the case of high resolution stimulated Raman spectroscopy (SRS). A. Owyoung demonstrated high resolution continuous wave SRS in 1978. Shortly afterwards, seeking higher sensitivity, he developed the quasi-continuous SRS technique (a high peak power pump laser, interacting with a low power cw probe laser). That variant remains today the best compromise between resolution and sensitivity for gas-phase Raman spectroscopy. In this work, we show the possibility of fully cw stimulated Raman spectroscopy, using a gas cell built around a HCPCF to overcome the limitations posed by the weakness of the stimulated Raman effect when not using pulsed sources. The interaction length (1.2 m), longer than that of a multiple pass refocusing cell, and the narrow diameter of the core (4.8 μm), can compensate for the much lower laser powers used in the cw set-up. The experimental complexity is considerably reduced and the instrumental resolution is at the 10's of MHz level, limited, with our fiber, by transit time effects. At present, we have demonstrated the feasibility of the experiment, a sensitivity enhancement of ˜ 6000 over the single focus regime, and a spectral resolution better than 0.005 wn in the unresolved Q-branch of the ν_1 component of the Fermi dyad of CO_2 at 1388 wn. Other examples of rotationally resolved spectra will be shown: the Q branch of O_2 at 1555 wn, and the 2ν_2 component of the Fermi dyad of CO_2 at 1285 wn. P. St. Russell, Science {299}, 358, 2003. A.Owyoung, C. W. Patterson, R S. McDowell, Chem. Phys. Lett. {59}, 156, 1978

  12. Modelling sub-daily evaporation from a small reservoir.

    NASA Astrophysics Data System (ADS)

    McGloin, Ryan; McGowan, Hamish; McJannet, David; Burn, Stewart

    2013-04-01

    Accurate quantification of evaporation from small water storages is essential for water management and is also required as input in some regional hydrological and meteorological models. Global estimates of the number of small storages or lakes (< 0.1 kilometers) are estimated to be in the order of 300 million (Downing et al., 2006). However, direct evaporation measurements at small reservoirs using the eddy covariance or scintillometry techniques have been limited due to their expensive and complex nature. To correctly represent the effect that small water bodies have on the regional hydrometeorology, reliable estimates of sub-daily evaporation are necessary. However, evaporation modelling studies at small reservoirs have so far been limited to quantifying daily estimates. In order to ascertain suitable methods for accurately modelling hourly evaporation from a small reservoir, this study compares evaporation results measured by the eddy covariance method at a small reservoir in southeast Queensland, Australia, to results from several modelling approaches using both over-water and land-based meteorological measurements. Accurate predictions of hourly evaporation were obtained by a simple theoretical mass transfer model requiring only over-water measurements of wind speed, humidity and water surface temperature. An evaporation model that was recently developed for use in small reservoir environments by Granger and Hedstrom (2011), appeared to overestimate the impact stability had on evaporation. While evaporation predictions made by the 1-dimensional hydrodynamics model, DYRESM (Dynamic Reservoir Simulation Model) (Imberger and Patterson, 1981), showed reasonable agreement with measured values. DYRESM did not show any substantial improvement in evaporation prediction when inflows and out flows were included and only a slighter better correlation was shown when over-water meteorological measurements were used in place of land-based measurements. Downing, J. A., Y. T. Prairie, J. J. Cole, C. M. Duarte, L. J. Tranvik, R. G. Striegl, W. H. McDowell, P. Kortelainen, N. F. Caraco, J. M. Melack and J. J. Middelburg (2006), The global abundance and size distribution of lakes, ponds, and impoundments, Limnology and Oceanography, 51, 2388-2397. Granger, R.J. and N. Hedstrom (2011), Modelling hourly rates of evaporation from small lakes, Hydrological and Earth System Sciences, 15, doi:10.5194/hess-15-267-2011. Imberger, J. and J.C. Patterson (1981), Dynamic Reservoir Simulation Model - DYRESM: 5, In: Transport Models for Inland and Coastal Waters. H.B. Fischer (Ed.). Academic Press, New York, 310-361.

  13. Constraints on ice volume changes of the WAIS and Ross Ice Shelf since the LGM based on cosmogenic exposure ages in the Darwin-Hatherton glacial system of the Transantarctic Mountains

    NASA Astrophysics Data System (ADS)

    Fink, David; Storey, Bryan; Hood, David; Joy, Kurt; Shulmeister, James

    2010-05-01

    Quantitative assessment of the spatial and temporal scale of ice volume change of the West Antarctic ice sheet (WAIS) and Ross Ice Shelf since the last glacial maximum (LGM) ~20 ka is essential to accurately predict ice sheet response to current and future climate change. Although global sea level rose by approximately 120 metres since the LGM, the contribution of polar ice sheets is uncertain and the timing of any such contribution is controversial. Mackintosh et al (2007) suggest that sectors of the EAIS, similar to those studied at Framnes Mountains where the ice sheet slowly calves at coastal margins, have made marginal contributions to global sea-level rise between 13 and 7 ka. In contrast, Stone et al (2003) document continuing WAIS decay during the mid-late Holocene, raising the question of what was the response of the WAIS since LGM and into the Holocene. Terrestrial evidence is restricted to sparse coastal oasis and ice free mountains which archive limits of former ice advances. Mountain ranges flanking the Darwin-Hatherton glaciers exhibit well-defined moraines, weathering signatures, boulder rich plateaus and glacial tills, which preserve the evidence of advance and retreat of the ice sheet during previous glacial cycles. Previous studies suggest a WAIS at the LGM in this location to be at least 1,000 meters thicker than today. As part of the New Zealand Latitudinal Gradient Project along the Transantarctic, we collected samples for cosmogenic exposure dating at a) Lake Wellman area bordering the Hatherton Glacier, (b) Roadend Nunatak at the confluence of the Darwin and Hatherton glaciers and (c) Diamond Hill which is positioned at the intersection of the Ross Ice Shelf and Darwin Glacier outlet. While the technique of exposure dating is very successful in mid-latitude alpine glacier systems, it is more challenging in polar ice-sheet regions due to the prevalence of cold-based ice over-riding events and absence of outwash processes which removes glacially transported debris. Our glacial geomorphic survey from ice sheet contact edge (~850 masl) to mountain peak at 1600 masl together with a suite of 10Be and 26Al exposure ages, documents a pre-LGM ice volume at least 800 meters thicker than current ice levels which was established at least 2 million years ago. However a complex history of exposure and re-exposure of the ice free regions in this area is seen in accordance with advance and retreat of the ice sheets that feeds into the Darwin -Hatherton system. A cluster of mid-altitude boulders, located below a prominent moraine feature mapped previously as demarcating the LGM ice advance limits, have exposure ages ranging from 30 to 40 ka. Exposure ages for boulders just above the ice contact range from 1to 19 ka and allow an estimate of inheritance. Hence, we conclude that LGM ice volume was not as large as previously estimated and actually little different from what is observed today. These results raise rather serious questions about the implications of a reduced WAIS at the LGM, its effect on the development of the Ross Ice Shelf, and how the Antarctic ice sheets respond to global warming. J. O. Stone et al., Science v299, 99 (2003). A. Mackintosh, D. White, D. Fink, D. Gore et al, Geology, v 35; 551-554 (2007).

  14. Chemical properties of Garnets from Garnet Ridge, Navajo volcanic field in the Colorado Plateau

    NASA Astrophysics Data System (ADS)

    Koga, I.; Ogasawara, Y.

    2012-12-01

    Significant amounts of garnet crystals have derived from kimberlitic diatremes at Garnet Ridge in northern Arizona. These garnets are chemically diverse and their origins have been still controversial. The diatremes at Garnet Ridge were dated at 30Ma (Smith et al., 2004). Coesite-bearing lawsonite eclogite reported by Usui et al., (2003) is important evidence for subduction of the Fallaron Plate below the Colorado plateau. This study characterized various kinds of garnets with several origins by petrographical observations and electron microprobe analyses (JXA-8900 WDS mode and JXA-733 EDS mode). On the basis of the chemical compositions and other features, the garnets were classified into the following 8 groups (A to H). Inclusions and exsolved phases were identified by laser Raman spectroscopy. (A) Garnet crystals (5-8 mm) with purple color are called ''Navajo Ruby''. A significant amount of Cr2O3 is a typical feature (up to ~5.9 wt. %). These garnet were rich in pyrope (66-78 mol. %). Olivine, Cpx, and exsolved lamellae of rutile were contained. (B) Reddish brown garnets were Pyp-rich (60-75 mol. %), and contained a minor amount of Cr2O3 (less than ~1 wt. %). The inclusions were rod-shaped rutile , Cpx, Opx, zircon, olivine and exsolved lamellae of apatite. (C) Garnet megacrysts (8-12 cm) were plotted near the center of Prp-Alm-Grs triangle (Pyp30-35 Alm28-33 Grs29-35). Exsolved apatite lamellae were confirmed. (D) Some of reddish brown garnets were plotted on same area as the Type-C. (E) Garnets in eclogite have Alm-rich composition (Pyp6-22 Alm52-65 Grs16-42). They clearly showed prograde chemical zonation; MgO: 1.4 to 5.4 wt. %, CaO: 14.0 to 5.6 wt. % both from core to rim. (F) Garnets in altered or metasomatized eclogite had a wide range of chemical composition (Pyp7-38 Alm52-69 Grs4-31) with similar prograde zonation. The cores were plotted near the rim of Type-E garnet. (G) Garnets in unidentified rock (strongly altered) had Alm-rich composition near Alm-Prp join. Euhedral quartz and zircon were included in the garnet. (H) Garnets in skarn-like rock of metasomatism origin at crustal level were plotted on Alm-Grs join and have no Prp component. Titanite, zoisite and fluid inclusion were identified in this garnet. Among the garnets described above, one of the typical garnets from Garnet Ridge is Cr-bearing Pyp-rich garnet, "Navajo Ruby", of peridotite origin at great depths, and another typical one is garnet in eclogite probably of subducted Farallon Plate origin. These two rocks having strong contrast each other were mixed underneath the Colorado Plateau. The chemical characteristics and petrographical features of the garnets from Garnet Ridge will give us very important information on complex petrochemical processes and related environments underneath the Colorado Plateau. Acknowledgements: The authors are grateful to Mrs. Pauline Deswudt who sold us various kinds of garnet grains and their host rocks for the present study. References: D. Smith, James N. Connelly, Kathryn Manser, Desmond E. Moser, Todd B. Housh, Fred W. McDowell, and Lawrence E. Mack., Vol. 5, Number 4. (2004) Geochemistry Geophysics Geosystems Usui, T., Nakamura, E., Kobayashi, K., Maruyama, S. and Helmstaedt, H. (2003) Geology, 31.

  15. First-principles calibration of 40Ar/39Ar mineral standards and complete extraction of 40Ar* from sanidine

    NASA Astrophysics Data System (ADS)

    Morgan, L. E.; Kuiper, K.; Mark, D.; Postma, O.; Villa, I. M.; Wijbrans, J. R.

    2010-12-01

    40Ar/39Ar geochronology relies on comparing argon isotopic data for unknowns to those for knowns. Mineral standards used as neutron fluence monitors must be dated by the K-Ar method (or at least referenced to a mineral of known K-Ar age). The commonly used age of 28.02 ± 0.28 Ma for the Fish Canyon sanidine (FCs) (Renne et al., 1998) is based upon measurements of radiogenic 40Ar in GA1550 biotite (McDougall and Roksandic, 1974), but underlying full data were not published (these measurements were never intended for use as an international standard), so uncertainties are difficult to assess. Recent developments by Kuiper et al. (2008) and Renne et al. (2010) are limited by their reliance on the accuracy of other systems. Modern technology should allow for more precise and accurate calibration of primary K-Ar and 40Ar/39Ar standards. From the ideal gas law, the number of moles of 40Ar in a system can be calculated from measurements of pressure, volume, and temperature. Thus we have designed and are proceeding to build a pipette system to introduce well-determined amounts of 40Ar into noble gas extraction lines and mass spectrometers. This system relies on components with calibrations traceable to SI unit prototypes, including a diaphragm pressure gauge (MKS Instruments), thermocouples, and a “slug” of an accurately determined volume to be inserted into the reservoir for volume determinations of the reservoir and pipette. The system will be renewable, with a lifetime of ca. 1 month for gas in the reservoir, and portable, to permit interlaboratory calibrations. The quantitative extraction of 40Ar* from the mineral standard is of highest importance; for sanidine standards this is complicated by high melt viscosity during heating. Experiments adding basaltic “zero age glass” (ZAG) to decrease melt viscosity are underway. This has previously been explored by McDowell (1983) with a resistance furnace, but has not been quantitatively addressed with laser heating. The sensitivity of each participating mass spectrometer will be calibrated by the bracketing standards approach, alternating measurements of pipette gas and mineral standards. This will convert relative abundances into absolute molar quantities and allow for quantification of interlaboratory systematic bias. Uncertainty propagation indicates uncertainties of the molar quantity of 40Ar in mineral standards will be < 0.25% (2σ), a considerable improvement of one component of the uncertainties involved in 40Ar/39Ar geochronology. The research leading to these results has received funding from the European Community's Seventh Framework Programme (FP7/2007-2013) under grant agreement n° [215458].

  16. Carbon limitation reveals allocation priority to defense compounds in peppermint

    NASA Astrophysics Data System (ADS)

    Forkelova, Lenka; Unsicker, Sybille; Forkel, Matthias; Huang, Jianbei; Trumbore, Susan; Hartmann, Henrik

    2016-04-01

    Studies of carbon partitioning during insect or pathogen infestation reveal high carbon investment into induced chemical defenses to deter the biotic agent (Baldwin, 1998). However, little is known how carbon investment into chemical defenses changes under abiotic stress such as drought. Drought forces plants to close their stomata to prevent water loss through transpiration while decreasing the amount of assimilated carbon. Furthermore drought hampers carbohydrates translocation due to declining plant hydration and reduced phloem functioning (McDowell, 2011; Hartmann et al., 2013; Sevanto, 2014). Hence long lasting drought can force plants into carbon starvation. The aim of our study was to disentangle carbon allocation priorities between growth, maintenance metabolism, storage and production of defense compounds under carbon limiting conditions using peppermint as our model plant. Drought is not the only method how to manipulate plant carbon metabolism and photosynthetic yield. Exposing plants to reduced [CO2] air is a promising tool simulating drought induced carbon limitation without affecting phloem functioning and so carbohydrate translocation (Hartmann et al., 2015). We exposed peppermint plants to drought (50% of the control irrigation) and to low [CO2] (progressive decrease from 350 ppm to 20 ppm) to disentangle hydraulic failure from carbon starvation effects on carbon allocation. Drought was applied as a cross-treatment yielding four treatments: watered and high [CO2] (W+CO2), drought and high [CO2] (D+CO2), water and low [CO2] (W-CO2), drought and low [CO2] (D-CO2). We analyzed the most abundant terpenoid defense compounds (α-Pinene, sabinene, myrcene, limonene, menthone, menthol and pulegone) and used continuous 13CO2 labelling to trace allocation pattern of new and old assimilated carbon in the four carbon sinks (structural biomass, water soluble sugars, starch and terpenoid defense compounds) in young expanding leaf tissue. This leaf tissue grew after the start of treatments and after the onset of the 13CO2 labelling. Under the control treatment (W+CO2) the relative proportion of new carbon in the four carbon sinks was very similar whereas under the three stress treatments (D+CO2, W-CO2, D-CO2) new carbon was preferentially invested into terpenoid defense compounds. This indicates that also under abiotic stress plants need to invest carbon into defense and so protect immature leaf tissue to secure long-term photosynthetic activity (Massad et al., 2014). Even though the concentration of water soluble sugars under both low [CO2] treatments dramatically dropped, concentration of terpenoid compounds correspondingly change only under the combination of drought and low [CO2] (D-CO2), which was the harshest treatment. Drought alone (D+CO2) caused high investment of old carbon and concentration increase of water soluble sugars as well as starch compared to other treatments. This carbohydrates increase could be explained by the use of water soluble sugars as osmoprotectants (Dichio et al.,2009) and by the fast growth decline as the main carbon sink (Muller et al., 2011).

  17. Hydrologic and water-quality data from Mountain Island Lake, North Carolina, 1994-97

    USGS Publications Warehouse

    Sarver, K.M.; Steiner, B.C.

    1998-01-01

    Continuous-record water-level gages were established at three sites on Mountain Island Lake and one site downstream from Mountain Island Dam. The water level of Mountain Island Lake is controlled by Duke Power Company releases at Cowans Ford Dam (upstream) and Mountain Island Dam (downstream). Water levels on Mountain Island Lake measured just downstream from Cowans Ford Dam fluctuated 11.15 feet during the study. Water levels just upstream from the Mountain Island Lake forebay fluctuated 6.72 feet during the study. About 3 miles downstream from Mountain Island Dam, water levels fluctuated 5.31 feet. Sampling locations included 14 sites in Mountain Island Lake, plus one downstream river site. At three sites, automated instruments recorded water temperature, dissolved-oxygen concentration, and specific conductance at 15-minute intervals throughout the study. Water temperatures recorded continuously during the study ranged from 4.2 to 35.2 degrees Celsius, and dissolved-oxygen concentrations ranged from 2.1 to 11.8 milligrams per liter. Dissolved-oxygen concentrations generally were inversely related to water temperature, with lowest dissolved-oxygen concentrations typically recorded in the summer. Specific conductance values recorded continuously during the study ranged from 33 to 89 microsiemens per centimeter; however, mean monthly values were fairly consistent throughout the study at all sites (50 to 61 microsiemens per centimeter). In addition, vertical profiles of water temperature, dissolved-oxygen concentration, specific conductance, and pH were measured at all sampling locations during 24 site visits. Water-quality constituent concentrations were determined for seven reservoir sites and the downstream river site during 17 sampling trips. Water-quality samples were routinely analyzed for biochemical oxygen demand, fecal coliform bacteria, hardness, alkalinity, total and volatile suspended solids, nutrients, total organic carbon, chlorophyll, iron, calcium, and magnesium; the samples were analyzed less frequently for trace metals, volatile organic compounds, semivolatile organic compounds, and pesticides. Maximum dissolved nitrite plus nitrate concentrations determined during the study were 0.348 milligram per liter in the mainstem sites and 2.77 milligrams per liter in the coves. Maximum total phosphorus concentrations were 0.143 milligram per liter in the mainstem sites and 0.600 milligram per liter in the coves. Fecal coliform and chlorophyll a concentrations were less than or equal to 160 colonies per 100 milliliters and 13 micrograms per liter, respectively, in all samples. Trace metals detected in at least one sample included arsenic, chromium, copper, lead, nickel, zinc, and antimony. Concentrations of all trace metals (except zinc) were 5.0 micrograms per liter or less; the maximum zinc concentration was 80 micrograms per liter. One set of bottom material samples was collected from Gar Creek and McDowell Creek for chemical analysis and analyzed for nutrients, trace metals, organochlorine pesticides, and semivolatile organic compounds. The only organochlorine pesticide identified in either sample was p,p'-DDE at an estimated concentration of 0.8 microgram per kilogram. Twenty semivolatile organic compounds, mainly polyaromatic hydrocarbons and plasticizers, were identified.

  18. Seasonal and spatial contrasts of sedimentary organic carbon in floodplain lakes of the central Amazon basin.

    NASA Astrophysics Data System (ADS)

    Sobrinho, Rodrigo; Kim, Jung-Hyun; Abril, Gwenaël; Zell, Claudia; Moreira-Turcq, Patricia; Mortillaro, Jean-Michel; Meziane, Tarik; Damsté, Jaap; Bernardes, Marcelo

    2014-05-01

    Three-quarters of the area of flooded land in the world are temporary wetlands (Downing, 2009), which play a significant role in the global carbon cycle(Einsele et al., 2001; Cole et al., 2007; Battin et al., 2009; Abril et al., 2013). Previous studies of the Amazonian floodplain lakes (várzeas), one important compartment of wetlands, showed that the sedimentation of organic carbon (OC) in the floodplain lakes is strongly linked to the periodical floods and to the biogeography from upstream to downstream(Victoria et al., 1992; Martinelli et al., 2003). However, the main sources of sedimentary OC remain uncertain. Hence, the study of the sources of OC buried in floodplain lake sediments can enhance our understanding of the carbon balance of the Amazon ecosystems. In this study, we investigated the seasonal and spatial pattern of sedimentary organic matter in five floodplain lakes of the central Amazon basin (Cabaliana, Janauaca, Canaçari, Miratuba, and Curuai) which have different morphologies, hydrodynamics and vegetation coverage. Surface sediments were collected in four hydrological seasons: low water (LW), rising water (RW), high water (HW) and falling water (FW) in 2009 and 2010. We investigated commonly used bulk geochemical tracers such as C:N ratio and stable isotopic composition of organic carbon (δ13COC). These results were compared with lignin-phenol parameters as an indicator of vascular plant detritus (Hedges and Ertel, 1982) and branched glycerol dialkyl glycerol tetraethers (brGDGTs) to trace the soil OC from land to the aquatic settings (Hopmans et al., 2004). Our data showed that during the RW and FW seasons, the concentration of lignin and brGDGTs were higher in comparison to other seasons. Our study also indicated that floodplain lake sediments primarily consisted of a mixture of C3 plant detritus and soil OC. However, a downstream increase in C4 plant-derived OC contribution was observed along the gradient of increasingly open waters, i.e. from upstream to downstream. We also identify the OC contribution from the seasonally flooded forests, i.e. temporary wetlands as the most important source of sedimentary OC in floodplain lakes. Accordingly, we attribute temporal and spatial difference in sedimentary OC composition to the hydrological connectivity between the Amazon River and its floodplain lakes and thus between the surrounding forests and the floodplain lakes. References: Abril, G., J.-M.Martinez, Artigas, L.F., Moreira-Turcq, P., Benedetti, M.P., Vidal, L., Meziane, T., Kim, J.-H., Bernardes, M.C., Savoye, N., Deborde, J., Albéric, P., Souza, M.F.L., Souza, E.L., Roland, F. Amazon River carbon dioxide outgassing fuelled by wetlands. Nature accepted (2013). Battin, T.J., Luyssaert, S., Kaplan, L.A., Aufdenkampe, A.K., Richter, A., Tranvik, L.J., 2009. The boundless carbon cycle. Nature Geoscience 2, 598 - 600 (2009). Cole, J.J., Prairie, Y.T., Caraco, N.F., McDowell, W.H., Tranvik, L.J., Striegl, R.G., Duarte, C.M., Kortelainen, P., Downing, J.A., Middelburg, J.J., Melack, J. Plumbing the Global Carbon Cycle: Integrating Inland Waters into the Terrestrial Carbon Budget. Ecosystems 10, 171 - 184 (2007). Downing, J. A. Global limnology: up-scaling aquatic services and processes to planet Earth. Verh Internat Verein Limnol 30, 1149.-1166 (2009). Einsele, G., Yan, J., Hinderer, M. Atmospheric carbon burial in modern lake basins and its significance for the global carbon budget. Global and Planetary Change 30, 167 - 195 (2001). Hedges, J.I., Ertel, J.R. Characterization of Lignin by Gas Capillary Chromatography of Cupric Oxide Oxidation Products. Analitical Chemistry 54, 174-178 (1982). Hopmans, E.C., Weijers, J.W.H., Schefu, E., Herfort, L., Damste, J.S.S., Schouten, S.,. A novel proxy for terrestrial organic matter in sediments based on branched and isoprenoid tetraether lipids. Earth and Planetary Science Letters 224, 107 - 116 (2004). Martinelli, L.A., Victoria, R.L., Camargo, P.B.d., Piccolo, M.d.C., Mertes, L., Richey, J.E., Devol, A.H., Forsberg, B.R.. Inland variability of carbon-nitrogen concentrations and d13C in Amazon floodplain (várzea) vegetation and sediment. Hydrol. Process. 17, 1419-1430 (2003). Victoria, R.L., Martinelli, L.A., Trivelin, P.C.O., Matsui, E., Forsberg, B.R., Richey, J.E., Devol, A.H.. The use of stable isotopes in studies of nutrient cycling: Carbon Isotope composition of Amazon varzea sediments. Biotropica 24, 240-249 (1992).

  19. Changes in Nitrogen Tropical Deposition Driven by Biomass Burning and Industrialization

    NASA Astrophysics Data System (ADS)

    Lara, L. B.; Holland, E. A.; Artaxo, P.; Martinelli, L.

    2003-12-01

    Until few years ago, N deposition studies and the consequences for ecosystems were focused on North Hemisphere, where most of the modern N deposition occurs. Nowadays, the pattern of N deposition has changed over the globe, calling attention to other geographical areas, including tropical regions which were the important pre-industrially(Matson et al., 1999). Substantial increases of NOx and SO2 emissions have been observed in Asia and in some regions of the tropics due to the rapid industrialization, urbanization, and deforestation (Ayers et al., 2000; Lara et al., 2001). Nevertheless,little information is available for developing regions of tropical and sub-tropical areas, where land-use changes are intense and followed by rapid urbanization, associated with a large industrial expansion. Such information is relevant, since recent estimates show that in a near future more than half of N inputs related to energy consumption in the Earth will take place in tropical and subtropical regions (Galloway et al., 1994). In addition, tropical terrestrial and aquatic systems appear to function differently from temperate systems, where N limitation is more severe than in the tropics (Matson et al, 1999). Conclusions based only in studies conducted in temperate regions may not be valid for tropical and sub-tropical regions. In the tropics the annual nitrogen wet deposition range from 2 to 10 kg N/ha/yr (Williams et al., 1997; Lara et al., 2001; IGAC 2003), according to the land cover. Brazil is largely tropical. It is considered a developing country, where developed areas with large urban centers, a large number of industries, and a high-technology agricultural system coexists with developing areas with low-technology and frontier-type agricultural systems and remote regions such as Amazon Basin. These anthropogenic activities are increasing the N wet deposition from an annual rate of 3.0 kg N/ha/yr in remote areas to an annual rate of 5.6 kg N/ha/yr in disturbed regions. If this increasing in the N deposition in Brazil will persist it may affect the nitrogen regional cycle with possible consequences to the ecosystems and to the nitrogen global cycle. References: Ayers, G. P., Leong Chow Peng, Lim Sze Fook, Cheah Wai Kong, R.W. Gillett and P.C. Manins (2000), Atmospheric concentrations and deposition of oxidized sulfur and nitrogen species at Petaling Jaya, Malaysia, 1993-1998, Tellus B, 52, 60-73. Galloway, J.N.; Levy, H.; Kasibhatla, P.S. (1994) Year 2020: consequences of population growth and development on deposition of oxidized nitrogen. Ambio 23, 120-123. IGAC newsletter no. 27, DEBITS special issue, January 2003, IGACtivities Newsletter. Lara, L.B.L.S., Artaxo, P.; Martinelli, L.A.; Victoria, R.L.; Camargo, P.B.; Krusche, A.; Ayers, G.P., Ferraz, E.S.B.; Ballester, M.V. (2001) Chemical composition of rainwater and anthropogenic influences in the Piracicaba river basin, Southeast Brazil. Atmospheric Environment. 35, 4937-4945. Matson, P.A.; McDowell, W.H.; Townsend, A.R.; Vitousek, P.M. (1999) The globalization of N deposition: ecosystem consequences in tropical environments. Biogeochemistry 46, 67-83. Williams, M.R., Fisher, T.R., and Melack, J.M. (1997) Chemical composition and deposition of rain in the central Amazon, Brazil. Atmospheric Environment 31, 207-217.

  20. LITERATURE ABSTRACTS.

    PubMed

    1971-12-01

    1. General Principles: 'Statistical Aspect of the Correlation Between Objective and Subjective Measurements of Meat Tenderness', by M. C. Gacula, Jr., J. B. Reaume, K. J. Morgan, and R. L. Luckett 1. General Principles: 'Texture of Semi-Solid Foods: Sensory and Physical Correlates', by W. F. Henry, M. H. Katz, F. J. Pilgrim, and A. T. May 2. Instrumentation and Methodology: 'Measurement of Bread Staling', by W. Morandini and L. Wassermann 2. Instrumentation and Methodology: 'Physical Considerations of the Methods of Consistency Measurement of Butter', by E. Knoop 2. Instrumentation and Methodology: 'Electronic Recording Mixers for the Baking Test', by P. W. Voisey, V. M. Bendelow and H. Miller 2. Instrumentation and Methodology: 'Measurement of the Consistency of Reconstituted Instant Potato Flakes', by P. W. Voisey and P. R. Dean 2. Instrumentation and Methodology: 'The Ottawa Electronic Recording Farinograph', by P. W. Voisey, H. Miller and P. L. Byrne 3. Objective Measurements: A. FOODS: 'The Rheological Properties of Corn Horny Endosperm', by J. R. Hamerle*, R. K. White**, and N. N. Mohsenin*** 3. Objective Measurements: 'Evaluation of Mechanical Properties of Comminuted Sausages by Construction and Analysis of Rheological Model', by St. Tyszkiewicz 3. Objective Measurements: 'Studies on Creep Compliance of Butter', by M. Chwiej 3. Objective Measurements: 'Heat-Induced Milk Gels. II. Preparation of Gels and Measurement of Firmness', by M. Kalab, P. W. Voisey and D. B. Emmons 3. Objective Measurements: 'Rheology of Fresh, Aged and Gamma-Irradiated Egg White', by M. A. Tung, J. F. Richards, B. C. Morrison and E. L. Watson 3. Objective Measurements: 'Retardation of Bread Staling - Practical Experiences', by W. Morandini and L. Wassermann 3. Objective Measurements: B. PHARMACEUTICALS: 'Influence of HLB on Certain Physicochemical Parameters of an O/W Emulsion', by M. Schrenzel 3. Objective Measurements: 'The Rheological Evaluation of Semisolids', by L. H. Block and P. P. Lamy 4. Factors Affecting Texture: 'Effects of Physical and Mechanical Treatments on the Tenderness of the Beef Longissimus', by G. C. Smith, T. C. Arango and Z. L. Carpenter 4. Factors Affecting Texture: 'Histological and Physical Changes in Carrots as Affected by Blanching, Cooking, Freezing, Freeze Drying and Compression', by A. R. Rahman, W. L. Henning and D. E. Westcott 4. Factors Affecting Texture: 'Effects of Physiological Maturity of Beef and Marbling of Rib Steaks on Eating Quality', by H. L. Norris, D. L. Harrison, L. L. Anderson, B. Van Welck and H. J. Tuma 4. Factors Affecting Texture: 'Effect of Ultimate pH Upon the Water-Holding Capacity and Tenderness of Mutton', by P. E. Bouton, P. V. Harris and W. R. Shorthose 4. Factors Affecting Texture: 'The Dilution Coefficient of Butter Serum and the Consistency of Butter', by E. Pijanowski, M. Chwiej, H. Hernik and M. Kurtowicz 4. Factors Affecting Texture: 'Moisture and pH Changes as Criteria of Freshness in Abalone and their Relationship to Texture of the Canned Product', by D. G. James and J. Olley 4. Factors Affecting Texture: 'Effect of Sucrose on Crispness of Explosion-Puffed Apple Pieces Exposed to High Humidities', by E. O. Strolle, J. Cording, Jr., P. E. McDowell, and R. K. Eskew 4. Factors Affecting Texture: 'Effect of Heat Treatment on Viscosity of Yolk', by P. K. Chang, W. D. Powrie and O. Fennema 4. Factors Affecting Texture: 'Protein Quality and Quantity: A Rheological Assessment of the Relative Importance in Breadmaking', by T. Webb, P. W. Heaps, and J. B. M. Coppock 4. Factors Affecting Texture: 'Bread Staling. 1. Experimental Study', by E. M. A. Willhoft 4. Factors Affecting Texture: 'Bread Staling. II. Theoretical Study', by E. M. A. Willhoft.

  1. Rodinia, She Do Spin: A (prequal) tribute to the legendary 2001 AGU poster, "Pangaea, She No Spin"

    NASA Astrophysics Data System (ADS)

    Stegman, D.; Knight, K.

    2004-12-01

    On one side of the poster: actual science, real equations, a burning question many scientists are researching by employing the best resources governments can provide; on the other side of the poster: a shameless and ill-fated attempt to conceptualize the problem using rudimentary and commonly available crafts. Unbenownst to passers-by, meticulous, covert record keepers will busily tabulate which of the two presentation styles draws more attention and for what length. Late night television the world over asks such simple questions: Will it float? (David Letterman); What was weak? (Australian Broadcasting Corporation's "Double the Fist" show); etc. Borrowing a page from popular culture we adopt a simple and interesting question: Does it spin? This question has been posed previously in rigorously mathematical terms in relation to true polar wander (TPW) (Ricard et al, 1993), in descriptive terms in relation to paleomagnetic data sets (Evans, 2003), and also in more interesting terms (McDowell, 2001). This poster draws on the success of the latter presentation which described the 2001 experiment thusly, "I wondered what would happen if the configuration were put in high relief on a globe and spun on axis. Then I wondered if the present configuration of land masses would itself balance as a spinning top. So I got two Replogle globes, two boxes of colored modeling clay sticks, and two fat knitting needles, to fit through the capped holes at the poles of the globes. The clay sticks I cut up into 3 mm. (1/8") slices, using a different color for each continent, and applied to the first globe, assuming the extreme exaggeration above the geoid, no matter how crude, would tell the story. Inserting one needle through the globe and securing it, I balanced the globe on the point of the needle and twirled it like a top. Result: Wobbly! Top end of needle gyrated unevenly, and here it was supposed to make a smooth precessional cone. Oh boy. For the second globe, I used a Scotese "free stuff" interpretation of Pangaea, which I had to augment considerably using USGS, DuToit, Irving and other references, fitting it on the globe and applying identical clay color slices to what I judged generally accepted land surfaces. Result: the thing would hardly stand up, let alone spin." We will indeed present new results relavent to the issue of whether Earth experienced an increased amount of rotational instability (i.e. TPW) duing the Proterozoic as a result of the Rodinia supercontinent. We make the assumption that the viscosity structure of the mantle has not changed significantly since the Neoproterozoic, and that the configuration of tectonic plates into a supercontinent (testing multiple plausible reconstructions) and the internal density anomalies resulting from such tectonic patterns will drive changes in the time varying moment of inertia tensor. So basically, we got a really big computer, a really fast mantle convection program written in F77, a few different plate reconstructions, generated a mantle density structure and then calculated the response for changes in the spin axis. We also create an "analog" model using crafts, including but not limited to, elbow macaroni, glitter, popsicle sticks, and felt. Without resorting to conclusion that the Earth must have been a smaller size in the past, we will present our findings because Rodinia, she do spin.

  2. MacMS: A Mass Spectrometer Simulator: Abstract of Issue 9906M

    NASA Astrophysics Data System (ADS)

    Bigger, Stephen W.; Craig, Robert A.

    1999-10-01

    MacMS is a program for Mac-OS compatible computers that simulates a magnetic sector mass spectrometer (1-4) designed to operate in the mass-to-charge (m/z) ratio range of 1-200 amu. MacMS has two operational modules. The first module (see Figure 1) is called the "Path" module and enables the user to quantitatively examine the trajectory of an ion of given m/z ratio in the electric and magnetic fields of the simulated "instrument". By systematically measuring a series of trajectories of different ions under different electric and magnetic field conditions, the user can determine how the resolution of the "instrument" is affected by these experimentally variable parameters. The user can thus choose suitable instrumental conditions for scanning a given m/z ratio range with good separation between the peaks. The second module (see Figure 2) is called as the "Spectrometer" module and enables the user to record, under any chosen instrumental conditions, the mass spectrum of (i) the instrumental background, (ii) neon, (iii) methane, or (iv) the parent ion of carbon tetrachloride. Both voltage scanning and magnetic scanning are possible (5). A hard copy of any mass spectrum that has been recorded can also be obtained. MacMS can read ASCII data files containing mass spectral information of compounds other than those that are "built-in" to the simulator. The appropriate format for creating such data files is described in the program documentation. There are a number of instructional exercises that can be conducted using the mass spectral information contained within the simulator. These are included in the program documentation. For example, the intensities of the 20Ne+, 21Ne+, and 22Ne+ species can be determined from hard copies of mass spectra of neon that are obtained under different instrumental sensitivities. The relative abundances of the three isotopes of neon can thus be calculated and compared with the literature values (6). The simulator also includes adjustable, fixed-value range and gain settings, which can be used to enhance the resolution and sensitivity of the instrument respectively.

    Figure 1. The "Path" module of MacMS showing the control panel (upper section) and graphics display region (lower section). The graphics display region incorporates a "data collector", which includes a "Grab" button to collect data and an area where data are displayed.
    Figure 2. The "Spectrometer" module of MacMS showing the control panel (upper section) and a graphics display region (lower section). A mass spectrum is produced in the graphics display region upon scanning. A "data collector" similar to that of the "Path" module forms part of the graphics display region. Hardware and Software Requirements Literature Cited
    1. Kiser, R. N. Introduction to Mass Spectrometry and its Applications; Prentice-Hall: Englewood Cliffs, N. J., 1965; pp 1-3; pp 32-65.
    2. Johnstone, R. A. W.; Rose, M. E. Mass Spectrometry for Chemists and Biochemists, 2nd ed.; Cambridge University Press: Cambridge, 1996.
    3. Hill, H. C.; Loudon, A. G. Introduction to Mass Spectrometry; 2nd ed.; Heyden: London, 1972; p 5.
    4. Farmer, J. B. In Mass Spectrometry, McDowell, C. A., Ed.; McGraw-Hill: New York, 1963; pp 10-11.
    5. Message, G. M. Practical Aspects of Gas Chromatography-Mass Spectrometry, Wiley: New York, 1984; Chapter 3.
    6. CRC Handbook of Chemistry and Physics, 55th ed.; CRC: Cleveland, 1974.

  3. Who is the new sheriff in town regulating boreal forest growth?

    NASA Astrophysics Data System (ADS)

    Park Williams, A.; Xu, Chonggang; McDowell, Nate G.

    2011-12-01

    Climate change appears to be altering boreal forests. One recently observed symptom of these changes has been an apparent weakening of the positive relationship between high-latitude boreal tree growth and temperature at some sites (D'Arrigo et al 2008). This phenomenon is referred to as the 'divergence problem' or 'divergence effect' and is thought to reflect a non-linear relationship between temperature and tree growth, where recent warming has allowed other factors besides growing-season temperature to emerge as dominant regulators of annual growth rates. Figure 1 demonstrates this divergence phenomenon with records of tree-ring widths collected from 59 populations of white spruce in Alaska 1. Key tendencies among these populations include: (1) growth is most sensitive to temperature during relatively cold growing seasons (figure 1(a)), (2) populations at colder sites are more sensitive to temperature than those at warmer sites are (figure 1(a)), and (3) growth at warmer sites may respond negatively to increased temperature beyond some optimal growing-season temperature (figure 1(b)). Since temperature is rising rapidly at high latitudes, one interpretation of figures 1(a) and (b) is that warming has promoted increased growth at colder sites, but caused growth to plateau or slow at warmer sites. Corroborating this interpretation, satellite imagery and tree-ring data indicate increasing vegetation productivity near the forest-tundra boundary but declining productivity in warmer regions within forest interiors (e.g., Bunn and Goetz 2006, Beck and Goetz 2011, Beck et al 2011, Berner et al 2011). Will continued warming cause a northward migration of boreal forests, with mortality in the warmer, southern locations and expansion into the colder tundra? This question is difficult to answer because many factors besides temperature influence boreal forest dynamics. Widespread productivity declines within interior boreal forests appear to be related to warming-induced drought stress (Barber et al 2000). Notably, this response may be more complicated than simply a decline in soil moisture. Even when soil moisture is plentiful, warming can negatively impact plant growth and survival by causing increased respiratory consumption of stored carbohydrates (McDowell 2011) and decreased stomatal conductance due to hydraulic limitation (Flexas et al 2004). Some degree of acclimation may be occurring, as white spruce populations that experience moderate temperatures and precipitation have lower optimal growth temperatures than populations at warmer, drier sites do (figure 1(c)). Yet, populations at the warmest or driest sites show strong growth declines during warm periods, consistent with a decline in the viability of these populations in some regions (Goetz et al 2005, Beck and Goetz 2011, Beck et al 2011). Can interior boreal forests acclimate to the current era's rapid warming? Or will temperatures within interior boreal forests outpace or extend beyond the adaptive capabilities of boreal tree species? The answer remains a mystery, partly because important aspects of acclimation are still poorly understood, and partly because of other important processes such as wildfire and increases in CO2 concentration, nitrogen deposition, growing-season length, and tropospheric ozone concentration. Figure 1 Figure 1. Relationships between white spruce tree-ring widths and climate at 59 sites in Alaska. (a) Annual correlation between ring-width index and June-July average temperature during years when June--July temperature was colder (blue bars) and warmer (red bars) than average. Pairs of bars represent the coldest 20 sites (left), 19 sites with intermediate temperature (middle) and the warmest 20 sites (right). (b) Spline curves that represent the best-fit relationship between temperature (x-axis) and ring-width index variability (y-axis) at cold sites (blue line), intermediate sites (black line) and warm sites (orange line). (c) Same as (b) but for the wettest 20 sites (green line), 19 sites with intermediate annual precipitation (black line) and the driest 20 sites (brown line). Error bars in (a)-(c) are standard errors. Perhaps an even bigger mystery is what the future has in store at the cold ecotone where boreal forest gives way to arctic tundra. Just as for warmer sites, there tends to be a temperature threshold at cold and intermediate sites, above which further warming no longer positively influences growth rate (figures 1(a) and (b)). Rather than reverse sign once this threshold is surpassed, growth-temperature relationships at cold and intermediate sites tend to simply disappear or at least diminish. This is because metabolic rates are slow in the cold, but are optimal under moderately warmer conditions (Tjoelker et al 2009). As temperature increases into a range of variability that no longer limits metabolic rate, a host of other climatic and soil-related factors can limit or promote growth and seedling recruitment. At some cool treeline sites, rapidly rising temperatures may have already surpassed the level that supports optimal growth, as negative relationships have emerged between temperature and growth rate in most decades (McGuire et al 2010). In a recent contribution to this important body of research, Andreu-Hayles et al (2011) studied growth-temperature relations within a white spruce population growing at the northern treeline in Alaska. Consistent with observations elsewhere in boreal forests, Andreu-Hayles et al discovered that a positive and significant relationship between ring widths and June-July temperature during 1901-1950 disappeared during 1951-2000. Interestingly, ring widths and temperature both increased throughout the 20th century at this treeline site, in contrast to recent trends at many other sites in Alaska where warming is outpacing ring widths (e.g., D'Arrigo et al 2008). At the site studied by Andreu-Hayles et al, it seems recent warming has caused a release of white spruce growth from temperature limitation and there is now a new sheriff in town regulating annual growth rate. Who this new sheriff is, however, remains an open and important question. Another interesting result in the Andreu-Hayles et al study is that the relationship between temperature and density of tree-ring latewood (the dark band formed at the end of the growing season) was stable throughout the 20th century. This means that although temperature may no longer be the primary factor governing annual growth, it still has an important physiological impact at the end of the growing season. The stability of the latewood density-temperature relationship also offers a promising implication for dendroclimatic studies. While non-linear relationships between ring widths and temperature may make it difficult to use ring widths to infer information about historical temperature variability for some sites, Andreu-Hayles et al add to the evidence (e.g., Barber et al 2000, Davi et al 2003, D'Arrigo et al 2009) that latewood density may be particularly useful in reconstructing historical temperature at high latitudes. While the divergence problem and new contribution by Andreu-Hayles et al are interesting on their own, they are also important because they highlight the current limits to our understanding of the mechanisms driving boreal forest growth and survival. As Allen et al (2010) pointed out, understanding and predicting the consequences of climate changes on forests is emerging as a grand challenge for global change scientists. This is particularly true at high latitudes because boreal forests store ~32% of Earth's terrestrial forest carbon, more than twice that of temperate forests (Pan et al 2011). Will continued warming turn boreal forests into a sink or source of atmospheric CO2? And will boreal forest growth and distribution change enough to significantly impact the energy balance of high latitude landscapes and thereby influence large-scale atmospheric circulation? To answer these questions, it is critical to understand the factors influencing boreal forest growth under warmer conditions and how the relative contributions of these factors vary spatially. Our understanding of these factors can be improved through research campaigns that integrate field-measurements, remote sensing and ecological modeling (Goetz et al 2011). Field-studies that measure the physiological responses of trees to manipulations of environmental variables such as temperature, soil moisture, soil nutrients and insolation are critical for informing ecological models that predict forest responses to various scenarios of climate and environmental change. Remote sensing is critical in validating modeled projections of forest growth. At present, ecological models do poorly at characterizing observed trends in boreal-forest productivity in some regions (Beck et al 2011). It will be exciting in the coming years to see how field measurements, modeling and remote sensing can work together to resolve the mysteries of the divergence problem, how warming will influence the overall productivity and distribution of boreal forests, and how changes in boreal-forest characteristics may influence regional and global climates. References Allen C D et al 2010 A global overview of drought and head-induced tree mortality reveals emerging climate change risks for forests Forest Ecol. Manag. 259 660-84 Andreu-Hayles L, D'Arrigo R, Anchukaitis K J, Beck P S A, Frank D and Goetz S 2011 Varying boreal forest response to Arctic environmental change at the Firth River, Alaska Environ. Res. Lett. 6 045503 Barber V A, Juday G P and Finney B P 2000 Reduced growth of Alaskan white spruce in the twentieth century from temperature-induced drought stress Nature 405 668-73 Beck P S A and Goetz S J 2011 Satellite observations of high northern latitude vegetation productivity changes between 1982 and 2008: ecological variability and regional differences Environ. Res. Lett. 6 045501 Beck P S A, Juday G P, Alix C, Barber V A, Winslow S E, Sousa E E, Heiser P, Herriges J D and Goetz S J 2011 Changes in forest productivity across Alaska consistent with biome shift Ecol. Lett. 14 373-9 Berner L T, Beck P S A, Bunn A G, Lloyd A H and Goetz S J 2011 High-latitude tree growth and satellite vegetation indices: correlations and trends in Russia and Canada (1982-2008) J. Geophys. Res. 116 G01015 Bunn A G and Goetz S J 2006 Trends in satellite-observed circumpolar photosynthetic activity from 1982 to 2003: the influence of seasonality, cover type, and vegetation density Earth Interact. 10 1-19 D'Arrigo R, Jacoby G, Buckley B, Sakulich J, Frank D, Wilson R, Curtis A and Anchukaitis K 2009 Tree growth and inferred temperature variability at the North American Arctic treeline Glob. Planet. Change 65 71-82 D'Arrigo R, Wilson R, Liepert B, Cherubini P 2008 On the 'divergence problem' in northern forests: a review of the tree-ring evidence and possible causes Glob. Planet. Change 60 289-305 Davi N K, Jacoby G C and Wiles G C 2003 Boreal temperature variability inferred from maximum latewood density and tree-ring width data, Wrangell Mountain region, Alaska Quatern. Res. 60 252-62 Flexas J, Bota J, Loreto F, Cornic G and Sharkey T 2004 Diffusive and metabolic limitations to photosynthesis under drought and salinity in C3 plants Plant Biol. 6 269-79 Goetz S J, Bunn A G, Fiske G J and Houghton R 2005 Satellite-observed photosynthetic trends across boreal North America associated with climate and fire disturbance Proc. Natl Acad. Sci. USA 102 13521-5 Goetz S J, Kimball J S, Mack M C and Kasischke E S 2011 Scoping completed for an experiment to assess vulnerability of Arctic and boreal ecosystems EOS Trans. Am. Geophys. Union 92 150-1 McDowell N G 2011 Mechanisms linking drought, hydraulics, carbon metabolism, and vegetation mortality Plant Physiol. 155 1051-9 McGuire A D, Ruess R W, Lloyd A, Yarie J, Clein J S and Juday G P 2010 Vulnerability of white spruce tree growth in interior Alaska in response to climate variability: dendrochronological, demographic, and experimental perspectives Canadian J. Forest Res. 40 1197-209 Pan Y et al 2011 A large and persistent carbon sink in the world's forests Science 333 988-93 Tjoelker M G, Oleksyn J, Lorenc-Plucinska G and Reich P B 2009 Acclimation of respiratory temperature responses in northern and southern populations of Pinus banksiana New Phytologist 181 218-29 1 Tree-ring data: ftp.ncdc.noaa.gov/pub/data/paleo/treering. Climate data: snap.uaf.edu/downloads/alaska-climate-datasets.

  4. The use of chipped pruned branches to control the soil and water losses in citrus plantations in Eastern Spain

    NASA Astrophysics Data System (ADS)

    Cerdà, Artemi; Keesstra, Saskia; Jordán, Antonio; Pereira, Paulo; Prosdocimi, Massimo; Ritsema, Coen J.; Burguet, María

    2016-04-01

    Soil erosion is the main cause of soil degradation in agriculture land, which is a world-wide problem (Cerdà et al., 2009; Novara et al., 2011; Biwas et al., 2015, Colazo and Buschiazzo, 2015; Ligonja and Shrestha, 2015). High erosion rates result in the loss of soil and also changes the hydrological, erosional, biological, and geochemical cycles (Keesstra et al., 2012; Berendse et al., 2015; Decock et al., 2015; Brevik et al., 2015; Smith et al., 2015). Thus, there is a need to reduce the soil losses to achieve soil sustainability. However, although some findings show that straw, geotextiles, vegetation cover and tillage reduction are efficient strategies (Gimenez Morera et al., 2010; Cerdà et al., 2015; Lieskovský and Kenderessy, 2014; Taguas et al., 2015) there is still a need to find easy strategies for farmers to adopt in their fields that will protect, and also recover, their soils. Chipped branches are usually burned in many orchards to remove them from the fields. However, when they would be chipped and spread on the fields, they can be a source of organic matter, and in addition this might reduce soil losses and improve the water retention capacity of the soils (Mukherjee et al., 2014; Yazdanpanah et al., 2016). The hypothesis is that the chipped branches reduce soil loss. To test this hypothesis we selected 3 study sites in which chipped branches were applied, and paired sites with bare soil to check the changes introduced by the chipped branches on the soils. We selected 3 sites of the Cànyoles river watershed (Montesa municipality), SW Spain, with 10 plots in each site. At each site, 10 rainfall simulation experiments were carried out. Paired plots were selected in the nearby (less than 10 m in distance) orchard where the pruned branches were removed. Then, 60 rainfall simulation experiments at 55 mm h-1 of rainfall intensity during 1 hour were carried out in small 0.25 m2 plots to determine the soil particle detachment. The results show that in all three sites the soil erosion is reduced in one order of magnitude in average as a consequence of the cover of the chipped pruned branches (78.45 % in average cover) in comparison to the bare (control) soils. Acknowledgements The research leading to these results has received funding from the European Union Seventh Framework Programme (FP7/2007-2013) under grant agreement n° 603498 (RECARE project). References Berendse, F., van Ruijven, J., Jongejans, E., Keesstra, S. 2015. Loss of plant species diversity reduces soil erosion resistance. Ecosystems, 18 (5), 881-888. DOI: 10.1007/s10021-015-9869-6 Biswas H., Raizada A., Mandal D., Kumar S., Srinivas S., Mishra P. K. 2015. Identification of areas vulnerable to soil erosion risk in India using GIS methods. Solid Earth, 6 (4), pp. 1247-1257. DOI: 10. 5194/se-6-1247-2015v Brevik, E. C., Cerdà, A., Mataix-Solera, J., Pereg, L., Quinton, J. N., Six, J., and Van Oost, K.: The interdisciplinary nature of SOIL, SOIL, 1, 117-129, doi:10.5194/soil-1-117-2015, 2015. Cerdà, A., Giménez-Morera, A. and Bodí, M.B. 2009.Soil and water losses from new citrus orchards growing on sloped soils in the western Mediterranean basin. Earth Surface Processes and Landforms, 34, 1822-1830. DOI: 10.1002/esp.1889 Cerdà, A., González-Pelayo, O., Giménez-Morera, A., Jordán, A., Pereira, P., Novara, A., Brevik, E.C., Prosdocimi, M., Mahmoodabadi, M., Keesstra, S., García Orenes, F., Ritsema, C., 2015. The use of barley straw residues to avoid high erosion and runoff rates on persimmon plantations in Eastern Spain under low frequency - high magnitude simulated rainfall events. Soil Res. (In press) Colazo, J.C., Buschiazzo, D. 2015. The Impact of Agriculture on Soil Texture Due to Wind Erosion.Land Degradation and Development, 26 (1), 62-70 DOI: 10.1002/ldr.2297 Decock, C.,J. Lee, M. Necpalova, E. I. P. Pereira, D. M. Tendall, J. Six. 2015 Mitigating N2O emissions from soil: from patching leaks to transformative action. SOIL, 1, 687-694, doi:10.5194/soil-1-687-2015, Keesstra, S.D., Geissen, V., van Schaik, L., Mosse., K., Piiranen, S., 2012. Soil as a filter for groundwater quality. Current Opinions in Environmental Sustainability 4, 507-516.doi:10.1016/j.cosust.2012.10.007 Lieskovský, J., Kenderessy, P. 2014. Modelling the effect of vegetation cover and different tillage practices on soil erosion in: A case study in vráble (Slovakia) using WATEM/SEDEM Land Degradation and Development, 25 (3), 288-296. DOI: 10.1002/ldr.2162 Ligonja P. J., Shrestha R. P. 2015. Soil erosion assessment in kondoa eroded area in Tanzania using universal soil loss equation, geographic information systems and socioeconomic approachLand Degradation and Development, 26 (4), 367-379. DOI: 10. 1002/ldr. 2215 Mukherjee, A., Zimmerman, A.R., Hamdan, R., Cooper, W.T.Physicochemical changes in pyrogenic organic matter (biochar) after 15 months of field aging(2014) Solid Earth, 5 (2), pp. 693-704. DOI: 10.5194/se-5-693-2014 Novara, A., Gristina, L., Saladino, S. S., Santoro, A., Cerdà, A. 2011. Soil erosion assessment on tillage and alternative soil managements in a Sicilian vineyard. Soil and Tillage Research, 117, 140-147. Smith, P., Cotrufo, M.F., Rumpel, C., Paustian, K., Kuikman, P.J., Elliott, J.A., McDowell, R., Griffiths, R.I., Asakawa, S., Bustamante, M., House, J.I., Sobocká, J., Harper, R., Pan, G., West, P.C., Gerber, J.S., Clark, J.M., Adhya, T., Scholes, R.J., Scholes, M.C., 2015. Biogeochemical cycles and biodiversity as key drivers of ecosystem services provided by soils. SOIL 1, 665-685. doi:10.5194/soil-1-665-2015 Taguas, E.V., Arroyo, C., Lora, A., Guzmán, G., Vanderlinden, K., Gómez, J.A., 2015. Exploring the linkage between spontaneous grass cover biodiversity and soil degradation in two olive orchard microcatchments with contrasting environmental and management conditions. SOIL, 1, 651-664. doi:10.5194/soil-1-651-2015 Yazdanpanah, N., Mahmoodabadi, M., and Cerdà, A. The impact of organic amendments on soil hydrology, structure and microbial respiration in semiarid lands. Geoderma Volume 266, 15 March 2016, Pages 58-65. doi:10.1016/j.geoderma.2015.11.032

  5. The developmental origins of naïve psychology in infancy.

    PubMed

    Poulin-Dubois, Diane; Brooker, Ivy; Chow, Virginia

    2009-01-01

    Research interest in children's understanding of the mind goes back as far as Piaget's claim that children are cognitively egocentric (Flavell, 2000). Many years later, research on the understanding of the mind was revived in a paper that sought evidence for a theory of mind, not for children but for chimpanzees (Premack & Woodruff, 1978). The researchers claimed that chimpanzees' ability to predict what a human actor will do to achieve certain goals implies that the animal attributes mental states to the actor. This seminal paper generated a flurry of studies on theory of mind in nonhuman primates. A review of this research based on several different experimental paradigms concluded that chimpanzees understand others in terms of a perception-goal psychology (i.e., they can perceive what the other's goal is but not understand the mental states associated with the goal), as opposed to a full-fledged, human-like belief-desire psychology (Call & Tomasello, 2008). Around the same time, research on children's understanding of the mind was revived in a landmark paper by Wimmer and Perner (1983) and by other developmentalists (Bretherton, McNew, & Beegly-Smith. 1981). In line with the research on nonhuman primates, part of the progress that has been made in recent years is a recognition that theory of mind knowledge is acquired in an extended series of developmental milestones and that this development is based on a rich set of socio-cognitive abilities that develop in infancy (Wellman, 2002). The evidence outlined in the sections of this chapter suggests that infants possess a nascent understanding of mental states that older children use in explaining and predicting human behavior. Researchers have learned a great deal about the developmental origins of naive psychology in infancy. Nevertheless, the depth of infants' understanding of human behavior is still a controversial issue. For example, a popular paradigm in naive psychology is violation of expectancy. In false-belief tasks, infants look longer at a scene.in which a protagonist searches for an object in a location she does not know than at a scene in which the protagonist searches for an object in a location where she has previously seen the object disappear. The fact that no active behavioral response is required makes many researchers doubt that an infants' looking pattern reflects a deep level of understanding. Looking pattern may simply reflect the infants' detection that something in the scene is novel (e.g., protagonist looks at a location different than the one infants last saw her look at). Indeed this interpretation may account for the conflicting results in recent studies (e.g., Poulin-Dubois et al., 2007; Onishi & Baillargeon, 2005; Surian et al., 2007). Poulin-Dubois et al. (2007) recently reported that the ability to distinguish between knowledge and ignorance (true belief) is absent at 14 months of age and still fragile at 18 months in a violation-of-expectancy task depicting videotaped human actors. In contrast, false-belief attribution to a computer animated caterpillar has been reported in 13-month-old infants (Surian et al., 2007). Given that infants have had more experience with humans looking at objects than with a caterpillar's looking behavior, the current evidence for an implicit understanding of advanced mental states such as false belief should be interpreted with caution. As is the case for nonhuman primate research, infants' mind-reading success might be accounted for by a simple behavior-reading explanation. According to some researchers, primates' (and infants') successful performance in theory of mind tasks can be explained by a sophisticated form of behavior reading. Under this view, infants perform well in such tasks because they are adept at calculating the statistical likelihood that some aspects of people's observable features (e.g., gaze) will be linked to future actions (e.g., search at a location). Distinguishing between a mentalistic and rule-based account is very difficult (Povinelli & Vonk, 2004). One way to address this debate would be to design training studies that provide infants with first-person experience of mental states and to use more active behavioral measures. In terms of training, there is some evidence that infants' performance on goal and visual perception attribution tasks is improved if they received training of relevant skills (e.g., wearing a blindfold, reaching with a "sticky mitten": Meltzoff & Brooks, 2007: Sommerville & Woodward, 2004). Furthermore, longitudinal research using more active measures revealed links between goal detection as measured with the violation of expectancy paradigm at 10 months of age and the ability to infer intended goals in an imitation task at 14 months (Olineck & Poulin-Dubois, 2007b). Developmental changes in the scope of infants' concept of intentional agent also will require more attention from researchers. According to some, infants' attributions of intentional behavior are activated whenever infants recognize an object as a psychological agent, based on an evolutionary designed system which is sensitive to certain cues such as self-propulsion, contingent reactivity or equifinal variation of the action (Baron-Cohen, 1995; Gergely & Csibra, 2003; Johnson, 2000; Leslie, deficient in theory of mind. One may hope that nonverbal theory of mind tasks that reliably predict later theory mind skills will be adapted for use with this population and eventually used for the early detection of autism. In sum, the numerous studies reported here show that by the end of the second year of life, infants have developed ways to predict human actions The review also makes clear that we do not yet fully understand how deep infants' insight into the mind really is. Nonetheless, there appears to be some consensus that infants, like chimpanzees, understand the goals, intentions, perception, and knowledge of others. This provides the foundations for the full-fledged adult-like naive psychology that develops gradually in early childhood.

  6. Transforming Collaborative Process Models into Interface Process Models by Applying an MDA Approach

    NASA Astrophysics Data System (ADS)

    Lazarte, Ivanna M.; Chiotti, Omar; Villarreal, Pablo D.

    Collaborative business models among enterprises require defining collaborative business processes. Enterprises implement B2B collaborations to execute these processes. In B2B collaborations the integration and interoperability of processes and systems of the enterprises are required to support the execution of collaborative processes. From a collaborative process model, which describes the global view of the enterprise interactions, each enterprise must define the interface process that represents the role it performs in the collaborative process in order to implement the process in a Business Process Management System. Hence, in this work we propose a method for the automatic generation of the interface process model of each enterprise from a collaborative process model. This method is based on a Model-Driven Architecture to transform collaborative process models into interface process models. By applying this method, interface processes are guaranteed to be interoperable and defined according to a collaborative process.

  7. Extensible packet processing architecture

    DOEpatents

    Robertson, Perry J.; Hamlet, Jason R.; Pierson, Lyndon G.; Olsberg, Ronald R.; Chun, Guy D.

    2013-08-20

    A technique for distributed packet processing includes sequentially passing packets associated with packet flows between a plurality of processing engines along a flow through data bus linking the plurality of processing engines in series. At least one packet within a given packet flow is marked by a given processing engine to signify by the given processing engine to the other processing engines that the given processing engine has claimed the given packet flow for processing. A processing function is applied to each of the packet flows within the processing engines and the processed packets are output on a time-shared, arbitered data bus coupled to the plurality of processing engines.

  8. Processed and ultra-processed foods are associated with lower-quality nutrient profiles in children from Colombia.

    PubMed

    Cornwell, Brittany; Villamor, Eduardo; Mora-Plazas, Mercedes; Marin, Constanza; Monteiro, Carlos A; Baylin, Ana

    2018-01-01

    To determine if processed and ultra-processed foods consumed by children in Colombia are associated with lower-quality nutrition profiles than less processed foods. We obtained information on sociodemographic and anthropometric variables and dietary information through dietary records and 24 h recalls from a convenience sample of the Bogotá School Children Cohort. Foods were classified into three categories: (i) unprocessed and minimally processed foods, (ii) processed culinary ingredients and (iii) processed and ultra-processed foods. We also examined the combination of unprocessed foods and processed culinary ingredients. Representative sample of children from low- to middle-income families in Bogotá, Colombia. Children aged 5-12 years in 2011 Bogotá School Children Cohort. We found that processed and ultra-processed foods are of lower dietary quality in general. Nutrients that were lower in processed and ultra-processed foods following adjustment for total energy intake included: n-3 PUFA, vitamins A, B12, C and E, Ca and Zn. Nutrients that were higher in energy-adjusted processed and ultra-processed foods compared with unprocessed foods included: Na, sugar and trans-fatty acids, although we also found that some healthy nutrients, including folate and Fe, were higher in processed and ultra-processed foods compared with unprocessed and minimally processed foods. Processed and ultra-processed foods generally have unhealthy nutrition profiles. Our findings suggest the categorization of foods based on processing characteristics is promising for understanding the influence of food processing on children's dietary quality. More studies accounting for the type and degree of food processing are needed.

  9. Dynamic control of remelting processes

    DOEpatents

    Bertram, Lee A.; Williamson, Rodney L.; Melgaard, David K.; Beaman, Joseph J.; Evans, David G.

    2000-01-01

    An apparatus and method of controlling a remelting process by providing measured process variable values to a process controller; estimating process variable values using a process model of a remelting process; and outputting estimated process variable values from the process controller. Feedback and feedforward control devices receive the estimated process variable values and adjust inputs to the remelting process. Electrode weight, electrode mass, electrode gap, process current, process voltage, electrode position, electrode temperature, electrode thermal boundary layer thickness, electrode velocity, electrode acceleration, slag temperature, melting efficiency, cooling water temperature, cooling water flow rate, crucible temperature profile, slag skin temperature, and/or drip short events are employed, as are parameters representing physical constraints of electroslag remelting or vacuum arc remelting, as applicable.

  10. On Intelligent Design and Planning Method of Process Route Based on Gun Breech Machining Process

    NASA Astrophysics Data System (ADS)

    Hongzhi, Zhao; Jian, Zhang

    2018-03-01

    The paper states an approach of intelligent design and planning of process route based on gun breech machining process, against several problems, such as complex machining process of gun breech, tedious route design and long period of its traditional unmanageable process route. Based on gun breech machining process, intelligent design and planning system of process route are developed by virtue of DEST and VC++. The system includes two functional modules--process route intelligent design and its planning. The process route intelligent design module, through the analysis of gun breech machining process, summarizes breech process knowledge so as to complete the design of knowledge base and inference engine. And then gun breech process route intelligently output. On the basis of intelligent route design module, the final process route is made, edited and managed in the process route planning module.

  11. Gasoline from coal in the state of Illinois: feasibility study. Volume I. Design. [KBW gasification process, ICI low-pressure methanol process and Mobil M-gasoline process

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1980-01-01

    Volume 1 describes the proposed plant: KBW gasification process, ICI low-pressure methanol process and Mobil M-gasoline process, and also with ancillary processes, such as oxygen plant, shift process, RECTISOL purification process, sulfur recovery equipment and pollution control equipment. Numerous engineering diagrams are included. (LTN)

  12. Performing a local reduction operation on a parallel computer

    DOEpatents

    Blocksome, Michael A; Faraj, Daniel A

    2013-06-04

    A parallel computer including compute nodes, each including two reduction processing cores, a network write processing core, and a network read processing core, each processing core assigned an input buffer. Copying, in interleaved chunks by the reduction processing cores, contents of the reduction processing cores' input buffers to an interleaved buffer in shared memory; copying, by one of the reduction processing cores, contents of the network write processing core's input buffer to shared memory; copying, by another of the reduction processing cores, contents of the network read processing core's input buffer to shared memory; and locally reducing in parallel by the reduction processing cores: the contents of the reduction processing core's input buffer; every other interleaved chunk of the interleaved buffer; the copied contents of the network write processing core's input buffer; and the copied contents of the network read processing core's input buffer.

  13. Performing a local reduction operation on a parallel computer

    DOEpatents

    Blocksome, Michael A.; Faraj, Daniel A.

    2012-12-11

    A parallel computer including compute nodes, each including two reduction processing cores, a network write processing core, and a network read processing core, each processing core assigned an input buffer. Copying, in interleaved chunks by the reduction processing cores, contents of the reduction processing cores' input buffers to an interleaved buffer in shared memory; copying, by one of the reduction processing cores, contents of the network write processing core's input buffer to shared memory; copying, by another of the reduction processing cores, contents of the network read processing core's input buffer to shared memory; and locally reducing in parallel by the reduction processing cores: the contents of the reduction processing core's input buffer; every other interleaved chunk of the interleaved buffer; the copied contents of the network write processing core's input buffer; and the copied contents of the network read processing core's input buffer.

  14. Situation awareness acquired from monitoring process plants - the Process Overview concept and measure.

    PubMed

    Lau, Nathan; Jamieson, Greg A; Skraaning, Gyrd

    2016-07-01

    We introduce Process Overview, a situation awareness characterisation of the knowledge derived from monitoring process plants. Process Overview is based on observational studies of process control work in the literature. The characterisation is applied to develop a query-based measure called the Process Overview Measure. The goal of the measure is to improve coupling between situation and awareness according to process plant properties and operator cognitive work. A companion article presents the empirical evaluation of the Process Overview Measure in a realistic process control setting. The Process Overview Measure demonstrated sensitivity and validity by revealing significant effects of experimental manipulations that corroborated with other empirical results. The measure also demonstrated adequate inter-rater reliability and practicality for measuring SA based on data collected by process experts. Practitioner Summary: The Process Overview Measure is a query-based measure for assessing operator situation awareness from monitoring process plants in representative settings.

  15. 43 CFR 2804.19 - How will BLM process my Processing Category 6 application?

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 43 Public Lands: Interior 2 2012-10-01 2012-10-01 false How will BLM process my Processing... process my Processing Category 6 application? (a) For Processing Category 6 applications, you and BLM must enter into a written agreement that describes how BLM will process your application. The final agreement...

  16. 43 CFR 2804.19 - How will BLM process my Processing Category 6 application?

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 43 Public Lands: Interior 2 2013-10-01 2013-10-01 false How will BLM process my Processing... process my Processing Category 6 application? (a) For Processing Category 6 applications, you and BLM must enter into a written agreement that describes how BLM will process your application. The final agreement...

  17. Process Correlation Analysis Model for Process Improvement Identification

    PubMed Central

    Park, Sooyong

    2014-01-01

    Software process improvement aims at improving the development process of software systems. It is initiated by process assessment identifying strengths and weaknesses and based on the findings, improvement plans are developed. In general, a process reference model (e.g., CMMI) is used throughout the process of software process improvement as the base. CMMI defines a set of process areas involved in software development and what to be carried out in process areas in terms of goals and practices. Process areas and their elements (goals and practices) are often correlated due to the iterative nature of software development process. However, in the current practice, correlations of process elements are often overlooked in the development of an improvement plan, which diminishes the efficiency of the plan. This is mainly attributed to significant efforts and the lack of required expertise. In this paper, we present a process correlation analysis model that helps identify correlations of process elements from the results of process assessment. This model is defined based on CMMI and empirical data of improvement practices. We evaluate the model using industrial data. PMID:24977170

  18. Process correlation analysis model for process improvement identification.

    PubMed

    Choi, Su-jin; Kim, Dae-Kyoo; Park, Sooyong

    2014-01-01

    Software process improvement aims at improving the development process of software systems. It is initiated by process assessment identifying strengths and weaknesses and based on the findings, improvement plans are developed. In general, a process reference model (e.g., CMMI) is used throughout the process of software process improvement as the base. CMMI defines a set of process areas involved in software development and what to be carried out in process areas in terms of goals and practices. Process areas and their elements (goals and practices) are often correlated due to the iterative nature of software development process. However, in the current practice, correlations of process elements are often overlooked in the development of an improvement plan, which diminishes the efficiency of the plan. This is mainly attributed to significant efforts and the lack of required expertise. In this paper, we present a process correlation analysis model that helps identify correlations of process elements from the results of process assessment. This model is defined based on CMMI and empirical data of improvement practices. We evaluate the model using industrial data.

  19. Cleanliness of Ti-bearing Al-killed ultra-low-carbon steel during different heating processes

    NASA Astrophysics Data System (ADS)

    Guo, Jian-long; Bao, Yan-ping; Wang, Min

    2017-12-01

    During the production of Ti-bearing Al-killed ultra-low-carbon (ULC) steel, two different heating processes were used when the converter tapping temperature or the molten steel temperature in the Ruhrstahl-Heraeus (RH) process was low: heating by Al addition during the RH decarburization process and final deoxidation at the end of the RH decarburization process (process-I), and increasing the oxygen content at the end of RH decarburization, heating and final deoxidation by one-time Al addition (process-II). Temperature increases of 10°C by different processes were studied; the results showed that the two heating processes could achieve the same heating effect. The T.[O] content in the slab and the refining process was better controlled by process-I than by process-II. Statistical analysis of inclusions showed that the numbers of inclusions in the slab obtained by process-I were substantially less than those in the slab obtained by process-II. For process-I, the Al2O3 inclusions produced by Al added to induce heating were substantially removed at the end of decarburization. The amounts of inclusions were substantially greater for process-II than for process-I at different refining stages because of the higher dissolved oxygen concentration in process-II. Industrial test results showed that process-I was more beneficial for improving the cleanliness of molten steel.

  20. Application of agent-based system for bioprocess description and process improvement.

    PubMed

    Gao, Ying; Kipling, Katie; Glassey, Jarka; Willis, Mark; Montague, Gary; Zhou, Yuhong; Titchener-Hooker, Nigel J

    2010-01-01

    Modeling plays an important role in bioprocess development for design and scale-up. Predictive models can also be used in biopharmaceutical manufacturing to assist decision-making either to maintain process consistency or to identify optimal operating conditions. To predict the whole bioprocess performance, the strong interactions present in a processing sequence must be adequately modeled. Traditionally, bioprocess modeling considers process units separately, which makes it difficult to capture the interactions between units. In this work, a systematic framework is developed to analyze the bioprocesses based on a whole process understanding and considering the interactions between process operations. An agent-based approach is adopted to provide a flexible infrastructure for the necessary integration of process models. This enables the prediction of overall process behavior, which can then be applied during process development or once manufacturing has commenced, in both cases leading to the capacity for fast evaluation of process improvement options. The multi-agent system comprises a process knowledge base, process models, and a group of functional agents. In this system, agent components co-operate with each other in performing their tasks. These include the description of the whole process behavior, evaluating process operating conditions, monitoring of the operating processes, predicting critical process performance, and providing guidance to decision-making when coping with process deviations. During process development, the system can be used to evaluate the design space for process operation. During manufacture, the system can be applied to identify abnormal process operation events and then to provide suggestions as to how best to cope with the deviations. In all cases, the function of the system is to ensure an efficient manufacturing process. The implementation of the agent-based approach is illustrated via selected application scenarios, which demonstrate how such a framework may enable the better integration of process operations by providing a plant-wide process description to facilitate process improvement. Copyright 2009 American Institute of Chemical Engineers

  1. Electricity from sunlight. [low cost silicon for solar cells

    NASA Technical Reports Server (NTRS)

    Yaws, C. L.; Miller, J. W.; Lutwack, R.; Hsu, G.

    1978-01-01

    The paper discusses a number of new unconventional processes proposed for the low-cost production of silicon for solar cells. Consideration is given to: (1) the Battelle process (Zn/SiCl4), (2) the Battelle process (SiI4), (3) the Silane process, (4) the Motorola process (SiF4/SiF2), (5) the Westinghouse process (Na/SiCl4), (6) the Dow Corning process (C/SiO2), (7) the AeroChem process (SiCl4/H atom), and the Stanford process (Na/SiF4). Preliminary results indicate that the conventional process and the SiI4 processes cannot meet the project goal of $10/kg by 1986. Preliminary cost evaluation results for the Zn/SiCl4 process are favorable.

  2. Composing Models of Geographic Physical Processes

    NASA Astrophysics Data System (ADS)

    Hofer, Barbara; Frank, Andrew U.

    Processes are central for geographic information science; yet geographic information systems (GIS) lack capabilities to represent process related information. A prerequisite to including processes in GIS software is a general method to describe geographic processes independently of application disciplines. This paper presents such a method, namely a process description language. The vocabulary of the process description language is derived formally from mathematical models. Physical processes in geography can be described in two equivalent languages: partial differential equations or partial difference equations, where the latter can be shown graphically and used as a method for application specialists to enter their process models. The vocabulary of the process description language comprises components for describing the general behavior of prototypical geographic physical processes. These process components can be composed by basic models of geographic physical processes, which is shown by means of an example.

  3. Process-based tolerance assessment of connecting rod machining process

    NASA Astrophysics Data System (ADS)

    Sharma, G. V. S. S.; Rao, P. Srinivasa; Surendra Babu, B.

    2016-06-01

    Process tolerancing based on the process capability studies is the optimistic and pragmatic approach of determining the manufacturing process tolerances. On adopting the define-measure-analyze-improve-control approach, the process potential capability index ( C p) and the process performance capability index ( C pk) values of identified process characteristics of connecting rod machining process are achieved to be greater than the industry benchmark of 1.33, i.e., four sigma level. The tolerance chain diagram methodology is applied to the connecting rod in order to verify the manufacturing process tolerances at various operations of the connecting rod manufacturing process. This paper bridges the gap between the existing dimensional tolerances obtained via tolerance charting and process capability studies of the connecting rod component. Finally, the process tolerancing comparison has been done by adopting a tolerance capability expert software.

  4. Intranode data communications in a parallel computer

    DOEpatents

    Archer, Charles J; Blocksome, Michael A; Miller, Douglas R; Ratterman, Joseph D; Smith, Brian E

    2014-01-07

    Intranode data communications in a parallel computer that includes compute nodes configured to execute processes, where the data communications include: allocating, upon initialization of a first process of a computer node, a region of shared memory; establishing, by the first process, a predefined number of message buffers, each message buffer associated with a process to be initialized on the compute node; sending, to a second process on the same compute node, a data communications message without determining whether the second process has been initialized, including storing the data communications message in the message buffer of the second process; and upon initialization of the second process: retrieving, by the second process, a pointer to the second process's message buffer; and retrieving, by the second process from the second process's message buffer in dependence upon the pointer, the data communications message sent by the first process.

  5. Intranode data communications in a parallel computer

    DOEpatents

    Archer, Charles J; Blocksome, Michael A; Miller, Douglas R; Ratterman, Joseph D; Smith, Brian E

    2013-07-23

    Intranode data communications in a parallel computer that includes compute nodes configured to execute processes, where the data communications include: allocating, upon initialization of a first process of a compute node, a region of shared memory; establishing, by the first process, a predefined number of message buffers, each message buffer associated with a process to be initialized on the compute node; sending, to a second process on the same compute node, a data communications message without determining whether the second process has been initialized, including storing the data communications message in the message buffer of the second process; and upon initialization of the second process: retrieving, by the second process, a pointer to the second process's message buffer; and retrieving, by the second process from the second process's message buffer in dependence upon the pointer, the data communications message sent by the first process.

  6. Canadian Libraries and Mass Deacidification.

    ERIC Educational Resources Information Center

    Pacey, Antony

    1992-01-01

    Considers the advantages and disadvantages of six mass deacidification processes that libraries can use to salvage printed materials: the Wei T'o process, the Diethyl Zinc (DEZ) process, the FMC (Lithco) process, the Book Preservation Associates (BPA) process, the "Bookkeeper" process, and the "Lyophilization" process. The…

  7. A new process sensitivity index to identify important system processes under process model and parametric uncertainty

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dai, Heng; Ye, Ming; Walker, Anthony P.

    Hydrological models are always composed of multiple components that represent processes key to intended model applications. When a process can be simulated by multiple conceptual-mathematical models (process models), model uncertainty in representing the process arises. While global sensitivity analysis methods have been widely used for identifying important processes in hydrologic modeling, the existing methods consider only parametric uncertainty but ignore the model uncertainty for process representation. To address this problem, this study develops a new method to probe multimodel process sensitivity by integrating the model averaging methods into the framework of variance-based global sensitivity analysis, given that the model averagingmore » methods quantify both parametric and model uncertainty. A new process sensitivity index is derived as a metric of relative process importance, and the index includes variance in model outputs caused by uncertainty in both process models and model parameters. For demonstration, the new index is used to evaluate the processes of recharge and geology in a synthetic study of groundwater reactive transport modeling. The recharge process is simulated by two models that converting precipitation to recharge, and the geology process is also simulated by two models of different parameterizations of hydraulic conductivity; each process model has its own random parameters. The new process sensitivity index is mathematically general, and can be applied to a wide range of problems in hydrology and beyond.« less

  8. Depth-of-processing effects on priming in stem completion: tests of the voluntary-contamination, conceptual-processing, and lexical-processing hypotheses.

    PubMed

    Richardson-Klavehn, A; Gardiner, J M

    1998-05-01

    Depth-of-processing effects on incidental perceptual memory tests could reflect (a) contamination by voluntary retrieval, (b) sensitivity of involuntary retrieval to prior conceptual processing, or (c) a deficit in lexical processing during graphemic study tasks that affects involuntary retrieval. The authors devised an extension of incidental test methodology--making conjunctive predictions about response times as well as response proportions--to discriminate among these alternatives. They used graphemic, phonemic, and semantic study tasks, and a word-stem completion test with incidental, intentional, and inclusion instructions. Semantic study processing was superior to phonemic study processing in the intentional and inclusion tests, but semantic and phonemic study processing produced equal priming in the incidental test, showing that priming was uncontaminated by voluntary retrieval--a conclusion reinforced by the response-time data--and that priming was insensitive to prior conceptual processing. The incidental test nevertheless showed a priming deficit following graphemic study processing, supporting the lexical-processing hypothesis. Adding a lexical decision to the 3 study tasks eliminated the priming deficit following graphemic study processing, but did not influence priming following phonemic and semantic processing. The results provide the first clear evidence that depth-of-processing effects on perceptual priming can reflect lexical processes, rather than voluntary contamination or conceptual processes.

  9. Improving operational anodising process performance using simulation approach

    NASA Astrophysics Data System (ADS)

    Liong, Choong-Yeun; Ghazali, Syarah Syahidah

    2015-10-01

    The use of aluminium is very widespread, especially in transportation, electrical and electronics, architectural, automotive and engineering applications sectors. Therefore, the anodizing process is an important process for aluminium in order to make the aluminium durable, attractive and weather resistant. This research is focused on the anodizing process operations in manufacturing and supplying of aluminium extrusion. The data required for the development of the model is collected from the observations and interviews conducted in the study. To study the current system, the processes involved in the anodizing process are modeled by using Arena 14.5 simulation software. Those processes consist of five main processes, namely the degreasing process, the etching process, the desmut process, the anodizing process, the sealing process and 16 other processes. The results obtained were analyzed to identify the problems or bottlenecks that occurred and to propose improvement methods that can be implemented on the original model. Based on the comparisons that have been done between the improvement methods, the productivity could be increased by reallocating the workers and reducing loading time.

  10. Value-driven process management: using value to improve processes.

    PubMed

    Melnyk, S A; Christensen, R T

    2000-08-01

    Every firm can be viewed as consisting of various processes. These processes affect everything that the firm does from accepting orders and designing products to scheduling production. In many firms, the management of processes often reflects considerations of efficiency (cost) rather than effectiveness (value). In this article, we introduce a well-structured process for managing processes that begins not with the process, but rather with the customer and the product and the concept of value. This process progresses through a number of steps which include issues such as defining value, generating the appropriate metrics, identifying the critical processes, mapping and assessing the performance of these processes, and identifying long- and short-term areas for action. What makes the approach presented in this article so powerful is that it explicitly links the customer to the process and that the process is evaluated in term of its ability to effectively serve the customers.

  11. Method for routing events from key strokes in a multi-processing computer systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rhodes, D.A.; Rustici, E.; Carter, K.H.

    1990-01-23

    The patent describes a method of routing user input in a computer system which concurrently runs a plurality of processes. It comprises: generating keycodes representative of keys typed by a user; distinguishing generated keycodes by looking up each keycode in a routing table which assigns each possible keycode to an individual assigned process of the plurality of processes, one of which processes being a supervisory process; then, sending each keycode to its assigned process until a keycode assigned to the supervisory process is received; sending keycodes received subsequent to the keycode assigned to the supervisory process to a buffer; next,more » providing additional keycodes to the supervisory process from the buffer until the supervisory process has completed operation; and sending keycodes stored in the buffer to processes assigned therewith after the supervisory process has completedoperation.« less

  12. Issues Management Process Course # 38401

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Binion, Ula Marie

    The purpose of this training it to advise Issues Management Coordinators (IMCs) on the revised Contractor Assurance System (CAS) Issues Management (IM) process. Terminal Objectives: Understand the Laboratory’s IM process; Understand your role in the Laboratory’s IM process. Learning Objectives: Describe the IM process within the context of the CAS; Describe the importance of implementing an institutional IM process at LANL; Describe the process flow for the Laboratory’s IM process; Apply the definition of an issue; Use available resources to determine initial screening risk levels for issues; Describe the required major process steps for each risk level; Describe the personnelmore » responsibilities for IM process implementation; Access available resources to support IM process implementation.« less

  13. Social network supported process recommender system.

    PubMed

    Ye, Yanming; Yin, Jianwei; Xu, Yueshen

    2014-01-01

    Process recommendation technologies have gained more and more attention in the field of intelligent business process modeling to assist the process modeling. However, most of the existing technologies only use the process structure analysis and do not take the social features of processes into account, while the process modeling is complex and comprehensive in most situations. This paper studies the feasibility of social network research technologies on process recommendation and builds a social network system of processes based on the features similarities. Then, three process matching degree measurements are presented and the system implementation is discussed subsequently. Finally, experimental evaluations and future works are introduced.

  14. [Definition and stabilization of processes I. Management processes and support in a Urology Department].

    PubMed

    Pascual, Carlos; Luján, Marcos; Mora, José Ramón; Chiva, Vicente; Gamarra, Manuela

    2015-01-01

    The implantation of total quality management models in clinical departments can better adapt to the 2009 ISO 9004 model. An essential part of implantation of these models is the establishment of processes and their stabilization. There are four types of processes: key, management, support and operative (clinical). Management processes have four parts: process stabilization form, process procedures form, medical activities cost estimation form and, process flow chart. In this paper we will detail the creation of an essential process in a surgical department, such as the process of management of the surgery waiting list.

  15. T-Check in Technologies for Interoperability: Business Process Management in a Web Services Context

    DTIC Science & Technology

    2008-09-01

    UML Sequence Diagram) 6  Figure 3:   BPMN Diagram of the Order Processing Business Process 9  Figure 4:   T-Check Process for Technology Evaluation 10...Figure 5:  Notional System Architecture 12  Figure 6:  Flow Chart of the Order Processing Business Process 14  Figure 7:  Order Processing Activities...features. Figure 3 (created with Intalio BPMS Designer [Intalio 2008]) shows a BPMN view of the Order Processing business process that is used in the

  16. A case study: application of statistical process control tool for determining process capability and sigma level.

    PubMed

    Chopra, Vikram; Bairagi, Mukesh; Trivedi, P; Nagar, Mona

    2012-01-01

    Statistical process control is the application of statistical methods to the measurement and analysis of variation process. Various regulatory authorities such as Validation Guidance for Industry (2011), International Conference on Harmonisation ICH Q10 (2009), the Health Canada guidelines (2009), Health Science Authority, Singapore: Guidance for Product Quality Review (2008), and International Organization for Standardization ISO-9000:2005 provide regulatory support for the application of statistical process control for better process control and understanding. In this study risk assessments, normal probability distributions, control charts, and capability charts are employed for selection of critical quality attributes, determination of normal probability distribution, statistical stability, and capability of production processes, respectively. The objective of this study is to determine tablet production process quality in the form of sigma process capability. By interpreting data and graph trends, forecasting of critical quality attributes, sigma process capability, and stability of process were studied. The overall study contributes to an assessment of process at the sigma level with respect to out-of-specification attributes produced. Finally, the study will point to an area where the application of quality improvement and quality risk assessment principles for achievement of six sigma-capable processes is possible. Statistical process control is the most advantageous tool for determination of the quality of any production process. This tool is new for the pharmaceutical tablet production process. In the case of pharmaceutical tablet production processes, the quality control parameters act as quality assessment parameters. Application of risk assessment provides selection of critical quality attributes among quality control parameters. Sequential application of normality distributions, control charts, and capability analyses provides a valid statistical process control study on process. Interpretation of such a study provides information about stability, process variability, changing of trends, and quantification of process ability against defective production. Comparative evaluation of critical quality attributes by Pareto charts provides the least capable and most variable process that is liable for improvement. Statistical process control thus proves to be an important tool for six sigma-capable process development and continuous quality improvement.

  17. A practical approach for exploration and modeling of the design space of a bacterial vaccine cultivation process.

    PubMed

    Streefland, M; Van Herpen, P F G; Van de Waterbeemd, B; Van der Pol, L A; Beuvery, E C; Tramper, J; Martens, D E; Toft, M

    2009-10-15

    A licensed pharmaceutical process is required to be executed within the validated ranges throughout the lifetime of product manufacturing. Changes to the process, especially for processes involving biological products, usually require the manufacturer to demonstrate that the safety and efficacy of the product remains unchanged by new or additional clinical testing. Recent changes in the regulations for pharmaceutical processing allow broader ranges of process settings to be submitted for regulatory approval, the so-called process design space, which means that a manufacturer can optimize his process within the submitted ranges after the product has entered the market, which allows flexible processes. In this article, the applicability of this concept of the process design space is investigated for the cultivation process step for a vaccine against whooping cough disease. An experimental design (DoE) is applied to investigate the ranges of critical process parameters that still result in a product that meets specifications. The on-line process data, including near infrared spectroscopy, are used to build a descriptive model of the processes used in the experimental design. Finally, the data of all processes are integrated in a multivariate batch monitoring model that represents the investigated process design space. This article demonstrates how the general principles of PAT and process design space can be applied for an undefined biological product such as a whole cell vaccine. The approach chosen for model development described here, allows on line monitoring and control of cultivation batches in order to assure in real time that a process is running within the process design space.

  18. Processing approaches to cognition: the impetus from the levels-of-processing framework.

    PubMed

    Roediger, Henry L; Gallo, David A; Geraci, Lisa

    2002-01-01

    Processing approaches to cognition have a long history, from act psychology to the present, but perhaps their greatest boost was given by the success and dominance of the levels-of-processing framework. We review the history of processing approaches, and explore the influence of the levels-of-processing approach, the procedural approach advocated by Paul Kolers, and the transfer-appropriate processing framework. Processing approaches emphasise the procedures of mind and the idea that memory storage can be usefully conceptualised as residing in the same neural units that originally processed information at the time of encoding. Processing approaches emphasise the unity and interrelatedness of cognitive processes and maintain that they can be dissected into separate faculties only by neglecting the richness of mental life. We end by pointing to future directions for processing approaches.

  19. Global Sensitivity Analysis for Process Identification under Model Uncertainty

    NASA Astrophysics Data System (ADS)

    Ye, M.; Dai, H.; Walker, A. P.; Shi, L.; Yang, J.

    2015-12-01

    The environmental system consists of various physical, chemical, and biological processes, and environmental models are always built to simulate these processes and their interactions. For model building, improvement, and validation, it is necessary to identify important processes so that limited resources can be used to better characterize the processes. While global sensitivity analysis has been widely used to identify important processes, the process identification is always based on deterministic process conceptualization that uses a single model for representing a process. However, environmental systems are complex, and it happens often that a single process may be simulated by multiple alternative models. Ignoring the model uncertainty in process identification may lead to biased identification in that identified important processes may not be so in the real world. This study addresses this problem by developing a new method of global sensitivity analysis for process identification. The new method is based on the concept of Sobol sensitivity analysis and model averaging. Similar to the Sobol sensitivity analysis to identify important parameters, our new method evaluates variance change when a process is fixed at its different conceptualizations. The variance considers both parametric and model uncertainty using the method of model averaging. The method is demonstrated using a synthetic study of groundwater modeling that considers recharge process and parameterization process. Each process has two alternative models. Important processes of groundwater flow and transport are evaluated using our new method. The method is mathematically general, and can be applied to a wide range of environmental problems.

  20. A new process sensitivity index to identify important system processes under process model and parametric uncertainty

    DOE PAGES

    Dai, Heng; Ye, Ming; Walker, Anthony P.; ...

    2017-03-28

    A hydrological model consists of multiple process level submodels, and each submodel represents a process key to the operation of the simulated system. Global sensitivity analysis methods have been widely used to identify important processes for system model development and improvement. The existing methods of global sensitivity analysis only consider parametric uncertainty, and are not capable of handling model uncertainty caused by multiple process models that arise from competing hypotheses about one or more processes. To address this problem, this study develops a new method to probe model output sensitivity to competing process models by integrating model averaging methods withmore » variance-based global sensitivity analysis. A process sensitivity index is derived as a single summary measure of relative process importance, and the index includes variance in model outputs caused by uncertainty in both process models and their parameters. Here, for demonstration, the new index is used to assign importance to the processes of recharge and geology in a synthetic study of groundwater reactive transport modeling. The recharge process is simulated by two models that convert precipitation to recharge, and the geology process is simulated by two models of hydraulic conductivity. Each process model has its own random parameters. Finally, the new process sensitivity index is mathematically general, and can be applied to a wide range of problems in hydrology and beyond.« less

  1. A new process sensitivity index to identify important system processes under process model and parametric uncertainty

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dai, Heng; Ye, Ming; Walker, Anthony P.

    A hydrological model consists of multiple process level submodels, and each submodel represents a process key to the operation of the simulated system. Global sensitivity analysis methods have been widely used to identify important processes for system model development and improvement. The existing methods of global sensitivity analysis only consider parametric uncertainty, and are not capable of handling model uncertainty caused by multiple process models that arise from competing hypotheses about one or more processes. To address this problem, this study develops a new method to probe model output sensitivity to competing process models by integrating model averaging methods withmore » variance-based global sensitivity analysis. A process sensitivity index is derived as a single summary measure of relative process importance, and the index includes variance in model outputs caused by uncertainty in both process models and their parameters. Here, for demonstration, the new index is used to assign importance to the processes of recharge and geology in a synthetic study of groundwater reactive transport modeling. The recharge process is simulated by two models that convert precipitation to recharge, and the geology process is simulated by two models of hydraulic conductivity. Each process model has its own random parameters. Finally, the new process sensitivity index is mathematically general, and can be applied to a wide range of problems in hydrology and beyond.« less

  2. Social Network Supported Process Recommender System

    PubMed Central

    Ye, Yanming; Yin, Jianwei; Xu, Yueshen

    2014-01-01

    Process recommendation technologies have gained more and more attention in the field of intelligent business process modeling to assist the process modeling. However, most of the existing technologies only use the process structure analysis and do not take the social features of processes into account, while the process modeling is complex and comprehensive in most situations. This paper studies the feasibility of social network research technologies on process recommendation and builds a social network system of processes based on the features similarities. Then, three process matching degree measurements are presented and the system implementation is discussed subsequently. Finally, experimental evaluations and future works are introduced. PMID:24672309

  3. A model for process representation and synthesis. Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    Thomas, R. H.

    1971-01-01

    The problem of representing groups of loosely connected processes is investigated, and a model for process representation useful for synthesizing complex patterns of process behavior is developed. There are three parts, the first part isolates the concepts which form the basis for the process representation model by focusing on questions such as: What is a process; What is an event; Should one process be able to restrict the capabilities of another? The second part develops a model for process representation which captures the concepts and intuitions developed in the first part. The model presented is able to describe both the internal structure of individual processes and the interface structure between interacting processes. Much of the model's descriptive power derives from its use of the notion of process state as a vehicle for relating the internal and external aspects of process behavior. The third part demonstrates by example that the model for process representation is a useful one for synthesizing process behavior patterns. In it the model is used to define a variety of interesting process behavior patterns. The dissertation closes by suggesting how the model could be used as a semantic base for a very potent language extension facility.

  4. Process and Post-Process: A Discursive History.

    ERIC Educational Resources Information Center

    Matsuda, Paul Kei

    2003-01-01

    Examines the history of process and post-process in composition studies, focusing on ways in which terms, such as "current-traditional rhetoric,""process," and "post-process" have contributed to the discursive construction of reality. Argues that use of the term post-process in the context of second language writing needs to be guided by a…

  5. Improving operational anodising process performance using simulation approach

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liong, Choong-Yeun, E-mail: lg@ukm.edu.my; Ghazali, Syarah Syahidah, E-mail: syarah@gapps.kptm.edu.my

    The use of aluminium is very widespread, especially in transportation, electrical and electronics, architectural, automotive and engineering applications sectors. Therefore, the anodizing process is an important process for aluminium in order to make the aluminium durable, attractive and weather resistant. This research is focused on the anodizing process operations in manufacturing and supplying of aluminium extrusion. The data required for the development of the model is collected from the observations and interviews conducted in the study. To study the current system, the processes involved in the anodizing process are modeled by using Arena 14.5 simulation software. Those processes consist ofmore » five main processes, namely the degreasing process, the etching process, the desmut process, the anodizing process, the sealing process and 16 other processes. The results obtained were analyzed to identify the problems or bottlenecks that occurred and to propose improvement methods that can be implemented on the original model. Based on the comparisons that have been done between the improvement methods, the productivity could be increased by reallocating the workers and reducing loading time.« less

  6. Feller processes: the next generation in modeling. Brownian motion, Lévy processes and beyond.

    PubMed

    Böttcher, Björn

    2010-12-03

    We present a simple construction method for Feller processes and a framework for the generation of sample paths of Feller processes. The construction is based on state space dependent mixing of Lévy processes. Brownian Motion is one of the most frequently used continuous time Markov processes in applications. In recent years also Lévy processes, of which Brownian Motion is a special case, have become increasingly popular. Lévy processes are spatially homogeneous, but empirical data often suggest the use of spatially inhomogeneous processes. Thus it seems necessary to go to the next level of generalization: Feller processes. These include Lévy processes and in particular brownian motion as special cases but allow spatial inhomogeneities. Many properties of Feller processes are known, but proving the very existence is, in general, very technical. Moreover, an applicable framework for the generation of sample paths of a Feller process was missing. We explain, with practitioners in mind, how to overcome both of these obstacles. In particular our simulation technique allows to apply Monte Carlo methods to Feller processes.

  7. Feller Processes: The Next Generation in Modeling. Brownian Motion, Lévy Processes and Beyond

    PubMed Central

    Böttcher, Björn

    2010-01-01

    We present a simple construction method for Feller processes and a framework for the generation of sample paths of Feller processes. The construction is based on state space dependent mixing of Lévy processes. Brownian Motion is one of the most frequently used continuous time Markov processes in applications. In recent years also Lévy processes, of which Brownian Motion is a special case, have become increasingly popular. Lévy processes are spatially homogeneous, but empirical data often suggest the use of spatially inhomogeneous processes. Thus it seems necessary to go to the next level of generalization: Feller processes. These include Lévy processes and in particular Brownian motion as special cases but allow spatial inhomogeneities. Many properties of Feller processes are known, but proving the very existence is, in general, very technical. Moreover, an applicable framework for the generation of sample paths of a Feller process was missing. We explain, with practitioners in mind, how to overcome both of these obstacles. In particular our simulation technique allows to apply Monte Carlo methods to Feller processes. PMID:21151931

  8. AIRSAR Automated Web-based Data Processing and Distribution System

    NASA Technical Reports Server (NTRS)

    Chu, Anhua; vanZyl, Jakob; Kim, Yunjin; Lou, Yunling; Imel, David; Tung, Wayne; Chapman, Bruce; Durden, Stephen

    2005-01-01

    In this paper, we present an integrated, end-to-end synthetic aperture radar (SAR) processing system that accepts data processing requests, submits processing jobs, performs quality analysis, delivers and archives processed data. This fully automated SAR processing system utilizes database and internet/intranet web technologies to allow external users to browse and submit data processing requests and receive processed data. It is a cost-effective way to manage a robust SAR processing and archival system. The integration of these functions has reduced operator errors and increased processor throughput dramatically.

  9. Simplified process model discovery based on role-oriented genetic mining.

    PubMed

    Zhao, Weidong; Liu, Xi; Dai, Weihui

    2014-01-01

    Process mining is automated acquisition of process models from event logs. Although many process mining techniques have been developed, most of them are based on control flow. Meanwhile, the existing role-oriented process mining methods focus on correctness and integrity of roles while ignoring role complexity of the process model, which directly impacts understandability and quality of the model. To address these problems, we propose a genetic programming approach to mine the simplified process model. Using a new metric of process complexity in terms of roles as the fitness function, we can find simpler process models. The new role complexity metric of process models is designed from role cohesion and coupling, and applied to discover roles in process models. Moreover, the higher fitness derived from role complexity metric also provides a guideline for redesigning process models. Finally, we conduct case study and experiments to show that the proposed method is more effective for streamlining the process by comparing with related studies.

  10. Electrotechnologies to process foods

    USDA-ARS?s Scientific Manuscript database

    Electrical energy is being used to process foods. In conventional food processing plants, electricity drives mechanical devices and controls the degree of process. In recent years, several processing technologies are being developed to process foods directly with electricity. Electrotechnologies use...

  11. Challenges associated with the implementation of the nursing process: A systematic review.

    PubMed

    Zamanzadeh, Vahid; Valizadeh, Leila; Tabrizi, Faranak Jabbarzadeh; Behshid, Mojghan; Lotfi, Mojghan

    2015-01-01

    Nursing process is a scientific approach in the provision of qualified nursing cares. However, in practice, the implementation of this process is faced with numerous challenges. With the knowledge of the challenges associated with the implementation of the nursing process, the nursing processes can be developed appropriately. Due to the lack of comprehensive information on this subject, the current study was carried out to assess the key challenges associated with the implementation of the nursing process. To achieve and review related studies on this field, databases of Iran medix, SID, Magiran, PUBMED, Google scholar, and Proquest were assessed using the main keywords of nursing process and nursing process systematic review. The articles were retrieved in three steps including searching by keywords, review of the proceedings based on inclusion criteria, and final retrieval and assessment of available full texts. Systematic assessment of the articles showed different challenges in implementation of the nursing process. Intangible understanding of the concept of nursing process, different views of the process, lack of knowledge and awareness among nurses related to the execution of process, supports of managing systems, and problems related to recording the nursing process were the main challenges that were extracted from review of literature. On systematically reviewing the literature, intangible understanding of the concept of nursing process has been identified as the main challenge in nursing process. To achieve the best strategy to minimize the challenge, in addition to preparing facilitators for implementation of nursing process, intangible understanding of the concept of nursing process, different views of the process, and forming teams of experts in nursing education are recommended for internalizing the nursing process among nurses.

  12. Challenges associated with the implementation of the nursing process: A systematic review

    PubMed Central

    Zamanzadeh, Vahid; Valizadeh, Leila; Tabrizi, Faranak Jabbarzadeh; Behshid, Mojghan; Lotfi, Mojghan

    2015-01-01

    Background: Nursing process is a scientific approach in the provision of qualified nursing cares. However, in practice, the implementation of this process is faced with numerous challenges. With the knowledge of the challenges associated with the implementation of the nursing process, the nursing processes can be developed appropriately. Due to the lack of comprehensive information on this subject, the current study was carried out to assess the key challenges associated with the implementation of the nursing process. Materials and Methods: To achieve and review related studies on this field, databases of Iran medix, SID, Magiran, PUBMED, Google scholar, and Proquest were assessed using the main keywords of nursing process and nursing process systematic review. The articles were retrieved in three steps including searching by keywords, review of the proceedings based on inclusion criteria, and final retrieval and assessment of available full texts. Results: Systematic assessment of the articles showed different challenges in implementation of the nursing process. Intangible understanding of the concept of nursing process, different views of the process, lack of knowledge and awareness among nurses related to the execution of process, supports of managing systems, and problems related to recording the nursing process were the main challenges that were extracted from review of literature. Conclusions: On systematically reviewing the literature, intangible understanding of the concept of nursing process has been identified as the main challenge in nursing process. To achieve the best strategy to minimize the challenge, in addition to preparing facilitators for implementation of nursing process, intangible understanding of the concept of nursing process, different views of the process, and forming teams of experts in nursing education are recommended for internalizing the nursing process among nurses. PMID:26257793

  13. Automated synthesis of image processing procedures using AI planning techniques

    NASA Technical Reports Server (NTRS)

    Chien, Steve; Mortensen, Helen

    1994-01-01

    This paper describes the Multimission VICAR (Video Image Communication and Retrieval) Planner (MVP) (Chien 1994) system, which uses artificial intelligence planning techniques (Iwasaki & Friedland, 1985, Pemberthy & Weld, 1992, Stefik, 1981) to automatically construct executable complex image processing procedures (using models of the smaller constituent image processing subprograms) in response to image processing requests made to the JPL Multimission Image Processing Laboratory (MIPL). The MVP system allows the user to specify the image processing requirements in terms of the various types of correction required. Given this information, MVP derives unspecified required processing steps and determines appropriate image processing programs and parameters to achieve the specified image processing goals. This information is output as an executable image processing program which can then be executed to fill the processing request.

  14. Optimisation of shock absorber process parameters using failure mode and effect analysis and genetic algorithm

    NASA Astrophysics Data System (ADS)

    Mariajayaprakash, Arokiasamy; Senthilvelan, Thiyagarajan; Vivekananthan, Krishnapillai Ponnambal

    2013-07-01

    The various process parameters affecting the quality characteristics of the shock absorber during the process were identified using the Ishikawa diagram and by failure mode and effect analysis. The identified process parameters are welding process parameters (squeeze, heat control, wheel speed, and air pressure), damper sealing process parameters (load, hydraulic pressure, air pressure, and fixture height), washing process parameters (total alkalinity, temperature, pH value of rinsing water, and timing), and painting process parameters (flowability, coating thickness, pointage, and temperature). In this paper, the process parameters, namely, painting and washing process parameters, are optimized by Taguchi method. Though the defects are reasonably minimized by Taguchi method, in order to achieve zero defects during the processes, genetic algorithm technique is applied on the optimized parameters obtained by Taguchi method.

  15. Methods, media and systems for managing a distributed application running in a plurality of digital processing devices

    DOEpatents

    Laadan, Oren; Nieh, Jason; Phung, Dan

    2012-10-02

    Methods, media and systems for managing a distributed application running in a plurality of digital processing devices are provided. In some embodiments, a method includes running one or more processes associated with the distributed application in virtualized operating system environments on a plurality of digital processing devices, suspending the one or more processes, and saving network state information relating to network connections among the one or more processes. The method further include storing process information relating to the one or more processes, recreating the network connections using the saved network state information, and restarting the one or more processes using the stored process information.

  16. SEMICONDUCTOR TECHNOLOGY A signal processing method for the friction-based endpoint detection system of a CMP process

    NASA Astrophysics Data System (ADS)

    Chi, Xu; Dongming, Guo; Zhuji, Jin; Renke, Kang

    2010-12-01

    A signal processing method for the friction-based endpoint detection system of a chemical mechanical polishing (CMP) process is presented. The signal process method uses the wavelet threshold denoising method to reduce the noise contained in the measured original signal, extracts the Kalman filter innovation from the denoised signal as the feature signal, and judges the CMP endpoint based on the feature of the Kalman filter innovation sequence during the CMP process. Applying the signal processing method, the endpoint detection experiments of the Cu CMP process were carried out. The results show that the signal processing method can judge the endpoint of the Cu CMP process.

  17. Composite faces are not (necessarily) processed coactively: A test using systems factorial technology and logical-rule models.

    PubMed

    Cheng, Xue Jun; McCarthy, Callum J; Wang, Tony S L; Palmeri, Thomas J; Little, Daniel R

    2018-06-01

    Upright faces are thought to be processed more holistically than inverted faces. In the widely used composite face paradigm, holistic processing is inferred from interference in recognition performance from a to-be-ignored face half for upright and aligned faces compared with inverted or misaligned faces. We sought to characterize the nature of holistic processing in composite faces in computational terms. We use logical-rule models (Fifić, Little, & Nosofsky, 2010) and Systems Factorial Technology (Townsend & Nozawa, 1995) to examine whether composite faces are processed through pooling top and bottom face halves into a single processing channel-coactive processing-which is one common mechanistic definition of holistic processing. By specifically operationalizing holistic processing as the pooling of features into a single decision process in our task, we are able to distinguish it from other processing models that may underlie composite face processing. For instance, a failure of selective attention might result even when top and bottom components of composite faces are processed in serial or in parallel without processing the entire face coactively. Our results show that performance is best explained by a mixture of serial and parallel processing architectures across all 4 upright and inverted, aligned and misaligned face conditions. The results indicate multichannel, featural processing of composite faces in a manner inconsistent with the notion of coactivity. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  18. Fuzzy image processing in sun sensor

    NASA Technical Reports Server (NTRS)

    Mobasser, S.; Liebe, C. C.; Howard, A.

    2003-01-01

    This paper will describe how the fuzzy image processing is implemented in the instrument. Comparison of the Fuzzy image processing and a more conventional image processing algorithm is provided and shows that the Fuzzy image processing yields better accuracy then conventional image processing.

  19. DESIGNING ENVIRONMENTAL, ECONOMIC AND ENERGY EFFICIENT CHEMICAL PROCESSES

    EPA Science Inventory

    The design and improvement of chemical processes can be very challenging. The earlier energy conservation, process economics and environmental aspects are incorporated into the process development, the easier and less expensive it is to alter the process design. Process emissio...

  20. Reversing the conventional leather processing sequence for cleaner leather production.

    PubMed

    Saravanabhavan, Subramani; Thanikaivelan, Palanisamy; Rao, Jonnalagadda Raghava; Nair, Balachandran Unni; Ramasami, Thirumalachari

    2006-02-01

    Conventional leather processing generally involves a combination of single and multistep processes that employs as well as expels various biological, inorganic, and organic materials. It involves nearly 14-15 steps and discharges a huge amount of pollutants. This is primarily due to the fact that conventional leather processing employs a "do-undo" process logic. In this study, the conventional leather processing steps have been reversed to overcome the problems associated with the conventional method. The charges of the skin matrix and of the chemicals and pH profiles of the process have been judiciously used for reversing the process steps. This reversed process eventually avoids several acidification and basification/neutralization steps used in conventional leather processing. The developed process has been validated through various analyses such as chromium content, shrinkage temperature, softness measurements, scanning electron microscopy, and physical testing of the leathers. Further, the performance of the leathers is shown to be on par with conventionally processed leathers through bulk property evaluation. The process enjoys a significant reduction in COD and TS by 53 and 79%, respectively. Water consumption and discharge is reduced by 65 and 64%, respectively. Also, the process benefits from significant reduction in chemicals, time, power, and cost compared to the conventional process.

  1. Group processing in an undergraduate biology course for preservice teachers: Experiences and attitudes

    NASA Astrophysics Data System (ADS)

    Schellenberger, Lauren Brownback

    Group processing is a key principle of cooperative learning in which small groups discuss their strengths and weaknesses and set group goals or norms. However, group processing has not been well-studied at the post-secondary level or from a qualitative or mixed methods perspective. This mixed methods study uses a phenomenological framework to examine the experience of group processing for students in an undergraduate biology course for preservice teachers. The effect of group processing on students' attitudes toward future group work and group processing is also examined. Additionally, this research investigated preservice teachers' plans for incorporating group processing into future lessons. Students primarily experienced group processing as a time to reflect on past performance. Also, students experienced group processing as a time to increase communication among group members and become motivated for future group assignments. Three factors directly influenced students' experiences with group processing: (1) previous experience with group work, (2) instructor interaction, and (3) gender. Survey data indicated that group processing had a slight positive effect on students' attitudes toward future group work and group processing. Participants who were interviewed felt that group processing was an important part of group work and that it had increased their group's effectiveness as well as their ability to work effectively with other people. Participants held positive views on group work prior to engaging in group processing, and group processing did not alter their atittude toward group work. Preservice teachers who were interviewed planned to use group work and a modified group processing protocol in their future classrooms. They also felt that group processing had prepared them for their future professions by modeling effective collaboration and group skills. Based on this research, a new model for group processing has been created which includes extensive instructor interaction and additional group processing sessions. This study offers a new perspective on the phenomenon of group processing and informs science educators and teacher educators on the effective implementation of this important component of small-group learning.

  2. Properties of the Bivariate Delayed Poisson Process

    DTIC Science & Technology

    1974-07-01

    and Lewis (1972) in their Berkeley Symposium paper and here their analysis of the bivariate Poisson processes (without Poisson noise) is carried... Poisson processes . They cannot, however, be independent Poisson processes because their events are associated in pairs by the displace- ment centres...process because its marginal processes for events of each type are themselves (univariate) Poisson processes . Cox and Lewis (1972) assumed a

  3. The Application of Six Sigma Methodologies to University Processes: The Use of Student Teams

    ERIC Educational Resources Information Center

    Pryor, Mildred Golden; Alexander, Christine; Taneja, Sonia; Tirumalasetty, Sowmya; Chadalavada, Deepthi

    2012-01-01

    The first student Six Sigma team (activated under a QEP Process Sub-team) evaluated the course and curriculum approval process. The goal was to streamline the process and thereby shorten process cycle time and reduce confusion about how the process works. Members of this team developed flowcharts on how the process is supposed to work (by…

  4. Impact of Radio Frequency Identification (RFID) on the Marine Corps’ Supply Process

    DTIC Science & Technology

    2006-09-01

    Hypothetical Improvement Using a Real-Time Order Processing System Vice a Batch Order Processing System ................56 3. As-Is: The Current... Processing System Vice a Batch Order Processing System ................58 V. RESULTS ................................................69 A. SIMULATION...Time: Hypothetical Improvement Using a Real-Time Order Processing System Vice a Batch Order Processing System ................71 3. As-Is: The

  5. Global-local processing relates to spatial and verbal processing: implications for sex differences in cognition.

    PubMed

    Pletzer, Belinda; Scheuringer, Andrea; Scherndl, Thomas

    2017-09-05

    Sex differences have been reported for a variety of cognitive tasks and related to the use of different cognitive processing styles in men and women. It was recently argued that these processing styles share some characteristics across tasks, i.e. male approaches are oriented towards holistic stimulus aspects and female approaches are oriented towards stimulus details. In that respect, sex-dependent cognitive processing styles share similarities with attentional global-local processing. A direct relationship between cognitive processing and global-local processing has however not been previously established. In the present study, 49 men and 44 women completed a Navon paradigm and a Kimchi Palmer task as well as a navigation task and a verbal fluency task with the goal to relate the global advantage (GA) effect as a measure of global processing to holistic processing styles in both tasks. Indeed participants with larger GA effects displayed more holistic processing during spatial navigation and phonemic fluency. However, the relationship to cognitive processing styles was modulated by the specific condition of the Navon paradigm, as well as the sex of participants. Thus, different types of global-local processing play different roles for cognitive processing in men and women.

  6. 21 CFR 113.83 - Establishing scheduled processes.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... competent processing authorities. If incubation tests are necessary for process confirmation, they shall... instituting the process. The incubation tests for confirmation of the scheduled processes should include the.... Complete records covering all aspects of the establishment of the process and associated incubation tests...

  7. 21 CFR 113.83 - Establishing scheduled processes.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... competent processing authorities. If incubation tests are necessary for process confirmation, they shall... instituting the process. The incubation tests for confirmation of the scheduled processes should include the.... Complete records covering all aspects of the establishment of the process and associated incubation tests...

  8. 21 CFR 113.83 - Establishing scheduled processes.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... competent processing authorities. If incubation tests are necessary for process confirmation, they shall... instituting the process. The incubation tests for confirmation of the scheduled processes should include the.... Complete records covering all aspects of the establishment of the process and associated incubation tests...

  9. A mathematical study of a random process proposed as an atmospheric turbulence model

    NASA Technical Reports Server (NTRS)

    Sidwell, K.

    1977-01-01

    A random process is formed by the product of a local Gaussian process and a random amplitude process, and the sum of that product with an independent mean value process. The mathematical properties of the resulting process are developed, including the first and second order properties and the characteristic function of general order. An approximate method for the analysis of the response of linear dynamic systems to the process is developed. The transition properties of the process are also examined.

  10. Standard services for the capture, processing, and distribution of packetized telemetry data

    NASA Technical Reports Server (NTRS)

    Stallings, William H.

    1989-01-01

    Standard functional services for the capture, processing, and distribution of packetized data are discussed with particular reference to the future implementation of packet processing systems, such as those for the Space Station Freedom. The major functions are listed under the following major categories: input processing, packet processing, and output processing. A functional block diagram of a packet data processing facility is presented, showing the distribution of the various processing functions as well as the primary data flow through the facility.

  11. Assessment of hospital processes using a process mining technique: Outpatient process analysis at a tertiary hospital.

    PubMed

    Yoo, Sooyoung; Cho, Minsu; Kim, Eunhye; Kim, Seok; Sim, Yerim; Yoo, Donghyun; Hwang, Hee; Song, Minseok

    2016-04-01

    Many hospitals are increasing their efforts to improve processes because processes play an important role in enhancing work efficiency and reducing costs. However, to date, a quantitative tool has not been available to examine the before and after effects of processes and environmental changes, other than the use of indirect indicators, such as mortality rate and readmission rate. This study used process mining technology to analyze process changes based on changes in the hospital environment, such as the construction of a new building, and to measure the effects of environmental changes in terms of consultation wait time, time spent per task, and outpatient care processes. Using process mining technology, electronic health record (EHR) log data of outpatient care before and after constructing a new building were analyzed, and the effectiveness of the technology in terms of the process was evaluated. Using the process mining technique, we found that the total time spent in outpatient care did not increase significantly compared to that before the construction of a new building, considering that the number of outpatients increased, and the consultation wait time decreased. These results suggest that the operation of the outpatient clinic was effective after changes were implemented in the hospital environment. We further identified improvements in processes using the process mining technique, thereby demonstrating the usefulness of this technique for analyzing complex hospital processes at a low cost. This study confirmed the effectiveness of process mining technology at an actual hospital site. In future studies, the use of process mining technology will be expanded by applying this approach to a larger variety of process change situations. Copyright © 2016. Published by Elsevier Ireland Ltd.

  12. Study of process variables associated with manufacturing hermetically-sealed nickel-cadmium cells

    NASA Technical Reports Server (NTRS)

    Miller, L.

    1974-01-01

    A two year study of the major process variables associated with the manufacturing process for sealed, nickel-cadmium, areospace cells is summarized. Effort was directed toward identifying the major process variables associated with a manufacturing process, experimentally assessing each variable's effect, and imposing the necessary changes (optimization) and controls for the critical process variables to improve results and uniformity. A critical process variable associated with the sintered nickel plaque manufacturing process was identified as the manual forming operation. Critical process variables identified with the positive electrode impregnation/polarization process were impregnation solution temperature, free acid content, vacuum impregnation, and sintered plaque strength. Positive and negative electrodes were identified as a major source of carbonate contamination in sealed cells.

  13. Monitoring autocorrelated process: A geometric Brownian motion process approach

    NASA Astrophysics Data System (ADS)

    Li, Lee Siaw; Djauhari, Maman A.

    2013-09-01

    Autocorrelated process control is common in today's modern industrial process control practice. The current practice of autocorrelated process control is to eliminate the autocorrelation by using an appropriate model such as Box-Jenkins models or other models and then to conduct process control operation based on the residuals. In this paper we show that many time series are governed by a geometric Brownian motion (GBM) process. Therefore, in this case, by using the properties of a GBM process, we only need an appropriate transformation and model the transformed data to come up with the condition needs in traditional process control. An industrial example of cocoa powder production process in a Malaysian company will be presented and discussed to illustrate the advantages of the GBM approach.

  14. Meta-control of combustion performance with a data mining approach

    NASA Astrophysics Data System (ADS)

    Song, Zhe

    Large scale combustion process is complex and proposes challenges of optimizing its performance. Traditional approaches based on thermal dynamics have limitations on finding optimal operational regions due to time-shift nature of the process. Recent advances in information technology enable people collect large volumes of process data easily and continuously. The collected process data contains rich information about the process and, to some extent, represents a digital copy of the process over time. Although large volumes of data exist in industrial combustion processes, they are not fully utilized to the level where the process can be optimized. Data mining is an emerging science which finds patterns or models from large data sets. It has found many successful applications in business marketing, medical and manufacturing domains The focus of this dissertation is on applying data mining to industrial combustion processes, and ultimately optimizing the combustion performance. However the philosophy, methods and frameworks discussed in this research can also be applied to other industrial processes. Optimizing an industrial combustion process has two major challenges. One is the underlying process model changes over time and obtaining an accurate process model is nontrivial. The other is that a process model with high fidelity is usually highly nonlinear, solving the optimization problem needs efficient heuristics. This dissertation is set to solve these two major challenges. The major contribution of this 4-year research is the data-driven solution to optimize the combustion process, where process model or knowledge is identified based on the process data, then optimization is executed by evolutionary algorithms to search for optimal operating regions.

  15. 5 CFR 1653.13 - Processing legal processes.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 5 Administrative Personnel 3 2014-01-01 2014-01-01 false Processing legal processes. 1653.13 Section 1653.13 Administrative Personnel FEDERAL RETIREMENT THRIFT INVESTMENT BOARD COURT ORDERS AND LEGAL PROCESSES AFFECTING THRIFT SAVINGS PLAN ACCOUNTS Legal Process for the Enforcement of a Participant's Legal...

  16. 5 CFR 1653.13 - Processing legal processes.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 5 Administrative Personnel 3 2013-01-01 2013-01-01 false Processing legal processes. 1653.13 Section 1653.13 Administrative Personnel FEDERAL RETIREMENT THRIFT INVESTMENT BOARD COURT ORDERS AND LEGAL PROCESSES AFFECTING THRIFT SAVINGS PLAN ACCOUNTS Legal Process for the Enforcement of a Participant's Legal...

  17. A Search Algorithm for Generating Alternative Process Plans in Flexible Manufacturing System

    NASA Astrophysics Data System (ADS)

    Tehrani, Hossein; Sugimura, Nobuhiro; Tanimizu, Yoshitaka; Iwamura, Koji

    Capabilities and complexity of manufacturing systems are increasing and striving for an integrated manufacturing environment. Availability of alternative process plans is a key factor for integration of design, process planning and scheduling. This paper describes an algorithm for generation of alternative process plans by extending the existing framework of the process plan networks. A class diagram is introduced for generating process plans and process plan networks from the viewpoint of the integrated process planning and scheduling systems. An incomplete search algorithm is developed for generating and searching the process plan networks. The benefit of this algorithm is that the whole process plan network does not have to be generated before the search algorithm starts. This algorithm is applicable to large and enormous process plan networks and also to search wide areas of the network based on the user requirement. The algorithm can generate alternative process plans and to select a suitable one based on the objective functions.

  18. PyMS: a Python toolkit for processing of gas chromatography-mass spectrometry (GC-MS) data. Application and comparative study of selected tools.

    PubMed

    O'Callaghan, Sean; De Souza, David P; Isaac, Andrew; Wang, Qiao; Hodkinson, Luke; Olshansky, Moshe; Erwin, Tim; Appelbe, Bill; Tull, Dedreia L; Roessner, Ute; Bacic, Antony; McConville, Malcolm J; Likić, Vladimir A

    2012-05-30

    Gas chromatography-mass spectrometry (GC-MS) is a technique frequently used in targeted and non-targeted measurements of metabolites. Most existing software tools for processing of raw instrument GC-MS data tightly integrate data processing methods with graphical user interface facilitating interactive data processing. While interactive processing remains critically important in GC-MS applications, high-throughput studies increasingly dictate the need for command line tools, suitable for scripting of high-throughput, customized processing pipelines. PyMS comprises a library of functions for processing of instrument GC-MS data developed in Python. PyMS currently provides a complete set of GC-MS processing functions, including reading of standard data formats (ANDI- MS/NetCDF and JCAMP-DX), noise smoothing, baseline correction, peak detection, peak deconvolution, peak integration, and peak alignment by dynamic programming. A novel common ion single quantitation algorithm allows automated, accurate quantitation of GC-MS electron impact (EI) fragmentation spectra when a large number of experiments are being analyzed. PyMS implements parallel processing for by-row and by-column data processing tasks based on Message Passing Interface (MPI), allowing processing to scale on multiple CPUs in distributed computing environments. A set of specifically designed experiments was performed in-house and used to comparatively evaluate the performance of PyMS and three widely used software packages for GC-MS data processing (AMDIS, AnalyzerPro, and XCMS). PyMS is a novel software package for the processing of raw GC-MS data, particularly suitable for scripting of customized processing pipelines and for data processing in batch mode. PyMS provides limited graphical capabilities and can be used both for routine data processing and interactive/exploratory data analysis. In real-life GC-MS data processing scenarios PyMS performs as well or better than leading software packages. We demonstrate data processing scenarios simple to implement in PyMS, yet difficult to achieve with many conventional GC-MS data processing software. Automated sample processing and quantitation with PyMS can provide substantial time savings compared to more traditional interactive software systems that tightly integrate data processing with the graphical user interface.

  19. Processing mode during repetitive thinking in socially anxious individuals: evidence for a maladaptive experiential mode.

    PubMed

    Wong, Quincy J J; Moulds, Michelle L

    2012-12-01

    Evidence from the depression literature suggests that an analytical processing mode adopted during repetitive thinking leads to maladaptive outcomes relative to an experiential processing mode. To date, in socially anxious individuals, the impact of processing mode during repetitive thinking related to an actual social-evaluative situation has not been investigated. We thus tested whether an analytical processing mode would be maladaptive relative to an experiential processing mode during anticipatory processing and post-event rumination. High and low socially anxious participants were induced to engage in either an analytical or experiential processing mode during: (a) anticipatory processing before performing a speech (Experiment 1; N = 94), or (b) post-event rumination after performing a speech (Experiment 2; N = 74). Mood, cognition, and behavioural measures were employed to examine the effects of processing mode. For high socially anxious participants, the modes had a similar effect on self-reported anxiety during both anticipatory processing and post-event rumination. Unexpectedly, relative to the analytical mode, the experiential mode led to stronger high standard and conditional beliefs during anticipatory processing, and stronger unconditional beliefs during post-event rumination. These experiments are the first to investigate processing mode during anticipatory processing and post-event rumination. Hence, these results are novel and will need to be replicated. These findings suggest that an experiential processing mode is maladaptive relative to an analytical processing mode during repetitive thinking characteristic of socially anxious individuals. Copyright © 2012 Elsevier Ltd. All rights reserved.

  20. Using process elicitation and validation to understand and improve chemotherapy ordering and delivery.

    PubMed

    Mertens, Wilson C; Christov, Stefan C; Avrunin, George S; Clarke, Lori A; Osterweil, Leon J; Cassells, Lucinda J; Marquard, Jenna L

    2012-11-01

    Chemotherapy ordering and administration, in which errors have potentially severe consequences, was quantitatively and qualitatively evaluated by employing process formalism (or formal process definition), a technique derived from software engineering, to elicit and rigorously describe the process, after which validation techniques were applied to confirm the accuracy of the described process. The chemotherapy ordering and administration process, including exceptional situations and individuals' recognition of and responses to those situations, was elicited through informal, unstructured interviews with members of an interdisciplinary team. The process description (or process definition), written in a notation developed for software quality assessment purposes, guided process validation (which consisted of direct observations and semistructured interviews to confirm the elicited details for the treatment plan portion of the process). The overall process definition yielded 467 steps; 207 steps (44%) were dedicated to handling 59 exceptional situations. Validation yielded 82 unique process events (35 new expected but not yet described steps, 16 new exceptional situations, and 31 new steps in response to exceptional situations). Process participants actively altered the process as ambiguities and conflicts were discovered by the elicitation and validation components of the study. Chemotherapy error rates declined significantly during and after the project, which was conducted from October 2007 through August 2008. Each elicitation method and the subsequent validation discussions contributed uniquely to understanding the chemotherapy treatment plan review process, supporting rapid adoption of changes, improved communication regarding the process, and ensuing error reduction.

  1. Modeling interdependencies between business and communication processes in hospitals.

    PubMed

    Brigl, Birgit; Wendt, Thomas; Winter, Alfred

    2003-01-01

    The optimization and redesign of business processes in hospitals is an important challenge for the hospital information management who has to design and implement a suitable HIS architecture. Nevertheless, there are no tools available specializing in modeling information-driven business processes and the consequences on the communication between information processing, tools. Therefore, we will present an approach which facilitates the representation and analysis of business processes and resulting communication processes between application components and their interdependencies. This approach aims not only to visualize those processes, but to also to evaluate if there are weaknesses concerning the information processing infrastructure which hinder the smooth implementation of the business processes.

  2. Life cycle analysis within pharmaceutical process optimization and intensification: case study of active pharmaceutical ingredient production.

    PubMed

    Ott, Denise; Kralisch, Dana; Denčić, Ivana; Hessel, Volker; Laribi, Yosra; Perrichon, Philippe D; Berguerand, Charline; Kiwi-Minsker, Lioubov; Loeb, Patrick

    2014-12-01

    As the demand for new drugs is rising, the pharmaceutical industry faces the quest of shortening development time, and thus, reducing the time to market. Environmental aspects typically still play a minor role within the early phase of process development. Nevertheless, it is highly promising to rethink, redesign, and optimize process strategies as early as possible in active pharmaceutical ingredient (API) process development, rather than later at the stage of already established processes. The study presented herein deals with a holistic life-cycle-based process optimization and intensification of a pharmaceutical production process targeting a low-volume, high-value API. Striving for process intensification by transfer from batch to continuous processing, as well as an alternative catalytic system, different process options are evaluated with regard to their environmental impact to identify bottlenecks and improvement potentials for further process development activities. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  3. SOI-CMOS Process for Monolithic, Radiation-Tolerant, Science-Grade Imagers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Williams, George; Lee, Adam

    In Phase I, Voxtel worked with Jazz and Sandia to document and simulate the processes necessary to implement a DH-BSI SOI CMOS imaging process. The development is based upon mature SOI CMOS process at both fabs, with the addition of only a few custom processing steps for integration and electrical interconnection of the fully-depleted photodetectors. In Phase I, Voxtel also characterized the Sandia process, including the CMOS7 design rules, and we developed the outline of a process option that included a “BOX etch”, that will permit a “detector in handle” SOI CMOS process to be developed The process flows weremore » developed in cooperation with both Jazz and Sandia process engineers, along with detailed TCAD modeling and testing of the photodiode array architectures. In addition, Voxtel tested the radiation performance of the Jazz’s CA18HJ process, using standard and circular-enclosed transistors.« less

  4. Face to face with emotion: holistic face processing is modulated by emotional state.

    PubMed

    Curby, Kim M; Johnson, Kareem J; Tyson, Alyssa

    2012-01-01

    Negative emotions are linked with a local, rather than global, visual processing style, which may preferentially facilitate feature-based, relative to holistic, processing mechanisms. Because faces are typically processed holistically, and because social contexts are prime elicitors of emotions, we examined whether negative emotions decrease holistic processing of faces. We induced positive, negative, or neutral emotions via film clips and measured holistic processing before and after the induction: participants made judgements about cued parts of chimeric faces, and holistic processing was indexed by the interference caused by task-irrelevant face parts. Emotional state significantly modulated face-processing style, with the negative emotion induction leading to decreased holistic processing. Furthermore, self-reported change in emotional state correlated with changes in holistic processing. These results contrast with general assumptions that holistic processing of faces is automatic and immune to outside influences, and they illustrate emotion's power to modulate socially relevant aspects of visual perception.

  5. 5 CFR 581.203 - Information minimally required to accompany legal process.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... accompany legal process. 581.203 Section 581.203 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT... Process § 581.203 Information minimally required to accompany legal process. (a) Sufficient identifying information must accompany the legal process in order to enable processing by the governmental entity named...

  6. 5 CFR 581.203 - Information minimally required to accompany legal process.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... accompany legal process. 581.203 Section 581.203 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT... Process § 581.203 Information minimally required to accompany legal process. (a) Sufficient identifying information must accompany the legal process in order to enable processing by the governmental entity named...

  7. 5 CFR 581.203 - Information minimally required to accompany legal process.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... accompany legal process. 581.203 Section 581.203 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT... Process § 581.203 Information minimally required to accompany legal process. (a) Sufficient identifying information must accompany the legal process in order to enable processing by the governmental entity named...

  8. 5 CFR 581.203 - Information minimally required to accompany legal process.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... accompany legal process. 581.203 Section 581.203 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT... Process § 581.203 Information minimally required to accompany legal process. (a) Sufficient identifying information must accompany the legal process in order to enable processing by the governmental entity named...

  9. 5 CFR 581.203 - Information minimally required to accompany legal process.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... accompany legal process. 581.203 Section 581.203 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT... Process § 581.203 Information minimally required to accompany legal process. (a) Sufficient identifying information must accompany the legal process in order to enable processing by the governmental entity named...

  10. 20 CFR 405.725 - Effect of expedited appeals process agreement.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... PROCESS FOR ADJUDICATING INITIAL DISABILITY CLAIMS Expedited Appeals Process for Constitutional Issues § 405.725 Effect of expedited appeals process agreement. After an expedited appeals process agreement is... 20 Employees' Benefits 2 2010-04-01 2010-04-01 false Effect of expedited appeals process agreement...

  11. Common and distinct networks for self-referential and social stimulus processing in the human brain.

    PubMed

    Herold, Dorrit; Spengler, Stephanie; Sajonz, Bastian; Usnich, Tatiana; Bermpohl, Felix

    2016-09-01

    Self-referential processing is a complex cognitive function, involving a set of implicit and explicit processes, complicating investigation of its distinct neural signature. The present study explores the functional overlap and dissociability of self-referential and social stimulus processing. We combined an established paradigm for explicit self-referential processing with an implicit social stimulus processing paradigm in one fMRI experiment to determine the neural effects of self-relatedness and social processing within one study. Overlapping activations were found in the orbitofrontal cortex and in the intermediate part of the precuneus. Stimuli judged as self-referential specifically activated the posterior cingulate cortex, the ventral medial prefrontal cortex, extending into anterior cingulate cortex and orbitofrontal cortex, the dorsal medial prefrontal cortex, the ventral and dorsal lateral prefrontal cortex, the left inferior temporal gyrus, and occipital cortex. Social processing specifically involved the posterior precuneus and bilateral temporo-parietal junction. Taken together, our data show, not only, first, common networks for both processes in the medial prefrontal and the medial parietal cortex, but also, second, functional differentiations for self-referential processing versus social processing: an anterior-posterior gradient for social processing and self-referential processing within the medial parietal cortex and specific activations for self-referential processing in the medial and lateral prefrontal cortex and for social processing in the temporo-parietal junction.

  12. Using Unified Modelling Language (UML) as a process-modelling technique for clinical-research process improvement.

    PubMed

    Kumarapeli, P; De Lusignan, S; Ellis, T; Jones, B

    2007-03-01

    The Primary Care Data Quality programme (PCDQ) is a quality-improvement programme which processes routinely collected general practice computer data. Patient data collected from a wide range of different brands of clinical computer systems are aggregated, processed, and fed back to practices in an educational context to improve the quality of care. Process modelling is a well-established approach used to gain understanding and systematic appraisal, and identify areas of improvement of a business process. Unified modelling language (UML) is a general purpose modelling technique used for this purpose. We used UML to appraise the PCDQ process to see if the efficiency and predictability of the process could be improved. Activity analysis and thinking-aloud sessions were used to collect data to generate UML diagrams. The UML model highlighted the sequential nature of the current process as a barrier for efficiency gains. It also identified the uneven distribution of process controls, lack of symmetric communication channels, critical dependencies among processing stages, and failure to implement all the lessons learned in the piloting phase. It also suggested that improved structured reporting at each stage - especially from the pilot phase, parallel processing of data and correctly positioned process controls - should improve the efficiency and predictability of research projects. Process modelling provided a rational basis for the critical appraisal of a clinical data processing system; its potential maybe underutilized within health care.

  13. Use of Analogies in the Study of Diffusion

    ERIC Educational Resources Information Center

    Letic, Milorad

    2014-01-01

    Emergent processes, such as diffusion, are considered more difficult to understand than direct processes. In physiology, most processes are presented as direct processes, so emergent processes, when encountered, are even more difficult to understand. It has been suggested that, when studying diffusion, misconceptions about random processes are the…

  14. Is Analytic Information Processing a Feature of Expertise in Medicine?

    ERIC Educational Resources Information Center

    McLaughlin, Kevin; Rikers, Remy M.; Schmidt, Henk G.

    2008-01-01

    Diagnosing begins by generating an initial diagnostic hypothesis by automatic information processing. Information processing may stop here if the hypothesis is accepted, or analytical processing may be used to refine the hypothesis. This description portrays analytic processing as an optional extra in information processing, leading us to…

  15. 5 CFR 582.305 - Honoring legal process.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 5 Administrative Personnel 1 2014-01-01 2014-01-01 false Honoring legal process. 582.305 Section... GARNISHMENT OF FEDERAL EMPLOYEES' PAY Compliance With Legal Process § 582.305 Honoring legal process. (a) The agency shall comply with legal process, except where the process cannot be complied with because: (1) It...

  16. 5 CFR 582.305 - Honoring legal process.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 5 Administrative Personnel 1 2011-01-01 2011-01-01 false Honoring legal process. 582.305 Section... GARNISHMENT OF FEDERAL EMPLOYEES' PAY Compliance With Legal Process § 582.305 Honoring legal process. (a) The agency shall comply with legal process, except where the process cannot be complied with because: (1) It...

  17. 5 CFR 581.305 - Honoring legal process.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 5 Administrative Personnel 1 2014-01-01 2014-01-01 false Honoring legal process. 581.305 Section... GARNISHMENT ORDERS FOR CHILD SUPPORT AND/OR ALIMONY Compliance With Process § 581.305 Honoring legal process. (a) The governmental entity shall comply with legal process, except where the process cannot be...

  18. 5 CFR 581.305 - Honoring legal process.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 5 Administrative Personnel 1 2010-01-01 2010-01-01 false Honoring legal process. 581.305 Section... GARNISHMENT ORDERS FOR CHILD SUPPORT AND/OR ALIMONY Compliance With Process § 581.305 Honoring legal process. (a) The governmental entity shall comply with legal process, except where the process cannot be...

  19. 5 CFR 582.305 - Honoring legal process.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 5 Administrative Personnel 1 2010-01-01 2010-01-01 false Honoring legal process. 582.305 Section... GARNISHMENT OF FEDERAL EMPLOYEES' PAY Compliance With Legal Process § 582.305 Honoring legal process. (a) The agency shall comply with legal process, except where the process cannot be complied with because: (1) It...

  20. 5 CFR 581.305 - Honoring legal process.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 5 Administrative Personnel 1 2011-01-01 2011-01-01 false Honoring legal process. 581.305 Section... GARNISHMENT ORDERS FOR CHILD SUPPORT AND/OR ALIMONY Compliance With Process § 581.305 Honoring legal process. (a) The governmental entity shall comply with legal process, except where the process cannot be...

  1. Articulating the Resources for Business Process Analysis and Design

    ERIC Educational Resources Information Center

    Jin, Yulong

    2012-01-01

    Effective process analysis and modeling are important phases of the business process management lifecycle. When many activities and multiple resources are involved, it is very difficult to build a correct business process specification. This dissertation provides a resource perspective of business processes. It aims at a better process analysis…

  2. An Integrated Model of Emotion Processes and Cognition in Social Information Processing.

    ERIC Educational Resources Information Center

    Lemerise, Elizabeth A.; Arsenio, William F.

    2000-01-01

    Interprets literature on contributions of social cognitive and emotion processes to children's social competence in the context of an integrated model of emotion processes and cognition in social information processing. Provides neurophysiological and functional evidence for the centrality of emotion processes in personal-social decision making.…

  3. Data Processing and First Products from the Hyperspectral Imager for the Coastal Ocean (HICO) on the International Space Station

    DTIC Science & Technology

    2010-04-01

    NRL Stennis Space Center (NRL-SSC) for further processing using the NRL SSC Automated Processing System (APS). APS was developed for processing...have not previously developed automated processing for 73 hyperspectral ocean color data. The hyperspectral processing branch includes several

  4. DISCRETE COMPOUND POISSON PROCESSES AND TABLES OF THE GEOMETRIC POISSON DISTRIBUTION.

    DTIC Science & Technology

    A concise summary of the salient properties of discrete Poisson processes , with emphasis on comparing the geometric and logarithmic Poisson processes . The...the geometric Poisson process are given for 176 sets of parameter values. New discrete compound Poisson processes are also introduced. These...processes have properties that are particularly relevant when the summation of several different Poisson processes is to be analyzed. This study provides the

  5. Management of processes of electrochemical dimensional processing

    NASA Astrophysics Data System (ADS)

    Akhmetov, I. D.; Zakirova, A. R.; Sadykov, Z. B.

    2017-09-01

    In different industries a lot high-precision parts are produced from hard-processed scarce materials. Forming such details can only be acting during non-contact processing, or a minimum of effort, and doable by the use, for example, of electro-chemical processing. At the present stage of development of metal working processes are important management issues electrochemical machining and its automation. This article provides some indicators and factors of electrochemical machining process.

  6. The Hyperspectral Imager for the Coastal Ocean (HICO): Sensor and Data Processing Overview

    DTIC Science & Technology

    2010-01-20

    backscattering coefficients, and others. Several of these software modules will be developed within the Automated Processing System (APS), a data... Automated Processing System (APS) NRL developed APS, which processes satellite data into ocean color data products. APS is a collection of methods...used for ocean color processing which provide the tools for the automated processing of satellite imagery [1]. These tools are in the process of

  7. [Study on culture and philosophy of processing of traditional Chinese medicines].

    PubMed

    Yang, Ming; Zhang, Ding-Kun; Zhong, Ling-Yun; Wang, Fang

    2013-07-01

    According to cultural views and philosophical thoughts, this paper studies the cultural origin, thinking modes, core principles, general regulation and methods of processing, backtracks processing's culture and history which contains generation and deduction process, experienced and promoting process, and core value, summarizes processing's basic principles which are directed by holistic, objective, dynamic, balanced and appropriate thoughts; so as to propagate cultural characteristic and philosophical wisdom of traditional Chinese medicine processing, to promote inheritance and development of processing and to ensure the maximum therapeutic value of Chinese medical clinical.

  8. Containerless automated processing of intermetallic compounds and composites

    NASA Technical Reports Server (NTRS)

    Johnson, D. R.; Joslin, S. M.; Reviere, R. D.; Oliver, B. F.; Noebe, R. D.

    1993-01-01

    An automated containerless processing system has been developed to directionally solidify high temperature materials, intermetallic compounds, and intermetallic/metallic composites. The system incorporates a wide range of ultra-high purity chemical processing conditions. The utilization of image processing for automated control negates the need for temperature measurements for process control. The list of recent systems that have been processed includes Cr, Mo, Mn, Nb, Ni, Ti, V, and Zr containing aluminides. Possible uses of the system, process control approaches, and properties and structures of recently processed intermetallics are reviewed.

  9. A continuous process for the development of Kodak Aerochrome Infrared Film 2443 as a negative

    NASA Astrophysics Data System (ADS)

    Klimes, D.; Ross, D. I.

    1993-02-01

    A process for the continuous dry-to-dry development of Kodak Aerochrome Infrared Film 2443 as a negative (CIR-neg) is described. The process is well suited for production processing of long film lengths. Chemicals from three commercial film processes are used with modifications. Sensitometric procedures are recommended for the monitoring of processing quality control. Sensitometric data and operational aerial exposures indicate that films developed in this process have approximately the same effective aerial film speed as films processed in the reversal process recommended by the manufacturer (Kodak EA-5). The CIR-neg process is useful when aerial photography is acquired for resources management applications which require print reproductions. Originals can be readily reproduced using conventional production equipment (electronic dodging) in black and white or color (color compensation).

  10. Antibiotics with anaerobic ammonium oxidation in urban wastewater treatment

    NASA Astrophysics Data System (ADS)

    Zhou, Ruipeng; Yang, Yuanming

    2017-05-01

    Biofilter process is based on biological oxidation process on the introduction of fast water filter design ideas generated by an integrated filtration, adsorption and biological role of aerobic wastewater treatment process various purification processes. By engineering example, we show that the process is an ideal sewage and industrial wastewater treatment process of low concentration. Anaerobic ammonia oxidation process because of its advantage of the high efficiency and low consumption, wastewater biological denitrification field has broad application prospects. The process in practical wastewater treatment at home and abroad has become a hot spot. In this paper, anammox bacteria habitats and species diversity, and anaerobic ammonium oxidation process in the form of diversity, and one and split the process operating conditions are compared, focusing on a review of the anammox process technology various types of wastewater laboratory research and engineering applications, including general water quality and pressure filtrate sludge digestion, landfill leachate, aquaculture wastewater, monosodium glutamate wastewater, wastewater, sewage, fecal sewage, waste water salinity wastewater characteristics, research progress and application of the obstacles. Finally, we summarize the anaerobic ammonium oxidation process potential problems during the processing of the actual waste water, and proposed future research focus on in-depth study of water quality anammox obstacle factor and its regulatory policy, and vigorously develop on this basis, and combined process optimization.

  11. Understanding scaling through history-dependent processes with collapsing sample space.

    PubMed

    Corominas-Murtra, Bernat; Hanel, Rudolf; Thurner, Stefan

    2015-04-28

    History-dependent processes are ubiquitous in natural and social systems. Many such stochastic processes, especially those that are associated with complex systems, become more constrained as they unfold, meaning that their sample space, or their set of possible outcomes, reduces as they age. We demonstrate that these sample-space-reducing (SSR) processes necessarily lead to Zipf's law in the rank distributions of their outcomes. We show that by adding noise to SSR processes the corresponding rank distributions remain exact power laws, p(x) ~ x(-λ), where the exponent directly corresponds to the mixing ratio of the SSR process and noise. This allows us to give a precise meaning to the scaling exponent in terms of the degree to which a given process reduces its sample space as it unfolds. Noisy SSR processes further allow us to explain a wide range of scaling exponents in frequency distributions ranging from α = 2 to ∞. We discuss several applications showing how SSR processes can be used to understand Zipf's law in word frequencies, and how they are related to diffusion processes in directed networks, or aging processes such as in fragmentation processes. SSR processes provide a new alternative to understand the origin of scaling in complex systems without the recourse to multiplicative, preferential, or self-organized critical processes.

  12. Effects of Processing Parameters on the Forming Quality of C-Shaped Thermosetting Composite Laminates in Hot Diaphragm Forming Process

    NASA Astrophysics Data System (ADS)

    Bian, X. X.; Gu, Y. Z.; Sun, J.; Li, M.; Liu, W. P.; Zhang, Z. G.

    2013-10-01

    In this study, the effects of processing temperature and vacuum applying rate on the forming quality of C-shaped carbon fiber reinforced epoxy resin matrix composite laminates during hot diaphragm forming process were investigated. C-shaped prepreg preforms were produced using a home-made hot diaphragm forming equipment. The thickness variations of the preforms and the manufacturing defects after diaphragm forming process, including fiber wrinkling and voids, were evaluated to understand the forming mechanism. Furthermore, both interlaminar slipping friction and compaction behavior of the prepreg stacks were experimentally analyzed for showing the importance of the processing parameters. In addition, autoclave processing was used to cure the C-shaped preforms to investigate the changes of the defects before and after cure process. The results show that the C-shaped prepreg preforms with good forming quality can be achieved through increasing processing temperature and reducing vacuum applying rate, which obviously promote prepreg interlaminar slipping process. The process temperature and forming rate in hot diaphragm forming process strongly influence prepreg interply frictional force, and the maximum interlaminar frictional force can be taken as a key parameter for processing parameter optimization. Autoclave process is effective in eliminating voids in the preforms and can alleviate fiber wrinkles to a certain extent.

  13. Assessment of Advanced Coal Gasification Processes

    NASA Technical Reports Server (NTRS)

    McCarthy, John; Ferrall, Joseph; Charng, Thomas; Houseman, John

    1981-01-01

    This report represents a technical assessment of the following advanced coal gasification processes: AVCO High Throughput Gasification (HTG) Process; Bell Single-Stage High Mass Flux (HMF) Process; Cities Service/Rockwell (CS/R) Hydrogasification Process; Exxon Catalytic Coal Gasification (CCG) Process. Each process is evaluated for its potential to produce SNG from a bituminous coal. In addition to identifying the new technology these processes represent, key similarities/differences, strengths/weaknesses, and potential improvements to each process are identified. The AVCO HTG and the Bell HMF gasifiers share similarities with respect to: short residence time (SRT), high throughput rate, slagging and syngas as the initial raw product gas. The CS/R Hydrogasifier is also SRT but is non-slagging and produces a raw gas high in methane content. The Exxon CCG gasifier is a long residence time, catalytic, fluidbed reactor producing all of the raw product methane in the gasifier. The report makes the following assessments: 1) while each process has significant potential as coal gasifiers, the CS/R and Exxon processes are better suited for SNG production; 2) the Exxon process is the closest to a commercial level for near-term SNG production; and 3) the SRT processes require significant development including scale-up and turndown demonstration, char processing and/or utilization demonstration, and reactor control and safety features development.

  14. Integrated Process Modeling-A Process Validation Life Cycle Companion.

    PubMed

    Zahel, Thomas; Hauer, Stefan; Mueller, Eric M; Murphy, Patrick; Abad, Sandra; Vasilieva, Elena; Maurer, Daniel; Brocard, Cécile; Reinisch, Daniela; Sagmeister, Patrick; Herwig, Christoph

    2017-10-17

    During the regulatory requested process validation of pharmaceutical manufacturing processes, companies aim to identify, control, and continuously monitor process variation and its impact on critical quality attributes (CQAs) of the final product. It is difficult to directly connect the impact of single process parameters (PPs) to final product CQAs, especially in biopharmaceutical process development and production, where multiple unit operations are stacked together and interact with each other. Therefore, we want to present the application of Monte Carlo (MC) simulation using an integrated process model (IPM) that enables estimation of process capability even in early stages of process validation. Once the IPM is established, its capability in risk and criticality assessment is furthermore demonstrated. IPMs can be used to enable holistic production control strategies that take interactions of process parameters of multiple unit operations into account. Moreover, IPMs can be trained with development data, refined with qualification runs, and maintained with routine manufacturing data which underlines the lifecycle concept. These applications will be shown by means of a process characterization study recently conducted at a world-leading contract manufacturing organization (CMO). The new IPM methodology therefore allows anticipation of out of specification (OOS) events, identify critical process parameters, and take risk-based decisions on counteractions that increase process robustness and decrease the likelihood of OOS events.

  15. Effects of rigor status during high-pressure processing on the physical qualities of farm-raised abalone (Haliotis rufescens).

    PubMed

    Hughes, Brianna H; Greenberg, Neil J; Yang, Tom C; Skonberg, Denise I

    2015-01-01

    High-pressure processing (HPP) is used to increase meat safety and shelf-life, with conflicting quality effects depending on rigor status during HPP. In the seafood industry, HPP is used to shuck and pasteurize oysters, but its use on abalones has only been minimally evaluated and the effect of rigor status during HPP on abalone quality has not been reported. Farm-raised abalones (Haliotis rufescens) were divided into 12 HPP treatments and 1 unprocessed control treatment. Treatments were processed pre-rigor or post-rigor at 2 pressures (100 and 300 MPa) and 3 processing times (1, 3, and 5 min). The control was analyzed post-rigor. Uniform plugs were cut from adductor and foot meat for texture profile analysis, shear force, and color analysis. Subsamples were used for scanning electron microscopy of muscle ultrastructure. Texture profile analysis revealed that post-rigor processed abalone was significantly (P < 0.05) less firm and chewy than pre-rigor processed irrespective of muscle type, processing time, or pressure. L values increased with pressure to 68.9 at 300 MPa for pre-rigor processed foot, 73.8 for post-rigor processed foot, 90.9 for pre-rigor processed adductor, and 89.0 for post-rigor processed adductor. Scanning electron microscopy images showed fraying of collagen fibers in processed adductor, but did not show pressure-induced compaction of the foot myofibrils. Post-rigor processed abalone meat was more tender than pre-rigor processed meat, and post-rigor processed foot meat was lighter in color than pre-rigor processed foot meat, suggesting that waiting for rigor to resolve prior to processing abalones may improve consumer perceptions of quality and market value. © 2014 Institute of Food Technologists®

  16. PROCESSING ALTERNATIVES FOR DESTRUCTION OF TETRAPHENYLBORATE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lambert, D; Thomas Peters, T; Samuel Fink, S

    Two processes were chosen in the 1980's at the Savannah River Site (SRS) to decontaminate the soluble High Level Waste (HLW). The In Tank Precipitation (ITP) process (1,2) was developed at SRS for the removal of radioactive cesium and actinides from the soluble HLW. Sodium tetraphenylborate was added to the waste to precipitate cesium and monosodium titanate (MST) was added to adsorb actinides, primarily uranium and plutonium. Two products of this process were a low activity waste stream and a concentrated organic stream containing cesium tetraphenylborate and actinides adsorbed on monosodium titanate (MST). A copper catalyzed acid hydrolysis process wasmore » built to process (3, 4) the Tank 48H cesium tetraphenylborate waste in the SRS's Defense Waste Processing Facility (DWPF). Operation of the DWPF would have resulted in the production of benzene for incineration in SRS's Consolidated Incineration Facility. This process was abandoned together with the ITP process in 1998 due to high benzene in ITP caused by decomposition of excess sodium tetraphenylborate. Processing in ITP resulted in the production of approximately 1.0 million liters of HLW. SRS has chosen a solvent extraction process combined with adsorption of the actinides to decontaminate the soluble HLW stream (5). However, the waste in Tank 48H is incompatible with existing waste processing facilities. As a result, a processing facility is needed to disposition the HLW in Tank 48H. This paper will describe the process for searching for processing options by SRS task teams for the disposition of the waste in Tank 48H. In addition, attempts to develop a caustic hydrolysis process for in tank destruction of tetraphenylborate will be presented. Lastly, the development of both a caustic and acidic copper catalyzed peroxide oxidation process will be discussed.« less

  17. Manufacturing Process Selection of Composite Bicycle’s Crank Arm using Analytical Hierarchy Process (AHP)

    NASA Astrophysics Data System (ADS)

    Luqman, M.; Rosli, M. U.; Khor, C. Y.; Zambree, Shayfull; Jahidi, H.

    2018-03-01

    Crank arm is one of the important parts in a bicycle that is an expensive product due to the high cost of material and production process. This research is aimed to investigate the potential type of manufacturing process to fabricate composite bicycle crank arm and to describe an approach based on analytical hierarchy process (AHP) that assists decision makers or manufacturing engineers in determining the most suitable process to be employed in manufacturing of composite bicycle crank arm at the early stage of the product development process to reduce the production cost. There are four types of processes were considered, namely resin transfer molding (RTM), compression molding (CM), vacuum bag molding and filament winding (FW). The analysis ranks these four types of process for its suitability in the manufacturing of bicycle crank arm based on five main selection factors and 10 sub factors. Determining the right manufacturing process was performed based on AHP process steps. Consistency test was performed to make sure the judgements are consistent during the comparison. The results indicated that the compression molding was the most appropriate manufacturing process because it has the highest value (33.6%) among the other manufacturing processes.

  18. A System-Oriented Approach for the Optimal Control of Process Chains under Stochastic Influences

    NASA Astrophysics Data System (ADS)

    Senn, Melanie; Schäfer, Julian; Pollak, Jürgen; Link, Norbert

    2011-09-01

    Process chains in manufacturing consist of multiple connected processes in terms of dynamic systems. The properties of a product passing through such a process chain are influenced by the transformation of each single process. There exist various methods for the control of individual processes, such as classical state controllers from cybernetics or function mapping approaches realized by statistical learning. These controllers ensure that a desired state is obtained at process end despite of variations in the input and disturbances. The interactions between the single processes are thereby neglected, but play an important role in the optimization of the entire process chain. We divide the overall optimization into two phases: (1) the solution of the optimization problem by Dynamic Programming to find the optimal control variable values for each process for any encountered end state of its predecessor and (2) the application of the optimal control variables at runtime for the detected initial process state. The optimization problem is solved by selecting adequate control variables for each process in the chain backwards based on predefined quality requirements for the final product. For the demonstration of the proposed concept, we have chosen a process chain from sheet metal manufacturing with simplified transformation functions.

  19. Quantitative analysis of geomorphic processes using satellite image data at different scales

    NASA Technical Reports Server (NTRS)

    Williams, R. S., Jr.

    1985-01-01

    When aerial and satellite photographs and images are used in the quantitative analysis of geomorphic processes, either through direct observation of active processes or by analysis of landforms resulting from inferred active or dormant processes, a number of limitations in the use of such data must be considered. Active geomorphic processes work at different scales and rates. Therefore, the capability of imaging an active or dormant process depends primarily on the scale of the process and the spatial-resolution characteristic of the imaging system. Scale is an important factor in recording continuous and discontinuous active geomorphic processes, because what is not recorded will not be considered or even suspected in the analysis of orbital images. If the geomorphic process of landform change caused by the process is less than 200 m in x to y dimension, then it will not be recorded. Although the scale factor is critical, in the recording of discontinuous active geomorphic processes, the repeat interval of orbital-image acquisition of a planetary surface also is a consideration in order to capture a recurring short-lived geomorphic process or to record changes caused by either a continuous or a discontinuous geomorphic process.

  20. Remote Sensing Image Quality Assessment Experiment with Post-Processing

    NASA Astrophysics Data System (ADS)

    Jiang, W.; Chen, S.; Wang, X.; Huang, Q.; Shi, H.; Man, Y.

    2018-04-01

    This paper briefly describes the post-processing influence assessment experiment, the experiment includes three steps: the physical simulation, image processing, and image quality assessment. The physical simulation models sampled imaging system in laboratory, the imaging system parameters are tested, the digital image serving as image processing input are produced by this imaging system with the same imaging system parameters. The gathered optical sampled images with the tested imaging parameters are processed by 3 digital image processes, including calibration pre-processing, lossy compression with different compression ratio and image post-processing with different core. Image quality assessment method used is just noticeable difference (JND) subject assessment based on ISO20462, through subject assessment of the gathered and processing images, the influence of different imaging parameters and post-processing to image quality can be found. The six JND subject assessment experimental data can be validated each other. Main conclusions include: image post-processing can improve image quality; image post-processing can improve image quality even with lossy compression, image quality with higher compression ratio improves less than lower ratio; with our image post-processing method, image quality is better, when camera MTF being within a small range.

  1. Microstructure and Texture of Al-2.5wt.%Mg Processed by Combining Accumulative Roll Bonding and Conventional Rolling

    NASA Astrophysics Data System (ADS)

    Gatti, J. R.; Bhattacharjee, P. P.

    2014-12-01

    Evolution of microstructure and texture during severe deformation and annealing was studied in Al-2.5%Mg alloy processed by two different routes, namely, monotonic Accumulative Roll Bonding (ARB) and a hybrid route combining ARB and conventional rolling (CR). For this purpose Al-2.5%Mg sheets were subjected to 5 cycles of monotonic ARB (equivalent strain (ɛeq) = 4.0) processing while in the hybrid route (ARB + CR) 3 cycle ARB-processed sheets were further deformed by conventional rolling to 75% reduction in thickness (ɛeq = 4.0). Although formation of ultrafine structure was observed in the two processing routes, the monotonic ARB—processed material showed finer microstructure but weak texture as compared to the ARB + CR—processed material. After complete recrystallization, the ARB + CR-processed material showed weak cube texture ({001}<100>) but the cube component was almost negligible in the monotonic ARB-processed material-processed material. However, the ND-rotated cube components were stronger in the monotonic ARB-processed material-processed material. The observed differences in the microstructure and texture evolution during deformation and annealing could be explained by the characteristic differences of the two processing routes.

  2. Process Materialization Using Templates and Rules to Design Flexible Process Models

    NASA Astrophysics Data System (ADS)

    Kumar, Akhil; Yao, Wen

    The main idea in this paper is to show how flexible processes can be designed by combining generic process templates and business rules. We instantiate a process by applying rules to specific case data, and running a materialization algorithm. The customized process instance is then executed in an existing workflow engine. We present an architecture and also give an algorithm for process materialization. The rules are written in a logic-based language like Prolog. Our focus is on capturing deeper process knowledge and achieving a holistic approach to robust process design that encompasses control flow, resources and data, as well as makes it easier to accommodate changes to business policy.

  3. HMI conventions for process control graphics.

    PubMed

    Pikaar, Ruud N

    2012-01-01

    Process operators supervise and control complex processes. To enable the operator to do an adequate job, instrumentation and process control engineers need to address several related topics, such as console design, information design, navigation, and alarm management. In process control upgrade projects, usually a 1:1 conversion of existing graphics is proposed. This paper suggests another approach, efficiently leading to a reduced number of new powerful process graphics, supported by a permanent process overview displays. In addition a road map for structuring content (process information) and conventions for the presentation of objects, symbols, and so on, has been developed. The impact of the human factors engineering approach on process control upgrade projects is illustrated by several cases.

  4. A novel processed food classification system applied to Australian food composition databases.

    PubMed

    O'Halloran, S A; Lacy, K E; Grimes, C A; Woods, J; Campbell, K J; Nowson, C A

    2017-08-01

    The extent of food processing can affect the nutritional quality of foodstuffs. Categorising foods by the level of processing emphasises the differences in nutritional quality between foods within the same food group and is likely useful for determining dietary processed food consumption. The present study aimed to categorise foods within Australian food composition databases according to the level of food processing using a processed food classification system, as well as assess the variation in the levels of processing within food groups. A processed foods classification system was applied to food and beverage items contained within Australian Food and Nutrient (AUSNUT) 2007 (n = 3874) and AUSNUT 2011-13 (n = 5740). The proportion of Minimally Processed (MP), Processed Culinary Ingredients (PCI) Processed (P) and Ultra Processed (ULP) by AUSNUT food group and the overall proportion of the four processed food categories across AUSNUT 2007 and AUSNUT 2011-13 were calculated. Across the food composition databases, the overall proportions of foods classified as MP, PCI, P and ULP were 27%, 3%, 26% and 44% for AUSNUT 2007 and 38%, 2%, 24% and 36% for AUSNUT 2011-13. Although there was wide variation in the classifications of food processing within the food groups, approximately one-third of foodstuffs were classified as ULP food items across both the 2007 and 2011-13 AUSNUT databases. This Australian processed food classification system will allow researchers to easily quantify the contribution of processed foods within the Australian food supply to assist in assessing the nutritional quality of the dietary intake of population groups. © 2017 The British Dietetic Association Ltd.

  5. Process and domain specificity in regions engaged for face processing: an fMRI study of perceptual differentiation.

    PubMed

    Collins, Heather R; Zhu, Xun; Bhatt, Ramesh S; Clark, Jonathan D; Joseph, Jane E

    2012-12-01

    The degree to which face-specific brain regions are specialized for different kinds of perceptual processing is debated. This study parametrically varied demands on featural, first-order configural, or second-order configural processing of faces and houses in a perceptual matching task to determine the extent to which the process of perceptual differentiation was selective for faces regardless of processing type (domain-specific account), specialized for specific types of perceptual processing regardless of category (process-specific account), engaged in category-optimized processing (i.e., configural face processing or featural house processing), or reflected generalized perceptual differentiation (i.e., differentiation that crosses category and processing type boundaries). ROIs were identified in a separate localizer run or with a similarity regressor in the face-matching runs. The predominant principle accounting for fMRI signal modulation in most regions was generalized perceptual differentiation. Nearly all regions showed perceptual differentiation for both faces and houses for more than one processing type, even if the region was identified as face-preferential in the localizer run. Consistent with process specificity, some regions showed perceptual differentiation for first-order processing of faces and houses (right fusiform face area and occipito-temporal cortex and right lateral occipital complex), but not for featural or second-order processing. Somewhat consistent with domain specificity, the right inferior frontal gyrus showed perceptual differentiation only for faces in the featural matching task. The present findings demonstrate that the majority of regions involved in perceptual differentiation of faces are also involved in differentiation of other visually homogenous categories.

  6. Process- and Domain-Specificity in Regions Engaged for Face Processing: An fMRI Study of Perceptual Differentiation

    PubMed Central

    Collins, Heather R.; Zhu, Xun; Bhatt, Ramesh S.; Clark, Jonathan D.; Joseph, Jane E.

    2015-01-01

    The degree to which face-specific brain regions are specialized for different kinds of perceptual processing is debated. The present study parametrically varied demands on featural, first-order configural or second-order configural processing of faces and houses in a perceptual matching task to determine the extent to which the process of perceptual differentiation was selective for faces regardless of processing type (domain-specific account), specialized for specific types of perceptual processing regardless of category (process-specific account), engaged in category-optimized processing (i.e., configural face processing or featural house processing) or reflected generalized perceptual differentiation (i.e. differentiation that crosses category and processing type boundaries). Regions of interest were identified in a separate localizer run or with a similarity regressor in the face-matching runs. The predominant principle accounting for fMRI signal modulation in most regions was generalized perceptual differentiation. Nearly all regions showed perceptual differentiation for both faces and houses for more than one processing type, even if the region was identified as face-preferential in the localizer run. Consistent with process-specificity, some regions showed perceptual differentiation for first-order processing of faces and houses (right fusiform face area and occipito-temporal cortex, and right lateral occipital complex), but not for featural or second-order processing. Somewhat consistent with domain-specificity, the right inferior frontal gyrus showed perceptual differentiation only for faces in the featural matching task. The present findings demonstrate that the majority of regions involved in perceptual differentiation of faces are also involved in differentiation of other visually homogenous categories. PMID:22849402

  7. Achieving Continuous Manufacturing for Final Dosage Formation: Challenges and How to Meet Them May 20-21 2014 Continuous Manufacturing Symposium.

    PubMed

    Byrn, Stephen; Futran, Maricio; Thomas, Hayden; Jayjock, Eric; Maron, Nicola; Meyer, Robert F; Myerson, Allan S; Thien, Michael P; Trout, Bernhardt L

    2015-03-01

    We describe the key issues and possibilities for continuous final dosage formation, otherwise known as downstream processing or drug product manufacturing. A distinction is made between heterogeneous processing and homogeneous processing, the latter of which is expected to add more value to continuous manufacturing. We also give the key motivations for moving to continuous manufacturing, some of the exciting new technologies, and the barriers to implementation of continuous manufacturing. Continuous processing of heterogeneous blends is the natural first step in converting existing batch processes to continuous. In heterogeneous processing, there are discrete particles that can segregate, versus in homogeneous processing, components are blended and homogenized such that they do not segregate. Heterogeneous processing can incorporate technologies that are closer to existing technologies, where homogeneous processing necessitates the development and incorporation of new technologies. Homogeneous processing has the greatest potential for reaping the full rewards of continuous manufacturing, but it takes long-term vision and a more significant change in process development than heterogeneous processing. Heterogeneous processing has the detriment that, as the technologies are adopted rather than developed, there is a strong tendency to incorporate correction steps, what we call below "The Rube Goldberg Problem." Thus, although heterogeneous processing will likely play a major role in the near-term transformation of heterogeneous to continuous processing, it is expected that homogeneous processing is the next step that will follow. Specific action items for industry leaders are. © 2014 Wiley Periodicals, Inc. and the American Pharmacists Association.

  8. An Analysis of the Air Force Government Operated Civil Engineering Supply Store Logistic System: How Can It Be Improved?

    DTIC Science & Technology

    1990-09-01

    6 Logistics Systems ............ 7 GOCESS Operation . . . . . . . ..... 9 Work Order Processing . . . . ... 12 Job Order Processing . . . . . . . . . . 14...orders and job orders to the Material Control Section will be discussed separately. Work Order Processing . Figure 2 illustrates typical WO processing...logistics function. The JO processing is similar. Job Order Processing . Figure 3 illustrates typical JO processing in a GOCESS operation. As with WOs, this

  9. Adaptive-optics optical coherence tomography processing using a graphics processing unit.

    PubMed

    Shafer, Brandon A; Kriske, Jeffery E; Kocaoglu, Omer P; Turner, Timothy L; Liu, Zhuolin; Lee, John Jaehwan; Miller, Donald T

    2014-01-01

    Graphics processing units are increasingly being used for scientific computing for their powerful parallel processing abilities, and moderate price compared to super computers and computing grids. In this paper we have used a general purpose graphics processing unit to process adaptive-optics optical coherence tomography (AOOCT) images in real time. Increasing the processing speed of AOOCT is an essential step in moving the super high resolution technology closer to clinical viability.

  10. Data processing system for the Sneg-2MP experiment

    NASA Technical Reports Server (NTRS)

    Gavrilova, Y. A.

    1980-01-01

    The data processing system for scientific experiments on stations of the "Prognoz" type provides for the processing sequence to be broken down into a number of consecutive stages: preliminary processing, primary processing, secondary processing. The tasks of each data processing stage are examined for an experiment designed to study gamma flashes of galactic origin and solar flares lasting from several minutes to seconds in the 20 kev to 1000 kev energy range.

  11. General RMP Guidance - Appendix D: OSHA Guidance on PSM

    EPA Pesticide Factsheets

    OSHA's Process Safety Management (PSM) Guidance on providing complete and accurate written information concerning process chemicals, process technology, and process equipment; including process hazard analysis and material safety data sheets.

  12. Elaboration Likelihood and the Counseling Process: The Role of Affect.

    ERIC Educational Resources Information Center

    Stoltenberg, Cal D.; And Others

    The role of affect in counseling has been examined from several orientations. The depth of processing model views the efficiency of information processing as a function of the extent to which the information is processed. The notion of cognitive processing capacity states that processing information at deeper levels engages more of one's limited…

  13. 5 CFR 582.202 - Service of legal process.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 5 Administrative Personnel 1 2011-01-01 2011-01-01 false Service of legal process. 582.202 Section... GARNISHMENT OF FEDERAL EMPLOYEES' PAY Service of Legal Process § 582.202 Service of legal process. (a) A person using this part shall serve interrogatories and legal process on the agent to receive process as...

  14. 5 CFR 582.202 - Service of legal process.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 5 Administrative Personnel 1 2014-01-01 2014-01-01 false Service of legal process. 582.202 Section... GARNISHMENT OF FEDERAL EMPLOYEES' PAY Service of Legal Process § 582.202 Service of legal process. (a) A person using this part shall serve interrogatories and legal process on the agent to receive process as...

  15. Information Processing Concepts: A Cure for "Technofright." Information Processing in the Electronic Office. Part 1: Concepts.

    ERIC Educational Resources Information Center

    Popyk, Marilyn K.

    1986-01-01

    Discusses the new automated office and its six major technologies (data processing, word processing, graphics, image, voice, and networking), the information processing cycle (input, processing, output, distribution/communication, and storage and retrieval), ergonomics, and ways to expand office education classes (versus class instruction). (CT)

  16. Facial Speech Gestures: The Relation between Visual Speech Processing, Phonological Awareness, and Developmental Dyslexia in 10-Year-Olds

    ERIC Educational Resources Information Center

    Schaadt, Gesa; Männel, Claudia; van der Meer, Elke; Pannekamp, Ann; Friederici, Angela D.

    2016-01-01

    Successful communication in everyday life crucially involves the processing of auditory and visual components of speech. Viewing our interlocutor and processing visual components of speech facilitates speech processing by triggering auditory processing. Auditory phoneme processing, analyzed by event-related brain potentials (ERP), has been shown…

  17. 40 CFR 65.62 - Process vent group determination.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ..., or Group 2B) for each process vent. Group 1 process vents require control, and Group 2A and 2B... 40 Protection of Environment 15 2010-07-01 2010-07-01 false Process vent group determination. 65... (CONTINUED) CONSOLIDATED FEDERAL AIR RULE Process Vents § 65.62 Process vent group determination. (a) Group...

  18. 40 CFR 63.138 - Process wastewater provisions-performance standards for treatment processes managing Group 1...

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    .../or Table 9 compounds are similar and often identical. (3) Biological treatment processes. Biological treatment processes in compliance with this section may be either open or closed biological treatment processes as defined in § 63.111. An open biological treatment process in compliance with this section need...

  19. 5 CFR 581.202 - Service of process.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 5 Administrative Personnel 1 2010-01-01 2010-01-01 false Service of process. 581.202 Section 581... GARNISHMENT ORDERS FOR CHILD SUPPORT AND/OR ALIMONY Service of Process § 581.202 Service of process. (a) A... facilitate proper service of process on its designated agent(s). If legal process is not directed to any...

  20. 30 CFR 828.11 - In situ processing: Performance standards.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 30 Mineral Resources 3 2011-07-01 2011-07-01 false In situ processing: Performance standards. 828... STANDARDS-IN SITU PROCESSING § 828.11 In situ processing: Performance standards. (a) The person who conducts in situ processing activities shall comply with 30 CFR 817 and this section. (b) In situ processing...

  1. 30 CFR 828.11 - In situ processing: Performance standards.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 30 Mineral Resources 3 2012-07-01 2012-07-01 false In situ processing: Performance standards. 828... STANDARDS-IN SITU PROCESSING § 828.11 In situ processing: Performance standards. (a) The person who conducts in situ processing activities shall comply with 30 CFR 817 and this section. (b) In situ processing...

  2. 30 CFR 828.11 - In situ processing: Performance standards.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 30 Mineral Resources 3 2013-07-01 2013-07-01 false In situ processing: Performance standards. 828... STANDARDS-IN SITU PROCESSING § 828.11 In situ processing: Performance standards. (a) The person who conducts in situ processing activities shall comply with 30 CFR 817 and this section. (b) In situ processing...

  3. 30 CFR 828.11 - In situ processing: Performance standards.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 30 Mineral Resources 3 2010-07-01 2010-07-01 false In situ processing: Performance standards. 828... STANDARDS-IN SITU PROCESSING § 828.11 In situ processing: Performance standards. (a) The person who conducts in situ processing activities shall comply with 30 CFR 817 and this section. (b) In situ processing...

  4. 30 CFR 828.11 - In situ processing: Performance standards.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 30 Mineral Resources 3 2014-07-01 2014-07-01 false In situ processing: Performance standards. 828... STANDARDS-IN SITU PROCESSING § 828.11 In situ processing: Performance standards. (a) The person who conducts in situ processing activities shall comply with 30 CFR 817 and this section. (b) In situ processing...

  5. Processing Depth, Elaboration of Encoding, Memory Stores, and Expended Processing Capacity.

    ERIC Educational Resources Information Center

    Eysenck, Michael W.; Eysenck, M. Christine

    1979-01-01

    The effects of several factors on expended processing capacity were measured. Expended processing capacity was greater when information was retrieved from secondary memory than from primary memory, when processing was of a deep, semantic nature than when it was shallow and physical, and when processing was more elaborate. (Author/GDC)

  6. Speed isn’t everything: Complex processing speed measures mask individual differences and developmental changes in executive control

    PubMed Central

    Cepeda, Nicholas J.; Blackwell, Katharine A.; Munakata, Yuko

    2012-01-01

    The rate at which people process information appears to influence many aspects of cognition across the lifespan. However, many commonly accepted measures of “processing speed” may require goal maintenance, manipulation of information in working memory, and decision-making, blurring the distinction between processing speed and executive control and resulting in overestimation of processing-speed contributions to cognition. This concern may apply particularly to studies of developmental change, as even seemingly simple processing speed measures may require executive processes to keep children and older adults on task. We report two new studies and a re-analysis of a published study, testing predictions about how different processing speed measures influence conclusions about executive control across the life span. We find that the choice of processing speed measure affects the relationship observed between processing speed and executive control, in a manner that changes with age, and that choice of processing speed measure affects conclusions about development and the relationship among executive control measures. Implications for understanding processing speed, executive control, and their development are discussed. PMID:23432836

  7. The impact of working memory and the “process of process modelling” on model quality: Investigating experienced versus inexperienced modellers

    NASA Astrophysics Data System (ADS)

    Martini, Markus; Pinggera, Jakob; Neurauter, Manuel; Sachse, Pierre; Furtner, Marco R.; Weber, Barbara

    2016-05-01

    A process model (PM) represents the graphical depiction of a business process, for instance, the entire process from online ordering a book until the parcel is delivered to the customer. Knowledge about relevant factors for creating PMs of high quality is lacking. The present study investigated the role of cognitive processes as well as modelling processes in creating a PM in experienced and inexperienced modellers. Specifically, two working memory (WM) functions (holding and processing of information and relational integration) and three process of process modelling phases (comprehension, modelling, and reconciliation) were related to PM quality. Our results show that the WM function of relational integration was positively related to PM quality in both modelling groups. The ratio of comprehension phases was negatively related to PM quality in inexperienced modellers and the ratio of reconciliation phases was positively related to PM quality in experienced modellers. Our research reveals central cognitive mechanisms in process modelling and has potential practical implications for the development of modelling software and teaching the craft of process modelling.

  8. A new class of random processes with application to helicopter noise

    NASA Technical Reports Server (NTRS)

    Hardin, Jay C.; Miamee, A. G.

    1989-01-01

    The concept of dividing random processes into classes (e.g., stationary, locally stationary, periodically correlated, and harmonizable) has long been employed. A new class of random processes is introduced which includes many of these processes as well as other interesting processes which fall into none of the above classes. Such random processes are denoted as linearly correlated. This class is shown to include the familiar stationary and periodically correlated processes as well as many other, both harmonizable and non-harmonizable, nonstationary processes. When a process is linearly correlated for all t and harmonizable, its two-dimensional power spectral density S(x) (omega 1, omega 2) is shown to take a particularly simple form, being non-zero only on lines such that omega 1 to omega 2 = + or - r(k) where the r(k's) are (not necessarily equally spaced) roots of a characteristic function. The relationship of such processes to the class of stationary processes is examined. In addition, the application of such processes in the analysis of typical helicopter noise signals is described.

  9. The impact of working memory and the “process of process modelling” on model quality: Investigating experienced versus inexperienced modellers

    PubMed Central

    Martini, Markus; Pinggera, Jakob; Neurauter, Manuel; Sachse, Pierre; Furtner, Marco R.; Weber, Barbara

    2016-01-01

    A process model (PM) represents the graphical depiction of a business process, for instance, the entire process from online ordering a book until the parcel is delivered to the customer. Knowledge about relevant factors for creating PMs of high quality is lacking. The present study investigated the role of cognitive processes as well as modelling processes in creating a PM in experienced and inexperienced modellers. Specifically, two working memory (WM) functions (holding and processing of information and relational integration) and three process of process modelling phases (comprehension, modelling, and reconciliation) were related to PM quality. Our results show that the WM function of relational integration was positively related to PM quality in both modelling groups. The ratio of comprehension phases was negatively related to PM quality in inexperienced modellers and the ratio of reconciliation phases was positively related to PM quality in experienced modellers. Our research reveals central cognitive mechanisms in process modelling and has potential practical implications for the development of modelling software and teaching the craft of process modelling. PMID:27157858

  10. A new class of random processes with application to helicopter noise

    NASA Technical Reports Server (NTRS)

    Hardin, Jay C.; Miamee, A. G.

    1989-01-01

    The concept of dividing random processes into classes (e.g., stationary, locally stationary, periodically correlated, and harmonizable) has long been employed. A new class of random processes is introduced which includes many of these processes as well as other interesting processes which fall into none of the above classes. Such random processes are denoted as linearly correlated. This class is shown to include the familiar stationary and periodically correlated processes as well as many other, both harmonizable and non-harmonizable, nonstationary processes. When a process is linearly correlated for all t and harmonizable, its two-dimensional power spectral density S(x)(omega 1, omega 2) is shown to take a particularly simple form, being non-zero only on lines such that omega 1 to omega 2 = + or - r(k) where the r(k's) are (not necessarily equally spaced) roots of a characteristic function. The relationship of such processes to the class of stationary processes is examined. In addition, the application of such processes in the analysis of typical helicopter noise signals is described.

  11. The impact of working memory and the "process of process modelling" on model quality: Investigating experienced versus inexperienced modellers.

    PubMed

    Martini, Markus; Pinggera, Jakob; Neurauter, Manuel; Sachse, Pierre; Furtner, Marco R; Weber, Barbara

    2016-05-09

    A process model (PM) represents the graphical depiction of a business process, for instance, the entire process from online ordering a book until the parcel is delivered to the customer. Knowledge about relevant factors for creating PMs of high quality is lacking. The present study investigated the role of cognitive processes as well as modelling processes in creating a PM in experienced and inexperienced modellers. Specifically, two working memory (WM) functions (holding and processing of information and relational integration) and three process of process modelling phases (comprehension, modelling, and reconciliation) were related to PM quality. Our results show that the WM function of relational integration was positively related to PM quality in both modelling groups. The ratio of comprehension phases was negatively related to PM quality in inexperienced modellers and the ratio of reconciliation phases was positively related to PM quality in experienced modellers. Our research reveals central cognitive mechanisms in process modelling and has potential practical implications for the development of modelling software and teaching the craft of process modelling.

  12. Rapid Automatized Naming in Children with Dyslexia: Is Inhibitory Control Involved?

    PubMed

    Bexkens, Anika; van den Wildenberg, Wery P M; Tijms, Jurgen

    2015-08-01

    Rapid automatized naming (RAN) is widely seen as an important indicator of dyslexia. The nature of the cognitive processes involved in rapid naming is however still a topic of controversy. We hypothesized that in addition to the involvement of phonological processes and processing speed, RAN is a function of inhibition processes, in particular of interference control. A total 86 children with dyslexia and 31 normal readers were recruited. Our results revealed that in addition to phonological processing and processing speed, interference control predicts rapid naming in dyslexia, but in contrast to these other two cognitive processes, inhibition is not significantly associated with their reading and spelling skills. After variance in reading and spelling associated with processing speed, interference control and phonological processing was partialled out, naming speed was no longer consistently associated with the reading and spelling skills of children with dyslexia. Finally, dyslexic children differed from normal readers on naming speed, literacy skills, phonological processing and processing speed, but not on inhibition processes. Both theoretical and clinical interpretations of these results are discussed. Copyright © 2014 John Wiley & Sons, Ltd.

  13. Feasibility of using continuous chromatography in downstream processing: Comparison of costs and product quality for a hybrid process vs. a conventional batch process.

    PubMed

    Ötes, Ozan; Flato, Hendrik; Winderl, Johannes; Hubbuch, Jürgen; Capito, Florian

    2017-10-10

    The protein A capture step is the main cost-driver in downstream processing, with high attrition costs especially when using protein A resin not until end of resin lifetime. Here we describe a feasibility study, transferring a batch downstream process to a hybrid process, aimed at replacing batch protein A capture chromatography with a continuous capture step, while leaving the polishing steps unchanged to minimize required process adaptations compared to a batch process. 35g of antibody were purified using the hybrid approach, resulting in comparable product quality and step yield compared to the batch process. Productivity for the protein A step could be increased up to 420%, reducing buffer amounts by 30-40% and showing robustness for at least 48h continuous run time. Additionally, to enable its potential application in a clinical trial manufacturing environment cost of goods were compared for the protein A step between hybrid process and batch process, showing a 300% cost reduction, depending on processed volumes and batch cycles. Copyright © 2017 Elsevier B.V. All rights reserved.

  14. Non-Conscious Perception of Emotions in Psychiatric Disorders: The Unsolved Puzzle of Psychopathology.

    PubMed

    Lee, Seung A; Kim, Chai-Youn; Lee, Seung-Hwan

    2016-03-01

    Psychophysiological and functional neuroimaging studies have frequently and consistently shown that emotional information can be processed outside of the conscious awareness. Non-conscious processing comprises automatic, uncontrolled, and fast processing that occurs without subjective awareness. However, how such non-conscious emotional processing occurs in patients with various psychiatric disorders requires further examination. In this article, we reviewed and discussed previous studies on the non-conscious emotional processing in patients diagnosed with anxiety disorder, schizophrenia, bipolar disorder, and depression, to further understand how non-conscious emotional processing varies across these psychiatric disorders. Although the symptom profile of each disorder does not often overlap with one another, these patients commonly show abnormal emotional processing based on the pathology of their mood and cognitive function. This indicates that the observed abnormalities of emotional processing in certain social interactions may derive from a biased mood or cognition process that precedes consciously controlled and voluntary processes. Since preconscious forms of emotional processing appear to have a major effect on behaviour and cognition in patients with these disorders, further investigation is required to understand these processes and their impact on patient pathology.

  15. Empirical evaluation of the Process Overview Measure for assessing situation awareness in process plants.

    PubMed

    Lau, Nathan; Jamieson, Greg A; Skraaning, Gyrd

    2016-03-01

    The Process Overview Measure is a query-based measure developed to assess operator situation awareness (SA) from monitoring process plants. A companion paper describes how the measure has been developed according to process plant properties and operator cognitive work. The Process Overview Measure demonstrated practicality, sensitivity, validity and reliability in two full-scope simulator experiments investigating dramatically different operational concepts. Practicality was assessed based on qualitative feedback of participants and researchers. The Process Overview Measure demonstrated sensitivity and validity by revealing significant effects of experimental manipulations that corroborated with other empirical results. The measure also demonstrated adequate inter-rater reliability and practicality for measuring SA in full-scope simulator settings based on data collected on process experts. Thus, full-scope simulator studies can employ the Process Overview Measure to reveal the impact of new control room technology and operational concepts on monitoring process plants. Practitioner Summary: The Process Overview Measure is a query-based measure that demonstrated practicality, sensitivity, validity and reliability for assessing operator situation awareness (SA) from monitoring process plants in representative settings.

  16. A Framework for Business Process Change Requirements Analysis

    NASA Astrophysics Data System (ADS)

    Grover, Varun; Otim, Samuel

    The ability to quickly and continually adapt business processes to accommodate evolving requirements and opportunities is critical for success in competitive environments. Without appropriate linkage between redesign decisions and strategic inputs, identifying processes that need to be modified will be difficult. In this paper, we draw attention to the analysis of business process change requirements in support of process change initiatives. Business process redesign is a multifaceted phenomenon involving processes, organizational structure, management systems, human resource architecture, and many other aspects of organizational life. To be successful, the business process initiative should focus not only on identifying the processes to be redesigned, but also pay attention to various enablers of change. Above all, a framework is just a blueprint; management must lead change. We hope our modest contribution will draw attention to the broader framing of requirements for business process change.

  17. Process Analytical Technology (PAT): batch-to-batch reproducibility of fermentation processes by robust process operational design and control.

    PubMed

    Gnoth, S; Jenzsch, M; Simutis, R; Lübbert, A

    2007-10-31

    The Process Analytical Technology (PAT) initiative of the FDA is a reaction on the increasing discrepancy between current possibilities in process supervision and control of pharmaceutical production processes and its current application in industrial manufacturing processes. With rigid approval practices based on standard operational procedures, adaptations of production reactors towards the state of the art were more or less inhibited for long years. Now PAT paves the way for continuous process and product improvements through improved process supervision based on knowledge-based data analysis, "Quality-by-Design"-concepts, and, finally, through feedback control. Examples of up-to-date implementations of this concept are presented. They are taken from one key group of processes in recombinant pharmaceutical protein manufacturing, the cultivations of genetically modified Escherichia coli bacteria.

  18. When teams shift among processes: insights from simulation and optimization.

    PubMed

    Kennedy, Deanna M; McComb, Sara A

    2014-09-01

    This article introduces process shifts to study the temporal interplay among transition and action processes espoused in the recurring phase model proposed by Marks, Mathieu, and Zacarro (2001). Process shifts are those points in time when teams complete a focal process and change to another process. By using team communication patterns to measure process shifts, this research explores (a) when teams shift among different transition processes and initiate action processes and (b) the potential of different interventions, such as communication directives, to manipulate process shift timing and order and, ultimately, team performance. Virtual experiments are employed to compare data from observed laboratory teams not receiving interventions, simulated teams receiving interventions, and optimal simulated teams generated using genetic algorithm procedures. Our results offer insights about the potential for different interventions to affect team performance. Moreover, certain interventions may promote discussions about key issues (e.g., tactical strategies) and facilitate shifting among transition processes in a manner that emulates optimal simulated teams' communication patterns. Thus, we contribute to theory regarding team processes in 2 important ways. First, we present process shifts as a way to explore the timing of when teams shift from transition to action processes. Second, we use virtual experimentation to identify those interventions with the greatest potential to affect performance by changing when teams shift among processes. Additionally, we employ computational methods including neural networks, simulation, and optimization, thereby demonstrating their applicability in conducting team research. PsycINFO Database Record (c) 2014 APA, all rights reserved.

  19. Nitrous oxide and methane emissions from different treatment processes in full-scale municipal wastewater treatment plants.

    PubMed

    Rena, Y G; Wang, J H; Li, H F; Zhang, J; Qi, P Y; Hu, Z

    2013-01-01

    Nitrous oxide (N2O) and methane (CH4) are two important greenhouse gases (GHG) emitted from biological nutrient removal (BNR) processes in municipal wastewater treatment plants (WWTP). In this study, three typical biological wastewater treatment processes were studied in WWTP of Northern China: pre-anaerobic carrousel oxidation ditch (A+OD) process, pre-anoxic anaerobic-anoxic-oxic (A-A/ A/O) process and reverse anaerobic-anoxic-oxic (r-A/ A/O) process. The N2O and CH4 emissions from these three different processes were measured in every processing unit of each WWTP. Results showed that N2O and CH4 were mainly discharged during the nitrification/denitrification process and the anaerobic/anoxic treatment process, respectively and the amounts of their formation and release were significantly influenced by different BNR processes implemented in these WWTP. The N2O conversion ratio of r-A/ A/O process was the lowest among the three WWTP, which were 10.9% and 18.6% lower than that of A-A/A/O process and A+OD process, respectively. Similarly, the CH4 conversion ratio of r-A/ A/O process was the lowest among the three WWTP, which were 89. I% and 80.8% lower than that of A-A/ A/O process and A+OD process, respectively. The factors influencing N2O and CH4 formation and emission in the three WWTP were investigated to explain the difference between these processes. The nitrite concentration and oxidation-reduction potential (ORP) value were found to be the dominant influencing factors affecting N2O and CH4 production, respectively. The flow-based emission factors of N2O and CH4 of the WWTP were figured out for better quantification of GHG emissions and further technical assessments of mitigation options.

  20. Effects of children's working memory capacity and processing speed on their sentence imitation performance.

    PubMed

    Poll, Gerard H; Miller, Carol A; Mainela-Arnold, Elina; Adams, Katharine Donnelly; Misra, Maya; Park, Ji Sook

    2013-01-01

    More limited working memory capacity and slower processing for language and cognitive tasks are characteristics of many children with language difficulties. Individual differences in processing speed have not consistently been found to predict language ability or severity of language impairment. There are conflicting views on whether working memory and processing speed are integrated or separable abilities. To evaluate four models for the relations of individual differences in children's processing speed and working memory capacity in sentence imitation. The models considered whether working memory and processing speed are integrated or separable, as well as the effect of the number of operations required per sentence. The role of working memory as a mediator of the effect of processing speed on sentence imitation was also evaluated. Forty-six children with varied language and reading abilities imitated sentences. Working memory was measured with the Competing Language Processing Task (CLPT), and processing speed was measured with a composite of truth-value judgment and rapid automatized naming tasks. Mixed-effects ordinal regression models evaluated the CLPT and processing speed as predictors of sentence imitation item scores. A single mediator model evaluated working memory as a mediator of the effect of processing speed on sentence imitation total scores. Working memory was a reliable predictor of sentence imitation accuracy, but processing speed predicted sentence imitation only as a component of a processing speed by number of operations interaction. Processing speed predicted working memory capacity, and there was evidence that working memory acted as a mediator of the effect of processing speed on sentence imitation accuracy. The findings support a refined view of working memory and processing speed as separable factors in children's sentence imitation performance. Processing speed does not independently explain sentence imitation accuracy for all sentence types, but contributes when the task requires more mental operations. Processing speed also has an indirect effect on sentence imitation by contributing to working memory capacity. © 2013 Royal College of Speech and Language Therapists.

  1. Q-marker based strategy for CMC research of Chinese medicine: A case study of Panax Notoginseng saponins.

    PubMed

    Zhong, Yi; Zhu, Jieqiang; Yang, Zhenzhong; Shao, Qing; Fan, Xiaohui; Cheng, Yiyu

    2018-01-31

    To ensure pharmaceutical quality, chemistry, manufacturing and control (CMC) research is essential. However, due to the inherent complexity of Chinese medicine (CM), CMC study of CM remains a great challenge for academia, industry, and regulatory agencies. Recently, quality-marker (Q-marker) was proposed to establish quality standards or quality analysis approaches of Chinese medicine, which sheds a light on Chinese medicine's CMC study. Here manufacture processes of Panax Notoginseng Saponins (PNS) is taken as a case study and the present work is to establish a Q-marker based research strategy for CMC of Chinese medicine. The Q-markers of Panax Notoginseng Saponins (PNS) is selected and established by integrating chemical profile with pharmacological activities. Then, the key processes of PNS manufacturing are identified by material flow analysis. Furthermore, modeling algorithms are employed to explore the relationship between Q-markers and critical process parameters (CPPs) of the key processes. At last, CPPs of the key processes are optimized in order to improving the process efficiency. Among the 97 identified compounds, Notoginsenoside R 1 , ginsenoside Rg 1 , Re, Rb 1 and Rd are selected as the Q-markers of PNS. Our analysis on PNS manufacturing show the extraction process and column chromatography process are the key processes. With the CPPs of each process as the inputs and Q-markers' contents as the outputs, two process prediction models are built separately for the extraction process and column chromatography process of Panax notoginseng, which both possess good prediction ability. Based on the efficiency models of extraction process and column chromatography process we constructed, the optimal CPPs of both processes are calculated. Our results show that the Q-markers derived from CMC research strategy can be applied to analyze the manufacturing processes of Chinese medicine to assure product's quality and promote key processes' efficiency simultaneously. Copyright © 2018 Elsevier GmbH. All rights reserved.

  2. PyMS: a Python toolkit for processing of gas chromatography-mass spectrometry (GC-MS) data. Application and comparative study of selected tools

    PubMed Central

    2012-01-01

    Background Gas chromatography–mass spectrometry (GC-MS) is a technique frequently used in targeted and non-targeted measurements of metabolites. Most existing software tools for processing of raw instrument GC-MS data tightly integrate data processing methods with graphical user interface facilitating interactive data processing. While interactive processing remains critically important in GC-MS applications, high-throughput studies increasingly dictate the need for command line tools, suitable for scripting of high-throughput, customized processing pipelines. Results PyMS comprises a library of functions for processing of instrument GC-MS data developed in Python. PyMS currently provides a complete set of GC-MS processing functions, including reading of standard data formats (ANDI- MS/NetCDF and JCAMP-DX), noise smoothing, baseline correction, peak detection, peak deconvolution, peak integration, and peak alignment by dynamic programming. A novel common ion single quantitation algorithm allows automated, accurate quantitation of GC-MS electron impact (EI) fragmentation spectra when a large number of experiments are being analyzed. PyMS implements parallel processing for by-row and by-column data processing tasks based on Message Passing Interface (MPI), allowing processing to scale on multiple CPUs in distributed computing environments. A set of specifically designed experiments was performed in-house and used to comparatively evaluate the performance of PyMS and three widely used software packages for GC-MS data processing (AMDIS, AnalyzerPro, and XCMS). Conclusions PyMS is a novel software package for the processing of raw GC-MS data, particularly suitable for scripting of customized processing pipelines and for data processing in batch mode. PyMS provides limited graphical capabilities and can be used both for routine data processing and interactive/exploratory data analysis. In real-life GC-MS data processing scenarios PyMS performs as well or better than leading software packages. We demonstrate data processing scenarios simple to implement in PyMS, yet difficult to achieve with many conventional GC-MS data processing software. Automated sample processing and quantitation with PyMS can provide substantial time savings compared to more traditional interactive software systems that tightly integrate data processing with the graphical user interface. PMID:22647087

  3. The Research Process on Converter Steelmaking Process by Using Limestone

    NASA Astrophysics Data System (ADS)

    Tang, Biao; Li, Xing-yi; Cheng, Han-chi; Wang, Jing; Zhang, Yun-long

    2017-08-01

    Compared with traditional converter steelmaking process, steelmaking process with limestone uses limestone to replace lime partly. A lot of researchers have studied about the new steelmaking process. There are much related research about material balance calculation, the behaviour of limestone in the slag, limestone powder injection in converter and application of limestone in iron and steel enterprises. The results show that the surplus heat of converter can meet the need of the limestone calcination, and the new process can reduce the steelmaking process energy loss in the whole steelmaking process, reduce carbon dioxide emissions, and improve the quality of the gas.

  4. Working on the Boundaries: Philosophies and Practices of the Design Process

    NASA Technical Reports Server (NTRS)

    Ryan, R.; Blair, J.; Townsend, J.; Verderaime, V.

    1996-01-01

    While systems engineering process is a program formal management technique and contractually binding, the design process is the informal practice of achieving the design project requirements throughout all design phases of the systems engineering process. The design process and organization are systems and component dependent. Informal reviews include technical information meetings and concurrent engineering sessions, and formal technical discipline reviews are conducted through the systems engineering process. This paper discusses and references major philosophical principles in the design process, identifies its role in interacting systems and disciplines analyses and integrations, and illustrates the process application in experienced aerostructural designs.

  5. Chemical processing of lunar materials

    NASA Technical Reports Server (NTRS)

    Criswell, D. R.; Waldron, R. D.

    1979-01-01

    The paper highlights recent work on the general problem of processing lunar materials. The discussion covers lunar source materials, refined products, motivations for using lunar materials, and general considerations for a lunar or space processing plant. Attention is given to chemical processing through various techniques, including electrolysis of molten silicates, carbothermic/silicothermic reduction, carbo-chlorination process, NaOH basic-leach process, and HF acid-leach process. Several options for chemical processing of lunar materials are well within the state of the art of applied chemistry and chemical engineering to begin development based on the extensive knowledge of lunar materials.

  6. Coordination and organization of security software process for power information application environment

    NASA Astrophysics Data System (ADS)

    Wang, Qiang

    2017-09-01

    As an important part of software engineering, the software process decides the success or failure of software product. The design and development feature of security software process is discussed, so is the necessity and the present significance of using such process. Coordinating the function software, the process for security software and its testing are deeply discussed. The process includes requirement analysis, design, coding, debug and testing, submission and maintenance. In each process, the paper proposed the subprocesses to support software security. As an example, the paper introduces the above process into the power information platform.

  7. Sensor-based atomic layer deposition for rapid process learning and enhanced manufacturability

    NASA Astrophysics Data System (ADS)

    Lei, Wei

    In the search for sensor based atomic layer deposition (ALD) process to accelerate process learning and enhance manufacturability, we have explored new reactor designs and applied in-situ process sensing to W and HfO 2 ALD processes. A novel wafer scale ALD reactor, which features fast gas switching, good process sensing compatibility and significant similarity to the real manufacturing environment, is constructed. The reactor has a unique movable reactor cap design that allows two possible operation modes: (1) steady-state flow with alternating gas species; or (2) fill-and-pump-out cycling of each gas, accelerating the pump-out by lifting the cap to employ the large chamber volume as ballast. Downstream quadrupole mass spectrometry (QMS) sampling is applied for in-situ process sensing of tungsten ALD process. The QMS reveals essential surface reaction dynamics through real-time signals associated with byproduct generation as well as precursor introduction and depletion for each ALD half cycle, which are then used for process learning and optimization. More subtle interactions such as imperfect surface saturation and reactant dose interaction are also directly observed by QMS, indicating that ALD process is more complicated than the suggested layer-by-layer growth. By integrating in real-time the byproduct QMS signals over each exposure and plotting it against process cycle number, the deposition kinetics on the wafer is directly measured. For continuous ALD runs, the total integrated byproduct QMS signal in each ALD run is also linear to ALD film thickness, and therefore can be used for ALD film thickness metrology. The in-situ process sensing is also applied to HfO2 ALD process that is carried out in a furnace type ALD reactor. Precursor dose end-point control is applied to precisely control the precursor dose in each half cycle. Multiple process sensors, including quartz crystal microbalance (QCM) and QMS are used to provide real time process information. The sensing results confirm the proposed surface reaction path and once again reveal the complexity of ALD processes. The impact of this work includes: (1) It explores new ALD reactor designs which enable the implementation of in-situ process sensors for rapid process learning and enhanced manufacturability; (2) It demonstrates in the first time that in-situ QMS can reveal detailed process dynamics and film growth kinetics in wafer-scale ALD process, and thus can be used for ALD film thickness metrology. (3) Based on results from two different processes carried out in two different reactors, it is clear that ALD is a more complicated process than normally believed or advertised, but real-time observation of the operational chemistries in ALD by in-situ sensors provides critical insight to the process and the basis for more effective process control for ALD applications.

  8. Implicit Processes, Self-Regulation, and Interventions for Behavior Change.

    PubMed

    St Quinton, Tom; Brunton, Julie A

    2017-01-01

    The ability to regulate and subsequently change behavior is influenced by both reflective and implicit processes. Traditional theories have focused on conscious processes by highlighting the beliefs and intentions that influence decision making. However, their success in changing behavior has been modest with a gap between intention and behavior apparent. Dual-process models have been recently applied to health psychology; with numerous models incorporating implicit processes that influence behavior as well as the more common conscious processes. Such implicit processes are theorized to govern behavior non-consciously. The article provides a commentary on motivational and volitional processes and how interventions have combined to attempt an increase in positive health behaviors. Following this, non-conscious processes are discussed in terms of their theoretical underpinning. The article will then highlight how these processes have been measured and will then discuss the different ways that the non-conscious and conscious may interact. The development of interventions manipulating both processes may well prove crucial in successfully altering behavior.

  9. All varieties of encoding variability are not created equal: Separating variable processing from variable tasks

    PubMed Central

    Huff, Mark J.; Bodner, Glen E.

    2014-01-01

    Whether encoding variability facilitates memory is shown to depend on whether item-specific and relational processing are both performed across study blocks, and whether study items are weakly versus strongly related. Variable-processing groups studied a word list once using an item-specific task and once using a relational task. Variable-task groups’ two different study tasks recruited the same type of processing each block. Repeated-task groups performed the same study task each block. Recall and recognition were greatest in the variable-processing group, but only with weakly related lists. A variable-processing benefit was also found when task-based processing and list-type processing were complementary (e.g., item-specific processing of a related list) rather than redundant (e.g., relational processing of a related list). That performing both item-specific and relational processing across trials, or within a trial, yields encoding-variability benefits may help reconcile decades of contradictory findings in this area. PMID:25018583

  10. Continuous welding of unidirectional fiber reinforced thermoplastic tape material

    NASA Astrophysics Data System (ADS)

    Schledjewski, Ralf

    2017-10-01

    Continuous welding techniques like thermoplastic tape placement with in situ consolidation offer several advantages over traditional manufacturing processes like autoclave consolidation, thermoforming, etc. However, still there is a need to solve several important processing issues before it becomes a viable economic process. Intensive process analysis and optimization has been carried out in the past through experimental investigation, model definition and simulation development. Today process simulation is capable to predict resulting consolidation quality. Effects of material imperfections or process parameter variations are well known. But using this knowledge to control the process based on online process monitoring and according adaption of the process parameters is still challenging. Solving inverse problems and using methods for automated code generation allowing fast implementation of algorithms on targets are required. The paper explains the placement technique in general. Process-material-property-relationships and typical material imperfections are described. Furthermore, online monitoring techniques and how to use them for a model based process control system are presented.

  11. Economics of polysilicon process: A view from Japan

    NASA Technical Reports Server (NTRS)

    Shimizu, Y.

    1986-01-01

    The production process of solar grade silicon (SOG-Si) through trichlorosilane (TCS) was researched in a program sponsored by New Energy Development Organization (NEDO). The NEDO process consists of the following two steps: TCS production from by-product silicon tetrachloride (STC) and SOG-Si formation from TCS using a fluidized bed reactor. Based on the data obtained during the research program, the manufacturing cost of the NEDO process and other polysilicon manufacturing processes were compared. The manufacturing cost was calculated on the basis of 1000 tons/year production. The cost estimate showed that the cost of producing silicon by all of the new processes is less than the cost by the conventional Siemens process. Using a new process, the cost of producing semiconductor grade silicon was found to be virtually the same with any to the TCS, diclorosilane, and monosilane processes when by-products were recycled. The SOG-Si manufacturing processes using the fluidized bed reactor, which needs further development, shows a greater probablility of cost reduction than the filament processes.

  12. Autonomous Agents for Dynamic Process Planning in the Flexible Manufacturing System

    NASA Astrophysics Data System (ADS)

    Nik Nejad, Hossein Tehrani; Sugimura, Nobuhiro; Iwamura, Koji; Tanimizu, Yoshitaka

    Rapid changes of market demands and pressures of competition require manufacturers to maintain highly flexible manufacturing systems to cope with a complex manufacturing environment. This paper deals with development of an agent-based architecture of dynamic systems for incremental process planning in the manufacturing systems. In consideration of alternative manufacturing processes and machine tools, the process plans and the schedules of the manufacturing resources are generated incrementally and dynamically. A negotiation protocol is discussed, in this paper, to generate suitable process plans for the target products real-timely and dynamically, based on the alternative manufacturing processes. The alternative manufacturing processes are presented by the process plan networks discussed in the previous paper, and the suitable process plans are searched and generated to cope with both the dynamic changes of the product specifications and the disturbances of the manufacturing resources. We initiatively combine the heuristic search algorithms of the process plan networks with the negotiation protocols, in order to generate suitable process plans in the dynamic manufacturing environment.

  13. Implicit Processes, Self-Regulation, and Interventions for Behavior Change

    PubMed Central

    St Quinton, Tom; Brunton, Julie A.

    2017-01-01

    The ability to regulate and subsequently change behavior is influenced by both reflective and implicit processes. Traditional theories have focused on conscious processes by highlighting the beliefs and intentions that influence decision making. However, their success in changing behavior has been modest with a gap between intention and behavior apparent. Dual-process models have been recently applied to health psychology; with numerous models incorporating implicit processes that influence behavior as well as the more common conscious processes. Such implicit processes are theorized to govern behavior non-consciously. The article provides a commentary on motivational and volitional processes and how interventions have combined to attempt an increase in positive health behaviors. Following this, non-conscious processes are discussed in terms of their theoretical underpinning. The article will then highlight how these processes have been measured and will then discuss the different ways that the non-conscious and conscious may interact. The development of interventions manipulating both processes may well prove crucial in successfully altering behavior. PMID:28337164

  14. Models of recognition: a review of arguments in favor of a dual-process account.

    PubMed

    Diana, Rachel A; Reder, Lynne M; Arndt, Jason; Park, Heekyeong

    2006-02-01

    The majority of computationally specified models of recognition memory have been based on a single-process interpretation, claiming that familiarity is the only influence on recognition. There is increasing evidence that recognition is, in fact, based on two processes: recollection and familiarity. This article reviews the current state of the evidence for dual-process models, including the usefulness of the remember/know paradigm, and interprets the relevant results in terms of the source of activation confusion (SAC) model of memory. We argue that the evidence from each of the areas we discuss, when combined, presents a strong case that inclusion of a recollection process is necessary. Given this conclusion, we also argue that the dual-process claim that the recollection process is always available is, in fact, more parsimonious than the single-process claim that the recollection process is used only in certain paradigms. The value of a well-specified process model such as the SAC model is discussed with regard to other types of dual-process models.

  15. Integrating Thermal Tools Into the Mechanical Design Process

    NASA Technical Reports Server (NTRS)

    Tsuyuki, Glenn T.; Siebes, Georg; Novak, Keith S.; Kinsella, Gary M.

    1999-01-01

    The intent of mechanical design is to deliver a hardware product that meets or exceeds customer expectations, while reducing cycle time and cost. To this end, an integrated mechanical design process enables the idea of parallel development (concurrent engineering). This represents a shift from the traditional mechanical design process. With such a concurrent process, there are significant issues that have to be identified and addressed before re-engineering the mechanical design process to facilitate concurrent engineering. These issues also assist in the integration and re-engineering of the thermal design sub-process since it resides within the entire mechanical design process. With these issues in mind, a thermal design sub-process can be re-defined in a manner that has a higher probability of acceptance, thus enabling an integrated mechanical design process. However, the actual implementation is not always problem-free. Experience in applying the thermal design sub-process to actual situations provides the evidence for improvement, but more importantly, for judging the viability and feasibility of the sub-process.

  16. [Monitoring method of extraction process for Schisandrae Chinensis Fructus based on near infrared spectroscopy and multivariate statistical process control].

    PubMed

    Xu, Min; Zhang, Lei; Yue, Hong-Shui; Pang, Hong-Wei; Ye, Zheng-Liang; Ding, Li

    2017-10-01

    To establish an on-line monitoring method for extraction process of Schisandrae Chinensis Fructus, the formula medicinal material of Yiqi Fumai lyophilized injection by combining near infrared spectroscopy with multi-variable data analysis technology. The multivariate statistical process control (MSPC) model was established based on 5 normal batches in production and 2 test batches were monitored by PC scores, DModX and Hotelling T2 control charts. The results showed that MSPC model had a good monitoring ability for the extraction process. The application of the MSPC model to actual production process could effectively achieve on-line monitoring for extraction process of Schisandrae Chinensis Fructus, and can reflect the change of material properties in the production process in real time. This established process monitoring method could provide reference for the application of process analysis technology in the process quality control of traditional Chinese medicine injections. Copyright© by the Chinese Pharmaceutical Association.

  17. Method for automatically evaluating a transition from a batch manufacturing technique to a lean manufacturing technique

    DOEpatents

    Ivezic, Nenad; Potok, Thomas E.

    2003-09-30

    A method for automatically evaluating a manufacturing technique comprises the steps of: receiving from a user manufacturing process step parameters characterizing a manufacturing process; accepting from the user a selection for an analysis of a particular lean manufacturing technique; automatically compiling process step data for each process step in the manufacturing process; automatically calculating process metrics from a summation of the compiled process step data for each process step; and, presenting the automatically calculated process metrics to the user. A method for evaluating a transition from a batch manufacturing technique to a lean manufacturing technique can comprise the steps of: collecting manufacturing process step characterization parameters; selecting a lean manufacturing technique for analysis; communicating the selected lean manufacturing technique and the manufacturing process step characterization parameters to an automatic manufacturing technique evaluation engine having a mathematical model for generating manufacturing technique evaluation data; and, using the lean manufacturing technique evaluation data to determine whether to transition from an existing manufacturing technique to the selected lean manufacturing technique.

  18. Process yield improvements with process control terminal for varian serial ion implanters

    NASA Astrophysics Data System (ADS)

    Higashi, Harry; Soni, Ameeta; Martinez, Larry; Week, Ken

    Implant processes in a modern wafer production fab are extremely complex. There can be several types of misprocessing, i.e. wrong dose or species, double implants and missed implants. Process Control Terminals (PCT) for Varian 350Ds installed at Intel fabs were found to substantially reduce the number of misprocessing steps. This paper describes those misprocessing steps and their subsequent reduction with use of PCTs. Reliable and simple process control with serial process ion implanters has been in increasing demand. A well designed process control terminal greatly increases device yield by monitoring all pertinent implanter functions and enabling process engineering personnel to set up process recipes for simple and accurate system operation. By programming user-selectable interlocks, implant errors are reduced and those that occur are logged for further analysis and prevention. A process control terminal should also be compatible with office personal computers for greater flexibility in system use and data analysis. The impact from the capability of a process control terminal is increased productivity, ergo higher device yield.

  19. An Aspect-Oriented Framework for Business Process Improvement

    NASA Astrophysics Data System (ADS)

    Pourshahid, Alireza; Mussbacher, Gunter; Amyot, Daniel; Weiss, Michael

    Recently, many organizations invested in Business Process Management Systems (BPMSs) in order to automate and monitor their processes. Business Activity Monitoring is one of the essential modules of a BPMS as it provides the core monitoring capabilities. Although the natural step after process monitoring is process improvement, most of the existing systems do not provide the means to help users with the improvement step. In this paper, we address this issue by proposing an aspect-oriented framework that allows the impact of changes to business processes to be explored with what-if scenarios based on the most appropriate process redesign patterns among several possibilities. As the four cornerstones of a BPMS are process, goal, performance and validation views, these views need to be aligned automatically by any approach that intends to support automated improvement of business processes. Our framework therefore provides means to reflect process changes also in the other views of the business process. A health care case study presented as a proof of concept suggests that this novel approach is feasible.

  20. Structure and Randomness of Continuous-Time, Discrete-Event Processes

    NASA Astrophysics Data System (ADS)

    Marzen, Sarah E.; Crutchfield, James P.

    2017-10-01

    Loosely speaking, the Shannon entropy rate is used to gauge a stochastic process' intrinsic randomness; the statistical complexity gives the cost of predicting the process. We calculate, for the first time, the entropy rate and statistical complexity of stochastic processes generated by finite unifilar hidden semi-Markov models—memoryful, state-dependent versions of renewal processes. Calculating these quantities requires introducing novel mathematical objects (ɛ -machines of hidden semi-Markov processes) and new information-theoretic methods to stochastic processes.

  1. Combined mesophilic anaerobic and thermophilic aerobic digestion process for high-strength food wastewater to increase removal efficiency and reduce sludge discharge.

    PubMed

    Jang, H M; Park, S K; Ha, J H; Park, J M

    2014-01-01

    In this study, a process that combines the mesophilic anaerobic digestion (MAD) process with thermophilic aerobic digestion (TAD) for high-strength food wastewater (FWW) treatment was developed to examine the removal of organic matter and methane production. All effluent discharged from the MAD process was separated into solid and liquid portions. The liquid part was discarded and the sludge part was passed to the TAD process for further degradation. Then, the digested sludge from the TAD process was recycled back to the MAD unit to achieve low sludge discharge from the combined process. The reactor combination was operated in two phases: during Phase I, 40 d of total hydraulic retention time (HRT) was applied; during Phase II, 20 d was applied. HRT of the TAD process was fixed at 5 d. For a comparison, a control process (single-stage MAD) was operated with the same HRTs of the combined process. Our results indicated that the combined process showed over 90% total solids, volatile solids and chemical oxygen demand removal efficiencies. In addition, the combined process showed a significantly higher methane production rate than that of the control process. Consequently, the experimental data demonstrated that the combined MAD-TAD process was successfully employed for high-strength FWW treatment with highly efficient organic matter reduction and methane production.

  2. Leading processes of patient care and treatment in hierarchical healthcare organizations in Sweden--process managers' experiences.

    PubMed

    Nilsson, Kerstin; Sandoff, Mette

    2015-01-01

    The purpose of this study is to gain better understanding of the roles and functions of process managers by describing Swedish process managers' experiences of leading processes involving patient care and treatment when working in a hierarchical health-care organization. This study is based on an explorative design. The data were gathered from interviews with 12 process managers at three Swedish hospitals. These data underwent qualitative and interpretative analysis with a modified editing style. The process managers' experiences of leading processes in a hierarchical health-care organization are described under three themes: having or not having a mandate, exposure to conflict situations and leading process development. The results indicate a need for clarity regarding process manager's responsibility and work content, which need to be communicated to all managers and staff involved in the patient care and treatment process, irrespective of department. There also needs to be an emphasis on realistic expectations and orientation of the goals that are an intrinsic part of the task of being a process manager. Generalizations from the results of the qualitative interview studies are limited, but a deeper understanding of the phenomenon was reached, which, in turn, can be transferred to similar settings. This study contributes qualitative descriptions of leading care and treatment processes in a functional, hierarchical health-care organization from process managers' experiences, a subject that has not been investigated earlier.

  3. Information Technology Process Improvement Decision-Making: An Exploratory Study from the Perspective of Process Owners and Process Managers

    ERIC Educational Resources Information Center

    Lamp, Sandra A.

    2012-01-01

    There is information available in the literature that discusses information technology (IT) governance and investment decision making from an executive-level perception, yet there is little information available that offers the perspective of process owners and process managers pertaining to their role in IT process improvement and investment…

  4. 43 CFR 2884.17 - How will BLM process my Processing Category 6 application?

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 43 Public Lands: Interior 2 2012-10-01 2012-10-01 false How will BLM process my Processing...-WAY UNDER THE MINERAL LEASING ACT Applying for MLA Grants or TUPs § 2884.17 How will BLM process my... written agreement that describes how BLM will process your application. The final agreement consists of a...

  5. 43 CFR 2884.17 - How will BLM process my Processing Category 6 application?

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 43 Public Lands: Interior 2 2013-10-01 2013-10-01 false How will BLM process my Processing...-WAY UNDER THE MINERAL LEASING ACT Applying for MLA Grants or TUPs § 2884.17 How will BLM process my... written agreement that describes how BLM will process your application. The final agreement consists of a...

  6. 15 CFR 15.3 - Acceptance of service of process.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 15 Commerce and Foreign Trade 1 2010-01-01 2010-01-01 false Acceptance of service of process. 15.3... Process § 15.3 Acceptance of service of process. (a) Except as otherwise provided in this subpart, any... employee by law is to be served personally with process. Service of process in this case is inadequate when...

  7. Weaknesses in Applying a Process Approach in Industry Enterprises

    NASA Astrophysics Data System (ADS)

    Kučerová, Marta; Mĺkva, Miroslava; Fidlerová, Helena

    2012-12-01

    The paper deals with a process approach as one of the main principles of the quality management. Quality management systems based on process approach currently represents one of a proofed ways how to manage an organization. The volume of sales, costs and profit levels are influenced by quality of processes and efficient process flow. As results of the research project showed, there are some weaknesses in applying of the process approach in the industrial routine and it has been often only a formal change of the functional management to process management in many organizations in Slovakia. For efficient process management it is essential that companies take attention to the way how to organize their processes and seek for their continuous improvement.

  8. Is Primary-Process Cognition a Feature of Hypnosis?

    PubMed

    Finn, Michael T; Goldman, Jared I; Lyon, Gyrid B; Nash, Michael R

    2017-01-01

    The division of cognition into primary and secondary processes is an important part of contemporary psychoanalytic metapsychology. Whereas primary processes are most characteristic of unconscious thought and loose associations, secondary processes generally govern conscious thought and logical reasoning. It has been theorized that an induction into hypnosis is accompanied by a predomination of primary-process cognition over secondary-process cognition. The authors hypothesized that highly hypnotizable individuals would demonstrate more primary-process cognition as measured by a recently developed cognitive-perceptual task. This hypothesis was not supported. In fact, low hypnotizable participants demonstrated higher levels of primary-process cognition. Exploratory analyses suggested a more specific effect: felt connectedness to the hypnotist seemed to promote secondary-process cognition among low hypnotizable participants.

  9. [Dual process in large number estimation under uncertainty].

    PubMed

    Matsumuro, Miki; Miwa, Kazuhisa; Terai, Hitoshi; Yamada, Kento

    2016-08-01

    According to dual process theory, there are two systems in the mind: an intuitive and automatic System 1 and a logical and effortful System 2. While many previous studies about number estimation have focused on simple heuristics and automatic processes, the deliberative System 2 process has not been sufficiently studied. This study focused on the System 2 process for large number estimation. First, we described an estimation process based on participants’ verbal reports. The task, corresponding to the problem-solving process, consisted of creating subgoals, retrieving values, and applying operations. Second, we investigated the influence of such deliberative process by System 2 on intuitive estimation by System 1, using anchoring effects. The results of the experiment showed that the System 2 process could mitigate anchoring effects.

  10. Object-processing neural efficiency differentiates object from spatial visualizers.

    PubMed

    Motes, Michael A; Malach, Rafael; Kozhevnikov, Maria

    2008-11-19

    The visual system processes object properties and spatial properties in distinct subsystems, and we hypothesized that this distinction might extend to individual differences in visual processing. We conducted a functional MRI study investigating the neural underpinnings of individual differences in object versus spatial visual processing. Nine participants of high object-processing ability ('object' visualizers) and eight participants of high spatial-processing ability ('spatial' visualizers) were scanned, while they performed an object-processing task. Object visualizers showed lower bilateral neural activity in lateral occipital complex and lower right-lateralized neural activity in dorsolateral prefrontal cortex. The data indicate that high object-processing ability is associated with more efficient use of visual-object resources, resulting in less neural activity in the object-processing pathway.

  11. Process simulation for advanced composites production

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Allendorf, M.D.; Ferko, S.M.; Griffiths, S.

    1997-04-01

    The objective of this project is to improve the efficiency and lower the cost of chemical vapor deposition (CVD) processes used to manufacture advanced ceramics by providing the physical and chemical understanding necessary to optimize and control these processes. Project deliverables include: numerical process models; databases of thermodynamic and kinetic information related to the deposition process; and process sensors and software algorithms that can be used for process control. Target manufacturing techniques include CVD fiber coating technologies (used to deposit interfacial coatings on continuous fiber ceramic preforms), chemical vapor infiltration, thin-film deposition processes used in the glass industry, and coatingmore » techniques used to deposit wear-, abrasion-, and corrosion-resistant coatings for use in the pulp and paper, metals processing, and aluminum industries.« less

  12. CDO budgeting

    NASA Astrophysics Data System (ADS)

    Nesladek, Pavel; Wiswesser, Andreas; Sass, Björn; Mauermann, Sebastian

    2008-04-01

    The Critical dimension off-target (CDO) is a key parameter for mask house customer, affecting directly the performance of the mask. The CDO is the difference between the feature size target and the measured feature size. The change of CD during the process is either compensated within the process or by data correction. These compensation methods are commonly called process bias and data bias, respectively. The difference between data bias and process bias in manufacturing results in systematic CDO error, however, this systematic error does not take into account the instability of the process bias. This instability is a result of minor variations - instabilities of manufacturing processes and changes in materials and/or logistics. Using several masks the CDO of the manufacturing line can be estimated. For systematic investigation of the unit process contribution to CDO and analysis of the factors influencing the CDO contributors, a solid understanding of each unit process and huge number of masks is necessary. Rough identification of contributing processes and splitting of the final CDO variation between processes can be done with approx. 50 masks with identical design, material and process. Such amount of data allows us to identify the main contributors and estimate the effect of them by means of Analysis of variance (ANOVA) combined with multivariate analysis. The analysis does not provide information about the root cause of the variation within the particular unit process, however, it provides a good estimate of the impact of the process on the stability of the manufacturing line. Additionally this analysis can be used to identify possible interaction between processes, which cannot be investigated if only single processes are considered. Goal of this work is to evaluate limits for CDO budgeting models given by the precision and the number of measurements as well as partitioning the variation within the manufacturing process. The CDO variation splits according to the suggested model into contributions from particular processes or process groups. Last but not least the power of this method to determine the absolute strength of each parameter will be demonstrated. Identification of the root cause of this variation within the unit process itself is not scope of this work.

  13. Application of high-throughput mini-bioreactor system for systematic scale-down modeling, process characterization, and control strategy development.

    PubMed

    Janakiraman, Vijay; Kwiatkowski, Chris; Kshirsagar, Rashmi; Ryll, Thomas; Huang, Yao-Ming

    2015-01-01

    High-throughput systems and processes have typically been targeted for process development and optimization in the bioprocessing industry. For process characterization, bench scale bioreactors have been the system of choice. Due to the need for performing different process conditions for multiple process parameters, the process characterization studies typically span several months and are considered time and resource intensive. In this study, we have shown the application of a high-throughput mini-bioreactor system viz. the Advanced Microscale Bioreactor (ambr15(TM) ), to perform process characterization in less than a month and develop an input control strategy. As a pre-requisite to process characterization, a scale-down model was first developed in the ambr system (15 mL) using statistical multivariate analysis techniques that showed comparability with both manufacturing scale (15,000 L) and bench scale (5 L). Volumetric sparge rates were matched between ambr and manufacturing scale, and the ambr process matched the pCO2 profiles as well as several other process and product quality parameters. The scale-down model was used to perform the process characterization DoE study and product quality results were generated. Upon comparison with DoE data from the bench scale bioreactors, similar effects of process parameters on process yield and product quality were identified between the two systems. We used the ambr data for setting action limits for the critical controlled parameters (CCPs), which were comparable to those from bench scale bioreactor data. In other words, the current work shows that the ambr15(TM) system is capable of replacing the bench scale bioreactor system for routine process development and process characterization. © 2015 American Institute of Chemical Engineers.

  14. Consumers' conceptualization of ultra-processed foods.

    PubMed

    Ares, Gastón; Vidal, Leticia; Allegue, Gimena; Giménez, Ana; Bandeira, Elisa; Moratorio, Ximena; Molina, Verónika; Curutchet, María Rosa

    2016-10-01

    Consumption of ultra-processed foods has been associated with low diet quality, obesity and other non-communicable diseases. This situation makes it necessary to develop educational campaigns to discourage consumers from substituting meals based on unprocessed or minimally processed foods by ultra-processed foods. In this context, the aim of the present work was to investigate how consumers conceptualize the term ultra-processed foods and to evaluate if the foods they perceive as ultra-processed are in concordance with the products included in the NOVA classification system. An online study was carried out with 2381 participants. They were asked to explain what they understood by ultra-processed foods and to list foods that can be considered ultra-processed. Responses were analysed using inductive coding. The great majority of the participants was able to provide an explanation of what ultra-processed foods are, which was similar to the definition described in the literature. Most of the participants described ultra-processed foods as highly processed products that usually contain additives and other artificial ingredients, stressing that they have low nutritional quality and are unhealthful. The most relevant products for consumers' conceptualization of the term were in agreement with the NOVA classification system and included processed meats, soft drinks, snacks, burgers, powdered and packaged soups and noodles. However, some of the participants perceived processed foods, culinary ingredients and even some minimally processed foods as ultra-processed. This suggests that in order to accurately convey their message, educational campaigns aimed at discouraging consumers from consuming ultra-processed foods should include a clear definition of the term and describe some of their specific characteristics, such as the type of ingredients included in their formulation and their nutritional composition. Copyright © 2016 Elsevier Ltd. All rights reserved.

  15. Rapid communication: Global-local processing affects recognition of distractor emotional faces.

    PubMed

    Srinivasan, Narayanan; Gupta, Rashmi

    2011-03-01

    Recent studies have shown links between happy faces and global, distributed attention as well as sad faces to local, focused attention. Emotions have been shown to affect global-local processing. Given that studies on emotion-cognition interactions have not explored the effect of perceptual processing at different spatial scales on processing stimuli with emotional content, the present study investigated the link between perceptual focus and emotional processing. The study investigated the effects of global-local processing on the recognition of distractor faces with emotional expressions. Participants performed a digit discrimination task with digits at either the global level or the local level presented against a distractor face (happy or sad) as background. The results showed that global processing associated with broad scope of attention facilitates recognition of happy faces, and local processing associated with narrow scope of attention facilitates recognition of sad faces. The novel results of the study provide conclusive evidence for emotion-cognition interactions by demonstrating the effect of perceptual processing on emotional faces. The results along with earlier complementary results on the effect of emotion on global-local processing support a reciprocal relationship between emotional processing and global-local processing. Distractor processing with emotional information also has implications for theories of selective attention.

  16. Tomographical process monitoring of laser transmission welding with OCT

    NASA Astrophysics Data System (ADS)

    Ackermann, Philippe; Schmitt, Robert

    2017-06-01

    Process control of laser processes still encounters many obstacles. Although these processes are stable, a narrow process parameter window during the process or process deviations have led to an increase on the requirements for the process itself and on monitoring devices. Laser transmission welding as a contactless and locally limited joining technique is well-established in a variety of demanding production areas. For example, sensitive parts demand a particle-free joining technique which does not affect the inner components. Inline integrated non-destructive optical measurement systems capable of providing non-invasive tomographical images of the transparent material, the weld seam and its surrounding areas with micron resolution would improve the overall process. Obtained measurement data enable qualitative feedback into the system to adapt parameters for a more robust process. Within this paper we present the inline monitoring device based on Fourier-domain optical coherence tomography developed within the European-funded research project "Manunet Weldable". This device, after adaptation to the laser transmission welding process is optically and mechanically integrated into the existing laser system. The main target lies within the inline process control destined to extract tomographical geometrical measurement data from the weld seam forming process. Usage of this technology makes offline destructive testing of produced parts obsolete. 1,2,3,4

  17. A quality-refinement process for medical imaging applications.

    PubMed

    Neuhaus, J; Maleike, D; Nolden, M; Kenngott, H-G; Meinzer, H-P; Wolf, I

    2009-01-01

    To introduce and evaluate a process for refinement of software quality that is suitable to research groups. In order to avoid constraining researchers too much, the quality improvement process has to be designed carefully. The scope of this paper is to present and evaluate a process to advance quality aspects of existing research prototypes in order to make them ready for initial clinical studies. The proposed process is tailored for research environments and therefore more lightweight than traditional quality management processes. Focus on quality criteria that are important at the given stage of the software life cycle. Usage of tools that automate aspects of the process is emphasized. To evaluate the additional effort that comes along with the process, it was exemplarily applied for eight prototypical software modules for medical image processing. The introduced process has been applied to improve the quality of all prototypes so that they could be successfully used in clinical studies. The quality refinement yielded an average of 13 person days of additional effort per project. Overall, 107 bugs were found and resolved by applying the process. Careful selection of quality criteria and the usage of automated process tools lead to a lightweight quality refinement process suitable for scientific research groups that can be applied to ensure a successful transfer of technical software prototypes into clinical research workflows.

  18. Negative Binomial Process Count and Mixture Modeling.

    PubMed

    Zhou, Mingyuan; Carin, Lawrence

    2015-02-01

    The seemingly disjoint problems of count and mixture modeling are united under the negative binomial (NB) process. A gamma process is employed to model the rate measure of a Poisson process, whose normalization provides a random probability measure for mixture modeling and whose marginalization leads to an NB process for count modeling. A draw from the NB process consists of a Poisson distributed finite number of distinct atoms, each of which is associated with a logarithmic distributed number of data samples. We reveal relationships between various count- and mixture-modeling distributions and construct a Poisson-logarithmic bivariate distribution that connects the NB and Chinese restaurant table distributions. Fundamental properties of the models are developed, and we derive efficient Bayesian inference. It is shown that with augmentation and normalization, the NB process and gamma-NB process can be reduced to the Dirichlet process and hierarchical Dirichlet process, respectively. These relationships highlight theoretical, structural, and computational advantages of the NB process. A variety of NB processes, including the beta-geometric, beta-NB, marked-beta-NB, marked-gamma-NB and zero-inflated-NB processes, with distinct sharing mechanisms, are also constructed. These models are applied to topic modeling, with connections made to existing algorithms under Poisson factor analysis. Example results show the importance of inferring both the NB dispersion and probability parameters.

  19. [Process management in the hospital pharmacy for the improvement of the patient safety].

    PubMed

    Govindarajan, R; Perelló-Juncá, A; Parès-Marimòn, R M; Serrais-Benavente, J; Ferrandez-Martí, D; Sala-Robinat, R; Camacho-Calvente, A; Campabanal-Prats, C; Solà-Anderiu, I; Sanchez-Caparrós, S; Gonzalez-Estrada, J; Martinez-Olalla, P; Colomer-Palomo, J; Perez-Mañosas, R; Rodríguez-Gallego, D

    2013-01-01

    To define a process management model for a hospital pharmacy in order to measure, analyse and make continuous improvements in patient safety and healthcare quality. In order to implement process management, Igualada Hospital was divided into different processes, one of which was the Hospital Pharmacy. A multidisciplinary management team was given responsibility for each process. For each sub-process one person was identified to be responsible, and a working group was formed under his/her leadership. With the help of each working group, a risk analysis using failure modes and effects analysis (FMEA) was performed, and the corresponding improvement actions were implemented. Sub-process indicators were also identified, and different process management mechanisms were introduced. The first risk analysis with FMEA produced more than thirty preventive actions to improve patient safety. Later, the weekly analysis of errors, as well as the monthly analysis of key process indicators, permitted us to monitor process results and, as each sub-process manager participated in these meetings, also to assume accountability and responsibility, thus consolidating the culture of excellence. The introduction of different process management mechanisms, with the participation of people responsible for each sub-process, introduces a participative management tool for the continuous improvement of patient safety and healthcare quality. Copyright © 2012 SECA. Published by Elsevier Espana. All rights reserved.

  20. Distributed processing method for arbitrary view generation in camera sensor network

    NASA Astrophysics Data System (ADS)

    Tehrani, Mehrdad P.; Fujii, Toshiaki; Tanimoto, Masayuki

    2003-05-01

    Camera sensor network as a new advent of technology is a network that each sensor node can capture video signals, process and communicate them with other nodes. The processing task in this network is to generate arbitrary view, which can be requested from central node or user. To avoid unnecessary communication between nodes in camera sensor network and speed up the processing time, we have distributed the processing tasks between nodes. In this method, each sensor node processes part of interpolation algorithm to generate the interpolated image with local communication between nodes. The processing task in camera sensor network is ray-space interpolation, which is an object independent method and based on MSE minimization by using adaptive filtering. Two methods were proposed for distributing processing tasks, which are Fully Image Shared Decentralized Processing (FIS-DP), and Partially Image Shared Decentralized Processing (PIS-DP), to share image data locally. Comparison of the proposed methods with Centralized Processing (CP) method shows that PIS-DP has the highest processing speed after FIS-DP, and CP has the lowest processing speed. Communication rate of CP and PIS-DP is almost same and better than FIS-DP. So, PIS-DP is recommended because of its better performance than CP and FIS-DP.

  1. EEG alpha synchronization is related to top-down processing in convergent and divergent thinking

    PubMed Central

    Benedek, Mathias; Bergner, Sabine; Könen, Tanja; Fink, Andreas; Neubauer, Aljoscha C.

    2011-01-01

    Synchronization of EEG alpha activity has been referred to as being indicative of cortical idling, but according to more recent evidence it has also been associated with active internal processing and creative thinking. The main objective of this study was to investigate to what extent EEG alpha synchronization is related to internal processing demands and to specific cognitive process involved in creative thinking. To this end, EEG was measured during a convergent and a divergent thinking task (i.e., creativity-related task) which once were processed involving low and once involving high internal processing demands. High internal processing demands were established by masking the stimulus (after encoding) and thus preventing further bottom-up processing. Frontal alpha synchronization was observed during convergent and divergent thinking only under exclusive top-down control (high internal processing demands), but not when bottom-up processing was allowed (low internal processing demands). We conclude that frontal alpha synchronization is related to top-down control rather than to specific creativity-related cognitive processes. Frontal alpha synchronization, which has been observed in a variety of different creativity tasks, thus may not reflect a brain state that is specific for creative cognition but can probably be attributed to high internal processing demands which are typically involved in creative thinking. PMID:21925520

  2. Kennedy Space Center Payload Processing

    NASA Technical Reports Server (NTRS)

    Lawson, Ronnie; Engler, Tom; Colloredo, Scott; Zide, Alan

    2011-01-01

    This slide presentation reviews the payload processing functions at Kennedy Space Center. It details some of the payloads processed at KSC, the typical processing tasks, the facilities available for processing payloads, and the capabilities and customer services that are available.

  3. Improving the Document Development Process: Integrating Relational Data and Statistical Process Control.

    ERIC Educational Resources Information Center

    Miller, John

    1994-01-01

    Presents an approach to document numbering, document titling, and process measurement which, when used with fundamental techniques of statistical process control, reveals meaningful process-element variation as well as nominal productivity models. (SR)

  4. USE OF INDICATOR ORGANISMS FOR DETERMINING PROCESS EFFECTIVENESS

    EPA Science Inventory

    Wastewaters, process effluents and treatment process residuals contain a variety of microorganisms. Many factors influence their densities as they move through collection systems and process equipment. Biological treatment systems rely on the catabolic processes of such microor...

  5. Food processing by high hydrostatic pressure.

    PubMed

    Yamamoto, Kazutaka

    2017-04-01

    High hydrostatic pressure (HHP) process, as a nonthermal process, can be used to inactivate microbes while minimizing chemical reactions in food. In this regard, a HHP level of 100 MPa (986.9 atm/1019.7 kgf/cm 2 ) and more is applied to food. Conventional thermal process damages food components relating color, flavor, and nutrition via enhanced chemical reactions. However, HHP process minimizes the damages and inactivates microbes toward processing high quality safe foods. The first commercial HHP-processed foods were launched in 1990 as fruit products such as jams, and then some other products have been commercialized: retort rice products (enhanced water impregnation), cooked hams and sausages (shelf life extension), soy sauce with minimized salt (short-time fermentation owing to enhanced enzymatic reactions), and beverages (shelf life extension). The characteristics of HHP food processing are reviewed from viewpoints of nonthermal process, history, research and development, physical and biochemical changes, and processing equipment.

  6. [Near infrared spectroscopy based process trajectory technology and its application in monitoring and controlling of traditional Chinese medicine manufacturing process].

    PubMed

    Li, Wen-Long; Qu, Hai-Bin

    2016-10-01

    In this paper, the principle of NIRS (near infrared spectroscopy)-based process trajectory technology was introduced.The main steps of the technique include:① in-line collection of the processes spectra of different technics; ② unfolding of the 3-D process spectra;③ determination of the process trajectories and their normal limits;④ monitoring of the new batches with the established MSPC (multivariate statistical process control) models.Applications of the technology in the chemical and biological medicines were reviewed briefly. By a comprehensive introduction of our feasibility research on the monitoring of traditional Chinese medicine technical process using NIRS-based multivariate process trajectories, several important problems of the practical applications which need urgent solutions are proposed, and also the application prospect of the NIRS-based process trajectory technology is fully discussed and put forward in the end. Copyright© by the Chinese Pharmaceutical Association.

  7. Recollection is a continuous process: implications for dual-process theories of recognition memory.

    PubMed

    Mickes, Laura; Wais, Peter E; Wixted, John T

    2009-04-01

    Dual-process theory, which holds that recognition decisions can be based on recollection or familiarity, has long seemed incompatible with signal detection theory, which holds that recognition decisions are based on a singular, continuous memory-strength variable. Formal dual-process models typically regard familiarity as a continuous process (i.e., familiarity comes in degrees), but they construe recollection as a categorical process (i.e., recollection either occurs or does not occur). A continuous process is characterized by a graded relationship between confidence and accuracy, whereas a categorical process is characterized by a binary relationship such that high confidence is associated with high accuracy but all lower degrees of confidence are associated with chance accuracy. Using a source-memory procedure, we found that the relationship between confidence and source-recollection accuracy was graded. Because recollection, like familiarity, is a continuous process, dual-process theory is more compatible with signal detection theory than previously thought.

  8. A qualitative assessment of a random process proposed as an atmospheric turbulence model

    NASA Technical Reports Server (NTRS)

    Sidwell, K.

    1977-01-01

    A random process is formed by the product of two Gaussian processes and the sum of that product with a third Gaussian process. The resulting total random process is interpreted as the sum of an amplitude modulated process and a slowly varying, random mean value. The properties of the process are examined, including an interpretation of the process in terms of the physical structure of atmospheric motions. The inclusion of the mean value variation gives an improved representation of the properties of atmospheric motions, since the resulting process can account for the differences in the statistical properties of atmospheric velocity components and their gradients. The application of the process to atmospheric turbulence problems, including the response of aircraft dynamic systems, is examined. The effects of the mean value variation upon aircraft loads are small in most cases, but can be important in the measurement and interpretation of atmospheric turbulence data.

  9. Metals Recovery from Artificial Ore in Case of Printed Circuit Boards, Using Plasmatron Plasma Reactor

    PubMed Central

    Szałatkiewicz, Jakub

    2016-01-01

    This paper presents the investigation of metals production form artificial ore, which consists of printed circuit board (PCB) waste, processed in plasmatron plasma reactor. A test setup was designed and built that enabled research of plasma processing of PCB waste of more than 700 kg/day scale. The designed plasma process is presented and discussed. The process in tests consumed 2 kWh/kg of processed waste. Investigation of the process products is presented with their elemental analyses of metals and slag. The average recovery of metals in presented experiments is 76%. Metals recovered include: Ag, Au, Pd, Cu, Sn, Pb, and others. The chosen process parameters are presented: energy consumption, throughput, process temperatures, and air consumption. Presented technology allows processing of variable and hard-to-process printed circuit board waste that can reach up to 100% of the input mass. PMID:28773804

  10. Characterisation and Processing of Some Iron Ores of India

    NASA Astrophysics Data System (ADS)

    Krishna, S. J. G.; Patil, M. R.; Rudrappa, C.; Kumar, S. P.; Ravi, B. P.

    2013-10-01

    Lack of process characterization data of the ores based on the granulometry, texture, mineralogy, physical, chemical, properties, merits and limitations of process, market and local conditions may mislead the mineral processing entrepreneur. The proper implementation of process characterization and geotechnical map data will result in optimized sustainable utilization of resource by processing. A few case studies of process characterization of some Indian iron ores are dealt with. The tentative ascending order of process refractoriness of iron ores is massive hematite/magnetite < marine black iron oxide sands < laminated soft friable siliceous ore fines < massive banded magnetite quartzite < laminated soft friable clayey aluminous ore fines < massive banded hematite quartzite/jasper < massive clayey hydrated iron oxide ore < manganese bearing iron ores massive < Ti-V bearing magnetite magmatic ore < ferruginous cherty quartzite. Based on diagnostic process characterization, the ores have been classified and generic process have been adopted for some Indian iron ores.

  11. Measuring health care process quality with software quality measures.

    PubMed

    Yildiz, Ozkan; Demirörs, Onur

    2012-01-01

    Existing quality models focus on some specific diseases, clinics or clinical areas. Although they contain structure, process, or output type measures, there is no model which measures quality of health care processes comprehensively. In addition, due to the not measured overall process quality, hospitals cannot compare quality of processes internally and externally. To bring a solution to above problems, a new model is developed from software quality measures. We have adopted the ISO/IEC 9126 software quality standard for health care processes. Then, JCIAS (Joint Commission International Accreditation Standards for Hospitals) measurable elements were added to model scope for unifying functional requirements. Assessment (diagnosing) process measurement results are provided in this paper. After the application, it was concluded that the model determines weak and strong aspects of the processes, gives a more detailed picture for the process quality, and provides quantifiable information to hospitals to compare their processes with multiple organizations.

  12. Thermal Stir Welding: A New Solid State Welding Process

    NASA Technical Reports Server (NTRS)

    Ding, R. Jeffrey

    2003-01-01

    Thermal stir welding is a new welding process developed at NASA's Marshall Space Flight Center in Huntsville, AL. Thermal stir welding is similar to friction stir welding in that it joins similar or dissimilar materials without melting the parent material. However, unlike friction stir welding, the heating, stirring and forging elements of the process are all independent of each other and are separately controlled. Furthermore, the heating element of the process can be either a solid-state process (such as a thermal blanket, induction type process, etc), or, a fusion process (YG laser, plasma torch, etc.) The separation of the heating, stirring, forging elements of the process allows more degrees of freedom for greater process control. This paper introduces the mechanics of the thermal stir welding process. In addition, weld mechanical property data is presented for selected alloys as well as metallurgical analysis.

  13. Thermal Stir Welding: A New Solid State Welding Process

    NASA Technical Reports Server (NTRS)

    Ding, R. Jeffrey; Munafo, Paul M. (Technical Monitor)

    2002-01-01

    Thermal stir welding is a new welding process developed at NASA's Marshall Space Flight Center in Huntsville, AL. Thermal stir welding is similar to friction stir welding in that it joins similar or dissimilar materials without melting the parent material. However, unlike friction stir welding, the heating, stirring and forging elements of the process are all independent of each other and are separately controlled. Furthermore, the heating element of the process can be either a solid-state process (such as a thermal blanket, induction type process, etc), or, a fusion process (YG laser, plasma torch, etc.) The separation of the heating, stirring, forging elements of the process allows more degrees of freedom for greater process control. This paper introduces the mechanics of the thermal stir welding process. In addition, weld mechanical property data is presented for selected alloys as well as metallurgical analysis.

  14. Metals Recovery from Artificial Ore in Case of Printed Circuit Boards, Using Plasmatron Plasma Reactor.

    PubMed

    Szałatkiewicz, Jakub

    2016-08-10

    This paper presents the investigation of metals production form artificial ore, which consists of printed circuit board (PCB) waste, processed in plasmatron plasma reactor. A test setup was designed and built that enabled research of plasma processing of PCB waste of more than 700 kg/day scale. The designed plasma process is presented and discussed. The process in tests consumed 2 kWh/kg of processed waste. Investigation of the process products is presented with their elemental analyses of metals and slag. The average recovery of metals in presented experiments is 76%. Metals recovered include: Ag, Au, Pd, Cu, Sn, Pb, and others. The chosen process parameters are presented: energy consumption, throughput, process temperatures, and air consumption. Presented technology allows processing of variable and hard-to-process printed circuit board waste that can reach up to 100% of the input mass.

  15. The origins of levels-of-processing effects in a conceptual test: evidence for automatic influences of memory from the process-dissociation procedure.

    PubMed

    Bergerbest, Dafna; Goshen-Gottstein, Yonatan

    2002-12-01

    In three experiments, we explored automatic influences of memory in a conceptual memory task, as affected by a levels-of-processing (LoP) manipulation. We also explored the origins of the LoP effect by examining whether the effect emerged only when participants in the shallow condition truncated the perceptual processing (the lexical-processing hypothesis) or even when the entire word was encoded in this condition (the conceptual-processing hypothesis). Using the process-dissociation procedure and an implicit association-generation task, we found that the deep encoding condition yielded higher estimates of automatic influences than the shallow condition. In support of the conceptual processing hypothesis, the LoP effect was found even when the shallow task did not lead to truncated processing of the lexical units. We suggest that encoding for meaning is a prerequisite for automatic processing on conceptual tests of memory.

  16. Exploring business process modelling paradigms and design-time to run-time transitions

    NASA Astrophysics Data System (ADS)

    Caron, Filip; Vanthienen, Jan

    2016-09-01

    The business process management literature describes a multitude of approaches (e.g. imperative, declarative or event-driven) that each result in a different mix of process flexibility, compliance, effectiveness and efficiency. Although the use of a single approach over the process lifecycle is often assumed, transitions between approaches at different phases in the process lifecycle may also be considered. This article explores several business process strategies by analysing the approaches at different phases in the process lifecycle as well as the various transitions.

  17. System Engineering Concept Demonstration, Process Model. Volume 3

    DTIC Science & Technology

    1992-12-01

    Process or Process Model The System Engineering process must be the enactment of the aforementioned definitions. Therefore, a process is an enactment of a...Prototype Tradeoff Scenario demonstrates six levels of abstraction in the Process Model. The Process Model symbology is explained within the "Help" icon ...dnofing no- ubeq t"vidi e /hn -am-a. lmi IzyuO ..pu Row _e._n au"c.ue-w’ ’- anuiildyidwile b ie htplup ?~imsav D symbo ,,ue,.dvu ,,dienl Flw s--..,fu..I

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Eun, H.C.; Cho, Y.Z.; Choi, J.H.

    A regeneration process of LiCl-KCl eutectic waste salt generated from the pyrochemical process of spent nuclear fuel has been studied. This regeneration process is composed of a chemical conversion process and a vacuum distillation process. Through the regeneration process, a high efficiency of renewable salt recovery can be obtained from the waste salt and rare earth nuclides in the waste salt can be separated as oxide or phosphate forms. Thus, the regeneration process can contribute greatly to a reduction of the waste volume and a creation of durable final waste forms. (authors)

  19. An open system approach to process reengineering in a healthcare operational environment.

    PubMed

    Czuchry, A J; Yasin, M M; Norris, J

    2000-01-01

    The objective of this study is to examine the applicability of process reengineering in a healthcare operational environment. The intake process of a mental healthcare service delivery system is analyzed systematically to identify process-related problems. A methodology which utilizes an open system orientation coupled with process reengineering is utilized to overcome operational and patient related problems associated with the pre-reengineered intake process. The systematic redesign of the intake process resulted in performance improvements in terms of cost, quality, service and timing.

  20. Developing the JPL Engineering Processes

    NASA Technical Reports Server (NTRS)

    Linick, Dave; Briggs, Clark

    2004-01-01

    This paper briefly recounts the recent history of process reengineering at the NASA Jet Propulsion Laboratory, with a focus on the engineering processes. The JPL process structure is described and the process development activities of the past several years outlined. The main focus of the paper is on the current process structure, the emphasis on the flight project life cycle, the governance approach that lead to Flight Project Practices, and the remaining effort to capture process knowledge at the detail level of the work group.

  1. Water-saving liquid-gas conditioning system

    DOEpatents

    Martin, Christopher; Zhuang, Ye

    2014-01-14

    A method for treating a process gas with a liquid comprises contacting a process gas with a hygroscopic working fluid in order to remove a constituent from the process gas. A system for treating a process gas with a liquid comprises a hygroscopic working fluid comprising a component adapted to absorb or react with a constituent of a process gas, and a liquid-gas contactor for contacting the working fluid and the process gas, wherein the constituent is removed from the process gas within the liquid-gas contactor.

  2. Model for Simulating a Spiral Software-Development Process

    NASA Technical Reports Server (NTRS)

    Mizell, Carolyn; Curley, Charles; Nayak, Umanath

    2010-01-01

    A discrete-event simulation model, and a computer program that implements the model, have been developed as means of analyzing a spiral software-development process. This model can be tailored to specific development environments for use by software project managers in making quantitative cases for deciding among different software-development processes, courses of action, and cost estimates. A spiral process can be contrasted with a waterfall process, which is a traditional process that consists of a sequence of activities that include analysis of requirements, design, coding, testing, and support. A spiral process is an iterative process that can be regarded as a repeating modified waterfall process. Each iteration includes assessment of risk, analysis of requirements, design, coding, testing, delivery, and evaluation. A key difference between a spiral and a waterfall process is that a spiral process can accommodate changes in requirements at each iteration, whereas in a waterfall process, requirements are considered to be fixed from the beginning and, therefore, a waterfall process is not flexible enough for some projects, especially those in which requirements are not known at the beginning or may change during development. For a given project, a spiral process may cost more and take more time than does a waterfall process, but may better satisfy a customer's expectations and needs. Models for simulating various waterfall processes have been developed previously, but until now, there have been no models for simulating spiral processes. The present spiral-process-simulating model and the software that implements it were developed by extending a discrete-event simulation process model of the IEEE 12207 Software Development Process, which was built using commercially available software known as the Process Analysis Tradeoff Tool (PATT). Typical inputs to PATT models include industry-average values of product size (expressed as number of lines of code), productivity (number of lines of code per hour), and number of defects per source line of code. The user provides the number of resources, the overall percent of effort that should be allocated to each process step, and the number of desired staff members for each step. The output of PATT includes the size of the product, a measure of effort, a measure of rework effort, the duration of the entire process, and the numbers of injected, detected, and corrected defects as well as a number of other interesting features. In the development of the present model, steps were added to the IEEE 12207 waterfall process, and this model and its implementing software were made to run repeatedly through the sequence of steps, each repetition representing an iteration in a spiral process. Because the IEEE 12207 model is founded on a waterfall paradigm, it enables direct comparison of spiral and waterfall processes. The model can be used throughout a software-development project to analyze the project as more information becomes available. For instance, data from early iterations can be used as inputs to the model, and the model can be used to estimate the time and cost of carrying the project to completion.

  3. Magnitude processing of symbolic and non-symbolic proportions: an fMRI study.

    PubMed

    Mock, Julia; Huber, Stefan; Bloechle, Johannes; Dietrich, Julia F; Bahnmueller, Julia; Rennig, Johannes; Klein, Elise; Moeller, Korbinian

    2018-05-10

    Recent research indicates that processing proportion magnitude is associated with activation in the intraparietal sulcus. Thus, brain areas associated with the processing of numbers (i.e., absolute magnitude) were activated during processing symbolic fractions as well as non-symbolic proportions. Here, we investigated systematically the cognitive processing of symbolic (e.g., fractions and decimals) and non-symbolic proportions (e.g., dot patterns and pie charts) in a two-stage procedure. First, we investigated relative magnitude-related activations of proportion processing. Second, we evaluated whether symbolic and non-symbolic proportions share common neural substrates. We conducted an fMRI study using magnitude comparison tasks with symbolic and non-symbolic proportions, respectively. As an indicator for magnitude-related processing of proportions, the distance effect was evaluated. A conjunction analysis indicated joint activation of specific occipito-parietal areas including right intraparietal sulcus (IPS) during proportion magnitude processing. More specifically, results indicate that the IPS, which is commonly associated with absolute magnitude processing, is involved in processing relative magnitude information as well, irrespective of symbolic or non-symbolic presentation format. However, we also found distinct activation patterns for the magnitude processing of the different presentation formats. Our findings suggest that processing for the separate presentation formats is not only associated with magnitude manipulations in the IPS, but also increasing demands on executive functions and strategy use associated with frontal brain regions as well as visual attention and encoding in occipital regions. Thus, the magnitude processing of proportions may not exclusively reflect processing of number magnitude information but also rather domain-general processes.

  4. [Alcohol-purification technology and its particle sedimentation process in manufactory of Fufang Kushen injection].

    PubMed

    Liu, Xiaoqian; Tong, Yan; Wang, Jinyu; Wang, Ruizhen; Zhang, Yanxia; Wang, Zhimin

    2011-11-01

    Fufang Kushen injection was selected as the model drug, to optimize its alcohol-purification process and understand the characteristics of particle sedimentation process, and to investigate the feasibility of using process analytical technology (PAT) on traditional Chinese medicine (TCM) manufacturing. Total alkaloids (calculated by matrine, oxymatrine, sophoridine and oxysophoridine) and macrozamin were selected as quality evaluation markers to optimize the process of Fufang Kushen injection purification with alcohol. Process parameters of particulate formed in the alcohol-purification, such as the number, density and sedimentation velocity, were also determined to define the sedimentation time and well understand the process. The purification process was optimized as that alcohol is added to the concentrated extract solution (drug material) to certain concentration for 2 times and deposited the alcohol-solution containing drug-material to sediment for some time, i.e. 60% alcohol deposited for 36 hours, filter and then 80% -90% alcohol deposited for 6 hours in turn. The content of total alkaloids was decreased a little during the depositing process. The average settling time of particles with the diameters of 10, 25 microm were 157.7, 25.2 h in the first alcohol-purified process, and 84.2, 13.5 h in the second alcohol-purified process, respectively. The optimized alcohol-purification process remains the marker compositions better and compared with the initial process, it's time saving and much economy. The manufacturing quality of TCM-injection can be controlled by process. PAT pattern must be designed under the well understanding of process of TCM production.

  5. Application of volume-retarded osmosis and low-pressure membrane hybrid process for water reclamation.

    PubMed

    Im, Sung-Ju; Choi, Jungwon; Lee, Jung-Gil; Jeong, Sanghyun; Jang, Am

    2018-03-01

    A new concept of volume-retarded osmosis and low-pressure membrane (VRO-LPM) hybrid process was developed and evaluated for the first time in this study. Commercially available forward osmosis (FO) and ultrafiltration (UF) membranes were employed in a VRO-LPM hybrid process to overcome energy limitations of draw solution (DS) regeneration and production of permeate in the FO process. To evaluate its feasibility as a water reclamation process, and to optimize the operational conditions, cross-flow FO and dead-end mode UF processes were individually evaluated. For the FO process, a DS concentration of 0.15 g mL -1 of polysulfonate styrene (PSS) was determined to be optimal, having a high flux with a low reverse salt flux. The UF membrane with a molecular weight cut-off of 1 kDa was chosen for its high PSS rejection in the LPM process. As a single process, UF (LPM) exhibited a higher flux than FO, but this could be controlled by adjusting the effective membrane area of the FO and UF membranes in the VRO-LPM system. The VRO-LPM hybrid process only required a circulation pump for the FO process. This led to a decrease in the specific energy consumption of the VRO-LPM process for potable water production, that was similar to the single FO process. Therefore, the newly developed VRO-LPM hybrid process, with an appropriate DS selection, can be used as an energy efficient water production method, and can outperform conventional water reclamation processes. Copyright © 2017 Elsevier Ltd. All rights reserved.

  6. Quality control process improvement of flexible printed circuit board by FMEA

    NASA Astrophysics Data System (ADS)

    Krasaephol, Siwaporn; Chutima, Parames

    2018-02-01

    This research focuses on the quality control process improvement of Flexible Printed Circuit Board (FPCB), centred around model 7-Flex, by using Failure Mode and Effect Analysis (FMEA) method to decrease proportion of defective finished goods that are found at the final inspection process. Due to a number of defective units that were found at the final inspection process, high scraps may be escaped to customers. The problem comes from poor quality control process which is not efficient enough to filter defective products from in-process because there is no In-Process Quality Control (IPQC) or sampling inspection in the process. Therefore, the quality control process has to be improved by setting inspection gates and IPCQs at critical processes in order to filter the defective products. The critical processes are analysed by the FMEA method. IPQC is used for detecting defective products and reducing chances of defective finished goods escaped to the customers. Reducing proportion of defective finished goods also decreases scrap cost because finished goods incur higher scrap cost than work in-process. Moreover, defective products that are found during process can reflect the abnormal processes; therefore, engineers and operators should timely solve the problems. Improved quality control was implemented for 7-Flex production lines from July 2017 to September 2017. The result shows decreasing of the average proportion of defective finished goods and the average of Customer Manufacturers Lot Reject Rate (%LRR of CMs) equal to 4.5% and 4.1% respectively. Furthermore, cost saving of this quality control process equals to 100K Baht.

  7. Formulating poultry processing sanitizers from alkaline salts of fatty acids

    USDA-ARS?s Scientific Manuscript database

    Though some poultry processing operations remove microorganisms from carcasses; other processing operations cause cross-contamination that spreads microorganisms between carcasses, processing water, and processing equipment. One method used by commercial poultry processors to reduce microbial contam...

  8. Fabrication Process for Cantilever Beam Micromechanical Switches

    DTIC Science & Technology

    1993-08-01

    Beam Design ................................................................... 13 B. Chemistry and Materials Used in Cantilever Beam Process...7 3. Photomask levels and composite...pp 410-413. 5 2. Cantilever Beam Fabrication Process The beam fabrication process incorporates four different photomasking levels with 62 processing

  9. Reports of planetary geology program, 1983

    NASA Technical Reports Server (NTRS)

    Holt, H. E. (Compiler)

    1984-01-01

    Several areas of the Planetary Geology Program were addressed including outer solar system satellites, asteroids, comets, Venus, cratering processes and landform development, volcanic processes, aeolian processes, fluvial processes, periglacial and permafrost processes, geomorphology, remote sensing, tectonics and stratigraphy, and mapping.

  10. Cognitive Processes in Discourse Comprehension: Passive Processes, Reader-Initiated Processes, and Evolving Mental Representations

    ERIC Educational Resources Information Center

    van den Broek, Paul; Helder, Anne

    2017-01-01

    As readers move through a text, they engage in various types of processes that, if all goes well, result in a mental representation that captures their interpretation of the text. With each new text segment the reader engages in passive and, at times, reader-initiated processes. These processes are strongly influenced by the readers'…

  11. The Use of Knowledge Based Decision Support Systems in Reengineering Selected Processes in the U. S. Marine Corps

    DTIC Science & Technology

    2001-09-01

    measurable benefit in terms of process efficiency and effectiveness, business process reengineering (BPR) is becoming increasingly important. BPR suggests...technology by businesses in hopes of achieving a measurable benefit in terms of process efficiency and effectiveness, business process...KOPER-LITE ........................................13 E. HOW MIGHT THE MILITARY BENEFIT FROM PROCESS REENGINEERING EFFORTS

  12. 30 CFR 206.181 - How do I establish processing costs for dual accounting purposes when I do not process the gas?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... accounting purposes when I do not process the gas? 206.181 Section 206.181 Mineral Resources MINERALS... Processing Allowances § 206.181 How do I establish processing costs for dual accounting purposes when I do not process the gas? Where accounting for comparison (dual accounting) is required for gas production...

  13. Conceptual models of information processing

    NASA Technical Reports Server (NTRS)

    Stewart, L. J.

    1983-01-01

    The conceptual information processing issues are examined. Human information processing is defined as an active cognitive process that is analogous to a system. It is the flow and transformation of information within a human. The human is viewed as an active information seeker who is constantly receiving, processing, and acting upon the surrounding environmental stimuli. Human information processing models are conceptual representations of cognitive behaviors. Models of information processing are useful in representing the different theoretical positions and in attempting to define the limits and capabilities of human memory. It is concluded that an understanding of conceptual human information processing models and their applications to systems design leads to a better human factors approach.

  14. Industrial application of semantic process mining

    NASA Astrophysics Data System (ADS)

    Espen Ingvaldsen, Jon; Atle Gulla, Jon

    2012-05-01

    Process mining relates to the extraction of non-trivial and useful information from information system event logs. It is a new research discipline that has evolved significantly since the early work on idealistic process logs. Over the last years, process mining prototypes have incorporated elements from semantics and data mining and targeted visualisation techniques that are more user-friendly to business experts and process owners. In this article, we present a framework for evaluating different aspects of enterprise process flows and address practical challenges of state-of-the-art industrial process mining. We also explore the inherent strengths of the technology for more efficient process optimisation.

  15. Reliability and performance of a system-on-a-chip by predictive wear-out based activation of functional components

    DOEpatents

    Cher, Chen-Yong; Coteus, Paul W; Gara, Alan; Kursun, Eren; Paulsen, David P; Schuelke, Brian A; Sheets, II, John E; Tian, Shurong

    2013-10-01

    A processor-implemented method for determining aging of a processing unit in a processor the method comprising: calculating an effective aging profile for the processing unit wherein the effective aging profile quantifies the effects of aging on the processing unit; combining the effective aging profile with process variation data, actual workload data and operating conditions data for the processing unit; and determining aging through an aging sensor of the processing unit using the effective aging profile, the process variation data, the actual workload data, architectural characteristics and redundancy data, and the operating conditions data for the processing unit.

  16. Fuzzy control of burnout of multilayer ceramic actuators

    NASA Astrophysics Data System (ADS)

    Ling, Alice V.; Voss, David; Christodoulou, Leo

    1996-08-01

    To improve the yield and repeatability of the burnout process of multilayer ceramic actuators (MCAs), an intelligent processing of materials (IPM-based) control system has been developed for the manufacture of MCAs. IPM involves the active (ultimately adaptive) control of a material process using empirical or analytical models and in situ sensing of critical process states (part features and process parameters) to modify the processing conditions in real time to achieve predefined product goals. Thus, the three enabling technologies for the IPM burnout control system are process modeling, in situ sensing and intelligent control. This paper presents the design of an IPM-based control strategy for the burnout process of MCAs.

  17. Direct access inter-process shared memory

    DOEpatents

    Brightwell, Ronald B; Pedretti, Kevin; Hudson, Trammell B

    2013-10-22

    A technique for directly sharing physical memory between processes executing on processor cores is described. The technique includes loading a plurality of processes into the physical memory for execution on a corresponding plurality of processor cores sharing the physical memory. An address space is mapped to each of the processes by populating a first entry in a top level virtual address table for each of the processes. The address space of each of the processes is cross-mapped into each of the processes by populating one or more subsequent entries of the top level virtual address table with the first entry in the top level virtual address table from other processes.

  18. Biotechnology in Food Production and Processing

    NASA Astrophysics Data System (ADS)

    Knorr, Dietrich; Sinskey, Anthony J.

    1985-09-01

    The food processing industry is the oldest and largest industry using biotechnological processes. Further development of food products and processes based on biotechnology depends upon the improvement of existing processes, such as fermentation, immobilized biocatalyst technology, and production of additives and processing aids, as well as the development of new opportunities for food biotechnology. Improvements are needed in the characterization, safety, and quality control of food materials, in processing methods, in waste conversion and utilization processes, and in currently used food microorganism and tissue culture systems. Also needed are fundamental studies of the structure-function relationship of food materials and of the cell physiology and biochemistry of raw materials.

  19. What is a good public participation process? Five perspectives from the public.

    PubMed

    Webler, T; Tuler, S; Krueger, R

    2001-03-01

    It is now widely accepted that members of the public should be involved in environmental decision-making. This has inspired many to search for principles that characterize good public participation processes. In this paper we report on a study that identifies discourses about what defines a good process. Our case study was a forest planning process in northern New England and New York. We employed Q methodology to learn how participants characterize a good process differently, by selecting, defining, and privileging different principles. Five discourses, or perspectives, about good process emerged from our study. One perspective emphasizes that a good process acquires and maintains popular legitimacy. A second sees a good process as one that facilitates an ideological discussion. A third focuses on the fairness of the process. A fourth perspective conceptualizes participatory processes as a power struggle--in this instance a power play between local land-owning interests and outsiders. A fifth perspective highlights the need for leadership and compromise. Dramatic differences among these views suggest an important challenge for those responsible for designing and carrying out public participation processes. Conflicts may emerge about process designs because people disagree about what is good in specific contexts.

  20. Alternating event processes during lifetimes: population dynamics and statistical inference.

    PubMed

    Shinohara, Russell T; Sun, Yifei; Wang, Mei-Cheng

    2018-01-01

    In the literature studying recurrent event data, a large amount of work has been focused on univariate recurrent event processes where the occurrence of each event is treated as a single point in time. There are many applications, however, in which univariate recurrent events are insufficient to characterize the feature of the process because patients experience nontrivial durations associated with each event. This results in an alternating event process where the disease status of a patient alternates between exacerbations and remissions. In this paper, we consider the dynamics of a chronic disease and its associated exacerbation-remission process over two time scales: calendar time and time-since-onset. In particular, over calendar time, we explore population dynamics and the relationship between incidence, prevalence and duration for such alternating event processes. We provide nonparametric estimation techniques for characteristic quantities of the process. In some settings, exacerbation processes are observed from an onset time until death; to account for the relationship between the survival and alternating event processes, nonparametric approaches are developed for estimating exacerbation process over lifetime. By understanding the population dynamics and within-process structure, the paper provide a new and general way to study alternating event processes.

  1. Process mining in oncology using the MIMIC-III dataset

    NASA Astrophysics Data System (ADS)

    Prima Kurniati, Angelina; Hall, Geoff; Hogg, David; Johnson, Owen

    2018-03-01

    Process mining is a data analytics approach to discover and analyse process models based on the real activities captured in information systems. There is a growing body of literature on process mining in healthcare, including oncology, the study of cancer. In earlier work we found 37 peer-reviewed papers describing process mining research in oncology with a regular complaint being the limited availability and accessibility of datasets with suitable information for process mining. Publicly available datasets are one option and this paper describes the potential to use MIMIC-III, for process mining in oncology. MIMIC-III is a large open access dataset of de-identified patient records. There are 134 publications listed as using the MIMIC dataset, but none of them have used process mining. The MIMIC-III dataset has 16 event tables which are potentially useful for process mining and this paper demonstrates the opportunities to use MIMIC-III for process mining in oncology. Our research applied the L* lifecycle method to provide a worked example showing how process mining can be used to analyse cancer pathways. The results and data quality limitations are discussed along with opportunities for further work and reflection on the value of MIMIC-III for reproducible process mining research.

  2. Research on the technique of large-aperture off-axis parabolic surface processing using tri-station machine and its applicability.

    PubMed

    Zhang, Xin; Luo, Xiao; Hu, Haixiang; Zhang, Xuejun

    2015-09-01

    In order to process large-aperture aspherical mirrors, we designed and constructed a tri-station machine processing center with a three station device, which bears vectored feed motion of up to 10 axes. Based on this processing center, an aspherical mirror-processing model is proposed, in which each station implements traversal processing of large-aperture aspherical mirrors using only two axes, while the stations are switchable, thus lowering cost and enhancing processing efficiency. The applicability of the tri-station machine is also analyzed. At the same time, a simple and efficient zero-calibration method for processing is proposed. To validate the processing model, using our processing center, we processed an off-axis parabolic SiC mirror with an aperture diameter of 1450 mm. The experimental results indicate that, with a one-step iterative process, the peak to valley (PV) and root mean square (RMS) of the mirror converged from 3.441 and 0.5203 μm to 2.637 and 0.2962 μm, respectively, where the RMS reduced by 43%. The validity and high accuracy of the model are thereby demonstrated.

  3. Patterning of Indium Tin Oxide Films

    NASA Technical Reports Server (NTRS)

    Immer, Christopher

    2008-01-01

    A relatively rapid, economical process has been devised for patterning a thin film of indium tin oxide (ITO) that has been deposited on a polyester film. ITO is a transparent, electrically conductive substance made from a mixture of indium oxide and tin oxide that is commonly used in touch panels, liquid-crystal and plasma display devices, gas sensors, and solar photovoltaic panels. In a typical application, the ITO film must be patterned to form electrodes, current collectors, and the like. Heretofore it has been common practice to pattern an ITO film by means of either a laser ablation process or a photolithography/etching process. The laser ablation process includes the use of expensive equipment to precisely position and focus a laser. The photolithography/etching process is time-consuming. The present process is a variant of the direct toner process an inexpensive but often highly effective process for patterning conductors for printed circuits. Relative to a conventional photolithography/ etching process, this process is simpler, takes less time, and is less expensive. This process involves equipment that costs less than $500 (at 2005 prices) and enables patterning of an ITO film in a process time of less than about a half hour.

  4. Assessment of Process Capability: the case of Soft Drinks Processing Unit

    NASA Astrophysics Data System (ADS)

    Sri Yogi, Kottala

    2018-03-01

    The process capability studies have significant impact in investigating process variation which is important in achieving product quality characteristics. Its indices are to measure the inherent variability of a process and thus to improve the process performance radically. The main objective of this paper is to understand capability of the process being produced within specification of the soft drinks processing unit, a premier brands being marketed in India. A few selected critical parameters in soft drinks processing: concentration of gas volume, concentration of brix, torque of crock has been considered for this study. Assessed some relevant statistical parameters: short term capability, long term capability as a process capability indices perspective. For assessment we have used real time data of soft drinks bottling company which is located in state of Chhattisgarh, India. As our research output suggested reasons for variations in the process which is validated using ANOVA and also predicted Taguchi cost function, assessed also predicted waste monetarily this shall be used by organization for improving process parameters. This research work has substantially benefitted the organization in understanding the various variations of selected critical parameters for achieving zero rejection.

  5. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dafler, J.R.; Sinnott, J.; Novil, M.

    The first phase of a study to identify candidate processes and products suitable for future exploitation using high-temperature solar energy is presented. This phase has been principally analytical, consisting of techno-economic studies, thermodynamic assessments of chemical reactions and processes, and the determination of market potentials for major chemical commodities that use significant amounts of fossil resources today. The objective was to identify energy-intensive processes that would be suitable for the production of chemicals and fuels using solar energy process heat. Of particular importance was the comparison of relative costs and energy requirements for the selected solar product versus costs formore » the product derived from conventional processing. The assessment methodology used a systems analytical approach to identify processes and products having the greatest potential for solar energy-thermal processing. This approach was used to establish the basis for work to be carried out in subsequent phases of development. It has been the intent of the program to divide the analysis and process identification into the following three distinct areas: (1) process selection, (2) process evaluation, and (3) ranking of processes. Four conventional processes were selected for assessment namely, methanol synthesis, styrene monomer production, vinyl chloride monomer production, and terephthalic acid production.« less

  6. An Application of X-Ray Fluorescence as Process Analytical Technology (PAT) to Monitor Particle Coating Processes.

    PubMed

    Nakano, Yoshio; Katakuse, Yoshimitsu; Azechi, Yasutaka

    2018-06-01

    An attempt to apply X-Ray Fluorescence (XRF) analysis to evaluate small particle coating process as a Process Analytical Technologies (PAT) was made. The XRF analysis was used to monitor coating level in small particle coating process with at-line manner. The small particle coating process usually consists of multiple coating processes. This study was conducted by a simple coating particles prepared by first coating of a model compound (DL-methionine) and second coating by talc on spherical microcrystalline cellulose cores. The particles with two layered coating are enough to demonstrate the small particle coating process. From the result by the small particle coating process, it was found that the XRF signal played different roles, resulting that XRF signals by first coating (layering) and second coating (mask coating) could demonstrate the extent with different mechanisms for the coating process. Furthermore, the particle coating of the different particle size has also been investigated to evaluate size effect of these coating processes. From these results, it was concluded that the XRF could be used as a PAT in monitoring particle coating processes and become powerful tool in pharmaceutical manufacturing.

  7. Single-Run Single-Mask Inductively-Coupled-Plasma Reactive-Ion-Etching Process for Fabricating Suspended High-Aspect-Ratio Microstructures

    NASA Astrophysics Data System (ADS)

    Yang, Yao-Joe; Kuo, Wen-Cheng; Fan, Kuang-Chao

    2006-01-01

    In this work, we present a single-run single-mask (SRM) process for fabricating suspended high-aspect-ratio structures on standard silicon wafers using an inductively coupled plasma-reactive ion etching (ICP-RIE) etcher. This process eliminates extra fabrication steps which are required for structure release after trench etching. Released microstructures with 120 μm thickness are obtained by this process. The corresponding maximum aspect ratio of the trench is 28. The SRM process is an extended version of the standard process proposed by BOSCH GmbH (BOSCH process). The first step of the SRM process is a standard BOSCH process for trench etching, then a polymer layer is deposited on trench sidewalls as a protective layer for the subsequent structure-releasing step. The structure is released by dry isotropic etching after the polymer layer on the trench floor is removed. All the steps can be integrated into a single-run ICP process. Also, only one mask is required. Therefore, the process complexity and fabrication cost can be effectively reduced. Discussions on each SRM step and considerations for avoiding undesired etching of the silicon structures during the release process are also presented.

  8. Auditory-musical processing in autism spectrum disorders: a review of behavioral and brain imaging studies.

    PubMed

    Ouimet, Tia; Foster, Nicholas E V; Tryfon, Ana; Hyde, Krista L

    2012-04-01

    Autism spectrum disorder (ASD) is a complex neurodevelopmental condition characterized by atypical social and communication skills, repetitive behaviors, and atypical visual and auditory perception. Studies in vision have reported enhanced detailed ("local") processing but diminished holistic ("global") processing of visual features in ASD. Individuals with ASD also show enhanced processing of simple visual stimuli but diminished processing of complex visual stimuli. Relative to the visual domain, auditory global-local distinctions, and the effects of stimulus complexity on auditory processing in ASD, are less clear. However, one remarkable finding is that many individuals with ASD have enhanced musical abilities, such as superior pitch processing. This review provides a critical evaluation of behavioral and brain imaging studies of auditory processing with respect to current theories in ASD. We have focused on auditory-musical processing in terms of global versus local processing and simple versus complex sound processing. This review contributes to a better understanding of auditory processing differences in ASD. A deeper comprehension of sensory perception in ASD is key to better defining ASD phenotypes and, in turn, may lead to better interventions. © 2012 New York Academy of Sciences.

  9. Discrete State Change Model of Manufacturing Quality to Aid Assembly Process Design

    NASA Astrophysics Data System (ADS)

    Koga, Tsuyoshi; Aoyama, Kazuhiro

    This paper proposes a representation model of the quality state change in an assembly process that can be used in a computer-aided process design system. In order to formalize the state change of the manufacturing quality in the assembly process, the functions, operations, and quality changes in the assembly process are represented as a network model that can simulate discrete events. This paper also develops a design method for the assembly process. The design method calculates the space of quality state change and outputs a better assembly process (better operations and better sequences) that can be used to obtain the intended quality state of the final product. A computational redesigning algorithm of the assembly process that considers the manufacturing quality is developed. The proposed method can be used to design an improved manufacturing process by simulating the quality state change. A prototype system for planning an assembly process is implemented and applied to the design of an auto-breaker assembly process. The result of the design example indicates that the proposed assembly process planning method outputs a better manufacturing scenario based on the simulation of the quality state change.

  10. Effect of simulated mechanical recycling processes on the structure and properties of poly(lactic acid).

    PubMed

    Beltrán, F R; Lorenzo, V; Acosta, J; de la Orden, M U; Martínez Urreaga, J

    2018-06-15

    The aim of this work is to study the effects of different simulated mechanical recycling processes on the structure and properties of PLA. A commercial grade of PLA was melt compounded and compression molded, then subjected to two different recycling processes. The first recycling process consisted of an accelerated ageing and a second melt processing step, while the other recycling process included an accelerated ageing, a demanding washing process and a second melt processing step. The intrinsic viscosity measurements indicate that both recycling processes produce a degradation in PLA, which is more pronounced in the sample subjected to the washing process. DSC results suggest an increase in the mobility of the polymer chains in the recycled materials; however the degree of crystallinity of PLA seems unchanged. The optical, mechanical and gas barrier properties of PLA do not seem to be largely affected by the degradation suffered during the different recycling processes. These results suggest that, despite the degradation of PLA, the impact of the different simulated mechanical recycling processes on the final properties is limited. Thus, the potential use of recycled PLA in packaging applications is not jeopardized. Copyright © 2017 Elsevier Ltd. All rights reserved.

  11. Consumption of ultra-processed foods predicts diet quality in Canada.

    PubMed

    Moubarac, Jean-Claude; Batal, M; Louzada, M L; Martinez Steele, E; Monteiro, C A

    2017-01-01

    This study describes food consumption patterns in Canada according to the types of food processing using the Nova classification and investigates the association between consumption of ultra-processed foods and the nutrient profile of the diet. Dietary intakes of 33,694 individuals from the 2004 Canadian Community Health Survey aged 2 years and above were analyzed. Food and drinks were classified using Nova into unprocessed or minimally processed foods, processed culinary ingredients, processed foods and ultra-processed foods. Average consumption (total daily energy intake) and relative consumption (% of total energy intake) provided by each of the food groups were calculated. Consumption of ultra-processed foods according to sex, age, education, residential location and relative family revenue was assessed. Mean nutrient content of ultra-processed foods and non-ultra-processed foods were compared, and the average nutrient content of the overall diet across quintiles of dietary share of ultra-processed foods was measured. In 2004, 48% of calories consumed by Canadians came from ultra-processed foods. Consumption of such foods was high amongst all socioeconomic groups, and particularly in children and adolescents. As a group, ultra-processed foods were grossly nutritionally inferior to non-ultra-processed foods. After adjusting for covariates, a significant and positive relationship was found between the dietary share of ultra-processed foods and the content in carbohydrates, free sugars, total and saturated fats and energy density, while an inverse relationship was observed with the dietary content in protein, fiber, vitamins A, C, D, B6 and B12, niacin, thiamine, riboflavin, as well as zinc, iron, magnesium, calcium, phosphorus and potassium. Lowering the dietary share of ultra-processed foods and raising consumption of hand-made meals from unprocessed or minimally processed foods would substantially improve the diet quality of Canadian. Copyright © 2016 Elsevier Ltd. All rights reserved.

  12. Development of Statistical Process Control Methodology for an Environmentally Compliant Surface Cleaning Process in a Bonding Laboratory

    NASA Technical Reports Server (NTRS)

    Hutchens, Dale E.; Doan, Patrick A.; Boothe, Richard E.

    1997-01-01

    Bonding labs at both MSFC and the northern Utah production plant prepare bond test specimens which simulate or witness the production of NASA's Reusable Solid Rocket Motor (RSRM). The current process for preparing the bonding surfaces employs 1,1,1-trichloroethane vapor degreasing, which simulates the current RSRM process. Government regulations (e.g., the 1990 Amendments to the Clean Air Act) have mandated a production phase-out of a number of ozone depleting compounds (ODC) including 1,1,1-trichloroethane. In order to comply with these regulations, the RSRM Program is qualifying a spray-in-air (SIA) precision cleaning process using Brulin 1990, an aqueous blend of surfactants. Accordingly, surface preparation prior to bonding process simulation test specimens must reflect the new production cleaning process. The Bonding Lab Statistical Process Control (SPC) program monitors the progress of the lab and its capabilities, as well as certifies the bonding technicians, by periodically preparing D6AC steel tensile adhesion panels with EA-91 3NA epoxy adhesive using a standardized process. SPC methods are then used to ensure the process is statistically in control, thus producing reliable data for bonding studies, and identify any problems which might develop. Since the specimen cleaning process is being changed, new SPC limits must be established. This report summarizes side-by-side testing of D6AC steel tensile adhesion witness panels and tapered double cantilevered beams (TDCBs) using both the current baseline vapor degreasing process and a lab-scale spray-in-air process. A Proceco 26 inches Typhoon dishwasher cleaned both tensile adhesion witness panels and TDCBs in a process which simulates the new production process. The tests were performed six times during 1995, subsequent statistical analysis of the data established new upper control limits (UCL) and lower control limits (LCL). The data also demonstrated that the new process was equivalent to the vapor degreasing process.

  13. Evaluation of stabilization techniques for ion implant processing

    NASA Astrophysics Data System (ADS)

    Ross, Matthew F.; Wong, Selmer S.; Minter, Jason P.; Marlowe, Trey; Narcy, Mark E.; Livesay, William R.

    1999-06-01

    With the integration of high current ion implant processing into volume CMOS manufacturing, the need for photoresist stabilization to achieve a stable ion implant process is critical. This study compares electron beam stabilization, a non-thermal process, with more traditional thermal stabilization techniques such as hot plate baking and vacuum oven processing. The electron beam processing is carried out in a flood exposure system with no active heating of the wafer. These stabilization techniques are applied to typical ion implant processes that might be found in a CMOS production process flow. The stabilization processes are applied to a 1.1 micrometers thick PFI-38A i-line photoresist film prior to ion implant processing. Post stabilization CD variation is detailed with respect to wall slope and feature integrity. SEM photographs detail the effects of the stabilization technique on photoresist features. The thermal stability of the photoresist is shown for different levels of stabilization and post stabilization thermal cycling. Thermal flow stability of the photoresist is detailed via SEM photographs. A significant improvement in thermal stability is achieved with the electron beam process, such that photoresist features are stable to temperatures in excess of 200 degrees C. Ion implant processing parameters are evaluated and compared for the different stabilization methods. Ion implant system end-station chamber pressure is detailed as a function of ion implant process and stabilization condition. The ion implant process conditions are detailed for varying factors such as ion current, energy, and total dose. A reduction in the ion implant systems end-station chamber pressure is achieved with the electron beam stabilization process over the other techniques considered. This reduction in end-station chamber pressure is shown to provide a reduction in total process time for a given ion implant dose. Improvements in the ion implant process are detailed across several combinations of current and energy.

  14. The prevalence of medial coronoid process disease is high in lame large breed dogs and quantitative radiographic assessments contribute to the diagnosis.

    PubMed

    Mostafa, Ayman; Nolte, Ingo; Wefstaedt, Patrick

    2018-06-05

    Medial coronoid process disease is a common leading cause of thoracic limb lameness in dogs. Computed tomography and arthroscopy are superior to radiography to diagnose medial coronoid process disease, however, radiography remains the most available diagnostic imaging modality in veterinary practice. Objectives of this retrospective observational study were to describe the prevalence of medial coronoid process disease in lame large breed dogs and apply a novel method for quantifying the radiographic changes associated with medial coronoid process and subtrochlear-ulnar region in Labrador and Golden Retrievers with confirmed medial coronoid process disease. Purebred Labrador and Golden Retrievers (n = 143, 206 elbows) without and with confirmed medial coronoid process disease were included. The prevalence of medial coronoid process disease in lame large breed dogs was calculated. Mediolateral and craniocaudal radiographs of elbows were analyzed to assess the medial coronoid process length and morphology, and subtrochlear-ulnar width. Mean grayscale value was calculated for radial and subtrochlear-ulnar zones. The prevalence of medial coronoid process disease was 20.8%. Labrador and Golden Retrievers were the most affected purebred dogs (29.6%). Elbows with confirmed medial coronoid process disease had short (P < 0.0001) and deformed (∼95%) medial coronoid process, with associated medial coronoid process osteophytosis (7.5%). Subtrochlear-ulnar sclerosis was evidenced in ∼96% of diseased elbows, with a significant increase (P < 0.0001) in subtrochlear-ulnar width and standardized grayscale value. Radial grayscale value did not differ between groups. Periarticular osteophytosis was identified in 51.4% of elbows with medial coronoid process disease. Medial coronoid process length and morphology, and subtrochlear-ulnar width and standardized grayscale value varied significantly in dogs with confirmed medial coronoid process disease compared to controls. Findings indicated that medial coronoid process disease has a high prevalence in lame large breed dogs and that quantitative radiographic assessments can contribute to the diagnosis. © 2018 American College of Veterinary Radiology.

  15. The role of rational and experiential processing in influencing the framing effect.

    PubMed

    Stark, Emily; Baldwin, Austin S; Hertel, Andrew W; Rothman, Alexander J

    2017-01-01

    Research on individual differences and the framing effect has focused primarily on how variability in rational processing influences choice. However, we propose that measuring only rational processing presents an incomplete picture of how participants are responding to framed options, as orthogonal individual differences in experiential processing might be relevant. In two studies, we utilize the Rational Experiential Inventory, which captures individual differences in rational and experiential processing, to investigate how both processing types influence decisions. Our results show that differences in experiential processing, but not rational processing, moderated the effect of frame on choice. We suggest that future research should more closely examine the influence of experiential processing on making decisions, to gain a broader understanding of the conditions that contribute to the framing effect.

  16. Study and Analysis of The Robot-Operated Material Processing Systems (ROMPS)

    NASA Technical Reports Server (NTRS)

    Nguyen, Charles C.

    1996-01-01

    This is a report presenting the progress of a research grant funded by NASA for work performed during 1 Oct. 1994 - 31 Sep. 1995. The report deals with the development and investigation of potential use of software for data processing for the Robot Operated Material Processing System (ROMPS). It reports on the progress of data processing of calibration samples processed by ROMPS in space and on earth. First data were retrieved using the I/O software and manually processed using MicroSoft Excel. Then the data retrieval and processing process was automated using a program written in C which is able to read the telemetry data and produce plots of time responses of sample temperatures and other desired variables. LabView was also employed to automatically retrieve and process the telemetry data.

  17. Application of statistical process control and process capability analysis procedures in orbiter processing activities at the Kennedy Space Center

    NASA Technical Reports Server (NTRS)

    Safford, Robert R.; Jackson, Andrew E.; Swart, William W.; Barth, Timothy S.

    1994-01-01

    Successful ground processing at KSC requires that flight hardware and ground support equipment conform to specifications at tens of thousands of checkpoints. Knowledge of conformance is an essential requirement for launch. That knowledge of conformance at every requisite point does not, however, enable identification of past problems with equipment, or potential problem areas. This paper describes how the introduction of Statistical Process Control and Process Capability Analysis identification procedures into existing shuttle processing procedures can enable identification of potential problem areas and candidates for improvements to increase processing performance measures. Results of a case study describing application of the analysis procedures to Thermal Protection System processing are used to illustrate the benefits of the approaches described in the paper.

  18. Separate cortical networks involved in music perception: preliminary functional MRI evidence for modularity of music processing.

    PubMed

    Schmithorst, Vincent J

    2005-04-01

    Music perception is a quite complex cognitive task, involving the perception and integration of various elements including melody, harmony, pitch, rhythm, and timbre. A preliminary functional MRI investigation of music perception was performed, using a simplified passive listening task. Group independent component analysis (ICA) was used to separate out various components involved in music processing, as the hemodynamic responses are not known a priori. Various components consistent with auditory processing, expressive language, syntactic processing, and visual association were found. The results are discussed in light of various hypotheses regarding modularity of music processing and its overlap with language processing. The results suggest that, while some networks overlap with ones used for language processing, music processing may involve its own domain-specific processing subsystems.

  19. Industrial implementation of spatial variability control by real-time SPC

    NASA Astrophysics Data System (ADS)

    Roule, O.; Pasqualini, F.; Borde, M.

    2016-10-01

    Advanced technology nodes require more and more information to get the wafer process well setup. The critical dimension of components decreases following Moore's law. At the same time, the intra-wafer dispersion linked to the spatial non-uniformity of tool's processes is not capable to decrease in the same proportions. APC systems (Advanced Process Control) are being developed in waferfab to automatically adjust and tune wafer processing, based on a lot of process context information. It can generate and monitor complex intrawafer process profile corrections between different process steps. It leads us to put under control the spatial variability, in real time by our SPC system (Statistical Process Control). This paper will outline the architecture of an integrated process control system for shape monitoring in 3D, implemented in waferfab.

  20. Modeling and analysis of power processing systems: Feasibility investigation and formulation of a methodology

    NASA Technical Reports Server (NTRS)

    Biess, J. J.; Yu, Y.; Middlebrook, R. D.; Schoenfeld, A. D.

    1974-01-01

    A review is given of future power processing systems planned for the next 20 years, and the state-of-the-art of power processing design modeling and analysis techniques used to optimize power processing systems. A methodology of modeling and analysis of power processing equipment and systems has been formulated to fulfill future tradeoff studies and optimization requirements. Computer techniques were applied to simulate power processor performance and to optimize the design of power processing equipment. A program plan to systematically develop and apply the tools for power processing systems modeling and analysis is presented so that meaningful results can be obtained each year to aid the power processing system engineer and power processing equipment circuit designers in their conceptual and detail design and analysis tasks.

  1. Laser displacement sensor to monitor the layup process of composite laminate production

    NASA Astrophysics Data System (ADS)

    Miesen, Nick; Groves, Roger M.; Sinke, Jos; Benedictus, Rinze

    2013-04-01

    Several types of flaw can occur during the layup process of prepreg composite laminates. Quality control after the production process checks the end product by testing the specimens for flaws which are included during the layup process or curing process, however by then these flaws are already irreversibly embedded in the laminate. This paper demonstrates the use of a laser displacement sensor technique applied during the layup process of prepreg laminates for in-situ flaw detection, for typical flaws that can occur during the composite production process. An incorrect number of layers and fibre wrinkling are dominant flaws during the process of layup. These and other dominant flaws have been modeled to determine the requirements for an in-situ monitoring during the layup process of prepreg laminates.

  2. Levels of integration in cognitive control and sequence processing in the prefrontal cortex.

    PubMed

    Bahlmann, Jörg; Korb, Franziska M; Gratton, Caterina; Friederici, Angela D

    2012-01-01

    Cognitive control is necessary to flexibly act in changing environments. Sequence processing is needed in language comprehension to build the syntactic structure in sentences. Functional imaging studies suggest that sequence processing engages the left ventrolateral prefrontal cortex (PFC). In contrast, cognitive control processes additionally recruit bilateral rostral lateral PFC regions. The present study aimed to investigate these two types of processes in one experimental paradigm. Sequence processing was manipulated using two different sequencing rules varying in complexity. Cognitive control was varied with different cue-sets that determined the choice of a sequencing rule. Univariate analyses revealed distinct PFC regions for the two types of processing (i.e. sequence processing: left ventrolateral PFC and cognitive control processing: bilateral dorsolateral and rostral PFC). Moreover, in a common brain network (including left lateral PFC and intraparietal sulcus) no interaction between sequence and cognitive control processing was observed. In contrast, a multivariate pattern analysis revealed an interaction of sequence and cognitive control processing, such that voxels in left lateral PFC and parietal cortex showed different tuning functions for tasks involving different sequencing and cognitive control demands. These results suggest that the difference between the process of rule selection (i.e. cognitive control) and the process of rule-based sequencing (i.e. sequence processing) find their neuronal underpinnings in distinct activation patterns in lateral PFC. Moreover, the combination of rule selection and rule sequencing can shape the response of neurons in lateral PFC and parietal cortex.

  3. Levels of Integration in Cognitive Control and Sequence Processing in the Prefrontal Cortex

    PubMed Central

    Bahlmann, Jörg; Korb, Franziska M.; Gratton, Caterina; Friederici, Angela D.

    2012-01-01

    Cognitive control is necessary to flexibly act in changing environments. Sequence processing is needed in language comprehension to build the syntactic structure in sentences. Functional imaging studies suggest that sequence processing engages the left ventrolateral prefrontal cortex (PFC). In contrast, cognitive control processes additionally recruit bilateral rostral lateral PFC regions. The present study aimed to investigate these two types of processes in one experimental paradigm. Sequence processing was manipulated using two different sequencing rules varying in complexity. Cognitive control was varied with different cue-sets that determined the choice of a sequencing rule. Univariate analyses revealed distinct PFC regions for the two types of processing (i.e. sequence processing: left ventrolateral PFC and cognitive control processing: bilateral dorsolateral and rostral PFC). Moreover, in a common brain network (including left lateral PFC and intraparietal sulcus) no interaction between sequence and cognitive control processing was observed. In contrast, a multivariate pattern analysis revealed an interaction of sequence and cognitive control processing, such that voxels in left lateral PFC and parietal cortex showed different tuning functions for tasks involving different sequencing and cognitive control demands. These results suggest that the difference between the process of rule selection (i.e. cognitive control) and the process of rule-based sequencing (i.e. sequence processing) find their neuronal underpinnings in distinct activation patterns in lateral PFC. Moreover, the combination of rule selection and rule sequencing can shape the response of neurons in lateral PFC and parietal cortex. PMID:22952762

  4. Flow chemistry using milli- and microstructured reactors-from conventional to novel process windows.

    PubMed

    Illg, Tobias; Löb, Patrick; Hessel, Volker

    2010-06-01

    The terminology Novel Process Window unites different methods to improve existing processes by applying unconventional and harsh process conditions like: process routes at much elevated pressure, much elevated temperature, or processing in a thermal runaway regime to achieve a significant impact on process performance. This paper is a review of parts of IMM's works in particular the applicability of above mentioned Novel Process Windows on selected chemical reactions. First, general characteristics of microreactors are discussed like excellent mass and heat transfer and improved mixing quality. Different types of reactions are presented in which the use of microstructured devices led to an increased process performance by applying Novel Process Windows. These examples were chosen to demonstrate how chemical reactions can benefit from the use of milli- and microstructured devices and how existing protocols can be changed toward process conditions hitherto not applicable in standard laboratory equipment. The used milli- and microstructured reactors can also offer advantages in other areas, for example, high-throughput screening of catalysts and better control of size distribution in a particle synthesis process by improved mixing, etc. The chemical industry is under continuous improvement. So, a lot of research is being done to synthesize high value chemicals, to optimize existing processes in view of process safety and energy consumption and to search for new routes to produce such chemicals. Leitmotifs of such undertakings are often sustainable development(1) and Green Chemistry(2).

  5. Fast but fleeting: adaptive motor learning processes associated with aging and cognitive decline.

    PubMed

    Trewartha, Kevin M; Garcia, Angeles; Wolpert, Daniel M; Flanagan, J Randall

    2014-10-01

    Motor learning has been shown to depend on multiple interacting learning processes. For example, learning to adapt when moving grasped objects with novel dynamics involves a fast process that adapts and decays quickly-and that has been linked to explicit memory-and a slower process that adapts and decays more gradually. Each process is characterized by a learning rate that controls how strongly motor memory is updated based on experienced errors and a retention factor determining the movement-to-movement decay in motor memory. Here we examined whether fast and slow motor learning processes involved in learning novel dynamics differ between younger and older adults. In addition, we investigated how age-related decline in explicit memory performance influences learning and retention parameters. Although the groups adapted equally well, they did so with markedly different underlying processes. Whereas the groups had similar fast processes, they had different slow processes. Specifically, the older adults exhibited decreased retention in their slow process compared with younger adults. Within the older group, who exhibited considerable variation in explicit memory performance, we found that poor explicit memory was associated with reduced retention in the fast process, as well as the slow process. These findings suggest that explicit memory resources are a determining factor in impairments in the both the fast and slow processes for motor learning but that aging effects on the slow process are independent of explicit memory declines. Copyright © 2014 the authors 0270-6474/14/3413411-11$15.00/0.

  6. Parallel Activation in Bilingual Phonological Processing

    ERIC Educational Resources Information Center

    Lee, Su-Yeon

    2011-01-01

    In bilingual language processing, the parallel activation hypothesis suggests that bilinguals activate their two languages simultaneously during language processing. Support for the parallel activation mainly comes from studies of lexical (word-form) processing, with relatively less attention to phonological (sound) processing. According to…

  7. OCLC-MARC Tape Processing: A Functional Analysis.

    ERIC Educational Resources Information Center

    Miller, Bruce Cummings

    1984-01-01

    Analyzes structure of, and data in, the OCLC-MARC record in the form delivered via OCLC's Tape Subscription Service, and outlines important processing functions involved: "unreadable tapes," duplicate records and deduping, match processing, choice processing, locations processing, "automatic" and "input" stamps,…

  8. 7 Processes that Enable NASA Software Engineering Technologies: Value-Added Process Engineering

    NASA Technical Reports Server (NTRS)

    Housch, Helen; Godfrey, Sally

    2011-01-01

    The presentation reviews Agency process requirements and the purpose, benefits, and experiences or seven software engineering processes. The processes include: product integration, configuration management, verification, software assurance, measurement and analysis, requirements management, and planning and monitoring.

  9. Risk-based Strategy to Determine Testing Requirement for the Removal of Residual Process Reagents as Process-related Impurities in Bioprocesses.

    PubMed

    Qiu, Jinshu; Li, Kim; Miller, Karen; Raghani, Anil

    2015-01-01

    The purpose of this article is to recommend a risk-based strategy for determining clearance testing requirements of the process reagents used in manufacturing biopharmaceutical products. The strategy takes account of four risk factors. Firstly, the process reagents are classified into two categories according to their safety profile and history of use: generally recognized as safe (GRAS) and potential safety concern (PSC) reagents. The clearance testing of GRAS reagents can be eliminated because of their safe use historically and process capability to remove these reagents. An estimated safety margin (Se) value, a ratio of the exposure limit to the estimated maximum reagent amount, is then used to evaluate the necessity for testing the PSC reagents at an early development stage. The Se value is calculated from two risk factors, the starting PSC reagent amount per maximum product dose (Me), and the exposure limit (Le). A worst-case scenario is assumed to estimate the Me value, that is common. The PSC reagent of interest is co-purified with the product and no clearance occurs throughout the entire purification process. No clearance testing is required for this PSC reagent if its Se value is ≥1; otherwise clearance testing is needed. Finally, the point of the process reagent introduction to the process is also considered in determining the necessity of the clearance testing for process reagents. How to use the measured safety margin as a criterion for determining PSC reagent testing at process characterization, process validation, and commercial production stages are also described. A large number of process reagents are used in the biopharmaceutical manufacturing to control the process performance. Clearance testing for all of the process reagents will be an enormous analytical task. In this article, a risk-based strategy is described to eliminate unnecessary clearance testing for majority of the process reagents using four risk factors. The risk factors included in the strategy are (i) safety profile of the reagents, (ii) the starting amount of the process reagents used in the manufacturing process, (iii) the maximum dose of the product, and (iv) the point of introduction of the process reagents in the process. The implementation of the risk-based strategy can eliminate clearance testing for approximately 90% of the process reagents used in the manufacturing processes. This science-based strategy allows us to ensure patient safety and meet regulatory agency expectations throughout the product development life cycle. © PDA, Inc. 2015.

  10. Titania nanotube powders obtained by rapid breakdown anodization in perchloric acid electrolytes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ali, Saima, E-mail: saima.ali@aalto.fi; Hannula, Simo-Pekka

    Titania nanotube (TNT) powders are prepared by rapid break down anodization (RBA) in a 0.1 M perchloric acid (HClO{sub 4}) solution (Process 1), and ethylene glycol (EG) mixture with HClO{sub 4} and water (Process 2). A study of the as-prepared and calcined TNT powders obtained by both processes is implemented to evaluate and compare the morphology, crystal structure, specific surface area, and the composition of the nanotubes. Longer TNTs are formed in Process 1, while comparatively larger pore diameter and wall thickness are obtained for the nanotubes prepared by Process 2. The TNTs obtained by Process 1 are converted tomore » nanorods at 350 °C, while nanotubes obtained by Process 2 preserve tubular morphology till 350 °C. In addition, the TNTs prepared by an aqueous electrolyte have a crystalline structure, whereas the TNTs obtained by Process 2 are amorphous. Samples calcined till 450 °C have XRD peaks from the anatase phase, while the rutile phase appears at 550 °C for the TNTs prepared by both processes. The Raman spectra also show clear anatase peaks for all samples except the as-prepared sample obtained by Process 2, thus supporting the XRD findings. FTIR spectra reveal the presence of O-H groups in the structure for the TNTs obtained by both processes. However, the presence is less prominent for annealed samples. Additionally, TNTs obtained by Process 2 have a carbonaceous impurity present in the structure attributed to the electrolyte used in that process. While a negligible weight loss is typical for TNTs prepared from aqueous electrolytes, a weight loss of 38.6% in the temperature range of 25–600 °C is found for TNTs prepared in EG electrolyte (Process 2). A large specific surface area of 179.2 m{sup 2} g{sup −1} is obtained for TNTs prepared by Process 1, whereas Process 2 produces nanotubes with a lower specific surface area. The difference appears to correspond to the dimensions of the nanotubes obtained by the two processes. - Graphical abstract: Titania nanotube powders prepared by Process 1 and Process 2 have different crystal structure and specific surface area. - Highlights: • Titania nanotube (TNT) powder is prepared in low water organic electrolyte. • Characterization of TNT powders prepared from aqueous and organic electrolyte. • TNTs prepared by Process 1 are crystalline with higher specific surface area. • TNTs obtained by Process 2 have carbonaceous impurities in the structure.« less

  11. A processing approach to the working memory/long-term memory distinction: evidence from the levels-of-processing span task.

    PubMed

    Rose, Nathan S; Craik, Fergus I M

    2012-07-01

    Recent theories suggest that performance on working memory (WM) tasks involves retrieval from long-term memory (LTM). To examine whether WM and LTM tests have common principles, Craik and Tulving's (1975) levels-of-processing paradigm, which is known to affect LTM, was administered as a WM task: Participants made uppercase, rhyme, or category-membership judgments about words, and immediate recall of the words was required after every 3 or 8 processing judgments. In Experiment 1, immediate recall did not demonstrate a levels-of-processing effect, but a subsequent LTM test (delayed recognition) of the same words did show a benefit of deeper processing. Experiment 2 showed that surprise immediate recall of 8-item lists did demonstrate a levels-of-processing effect, however. A processing account of the conditions in which levels-of-processing effects are and are not found in WM tasks was advanced, suggesting that the extent to which levels-of-processing effects are similar between WM and LTM tests largely depends on the amount of disruption to active maintenance processes. 2012 APA, all rights reserved

  12. Emotional words can be embodied or disembodied: the role of superficial vs. deep types of processing

    PubMed Central

    Abbassi, Ensie; Blanchette, Isabelle; Ansaldo, Ana I.; Ghassemzadeh, Habib; Joanette, Yves

    2015-01-01

    Emotional words are processed rapidly and automatically in the left hemisphere (LH) and slowly, with the involvement of attention, in the right hemisphere (RH). This review aims to find the reason for this difference and suggests that emotional words can be processed superficially or deeply due to the involvement of the linguistic and imagery systems, respectively. During superficial processing, emotional words likely make connections only with semantically associated words in the LH. This part of the process is automatic and may be sufficient for the purpose of language processing. Deep processing, in contrast, seems to involve conceptual information and imagery of a word’s perceptual and emotional properties using autobiographical memory contents. Imagery and the involvement of autobiographical memory likely differentiate between emotional and neutral word processing and explain the salient role of the RH in emotional word processing. It is concluded that the level of emotional word processing in the RH should be deeper than in the LH and, thus, it is conceivable that the slow mode of processing adds certain qualities to the output. PMID:26217288

  13. Process Monitoring Evaluation and Implementation for the Wood Abrasive Machining Process

    PubMed Central

    Saloni, Daniel E.; Lemaster, Richard L.; Jackson, Steven D.

    2010-01-01

    Wood processing industries have continuously developed and improved technologies and processes to transform wood to obtain better final product quality and thus increase profits. Abrasive machining is one of the most important of these processes and therefore merits special attention and study. The objective of this work was to evaluate and demonstrate a process monitoring system for use in the abrasive machining of wood and wood based products. The system developed increases the life of the belt by detecting (using process monitoring sensors) and removing (by cleaning) the abrasive loading during the machining process. This study focused on abrasive belt machining processes and included substantial background work, which provided a solid base for understanding the behavior of the abrasive, and the different ways that the abrasive machining process can be monitored. In addition, the background research showed that abrasive belts can effectively be cleaned by the appropriate cleaning technique. The process monitoring system developed included acoustic emission sensors which tended to be sensitive to belt wear, as well as platen vibration, but not loading, and optical sensors which were sensitive to abrasive loading. PMID:22163477

  14. Adaptive memory: determining the proximate mechanisms responsible for the memorial advantages of survival processing.

    PubMed

    Burns, Daniel J; Burns, Sarah A; Hwang, Ana J

    2011-01-01

    J. S. Nairne, S. R. Thompson, and J. N. S. Pandeirada (2007) suggested that our memory systems may have evolved to help us remember fitness-relevant information and showed that retention of words rated for their relevance to survival is superior to that of words encoded under other deep processing conditions. The authors present 4 experiments that uncover the proximate mechanisms likely responsible. The authors obtained a recall advantage for survival processing compared with conditions that promoted only item-specific processing or only relational processing. This effect was eliminated when control conditions encouraged both item-specific and relational processing. Data from separate measures of item-specific and relational processing generally were consistent with the view that the memorial advantage for survival processing results from the encoding of both types of processing. Although the present study suggests the proximate mechanisms for the effect, the authors argue that survival processing may be fundamentally different from other memory phenomena for which item-specific and relational processing differences have been implicated. (PsycINFO Database Record (c) 2010 APA, all rights reserved).

  15. Implementation of quality by design toward processing of food products.

    PubMed

    Rathore, Anurag S; Kapoor, Gautam

    2017-05-28

    Quality by design (QbD) is a systematic approach that begins with predefined objectives and emphasizes product and process understanding and process control. It is an approach based on principles of sound science and quality risk management. As the food processing industry continues to embrace the idea of in-line, online, and/or at-line sensors and real-time characterization for process monitoring and control, the existing gaps with regard to our ability to monitor multiple parameters/variables associated with the manufacturing process will be alleviated over time. Investments made for development of tools and approaches that facilitate high-throughput analytical and process development, process analytical technology, design of experiments, risk analysis, knowledge management, and enhancement of process/product understanding would pave way for operational and economic benefits later in the commercialization process and across other product pipelines. This article aims to achieve two major objectives. First, to review the progress that has been made in the recent years on the topic of QbD implementation in processing of food products and second, present a case study that illustrates benefits of such QbD implementation.

  16. Energy saving processes for nitrogen removal in organic wastewater from food processing industries in Thailand.

    PubMed

    Johansen, N H; Suksawad, N; Balslev, P

    2004-01-01

    Nitrogen removal from organic wastewater is becoming a demand in developed communities. The use of nitrite as intermediate in the treatment of wastewater has been largely ignored, but is actually a relevant energy saving process compared to conventional nitrification/denitrification using nitrate as intermediate. Full-scale results and pilot-scale results using this process are presented. The process needs some additional process considerations and process control to be utilized. Especially under tropical conditions the nitritation process will round easily, and it must be expected that many AS treatment plants in the food industry already produce NO2-N. This uncontrolled nitrogen conversion can be the main cause for sludge bulking problems. It is expected that sludge bulking problems in many cases can be solved just by changing the process control in order to run a more consequent nitritation. Theoretically this process will decrease the oxygen consumption for oxidation by 25% and the use of carbon source for the reduction will be decreased by 40% compared to the conventional process.

  17. Application of Ozone MBBR Process in Refinery Wastewater Treatment

    NASA Astrophysics Data System (ADS)

    Lin, Wang

    2018-01-01

    Moving Bed Biofilm Reactor (MBBR) is a kind of sewage treatment technology based on fluidized bed. At the same time, it can also be regarded as an efficient new reactor between active sludge method and the biological membrane method. The application of ozone MBBR process in refinery wastewater treatment is mainly studied. The key point is to design the ozone +MBBR combined process based on MBBR process. The ozone +MBBR process is used to analyze the treatment of concentrated water COD discharged from the refinery wastewater treatment plant. The experimental results show that the average removal rate of COD is 46.0%~67.3% in the treatment of reverse osmosis concentrated water by ozone MBBR process, and the effluent can meet the relevant standard requirements. Compared with the traditional process, the ozone MBBR process is more flexible. The investment of this process is mainly ozone generator, blower and so on. The prices of these items are relatively inexpensive, and these costs can be offset by the excess investment in traditional activated sludge processes. At the same time, ozone MBBR process has obvious advantages in water quality, stability and other aspects.

  18. Models of recognition: A review of arguments in favor of a dual-process account

    PubMed Central

    DIANA, RACHEL A.; REDER, LYNNE M.; ARNDT, JASON; PARK, HEEKYEONG

    2008-01-01

    The majority of computationally specified models of recognition memory have been based on a single-process interpretation, claiming that familiarity is the only influence on recognition. There is increasing evidence that recognition is, in fact, based on two processes: recollection and familiarity. This article reviews the current state of the evidence for dual-process models, including the usefulness of the remember/know paradigm, and interprets the relevant results in terms of the source of activation confusion (SAC) model of memory. We argue that the evidence from each of the areas we discuss, when combined, presents a strong case that inclusion of a recollection process is necessary. Given this conclusion, we also argue that the dual-process claim that the recollection process is always available is, in fact, more parsimonious than the single-process claim that the recollection process is used only in certain paradigms. The value of a well-specified process model such as the SAC model is discussed with regard to other types of dual-process models. PMID:16724763

  19. Emotional words can be embodied or disembodied: the role of superficial vs. deep types of processing.

    PubMed

    Abbassi, Ensie; Blanchette, Isabelle; Ansaldo, Ana I; Ghassemzadeh, Habib; Joanette, Yves

    2015-01-01

    Emotional words are processed rapidly and automatically in the left hemisphere (LH) and slowly, with the involvement of attention, in the right hemisphere (RH). This review aims to find the reason for this difference and suggests that emotional words can be processed superficially or deeply due to the involvement of the linguistic and imagery systems, respectively. During superficial processing, emotional words likely make connections only with semantically associated words in the LH. This part of the process is automatic and may be sufficient for the purpose of language processing. Deep processing, in contrast, seems to involve conceptual information and imagery of a word's perceptual and emotional properties using autobiographical memory contents. Imagery and the involvement of autobiographical memory likely differentiate between emotional and neutral word processing and explain the salient role of the RH in emotional word processing. It is concluded that the level of emotional word processing in the RH should be deeper than in the LH and, thus, it is conceivable that the slow mode of processing adds certain qualities to the output.

  20. Techno-economic analysis of biocatalytic processes for production of alkene expoxides

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Borole, Abhijeet P

    2007-01-01

    A techno-economic analysis of two different bioprocesses was conducted, one for the conversion of propylene to propylene oxide (PO) and other for conversion of styrene to styrene expoxide (SO). The first process was a lipase-mediated chemo-enzymatic reaction, whereas the second one was a one-step enzymatic process using chloroperoxidase. The PO produced through the chemo-enzymatic process is a racemic product, whereas the latter process (based on chloroperoxidase) produces an enantio-pure product. The former process thus falls under the category of high-volume commodity chemical (PO); whereas the latter is a low-volume, high-value product (SO).A simulation of the process was conducted using themore » bioprocess engineering software SuperPro Designer v6.0 (Intelligen, Inc., Scotch Plains, NJ) to determine the economic feasibility of the process. The purpose of the exercise was to compare biocatalytic processes with existing chemical processes for production of alkene expoxides. The results show that further improvements are needed in improving biocatalyst stability to make these bioprocesses competitive with chemical processes.« less

  1. The representation of conceptual knowledge: visual, auditory, and olfactory imagery compared with semantic processing.

    PubMed

    Palmiero, Massimiliano; Di Matteo, Rosalia; Belardinelli, Marta Olivetti

    2014-05-01

    Two experiments comparing imaginative processing in different modalities and semantic processing were carried out to investigate the issue of whether conceptual knowledge can be represented in different format. Participants were asked to judge the similarity between visual images, auditory images, and olfactory images in the imaginative block, if two items belonged to the same category in the semantic block. Items were verbally cued in both experiments. The degree of similarity between the imaginative and semantic items was changed across experiments. Experiment 1 showed that the semantic processing was faster than the visual and the auditory imaginative processing, whereas no differentiation was possible between the semantic processing and the olfactory imaginative processing. Experiment 2 revealed that only the visual imaginative processing could be differentiated from the semantic processing in terms of accuracy. These results showed that the visual and auditory imaginative processing can be differentiated from the semantic processing, although both visual and auditory images strongly rely on semantic representations. On the contrary, no differentiation is possible within the olfactory domain. Results are discussed in the frame of the imagery debate.

  2. Working memory load eliminates the survival processing effect.

    PubMed

    Kroneisen, Meike; Rummel, Jan; Erdfelder, Edgar

    2014-01-01

    In a series of experiments, Nairne, Thompson, and Pandeirada (2007) demonstrated that words judged for their relevance to a survival scenario are remembered better than words judged for a scenario not relevant on a survival dimension. They explained this survival-processing effect by arguing that nature "tuned" our memory systems to process and remember fitness-relevant information. Kroneisen and Erdfelder (2011) proposed that it may not be survival processing per se that facilitates recall but the richness and distinctiveness with which information is encoded. To further test this account, we investigated how the survival processing effect is affected by cognitive load. If the survival processing effect is due to automatic processes or, alternatively, if survival processing is routinely prioritized in dual-task contexts, we would expect this effect to persist under cognitive load conditions. If the effect relies on cognitively demanding processes like richness and distinctiveness of encoding, however, the survival processing benefit should be hampered by increased cognitive load during encoding. Results were in line with the latter prediction, that is, the survival processing effect vanished under dual-task conditions.

  3. E-learning process maturity level: a conceptual framework

    NASA Astrophysics Data System (ADS)

    Rahmah, A.; Santoso, H. B.; Hasibuan, Z. A.

    2018-03-01

    ICT advancement is a sure thing with the impact influencing many domains, including learning in both formal and informal situations. It leads to a new mindset that we should not only utilize the given ICT to support the learning process, but also improve it gradually involving a lot of factors. These phenomenon is called e-learning process evolution. Accordingly, this study attempts to explore maturity level concept to provide the improvement direction gradually and progression monitoring for the individual e-learning process. Extensive literature review, observation, and forming constructs are conducted to develop a conceptual framework for e-learning process maturity level. The conceptual framework consists of learner, e-learning process, continuous improvement, evolution of e-learning process, technology, and learning objectives. Whilst, evolution of e-learning process depicted as current versus expected conditions of e-learning process maturity level. The study concludes that from the e-learning process maturity level conceptual framework, it may guide the evolution roadmap for e-learning process, accelerate the evolution, and decrease the negative impact of ICT. The conceptual framework will be verified and tested in the future study.

  4. Heat input and accumulation for ultrashort pulse processing with high average power

    NASA Astrophysics Data System (ADS)

    Finger, Johannes; Bornschlegel, Benedikt; Reininghaus, Martin; Dohrn, Andreas; Nießen, Markus; Gillner, Arnold; Poprawe, Reinhart

    2018-05-01

    Materials processing using ultrashort pulsed laser radiation with pulse durations <10 ps is known to enable very precise processing with negligible thermal load. However, even for the application of picosecond and femtosecond laser radiation, not the full amount of the absorbed energy is converted into ablation products and a distinct fraction of the absorbed energy remains as residual heat in the processed workpiece. For low average power and power densities, this heat is usually not relevant for the processing results and dissipates into the workpiece. In contrast, when higher average powers and repetition rates are applied to increase the throughput and upscale ultrashort pulse processing, this heat input becomes relevant and significantly affects the achieved processing results. In this paper, we outline the relevance of heat input for ultrashort pulse processing, starting with the heat input of a single ultrashort laser pulse. Heat accumulation during ultrashort pulse processing with high repetition rate is discussed as well as heat accumulation for materials processing using pulse bursts. In addition, the relevance of heat accumulation with multiple scanning passes and processing with multiple laser spots is shown.

  5. Defining and reconstructing clinical processes based on IHE and BPMN 2.0.

    PubMed

    Strasser, Melanie; Pfeifer, Franz; Helm, Emmanuel; Schuler, Andreas; Altmann, Josef

    2011-01-01

    This paper describes the current status and the results of our process management system for defining and reconstructing clinical care processes, which contributes to compare, analyze and evaluate clinical processes and further to identify high cost tasks or stays. The system is founded on IHE, which guarantees standardized interfaces and interoperability between clinical information systems. At the heart of the system there is BPMN, a modeling notation and specification language, which allows the definition and execution of clinical processes. The system provides functionality to define healthcare information system independent clinical core processes and to execute the processes in a workflow engine. Furthermore, the reconstruction of clinical processes is done by evaluating an IHE audit log database, which records patient movements within a health care facility. The main goal of the system is to assist hospital operators and clinical process managers to detect discrepancies between defined and actual clinical processes and as well to identify main causes of high medical costs. Beyond that, the system can potentially contribute to reconstruct and improve clinical processes and enhance cost control and patient care quality.

  6. Process qualification and testing of LENS deposited AY1E0125 D-bottle brackets.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Atwood, Clinton J.; Smugeresky, John E.; Jew, Michael

    2006-11-01

    The LENS Qualification team had the goal of performing a process qualification for the Laser Engineered Net Shaping{trademark}(LENS{reg_sign}) process. Process Qualification requires that a part be selected for process demonstration. The AY1E0125 D-Bottle Bracket from the W80-3 was selected for this work. The repeatability of the LENS process was baselined to determine process parameters. Six D-Bottle brackets were deposited using LENS, machined to final dimensions, and tested in comparison to conventionally processed brackets. The tests, taken from ES1E0003, included a mass analysis and structural dynamic testing including free-free and assembly-level modal tests, and Haversine shock tests. The LENS brackets performedmore » with very similar characteristics to the conventionally processed brackets. Based on the results of the testing, it was concluded that the performance of the brackets made them eligible for parallel path testing in subsystem level tests. The testing results and process rigor qualified the LENS process as detailed in EER200638525A.« less

  7. Sustainability assessment of shielded metal arc welding (SMAW) process

    NASA Astrophysics Data System (ADS)

    Alkahla, Ibrahim; Pervaiz, Salman

    2017-09-01

    Shielded metal arc welding (SMAW) process is one of the most commonly employed material joining processes utilized in the various industrial sectors such as marine, ship-building, automotive, aerospace, construction and petrochemicals etc. The increasing pressure on manufacturing sector wants the welding process to be sustainable in nature. The SMAW process incorporates several types of inputs and output streams. The sustainability concerns associated with SMAW process are linked with the various input and output streams such as electrical energy requirement, input material consumptions, slag formation, fumes emission and hazardous working conditions associated with the human health and occupational safety. To enhance the environmental performance of the SMAW welding process, there is a need to characterize the sustainability for the SMAW process under the broad framework of sustainability. Most of the available literature focuses on the technical and economic aspects of the welding process, however the environmental and social aspects are rarely addressed. The study reviews SMAW process with respect to the triple bottom line (economic, environmental and social) sustainability approach. Finally, the study concluded recommendations towards achieving economical and sustainable SMAW welding process.

  8. Decontamination and disposal of PCB wastes.

    PubMed Central

    Johnston, L E

    1985-01-01

    Decontamination and disposal processes for PCB wastes are reviewed. Processes are classed as incineration, chemical reaction or decontamination. Incineration technologies are not limited to the rigorous high temperature but include those where innovations in use of oxident, heat transfer and residue recycle are made. Chemical processes include the sodium processes, radiant energy processes and low temperature oxidations. Typical processing rates and associated costs are provided where possible. PMID:3928363

  9. Logistics Control Facility: A Normative Model for Total Asset Visibility in the Air Force Logistics System

    DTIC Science & Technology

    1994-09-01

    IIssue Computers, information systems, and communication systems are being increasingly used in transportation, warehousing, order processing , materials...inventory levels, reduced order processing times, reduced order processing costs, and increased customer satisfaction. While purchasing and transportation...process, the speed in which crders are processed would increase significantly. Lowering the order processing time in turn lowers the lead time, which in

  10. Definition and documentation of engineering processes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McDonald, G.W.

    1997-11-01

    This tutorial is an extract of a two-day workshop developed under the auspices of the Quality Engineering Department at Sandia National Laboratories. The presentation starts with basic definitions and addresses why processes should be defined and documented. It covers three primary topics: (1) process considerations and rationale, (2) approach to defining and documenting engineering processes, and (3) an IDEFO model of the process for defining engineering processes.

  11. Method for enhanced atomization of liquids

    DOEpatents

    Thompson, Richard E.; White, Jerome R.

    1993-01-01

    In a process for atomizing a slurry or liquid process stream in which a slurry or liquid is passed through a nozzle to provide a primary atomized process stream, an improvement which comprises subjecting the liquid or slurry process stream to microwave energy as the liquid or slurry process stream exits the nozzle, wherein sufficient microwave heating is provided to flash vaporize the primary atomized process stream.

  12. Rethinking a Negative Event: The Affective Impact of Ruminative versus Imagery-Based Processing of Aversive Autobiographical Memories.

    PubMed

    Slofstra, Christien; Eisma, Maarten C; Holmes, Emily A; Bockting, Claudi L H; Nauta, Maaike H

    2017-01-01

    Ruminative (abstract verbal) processing during recall of aversive autobiographical memories may serve to dampen their short-term affective impact. Experimental studies indeed demonstrate that verbal processing of non-autobiographical material and positive autobiographical memories evokes weaker affective responses than imagery-based processing. In the current study, we hypothesized that abstract verbal or concrete verbal processing of an aversive autobiographical memory would result in weaker affective responses than imagery-based processing. The affective impact of abstract verbal versus concrete verbal versus imagery-based processing during recall of an aversive autobiographical memory was investigated in a non-clinical sample ( n  = 99) using both an observational and an experimental design. Observationally, it was examined whether spontaneous use of processing modes (both state and trait measures) was associated with impact of aversive autobiographical memory recall on negative and positive affect. Experimentally, the causal relation between processing modes and affective impact was investigated by manipulating the processing mode during retrieval of the same aversive autobiographical memory. Main findings were that higher levels of trait (but not state) measures of both ruminative and imagery-based processing and depressive symptomatology were positively correlated with higher levels of negative affective impact in the observational part of the study. In the experimental part, no main effect of processing modes on affective impact of autobiographical memories was found. However, a significant moderating effect of depressive symptomatology was found. Only for individuals with low levels of depressive symptomatology, concrete verbal (but not abstract verbal) processing of the aversive autobiographical memory did result in weaker affective responses, compared to imagery-based processing. These results cast doubt on the hypothesis that ruminative processing of aversive autobiographical memories serves to avoid the negative emotions evoked by such memories. Furthermore, findings suggest that depressive symptomatology is associated with the spontaneous use and the affective impact of processing modes during recall of aversive autobiographical memories. Clinical studies are needed that examine the role of processing modes during aversive autobiographical memory recall in depression, including the potential effectiveness of targeting processing modes in therapy.

  13. Active pharmaceutical ingredient (API) production involving continuous processes--a process system engineering (PSE)-assisted design framework.

    PubMed

    Cervera-Padrell, Albert E; Skovby, Tommy; Kiil, Søren; Gani, Rafiqul; Gernaey, Krist V

    2012-10-01

    A systematic framework is proposed for the design of continuous pharmaceutical manufacturing processes. Specifically, the design framework focuses on organic chemistry based, active pharmaceutical ingredient (API) synthetic processes, but could potentially be extended to biocatalytic and fermentation-based products. The method exploits the synergic combination of continuous flow technologies (e.g., microfluidic techniques) and process systems engineering (PSE) methods and tools for faster process design and increased process understanding throughout the whole drug product and process development cycle. The design framework structures the many different and challenging design problems (e.g., solvent selection, reactor design, and design of separation and purification operations), driving the user from the initial drug discovery steps--where process knowledge is very limited--toward the detailed design and analysis. Examples from the literature of PSE methods and tools applied to pharmaceutical process design and novel pharmaceutical production technologies are provided along the text, assisting in the accumulation and interpretation of process knowledge. Different criteria are suggested for the selection of batch and continuous processes so that the whole design results in low capital and operational costs as well as low environmental footprint. The design framework has been applied to the retrofit of an existing batch-wise process used by H. Lundbeck A/S to produce an API: zuclopenthixol. Some of its batch operations were successfully converted into continuous mode, obtaining higher yields that allowed a significant simplification of the whole process. The material and environmental footprint of the process--evaluated through the process mass intensity index, that is, kg of material used per kg of product--was reduced to half of its initial value, with potential for further reduction. The case-study includes reaction steps typically used by the pharmaceutical industry featuring different characteristic reaction times, as well as L-L separation and distillation-based solvent exchange steps, and thus constitutes a good example of how the design framework can be useful to efficiently design novel or already existing API manufacturing processes taking advantage of continuous processes. Copyright © 2012 Elsevier B.V. All rights reserved.

  14. On the facilitative effects of face motion on face recognition and its development

    PubMed Central

    Xiao, Naiqi G.; Perrotta, Steve; Quinn, Paul C.; Wang, Zhe; Sun, Yu-Hao P.; Lee, Kang

    2014-01-01

    For the past century, researchers have extensively studied human face processing and its development. These studies have advanced our understanding of not only face processing, but also visual processing in general. However, most of what we know about face processing was investigated using static face images as stimuli. Therefore, an important question arises: to what extent does our understanding of static face processing generalize to face processing in real-life contexts in which faces are mostly moving? The present article addresses this question by examining recent studies on moving face processing to uncover the influence of facial movements on face processing and its development. First, we describe evidence on the facilitative effects of facial movements on face recognition and two related theoretical hypotheses: the supplementary information hypothesis and the representation enhancement hypothesis. We then highlight several recent studies suggesting that facial movements optimize face processing by activating specific face processing strategies that accommodate to task requirements. Lastly, we review the influence of facial movements on the development of face processing in the first year of life. We focus on infants' sensitivity to facial movements and explore the facilitative effects of facial movements on infants' face recognition performance. We conclude by outlining several future directions to investigate moving face processing and emphasize the importance of including dynamic aspects of facial information to further understand face processing in real-life contexts. PMID:25009517

  15. Comparison of property between two Viking Seismic tapes

    NASA Astrophysics Data System (ADS)

    Yamamoto, Y.; Yamada, R.

    2016-12-01

    Tthe restoration work of the seismometer data onboard Viking Lander 2 is still continuing. Originally, the data were processed and archived both in MIT and UTIG separately, and each data is accessible via the Internet today. Their file formats to store the data are different, but both of them are currently readable due to the continuous investigation. However, there is some inconsistency between their data although most of their data are highly consistent. To understand the differences, the knowledge of archiving and off-line processing of spacecraft is required because these differences are caused by the off-line processing.The data processing of spacecraft often requires merge and sort processing of raw data. The merge processing is normally performed to eliminate duplicated data, and the sort processing is performed to fix data order. UTIG did not seem to perform these merge and sort processing. Therefore, the UTIG processed data remain duplication. The MIT processed data did these merge and sort processing, but the raw data sometimes include wrong time tags, and it cannot be fixed strictly after sort processing. Also, the MIT processed data has enough documents to understand metadata, while UTIG data has a brief instruction. Therefore, both of MIT and UTIG data are treated complementary. A better data set can be established using both of them. In this presentation, we would show the method to build a better data set of Viking Lander 2 seismic data.

  16. Holistic processing, contact, and the other-race effect in face recognition.

    PubMed

    Zhao, Mintao; Hayward, William G; Bülthoff, Isabelle

    2014-12-01

    Face recognition, holistic processing, and processing of configural and featural facial information are known to be influenced by face race, with better performance for own- than other-race faces. However, whether these various other-race effects (OREs) arise from the same underlying mechanisms or from different processes remains unclear. The present study addressed this question by measuring the OREs in a set of face recognition tasks, and testing whether these OREs are correlated with each other. Participants performed different tasks probing (1) face recognition, (2) holistic processing, (3) processing of configural information, and (4) processing of featural information for both own- and other-race faces. Their contact with other-race people was also assessed with a questionnaire. The results show significant OREs in tasks testing face memory and processing of configural information, but not in tasks testing either holistic processing or processing of featural information. Importantly, there was no cross-task correlation between any of the measured OREs. Moreover, the level of other-race contact predicted only the OREs obtained in tasks testing face memory and processing of configural information. These results indicate that these various cross-race differences originate from different aspects of face processing, in contrary to the view that the ORE in face recognition is due to cross-race differences in terms of holistic processing. Copyright © 2014 The Authors. Published by Elsevier Ltd.. All rights reserved.

  17. Process monitoring and visualization solutions for hot-melt extrusion: a review.

    PubMed

    Saerens, Lien; Vervaet, Chris; Remon, Jean Paul; De Beer, Thomas

    2014-02-01

    Hot-melt extrusion (HME) is applied as a continuous pharmaceutical manufacturing process for the production of a variety of dosage forms and formulations. To ensure the continuity of this process, the quality of the extrudates must be assessed continuously during manufacturing. The objective of this review is to provide an overview and evaluation of the available process analytical techniques which can be applied in hot-melt extrusion. Pharmaceutical extruders are equipped with traditional (univariate) process monitoring tools, observing barrel and die temperatures, throughput, screw speed, torque, drive amperage, melt pressure and melt temperature. The relevance of several spectroscopic process analytical techniques for monitoring and control of pharmaceutical HME has been explored recently. Nevertheless, many other sensors visualizing HME and measuring diverse critical product and process parameters with potential use in pharmaceutical extrusion are available, and were thoroughly studied in polymer extrusion. The implementation of process analytical tools in HME serves two purposes: (1) improving process understanding by monitoring and visualizing the material behaviour and (2) monitoring and analysing critical product and process parameters for process control, allowing to maintain a desired process state and guaranteeing the quality of the end product. This review is the first to provide an evaluation of the process analytical tools applied for pharmaceutical HME monitoring and control, and discusses techniques that have been used in polymer extrusion having potential for monitoring and control of pharmaceutical HME. © 2013 Royal Pharmaceutical Society.

  18. Process for improving metal production in steelmaking processes

    DOEpatents

    Pal, Uday B.; Gazula, Gopala K. M.; Hasham, Ali

    1996-01-01

    A process and apparatus for improving metal production in ironmaking and steelmaking processes is disclosed. The use of an inert metallic conductor in the slag containing crucible and the addition of a transition metal oxide to the slag are the disclosed process improvements.

  19. Materials processing in space: Early experiments

    NASA Technical Reports Server (NTRS)

    Naumann, R. J.; Herring, H. W.

    1980-01-01

    The characteristics of the space environment were reviewed. Potential applications of space processing are discussed and include metallurgical processing, and processing of semiconductor materials. The behavior of fluid in low gravity is described. The evolution of apparatus for materials processing in space was reviewed.

  20. Abhijit Dutta | NREL

    Science.gov Websites

    Techno-economic analysis Process model development for existing and conceptual processes Detailed heat integration Economic analysis of integrated processes Integration of process simulation learnings into control ;Conceptual Process Design and Techno-Economic Assessment of Ex Situ Catalytic Fast Pyrolysis of Biomass: A

  1. 7 CFR 52.806 - Color.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... PROCESSED FRUITS AND VEGETABLES, PROCESSED PRODUCTS THEREOF, AND CERTAIN OTHER PROCESSED FOOD PRODUCTS 1... cherries that vary markedly from this color due to oxidation, improper processing, or other causes, or that... to oxidation, improper processing, or other causes, or that are undercolored, does not exceed the...

  2. 7 CFR 52.806 - Color.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... PROCESSED FRUITS AND VEGETABLES, PROCESSED PRODUCTS THEREOF, AND CERTAIN OTHER PROCESSED FOOD PRODUCTS 1... cherries that vary markedly from this color due to oxidation, improper processing, or other causes, or that... to oxidation, improper processing, or other causes, or that are undercolored, does not exceed the...

  3. Meat Processing.

    ERIC Educational Resources Information Center

    Legacy, Jim; And Others

    This publication provides an introduction to meat processing for adult students in vocational and technical education programs. Organized in four chapters, the booklet provides a brief overview of the meat processing industry and the techniques of meat processing and butchering. The first chapter introduces the meat processing industry and…

  4. 40 CFR 60.2025 - What if my chemical recovery unit is not listed in § 60.2020(n)?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... materials that are recovered. (3) A description (including a process flow diagram) of the process in which... process. (4) A description (including a process flow diagram) of the chemical constituent recovery process...

  5. 40 CFR 60.2025 - What if my chemical recovery unit is not listed in § 60.2020(n)?

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... materials that are recovered. (3) A description (including a process flow diagram) of the process in which... process. (4) A description (including a process flow diagram) of the chemical constituent recovery process...

  6. 40 CFR 60.2025 - What if my chemical recovery unit is not listed in § 60.2020(n)?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... materials that are recovered. (3) A description (including a process flow diagram) of the process in which... process. (4) A description (including a process flow diagram) of the chemical constituent recovery process...

  7. 40 CFR 60.2558 - What if a chemical recovery unit is not listed in § 60.2555(n)?

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... materials that are recovered. (3) A description (including a process flow diagram) of the process in which... process. (4) A description (including a process flow diagram) of the chemical constituent recovery process...

  8. Integrated decontamination process for metals

    DOEpatents

    Snyder, Thomas S.; Whitlow, Graham A.

    1991-01-01

    An integrated process for decontamination of metals, particularly metals that are used in the nuclear energy industry contaminated with radioactive material. The process combines the processes of electrorefining and melt refining to purify metals that can be decontaminated using either electrorefining or melt refining processes.

  9. Case Studies in Continuous Process Improvement

    NASA Technical Reports Server (NTRS)

    Mehta, A.

    1997-01-01

    This study focuses on improving the SMT assembly process in a low-volume, high-reliability environment with emphasis on fine pitch and BGA packages. Before a process improvement is carried out, it is important to evaluate where the process stands in terms of process capability.

  10. Mathematical Model of Nonstationary Separation Processes Proceeding in the Cascade of Gas Centrifuges in the Process of Separation of Multicomponent Isotope Mixtures

    NASA Astrophysics Data System (ADS)

    Orlov, A. A.; Ushakov, A. A.; Sovach, V. P.

    2017-03-01

    We have developed and realized on software a mathematical model of the nonstationary separation processes proceeding in the cascades of gas centrifuges in the process of separation of multicomponent isotope mixtures. With the use of this model the parameters of the separation process of germanium isotopes have been calculated. It has been shown that the model adequately describes the nonstationary processes in the cascade and is suitable for calculating their parameters in the process of separation of multicomponent isotope mixtures.

  11. International Best Practices for Pre-Processing and Co-Processing Municipal Solid Waste and Sewage Sludge in the Cement Industry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hasanbeigi, Ali; Lu, Hongyou; Williams, Christopher

    The purpose of this report is to describe international best practices for pre-processing and coprocessing of MSW and sewage sludge in cement plants, for the benefit of countries that wish to develop co-processing capacity. The report is divided into three main sections. Section 2 describes the fundamentals of co-processing, Section 3 describes exemplary international regulatory and institutional frameworks for co-processing, and Section 4 describes international best practices related to the technological aspects of co-processing.

  12. Thermochemical water decomposition processes

    NASA Technical Reports Server (NTRS)

    Chao, R. E.

    1974-01-01

    Thermochemical processes which lead to the production of hydrogen and oxygen from water without the consumption of any other material have a number of advantages when compared to other processes such as water electrolysis. It is possible to operate a sequence of chemical steps with net work requirements equal to zero at temperatures well below the temperature required for water dissociation in a single step. Various types of procedures are discussed, giving attention to halide processes, reverse Deacon processes, iron oxide and carbon oxide processes, and metal and alkali metal processes. Economical questions are also considered.

  13. Voyager image processing at the Image Processing Laboratory

    NASA Astrophysics Data System (ADS)

    Jepsen, P. L.; Mosher, J. A.; Yagi, G. M.; Avis, C. C.; Lorre, J. J.; Garneau, G. W.

    1980-09-01

    This paper discusses new digital processing techniques as applied to the Voyager Imaging Subsystem and devised to explore atmospheric dynamics, spectral variations, and the morphology of Jupiter, Saturn and their satellites. Radiometric and geometric decalibration processes, the modulation transfer function, and processes to determine and remove photometric properties of the atmosphere and surface of Jupiter and its satellites are examined. It is exhibited that selected images can be processed into 'approach at constant longitude' time lapse movies which are useful in observing atmospheric changes of Jupiter. Photographs are included to illustrate various image processing techniques.

  14. Voyager image processing at the Image Processing Laboratory

    NASA Technical Reports Server (NTRS)

    Jepsen, P. L.; Mosher, J. A.; Yagi, G. M.; Avis, C. C.; Lorre, J. J.; Garneau, G. W.

    1980-01-01

    This paper discusses new digital processing techniques as applied to the Voyager Imaging Subsystem and devised to explore atmospheric dynamics, spectral variations, and the morphology of Jupiter, Saturn and their satellites. Radiometric and geometric decalibration processes, the modulation transfer function, and processes to determine and remove photometric properties of the atmosphere and surface of Jupiter and its satellites are examined. It is exhibited that selected images can be processed into 'approach at constant longitude' time lapse movies which are useful in observing atmospheric changes of Jupiter. Photographs are included to illustrate various image processing techniques.

  15. A novel process control method for a TT-300 E-Beam/X-Ray system

    NASA Astrophysics Data System (ADS)

    Mittendorfer, Josef; Gallnböck-Wagner, Bernhard

    2018-02-01

    This paper presents some aspects of the process control method for a TT-300 E-Beam/X-Ray system at Mediscan, Austria. The novelty of the approach is the seamless integration of routine monitoring dosimetry with process data. This allows to calculate a parametric dose for each production unit and consequently a fine grain and holistic process performance monitoring. Process performance is documented in process control charts for the analysis of individual runs as well as historic trending of runs of specific process categories over a specified time range.

  16. A minimally processed dietary pattern is associated with lower odds of metabolic syndrome among Lebanese adults.

    PubMed

    Nasreddine, Lara; Tamim, Hani; Itani, Leila; Nasrallah, Mona P; Isma'eel, Hussain; Nakhoul, Nancy F; Abou-Rizk, Joana; Naja, Farah

    2018-01-01

    To (i) estimate the consumption of minimally processed, processed and ultra-processed foods in a sample of Lebanese adults; (ii) explore patterns of intakes of these food groups; and (iii) investigate the association of the derived patterns with cardiometabolic risk. Cross-sectional survey. Data collection included dietary assessment using an FFQ and biochemical, anthropometric and blood pressure measurements. Food items were categorized into twenty-five groups based on the NOVA food classification. The contribution of each food group to total energy intake (TEI) was estimated. Patterns of intakes of these food groups were examined using exploratory factor analysis. Multivariate logistic regression analysis was used to evaluate the associations of derived patterns with cardiometabolic risk factors. Greater Beirut area, Lebanon. Adults ≥18 years (n 302) with no prior history of chronic diseases. Of TEI, 36·53 and 27·10 % were contributed by ultra-processed and minimally processed foods, respectively. Two dietary patterns were identified: the 'ultra-processed' and the 'minimally processed/processed'. The 'ultra-processed' consisted mainly of fast foods, snacks, meat, nuts, sweets and liquor, while the 'minimally processed/processed' consisted mostly of fruits, vegetables, legumes, breads, cheeses, sugar and fats. Participants in the highest quartile of the 'minimally processed/processed' pattern had significantly lower odds for metabolic syndrome (OR=0·18, 95 % CI 0·04, 0·77), hyperglycaemia (OR=0·25, 95 % CI 0·07, 0·98) and low HDL cholesterol (OR=0·17, 95 % CI 0·05, 0·60). The study findings may be used for the development of evidence-based interventions aimed at encouraging the consumption of minimally processed foods.

  17. Increasing patient safety and efficiency in transfusion therapy using formal process definitions.

    PubMed

    Henneman, Elizabeth A; Avrunin, George S; Clarke, Lori A; Osterweil, Leon J; Andrzejewski, Chester; Merrigan, Karen; Cobleigh, Rachel; Frederick, Kimberly; Katz-Bassett, Ethan; Henneman, Philip L

    2007-01-01

    The administration of blood products is a common, resource-intensive, and potentially problem-prone area that may place patients at elevated risk in the clinical setting. Much of the emphasis in transfusion safety has been targeted toward quality control measures in laboratory settings where blood products are prepared for administration as well as in automation of certain laboratory processes. In contrast, the process of transfusing blood in the clinical setting (ie, at the point of care) has essentially remained unchanged over the past several decades. Many of the currently available methods for improving the quality and safety of blood transfusions in the clinical setting rely on informal process descriptions, such as flow charts and medical algorithms, to describe medical processes. These informal descriptions, although useful in presenting an overview of standard processes, can be ambiguous or incomplete. For example, they often describe only the standard process and leave out how to handle possible failures or exceptions. One alternative to these informal descriptions is to use formal process definitions, which can serve as the basis for a variety of analyses because these formal definitions offer precision in the representation of all possible ways that a process can be carried out in both standard and exceptional situations. Formal process definitions have not previously been used to describe and improve medical processes. The use of such formal definitions to prospectively identify potential error and improve the transfusion process has not previously been reported. The purpose of this article is to introduce the concept of formally defining processes and to describe how formal definitions of blood transfusion processes can be used to detect and correct transfusion process errors in ways not currently possible using existing quality improvement methods.

  18. Chemical interaction matrix between reagents in a Purex based process

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brahman, R.K.; Hennessy, W.P.; Paviet-Hartmann, P.

    2008-07-01

    The United States Department of Energy (DOE) is the responsible entity for the disposal of the United States excess weapons grade plutonium. DOE selected a PUREX-based process to convert plutonium to low-enriched mixed oxide fuel for use in commercial nuclear power plants. To initiate this process in the United States, a Mixed Oxide (MOX) Fuel Fabrication Facility (MFFF) is under construction and will be operated by Shaw AREVA MOX Services at the Savannah River Site. This facility will be licensed and regulated by the U.S. Nuclear Regulatory Commission (NRC). A PUREX process, similar to the one used at La Hague,more » France, will purify plutonium feedstock through solvent extraction. MFFF employs two major process operations to manufacture MOX fuel assemblies: (1) the Aqueous Polishing (AP) process to remove gallium and other impurities from plutonium feedstock and (2) the MOX fuel fabrication process (MP), which processes the oxides into pellets and manufactures the MOX fuel assemblies. The AP process consists of three major steps, dissolution, purification, and conversion, and is the center of the primary chemical processing. A study of process hazards controls has been initiated that will provide knowledge and protection against the chemical risks associated from mixing of reagents over the life time of the process. This paper presents a comprehensive chemical interaction matrix evaluation for the reagents used in the PUREX-based process. Chemical interaction matrix supplements the process conditions by providing a checklist of any potential inadvertent chemical reactions that may take place. It also identifies the chemical compatibility/incompatibility of the reagents if mixed by failure of operations or equipment within the process itself or mixed inadvertently by a technician in the laboratories. (aut0010ho.« less

  19. Ultra-processed foods have the worst nutrient profile, yet they are the most available packaged products in a sample of New Zealand supermarkets.

    PubMed

    Luiten, Claire M; Steenhuis, Ingrid Hm; Eyles, Helen; Ni Mhurchu, Cliona; Waterlander, Wilma E

    2016-02-01

    To examine the availability of packaged food products in New Zealand supermarkets by level of industrial processing, nutrient profiling score (NPSC), price (energy, unit and serving costs) and brand variety. Secondary analysis of cross-sectional survey data on packaged supermarket food and non-alcoholic beverages. Products were classified according to level of industrial processing (minimally, culinary and ultra-processed) and their NPSC. Packaged foods available in four major supermarkets in Auckland, New Zealand. Packaged supermarket food products for the years 2011 and 2013. The majority (84% in 2011 and 83% in 2013) of packaged foods were classified as ultra-processed. A significant positive association was found between the level of industrial processing and NPSC, i.e., ultra-processed foods had a worse nutrient profile (NPSC=11.63) than culinary processed foods (NPSC=7.95), which in turn had a worse nutrient profile than minimally processed foods (NPSC=3.27), P<0.001. No clear associations were observed between the three price measures and level of processing. The study observed many variations of virtually the same product. The ten largest food manufacturers produced 35% of all packaged foods available. In New Zealand supermarkets, ultra-processed foods comprise the largest proportion of packaged foods and are less healthy than less processed foods. The lack of significant price difference between ultra- and less processed foods suggests ultra-processed foods might provide time-poor consumers with more value for money. These findings highlight the need to improve the supermarket food supply by reducing numbers of ultra-processed foods and by reformulating products to improve their nutritional profile.

  20. Trends in consumption of ultra-processed foods and obesity in Sweden between 1960 and 2010.

    PubMed

    Juul, Filippa; Hemmingsson, Erik

    2015-12-01

    To investigate how consumption of ultra-processed foods has changed in Sweden in relation to obesity. Nationwide ecological analysis of changes in processed foods along with corresponding changes in obesity. Trends in per capita food consumption during 1960-2010 were investigated using data from the Swedish Board of Agriculture. Food items were classified as group 1 (unprocessed/minimally processed), group 2 (processed culinary ingredients) or group 3 (3·1, processed food products; and 3·2, ultra-processed products). Obesity prevalence data were pooled from the peer-reviewed literature, Statistics Sweden and the WHO Global Health Observatory. Nationwide analysis in Sweden, 1960-2010. Swedish nationals aged 18 years and older. During the study period consumption of group 1 foods (minimal processing) decreased by 2 %, while consumption of group 2 foods (processed ingredients) decreased by 34 %. Consumption of group 3·1 foods (processed food products) increased by 116 % and group 3·2 foods (ultra-processed products) increased by 142 %. Among ultra-processed products, there were particularly large increases in soda (315 %; 22 v. 92 litres/capita per annum) and snack foods such as crisps and candies (367 %; 7 v. 34 kg/capita per annum). In parallel to these changes in ultra-processed products, rates of adult obesity increased from 5 % in 1980 to over 11 % in 2010. The consumption of ultra-processed products (i.e. foods with low nutritional value but high energy density) has increased dramatically in Sweden since 1960, which mirrors the increased prevalence of obesity. Future research should clarify the potential causal role of ultra-processed products in weight gain and obesity.

Top