IDEA Special Education Resolution Meetings. A Guide for Parents of Children & Youth (Ages 3-21)
ERIC Educational Resources Information Center
Center for Appropriate Dispute Resolution in Special Education (CADRE), 2014
2014-01-01
A resolution meeting is a dispute resolution process that takes place after a parent files a due process complaint. Resolution meetings offer parents and school districts the opportunity to resolve issues before a due process hearing happens. This publication describes Resolution Meetings generally for Part B of the Individuals with Disabilities…
ERIC Educational Resources Information Center
Hendley, Tom
1995-01-01
Discussion of digital document image processing focuses on issues and options associated with greyscale and color image processing. Topics include speed; size of original document; scanning resolution; markets for different categories of scanners, including photographic libraries, publishing, and office applications; hybrid systems; data…
Engineering the Business of Defense Acquisition: An Analysis of Program Office Processes
2015-04-30
ability to stay focused on the decision outcome rather than procrastinate and wait for a time- dependent resolution. Another critical aspect of...instituted an effective issue resolution process. Conflict, left unmanaged, tended to result in further procrastination and less effective outcomes
Defense AT and L Magazine. Vol. 46, no. 2, March-April 2017
2017-03-01
Regulation Supplement, system and reporting) are placed within the solicitation. Once under contract, PARCA has a new issue resolution ( IR ) pro- cess...PARCA has a new issue resolution ( IR ) process that has helped contractors interpret the expectations of the EIA-748 EVMS standard guidelines and EVM...interpretive guide which better clarified the requirements for an earned value management system. Through industry outreach and Web -based clearinghouse
Independent Orbiter Assessment (IOA): CIL issues resolution report, volume 1
NASA Technical Reports Server (NTRS)
Urbanowicz, Kenneth J.; Hinsdale, L. W.; Barnes, J. E.
1988-01-01
The results of the Independent Orbiter Assessment (IOA) of the Failure Modes and Effects Analysis (FMEA) and Critical Items List (CIL) are presented. This report contains IOA assessment worksheets showing resolution of outstanding IOA CIL issues that were summarized in the IOA FMEA/CIL Assessment Interim Report, dated 9 March 1988. Each assessment worksheet has been updated with CIL issue resolution and rationale. The NASA and Prime Contractor post 51-L FMEA/CIL documentation assessed is believed to be technically accurate and complete. No assessment issues remain that has safety implications. Volume 1 contain worksheets for the following sybsystems: Landing and Deceleration Subsystem; Purge, Vent and Drain Subsystem; Active Thermal Control and Life Support Systems; Crew Equipment Subsystem; Instrumentation Subsystem; Data Processing Subsystem; Atmospheric Revitalization Pressure Control Subsystem; Hydraulics and Water Spray Boiler Subsystem; and Mechanical Actuation Subsystem.
A digital gigapixel large-format tile-scan camera.
Ben-Ezra, M
2011-01-01
Although the resolution of single-lens reflex (SLR) and medium-format digital cameras has increased in recent years, applications for cultural-heritage preservation and computational photography require even higher resolutions. Addressing this issue, a large-format cameras' large image planes can achieve very high resolution without compromising pixel size and thus can provide high-quality, high-resolution images.This digital large-format tile scan camera can acquire high-quality, high-resolution images of static scenes. It employs unique calibration techniques and a simple algorithm for focal-stack processing of very large images with significant magnification variations. The camera automatically collects overlapping focal stacks and processes them into a high-resolution, extended-depth-of-field image.
Determining Extension's Role in Controversial Issues: Content, Process, Neither, or Both?
ERIC Educational Resources Information Center
Goerlich, Dan; Walker, Martha A.
2015-01-01
Controversial issues offer Extension faculty opportunities to facilitate community dialogue and apply conflict resolution strategies to help communities achieve higher ground. Handled appropriately, the long-term benefits to the community, the Extension organization, and the faculty member of facilitating public issues outweigh the costs. This…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Moiseev, A A; Gelikonov, G V; Terpelov, D A
2014-08-31
An analogy between spectral-domain optical coherence tomography (SD OCT) data and broadband digital holography data is considered. Based on this analogy, a method for processing SD OCT data, which makes it possible to construct images with a lateral resolution in the whole investigated volume equal to the resolution in the in-focus region, is developed. Several issues concerning practical application of the proposed method are discussed. (laser biophotonics)
Trainee underperformance: a guide to achieving resolution.
Rashid, Prem; Grills, Richard; Kuan, Melvyn; Klein, Deborah
2015-05-01
Underperformance and the disharmony it can cause are not commonly faced by trainees. However, when it occurs, a process to recognize and manage the issues compassionately must be put in place. A literature review was undertaken to outline processes and themes in addressing and resolving these types of issues. A PubMed search using 'surgical underperformance' and 'remedial teaching' was used as a broad template to find papers that illustrated key concepts. One thousand four hundred and fifteen papers were identified. In papers where the titles were in line with the stated topic, 294 abstracts were reviewed. Key papers were used to develop themes. Additional cross-referenced papers were also included where relevant. There can be a variety of reasons for trainee underperformance. The root cause is not always clear. Disharmony can result in a surgical unit during this time. The involved trainee as well as the members of the clinical unit may experience a variety of stressors. A systematic process of management can be used to evaluate the situation and bring some resolution to difficulties in working relationships. Early constructive intervention improves outcomes. There should be a process to systematically and compassionately resolve underlying issues. This paper outlines the disharmony that can result from trainee underperformance and offers guidance for resolution to those involved. © 2014 Royal Australasian College of Surgeons.
33 CFR 385.23 - Dispute resolution.
Code of Federal Regulations, 2011 CFR
2011-07-01
... Processes § 385.23 Dispute resolution. (a) Disputes with the non-Federal sponsor concerning a Project... Cooperation Agreement. (b) Disputes with the non-Federal sponsor concerning design activities shall be... issues with the non-Federal sponsor and disputes with the State associated with the implementation of the...
33 CFR 385.23 - Dispute resolution.
Code of Federal Regulations, 2014 CFR
2014-07-01
... Processes § 385.23 Dispute resolution. (a) Disputes with the non-Federal sponsor concerning a Project... Cooperation Agreement. (b) Disputes with the non-Federal sponsor concerning design activities shall be... issues with the non-Federal sponsor and disputes with the State associated with the implementation of the...
33 CFR 385.23 - Dispute resolution.
Code of Federal Regulations, 2012 CFR
2012-07-01
... Processes § 385.23 Dispute resolution. (a) Disputes with the non-Federal sponsor concerning a Project... Cooperation Agreement. (b) Disputes with the non-Federal sponsor concerning design activities shall be... issues with the non-Federal sponsor and disputes with the State associated with the implementation of the...
33 CFR 385.23 - Dispute resolution.
Code of Federal Regulations, 2013 CFR
2013-07-01
... Processes § 385.23 Dispute resolution. (a) Disputes with the non-Federal sponsor concerning a Project... Cooperation Agreement. (b) Disputes with the non-Federal sponsor concerning design activities shall be... issues with the non-Federal sponsor and disputes with the State associated with the implementation of the...
An Optimised System for Generating Multi-Resolution Dtms Using NASA Mro Datasets
NASA Astrophysics Data System (ADS)
Tao, Y.; Muller, J.-P.; Sidiropoulos, P.; Veitch-Michaelis, J.; Yershov, V.
2016-06-01
Within the EU FP-7 iMars project, a fully automated multi-resolution DTM processing chain, called Co-registration ASP-Gotcha Optimised (CASP-GO) has been developed, based on the open source NASA Ames Stereo Pipeline (ASP). CASP-GO includes tiepoint based multi-resolution image co-registration and an adaptive least squares correlation-based sub-pixel refinement method called Gotcha. The implemented system guarantees global geo-referencing compliance with respect to HRSC (and thence to MOLA), provides refined stereo matching completeness and accuracy based on the ASP normalised cross-correlation. We summarise issues discovered from experimenting with the use of the open-source ASP DTM processing chain and introduce our new working solutions. These issues include global co-registration accuracy, de-noising, dealing with failure in matching, matching confidence estimation, outlier definition and rejection scheme, various DTM artefacts, uncertainty estimation, and quality-efficiency trade-offs.
Conflict Resolution, Diversity, and Social Justice.
ERIC Educational Resources Information Center
Townley, Annette
1994-01-01
Briefly explores the issue of conflict resolution (CR) in education; the introduction of CR into the public schools; and whether CR processes such as mediation meet the needs of nondominant groups. It also introduces several articles that discuss specific approaches to the development and implementation of CR programs in schools. (GLR)
High resolution modeling of a small urban catchment
NASA Astrophysics Data System (ADS)
Skouri-Plakali, Ilektra; Ichiba, Abdellah; Gires, Auguste; Tchiguirinskaia, Ioulia; Schertzer, Daniel
2016-04-01
Flooding is one of the most complex issues that urban environments have to deal with. In France, flooding remains the first natural risk with 72% of decrees state of natural disaster issued between October 1982 and mid-November 2014. Flooding is a result of meteorological extremes that are usually aggravated by the hydrological behavior of urban catchments and human factors. The continuing urbanization process is indeed changing the whole urban water cycle by limiting the infiltration and promoting runoff. Urban environments are very complex systems due to their extreme variability, the interference between human activities and natural processes but also the effect of the ongoing urbanization process that changes the landscape and hardly influences their hydrologic behavior. Moreover, many recent works highlight the need to simulate all urban water processes at their specific temporal and spatial scales. However, considering urban catchments heterogeneity still challenging for urban hydrology, even after advances noticed in term of high-resolution data collection and computational resources. This issue is more to be related to the architecture of urban models being used and how far these models are ready to take into account the extreme variability of urban catchments. In this work, high spatio-temporal resolution modeling is performed for a small and well-equipped urban catchment. The aim of this work is to identify urban modeling needs in terms of spatial and temporal resolution especially for a very small urban area (3.7 ha urban catchment located in the Perreux-sur-Marne city at the southeast of Paris) MultiHydro model was selected to carry out this work, it is a physical based and fully distributed model that interacts four existing modules each of them representing a portion of the water cycle in urban environments. MultiHydro was implemented at 10m, 5m and 2m resolution. Simulations were performed at different spatio-temporal resolutions and analyzed with respect to real flow measurements. First Results coming out show improvements obtained in terms of the model performance at high spatio-temporal resolution.
Landsat 8 Multispectral and Pansharpened Imagery Processing on the Study of Civil Engineering Issues
NASA Astrophysics Data System (ADS)
Lazaridou, M. A.; Karagianni, A. Ch.
2016-06-01
Scientific and professional interests of civil engineering mainly include structures, hydraulics, geotechnical engineering, environment, and transportation issues. Topics included in the context of the above may concern urban environment issues, urban planning, hydrological modelling, study of hazards and road construction. Land cover information contributes significantly on the study of the above subjects. Land cover information can be acquired effectively by visual image interpretation of satellite imagery or after applying enhancement routines and also by imagery classification. The Landsat Data Continuity Mission (LDCM - Landsat 8) is the latest satellite in Landsat series, launched in February 2013. Landsat 8 medium spatial resolution multispectral imagery presents particular interest in extracting land cover, because of the fine spectral resolution, the radiometric quantization of 12bits, the capability of merging the high resolution panchromatic band of 15 meters with multispectral imagery of 30 meters as well as the policy of free data. In this paper, Landsat 8 multispectral and panchromatic imageries are being used, concerning surroundings of a lake in north-western Greece. Land cover information is extracted, using suitable digital image processing software. The rich spectral context of the multispectral image is combined with the high spatial resolution of the panchromatic image, applying image fusion - pansharpening, facilitating in this way visual image interpretation to delineate land cover. Further processing concerns supervised image classification. The classification of pansharpened image preceded multispectral image classification. Corresponding comparative considerations are also presented.
Antecedent Frequency Effects on Anaphoric Pronoun Resolution: Evidence from Spanish
ERIC Educational Resources Information Center
Egusquiza, Nerea; Navarrete, Eduardo; Zawiszewski, Adam
2016-01-01
High-frequency words are usually understood and produced faster than low-frequency words. Although the effect of word frequency is a reliable phenomenon in many domains of language processing, it remains unclear whether and how frequency affects pronominal anaphoric resolution. We evaluated this issue by means of two self-paced reading…
Problems and Processes in Medical Encounters: The CASES method of dialogue analysis
Laws, M. Barton; Taubin, Tatiana; Bezreh, Tanya; Lee, Yoojin; Beach, Mary Catherine; Wilson, Ira B.
2013-01-01
Objective To develop methods to reliably capture structural and dynamic temporal features of clinical interactions. Methods Observational study of 50 audio-recorded routine outpatient visits to HIV specialty clinics, using innovative analytic methods. The Comprehensive Analysis of the Structure of Encounters System (CASES) uses transcripts coded for speech acts, then imposes larger-scale structural elements: threads – the problems or issues addressed; and processes within threads –basic tasks of clinical care labeled Presentation, Information, Resolution (decision making) and Engagement (interpersonal exchange). Threads are also coded for the nature of resolution. Results 61% of utterances are in presentation processes. Provider verbal dominance is greatest in information and resolution processes, which also contain a high proportion of provider directives. About half of threads result in no action or decision. Information flows predominantly from patient to provider in presentation processes, and from provider to patient in information processes. Engagement is rare. Conclusions In this data, resolution is provider centered; more time for patient participation in resolution, or interpersonal engagement, would have to come from presentation. Practice Implications Awareness of the use of time in clinical encounters, and the interaction processes associated with various tasks, may help make clinical communication more efficient and effective. PMID:23391684
Problems and processes in medical encounters: the cases method of dialogue analysis.
Laws, M Barton; Taubin, Tatiana; Bezreh, Tanya; Lee, Yoojin; Beach, Mary Catherine; Wilson, Ira B
2013-05-01
To develop methods to reliably capture structural and dynamic temporal features of clinical interactions. Observational study of 50 audio-recorded routine outpatient visits to HIV specialty clinics, using innovative analytic methods. The comprehensive analysis of the structure of encounters system (CASES) uses transcripts coded for speech acts, then imposes larger-scale structural elements: threads--the problems or issues addressed; and processes within threads--basic tasks of clinical care labeled presentation, information, resolution (decision making) and Engagement (interpersonal exchange). Threads are also coded for the nature of resolution. 61% of utterances are in presentation processes. Provider verbal dominance is greatest in information and resolution processes, which also contain a high proportion of provider directives. About half of threads result in no action or decision. Information flows predominantly from patient to provider in presentation processes, and from provider to patient in information processes. Engagement is rare. In this data, resolution is provider centered; more time for patient participation in resolution, or interpersonal engagement, would have to come from presentation. Awareness of the use of time in clinical encounters, and the interaction processes associated with various tasks, may help make clinical communication more efficient and effective. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.
ERIC Educational Resources Information Center
Howell, Robert E.; And Others
A model and supportive materials are presented for design and implementation of a program for involving citizens in decision-making concerning significant environmental issues. Chapter topics include: why citizen involvement? (potential benefits of the process); theoretical basis for citizen involvement (three fundamental perspectives underlying…
Building a Narrative Based Requirements Engineering Mediation Model
NASA Astrophysics Data System (ADS)
Ma, Nan; Hall, Tracy; Barker, Trevor
This paper presents a narrative-based Requirements Engineering (RE) mediation model to help RE practitioners to effectively identify, define, and resolve conflicts of interest, goals, and requirements. Within the SPI community, there is a common belief that social, human, and organizational issues significantly impact on the effectiveness of software process improvement in general and the requirements engineering process in particularl. Conflicts among different stakeholders are an important human and social issue that need more research attention in the SPI and RE community. By drawing on the conflict resolution literature and IS literature, we argue that conflict resolution in RE is a mediated process, in which a requirements engineer can act as a mediator among different stakeholders. To address socio-psychological aspects of conflict in RE and SPI, Winslade and Monk (2000)'s narrative mediation model is introduced, justified, and translated into the context of RE.
Special Issue on Optochemical and Optogenetic Control of Cellular Processes.
Deiters, Alexander
2018-06-06
Diverse optochemical and optobiological approaches are being developed and applied to the light-regulation of cellular processes with exquisite spatial and temporal resolution in cells and multicellular model organisms. In this special issue, experts report some of the latest progress in the expanding field of the optical control of biological systems and present an overview of the state of the art of select approaches. © 2018 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zank, G.D.
1989-05-01
The relationship between strategic offensive capabilities (reflected in the SIOP) and emerging strategic defensive capabilities (reflected by SDI) is not being adequately addressed. A summary of the existing nuclear war planning process is provided, and an analagous defensive process is postulated. Parallels and differences between the two processes are discussed. Potential areas for information exchange and cooperation are identified to enhance deterrence and improve war fighting potential. Operational, technical and political issues requiring resolution are raised and recommendations to resolve these issues are made.
Howe, Nina; Rinaldi, Christina M; Jennings, Melissa; Petrakos, Harriet
2002-01-01
Associations among constructive and destructive sibling conflict, pretend play, internal state language, and sibling relationship quality were investigated in 40 middle-class dyads with a kindergarten-age child (M age = 5.7 years). In 20 dyads the sibling was older (M age = 7.1 years) and in 20 dyads the sibling was younger (M age = 3.6 years). Dyads were videotaped playing with a farm set for 15 min; transcribed sessions were coded for (1) five types of conflict issues; (2) constructive, destructive, and passive resolution strategies; and (3) verbal and physical aggression. Measures of pretend play enactment, low- and high-level pretense negotiation strategies, and internal state language were also based on the transcripts. The Sibling Behavior and Feelings Questionnaire was used to assess both siblings' perceptions of sibling relationship quality. Findings revealed that conflict issues, aggression, and internal state language were associated with specific resolution strategies. Associations were evident between conflict issues and resolutions. Moreover, conflict issues and resolutions were associated with (1) relationship quality, (2) high-level pretense negotiation, and (3) internal state language employed in both play and conflict. Findings are discussed in light of recent theory on developmental processes operating within children's relationships.
King, Michael S
2008-12-01
Increasingly courts are using new approaches that promote a more comprehensive resolution of legal problems, minimise any negative effects that legal processes have on participant wellbeing and/or that use legal processes to promote participant wellbeing. Therapeutic jurisprudence, restorative justice, mediation and problem-solving courts are examples. This article suggests a model for the use of these processes in the coroner's court to minimise negative effects of coroner's court processes on the bereaved and to promote a more comprehensive resolution of matters at issue, including the determination of the cause of death and the public health and safety promotion role of the coroner.
7 CFR 1486.207 - What is the Technical Issues Resolution Fund?
Code of Federal Regulations, 2010 CFR
2010-01-01
... strategic areas of longer term interest. Funding decisions are determined primarily through a review process... CREDIT CORPORATION, DEPARTMENT OF AGRICULTURE LOANS, PURCHASES, AND OTHER OPERATIONS EMERGING MARKETS...
Adolescents and Music Lyrics: Implications of a Cognitive Perspective.
ERIC Educational Resources Information Center
Desmond, Roger Jon
1987-01-01
Addresses the relevance of cognitive research in memory processes and auditory information processing for the resolution of policy issues concerning the regulation of popular music. Offers several assumptions regarding music listening, and suggests appropriate research methods for resolving questions surrounding music listening and regulation.…
Loran-C performance assurance assessment program
NASA Technical Reports Server (NTRS)
Lilley, Robert W.; Brooks, N. Kent
1992-01-01
The Federal Aviation Administration (FAA) has accepted the Loran-C navigation system as a supplemental navigation aid for enroute use. Extension of Loran-C utilization to instrument approaches requires establishment of a process by which the current level of performance of the system is always known by the pilot. This system 'integrity' translates into confidence that, if the system is made available to the pilot, the guidance will be correct. Early in the consideration of Loran-C for instrument approaches, the Loran-C Planning Work-Group (LPW) was formed with membership from the FAA, the US Coast Guard, various state governments, aviation users, equipment manufacturers and technical experts. The group was hosted and co-chaired by the National Association of State Aviation Officials (NASAO). This forum was ideal for identification of system integrity issues and for finding the correct process for their resolution. Additionally, the Wild Goose Association (WGA), which is the international Loran-C technical and user forum, regularly brings together members of the FAA, Coast Guard, and the scientific community. Papers and discussions from WGA meetings have been helpful. Given here is a collection of the issues in which Ohio University became involved. Issues definition and resolution are included along with the recommendations in those areas where resolution is not yet complete.
ERIC Educational Resources Information Center
Loch, John R.
2003-01-01
Outlines problems in continuing higher education, suggesting that it lacks (1) a standard name; (2) a unified voice on national issues; (3) a standard set of roles and functions; (4) a standard title for the chief administrative officer; (5) an accreditation body and process; and (6) resolution of the centralization/decentralization issue. (SK)
Towards a real-time wide area motion imagery system
NASA Astrophysics Data System (ADS)
Young, R. I.; Foulkes, S. B.
2015-10-01
It is becoming increasingly important in both the defence and security domains to conduct persistent wide area surveillance (PWAS) of large populations of targets. Wide Area Motion Imagery (WAMI) is a key technique for achieving this wide area surveillance. The recent development of multi-million pixel sensors has provided sensors with wide field of view replete with sufficient resolution for detection and tracking of objects of interest to be achieved across these extended areas of interest. WAMI sensors simultaneously provide high spatial and temporal resolutions, giving extreme pixel counts over large geographical areas. The high temporal resolution is required to enable effective tracking of targets. The provision of wide area coverage with high frame rates generates data deluge issues; these are especially profound if the sensor is mounted on an airborne platform, with finite data-link bandwidth and processing power that is constrained by size, weight and power (SWAP) limitations. These issues manifest themselves either as bottlenecks in the transmission of the imagery off-board or as latency in the time taken to analyse the data due to limited computational processing power.
Donnelly, Lane F; Cherian, Shirley S; Chua, Kimberly B; Thankachan, Sam; Millecker, Laura A; Koroll, Alex G; Bisset, George S
2017-01-01
Because of the increasing complexities of providing imaging for pediatric health care services, a more reliable process to manage the daily delivery of care is necessary. Objective We describe our Daily Readiness Huddle and the effects of the process on problem identification and improvement. Our Daily Readiness Huddle has four elements: metrics review, clinical volume review, daily readiness assessment, and problem accountability. It is attended by radiologists, directors, managers, front-line staff with concerns, representatives from support services (information technology [IT] and biomedical engineering [biomed]), and representatives who join the meeting in a virtual format from off-site locations. Data are visually displayed on erasable whiteboards. The daily readiness assessment uses queues to determine whether anyone has concerns or outlier data in regard to S-MESA (Safety, Methods, Equipment, Supplies or Associates). Through this assessment, problems are identified and categorized as quick hits (will be resolved in 24-48 h, not requiring project management) and complex issues. Complex issues are assigned an owner, quality coach and report-back date. Additionally, projects are defined as improvements that are often strategic, are anticipated to take more than 60 days, and do not necessarily arise out of identified issues during the Daily Readiness Huddle. We tracked and calculated the mean, median and range of days to resolution and completion for complex issues and for projects during the first full year of implementing this process. During the first 12 months, 91 complex issues were identified and resolved, 11 projects were in progress and 33 completed, with 23 other projects active or in planning. Time to resolution of complex issues (in days) was mean 37.5, median 34.0, and range 1-105. For projects, time to completion (in days) was mean 86.0, median 84.0, and range 5-280. The Daily Readiness Huddle process has given us a framework to rapidly identify issues, bring accountability to problem-solving, and foster improvement. It has also had a positive effect on team-building and coordination.
Resolution No. 598-87, Regulations for the Functioning of the Land Tenure Registry, 27 October 1987.
1988-01-01
This Resolution sets forth Regulations for the Land Tenancy Registry of Cuba. It provides that the Registry is part of a scheme for exercising control over legal land tenure and has the following objectives: maintaining control of national land; determining the legal situation of holders of land; recognizing the number of legal holders of land; furnishing information on acquisition, exploitation, and buildings; issuing certificates; and analyzing and processing records in appeals. Further provisions of the Resolution lay down details about these functions. full text
17 CFR 240.3a12-10 - Exemption of certain securities issued by the Resolution Funding Corporation.
Code of Federal Regulations, 2010 CFR
2010-04-01
... securities issued by the Resolution Funding Corporation. 240.3a12-10 Section 240.3a12-10 Commodity and... Exemptions § 240.3a12-10 Exemption of certain securities issued by the Resolution Funding Corporation. Securities that are issued by the Resolution Funding Corporation pursuant to section 21B(f) of the Federal...
Optimization Case Study: ISR Allocation in the Global Force Management Process
2016-09-01
Communications Intelligence (COMINT), and other intelligence collection capabilities. The complexity of FMV force allocation makes FMV the ideal...Joint Staff (2014). 5 This chapter will step through the GFM allocation process and develop an understanding of the GFM process depicted in Figure 1...contentious. The contentious issue will go through a resolution process consisting of action officer and General Officer/Flag Officer (GOFO) level forums
Mask manufacturing of advanced technology designs using multi-beam lithography (Part 1)
NASA Astrophysics Data System (ADS)
Green, Michael; Ham, Young; Dillon, Brian; Kasprowicz, Bryan; Hur, Ik Boum; Park, Joong Hee; Choi, Yohan; McMurran, Jeff; Kamberian, Henry; Chalom, Daniel; Klikovits, Jan; Jurkovic, Michal; Hudek, Peter
2016-10-01
As optical lithography is extended into 10nm and below nodes, advanced designs are becoming a key challenge for mask manufacturers. Techniques including advanced Optical Proximity Correction (OPC) and Inverse Lithography Technology (ILT) result in structures that pose a range of issues across the mask manufacturing process. Among the new challenges are continued shrinking Sub-Resolution Assist Features (SRAFs), curvilinear SRAFs, and other complex mask geometries that are counter-intuitive relative to the desired wafer pattern. Considerable capability improvements over current mask making methods are necessary to meet the new requirements particularly regarding minimum feature resolution and pattern fidelity. Advanced processes using the IMS Multi-beam Mask Writer (MBMW) are feasible solutions to these coming challenges. In this paper, we study one such process, characterizing mask manufacturing capability of 10nm and below structures with particular focus on minimum resolution and pattern fidelity.
Lessons Learned During Solutions of Multidisciplinary Design Optimization Problems
NASA Technical Reports Server (NTRS)
Patnaik, Suna N.; Coroneos, Rula M.; Hopkins, Dale A.; Lavelle, Thomas M.
2000-01-01
Optimization research at NASA Glenn Research Center has addressed the design of structures, aircraft and airbreathing propulsion engines. During solution of the multidisciplinary problems several issues were encountered. This paper lists four issues and discusses the strategies adapted for their resolution: (1) The optimization process can lead to an inefficient local solution. This deficiency was encountered during design of an engine component. The limitation was overcome through an augmentation of animation into optimization. (2) Optimum solutions obtained were infeasible for aircraft and air-breathing propulsion engine problems. Alleviation of this deficiency required a cascading of multiple algorithms. (3) Profile optimization of a beam produced an irregular shape. Engineering intuition restored the regular shape for the beam. (4) The solution obtained for a cylindrical shell by a subproblem strategy converged to a design that can be difficult to manufacture. Resolution of this issue remains a challenge. The issues and resolutions are illustrated through six problems: (1) design of an engine component, (2) synthesis of a subsonic aircraft, (3) operation optimization of a supersonic engine, (4) design of a wave-rotor-topping device, (5) profile optimization of a cantilever beam, and (6) design of a cvlindrical shell. The combined effort of designers and researchers can bring the optimization method from academia to industry.
Irestig, Magnus; Timpka, Toomas
2010-02-01
We set out to examine design conflict resolution tactics used in development of large information systems for health services and to outline the design consequences for these tactics. Discourse analysis methods were applied to data collected from meetings conducted during the development of a web-based system in a public health context. We found that low risk tactics were characterized by design issues being managed within the formal mandate and competences of the design group. In comparison, high risk tactics were associated with irresponsible compromises, i.e. decisions being passed on to others or to later phases of the design process. The consequence of this collective disregard of issues such as responsibility and legitimacy is that the system design will be impossible to implement in factual health service contexts. The results imply that downstream responsibility issues have to be continuously dealt with in system development in health services.
NASA Technical Reports Server (NTRS)
Watson, Andrew B.
1988-01-01
Two types of research issues are involved in image management systems with space station applications: image processing research and image perception research. The image processing issues are the traditional ones of digitizing, coding, compressing, storing, analyzing, and displaying, but with a new emphasis on the constraints imposed by the human perceiver. Two image coding algorithms have been developed that may increase the efficiency of image management systems (IMS). Image perception research involves a study of the theoretical and practical aspects of visual perception of electronically displayed images. Issues include how rapidly a user can search through a library of images, how to make this search more efficient, and how to present images in terms of resolution and split screens. Other issues include optimal interface to an IMS and how to code images in a way that is optimal for the human perceiver. A test-bed within which such issues can be addressed has been designed.
NASA Astrophysics Data System (ADS)
Ahmad, Sabrina; Jalil, Intan Ermahani A.; Ahmad, Sharifah Sakinah Syed
2016-08-01
It is seldom technical issues which impede the process of eliciting software requirements. The involvement of multiple stakeholders usually leads to conflicts and therefore the need of conflict detection and resolution effort is crucial. This paper presents a conceptual model to further improve current efforts. Hence, this paper forwards an improved conceptual model to assist the conflict detection and resolution effort which extends the model ability and improves overall performance. The significant of the new model is to empower the automation of conflicts detection and its severity level with rule-based reasoning.
Processing Ocean Images to Detect Large Drift Nets
NASA Technical Reports Server (NTRS)
Veenstra, Tim
2009-01-01
A computer program processes the digitized outputs of a set of downward-looking video cameras aboard an aircraft flying over the ocean. The purpose served by this software is to facilitate the detection of large drift nets that have been lost, abandoned, or jettisoned. The development of this software and of the associated imaging hardware is part of a larger effort to develop means of detecting and removing large drift nets before they cause further environmental damage to the ocean and to shores on which they sometimes impinge. The software is capable of near-realtime processing of as many as three video feeds at a rate of 30 frames per second. After a user sets the parameters of an adjustable algorithm, the software analyzes each video stream, detects any anomaly, issues a command to point a high-resolution camera toward the location of the anomaly, and, once the camera has been so aimed, issues a command to trigger the camera shutter. The resulting high-resolution image is digitized, and the resulting data are automatically uploaded to the operator s computer for analysis.
Imaging the cell surface and its organization down to the level of single molecules.
Klenerman, David; Shevchuk, Andrew; Novak, Pavel; Korchev, Yuri E; Davis, Simon J
2013-02-05
Determining the organization of key molecules on the surface of live cells in two dimensions and how this changes during biological processes, such as signalling, is a major challenge in cell biology and requires methods with nanoscale spatial resolution and high temporal resolution. Here, we review biophysical tools, based on scanning ion conductance microscopy and single-molecule fluorescence and the combination of both of these methods, which have recently been developed to address these issues. We then give examples of how these methods have been be applied to provide new insights into cell membrane organization and function, and discuss some of the issues that will need to be addressed to further exploit these methods in the future.
14 CFR 17.39 - Default adjudicative process for contract disputes.
Code of Federal Regulations, 2011 CFR
2011-01-01
... Resolution for Acquisition of a joint statement pursuant to § 17.27 which indicates that ADR will not be... notification by any party that the parties have not settled some or all of the dispute issues via ADR, and it...
14 CFR 17.39 - Default adjudicative process for contract disputes.
Code of Federal Regulations, 2010 CFR
2010-01-01
... Resolution for Acquisition of a joint statement pursuant to § 17.27 which indicates that ADR will not be... notification by any party that the parties have not settled some or all of the dispute issues via ADR, and it...
Flight Crew Integration (FCI) ISS Crew Comments Database & Products Summary
NASA Technical Reports Server (NTRS)
Schuh, Susan
2016-01-01
This Crew Debrief Data provides support for design and development of vehicles, hardware, requirements, procedures, processes, issue resolution, lessons learned, consolidation and trending for current Programs; and much of the data is also used to support development of future Programs.
Code of Federal Regulations, 2010 CFR
2010-10-01
... quality of a CAP drug supplied by the approved CAP vendor to be unsatisfactory, then the physician may address the issue first through the approved CAP vendor's grievance process, and second through an... approved CAP vendor's service or the quality of a CAP drug supplied by the approved CAP vendor, then the...
Resolution of an Orbital Issue: A Designed Experiment
NASA Technical Reports Server (NTRS)
Huddleston, Lisa L.
2011-01-01
Design of Experiments (DOE) is a systematic approach to investigation of a system or process. A series of structured tests are designed in which planned changes are made to the input variables of a process or system. The effects of these changes on a pre-defined output are then assessed. DOE is a formal method of maximizing information gained while minimizing resources required.
Liang, Phyllis; Gustafsson, Louise; Liddle, Jacki; Fleming, Jennifer
2017-07-01
Family members often assume the role of driver for individuals who are not driving post-acquired brain injury (ABI). Given that return to driving can be unpredictable and uncertain, the impact of driving disruption on family members may vary at different stages post-injury. This study aims to understand the needs and experiences of family members over time during driving disruption following an ABI. A qualitative prospective longitudinal research design was used with semi-structured interviews at recruitment to study, 3 and 6 months later. Fourteen family members completed 41 interviews. The longitudinal data revealed four phases of driving disruption: (1) Wait and see, (2) Holding onto a quick fix, (3) No way out, and (4) Resolution and adjustment. The phases described a process of building tension and a need for support and resolution over time. Holding onto a quick fix is a pivotal phase whereby supports, such as engagement in realistic goal setting, are essential to facilitate family members' resolution of driving disruption issues. Family members who see no way out might not actively seek help and these points to a need for long-term and regular follow-ups. Future research can explore ways to support family members at these key times. Implications for rehabilitation Health professionals need to facilitate the process of fostering hope in family members to set realistic expectations of return to driving and the duration of driving disruption. It is necessary to follow-up with family members even years after ABI as the issue of driving disruption could escalate to be a crisis and family members might not actively seek help. Health professionals can consider both practical support for facilitating transport and emotional support when addressing the issue of driving disruption with family members.
NASA Astrophysics Data System (ADS)
Kim, S. K.; Lee, J.; Zhang, C.; Ames, S.; Williams, D. N.
2017-12-01
Deep learning techniques have been successfully applied to solve many problems in climate and geoscience using massive-scaled observed and modeled data. For extreme climate event detections, several models based on deep neural networks have been recently proposed and attend superior performance that overshadows all previous handcrafted expert based method. The issue arising, though, is that accurate localization of events requires high quality of climate data. In this work, we propose framework capable of detecting and localizing extreme climate events in very coarse climate data. Our framework is based on two models using deep neural networks, (1) Convolutional Neural Networks (CNNs) to detect and localize extreme climate events, and (2) Pixel recursive recursive super resolution model to reconstruct high resolution climate data from low resolution climate data. Based on our preliminary work, we have presented two CNNs in our framework for different purposes, detection and localization. Our results using CNNs for extreme climate events detection shows that simple neural nets can capture the pattern of extreme climate events with high accuracy from very coarse reanalysis data. However, localization accuracy is relatively low due to the coarse resolution. To resolve this issue, the pixel recursive super resolution model reconstructs the resolution of input of localization CNNs. We present a best networks using pixel recursive super resolution model that synthesizes details of tropical cyclone in ground truth data while enhancing their resolution. Therefore, this approach not only dramat- ically reduces the human effort, but also suggests possibility to reduce computing cost required for downscaling process to increase resolution of data.
NASA Astrophysics Data System (ADS)
Vela, Adan Ernesto
2011-12-01
From 2010 to 2030, the number of instrument flight rules aircraft operations handled by Federal Aviation Administration en route traffic centers is predicted to increase from approximately 39 million flights to 64 million flights. The projected growth in air transportation demand is likely to result in traffic levels that exceed the abilities of the unaided air traffic controller in managing, separating, and providing services to aircraft. Consequently, the Federal Aviation Administration, and other air navigation service providers around the world, are making several efforts to improve the capacity and throughput of existing airspaces. Ultimately, the stated goal of the Federal Aviation Administration is to triple the available capacity of the National Airspace System by 2025. In an effort to satisfy air traffic demand through the increase of airspace capacity, air navigation service providers are considering the inclusion of advisory conflict-detection and resolution systems. In a human-in-the-loop framework, advisory conflict-detection and resolution decision-support tools identify potential conflicts and propose resolution commands for the air traffic controller to verify and issue to aircraft. A number of researchers and air navigation service providers hypothesize that the inclusion of combined conflict-detection and resolution tools into air traffic control systems will reduce or transform controller workload and enable the required increases in airspace capacity. In an effort to understand the potential workload implications of introducing advisory conflict-detection and resolution tools, this thesis provides a detailed study of the conflict event process and the implementation of conflict-detection and resolution algorithms. Specifically, the research presented here examines a metric of controller taskload: how many resolution commands an air traffic controller issues under the guidance of a conflict-detection and resolution decision-support tool. The goal of the research is to understand how the formulation, capabilities, and implementation of conflict-detection and resolution tools affect the controller taskload (system demands) associated with the conflict-resolution process, and implicitly the controller workload (physical and psychological demands). Furthermore this thesis seeks to establish best practices for the design of future conflict-detection and resolution systems. To generalize conclusions on the conflict-resolution taskload and best design practices of conflict-detection and resolution systems, this thesis focuses on abstracting and parameterizing the behaviors and capabilities of the advisory tools. Ideally, this abstraction of advisory decision-support tools serves as an alternative to exhaustively designing tools, implementing them in high-fidelity simulations, and analyzing their conflict-resolution taskload. Such an approach of simulating specific conflict-detection and resolution systems limits the type of conclusions that can be drawn concerning the design of more generic algorithms. In the process of understanding conflict-detection and resolution systems, evidence in the thesis reveals that the most effective approach to reducing conflict-resolution taskload is to improve conflict-detection systems. Furthermore, studies in the this thesis indicate that there is significant exibility in the design of conflict-resolution algorithms.
Integration of High-resolution Data for Temporal Bone Surgical Simulations
Wiet, Gregory J.; Stredney, Don; Powell, Kimerly; Hittle, Brad; Kerwin, Thomas
2016-01-01
Purpose To report on the state of the art in obtaining high-resolution 3D data of the microanatomy of the temporal bone and to process that data for integration into a surgical simulator. Specifically, we report on our experience in this area and discuss the issues involved to further the field. Data Sources Current temporal bone image acquisition and image processing established in the literature as well as in house methodological development. Review Methods We reviewed the current English literature for the techniques used in computer-based temporal bone simulation systems to obtain and process anatomical data for use within the simulation. Search terms included “temporal bone simulation, surgical simulation, temporal bone.” Articles were chosen and reviewed that directly addressed data acquisition and processing/segmentation and enhancement with emphasis given to computer based systems. We present the results from this review in relationship to our approach. Conclusions High-resolution CT imaging (≤100μm voxel resolution), along with unique image processing and rendering algorithms, and structure specific enhancement are needed for high-level training and assessment using temporal bone surgical simulators. Higher resolution clinical scanning and automated processes that run in efficient time frames are needed before these systems can routinely support pre-surgical planning. Additionally, protocols such as that provided in this manuscript need to be disseminated to increase the number and variety of virtual temporal bones available for training and performance assessment. PMID:26762105
Thresholds of Extinction: Simulation Strategies in Environmental Values Education.
ERIC Educational Resources Information Center
Glew, Frank
1990-01-01
Describes a simulation exercise for campers and an accompanying curriculum unit--"Thresholds of Extinction"--that addresses the issues of endangered species. Uses this context to illustrate steps in the process of values development: awareness, gathering data, resolution (decision making), responsibility (acting on values), and…
McMahon, Brian T; Hurley, Jessica E; West, Steven L; Chan, Fong; Roessler, Richard; Rumrill, Phillip D
2008-06-01
This article describes findings from a causal comparative study of the Merit Resolution rate for allegations of Hiring discrimination that were filed with the U.S. Equal Employment Opportunity Commission (EEOC) under Title I of the Americans with Disabilities Act (ADA) between 1992 and 2005. An allegation is the Charging Party's perception of discrimination, but a Merit Resolution is one in which the EEOC has determined that a discriminatory event did indeed occur. A Non-Merit Resolution is an allegation that is closed due to a technicality or lacks sufficient evidence to conclude that discrimination occurred. Merit favors the Charging Party; Non-Merit favors the Employer. The Merit Resolution rate of 19,527 closed Hiring allegations is compared and contrasted to that of 259,680 allegations aggregated from six other prevalent forms of discrimination including Discharge and Constructive Discharge, Reasonable Accommodation, Disability Harassment and Intimidation, and Terms and Conditions of Employment. Tests of Proportion distributed as chi-square are used to form comparisons along a variety of subcategories of Merit and Non-Merit outcomes. The overall Merit Resolution rate for Hiring is 26% compared to Non-Hiring at 20.6%. Employers are less likely to settle claims of hiring discrimination without mediation, and less likely to accept the remedies recommended by the EEOC when hiring discrimination has been determined. Hiring is not an unusual discrimination issue in that the overwhelming majority of allegations are still closed in favor of the Employer. However, it is counterintuitive that hiring has a higher merit resolution rate than other prevalent issues. This finding contradicts the assumption that hiring is an "invisible process." Considering that the EEOC makes merit determinations at a competitive rate, it is clear that hiring is sufficiently transparent.
The Commercial Challenges Of Pacs
NASA Astrophysics Data System (ADS)
Vanden Brink, John A.
1984-08-01
The increasing use of digital imaging techniques create a need for improved methods of digital processing, communication and archiving. However, the commercial opportunity is dependent on the resolution of a number of issues. These issues include proof that digital processes are more cost effective than present techniques, implementation of information system support in the imaging activity, implementation of industry standards, conversion of analog images to digital formats, definition of clinical needs, the implications of the purchase decision and technology requirements. In spite of these obstacles, a market is emerging, served by new and existing companies, that may become a $500 million market (U.S.) by 1990 for equipment and supplies.
Demonstration of nanoimprinted hyperlens array for high-throughput sub-diffraction imaging
NASA Astrophysics Data System (ADS)
Byun, Minsueop; Lee, Dasol; Kim, Minkyung; Kim, Yangdoo; Kim, Kwan; Ok, Jong G.; Rho, Junsuk; Lee, Heon
2017-04-01
Overcoming the resolution limit of conventional optics is regarded as the most important issue in optical imaging science and technology. Although hyperlenses, super-resolution imaging devices based on highly anisotropic dispersion relations that allow the access of high-wavevector components, have recently achieved far-field sub-diffraction imaging in real-time, the previously demonstrated devices have suffered from the extreme difficulties of both the fabrication process and the non-artificial objects placement. This results in restrictions on the practical applications of the hyperlens devices. While implementing large-scale hyperlens arrays in conventional microscopy is desirable to solve such issues, it has not been feasible to fabricate such large-scale hyperlens array with the previously used nanofabrication methods. Here, we suggest a scalable and reliable fabrication process of a large-scale hyperlens device based on direct pattern transfer techniques. We fabricate a 5 cm × 5 cm size hyperlenses array and experimentally demonstrate that it can resolve sub-diffraction features down to 160 nm under 410 nm wavelength visible light. The array-based hyperlens device will provide a simple solution for much more practical far-field and real-time super-resolution imaging which can be widely used in optics, biology, medical science, nanotechnology and other closely related interdisciplinary fields.
Addressing spatial scales and new mechanisms in climate impact ecosystem modeling
NASA Astrophysics Data System (ADS)
Poulter, B.; Joetzjer, E.; Renwick, K.; Ogunkoya, G.; Emmett, K.
2015-12-01
Climate change impacts on vegetation distributions are typically addressed using either an empirical approach, such as a species distribution model (SDM), or with process-based methods, for example, dynamic global vegetation models (DGVMs). Each approach has its own benefits and disadvantages. For example, an SDM is constrained by data and few parameters, but does not include adaptation or acclimation processes or other ecosystem feedbacks that may act to mitigate or enhance climate effects. Alternatively, a DGVM model includes many mechanisms relating plant growth and disturbance to climate, but simulations are costly to perform at high-spatial resolution and there remains large uncertainty on a variety of fundamental physical processes. To address these issues, here, we present two DGVM-based case studies where i) high-resolution (1 km) simulations are being performed for vegetation in the Greater Yellowstone Ecosystem using a biogeochemical, forest gap model, LPJ-GUESS, and ii) where new mechanisms for simulating tropical tree-mortality are being introduced. High-resolution DGVM model simulations require not only computing and reorganizing code but also a consideration of scaling issues on vegetation dynamics and stochasticity and also on disturbance and migration. New mechanisms for simulating forest mortality must consider hydraulic limitations and carbon reserves and their interactions on source-sink dynamics and in controlling water potentials. Improving DGVM approaches by addressing spatial scale challenges and integrating new approaches for estimating forest mortality will provide new insights more relevant for land management and possibly reduce uncertainty by physical processes more directly comparable to experimental and observational evidence.
USDA-ARS?s Scientific Manuscript database
In recent years, large-scale watershed modeling has been implemented broadly in the field of water resources planning and management. Complex hydrological, sediment, and nutrient processes can be simulated by sophisticated watershed simulation models for important issues such as water resources all...
36 CFR 251.93 - Resolution of issues.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 36 Parks, Forests, and Public Property 2 2010-07-01 2010-07-01 false Resolution of issues. 251.93... issues. (a) Authorized Forest Service officers shall, to the extent practicable and consistent with the... meetings is to discuss any issues or concerns related to the authorized use and to reach a common...
NASA Astrophysics Data System (ADS)
van Geer, Frans C.; Kronvang, Brian; Broers, Hans Peter
2016-09-01
Four sessions on "Monitoring Strategies: temporal trends in groundwater and surface water quality and quantity" at the EGU conferences in 2012, 2013, 2014, and 2015 and a special issue of HESS form the background for this overview of the current state of high-resolution monitoring of nutrients. The overview includes a summary of technologies applied in high-frequency monitoring of nutrients in the special issue. Moreover, we present a new assessment of the objectives behind high-frequency monitoring as classified into three main groups: (i) improved understanding of the underlying hydrological, chemical, and biological processes (PU); (ii) quantification of true nutrient concentrations and loads (Q); and (iii) operational management, including evaluation of the effects of mitigation measures (M). The contributions in the special issue focus on the implementation of high-frequency monitoring within the broader context of policy making and management of water in Europe for support of EU directives such as the Water Framework Directive, the Groundwater Directive, and the Nitrates Directive. The overview presented enabled us to highlight the typical objectives encountered in the application of high-frequency monitoring and to reflect on future developments and research needs in this growing field of expertise.
Single-nm resolution approach by applying DDRP and DDRM
NASA Astrophysics Data System (ADS)
Shibayama, Wataru; Shigaki, Shuhei; Takeda, Satoshi; Nakajima, Makoto; Sakamoto, Rikimaru
2017-03-01
EUV lithography has been desired as the leading technology for 1x or single nm half-pitch patterning. However, the source power, masks and resist materials still have critical issues for mass production. Especially in resist materials, RLS trade-off has been the key issue. To overcome this issue, we are suggesting Dry Development Rinse Process (DDRP) and Materials (DDRM) as the pattern collapse mitigation approach. This DDRM can perform not only as pattern collapse free materials for fine pitch, but also as the etching hard mask against bottom layer (spin on carbon : SOC). In this paper, we especially propose new approaches to achieve high resolution around hp1X nm L/S and single nm line patterning. Especially, semi iso 8nm line was successfully achieved with good LWR (2.5nm) and around 3 times aspect ratio. This single nm patterning technique also helped to enhance sensitivity about 33%. On the other hand, pillar patterning thorough CH pattern by applying DDRP also showed high resolution below 20nm pillar CD with good LCDU and high sensitivity. This new DDRP technology can be the promising approach not only for hp1Xnm level patterning but also single nm patterning in N7/N5 and beyond.
NASA Astrophysics Data System (ADS)
Russell, Gale L.; Chernoff, Egan J.
2013-03-01
In mathematics education, there are (at least) two seemingly disparate and unethical issues that have been allowed to continue unresolved for decades: the math wars (traditional versus reform teaching and learning of mathematics) and the marginalisation of Indigenous students within K-12 mathematics. Willie Ermine, an Indigenous scholar, has proposed the use of ethical spaces to explore and analyse occurrences of unethical situations arising between the "intersection of Indigenous law and Canadian Legal systems" (Ermine, Indigenous Law Journal 6(1):193-203, 2007). This paper brings Ermine's notion of ethical spaces to the field of mathematics education research as the theoretical framework for analysing the aforementioned issues. The result of this analysis is a potential single theoretical resolution to both dilemmas that can also serve as a significant factor in the processes of decolonisation.
C. Aguirre-Bravo; Patrick J. Pellicane; Denver P. Burns; Sidney Draggan
2006-01-01
A rational approach to monitoring and assessment is prerequisite for sustainable management of ecosystem resources. This features innovative ways to advance the concept of monitoring ecosystem sustainability across spheres of environmental concern, natural and anthropogenic processes, and other hemispheric issues over a variety of spatial scales and resolution levels....
Colorado Charter Schools Capital Finance Study: Challenges and Opportunities for the Future.
ERIC Educational Resources Information Center
Caldwell, Russell B.; Arrington, Barry
This report discusses strategies that will help charter schools finance their facilities needs. It outlines the history of the Colorado Charter Schools Act, focusing on the contracting process, on dispute resolution and appeals, on renewal, on employee options, and on revenue allocation. The document also examines issues surrounding school…
Digital Photography and Its Impact on Instruction.
ERIC Educational Resources Information Center
Lantz, Chris
Today the chemical processing of film is being replaced by a virtual digital darkroom. Digital image storage makes new levels of consistency possible because its nature is less volatile and more mutable than traditional photography. The potential of digital imaging is great, but issues of disk storage, computer speed, camera sensor resolution,…
Where Does Conflict Management Fit in the System's Leadership Puzzle?
ERIC Educational Resources Information Center
Cook, Vickie S.; Johnston, Linda M.
2008-01-01
Superintendents are faced with conflicts every day. The conflicts arise around issues of personnel, community roles, funding, politics, and work/life balance. Good leadership involves an understanding of how to deal with conflict, whom to involve in the conflict resolution, how to set up structures and processes that ensure conflict doesn't…
Automated conflict resolution issues
NASA Technical Reports Server (NTRS)
Wike, Jeffrey S.
1991-01-01
A discussion is presented of how conflicts for Space Network resources should be resolved in the ATDRSS era. The following topics are presented: a description of how resource conflicts are currently resolved; a description of issues associated with automated conflict resolution; present conflict resolution strategies; and topics for further discussion.
Hyper-resolution monitoring of urban flooding with social media and crowdsourcing data
NASA Astrophysics Data System (ADS)
Wang, Ruo-Qian; Mao, Huina; Wang, Yuan; Rae, Chris; Shaw, Wesley
2018-02-01
Hyper-resolution datasets for urban flooding are rare. This problem prevents detailed flooding risk analysis, urban flooding control, and the validation of hyper-resolution numerical models. We employed social media and crowdsourcing data to address this issue. Natural Language Processing and Computer Vision techniques are applied to the data collected from Twitter and MyCoast (a crowdsourcing app). We found these big data based flood monitoring approaches can complement the existing means of flood data collection. The extracted information is validated against precipitation data and road closure reports to examine the data quality. The two data collection approaches are compared and the two data mining methods are discussed. A series of suggestions is given to improve the data collection strategy.
The Specialization of Function: Cognitive and Neural Perspectives
Mahon, Bradford Z.; Cantlon, Jessica F.
2014-01-01
A unifying theme that cuts across all research areas and techniques in the cognitive and brain sciences is whether there is specialization of function at levels of processing that are ‘abstracted away’ from sensory inputs and motor outputs. Any theory that articulates claims about specialization of function in the mind/brain confronts the following types of interrelated questions, each of which carries with it certain theoretical commitments. What methods are appropriate for decomposing complex cognitive and neural processes into their constituent parts? How do cognitive processes map onto neural processes, and at what resolution are they related? What types of conclusions can be drawn about the structure of mind from dissociations observed at the neural level, and vice versa? The contributions that form this Special Issue of Cognitive Neuropsychology represent recent reflections on these and other issues from leading researchers in different areas of the cognitive and brain sciences. PMID:22185234
NASA Astrophysics Data System (ADS)
Blok, A. S.; Bukhenskii, A. F.; Krupitskii, É. I.; Morozov, S. V.; Pelevin, V. Yu; Sergeenko, T. N.; Yakovlev, V. I.
1995-10-01
An investigation is reported of acousto-optical and fibre-optic Fourier processors of electric signals, based on semiconductor lasers. A description is given of practical acousto-optical processors with an analysis band 120 MHz wide, a resolution of 200 kHz, and 7 cm × 8 cm × 18 cm dimensions. Fibre-optic Fourier processors are considered: they represent a new class of devices which are promising for the processing of gigahertz signals.
Landsat continuity: issues and opportunities for land cover monitoring
Michael A. Wulder; Joanne C. White; Samuel N. Goward; Jeffrey G. Masek; James R. Irons; Martin Herold; Warren B. Cohen; Thomas R. Loveland; Curtis E. Woodcock
2008-01-01
Initiated in 1972, the Landsat program has provided a continuous record of Earth observation for 35 years. The assemblage of Landsat spatial, spectral, and temporal resolutions, over a reasonably sized image extent, results in imagery that can be processed to represent land cover over large areas with an amount of spatial detail that is absolutely unique and...
Mask manufacturing of advanced technology designs using multi-beam lithography (part 2)
NASA Astrophysics Data System (ADS)
Green, Michael; Ham, Young; Dillon, Brian; Kasprowicz, Bryan; Hur, Ik Boum; Park, Joong Hee; Choi, Yohan; McMurran, Jeff; Kamberian, Henry; Chalom, Daniel; Klikovits, Jan; Jurkovic, Michal; Hudek, Peter
2016-09-01
As optical lithography is extended into 10nm and below nodes, advanced designs are becoming a key challenge for mask manufacturers. Techniques including advanced optical proximity correction (OPC) and Inverse Lithography Technology (ILT) result in structures that pose a range of issues across the mask manufacturing process. Among the new challenges are continued shrinking sub-resolution assist features (SRAFs), curvilinear SRAFs, and other complex mask geometries that are counter-intuitive relative to the desired wafer pattern. Considerable capability improvements over current mask making methods are necessary to meet the new requirements particularly regarding minimum feature resolution and pattern fidelity. Advanced processes using the IMS Multi-beam Mask Writer (MBMW) are feasible solutions to these coming challenges. In this paper, Part 2 of our study, we further characterize an MBMW process for 10nm and below logic node mask manufacturing including advanced pattern analysis and write time demonstration.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Reza Akrami, Seyed Mohammad; Miyata, Kazuki; Asakawa, Hitoshi
High-speed atomic force microscopy has attracted much attention due to its unique capability of visualizing nanoscale dynamic processes at a solid/liquid interface. However, its usability and resolution have yet to be improved. As one of the solutions for this issue, here we present a design of a high-speed Z-tip scanner with screw holding mechanism. We perform detailed comparison between designs with different actuator size and screw arrangement by finite element analysis. Based on the design giving the best performance, we have developed a Z tip scanner and measured its performance. The measured frequency response of the scanner shows a flatmore » response up to ∼10 kHz. This high frequency response allows us to achieve wideband tip-sample distance regulation. We demonstrate the applicability of the scanner to high-speed atomic-resolution imaging by visualizing atomic-scale calcite crystal dissolution process in water at 2 s/frame.« less
NASA Astrophysics Data System (ADS)
Masaitis, A.
2013-12-01
Conflicts in the development of mining projects are now common between the mining proponents, NGO's and communities. These conflicts can sometimes be alleviated by early development of modes of communication, and a formal discussion format that allows airing of concerns and potential resolution of problems. One of the methods that can formalize this process is to establish a Good Neighbor Agreement (GNA), which deals specifically with challenges in relationships between mining operations and the local communities. It is a new practice related to mining operations that are oriented toward social needs and concerns of local communities that arise during the normal life of a mine, which can achieve sustainable mining practices in both developing and developed countries. The GNA project being currently developed at the University of Nevada, Reno in cooperation with the Newmont Mining Corporation has a goal to create an open company/community dialog that is based on the international standards and that will help identify and address sociological and environmental concerns associated with mining, as well as find methods for communication and conflict resolution. GNA standards should be based on trust doctrine, open information access, and community involvement in the decision making process. It should include the following components: emergency response and community communications; environmental issues, including air and water quality standards; reclamation and recultivation; socio-economic issues: transportation, safety, training, and local hiring; and financial issues, particularly related to mitigation offsets and community needs. The GNA standards help identify and evaluate conflict criteria in mining/community relationships; determine the status of concerns; focus on the local political and government systems; separate the acute and the chronic concerns; determine the role and responsibilities of stakeholders; analyze problem resolution feasibility; maintain the community involvement and support through economic benefits and environmental safeguards; develop options for the concerns resolution; develop and manage short and long-term plans. Difficulties in establishing the GNA standards include identification of the full list of stakeholders, lack of responsible environmental protection practices, dependence on the government and political system, lack of will to disclose full information to the public. It is further complicated by the lack of insurance/bonding policies, and by the lack of audit and monitoring that could determine the level of exposure of the local community and the environment to the contaminants released at the mine sites. Since many problems of mines can occur during closure and post-closure, GNA's should address those issues also. Determined the process for the GNA implementation as a conflict prevention/resolution tool, analyzed conflict/concerns criteria associated with mining operations, determined the role of the stakeholders, worked out the process of stakeholders monitoring, carried out the sociological survey of the stakeholders and the community. Frequent conflicts between mining companies and surrounding communities that lead to work disruptions or even mine closures show the necessity of a less confrontational approach to environmental and social justice. Establishment of GNA standards for use in both developed and developing nations can decrease these conflicts.
Paiva, Anthony; Shou, Wilson Z
2016-08-01
The last several years have seen the rapid adoption of the high-resolution MS (HRMS) for bioanalytical support of high throughput in vitro ADME profiling. Many capable software tools have been developed and refined to process quantitative HRMS bioanalysis data for ADME samples with excellent performance. Additionally, new software applications specifically designed for quan/qual soft spot identification workflows using HRMS have greatly enhanced the quality and efficiency of the structure elucidation process for high throughput metabolite ID in early in vitro ADME profiling. Finally, novel approaches in data acquisition and compression, as well as tools for transferring, archiving and retrieving HRMS data, are being continuously refined to tackle the issue of large data file size typical for HRMS analyses.
Rapid 3D bioprinting from medical images: an application to bone scaffolding
NASA Astrophysics Data System (ADS)
Lee, Daniel Z.; Peng, Matthew W.; Shinde, Rohit; Khalid, Arbab; Hong, Abigail; Pennacchi, Sara; Dawit, Abel; Sipzner, Daniel; Udupa, Jayaram K.; Rajapakse, Chamith S.
2018-03-01
Bioprinting of tissue has its applications throughout medicine. Recent advances in medical imaging allows the generation of 3-dimensional models that can then be 3D printed. However, the conventional method of converting medical images to 3D printable G-Code instructions has several limitations, namely significant processing time for large, high resolution images, and the loss of microstructural surface information from surface resolution and subsequent reslicing. We have overcome these issues by creating a JAVA program that skips the intermediate triangularization and reslicing steps and directly converts binary dicom images into G-Code. In this study, we tested the two methods of G-Code generation on the application of synthetic bone graft scaffold generation. We imaged human cadaveric proximal femurs at an isotropic resolution of 0.03mm using a high resolution peripheral quantitative computed tomography (HR-pQCT) scanner. These images, of the Digital Imaging and Communications in Medicine (DICOM) format, were then processed through two methods. In each method, slices and regions of print were selected, filtered to generate a smoothed image, and thresholded. In the conventional method, these processed images are converted to the STereoLithography (STL) format and then resliced to generate G-Code. In the new, direct method, these processed images are run through our JAVA program and directly converted to G-Code. File size, processing time, and print time were measured for each. We found that this new method produced a significant reduction in G-Code file size as well as processing time (92.23% reduction). This allows for more rapid 3D printing from medical images.
Moral Sensitivity and Its Contribution to the Resolution of Socio-Scientific Issues
ERIC Educational Resources Information Center
Sadler, Troy
2004-01-01
This study explores models of how people perceive moral aspects of socio-scientific issues. Thirty college students participated in interviews during which they discussed their reactions to and resolutions of two genetic engineering issues. The interview data were analyzed qualitatively to produce an emergent taxonomy of moral concerns recognized…
Watson, Jeanne C
2018-05-01
An important objective in humanistic-experiential psychotherapies and particularly emotion-focused psychotherapy (EFT) is to map patterns of change. Effective mapping of the processes and pathways of change requires that in-session processes be linked to in-session resolutions, immediate post-session changes, intermediate outcome, final therapy outcome, and longer-term change. This is a challenging and long-term endeavour. Fine-grained descriptions of in-session processes that lead to resolution of specific interpersonal and intrapersonal issues linked with longer-term outcomes are the foundation of EFT, the process-experiential approach. In this paper, evidence in support of EFT as a treatment approach will be reviewed along with research on two mechanisms of change, viewed as central to EFT, clients' emotional processing and the therapeutic relationship conditions. The implications for psychotherapy research are discussed. Given the methodological constraints, there is a need for more innovative methodologies and strategies to investigate specific psychotherapy processes within and across different approaches to map patterns and mechanisms of change to enhance theory, research, practice, and training.
Will lawyering strangle democratic capitalism
DOE Office of Scientific and Technical Information (OSTI.GOV)
Silberman, L.H.
1978-01-01
Excessive reliance on intervention through the legal process, an expression of governmental power, is seen as a threat to capitalism because the legal process is less responsive to public will. The increasing use of the courts to resolve social and economic issues is partly a result of the complex legislative process. Judges have become more receptive to public-interest issues and have broadened the definition of their jurisdiction. The transference of interests into rights in the public mind eventually leads to authoritarian resolutions and a loss of democracy. The new power of law has attracted talent away from business and intomore » legal services to the detriment of economic growth and vitality. Lawyers benefit from the expansion of the legal process, although they fail to relate the economic ramifications of free access to the courts with the principles of capitalism and democracy. Lawyers are urged to help find a solution to the dilemma before the legal process becomes too unwieldy.« less
ERIC Educational Resources Information Center
Copenhaver, John
2007-01-01
Parents and school staff usually agree upon issues regarding evaluation, eligibility, program, and placement of students with disabilities. However, there are times when disagreement occurs. Disagreements and conflict are often inevitable, but they need not produce negative results. Mediation in special education is a process to assist parents and…
ERIC Educational Resources Information Center
Mueller, Tracy Gershwin; Piantoni, Shawn
2013-01-01
Conflict between parents of children with disabilities and school district members has been an ongoing issue for decades. Special education administrators are often designated to address conflict with the intent to find an amicable resolution. Otherwise, conflict can lead to due process hearings that move valuable time and money away from general…
Liangjun Hu; Qinfeng Guo
2013-01-01
How species diversity relates to productivity remains a major debate. To date, however, the underlying mechanisms that regulate the ecological processes involved are still poorly understood. Three major issues persist in early efforts at resolution. First, in the context that productivity drives species diversity, how the pathways operate is poorly-explained. Second,...
EUV lithography for 30nm half pitch and beyond: exploring resolution, sensitivity, and LWR tradeoffs
NASA Astrophysics Data System (ADS)
Putna, E. Steve; Younkin, Todd R.; Chandhok, Manish; Frasure, Kent
2009-03-01
The International Technology Roadmap for Semiconductors (ITRS) denotes Extreme Ultraviolet (EUV) lithography as a leading technology option for realizing the 32nm half-pitch node and beyond. Readiness of EUV materials is currently one high risk area according to assessments made at the 2008 EUVL Symposium. The main development issue regarding EUV resist has been how to simultaneously achieve high sensitivity, high resolution, and low line width roughness (LWR). This paper describes the strategy and current status of EUV resist development at Intel Corporation. Data is presented utilizing Intel's Micro-Exposure Tool (MET) examining the feasibility of establishing a resist process that simultaneously exhibits <=30nm half-pitch (HP) L/S resolution at <=10mJ/cm2 with <=4nm LWR.
EUV lithography for 22nm half pitch and beyond: exploring resolution, LWR, and sensitivity tradeoffs
NASA Astrophysics Data System (ADS)
Putna, E. Steve; Younkin, Todd R.; Caudillo, Roman; Chandhok, Manish
2010-04-01
The International Technology Roadmap for Semiconductors (ITRS) denotes Extreme Ultraviolet (EUV) lithography as a leading technology option for realizing the 22nm half pitch node and beyond. Readiness of EUV materials is currently one high risk area according to recent assessments made at the 2009 EUVL Symposium. The main development issue regarding EUV resist has been how to simultaneously achieve high sensitivity, high resolution, and low line width roughness (LWR). This paper describes the strategy and current status of EUV resist development at Intel Corporation. Data collected utilizing Intel's Micro-Exposure Tool (MET) is presented in order to examine the feasibility of establishing a resist process that simultaneously exhibits <=22nm half-pitch (HP) L/S resolution at <= 12.5mJ/cm2 with <= 4nm LWR.
Multi-scale analysis of a household level agent-based model of landcover change.
Evans, Tom P; Kelley, Hugh
2004-08-01
Scale issues have significant implications for the analysis of social and biophysical processes in complex systems. These same scale implications are likewise considerations for the design and application of models of landcover change. Scale issues have wide-ranging effects from the representativeness of data used to validate models to aggregation errors introduced in the model structure. This paper presents an analysis of how scale issues affect an agent-based model (ABM) of landcover change developed for a research area in the Midwest, USA. The research presented here explores how scale factors affect the design and application of agent-based landcover change models. The ABM is composed of a series of heterogeneous agents who make landuse decisions on a portfolio of cells in a raster-based programming environment. The model is calibrated using measures of fit derived from both spatial composition and spatial pattern metrics from multi-temporal landcover data interpreted from historical aerial photography. A model calibration process is used to find a best-fit set of parameter weights assigned to agents' preferences for different landuses (agriculture, pasture, timber production, and non-harvested forest). Previous research using this model has shown how a heterogeneous set of agents with differing preferences for a portfolio of landuses produces the best fit to landcover changes observed in the study area. The scale dependence of the model is explored by varying the resolution of the input data used to calibrate the model (observed landcover), ancillary datasets that affect land suitability (topography), and the resolution of the model landscape on which agents make decisions. To explore the impact of these scale relationships the model is run with input datasets constructed at the following spatial resolutions: 60, 90, 120, 150, 240, 300 and 480 m. The results show that the distribution of landuse-preference weights differs as a function of scale. In addition, with the gradient descent model fitting method used in this analysis the model was not able to converge to an acceptable fit at the 300 and 480 m spatial resolutions. This is a product of the ratio of the input cell resolution to the average parcel size in the landscape. This paper uses these findings to identify scale considerations in the design, development, validation and application of ABMs of landcover change.
A guide to the EEOC's final regulations on the Americans with Disabilities Act.
Shaller, E H; Rosen, D A
The Equal Employment Opportunity Commission (EEOC) recently issued its final regulations on the Americans with Disabilities Act (ADA). Although the regulations offer some guidance for employers on how to comply with the Act, they fail to provide specific answers to the many complicated compliance questions that will surely arise. Further, the regulations are almost totally silent on certain critical issues related to insurance, workers' compensation, and potential conflicts between ADA obligations and terms of collective bargaining agreements. The EEOC has essentially left the resolution of many important ADA questions to case-by-case determination and the litigation process.
NASA Astrophysics Data System (ADS)
Baldwin, Daniel; Tschudi, Mark; Pacifici, Fabio; Liu, Yinghui
2017-08-01
Two independent VIIRS-based Sea Ice Concentration (SIC) products are validated against SIC as estimated from Very High Spatial Resolution Imagery for several VIIRS overpasses. The 375 m resolution VIIRS SIC from the Interface Data Processing Segment (IDPS) SIC algorithm is compared against estimates made from 2 m DigitalGlobe (DG) WorldView-2 imagery and also against estimates created from 10 cm Digital Mapping System (DMS) camera imagery. The 750 m VIIRS SIC from the Enterprise SIC algorithm is compared against DG imagery. The IDPS vs. DG comparisons reveal that, due to algorithm issues, many of the IDPS SIC retrievals were falsely assigned ice-free values when the pixel was clearly over ice. These false values increased the validation bias and RMS statistics. The IDPS vs. DMS comparisons were largely over ice-covered regions and did not demonstrate the false retrieval issue. The validation results show that products from both the IDPS and Enterprise algorithms were within or very close to the 10% accuracy (bias) specifications in both the non-melting and melting conditions, but only products from the Enterprise algorithm met the 25% specifications for the uncertainty (RMS).
Hurdles in low k1 mass production
NASA Astrophysics Data System (ADS)
Yim, Donggyu; Yang, Hyunjo; Park, Chanha; Hong, Jongkyun; Choi, Jaeseung
2005-05-01
As the optical lithography pushes toward its theoretical resolution limit 0.25k1, the application of aggressive Resolution Enhancement Techniques (RETs) are required in order to ensure necessary resolution, sufficient process window, and reasonable MEEF in critical layers. When chip makers are adopting RETs in low k1 device, there are a lot of crucial factors to take into account in the development and mass production. Those hurdles are not only difficult to overcome but also highly risky to the company, which adopts low k1 mass production strategy. But, low k1 production strategy is very attractive to all chip makers, owing to improving production capacity and cost of ownership. So, low k1 technology has been investigated by many lithography engineers. Lots of materials have been introduced. Most of them are just in RnD level. In this study, low k1 mass production issues shall be introduced, mainly. The definition of low k1 in mass production shall be suggested. And, a lot of low_k1 issues shall be introduced, also. Most of them were investigated/experienced in RnD development stage and final mass production line. Low k1 mass production, is some what different from only RnD development.
Fast, Accurate and Shift-Varying Line Projections for Iterative Reconstruction Using the GPU
Pratx, Guillem; Chinn, Garry; Olcott, Peter D.; Levin, Craig S.
2013-01-01
List-mode processing provides an efficient way to deal with sparse projections in iterative image reconstruction for emission tomography. An issue often reported is the tremendous amount of computation required by such algorithm. Each recorded event requires several back- and forward line projections. We investigated the use of the programmable graphics processing unit (GPU) to accelerate the line-projection operations and implement fully-3D list-mode ordered-subsets expectation-maximization for positron emission tomography (PET). We designed a reconstruction approach that incorporates resolution kernels, which model the spatially-varying physical processes associated with photon emission, transport and detection. Our development is particularly suitable for applications where the projection data is sparse, such as high-resolution, dynamic, and time-of-flight PET reconstruction. The GPU approach runs more than 50 times faster than an equivalent CPU implementation while image quality and accuracy are virtually identical. This paper describes in details how the GPU can be used to accelerate the line projection operations, even when the lines-of-response have arbitrary endpoint locations and shift-varying resolution kernels are used. A quantitative evaluation is included to validate the correctness of this new approach. PMID:19244015
Data processing and analysis for 2D imaging GEM detector system
NASA Astrophysics Data System (ADS)
Czarski, T.; Chernyshova, M.; Pozniak, K. T.; Kasprowicz, G.; Byszuk, A.; Juszczyk, B.; Kolasinski, P.; Linczuk, M.; Wojenski, A.; Zabolotny, W.; Zienkiewicz, P.
2014-11-01
The Triple Gas Electron Multiplier (T-GEM) is presented as soft X-ray (SXR) energy and position sensitive detector for high-resolution X-ray diagnostics of magnetic confinement fusion plasmas [1]. Multi-channel measurement system and essential data processing for X-ray energy and position recognition is consider. Several modes of data acquisition are introduced depending on processing division for hardware and software components. Typical measuring issues aredeliberated for enhancement of data quality. Fundamental output characteristics are presented for one and two dimensional detector structure. Representative results for reference X-ray source and tokamak plasma are demonstrated.
NASA Technical Reports Server (NTRS)
Naumann, R. J.
1980-01-01
The scientific aspects of the Materials Processing in Space program are described with emphasis on the major categories of interest: (1) crystal growth; (2) solidification of metals, alloys, and composites; (3) fluids and chemical processes; (4) containerless processing, glasses, and refractories; (5) ultrahigh vacuum processes; and (6) bioprocessing. An index is provided for each of these areas. The possible contributions that materials science experiments in space can make to the various disciplines are summarized, and the necessity for performing experiments in space is justified. What has been learned from previous experiments relating to space processing, current investigations, and remaining issues that require resolution are discussed. Recommendations for the future direction of the program are included.
Inferred Lunar Boulder Distributions at Decimeter Scales
NASA Technical Reports Server (NTRS)
Baloga, S. M.; Glaze, L. S.; Spudis, P. D.
2012-01-01
Block size distributions of impact deposits on the Moon are diagnostic of the impact process and environmental effects, such as target lithology and weathering. Block size distributions are also important factors in trafficability, habitability, and possibly the identification of indigenous resources. Lunar block sizes have been investigated for many years for many purposes [e.g., 1-3]. An unresolved issue is the extent to which lunar block size distributions can be extrapolated to scales smaller than limits of resolution of direct measurement. This would seem to be a straightforward statistical application, but it is complicated by two issues. First, the cumulative size frequency distribution of observable boulders rolls over due to resolution limitations at the small end. Second, statistical regression provides the best fit only around the centroid of the data [4]. Confidence and prediction limits splay away from the best fit at the endpoints resulting in inferences in the boulder density at the CPR scale that can differ by many orders of magnitude [4]. These issues were originally investigated by Cintala and McBride [2] using Surveyor data. The objective of this study was to determine whether the measured block size distributions from Lunar Reconnaissance Orbiter Camera - Narrow Angle Camera (LROC-NAC) images (m-scale resolution) can be used to infer the block size distribution at length scales comparable to Mini-RF Circular Polarization Ratio (CPR) scales, nominally taken as 10 cm. This would set the stage for assessing correlations of inferred block size distributions with CPR returns [6].
Hyper-Resolution Groundwater Modeling using MODFLOW 6
NASA Astrophysics Data System (ADS)
Hughes, J. D.; Langevin, C.
2017-12-01
MODFLOW 6 is the latest version of the U.S. Geological Survey's modular hydrologic model. MODFLOW 6 was developed to synthesize many of the recent versions of MODFLOW into a single program, improve the way different process models are coupled, and to provide an object-oriented framework for adding new types of models and packages. The object-oriented framework and underlying numerical solver make it possible to tightly couple any number of hyper-resolution models within coarser regional models. The hyper-resolution models can be used to evaluate local-scale groundwater issues that may be affected by regional-scale forcings. In MODFLOW 6, hyper-resolution meshes can be maintained as separate model datasets, similar to MODFLOW-LGR, which simplifies the development of a coarse regional model with imbedded hyper-resolution models from a coarse regional model. For example, the South Atlantic Coastal Plain regional water availability model was converted from a MODFLOW-2000 model to a MODFLOW 6 model. The horizontal discretization of the original model is approximately 3,218 m x 3,218 m. Hyper-resolution models of the Aiken and Sumter County water budget areas in South Carolina with a horizontal discretization of approximately 322 m x 322 m were developed and were tightly coupled to a modified version of the original coarse regional model that excluded these areas. Hydraulic property and aquifer geometry data from the coarse model were mapped to the hyper-resolution models. The discretization of the hyper-resolution models is fine enough to make detailed analyses of the effect that changes in groundwater withdrawals in the production aquifers have on the water table and surface-water/groundwater interactions. The approach used in this analysis could be applied to other regional water availability models that have been developed by the U.S. Geological Survey to evaluate local scale groundwater issues.
Independent Orbiter Assessment (IOA): CIL issues resolution report, volume 2
NASA Technical Reports Server (NTRS)
1988-01-01
The results of the Independent Orbiter Assessment (IOA) of the Failure Modes And Effects Analysis (FMEA) and Critical Items List (CIL) are presented. This report contains IOA assessment worksheets showing resolution of outstanding IOA CIL issues that were summarized in the IOA FMEA/CIL Assessment Interim Report, dated 9 March 1988. Each assessment worksheet has been updated with CIL issue resolution and rationale. Volume 2 contains the worksheets for the following subsystems: Nose Wheel Steering Subsystem; Remote Manipulator Subsystem; Atmospheric Revitalization Subsystem; Extravehicular Mobility Unit Subsystem; Power Reactant Supply and Distribution Subsystem; Main Propulsion Subsystem; and Orbital Maneuvering Subsystem.
ERIC Educational Resources Information Center
Kayes, Pauline E.
2006-01-01
In the last ten years, many colleges, universities, boards, and agencies have jumped on the diverse faculty/staff hiring bandwagon not only by issuing resolutions, policies, and mandates but also by inventing programs, initiatives, and strategies all intended to increase the number of faculty and staff of color in predominantly White institutions.…
Sahra integrated modeling approach to address water resources management in semi-arid river basins
DOE Office of Scientific and Technical Information (OSTI.GOV)
Springer, E. P.; Gupta, Hoshin V.; Brookshire, David S.
Water resources decisions in the 21Sf Century that will affect allocation of water for economic and environmental will rely on simulations from integrated models of river basins. These models will not only couple natural systems such as surface and ground waters, but will include economic components that can assist in model assessments of river basins and bring the social dimension to the decision process. The National Science Foundation Science and Technology Center for Sustainability of semi-Arid Hydrology and Riparian Areas (SAHRA) has been developing integrated models to assess impacts of climate variability and land use change on water resources inmore » semi-arid river basins. The objectives of this paper are to describe the SAHRA integrated modeling approach and to describe the linkage between social and natural sciences in these models. Water resources issues that arise from climate variability or land use change may require different resolution models to answer different questions. For example, a question related to streamflow may not need a high-resolution model whereas a question concerning the source and nature of a pollutant will. SAHRA has taken a multiresolution approach to integrated model development because one cannot anticipate the questions in advance, and the computational and data resources may not always be available or needed for the issue to be addressed. The coarsest resolution model is based on dynamic simulation of subwatersheds or river reaches. This model resolution has the advantage of simplicity and social factors are readily incorporated. Users can readily take this model (and they have) and examine the effects of various management strategies such as increased cost of water. The medium resolution model is grid based and uses variable grid cells of 1-12 km. The surface hydrology is more physically based using basic equations for energy and water balance terms, and modules are being incorporated that will simulate engineering components such as reservoirs or irrigation diversions and economic features such as variable demand. The fine resolution model is viewed as a tool to examine basin response using best available process models. The fine resolution model operates on a grid cell size of 100 m or less, which is consistent with the scale that our process knowledge has developed. The fine resolution model couples atmosphere, surface water and groundwater modules using high performance computing. The medium and fine resolution models are not expected at this time to be operated by users as opposed to the coarse resolution model. One of the objectives of the SAHRA integrated modeling task is to present results in a manner that can be used by those making decisions. The application of these models within SAHRA is driven by a scenario analysis and a place location. The place is the Rio Grande from its headwaters in Colorado to the New Mexico-Texas border. This provides a focus for model development and an attempt to see how the results from the various models relate. The scenario selected by SAHRA is the impact of a 1950's style drought using 1990's population and land use on Rio Grande water resources including surface and groundwater. The same climate variables will be used to drive all three models so that comparison will be based on how the three resolutions partition and route water through the river basin. Aspects of this scenario will be discussed and initial model simulation will be presented. The issue of linking economic modules into the modeling effort will be discussed and the importance of feedback from the social and economic modules to the natural science modules will be reviewed.« less
Gridless, pattern-driven point cloud completion and extension
NASA Astrophysics Data System (ADS)
Gravey, Mathieu; Mariethoz, Gregoire
2016-04-01
While satellites offer Earth observation with a wide coverage, other remote sensing techniques such as terrestrial LiDAR can acquire very high-resolution data on an area that is limited in extension and often discontinuous due to shadow effects. Here we propose a numerical approach to merge these two types of information, thereby reconstructing high-resolution data on a continuous large area. It is based on a pattern matching process that completes the areas where only low-resolution data is available, using bootstrapped high-resolution patterns. Currently, the most common approach to pattern matching is to interpolate the point data on a grid. While this approach is computationally efficient, it presents major drawbacks for point clouds processing because a significant part of the information is lost in the point-to-grid resampling, and that a prohibitive amount of memory is needed to store large grids. To address these issues, we propose a gridless method that compares point clouds subsets without the need to use a grid. On-the-fly interpolation involves a heavy computational load, which is met by using a GPU high-optimized implementation and a hierarchical pattern searching strategy. The method is illustrated using data from the Val d'Arolla, Swiss Alps, where high-resolution terrestrial LiDAR data are fused with lower-resolution Landsat and WorldView-3 acquisitions, such that the density of points is homogeneized (data completion) and that it is extend to a larger area (data extension).
Skin color as post-colonial hierarchy: a global strategy for conflict resolution.
Hall, Ronald E
2003-01-01
The post-colonial hierarchy is a critical dynamic of global coexistence. Power is associated with those sovereignties characterized by light-skinned populations. Those characterized by dark skin are denigrated and assumed less qualified to negotiate global issues as equals. Although political objectives are expected to stimulate conflict, skin color is directly correlated with the present world order. Moreover, most post-colonial sovereignties are heterogeneous in one way or another and yet do not engage in destructive conflict. From a global perspective, conflict resolution will require post-colonial sovereignties--particularly those of relative light skin--to forfeit their self-serving denigration of others. Strategies for conflict resolution should ignore skin color and incorporate measures designed to improve problem solving, moral reasoning, and the general etiquette skills of those engaged in any negotiation process.
Constraining Stochastic Parametrisation Schemes Using High-Resolution Model Simulations
NASA Astrophysics Data System (ADS)
Christensen, H. M.; Dawson, A.; Palmer, T.
2017-12-01
Stochastic parametrisations are used in weather and climate models as a physically motivated way to represent model error due to unresolved processes. Designing new stochastic schemes has been the target of much innovative research over the last decade. While a focus has been on developing physically motivated approaches, many successful stochastic parametrisation schemes are very simple, such as the European Centre for Medium-Range Weather Forecasts (ECMWF) multiplicative scheme `Stochastically Perturbed Parametrisation Tendencies' (SPPT). The SPPT scheme improves the skill of probabilistic weather and seasonal forecasts, and so is widely used. However, little work has focused on assessing the physical basis of the SPPT scheme. We address this matter by using high-resolution model simulations to explicitly measure the `error' in the parametrised tendency that SPPT seeks to represent. The high resolution simulations are first coarse-grained to the desired forecast model resolution before they are used to produce initial conditions and forcing data needed to drive the ECMWF Single Column Model (SCM). By comparing SCM forecast tendencies with the evolution of the high resolution model, we can measure the `error' in the forecast tendencies. In this way, we provide justification for the multiplicative nature of SPPT, and for the temporal and spatial scales of the stochastic perturbations. However, we also identify issues with the SPPT scheme. It is therefore hoped these measurements will improve both holistic and process based approaches to stochastic parametrisation. Figure caption: Instantaneous snapshot of the optimal SPPT stochastic perturbation, derived by comparing high-resolution simulations with a low resolution forecast model.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, Qinghua; He, Xu; Shi, Jinan
Oxygen ion transport is the key issue in redox processes. Visualizing the process of oxygen ion migration with atomic resolution is highly desirable for designing novel devices such as oxidation catalysts, oxygen permeation membranes, and solid oxide fuel cells. We show the process of electrically induced oxygen migration and subsequent reconstructive structural transformation in a SrCoO 2.5-σ film by scanning transmission electron microscopy. We find that the extraction of oxygen from every second SrO layer occurs gradually under an electrical bias; beyond a critical voltage, the brownmillerite units collapse abruptly and evolve into a periodic nano-twined phase with a highmore » c/a ratio and distorted tetrahedra. These results show that oxygen vacancy rows are not only natural oxygen diffusion channels, but also preferred sites for the induced oxygen vacancies. These direct experimental results of oxygen migration may provide a common mechanism for the electrically induced structural evolution of oxides.« less
Zhang, Qinghua; He, Xu; Shi, Jinan; ...
2017-07-24
Oxygen ion transport is the key issue in redox processes. Visualizing the process of oxygen ion migration with atomic resolution is highly desirable for designing novel devices such as oxidation catalysts, oxygen permeation membranes, and solid oxide fuel cells. We show the process of electrically induced oxygen migration and subsequent reconstructive structural transformation in a SrCoO 2.5-σ film by scanning transmission electron microscopy. We find that the extraction of oxygen from every second SrO layer occurs gradually under an electrical bias; beyond a critical voltage, the brownmillerite units collapse abruptly and evolve into a periodic nano-twined phase with a highmore » c/a ratio and distorted tetrahedra. These results show that oxygen vacancy rows are not only natural oxygen diffusion channels, but also preferred sites for the induced oxygen vacancies. These direct experimental results of oxygen migration may provide a common mechanism for the electrically induced structural evolution of oxides.« less
NASA Astrophysics Data System (ADS)
Ying, Changsheng; Zhao, Peng; Li, Ye
2018-01-01
The intensified charge-coupled device (ICCD) is widely used in the field of low-light-level (LLL) imaging. The LLL images captured by ICCD suffer from low spatial resolution and contrast, and the target details can hardly be recognized. Super-resolution (SR) reconstruction of LLL images captured by ICCDs is a challenging issue. The dispersion in the double-proximity-focused image intensifier is the main factor that leads to a reduction in image resolution and contrast. We divide the integration time into subintervals that are short enough to get photon images, so the overlapping effect and overstacking effect of dispersion can be eliminated. We propose an SR reconstruction algorithm based on iterative projection photon localization. In the iterative process, the photon image is sliced by projection planes, and photons are screened under the constraints of regularity. The accurate position information of the incident photons in the reconstructed SR image is obtained by the weighted centroids calculation. The experimental results show that the spatial resolution and contrast of our SR image are significantly improved.
Identity styles and conflict resolution styles: associations in mother-adolescent dyads.
Missotten, Lies Christine; Luyckx, Koen; Branje, Susan; Vanhalst, Janne; Goossens, Luc
2011-08-01
Adolescent identity and parent-adolescent conflict have each attracted considerable research interest. However, few studies have examined the important link between the two constructs. The present study examined the associations between adolescent identity processing styles and adolescent conflict resolution styles in the mother-adolescent dyad. Questionnaires about conflict frequency and resolution were completed by 796 adolescents (66% female, mostly Caucasian) and their mothers. Adolescents also completed a measure on identity styles. Each identity style was hypothesized to relate to a specific conflict resolution behavior. Hierarchical regression analyses showed that the information-oriented identity style was positively associated with positive problem solving and negatively with conflict engagement and withdrawal, the normative style was positively associated with compliance, and, finally, the diffuse-avoidant style was positively associated with withdrawal and conflict engagement and negatively with positive problem solving. Our results demonstrated that the way in which adolescents tackle identity-relevant issues is related to the way in which they deal with conflicts with their mothers. Implications and suggestions for future research are discussed.
Defense Acquisition Research Journal. Volume 21, Number 2, Issue 69
2014-04-01
that quickly meets their needs, not a slow and lumbering bureau- cracy better suited to the last century. As important, our military men and women...resolution of urgent needs/ONS. Joint organizations and other military services, however, are not included in this table. As reflected in Table 2, multiple...urgent capability shortfall, the process endures. Materiel release is required for all nonexpendable materiel; high-density military expendables
Research@ARL. Imaging & Image Processing. Volume 3, Issue 1
2014-01-01
goal, the focal plane arrays (FPAs) the Army deploys must excel in all areas of performance including thermal sensitivity, image resolution, speed of...are available only in relatively small sizes. Further, the difference in thermal expansion coefficients between a CZT substrate and its silicon (Si...read-out integrated circuitry reduces the reliability of large format FPAs due to repeated thermal cycling. Some in the community believed this
Underhill, Kristen
2014-02-01
Under US federal regulations, participants providing informed consent must receive information regarding whom to contact in case of a research-related injury or complaint. Although informed consent processes routinely direct participants to contact institutional review boards (IRBs) with questions or concerns, there has been little empirical study of the ways in which IRBs act to resolve participants' research-related complaints. This article explores available literature on participant complaints, considers the responsibilities of IRBs in dispute resolution, and outlines a research agenda. As a case study, this review considers disputes arising from HIV/AIDS research, focusing on novel issues arising from biomedical HIV prevention trials.
Underhill, Kristen
2014-01-01
Under U.S. federal regulations, participants providing informed consent must receive information regarding whom to contact in case of a research-related injury or complaint. Although informed consent processes routinely direct participants to contact institutional review boards (IRBs) with questions or concerns, there has been little empirical study of the ways in which IRBs act to resolve participants' research-related complaints. This article explores available literature on participant complaints, considers the responsibilities of IRBs in dispute resolution, and outlines a research agenda. As a case study, this review considers disputes arising from HIV/AIDS research, focusing on novel issues arising from biomedical HIV prevention trials. PMID:24572085
USGS advances in integrated, high-resolution sea-floor mapping: inner continental shelf to estuaries
Denny, J.F.; Schwab, W.C.; Twichell, D.C.; O'Brien, T.F.; Danforth, W.W.; Foster, D.S.; Bergeron, E.; Worley, C.W.; Irwin, B.J.; Butman, B.; Valentine, P.C.; Baldwin, W.E.; Morton, R.A.; Thieler, E.R.; Nichols, D.R.; Andrews, B.D.
2007-01-01
The U.S. Geological Survey (USGS) has been involved in geological mapping of the sea floor for the past thirty years. Early geophysical and acoustic mapping efforts using GLORIA (Geologic LOng Range Inclined ASDIC) a long-range sidescan-sonar system, provided broad-scale imagery of deep waters within the U.S. Exclusive Economic Zone (EEZ). In the early 1990's, research emphasis shifted from deep- to shallow-water environments to address pertinent coastal research and resource management issues. Use of shallow-water, high-resolution geophysical systems has enhanced our understanding of the processes shaping shallow marine environments. However, research within these shallow-water environments continues to present technological challenges.
Nuclear safety for the space exploration initiative
NASA Technical Reports Server (NTRS)
Dix, Terry E.
1991-01-01
The results of a study to identify potential hazards arising from nuclear reactor power systems for use on the lunar and Martian surfaces, related safety issues, and resolutions of such issues by system design changes, operating procedures, and other means are presented. All safety aspects of nuclear reactor power systems from prelaunch ground handling to eventual disposal were examined consistent with the level of detail for SP-100 reactor design at the 1988 System Design Review and for launch vehicle and space transport vehicle designs and mission descriptions as defined in the 90-day Space Exploration Initiative (SEI) study. Information from previous aerospace nuclear safety studies was used where appropriate. Safety requirements for the SP-100 space nuclear reactor system were compiled. Mission profiles were defined with emphasis on activities after low earth orbit insertion. Accident scenarios were then qualitatively defined for each mission phase. Safety issues were identified for all mission phases with the aid of simplified event trees. Safety issue resolution approaches of the SP-100 program were compiled. Resolution approaches for those safety issues not covered by the SP-100 program were identified. Additionally, the resolution approaches of the SP-100 program were examined in light of the moon and Mars missions.
Parametric effects of syntactic-semantic conflict in Broca's area during sentence processing.
Thothathiri, Malathi; Kim, Albert; Trueswell, John C; Thompson-Schill, Sharon L
2012-03-01
The hypothesized role of Broca's area in sentence processing ranges from domain-general executive function to domain-specific computation that is specific to certain syntactic structures. We examined this issue by manipulating syntactic structure and conflict between syntactic and semantic cues in a sentence processing task. Functional neuroimaging revealed that activation within several Broca's area regions of interest reflected the parametric variation in syntactic-semantic conflict. These results suggest that Broca's area supports sentence processing by mediating between multiple incompatible constraints on sentence interpretation, consistent with this area's well-known role in conflict resolution in other linguistic and non-linguistic tasks. Copyright © 2011 Elsevier Inc. All rights reserved.
Position Statements, Issue Briefs, Resolutions and Consensus Statements. Revised
ERIC Educational Resources Information Center
National Association of School Nurses (NJ1), 2012
2012-01-01
This article presents position statements, issue briefs, and resolutions and consensus statements of the National Association of School Nurses (NASN). The Position Statements include: (1) Allergy/Anaphylaxis Management in the School Setting; (2) Caseload Assignments; (3) Child Mortality in the School Setting; (4) Chronic Health Conditions, Managed…
Spatial heterogeneity of leaf area index across scales from simulation and remote sensing
NASA Astrophysics Data System (ADS)
Reichenau, Tim G.; Korres, Wolfgang; Montzka, Carsten; Schneider, Karl
2016-04-01
Leaf area index (LAI, single sided leaf area per ground area) influences mass and energy exchange of vegetated surfaces. Therefore LAI is an input variable for many land surface schemes of coupled large scale models, which do not simulate LAI. Since these models typically run on rather coarse resolution grids, LAI is often inferred from coarse resolution remote sensing. However, especially in agriculturally used areas, a grid cell of these products often covers more than a single land-use. In that case, the given LAI does not apply to any single land-use. Therefore, the overall spatial heterogeneity in these datasets differs from that on resolutions high enough to distinguish areas with differing land-use. Detailed process-based plant growth models simulate LAI for separate plant functional types or specific species. However, limited availability of observations causes reduced spatial heterogeneity of model input data (soil, weather, land-use). Since LAI is strongly heterogeneous in space and time and since processes depend on LAI in a nonlinear way, a correct representation of LAI spatial heterogeneity is also desirable on coarse resolutions. The current study assesses this issue by comparing the spatial heterogeneity of LAI from remote sensing (RapidEye) and process-based simulations (DANUBIA simulation system) across scales. Spatial heterogeneity is assessed by analyzing LAI frequency distributions (spatial variability) and semivariograms (spatial structure). Test case is the arable land in the fertile loess plain of the Rur catchment near the Germany-Netherlands border.
Creating A Nationwide Nonpartisan Initiative for Family Caregivers in Political Party Platforms.
Scribner, Ben; Lynn, Joanne; Walker, Victoria; Morgan, Les; Montgomery, Anne; Blair, Elizabeth; Baird, Davis; Goldschmidt, Barbara; Kirschenbaum, Naomi
2017-06-01
Policymakers have been slow to support family caregivers, and political agendas mostly fail to address the cost burdens, impact on employment and productivity, and other challenges in taking on long-term care tasks. This project set out to raise policymakers' awareness of family caregivers through proposals to Republican and Democratic party platforms during the 2016 political season. The Family Caregiver Platform Project (FCPP) reviewed the state party platform submission process for Democratic and Republican parties in all 50 states and the District of Columbia. We built a website to make each process understandable by caregiver advocates. We designed model submissions to help volunteers tailor a proposal and recruited caregiver advocates participating in their state process. Finally, we mobilized a ground operation in many states and followed the progress of submissions in each state, as well as the formation of the national platforms. In 39 states, at least one party, Republican or Democrat, hosted a state party platform process. As of September 2016 FCPP volunteers submitted proposals to 29 state parties in 22 states. Family caregiver language was added to eight state party platforms, one state party resolution, two bipartisan legislative resolutions, and one national party platform. The FCPP generated a non-partisan grassroots effort to educate and motivate policymakers to address caregiving issues and solutions. Democratic party leaders provided more opportunities to connect with political leaders, with seven Democratic parties and one Republican party, addressing family caregiver issues in their party platforms. © 2017, Copyright the Authors Journal compilation © 2017, The American Geriatrics Society.
Energy Efficiency Collaboratives
DOE Office of Scientific and Technical Information (OSTI.GOV)
Li, Michael; Bryson, Joe
2015-09-01
Collaboratives for energy efficiency have a long and successful history and are currently used, in some form, in more than half of the states. Historically, many state utility commissions have used some form of collaborative group process to resolve complex issues that emerge during a rate proceeding. Rather than debate the issues through the formality of a commission proceeding, disagreeing parties are sent to discuss issues in a less-formal setting and bring back resolutions to the commission. Energy efficiency collaboratives take this concept and apply it specifically to energy efficiency programs—often in anticipation of future issues as opposed to reactingmore » to a present disagreement. Energy efficiency collaboratives can operate long term and can address the full suite of issues associated with designing, implementing, and improving energy efficiency programs. Collaboratives can be useful to gather stakeholder input on changing program budgets and program changes in response to performance or market shifts, as well as to provide continuity while regulators come and go, identify additional energy efficiency opportunities and innovations, assess the role of energy efficiency in new regulatory contexts, and draw on lessons learned and best practices from a diverse group. Details about specific collaboratives in the United States are in the appendix to this guide. Collectively, they demonstrate the value of collaborative stakeholder processes in producing successful energy efficiency programs.« less
Site characterization report for the basalt waste isolation project. Volume II
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
1982-11-01
The reference location for a repository in basalt for the terminal storage of nuclear wastes on the Hanford Site and the candidate horizons within this reference repository location have been identified and the preliminary characterization work in support of the site screening process has been completed. Fifteen technical questions regarding the qualification of the site were identified to be addressed during the detailed site characterization phase of the US Department of Energy-National Waste Terminal Storage Program site selection process. Resolution of these questions will be provided in the final site characterization progress report, currently planned to be issued in 1987,more » and in the safety analysis report to be submitted with the License Application. The additional information needed to resolve these questions and the plans for obtaining the information have been identified. This Site Characterization Report documents the results of the site screening process, the preliminary site characterization data, the technical issues that need to be addressed, and the plans for resolving these issues. Volume 2 contains chapters 6 through 12: geochemistry; surface hydrology; climatology, meteorology, and air quality; environmental, land-use, and socioeconomic characteristics; repository design; waste package; and performance assessment.« less
EUV lithography for 22nm half pitch and beyond: exploring resolution, LWR, and sensitivity tradeoffs
NASA Astrophysics Data System (ADS)
Putna, E. Steve; Younkin, Todd R.; Leeson, Michael; Caudillo, Roman; Bacuita, Terence; Shah, Uday; Chandhok, Manish
2011-04-01
The International Technology Roadmap for Semiconductors (ITRS) denotes Extreme Ultraviolet (EUV) lithography as a leading technology option for realizing the 22nm half pitch node and beyond. According to recent assessments made at the 2010 EUVL Symposium, the readiness of EUV materials remains one of the top risk items for EUV adoption. The main development issue regarding EUV resists has been how to simultaneously achieve high resolution, high sensitivity, and low line width roughness (LWR). This paper describes our strategy, the current status of EUV materials, and the integrated post-development LWR reduction efforts made at Intel Corporation. Data collected utilizing Intel's Micro- Exposure Tool (MET) is presented in order to examine the feasibility of establishing a resist process that simultaneously exhibits <=22nm half-pitch (HP) L/S resolution at <=11.3mJ/cm2 with <=3nm LWR.
Multidimensional Processing and Visual Rendering of Complex 3D Biomedical Images
NASA Technical Reports Server (NTRS)
Sams, Clarence F.
2016-01-01
The proposed technology uses advanced image analysis techniques to maximize the resolution and utility of medical imaging methods being used during spaceflight. We utilize COTS technology for medical imaging, but our applications require higher resolution assessment of the medical images than is routinely applied with nominal system software. By leveraging advanced data reduction and multidimensional imaging techniques utilized in analysis of Planetary Sciences and Cell Biology imaging, it is possible to significantly increase the information extracted from the onboard biomedical imaging systems. Year 1 focused on application of these techniques to the ocular images collected on ground test subjects and ISS crewmembers. Focus was on the choroidal vasculature and the structure of the optic disc. Methods allowed for increased resolution and quantitation of structural changes enabling detailed assessment of progression over time. These techniques enhance the monitoring and evaluation of crew vision issues during space flight.
Clinical processes in behavioral couples therapy.
Fischer, Daniel J; Fink, Brandi C
2014-03-01
Behavioral couples therapy is a broad term for couples therapies that use behavioral techniques based on principles of operant conditioning, such as reinforcement. Behavioral shaping and rehearsal and acceptance are clinical processes found across contemporary behavioral couples therapies. These clinical processes are useful for assessment and case formulation, as well as teaching couples new methods of conflict resolution. Although these clinical processes assist therapists in achieving efficient and effective therapeutic change with distressed couples by rapidly stemming couples' corrosive affective exchanges, they also address the thoughts, emotions, and issues of trust and intimacy that are important aspects of the human experience in the context of a couple. Vignettes are provided to illustrate the clinical processes described. (PsycINFO Database Record (c) 2014 APA, all rights reserved).
5 CFR 2424.31 - Resolution of disputed issues of material fact; hearings.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 5 Administrative Personnel 3 2010-01-01 2010-01-01 false Resolution of disputed issues of material fact; hearings. 2424.31 Section 2424.31 Administrative Personnel FEDERAL LABOR RELATIONS AUTHORITY, GENERAL COUNSEL OF THE FEDERAL LABOR RELATIONS AUTHORITY AND FEDERAL SERVICE IMPASSES PANEL FEDERAL LABOR RELATIONS AUTHORITY AND GENERAL COUNSEL OF...
2009-06-16
resolution of disapproval is passed by Congress. Following the actual base closings and realignments, DOD develops an environmental remediation plan...examination of the BRAC process, For environmental remediation issues, see CRS Report RS21822, Military Base Closures: DOD’s 2005 Internal Selection...review and remediation . 2 Prior to the 1988 BRAC round, military installations were closed, or their missions were altered by order of the Secretary of
Patthoff, D E
1993-12-01
Ethics dialogue in this case is first used as a framework to initiate reflection on which forms of conflict resolution are appropriate in specific situations. This helps in planning and strategies, but does not guarantee what the outcome will actually be. Ethics dialogue, however, can also be used as a form of conflict resolution. For example, when the patient in the story wants to avoid revealing the names of her past dentists, an ethical framework could be presented that would respect her autonomy (an ethical term) and her right to privacy (a legal term), while still addressing your need to determine if the primary problem is of an ethical or dental nature, and if your role is to be that of a healing mediator or a healing dentist. This same form of conflict resolution could also be applied elsewhere in the story. For example, ethics dialogue would have been appropriate during the consultation between you and the endodontist, or between you and the patient, prior to the lawyer's formal request for the patient's records. It is difficult, however, for you to reduce conflict through an ethical dialogue once the lawyer requests information from you because, at that point, the adjudication process has already begun. The ethical reflection exercise will, however, help you negotiate through the adjudication process by providing a solid ethical reference point concerning conflict resolution. The February issue's ethics column will provide a framework for evaluating the forms of power available in conflict resolution in terms of justice.
48 CFR 33.214 - Alternative dispute resolution (ADR).
Code of Federal Regulations, 2011 CFR
2011-10-01
... resolution (ADR). 33.214 Section 33.214 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION... dispute resolution (ADR). (a) The objective of using ADR procedures is to increase the opportunity for relatively inexpensive and expeditious resolution of issues in controversy. Essential elements of ADR include...
48 CFR 33.214 - Alternative dispute resolution (ADR).
Code of Federal Regulations, 2010 CFR
2010-10-01
... resolution (ADR). 33.214 Section 33.214 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION... dispute resolution (ADR). (a) The objective of using ADR procedures is to increase the opportunity for relatively inexpensive and expeditious resolution of issues in controversy. Essential elements of ADR include...
Collaborative problem solving with a total quality model.
Volden, C M; Monnig, R
1993-01-01
A collaborative problem-solving system committed to the interests of those involved complies with the teachings of the total quality management movement in health care. Deming espoused that any quality system must become an integral part of routine activities. A process that is used consistently in dealing with problems, issues, or conflicts provides a mechanism for accomplishing total quality improvement. The collaborative problem-solving process described here results in quality decision-making. This model incorporates Ishikawa's cause-and-effect (fishbone) diagram, Moore's key causes of conflict, and the steps of the University of North Dakota Conflict Resolution Center's collaborative problem solving model.
JPEG XS call for proposals subjective evaluations
NASA Astrophysics Data System (ADS)
McNally, David; Bruylants, Tim; Willème, Alexandre; Ebrahimi, Touradj; Schelkens, Peter; Macq, Benoit
2017-09-01
In March 2016 the Joint Photographic Experts Group (JPEG), formally known as ISO/IEC SC29 WG1, issued a call for proposals soliciting compression technologies for a low-latency, lightweight and visually transparent video compression scheme. Within the JPEG family of standards, this scheme was denominated JPEG XS. The subjective evaluation of visually lossless compressed video sequences at high resolutions and bit depths poses particular challenges. This paper describes the adopted procedures, the subjective evaluation setup, the evaluation process and summarizes the obtained results which were achieved in the context of the JPEG XS standardization process.
Remote Sensing and Modeling of Landslides: Detection, Monitoring and Risk Evaluation
NASA Technical Reports Server (NTRS)
Kirschbaum, Dalia; Fukuoka, Hiroshi
2012-01-01
Landslides are one of the most pervasive hazards in the world, resulting in more fatalities and economic damage than is generally recognized_ Occurring over an extensive range of lithologies, morphologies, hydrologies, and climates, mass movements can be triggered by intense or prolonged rainfall, seismicity, freeze/thaw processes, and antbropogertic activities, among other factors. The location, size, and timing of these processes are characteristically difficult to predict and assess because of their localized spatial scales, distribution, and complex interactions between rainfall infiltration, hydromechanical properties of the soil, and the underlying surface composition. However, the increased availability, accessibility, and resolution of remote sensing data offer a new opportunity to explore issues of landslide susceptibility, hazard, and risk over a variety of spatial scales. This special issue presents a series of papers that investigate the sources, behavior, and impacts of different mass movement types using a diverse set of data sources and evaluation methodologies.
Effective Tools for Conflict Resolution in Multicultural Teams in Industrial Enterprises
NASA Astrophysics Data System (ADS)
Videnová, Veronika; Beluský, Martin; Cagáňová, Dagmar; Čambál, Miloš
2012-12-01
The aim of this paper is to highlight the issue of resolving conflicts within multicultural teams in industrial enterprises. The authors build upon the concept of multiculturalism which seeks for possible ways to enable different cultures to coexist and the means of communication between them. In the introduction, the authors explain the importance of increased attention and interest in the area of multiculturalism. Industrial enterprises nowadays are increasingly aware of this issue as they become more open to different cultures and they are confronted with intensive international migration and previously isolated societies become more pluralistic. As a result of these processes, individuals are more frequently in contact with members of different cultures.
Independent Orbiter Assessment (IOA) CIL issues resolution report, volume 3
NASA Technical Reports Server (NTRS)
1988-01-01
The results of the Independent Orbiter Assessment (IOA) of the Failure Modes and Effects Analysis (FMEA) and Critical Items List (CIL) are presented. This report contains IOA assessment workshets showing resolution of outstanding IOA CIL issues that were summarized in the IOA FMEA/CIL Assessment Interim Report, dated 9 March 1988. Each assessment worksheet has been updated with CIL issue rsolution and rationale. Volume 3 contains the worksheets for the Reaction Control Subsystem and the Communications and Tracking Subsystem.
Photoionization in the time and frequency domain
NASA Astrophysics Data System (ADS)
Isinger, M.; Squibb, R. J.; Busto, D.; Zhong, S.; Harth, A.; Kroon, D.; Nandi, S.; Arnold, C. L.; Miranda, M.; Dahlström, J. M.; Lindroth, E.; Feifel, R.; Gisselbrecht, M.; L'Huillier, A.
2017-11-01
Ultrafast processes in matter, such as the electron emission after light absorption, can now be studied using ultrashort light pulses of attosecond duration (10-18 seconds) in the extreme ultraviolet spectral range. The lack of spectral resolution due to the use of short light pulses has raised issues in the interpretation of the experimental results and the comparison with theoretical calculations. We determine photoionization time delays in neon atoms over a 40-electron volt energy range with an interferometric technique combining high temporal and spectral resolution. We spectrally disentangle direct ionization from ionization with shake-up, in which a second electron is left in an excited state, and obtain excellent agreement with theoretical calculations, thereby solving a puzzle raised by 7-year-old measurements.
Compressive Sensing Image Sensors-Hardware Implementation
Dadkhah, Mohammadreza; Deen, M. Jamal; Shirani, Shahram
2013-01-01
The compressive sensing (CS) paradigm uses simultaneous sensing and compression to provide an efficient image acquisition technique. The main advantages of the CS method include high resolution imaging using low resolution sensor arrays and faster image acquisition. Since the imaging philosophy in CS imagers is different from conventional imaging systems, new physical structures have been developed for cameras that use the CS technique. In this paper, a review of different hardware implementations of CS encoding in optical and electrical domains is presented. Considering the recent advances in CMOS (complementary metal–oxide–semiconductor) technologies and the feasibility of performing on-chip signal processing, important practical issues in the implementation of CS in CMOS sensors are emphasized. In addition, the CS coding for video capture is discussed. PMID:23584123
NASA Astrophysics Data System (ADS)
Gao, M.; Li, J.
2018-04-01
Geometric correction is an important preprocessing process in the application of GF4 PMS image. The method of geometric correction that is based on the manual selection of geometric control points is time-consuming and laborious. The more common method, based on a reference image, is automatic image registration. This method involves several steps and parameters. For the multi-spectral sensor GF4 PMS, it is necessary for us to identify the best combination of parameters and steps. This study mainly focuses on the following issues: necessity of Rational Polynomial Coefficients (RPC) correction before automatic registration, base band in the automatic registration and configuration of GF4 PMS spatial resolution.
Social Workers' Participation in the Resolution of Ethical Dilemmas in Hospice Care
ERIC Educational Resources Information Center
Csikai, Ellen L.
2004-01-01
Ethical dilemmas are inherent in every health care setting. A sample of hospice social workers with no direct access to a hospice ethics committee (N = 110) was surveyed regarding ethical issues in hospice care, how the issues were managed, and the extent to which social workers participated in resolution of ethical dilemmas. Common issues…
Finite element modeling of mass transport in high-Péclet cardiovascular flows
NASA Astrophysics Data System (ADS)
Hansen, Kirk; Arzani, Amirhossein; Shadden, Shawn
2016-11-01
Mass transport plays an important role in many important cardiovascular processes, including thrombus formation and atherosclerosis. These mass transport problems are characterized by Péclet numbers of up to 108, leading to several numerical difficulties. The presence of thin near-wall concentration boundary layers requires very fine mesh resolution in these regions, while large concentration gradients within the flow cause numerical stabilization issues. In this work, we will discuss some guidelines for solving mass transport problems in cardiovascular flows using a stabilized Galerkin finite element method. First, we perform mesh convergence studies in a series of idealized and patient-specific geometries to determine the required near-wall mesh resolution for these types of problems, using both first- and second-order tetrahedral finite elements. Second, we investigate the use of several boundary condition types at outflow boundaries where backflow during some parts of the cardiac cycle can lead to convergence issues. Finally, we evaluate the effect of reducing Péclet number by increasing mass diffusivity as has been proposed by some researchers. This work was supported by the NSF GRFP and NSF Career Award #1354541.
New reproductive technologies, ethics and legislation in Brazil: a delayed debate.
Guilhem, D
2001-06-01
This paper focuses on the debate about the utilization of new reproductive technologies in Brazil, and the paths taken in the Brazilian National Congress in an attempt to draw up legislation to regulate the clinical practice of human assisted reproduction. British documents, such as the Warnock Report and Human Fertilization and Embriology [sic] Authority (HFEA) are used for thorough reference. The analysis of the Law Projects in the National Congress, the Resolution by the Federal Medicine Council, Resolution 196/96 and documents by the the Ministerio Publico (Public Prosecution Office), supplied the bases for the discussion. The principal question involved is the observation of different technical and moral prientations [sic] that influence the conduct of the issue in the legislative process. It is possible to observe that the main focus of the projects relates to the rights and interests of the children, to those possibly benefited by the technique and to embryo reduction. Very little attention has been directed to the issues of sexual and reproductive rights and to the health of the women submitted to the new reproductive techologies [sic].
NASA Astrophysics Data System (ADS)
Niri, Mohammad Emami; Lumley, David E.
2017-10-01
Integration of 3D and time-lapse 4D seismic data into reservoir modelling and history matching processes poses a significant challenge due to the frequent mismatch between the initial reservoir model, the true reservoir geology, and the pre-production (baseline) seismic data. A fundamental step of a reservoir characterisation and performance study is the preconditioning of the initial reservoir model to equally honour both the geological knowledge and seismic data. In this paper we analyse the issues that have a significant impact on the (mis)match of the initial reservoir model with well logs and inverted 3D seismic data. These issues include the constraining methods for reservoir lithofacies modelling, the sensitivity of the results to the presence of realistic resolution and noise in the seismic data, the geostatistical modelling parameters, and the uncertainties associated with quantitative incorporation of inverted seismic data in reservoir lithofacies modelling. We demonstrate that in a geostatistical lithofacies simulation process, seismic constraining methods based on seismic litho-probability curves and seismic litho-probability cubes yield the best match to the reference model, even when realistic resolution and noise is included in the dataset. In addition, our analyses show that quantitative incorporation of inverted 3D seismic data in static reservoir modelling carries a range of uncertainties and should be cautiously applied in order to minimise the risk of misinterpretation. These uncertainties are due to the limited vertical resolution of the seismic data compared to the scale of the geological heterogeneities, the fundamental instability of the inverse problem, and the non-unique elastic properties of different lithofacies types.
Conflict on interprofessional primary health care teams--can it be resolved?
Brown, Judith; Lewis, Laura; Ellis, Kathy; Stewart, Moira; Freeman, Thomas R; Kasperski, M Janet
2011-01-01
Increasingly, primary health care teams (PHCTs) depend on the contributions of multiple professionals. However, conflict is inevitable on teams. This article examines PHCTs members' experiences with conflict and responses to conflict. This phenomenological study was conducted using in-depth interviews with 121 participants from 16 PHCTs (10 urban and 6 rural) including a wide range of health care professionals. An iterative analysis process was used to examine the verbatim transcripts. The analysis revealed three main themes: sources of team conflict; barriers to conflict resolution; and strategies for conflict resolution. Sources of team conflict included: role boundary issues; scope of practice; and accountability. Barriers to conflict resolution were: lack of time and workload; people in less powerful positions; lack of recognition or motivation to address conflict; and avoiding confrontation for fear of causing emotional discomfort. Team strategies for conflict resolution included interventions by team leaders and the development of conflict management protocols. Individual strategies included: open and direct communication; a willingness to find solutions; showing respect; and humility. Conflict is inherent in teamwork. However, understanding the potential barriers to conflict resolution can assist PHCTs in developing strategies to resolve conflict in a timely fashion.
Border effect-based precise measurement of any frequency signal
NASA Astrophysics Data System (ADS)
Bai, Li-Na; Ye, Bo; Xuan, Mei-Na; Jin, Yu-Zhen; Zhou, Wei
2015-12-01
Limited detection resolution leads to fuzzy areas during the measurement, and the discrimination of the border of a fuzzy area helps to use the resolution stability. In this way, measurement precision is greatly improved, hence this phenomenon is named the border effect. The resolution fuzzy area and its application should be studied to realize high-resolution measurement. During the measurement of any frequency signal, the fuzzy areas of phase-coincidence detection are always discrete and irregular. In this paper the difficulty in capturing the border information of discrete fuzzy areas is overcome and extra-high resolution measurement is implemented. Measurement precision of any frequency-signal can easily reach better than 1 × 10-11/s in a wide range of frequencies, showing the great importance of the border effect. An in-depth study of this issue has great significance for frequency standard comparison, signal processing, telecommunication, and fundamental subjects. Project supported by the National Natural Science Foundation of China (Grant Nos. 10978017 and 61201288), the Natural Science Foundation of Research Plan Projects of Shaanxi Province, China (Grant No. 2014JM2-6128), and the Sino-Poland Science and Technology Cooperation Projects (Grant No. 36-33).
There are two classes of statistical issues: firm issues amenable to problem statement and technical resolution and soft issues that have qualitative dimensions and ideological implications. irm issues are easy: defining and stating the problem is much of the solution. he soft is...
NASA Astrophysics Data System (ADS)
Ichimura, Koji; Hikichi, Ryugo; Harada, Saburo; Kanno, Koichi; Kurihara, Masaaki; Hayashi, Naoya
2017-04-01
Nanoimprint lithography, NIL, is gathering much attention as one of the most potential candidates for the next generation lithography for semiconductor. This technology needs no pattern data modification for exposure, simpler exposure system, and single step patterning process without any coat/develop truck, and has potential of cost effective patterning rather than very complex optical lithography and/or EUV lithography. NIL working templates are made by the replication of the EB written high quality master templates. Fabrication of high resolution master templates is one of the most important issues. Since NIL is 1:1 pattern transfer process, master templates have 4 times higher resolution compared with photomasks. Another key is to maintain the quality of the master templates in replication process. NIL process is applied for the template replication and this imprint process determines most of the performance of the replicated templates. Expectations to the NIL are not only high resolution line and spaces but also the contact hole layer application. Conventional ArF-i lithography has a certain limit in size and pitch for contact hole fabrication. On the other hand, NIL has good pattern fidelity for contact hole fabrication at smaller sizes and pitches compared with conventional optical lithography. Regarding the tone of the templates for contact hole, there are the possibilities of both tone, the hole template and the pillar template, depending on the processes of the wafer side. We have succeeded to fabricate both types of templates at 2xnm in size. In this presentation, we will be discussing fabrication or our replica template for the contact hole layer application. Both tone of the template fabrication will be presented as well as the performance of the replica templates. We will also discuss the resolution improvement of the hole master templates by using various e-beam exposure technologies.
XFEL diffraction: Developing processing methods to optimize data quality
Sauter, Nicholas K.
2015-01-29
Serial crystallography, using either femtosecond X-ray pulses from free-electron laser sources or short synchrotron-radiation exposures, has the potential to reveal metalloprotein structural details while minimizing damage processes. However, deriving a self-consistent set of Bragg intensities from numerous still-crystal exposures remains a difficult problem, with optimal protocols likely to be quite different from those well established for rotation photography. Here several data processing issues unique to serial crystallography are examined. It is found that the limiting resolution differs for each shot, an effect that is likely to be due to both the sample heterogeneity and pulse-to-pulse variation in experimental conditions. Shotsmore » with lower resolution limits produce lower-quality models for predicting Bragg spot positions during the integration step. Also, still shots by their nature record only partial measurements of the Bragg intensity. An approximate model that corrects to the full-spot equivalent (with the simplifying assumption that the X-rays are monochromatic) brings the distribution of intensities closer to that expected from an ideal crystal, and improves the sharpness of anomalous difference Fourier peaks indicating metal positions.« less
Improving the detection of cocoa bean fermentation-related changes using image fusion
NASA Astrophysics Data System (ADS)
Ochoa, Daniel; Criollo, Ronald; Liao, Wenzhi; Cevallos-Cevallos, Juan; Castro, Rodrigo; Bayona, Oswaldo
2017-05-01
Complex chemical processes occur in during cocoa bean fermentation. To select well-fermented beans, experts take a sample of beans, cut them in half and visually check its color. Often farmers mix high and low quality beans therefore, chocolate properties are difficult to control. In this paper, we explore how close-range hyper- spectral (HS) data can be used to characterize the fermentation process of two types of cocoa beans (CCN51 and National). Our aim is to find spectral differences to allow bean classification. The main issue is to extract reliable spectral data as openings resulting from the loss of water during fermentation, can cover up to 40% of the bean surface. We exploit HS pan-sharpening techniques to increase the spatial resolution of HS images and filter out uneven surface regions. In particular, the guided filter PCA approach which has proved suitable to use high-resolution RGB data as guide image. Our preliminary results show that this pre-processing step improves the separability of classes corresponding to each fermentation stage compared to using the average spectrum of the bean surface.
20 CFR 627.481 - Audit resolution.
Code of Federal Regulations, 2010 CFR
2010-04-01
... 20 Employees' Benefits 3 2010-04-01 2010-04-01 false Audit resolution. 627.481 Section 627.481... PROGRAMS UNDER TITLES I, II, AND III OF THE ACT Administrative Standards § 627.481 Audit resolution. (a) Federal audit resolution. When the OIG issues an audit report to the Employment and Training...
Enhancing resolution in coherent x-ray diffraction imaging.
Noh, Do Young; Kim, Chan; Kim, Yoonhee; Song, Changyong
2016-12-14
Achieving a resolution near 1 nm is a critical issue in coherent x-ray diffraction imaging (CDI) for applications in materials and biology. Albeit with various advantages of CDI based on synchrotrons and newly developed x-ray free electron lasers, its applications would be limited without improving resolution well below 10 nm. Here, we review the issues and efforts in improving CDI resolution including various methods for resolution determination. Enhancing diffraction signal at large diffraction angles, with the aid of interference between neighboring strong scatterers or templates, is reviewed and discussed in terms of increasing signal-to-noise ratio. In addition, we discuss errors in image reconstruction algorithms-caused by the discreteness of the Fourier transformations involved-which degrade the spatial resolution, and suggest ways to correct them. We expect this review to be useful for applications of CDI in imaging weakly scattering soft matters using coherent x-ray sources including x-ray free electron lasers.
Negative-tone imaging with EUV exposure toward 13nm hp
NASA Astrophysics Data System (ADS)
Tsubaki, Hideaki; Nihashi, Wataru; Tsuchihashi, Toru; Yamamoto, Kei; Goto, Takahiro
2016-03-01
Negative-tone imaging (NTI) with EUV exposure has major advantages with respect to line-width roughness (LWR) and resolution due in part to polymer swelling and favorable dissolution mechanics. In NTI process, both resist and organic solvents play important roles in determining lithography performances. The present study describes novel chemically amplified resist materials based on NTI technology with EUV using a specific organic solvents. Lithographic performances of NTI process were described in this paper under exposures using ASML NXE:3300 EUV scanner at imec. It is emphasized that 14 nm hp was nicely resolved under exposure dose of 37 mJ/cm2 without any bridge and collapse, which are attributed to the low swelling character of NTI process. Although 13 nm hp resolution was potentially obtained, a pattern collapse still restricts its resolution in case coating resist film thickness is 40 nm. Dark mask limitation due mainly to mask defectivity issue makes NTI with EUV favorable approach for printing block mask to produce logic circuit. A good resolution of CD-X 21 nm/CD-Y 32 nm was obtained for block mask pattern using NTI with usable process window and dose of 49 mJ/cm2. Minimum resolution now reaches CD-X 17 nm / CD-Y 23 nm for the block. A 21 nm block mask resolution was not affected by exposure dose and explored toward low dose down to 18 mJ/cm2 by reducing quencher loading. In addition, there was a negligible amount of increase in LCDU for isolated dot pattern when decreasing exposure dose from 66 mJ/cm2 to 24 mJ/cm2. On the other hand, there appeared tradeoff relationship between LCDU and dose for dense dot pattern, indicating photon-shot noise restriction, but strong dependency on patterning features. Design to improve acid generation efficiency was described based on acid generation mechanism in traditional chemically amplified materials which contains photo-acid generator (PAG) and polymer. Conventional EUV absorber comprises of organic compounds is expected to have 1.6 times higher EUV absorption than polyhydroxystyrene based on calculation. However, observed value of acid amount was comparable or significantly worse than polyhydroxystyrene.
Biomimetic machine vision system.
Harman, William M; Barrett, Steven F; Wright, Cameron H G; Wilcox, Michael
2005-01-01
Real-time application of digital imaging for use in machine vision systems has proven to be prohibitive when used within control systems that employ low-power single processors without compromising the scope of vision or resolution of captured images. Development of a real-time machine analog vision system is the focus of research taking place at the University of Wyoming. This new vision system is based upon the biological vision system of the common house fly. Development of a single sensor is accomplished, representing a single facet of the fly's eye. This new sensor is then incorporated into an array of sensors capable of detecting objects and tracking motion in 2-D space. This system "preprocesses" incoming image data resulting in minimal data processing to determine the location of a target object. Due to the nature of the sensors in the array, hyperacuity is achieved thereby eliminating resolutions issues found in digital vision systems. In this paper, we will discuss the biological traits of the fly eye and the specific traits that led to the development of this machine vision system. We will also discuss the process of developing an analog based sensor that mimics the characteristics of interest in the biological vision system. This paper will conclude with a discussion of how an array of these sensors can be applied toward solving real-world machine vision issues.
BOREAS TE-18, 30-m, Radiometrically Rectified Landsat TM Imagery
NASA Technical Reports Server (NTRS)
Hall, Forrest G. (Editor); Knapp, David
2000-01-01
The BOREAS TE-18 team used a radiometric rectification process to produce standardized DN values for a series of Landsat TM images of the BOREAS SSA and NSA in order to compare images that were collected under different atmospheric conditions. The images for each study area were referenced to an image that had very clear atmospheric qualities. The reference image for the SSA was collected on 02-Sep-1994, while the reference image for the NSA was collected on 21-Jun-1995. the 23 rectified images cover the period of 07-Jul-1985 to 18 Sep-1994 in the SSA and from 22-Jun-1984 to 09-Jun-1994 in the NSA. Each of the reference scenes had coincident atmospheric optical thickness measurements made by RSS-11. The radiometric rectification process is described in more detail by Hall et al. (199 1). The original Landsat TM data were received from CCRS for use in the BOREAS project. The data are stored in binary image-format files. Due to the nature of the radiometric rectification process and copyright issues, these full-resolution images may not be publicly distributed. However, a spatially degraded 60-m resolution version of the images is available on the BOREAS CD-ROM series. See Sections 15 and 16 for information about how to possibly acquire the full resolution data. Information about the full-resolution images is provided in an inventory listing on the CD-ROMs. The data files are available on a CD-ROM (see document number 20010000884), or from the Oak Ridge National Laboratory (ORNL) Distributed Activity Archive Center (DAAC).
Negative-tone development of photoresists in environmentally friendly silicone fluids
NASA Astrophysics Data System (ADS)
Ouyang, Christine Y.; Lee, Jin-Kyun; Ober, Christopher K.
2012-03-01
The large amount of organic solvents and chemicals that are used in today's microelectronic fabrication process can lead to environmental, health and safety hazards. It is therefore necessary to design new materials and new processes to reduce the environmental impact of the lithographic process. In addition, as the feature sizes decrease, other issues such as pattern collapse, which is related to the undesirable high surface tension of the developers and rinse liquids, can occur and limit the resolution. In order to solve these issues, silicone fluids are chosen as alternative developing solvents in this paper. Silicone fluids, also known as linear methyl siloxanes, are a class of mild, non-polar solvents that are non-toxic, not ozone-depleting, and contribute little to global warming. They are considered as promising developers because of their environmental-friendliness and their unique physical properties such as low viscosity and low surface tension. Recently, there have been emerging interests in negative-tone development (NTD) due to its better ability in printing contact holes and trenches. It is also found that the performance of negative-tone development is closely related to the developing solvents. Silicone fluids are thus promising developers for NTD because of their non-polar nature and high contrast negative-tone images are expected with chemical amplification photoresists due to the high chemical contrast of chemical amplification. We have previously shown some successful NTD with conventional photoresists such as ESCAP in silicone fluids. In this paper, another commercially available TOK resist was utilized to study the NTD process in silicone fluids. Because small and non-polar molecules are intrinsically soluble in silicone fluids, we have designed a molecular glass resist for silicone fluids. Due to the low surface tension of silicone fluids, we are able achieve high aspect-ratio, high-resolution patterns without pattern collapse.
NASA Astrophysics Data System (ADS)
Pourteau, Marie-Line; Servin, Isabelle; Lepinay, Kévin; Essomba, Cyrille; Dal'Zotto, Bernard; Pradelles, Jonathan; Lattard, Ludovic; Brandt, Pieter; Wieland, Marco
2016-03-01
The emerging Massively Parallel-Electron Beam Direct Write (MP-EBDW) is an attractive high resolution high throughput lithography technology. As previously shown, Chemically Amplified Resists (CARs) meet process/integration specifications in terms of dose-to-size, resolution, contrast, and energy latitude. However, they are still limited by their line width roughness. To overcome this issue, we tested an alternative advanced non-CAR and showed it brings a substantial gain in sensitivity compared to CAR. We also implemented and assessed in-line post-lithographic treatments for roughness mitigation. For outgassing-reduction purpose, a top-coat layer is added to the total process stack. A new generation top-coat was tested and showed improved printing performances compared to the previous product, especially avoiding dark erosion: SEM cross-section showed a straight pattern profile. A spin-coatable charge dissipation layer based on conductive polyaniline has also been tested for conductivity and lithographic performances, and compatibility experiments revealed that the underlying resist type has to be carefully chosen when using this product. Finally, the Process Of Reference (POR) trilayer stack defined for 5 kV multi-e-beam lithography was successfully etched with well opened and straight patterns, and no lithography-etch bias.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lyons, E.A.
This dissertation is an evaluation of the use of negotiations in the rule-making context at the US Environmental Protection Agency (EPA). The goal is to assess the benefits and the limitations of negotiation as a policy process, and to make explicit the values which are expected from a negotiation process as well as the conditions which must be met in order for those values to be realized. Three distinct values are expected of negotiation processes: (1) negotiation is promoted as an efficient process that can save time and money in public decision making by avoiding protracted and expensive legal actions;more » (2) it is expected that a negotiation process which provides a mechanism for reaching accommodation among all competing perspectives can yield good policy outcomes; face-to-face interactions among parties with competing interests should provide opportunities for building better relationships among individuals and also for building community. The usefulness of negotiation as a policy tool is limited by the fact that negotiation is only appropriate in a few select cases in which the issues are mature and the parties affected by the issues are prepared for negotiation.« less
Multiscale soil moisture estimates using static and roving cosmic-ray soil moisture sensors
NASA Astrophysics Data System (ADS)
McJannet, David; Hawdon, Aaron; Baker, Brett; Renzullo, Luigi; Searle, Ross
2017-12-01
Soil moisture plays a critical role in land surface processes and as such there has been a recent increase in the number and resolution of satellite soil moisture observations and the development of land surface process models with ever increasing resolution. Despite these developments, validation and calibration of these products has been limited because of a lack of observations on corresponding scales. A recently developed mobile soil moisture monitoring platform, known as the rover
, offers opportunities to overcome this scale issue. This paper describes methods, results and testing of soil moisture estimates produced using rover surveys on a range of scales that are commensurate with model and satellite retrievals. Our investigation involved static cosmic-ray neutron sensors and rover surveys across both broad (36 × 36 km at 9 km resolution) and intensive (10 × 10 km at 1 km resolution) scales in a cropping district in the Mallee region of Victoria, Australia. We describe approaches for converting rover survey neutron counts to soil moisture and discuss the factors controlling soil moisture variability. We use independent gravimetric and modelled soil moisture estimates collected across both space and time to validate rover soil moisture products. Measurements revealed that temporal patterns in soil moisture were preserved through time and regression modelling approaches were utilised to produce time series of property-scale soil moisture which may also have applications in calibration and validation studies or local farm management. Intensive-scale rover surveys produced reliable soil moisture estimates at 1 km resolution while broad-scale surveys produced soil moisture estimates at 9 km resolution. We conclude that the multiscale soil moisture products produced in this study are well suited to future analysis of satellite soil moisture retrievals and finer-scale soil moisture models.
Pixel-based OPC optimization based on conjugate gradients.
Ma, Xu; Arce, Gonzalo R
2011-01-31
Optical proximity correction (OPC) methods are resolution enhancement techniques (RET) used extensively in the semiconductor industry to improve the resolution and pattern fidelity of optical lithography. In pixel-based OPC (PBOPC), the mask is divided into small pixels, each of which is modified during the optimization process. Two critical issues in PBOPC are the required computational complexity of the optimization process, and the manufacturability of the optimized mask. Most current OPC optimization methods apply the steepest descent (SD) algorithm to improve image fidelity augmented by regularization penalties to reduce the complexity of the mask. Although simple to implement, the SD algorithm converges slowly. The existing regularization penalties, however, fall short in meeting the mask rule check (MRC) requirements often used in semiconductor manufacturing. This paper focuses on developing OPC optimization algorithms based on the conjugate gradient (CG) method which exhibits much faster convergence than the SD algorithm. The imaging formation process is represented by the Fourier series expansion model which approximates the partially coherent system as a sum of coherent systems. In order to obtain more desirable manufacturability properties of the mask pattern, a MRC penalty is proposed to enlarge the linear size of the sub-resolution assistant features (SRAFs), as well as the distances between the SRAFs and the main body of the mask. Finally, a projection method is developed to further reduce the complexity of the optimized mask pattern.
The judicial role in life-sustaining medical treatment decisions.
Hafemeister, T L; Keilitz, I; Banks, S M
1991-01-01
Although there has been speculation regarding the pervasiveness and nature of judicial decisions regarding life-sustaining medical treatment (LSMT), no attempt has been made to empirically assess their prevalence or the issues they address. An exploratory study utilizing a mail survey of a nationwide random sample (N = 905) of state trial court judges was conducted to provide initial information regarding this decision-making process. Twenty-two percent of the responding judges had heard at least one LSMT case, and judicial review did not appear endemic to particular states. The number of judges hearing LSMT cases dropped from 1975 to 1981 but has increased since then. Three major issues predominate: patient competency, appointment of a surrogate decisionmaker, and resolution of the ultimate issue of forgoing LSMT. Relatively few cases either contested a prior directive's validity or involved imposing sanctions for instituting or forgoing LSMT. Although subject to different interpretations, the results suggest the courts are having a significant impact on certain aspects of the LSMT decision-making process. However, the infrequency with which any one judge is called upon to make an LSMT decision causes concern about the judiciary's ability to respond in a timely and appropriate manner. With their potential for a profound effect on the actions of health care providers, greater attention to this decision-making process is warranted.
Foreman, Brady Z; Straub, Kyle M
2017-09-01
Terrestrial paleoclimate records rely on proxies hosted in alluvial strata whose beds are deposited by unsteady and nonlinear geomorphic processes. It is broadly assumed that this renders the resultant time series of terrestrial paleoclimatic variability noisy and incomplete. We evaluate this assumption using a model of oscillating climate and the precise topographic evolution of an experimental alluvial system. We find that geomorphic stochasticity can create aliasing in the time series and spurious climate signals, but these issues are eliminated when the period of climate oscillation is longer than a key time scale of internal dynamics in the geomorphic system. This emergent autogenic geomorphic behavior imparts regularity to deposition and represents a natural discretization interval of the continuous climate signal. We propose that this time scale in nature could be in excess of 10 4 years but would still allow assessments of the rates of climate change at resolutions finer than the existing age model techniques in isolation.
Foreman, Brady Z.; Straub, Kyle M.
2017-01-01
Terrestrial paleoclimate records rely on proxies hosted in alluvial strata whose beds are deposited by unsteady and nonlinear geomorphic processes. It is broadly assumed that this renders the resultant time series of terrestrial paleoclimatic variability noisy and incomplete. We evaluate this assumption using a model of oscillating climate and the precise topographic evolution of an experimental alluvial system. We find that geomorphic stochasticity can create aliasing in the time series and spurious climate signals, but these issues are eliminated when the period of climate oscillation is longer than a key time scale of internal dynamics in the geomorphic system. This emergent autogenic geomorphic behavior imparts regularity to deposition and represents a natural discretization interval of the continuous climate signal. We propose that this time scale in nature could be in excess of 104 years but would still allow assessments of the rates of climate change at resolutions finer than the existing age model techniques in isolation. PMID:28924607
High resolution climate scenarios for snowmelt modelling in small alpine catchments
NASA Astrophysics Data System (ADS)
Schirmer, M.; Peleg, N.; Burlando, P.; Jonas, T.
2017-12-01
Snow in the Alps is affected by climate change with regard to duration, timing and amount. This has implications with respect to important societal issues as drinking water supply or hydropower generation. In Switzerland, the latter received a lot of attention following the political decision to phase out of nuclear electricity production. An increasing number of authorization requests for small hydropower plants located in small alpine catchments was observed in the recent years. This situation generates ecological conflicts, while the expected climate change poses a threat to water availability thus putting at risk investments in such hydropower plants. Reliable high-resolution climate scenarios are thus required, which account for small-scale processes to achieve realistic predictions of snowmelt runoff and its variability in small alpine catchments. We therefore used a novel model chain by coupling a stochastic 2-dimensional weather generator (AWE-GEN-2d) with a state-of-the-art energy balance snow cover model (FSM). AWE-GEN-2d was applied to generate ensembles of climate variables at very fine temporal and spatial resolution, thus providing all climatic input variables required for the energy balance modelling. The land-surface model FSM was used to describe spatially variable snow cover accumulation and melt processes. The FSM was refined to allow applications at very high spatial resolution by specifically accounting for small-scale processes, such as a subgrid-parametrization of snow covered area or an improved representation of forest-snow processes. For the present study, the model chain was tested for current climate conditions using extensive observational dataset of different spatial and temporal coverage. Small-scale spatial processes such as elevation gradients or aspect differences in the snow distribution were evaluated using airborne LiDAR data. 40-year of monitoring data for snow water equivalent, snowmelt and snow-covered area for entire Switzerland was used to verify snow distribution patterns at coarser spatial and temporal scale. The ability of the model chain to reproduce current climate conditions in small alpine catchments makes this model combination an outstanding candidate to produce high resolution climate scenarios of snowmelt in small alpine catchments.
NASA Astrophysics Data System (ADS)
Brasseur, P.; Verron, J. A.; Djath, B.; Duran, M.; Gaultier, L.; Gourdeau, L.; Melet, A.; Molines, J. M.; Ubelmann, C.
2014-12-01
The upcoming high-resolution SWOT altimetry satellite will provide an unprecedented description of the ocean dynamic topography for studying sub- and meso-scale processes in the ocean. But there is still much uncertainty on the signal that will be observed. There are many scientific questions that are unresolved about the observability of altimetry at vhigh resolution and on the dynamical role of the ocean meso- and submesoscales. In addition, SWOT data will raise specific problems due to the size of the data flows. These issues will probably impact the data assimilation approaches for future scientific or operational oceanography applications. In this work, we propose to use a high-resolution numerical model of the Western Pacific Solomon Sea as a regional laboratory to explore such observability and dynamical issues, as well as new data assimilation challenges raised by SWOT. The Solomon Sea connects subtropical water masses to the equatorial ones through the low latitude western boundary currents and could potentially modulate the tropical Pacific climate. In the South Western Pacific, the Solomon Sea exhibits very intense eddy kinetic energy levels, while relatively little is known about the mesoscale and submesoscale activities in this region. The complex bathymetry of the region, complicated by the presence of narrow straits and numerous islands, raises specific challenges. So far, a Solomon sea model configuration has been set up at 1/36° resolution. Numerical simulations have been performed to explore the meso- and submesoscales dynamics. The numerical solutions which have been validated against available in situ data, show the development of small scale features, eddies, fronts and filaments. Spectral analysis reveals a behavior that is consistent with the SQG theory. There is a clear evidence of energy cascade from the small scales including the submesoscales, although those submesoscales are only partially resolved by the model. In parallel, investigations have been conducted using image assimilation approaches in order to explore the richness of high-resolution altimetry missions. These investigations illustrate the potential benefit of combining tracer fields (SST, SSS and spiciness) with high-resolution SWOT data to estimate the fine-scale circulation.
A Large Scale Code Resolution Service Network in the Internet of Things
Yu, Haining; Zhang, Hongli; Fang, Binxing; Yu, Xiangzhan
2012-01-01
In the Internet of Things a code resolution service provides a discovery mechanism for a requester to obtain the information resources associated with a particular product code immediately. In large scale application scenarios a code resolution service faces some serious issues involving heterogeneity, big data and data ownership. A code resolution service network is required to address these issues. Firstly, a list of requirements for the network architecture and code resolution services is proposed. Secondly, in order to eliminate code resolution conflicts and code resolution overloads, a code structure is presented to create a uniform namespace for code resolution records. Thirdly, we propose a loosely coupled distributed network consisting of heterogeneous, independent; collaborating code resolution services and a SkipNet based code resolution service named SkipNet-OCRS, which not only inherits DHT's advantages, but also supports administrative control and autonomy. For the external behaviors of SkipNet-OCRS, a novel external behavior mode named QRRA mode is proposed to enhance security and reduce requester complexity. For the internal behaviors of SkipNet-OCRS, an improved query algorithm is proposed to increase query efficiency. It is analyzed that integrating SkipNet-OCRS into our resolution service network can meet our proposed requirements. Finally, simulation experiments verify the excellent performance of SkipNet-OCRS. PMID:23202207
A large scale code resolution service network in the Internet of Things.
Yu, Haining; Zhang, Hongli; Fang, Binxing; Yu, Xiangzhan
2012-11-07
In the Internet of Things a code resolution service provides a discovery mechanism for a requester to obtain the information resources associated with a particular product code immediately. In large scale application scenarios a code resolution service faces some serious issues involving heterogeneity, big data and data ownership. A code resolution service network is required to address these issues. Firstly, a list of requirements for the network architecture and code resolution services is proposed. Secondly, in order to eliminate code resolution conflicts and code resolution overloads, a code structure is presented to create a uniform namespace for code resolution records. Thirdly, we propose a loosely coupled distributed network consisting of heterogeneous, independent; collaborating code resolution services and a SkipNet based code resolution service named SkipNet-OCRS, which not only inherits DHT’s advantages, but also supports administrative control and autonomy. For the external behaviors of SkipNet-OCRS, a novel external behavior mode named QRRA mode is proposed to enhance security and reduce requester complexity. For the internal behaviors of SkipNet-OCRS, an improved query algorithm is proposed to increase query efficiency. It is analyzed that integrating SkipNet-OCRS into our resolution service network can meet our proposed requirements. Finally, simulation experiments verify the excellent performance of SkipNet-OCRS.
From Heavy-Ion Collisions to Quark Matter (2/3)
Lourenco, C.
2018-05-23
The art of experimental (high-energy heavy-ion) physics 1) many experimental issues are crucial to properly understand the measurements and derive a correct physics interpretation: Acceptance and phase space windows; Efficiencies (of track reconstruction, vertexing, track matching, trigger, etc); Resolutions (of mass, momenta, energies, etc); Backgrounds, feed-downs and "expected sources"; Data selection; Monte Carlo adjustments, calibrations and smearing; luminosity and trigger conditions; Evaluation of systematic uncertainties, and several others. 2) "New Physics" often appears as excesses or suppressions with respect to "normal baselines", which must be very carefully established, on the basis of "reference" physics processes and collision systems. If we misunderstand these issues we can miss an important discovery...or we can "discover" non-existent "new physics."
NASA Technical Reports Server (NTRS)
Landano, M. R.; Easter, R. W.
1984-01-01
Aspects of Space Station automated systems testing and verification are discussed, taking into account several program requirements. It is found that these requirements lead to a number of issues of uncertainties which require study and resolution during the Space Station definition phase. Most, if not all, of the considered uncertainties have implications for the overall testing and verification strategy adopted by the Space Station Program. A description is given of the Galileo Orbiter fault protection design/verification approach. Attention is given to a mission description, an Orbiter description, the design approach and process, the fault protection design verification approach/process, and problems of 'stress' testing.
Conflict Resolution in Japanese Social Interactions.
ERIC Educational Resources Information Center
Killen, Melanie; Sueyoshi, Lina
1995-01-01
Studied Japanese preschool children's conflict resolution and how they and their mothers evaluate teachers' conflict resolution methods. Found that preschoolers' conflicts stemmed from wide range of issues, including concerns about justice, rights, and fairness. Preschoolers used negotiation more than retribution or appeals to teachers, which…
Braberg, Hannes; Moehle, Erica A.; Shales, Michael; Guthrie, Christine; Krogan, Nevan J.
2014-01-01
We have achieved a residue-level resolution of genetic interaction mapping – a technique that measures how the function of one gene is affected by the alteration of a second gene – by analyzing point mutations. Here, we describe how to interpret point mutant genetic interactions, and outline key applications for the approach, including interrogation of protein interaction interfaces and active sites, and examination of post-translational modifications. Genetic interaction analysis has proven effective for characterizing cellular processes; however, to date, systematic high-throughput genetic interaction screens have relied on gene deletions or knockdowns, which limits the resolution of gene function analysis and poses problems for multifunctional genes. Our point mutant approach addresses these issues, and further provides a tool for in vivo structure-function analysis that complements traditional biophysical methods. We also discuss the potential for genetic interaction mapping of point mutations in human cells and its application to personalized medicine. PMID:24842270
Earth mapping - aerial or satellite imagery comparative analysis
NASA Astrophysics Data System (ADS)
Fotev, Svetlin; Jordanov, Dimitar; Lukarski, Hristo
Nowadays, solving the tasks for revision of existing map products and creation of new maps requires making a choice of the land cover image source. The issue of the effectiveness and cost of the usage of aerial mapping systems versus the efficiency and cost of very-high resolution satellite imagery is topical [1, 2, 3, 4]. The price of any remotely sensed image depends on the product (panchromatic or multispectral), resolution, processing level, scale, urgency of task and on whether the needed image is available in the archive or has to be requested. The purpose of the present work is: to make a comparative analysis between the two approaches for mapping the Earth having in mind two parameters: quality and cost. To suggest an approach for selection of the map information sources - airplane-based or spacecraft-based imaging systems with very-high spatial resolution. Two cases are considered: area that equals approximately one satellite scene and area that equals approximately the territory of Bulgaria.
Army Airspace Command and Control (A2C2): Action Plan for Issue Resolution
1993-09-01
INFO Information INTEL Intelligence IPR In-Process Review IVIS Inter-Vehicular Information System JACC Joint Airspace Control Center JAOC Joint Air...base, centralized such as intelligence at Fort Huachuca and combat service support at Fort Lee , or a combination of both. It is no longer efficient to...Regiment (ATS) Ft. Bragg, NC 28307 ATTN: AFZF-ATS-C (LTC Ledbetter ) (919) 396-8899/7649 Bldg 87009, 16th Street Ft. Hood, TX 76544 Commander, 1st
2006-03-01
candidates. Either one of these two organizations, taken separately, possesses enough potential to create an asymmetry against Russia. Would these... two institutions get more involved in the fate of this intra-state conflict? Possible courses of action of actors involved will be analyzed through...stand-off against Russia on the Transnistrian issue. In this respect, NATO and the EU appear to be appropriate candidates. Either one of these two
A comparative verification of high resolution precipitation forecasts using model output statistics
NASA Astrophysics Data System (ADS)
van der Plas, Emiel; Schmeits, Maurice; Hooijman, Nicolien; Kok, Kees
2017-04-01
Verification of localized events such as precipitation has become even more challenging with the advent of high-resolution meso-scale numerical weather prediction (NWP). The realism of a forecast suggests that it should compare well against precipitation radar imagery with similar resolution, both spatially and temporally. Spatial verification methods solve some of the representativity issues that point verification gives rise to. In this study a verification strategy based on model output statistics is applied that aims to address both double penalty and resolution effects that are inherent to comparisons of NWP models with different resolutions. Using predictors based on spatial precipitation patterns around a set of stations, an extended logistic regression (ELR) equation is deduced, leading to a probability forecast distribution of precipitation for each NWP model, analysis and lead time. The ELR equations are derived for predictands based on areal calibrated radar precipitation and SYNOP observations. The aim is to extract maximum information from a series of precipitation forecasts, like a trained forecaster would. The method is applied to the non-hydrostatic model Harmonie (2.5 km resolution), Hirlam (11 km resolution) and the ECMWF model (16 km resolution), overall yielding similar Brier skill scores for the 3 post-processed models, but larger differences for individual lead times. Besides, the Fractions Skill Score is computed using the 3 deterministic forecasts, showing somewhat better skill for the Harmonie model. In other words, despite the realism of Harmonie precipitation forecasts, they only perform similarly or somewhat better than precipitation forecasts from the 2 lower resolution models, at least in the Netherlands.
Prefrontal Cortex, Emotion, and Approach/Withdrawal Motivation
Spielberg, Jeffrey M.; Stewart, Jennifer L.; Levin, Rebecca L.; Miller, Gregory A.; Heller, Wendy
2010-01-01
This article provides a selective review of the literature and current theories regarding the role of prefrontal cortex, along with some other critical brain regions, in emotion and motivation. Seemingly contradictory findings have often appeared in this literature. Research attempting to resolve these contradictions has been the basis of new areas of growth and has led to more sophisticated understandings of emotional and motivational processes as well as neural networks associated with these processes. Progress has, in part, depended on methodological advances that allow for increased resolution in brain imaging. A number of issues are currently in play, among them the role of prefrontal cortex in emotional or motivational processes. This debate fosters research that will likely lead to further refinement of conceptualizations of emotion, motivation, and the neural processes associated with them. PMID:20574551
Prefrontal Cortex, Emotion, and Approach/Withdrawal Motivation.
Spielberg, Jeffrey M; Stewart, Jennifer L; Levin, Rebecca L; Miller, Gregory A; Heller, Wendy
2008-01-01
This article provides a selective review of the literature and current theories regarding the role of prefrontal cortex, along with some other critical brain regions, in emotion and motivation. Seemingly contradictory findings have often appeared in this literature. Research attempting to resolve these contradictions has been the basis of new areas of growth and has led to more sophisticated understandings of emotional and motivational processes as well as neural networks associated with these processes. Progress has, in part, depended on methodological advances that allow for increased resolution in brain imaging. A number of issues are currently in play, among them the role of prefrontal cortex in emotional or motivational processes. This debate fosters research that will likely lead to further refinement of conceptualizations of emotion, motivation, and the neural processes associated with them.
When Can Clades Be Potentially Resolved with Morphology?
Bapst, David W.
2013-01-01
Morphology-based phylogenetic analyses are the only option for reconstructing relationships among extinct lineages, but often find support for conflicting hypotheses of relationships. The resulting lack of phylogenetic resolution is generally explained in terms of data quality and methodological issues, such as character selection. A previous suggestion is that sampling ancestral morphotaxa or sampling multiple taxa descended from a long-lived, unchanging lineage can also yield clades which have no opportunity to share synapomorphies. This lack of character information leads to a lack of ‘intrinsic’ resolution, an issue that cannot be solved with additional morphological data. It is unclear how often we should expect clades to be intrinsically resolvable in realistic circumstances, as intrinsic resolution must increase as taxonomic sampling decreases. Using branching simulations, I quantify intrinsic resolution across several models of morphological differentiation and taxonomic sampling. Intrinsically unresolvable clades are found to be relatively frequent in simulations of both extinct and living taxa under realistic sampling scenarios, implying that intrinsic resolution is an issue for morphology-based analyses of phylogeny. Simulations which vary the rates of sampling and differentiation were tested for their agreement to observed distributions of durations from well-sampled fossil records and also having high intrinsic resolution. This combination only occurs in those datasets when differentiation and sampling rates are both unrealistically high relative to branching and extinction rates. Thus, the poor phylogenetic resolution occasionally observed in morphological phylogenetics may result from a lack of intrinsic resolvability within groups. PMID:23638034
OpenMP parallelization of a gridded SWAT (SWATG)
NASA Astrophysics Data System (ADS)
Zhang, Ying; Hou, Jinliang; Cao, Yongpan; Gu, Juan; Huang, Chunlin
2017-12-01
Large-scale, long-term and high spatial resolution simulation is a common issue in environmental modeling. A Gridded Hydrologic Response Unit (HRU)-based Soil and Water Assessment Tool (SWATG) that integrates grid modeling scheme with different spatial representations also presents such problems. The time-consuming problem affects applications of very high resolution large-scale watershed modeling. The OpenMP (Open Multi-Processing) parallel application interface is integrated with SWATG (called SWATGP) to accelerate grid modeling based on the HRU level. Such parallel implementation takes better advantage of the computational power of a shared memory computer system. We conducted two experiments at multiple temporal and spatial scales of hydrological modeling using SWATG and SWATGP on a high-end server. At 500-m resolution, SWATGP was found to be up to nine times faster than SWATG in modeling over a roughly 2000 km2 watershed with 1 CPU and a 15 thread configuration. The study results demonstrate that parallel models save considerable time relative to traditional sequential simulation runs. Parallel computations of environmental models are beneficial for model applications, especially at large spatial and temporal scales and at high resolutions. The proposed SWATGP model is thus a promising tool for large-scale and high-resolution water resources research and management in addition to offering data fusion and model coupling ability.
Real-time image processing for passive mmW imagery
NASA Astrophysics Data System (ADS)
Kozacik, Stephen; Paolini, Aaron; Bonnett, James; Harrity, Charles; Mackrides, Daniel; Dillon, Thomas E.; Martin, Richard D.; Schuetz, Christopher A.; Kelmelis, Eric; Prather, Dennis W.
2015-05-01
The transmission characteristics of millimeter waves (mmWs) make them suitable for many applications in defense and security, from airport preflight scanning to penetrating degraded visual environments such as brownout or heavy fog. While the cold sky provides sufficient illumination for these images to be taken passively in outdoor scenarios, this utility comes at a cost; the diffraction limit of the longer wavelengths involved leads to lower resolution imagery compared to the visible or IR regimes, and the low power levels inherent to passive imagery allow the data to be more easily degraded by noise. Recent techniques leveraging optical upconversion have shown significant promise, but are still subject to fundamental limits in resolution and signal-to-noise ratio. To address these issues we have applied techniques developed for visible and IR imagery to decrease noise and increase resolution in mmW imagery. We have developed these techniques into fieldable software, making use of GPU platforms for real-time operation of computationally complex image processing algorithms. We present data from a passive, 77 GHz, distributed aperture, video-rate imaging platform captured during field tests at full video rate. These videos demonstrate the increase in situational awareness that can be gained through applying computational techniques in real-time without needing changes in detection hardware.
Single Channel EEG Artifact Identification Using Two-Dimensional Multi-Resolution Analysis.
Taherisadr, Mojtaba; Dehzangi, Omid; Parsaei, Hossein
2017-12-13
As a diagnostic monitoring approach, electroencephalogram (EEG) signals can be decoded by signal processing methodologies for various health monitoring purposes. However, EEG recordings are contaminated by other interferences, particularly facial and ocular artifacts generated by the user. This is specifically an issue during continuous EEG recording sessions, and is therefore a key step in using EEG signals for either physiological monitoring and diagnosis or brain-computer interface to identify such artifacts from useful EEG components. In this study, we aim to design a new generic framework in order to process and characterize EEG recording as a multi-component and non-stationary signal with the aim of localizing and identifying its component (e.g., artifact). In the proposed method, we gather three complementary algorithms together to enhance the efficiency of the system. Algorithms include time-frequency (TF) analysis and representation, two-dimensional multi-resolution analysis (2D MRA), and feature extraction and classification. Then, a combination of spectro-temporal and geometric features are extracted by combining key instantaneous TF space descriptors, which enables the system to characterize the non-stationarities in the EEG dynamics. We fit a curvelet transform (as a MRA method) to 2D TF representation of EEG segments to decompose the given space to various levels of resolution. Such a decomposition efficiently improves the analysis of the TF spaces with different characteristics (e.g., resolution). Our experimental results demonstrate that the combination of expansion to TF space, analysis using MRA, and extracting a set of suitable features and applying a proper predictive model is effective in enhancing the EEG artifact identification performance. We also compare the performance of the designed system with another common EEG signal processing technique-namely, 1D wavelet transform. Our experimental results reveal that the proposed method outperforms 1D wavelet.
Massive stereo-based DTM production for Mars on cloud computers
NASA Astrophysics Data System (ADS)
Tao, Y.; Muller, J.-P.; Sidiropoulos, P.; Xiong, Si-Ting; Putri, A. R. D.; Walter, S. H. G.; Veitch-Michaelis, J.; Yershov, V.
2018-05-01
Digital Terrain Model (DTM) creation is essential to improving our understanding of the formation processes of the Martian surface. Although there have been previous demonstrations of open-source or commercial planetary 3D reconstruction software, planetary scientists are still struggling with creating good quality DTMs that meet their science needs, especially when there is a requirement to produce a large number of high quality DTMs using "free" software. In this paper, we describe a new open source system to overcome many of these obstacles by demonstrating results in the context of issues found from experience with several planetary DTM pipelines. We introduce a new fully automated multi-resolution DTM processing chain for NASA Mars Reconnaissance Orbiter (MRO) Context Camera (CTX) and High Resolution Imaging Science Experiment (HiRISE) stereo processing, called the Co-registration Ames Stereo Pipeline (ASP) Gotcha Optimised (CASP-GO), based on the open source NASA ASP. CASP-GO employs tie-point based multi-resolution image co-registration, and Gotcha sub-pixel refinement and densification. CASP-GO pipeline is used to produce planet-wide CTX and HiRISE DTMs that guarantee global geo-referencing compliance with respect to High Resolution Stereo Colour imaging (HRSC), and thence to the Mars Orbiter Laser Altimeter (MOLA); providing refined stereo matching completeness and accuracy. All software and good quality products introduced in this paper are being made open-source to the planetary science community through collaboration with NASA Ames, United States Geological Survey (USGS) and the Jet Propulsion Laboratory (JPL), Advanced Multi-Mission Operations System (AMMOS) Planetary Data System (PDS) Pipeline Service (APPS-PDS4), as well as browseable and visualisable through the iMars web based Geographic Information System (webGIS) system.
Multisensor data fusion across time and space
NASA Astrophysics Data System (ADS)
Villeneuve, Pierre V.; Beaven, Scott G.; Reed, Robert A.
2014-06-01
Field measurement campaigns typically deploy numerous sensors having different sampling characteristics for spatial, temporal, and spectral domains. Data analysis and exploitation is made more difficult and time consuming as the sample data grids between sensors do not align. This report summarizes our recent effort to demonstrate feasibility of a processing chain capable of "fusing" image data from multiple independent and asynchronous sensors into a form amenable to analysis and exploitation using commercially-available tools. Two important technical issues were addressed in this work: 1) Image spatial registration onto a common pixel grid, 2) Image temporal interpolation onto a common time base. The first step leverages existing image matching and registration algorithms. The second step relies upon a new and innovative use of optical flow algorithms to perform accurate temporal upsampling of slower frame rate imagery. Optical flow field vectors were first derived from high-frame rate, high-resolution imagery, and then finally used as a basis for temporal upsampling of the slower frame rate sensor's imagery. Optical flow field values are computed using a multi-scale image pyramid, thus allowing for more extreme object motion. This involves preprocessing imagery to varying resolution scales and initializing new vector flow estimates using that from the previous coarser-resolution image. Overall performance of this processing chain is demonstrated using sample data involving complex too motion observed by multiple sensors mounted to the same base. Multiple sensors were included, including a high-speed visible camera, up to a coarser resolution LWIR camera.
Flexible high-resolution display systems for the next generation of radiology reading rooms
NASA Astrophysics Data System (ADS)
Caban, Jesus J.; Wood, Bradford J.; Park, Adrian
2007-03-01
A flexible, scalable, high-resolution display system is presented to support the next generation of radiology reading rooms or interventional radiology suites. The project aims to create an environment for radiologists that will simultaneously facilitate image interpretation, analysis, and understanding while lowering visual and cognitive stress. Displays currently in use present radiologists with technical challenges to exploring complex datasets that we seek to address. These include resolution and brightness, display and ambient lighting differences, and degrees of complexity in addition to side-by-side comparison of time-variant and 2D/3D images. We address these issues through a scalable projector-based system that uses our custom-designed geometrical and photometrical calibration process to create a seamless, bright, high-resolution display environment that can reduce the visual fatigue commonly experienced by radiologists. The system we have designed uses an array of casually aligned projectors to cooperatively increase overall resolution and brightness. Images from a set of projectors in their narrowest zoom are combined at a shared projection surface, thus increasing the global "pixels per inch" (PPI) of the display environment. Two primary challenges - geometric calibration and photometric calibration - remained to be resolved before our high-resolution display system could be used in a radiology reading room or procedure suite. In this paper we present a method that accomplishes those calibrations and creates a flexible high-resolution display environment that appears seamless, sharp, and uniform across different devices.
High-performance computing in image registration
NASA Astrophysics Data System (ADS)
Zanin, Michele; Remondino, Fabio; Dalla Mura, Mauro
2012-10-01
Thanks to the recent technological advances, a large variety of image data is at our disposal with variable geometric, radiometric and temporal resolution. In many applications the processing of such images needs high performance computing techniques in order to deliver timely responses e.g. for rapid decisions or real-time actions. Thus, parallel or distributed computing methods, Digital Signal Processor (DSP) architectures, Graphical Processing Unit (GPU) programming and Field-Programmable Gate Array (FPGA) devices have become essential tools for the challenging issue of processing large amount of geo-data. The article focuses on the processing and registration of large datasets of terrestrial and aerial images for 3D reconstruction, diagnostic purposes and monitoring of the environment. For the image alignment procedure, sets of corresponding feature points need to be automatically extracted in order to successively compute the geometric transformation that aligns the data. The feature extraction and matching are ones of the most computationally demanding operations in the processing chain thus, a great degree of automation and speed is mandatory. The details of the implemented operations (named LARES) exploiting parallel architectures and GPU are thus presented. The innovative aspects of the implementation are (i) the effectiveness on a large variety of unorganized and complex datasets, (ii) capability to work with high-resolution images and (iii) the speed of the computations. Examples and comparisons with standard CPU processing are also reported and commented.
10 CFR 2.700 - Scope of subpart G.
Code of Federal Regulations, 2010 CFR
2010-01-01
... presiding officer by order finds that resolution of the contention necessitates resolution of: issues of... to the resolution of the contested matter, proceedings for initial applications for construction... conflict between the provisions of this subpart and those set forth in subpart C of this part, the...
NASA Astrophysics Data System (ADS)
Zhu, Feng; Macdonald, Niall; Skommer, Joanna; Wlodkowic, Donald
2015-06-01
Current microfabrication methods are often restricted to two-dimensional (2D) or two and a half dimensional (2.5D) structures. Those fabrication issues can be potentially addressed by emerging additive manufacturing technologies. Despite rapid growth of additive manufacturing technologies in tissue engineering, microfluidics has seen relatively little developments with regards to adopting 3D printing for rapid fabrication of complex chip-based devices. This has been due to two major factors: lack of sufficient resolution of current rapid-prototyping methods (usually >100 μm ) and optical transparency of polymers to allow in vitro imaging of specimens. We postulate that adopting innovative fabrication processes can provide effective solutions for prototyping and manufacturing of chip-based devices with high-aspect ratios (i.e. above ration of 20:1). This work provides a comprehensive investigation of commercially available additive manufacturing technologies as an alternative for rapid prototyping of complex monolithic Lab-on-a-Chip devices for biological applications. We explored both multi-jet modelling (MJM) and several stereolithography (SLA) processes with five different 3D printing resins. Compared with other rapid prototyping technologies such as PDMS soft lithography and infrared laser micromachining, we demonstrated that selected SLA technologies had superior resolution and feature quality. We also for the first time optimised the post-processing protocols and demonstrated polymer features under scanning electronic microscope (SEM). Finally we demonstrate that selected SLA polymers have optical properties enabling high-resolution biological imaging. A caution should be, however, exercised as more work is needed to develop fully bio-compatible and non-toxic polymer chemistries.
Top Tips for Buying Telecommunication Services.
ERIC Educational Resources Information Center
Linder, Jeff
2001-01-01
Examines top regulatory issues and other unique issues resulting from this regulatory overlay when negotiating for corporate telecom services. Issues cover such topics as tariffs, rate negotiation, exclusivity provisions, revenue commitments, mid-term negotiations, service-level agreements, and dispute resolution. (GR)
Hu, Hao; Hong, Xingchen; Terstriep, Jeff; Liu, Yan; Finn, Michael P.; Rush, Johnathan; Wendel, Jeffrey; Wang, Shaowen
2016-01-01
Geospatial data, often embedded with geographic references, are important to many application and science domains, and represent a major type of big data. The increased volume and diversity of geospatial data have caused serious usability issues for researchers in various scientific domains, which call for innovative cyberGIS solutions. To address these issues, this paper describes a cyberGIS community data service framework to facilitate geospatial big data access, processing, and sharing based on a hybrid supercomputer architecture. Through the collaboration between the CyberGIS Center at the University of Illinois at Urbana-Champaign (UIUC) and the U.S. Geological Survey (USGS), a community data service for accessing, customizing, and sharing digital elevation model (DEM) and its derived datasets from the 10-meter national elevation dataset, namely TopoLens, is created to demonstrate the workflow integration of geospatial big data sources, computation, analysis needed for customizing the original dataset for end user needs, and a friendly online user environment. TopoLens provides online access to precomputed and on-demand computed high-resolution elevation data by exploiting the ROGER supercomputer. The usability of this prototype service has been acknowledged in community evaluation.
Viking High-Resolution Topography and Mars '01 Site Selection: Application to the White Rock Area
NASA Astrophysics Data System (ADS)
Tanaka, K. L.; Kirk, Randolph L.; Mackinnon, D. J.; Howington-Kraus, E.
1999-06-01
Definition of the local topography of the Mars '01 Lander site is crucial for assessment of lander safety and rover trafficability. According to Golombek et al., steep surface slopes may (1) cause retro-rockets to be fired too early or late for a safe landing, (2) the landing site slope needs to be < 1deg to ensure lander stability, and (3) a nearly level site is better for power generation of both the lander and the rover and for rover trafficability. Presently available datasets are largely inadequate to determine surface slope at scales pertinent to landing-site issues. Ideally, a topographic model of the entire landing site at meter-scale resolution would permit the best assessment of the pertinent topographic issues. MOLA data, while providing highly accurate vertical measurements, are inadequate to address slopes along paths of less than several hundred meters, because of along-track data spacings of hundreds of meters and horizontal errors in positioning of 500 to 2000 m. The capability to produce stereotopography from MOC image pairs is not yet in hand, nor can we necessarily expect a suitable number of stereo image pairs to be acquired. However, for a limited number of sites, high-resolution Viking stereo imaging is available at tens of meters horizontal resolution, capable of covering landing-ellipse sized areas. Although we would not necessarily suggest that the chosen Mars '01 Lander site should be located where good Viking stereotopography is available, an assessment of typical surface slopes at these scales for a range of surface types may be quite valuable in landing-site selection. Thus this study has a two-fold application: (1) to support the proposal of White Rock as a candidate Mars '01 Lander site, and (2) to evaluate how Viking high resolution stereotopography may be of value in the overall Mars '01 Lander site selection process.
Joint Bearing and Range Estimation of Multiple Objects from Time-Frequency Analysis.
Liu, Jeng-Cheng; Cheng, Yuang-Tung; Hung, Hsien-Sen
2018-01-19
Direction-of-arrival (DOA) and range estimation is an important issue of sonar signal processing. In this paper, a novel approach using Hilbert-Huang transform (HHT) is proposed for joint bearing and range estimation of multiple targets based on a uniform linear array (ULA) of hydrophones. The structure of this ULA based on micro-electro-mechanical systems (MEMS) technology, and thus has attractive features of small size, high sensitivity and low cost, and is suitable for Autonomous Underwater Vehicle (AUV) operations. This proposed target localization method has the following advantages: only a single snapshot of data is needed and real-time processing is feasible. The proposed algorithm transforms a very complicated nonlinear estimation problem to a simple nearly linear one via time-frequency distribution (TFD) theory and is verified with HHT. Theoretical discussions of resolution issue are also provided to facilitate the design of a MEMS sensor with high sensitivity. Simulation results are shown to verify the effectiveness of the proposed method.
A prioritization of generic safety issues. Supplement 19, Revision insertion instructions
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
1995-11-01
The report presents the safety priority ranking for generic safety issues related to nuclear power plants. The purpose of these rankings is to assist in the timely and efficient allocation of NRC resources for the resolution of those safety issues that have a significant potential for reducing risk. The safety priority rankings are HIGH, MEDIUM, LOW, and DROP, and have been assigned on the basis of risk significance estimates, the ratio of risk to costs and other impacts estimated to result if resolution of the safety issues were implemented, and the consideration of uncertainties and other quantitative or qualitative factors.more » To the extent practical, estimates are quantitative. This document provides revisions and amendments to the report.« less
Spatial resolution requirements for soft-copy reporting in digital radiography
NASA Astrophysics Data System (ADS)
Davies, Andrew G.; Cowen, Arnold R.; Fowler, Richard C.; Bury, Robert F.; Parkin, Geoff J. S.; Lintott, David J.; Martinez, Delia; Safudim, Asif
1996-04-01
The issue of the spatial resolution required in order to present diagnostic quality digital images, especially for softcopy reporting, has received much attention over recent years. The aim of this study was to compare the diagnostic performance reporting from hardcopy and optimized softcopy image presentations. One-hundred-fifteen radiographs of the hand acquired on a photostimulable phosphor computed radiography (CR) system were chosen as the image material. The study group was taken from patients who demonstrated subtle erosions of the bone in the digits. The control group consisted of radiologically normal bands. The images were presented in three modes, the CR system's hardcopy output, and softcopy presentations at full and half spatial resolutions. Four consultant radiologists participated as observers. Results were analyzed using the receiver operating characteristic (ROC) technique, and showed a statistically significant improvement in observer performance for both softcopy formats, when compared to the hardcopy presentation. However, no significant difference in observer performance was found between the two softcopy presentations. We therefore conclude that, with appropriate attention to the processing and presentation of digital image data, softcopy reporting can, for most examinations, provide superior diagnostic performance, even for images viewed at modest (1 k2) resolutions.
Stage acoustics for musicians: A multidimensional approach using 3D ambisonic technology
NASA Astrophysics Data System (ADS)
Guthrie, Anne
In this research, a method was outlined and tested for the use of 3D Ambisonic technology to inform stage acoustics research and design. Stage acoustics for musicians as a field has yet to benefit from recent advancements in auralization and spatial acoustic analysis. This research attempts to address common issues in stage acoustics: subjective requirements for performers in relation to feelings of support, quality of sound, and ease of ensemble playing in relation to measurable, objective characteristics that can be used to design better stage enclosures. While these issues have been addressed in previous work, this research attempts to use technological advancements to improve the resolution and realism of the testing and analysis procedures. Advancements include measurement of spatial impulse responses using a spherical microphone array, higher-order ambisonic encoding and playback for real-time performer auralization, high-resolution spatial beamforming for analysis of onstage impulse responses, and multidimensional scaling procedures to determine subjective musician preferences. The methodology for implementing these technologies into stage acoustics research is outlined in this document and initial observations regarding implications for stage enclosure design are proposed. This research provides a robust method for measuring and analyzing performer experiences on multiple stages without the costly and time-intensive process of physically surveying orchestras on different stages, with increased repeatability while maintaining a high level of immersive realism and spatial resolution. Along with implications for physical design, this method provides possibilities for virtual teaching and rehearsal, parametric modeling and co-located performance.
The Session of a Special Character of UNEP's Governing Council: Principal Resolutions.
ERIC Educational Resources Information Center
Uniterra, 1982
1982-01-01
Presents principal resolutions adopted by the Session of a Special Character (SSC) of the Governing Council of the United Nations Environment Programme. Resolutions focus on achievements of Action Plan for Human Environment, new perceptions of environmental issues, environmental trends, and planning/implementation of environmental activities.…
Arnold, J E; Eisenband, J G; Brown-Schmidt, S; Trueswell, J C
2000-07-14
Eye movements of listeners were monitored to investigate how gender information and accessibility influence the initial processes of pronoun interpretation. Previous studies on this issue have produced mixed results, and several studies have concluded that gender cues are not automatically used during the early processes of pronoun interpretation (e.g. Garnham, A., Oakhill, J. & Cruttenden, H. (1992). The role of implicit causality and gender cue in the interpretation of pronouns. Language and Cognitive Processes, 73 (4), 231-255; Greene, S. B., McKoon, G. & Ratcliff, R. (1992). Pronoun resolution and discourse models. Journal of Experimental Psychology: Learning, Memory, and Cognition, 182, 266-283). In the two experiments presented here, participants viewed a picture with two familiar cartoon characters of either same or different gender. They listened to a text describing the picture, in which a pronoun referred to either the first, more accessible, character, or the second. (For example, Donald is bringing some mail to ¿Mickey/Minnie¿ while a violent storm is beginning. He's carrying an umbrellaellipsis.) The results of both experiments show rapid use of both gender and accessibility at approximately 200 ms after the pronoun offset.
NASA Astrophysics Data System (ADS)
McAlpin, D. B.; Meyer, F. J.; Webley, P. W.
2017-12-01
Using thermal data from Advanced Very High Resolution Radiometer (AVHRR) sensors, we investigated algorithms to estimate the effusive volume of lava flows from the 2012-13 eruption of Tolbachik Volcano with high temporal resolution. AVHRR are polar orbiting, radiation detection instruments that provide reflectance and radiance data in six spectral bands with a ground resolution of 1.1 km². During the Tolbachik eruption of 2012-13, active AVHRR instruments were available aboard four polar orbiting platforms. Although the primary purpose of the instruments is climate and ocean studies, their multiple platforms provide global coverage at least twice daily, with data for all regions of the earth no older than six hours. This frequency makes the AVHRR instruments particularly suitable for the study of volcanic activity. While methods for deriving effusion rates from thermal observations have been previously published, a number of topics complicate their practical application. In particular, these include (1) unknown material parameters used in the estimation process; (2) relatively coarse resolution of thermal sensors; (3) optimizing a model to describe the number of thermal regimes within each pixel and (4) frequent saturation issues in thermal channels. We present ongoing investigations into effusion rate estimation from AVHRR data using the 2012-13 eruption of Tolbachik Volcano as a test event. For this eruption we studied approaches for coping with issues (1) - (4) to pave the way to a more operational implementation of published techniques. To address (1), we used Monte Carlo simulations to understand the sensitivity of effusion rate estimates to changes in material parameters. To study (2) and (3) we compared typical two-component (exposed lava on ambient background) and three-component models (exposed lava, cooled crust, ambient background) for their relative performance. To study issue (4), we compared AVHRR-derived effusion rates to reference data derived from multi-temporal digital elevation models. In our workflow, we correct for scan angle of the sensor and the transmissivity of the atmosphere before including include corrected temperatures in heat equations to determine the effusion volume necessary to satisfy the equations.
Deep-towed high resolution seismic imaging II: Determination of P-wave velocity distribution
NASA Astrophysics Data System (ADS)
Marsset, B.; Ker, S.; Thomas, Y.; Colin, F.
2018-02-01
The acquisition of high resolution seismic data in deep waters requires the development of deep towed seismic sources and receivers able to deal with the high hydrostatic pressure environment. The low frequency piezoelectric transducer of the SYSIF (SYstème Sismique Fond) deep towed seismic device comply with the former requirement taking advantage of the coupling of a mechanical resonance (Janus driver) and a fluid resonance (Helmholtz cavity) to produce a large frequency bandwidth acoustic signal (220-1050 Hz). The ability to perform deep towed multichannel seismic imaging with SYSIF was demonstrated in 2014, yet, the ability to determine P-wave velocity distribution wasn't achieved. P-wave velocity analysis relies on the ratio between the source-receiver offset range and the depth of the seismic reflectors, thus towing the seismic source and receivers closer to the sea bed will provide a better geometry for P-wave velocity determination. Yet, technical issues, related to the acoustic source directivity, arise for this approach in the particular framework of piezoelectric sources. A signal processing sequence is therefore added to the initial processing flow. Data acquisition took place during the GHASS (Gas Hydrates, fluid Activities and Sediment deformations in the western Black Sea) cruise in the Romanian waters of the Black Sea. The results of the imaging processing are presented for two seismic data sets acquired over gas hydrates and gas bearing sediments. The improvement in the final seismic resolution demonstrates the validity of the velocity model.
3D image processing architecture for camera phones
NASA Astrophysics Data System (ADS)
Atanassov, Kalin; Ramachandra, Vikas; Goma, Sergio R.; Aleksic, Milivoje
2011-03-01
Putting high quality and easy-to-use 3D technology into the hands of regular consumers has become a recent challenge as interest in 3D technology has grown. Making 3D technology appealing to the average user requires that it be made fully automatic and foolproof. Designing a fully automatic 3D capture and display system requires: 1) identifying critical 3D technology issues like camera positioning, disparity control rationale, and screen geometry dependency, 2) designing methodology to automatically control them. Implementing 3D capture functionality on phone cameras necessitates designing algorithms to fit within the processing capabilities of the device. Various constraints like sensor position tolerances, sensor 3A tolerances, post-processing, 3D video resolution and frame rate should be carefully considered for their influence on 3D experience. Issues with migrating functions such as zoom and pan from the 2D usage model (both during capture and display) to 3D needs to be resolved to insure the highest level of user experience. It is also very important that the 3D usage scenario (including interactions between the user and the capture/display device) is carefully considered. Finally, both the processing power of the device and the practicality of the scheme needs to be taken into account while designing the calibration and processing methodology.
Human Systems Integration in Practice: Constellation Lessons Learned
NASA Technical Reports Server (NTRS)
Zumbado, Jennifer Rochlis
2012-01-01
NASA's Constellation program provided a unique testbed for Human Systems Integration (HSI) as a fundamental element of the Systems Engineering process. Constellation was the first major program to have HSI mandated by NASA's Human Rating document. Proper HSI is critical to the success of any project that relies on humans to function as operators, maintainers, or controllers of a system. HSI improves mission, system and human performance, significantly reduces lifecycle costs, lowers risk and minimizes re-design. Successful HSI begins with sufficient project schedule dedicated to the generation of human systems requirements, but is by no means solely a requirements management process. A top-down systems engineering process that recognizes throughout the organization, human factors as a technical discipline equal to traditional engineering disciplines with authority for the overall system. This partners with a bottoms-up mechanism for human-centered design and technical issue resolution. The Constellation Human Systems Integration Group (HSIG) was a part of the Systems Engineering and Integration (SE&I) organization within the program office, and existed alongside similar groups such as Flight Performance, Environments & Constraints, and Integrated Loads, Structures and Mechanisms. While the HSIG successfully managed, via influence leadership, a down-and-in Community of Practice to facilitate technical integration and issue resolution, it lacked parallel top-down authority to drive integrated design. This presentation will discuss how HSI was applied to Constellation, the lessons learned and best practices it revealed, and recommendations to future NASA program and project managers. This presentation will discuss how Human Systems Integration (HSI) was applied to NASA's Constellation program, the lessons learned and best practices it revealed, and recommendations to future NASA program and project managers on how to accomplish this critical function.
Technical Basis of Scaling Relationships for the Pretreatment Engineering Platform
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kuhn, William L.; Arm, Stuart T.; Huckaby, James L.
Pacific Northwest National Laboratory has been tasked by Bechtel National Inc. (BNI) on the River Protection Project-Waste Treatment Plant (RPP-WTP) project to perform research and development activities. The Pretreatment Engineering Platform (PEP) is being designed and constructed as part of a plan to respond to an issue raised by the WTP External Flowsheet Review Team (EFRT) entitled “Undemonstrated Leaching Processes” and numbered M12. The PEP replicates the WTP leaching process using prototypic equipment and control strategies. The approach for scaling PEP performance data to predict WTP performance is critical to the successful resolution of the EFRT issue. This report describesmore » the recommended PEP scaling approach, PEP data interpretation and provides recommendations on test conduct and data requirements.« less
Quality Assessment of Collection 6 MODIS Atmospheric Science Products
NASA Astrophysics Data System (ADS)
Manoharan, V. S.; Ridgway, B.; Platnick, S. E.; Devadiga, S.; Mauoka, E.
2015-12-01
Since the launch of the NASA Terra and Aqua satellites in December 1999 and May 2002, respectively, atmosphere and land data acquired by the MODIS (Moderate Resolution Imaging Spectroradiometer) sensor on-board these satellites have been reprocessed five times at the MODAPS (MODIS Adaptive Processing System) located at NASA GSFC. The global land and atmosphere products use science algorithms developed by the NASA MODIS science team investigators. MODAPS completed Collection 6 reprocessing of MODIS Atmosphere science data products in April 2015 and is currently generating the Collection 6 products using the latest version of the science algorithms. This reprocessing has generated one of the longest time series of consistent data records for understanding cloud, aerosol, and other constituents in the earth's atmosphere. It is important to carefully evaluate and assess the quality of this data and remove any artifacts to maintain a useful climate data record. Quality Assessment (QA) is an integral part of the processing chain at MODAPS. This presentation will describe the QA approaches and tools adopted by the MODIS Land/Atmosphere Operational Product Evaluation (LDOPE) team to assess the quality of MODIS operational Atmospheric products produced at MODAPS. Some of the tools include global high resolution images, time series analysis and statistical QA metrics. The new high resolution global browse images with pan and zoom have provided the ability to perform QA of products in real time through synoptic QA on the web. This global browse generation has been useful in identifying production error, data loss, and data quality issues from calibration error, geolocation error and algorithm performance. A time series analysis for various science datasets in the Level-3 monthly product was recently developed for assessing any long term drifts in the data arising from instrument errors or other artifacts. This presentation will describe and discuss some test cases from the recently processed C6 products. We will also describe the various tools and approaches developed to verify and assess the algorithm changes implemented by the science team to address known issues in the products and improve the quality of the products.
Analysis of Radar and Optical Space Borne Data for Large Scale Topographical Mapping
NASA Astrophysics Data System (ADS)
Tampubolon, W.; Reinhardt, W.
2015-03-01
Normally, in order to provide high resolution 3 Dimension (3D) geospatial data, large scale topographical mapping needs input from conventional airborne campaigns which are in Indonesia bureaucratically complicated especially during legal administration procedures i.e. security clearance from military/defense ministry. This often causes additional time delays besides technical constraints such as weather and limited aircraft availability for airborne campaigns. Of course the geospatial data quality is an important issue for many applications. The increasing demand of geospatial data nowadays consequently requires high resolution datasets as well as a sufficient level of accuracy. Therefore an integration of different technologies is required in many cases to gain the expected result especially in the context of disaster preparedness and emergency response. Another important issue in this context is the fast delivery of relevant data which is expressed by the term "Rapid Mapping". In this paper we present first results of an on-going research to integrate different data sources like space borne radar and optical platforms. Initially the orthorectification of Very High Resolution Satellite (VHRS) imagery i.e. SPOT-6 has been done as a continuous process to the DEM generation using TerraSAR-X/TanDEM-X data. The role of Ground Control Points (GCPs) from GNSS surveys is mandatory in order to fulfil geometrical accuracy. In addition, this research aims on providing suitable processing algorithm of space borne data for large scale topographical mapping as described in section 3.2. Recently, radar space borne data has been used for the medium scale topographical mapping e.g. for 1:50.000 map scale in Indonesian territories. The goal of this on-going research is to increase the accuracy of remote sensing data by different activities, e.g. the integration of different data sources (optical and radar) or the usage of the GCPs in both, the optical and the radar satellite data processing. Finally this results will be used in the future as a reference for further geospatial data acquisitions to support topographical mapping in even larger scales up to the 1:10.000 map scale.
Using complexity science and negotiation theory to resolve boundary-crossing water issues
NASA Astrophysics Data System (ADS)
Islam, Shafiqul; Susskind, Lawrence
2018-07-01
Many water governance and management issues are complex. The complexity of these issues is related to crossing of multiple boundaries: political, social and jurisdictional, as well as physical, ecological and biogeochemical. Resolution of these issues usually requires interactions of many parties with conflicting values and interests operating across multiple boundaries and scales to make decisions. The interdependence and feedback among interacting variables, processes, actors and institutions are hard to model and difficult to forecast. Thus, decision-making related to complex water problems needs be contingent and adaptive. This paper draws on a number of ideas from complexity science and negotiation theory that may make it easier to cope with the complexities and difficulties of managing boundary crossing water disputes. It begins with the Water Diplomacy Framework that was developed and tested over the past several years. Then, it uses three key ideas from complexity science (interdependence and interconnectedness; uncertainty and feedback; emergence and adaptation) and three from negotiation theory (stakeholder identification and engagement; joint fact finding; and value creation through option generation) to show how application of these ideas can help enhance effectiveness of water management.
Tracking: Conflicts and Resolutions. Controversial Issues in Education.
ERIC Educational Resources Information Center
Lockwood, Anne Turnbaugh
The educational tracking system raises highly controversial issues. This book offers both the viewpoints of researchers who have grappled with the issue of tracking and the personal experiences of school staff who have wrestled with the issue of whether or not to track instruction. It presents summaries of interviews that were conducted with three…
Infectious diseases and securitization: WHO's dilemma.
Jin, Jiyong; Karackattu, Joe Thomas
2011-06-01
The threat posed by infectious diseases has been increasingly framed as a security issue. The UN Security Council's Resolution 1308, which designated HIV/AIDS as a threat to international security, evidenced the securitization process. Using securitization theory as a theoretical tool, this article explores the securitization of infectious diseases in the World Health Organization (WHO). While WHO has tended to securitize infectious diseases since 2000, it has encountered a dilemma in the process because of the inherent asymmetry of interest between developed and developing countries. The act of securitization in WHO currently remains mostly a rhetorical device, since WHO's norms emblematic of securitization have not been backed by operational measures for verification or enforcement due to these asymmetric interests.
In Vivo EPR Resolution Enhancement Using Techniques Known from Quantum Computing Spin Technology.
Rahimi, Robabeh; Halpern, Howard J; Takui, Takeji
2017-01-01
A crucial issue with in vivo biological/medical EPR is its low signal-to-noise ratio, giving rise to the low spectroscopic resolution. We propose quantum hyperpolarization techniques based on 'Heat Bath Algorithmic Cooling', allowing possible approaches for improving the resolution in magnetic resonance spectroscopy and imaging.
A comparison of earthquake backprojection imaging methods for dense local arrays
NASA Astrophysics Data System (ADS)
Beskardes, G. D.; Hole, J. A.; Wang, K.; Michaelides, M.; Wu, Q.; Chapman, M. C.; Davenport, K. K.; Brown, L. D.; Quiros, D. A.
2018-03-01
Backprojection imaging has recently become a practical method for local earthquake detection and location due to the deployment of densely sampled, continuously recorded, local seismograph arrays. While backprojection sometimes utilizes the full seismic waveform, the waveforms are often pre-processed and simplified to overcome imaging challenges. Real data issues include aliased station spacing, inadequate array aperture, inaccurate velocity model, low signal-to-noise ratio, large noise bursts and varying waveform polarity. We compare the performance of backprojection with four previously used data pre-processing methods: raw waveform, envelope, short-term averaging/long-term averaging and kurtosis. Our primary goal is to detect and locate events smaller than noise by stacking prior to detection to improve the signal-to-noise ratio. The objective is to identify an optimized strategy for automated imaging that is robust in the presence of real-data issues, has the lowest signal-to-noise thresholds for detection and for location, has the best spatial resolution of the source images, preserves magnitude, and considers computational cost. Imaging method performance is assessed using a real aftershock data set recorded by the dense AIDA array following the 2011 Virginia earthquake. Our comparisons show that raw-waveform backprojection provides the best spatial resolution, preserves magnitude and boosts signal to detect events smaller than noise, but is most sensitive to velocity error, polarity error and noise bursts. On the other hand, the other methods avoid polarity error and reduce sensitivity to velocity error, but sacrifice spatial resolution and cannot effectively reduce noise by stacking. Of these, only kurtosis is insensitive to large noise bursts while being as efficient as the raw-waveform method to lower the detection threshold; however, it does not preserve the magnitude information. For automatic detection and location of events in a large data set, we therefore recommend backprojecting kurtosis waveforms, followed by a second pass on the detected events using noise-filtered raw waveforms to achieve the best of all criteria.
NASA Astrophysics Data System (ADS)
Fleury, Jules; Brunier, Guillaume; Michaud, Emma; Anthony, Edward; Dussouillez, Philippe; Morvan, Sylvain
2016-04-01
Mud banks are the loci of rich bio-geo-chemical processes occuring rapidly at infra-tide frequency. Their surface topography is commonly affected by many of these processes, including bioturbation, water drainage or dessication. Quantifying surface morphology and changes on a mud bank at the micro-scale is a challenging task due to a number of issues. First, the water-saturated nature of the soil makes it difficult to measure High Resolution Topography (HRT) with classical methods. Second, setting up an instrumented experiment without disrupting the signal being studied is hardly achieved at micro-scale. Finally, the highly mobile nature of this environment enhancing strong spatio-temporal heterogeneity is hard to capture. Terrestrial Laser Scanning (TLS) and SfM (Surface from Motion)-Photogrammetry are two techniques that enable mapping of micro-scale features, but the first technique is not suitable because of the poor quality of the backscattered laser signal on wet surfaces and the need to set up several measuring stations on a complex, unstable substrate. Thus, we set up an experiment to assess the feasibility and the accuracy of SfM in such a context. We took the opportunity of the installation of a pontoon dedicated to the study of bio-geochemical processes within benthic mesocosms installed on a mud bank inhabited by pioneer mangroves trees to develop an adapted photogrammetry protocol based on a full-frame remotely triggered camera sensor mounted on a pole. The incident light on the surface was also controlled with a light-diffusing device. We obtained sub-millimetric resolution 3D-topography and visible imagery. Surveys were carried out every 2 hours at low tide to detect surface changes due to water content variation as well as bioturbation mainly caused by crabs digging galleries and feeding on sediment surface. Both the qualitative and quantitative results seem very promising and lead us to expect new insights into heterogeneous surface processes on a highly dynamic mud bank. Remaining issues are finding appropriate validation data at such a high level of resolution in order to assess accuracy, and developing an acquisition method at a frequency high enough to enable us to decipher bulk soil movement from local changing features.
Technical Directions In High Resolution Non-Impact Printers
NASA Astrophysics Data System (ADS)
Dunn, S. Thomas; Dunn, Patrice M.
1987-04-01
There are several factors to consider when addressing the issue of non-impact printer resolution. One will find differences between the imaging resolution and the final output resolution, and most assuradly differences exist between the advertised and actual resolution of many of these systems. Beyond that some of the technical factors that effect the resolution of a system in-clude: . Scan Line Density . Overlap . Spot Size . Energy Profile . Symmetry of Imaging Generally speaking, the user of graphic arts equipment, is best advised to view output to determine the degree of acceptable quality.
NASA Astrophysics Data System (ADS)
Bityurin, N. M.
2010-12-01
This paper considers nanostructuring of solid surfaces by nano-optical techniques, primarily by laser particle nanolithography. Threshold processes are examined that can be used for laser structuring of solid surfaces, with particular attention to laser swelling of materials. Fundamental spatial resolution issues in three-dimensional (3D) laser nanostructuring are analysed with application to laser nanopolymerisation and 3D optical information recording. The formation of nanostructures in the bulk of solids due to their structural instability under irradiation is exemplified by photoinduced formation of nanocomposites.
Effects of pore-scale physics on uranium geochemistry in Hanford sediments
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hu, Qinhong; Ewing, Robert P.
Overall, this work examines a key scientific issue, mass transfer limitations at the pore-scale, using both new instruments with high spatial resolution, and new conceptual and modeling paradigms. The complementary laboratory and numerical approaches connect pore-scale physics to macroscopic measurements, providing a previously elusive scale integration. This Exploratory research project produced five peer-reviewed journal publications and eleven scientific presentations. This work provides new scientific understanding, allowing the DOE to better incorporate coupled physical and chemical processes into decision making for environmental remediation and long-term stewardship.
The Regional Superfund Engineering Forum is a group of EPA professionals, representing EPA's Regional Superfund Offices, committed to the identification and resolution of engineering issues impacting the remediation of Superfund sites. The Forum is supported by and advises the ...
Imaging dynamic redox processes with genetically encoded probes.
Ezeriņa, Daria; Morgan, Bruce; Dick, Tobias P
2014-08-01
Redox signalling plays an important role in many aspects of physiology, including that of the cardiovascular system. Perturbed redox regulation has been associated with numerous pathological conditions; nevertheless, the causal relationships between redox changes and pathology often remain unclear. Redox signalling involves the production of specific redox species at specific times in specific locations. However, until recently, the study of these processes has been impeded by a lack of appropriate tools and methodologies that afford the necessary redox species specificity and spatiotemporal resolution. Recently developed genetically encoded fluorescent redox probes now allow dynamic real-time measurements, of defined redox species, with subcellular compartment resolution, in intact living cells. Here we discuss the available genetically encoded redox probes in terms of their sensitivity and specificity and highlight where uncertainties or controversies currently exist. Furthermore, we outline major goals for future probe development and describe how progress in imaging methodologies will improve our ability to employ genetically encoded redox probes in a wide range of situations. This article is part of a special issue entitled "Redox Signalling in the Cardiovascular System." Copyright © 2014 Elsevier Ltd. All rights reserved.
Embedded Implementation of VHR Satellite Image Segmentation
Li, Chao; Balla-Arabé, Souleymane; Ginhac, Dominique; Yang, Fan
2016-01-01
Processing and analysis of Very High Resolution (VHR) satellite images provide a mass of crucial information, which can be used for urban planning, security issues or environmental monitoring. However, they are computationally expensive and, thus, time consuming, while some of the applications, such as natural disaster monitoring and prevention, require high efficiency performance. Fortunately, parallel computing techniques and embedded systems have made great progress in recent years, and a series of massively parallel image processing devices, such as digital signal processors or Field Programmable Gate Arrays (FPGAs), have been made available to engineers at a very convenient price and demonstrate significant advantages in terms of running-cost, embeddability, power consumption flexibility, etc. In this work, we designed a texture region segmentation method for very high resolution satellite images by using the level set algorithm and the multi-kernel theory in a high-abstraction C environment and realize its register-transfer level implementation with the help of a new proposed high-level synthesis-based design flow. The evaluation experiments demonstrate that the proposed design can produce high quality image segmentation with a significant running-cost advantage. PMID:27240370
Scale-Resolving simulations (SRS): How much resolution do we really need?
NASA Astrophysics Data System (ADS)
Pereira, Filipe M. S.; Girimaji, Sharath
2017-11-01
Scale-resolving simulations (SRS) are emerging as the computational approach of choice for many engineering flows with coherent structures. The SRS methods seek to resolve only the most important features of the coherent structures and model the remainder of the flow field with canonical closures. With reference to a typical Large-Eddy Simulation (LES), practical SRS methods aim to resolve a considerably narrower range of scales (reduced physical resolution) to achieve an adequate degree of accuracy at reasonable computational effort. While the objective of SRS is well-founded, the criteria for establishing the optimal degree of resolution required to achieve an acceptable level of accuracy are not clear. This study considers the canonical case of the flow around a circular cylinder to address the issue of `optimal' resolution. Two important criteria are developed. The first condition addresses the issue of adequate resolution of the flow field. The second guideline provides an assessment of whether the modeled field is canonical (stochastic) turbulence amenable to closure-based computations.
NASA Technical Reports Server (NTRS)
1990-01-01
This NASA Audit Follow-up Handbook is issued pursuant to the requirements of the Office of Management and Budget (OMB) Circular A-50, Audit Follow-up, dated September 29, 1982. It sets forth policy, uniform performance standards, and procedural guidance to NASA personnel for use when considering reports issued by the Office of Inspector General (OIG), other executive branch audit organizations, the Defense Contract Audit Agency (DCAA), and the General Accounting Office (GAO). It is intended to: specify principal roles; strengthen the procedures for management decisions (resolution) on audit findings and corrective action on audit report recommendations; emphasize the importance of monitoring agreed upon corrective actions to assure actual accomplishment; and foster the use of audit reports as effective tools of management. A flow chart depicting the NASA audit and management decision process is in Appendix A. This handbook is a controlled handbook issued in loose-leaf form and will be revised by page changes. Additional copies for internal use may be obtained through normal distribution channels.
Marine ice sheet model performance depends on basal sliding physics and sub-shelf melting
NASA Astrophysics Data System (ADS)
Gladstone, Rupert Michael; Warner, Roland Charles; Galton-Fenzi, Benjamin Keith; Gagliardini, Olivier; Zwinger, Thomas; Greve, Ralf
2017-01-01
Computer models are necessary for understanding and predicting marine ice sheet behaviour. However, there is uncertainty over implementation of physical processes at the ice base, both for grounded and floating glacial ice. Here we implement several sliding relations in a marine ice sheet flow-line model accounting for all stress components and demonstrate that model resolution requirements are strongly dependent on both the choice of basal sliding relation and the spatial distribution of ice shelf basal melting.Sliding relations that reduce the magnitude of the step change in basal drag from grounded ice to floating ice (where basal drag is set to zero) show reduced dependence on resolution compared to a commonly used relation, in which basal drag is purely a power law function of basal ice velocity. Sliding relations in which basal drag goes smoothly to zero as the grounding line is approached from inland (due to a physically motivated incorporation of effective pressure at the bed) provide further reduction in resolution dependence.A similar issue is found with the imposition of basal melt under the floating part of the ice shelf: melt parameterisations that reduce the abruptness of change in basal melting from grounded ice (where basal melt is set to zero) to floating ice provide improved convergence with resolution compared to parameterisations in which high melt occurs adjacent to the grounding line.Thus physical processes, such as sub-glacial outflow (which could cause high melt near the grounding line), impact on capability to simulate marine ice sheets. If there exists an abrupt change across the grounding line in either basal drag or basal melting, then high resolution will be required to solve the problem. However, the plausible combination of a physical dependency of basal drag on effective pressure, and the possibility of low ice shelf basal melt rates next to the grounding line, may mean that some marine ice sheet systems can be reliably simulated at a coarser resolution than currently thought necessary.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shrestha, Roshan; Houser, Paul R.; Anantharaj, Valentine G.
2011-04-01
Precipitation products are currently available from various sources at higher spatial and temporal resolution than any time in the past. Each of the precipitation products has its strengths and weaknesses in availability, accuracy, resolution, retrieval techniques and quality control. By merging the precipitation data obtained from multiple sources, one can improve its information content by minimizing these issues. However, precipitation data merging poses challenges of scale-mismatch, and accurate error and bias assessment. In this paper we present Optimal Merging of Precipitation (OMP), a new method to merge precipitation data from multiple sources that are of different spatial and temporal resolutionsmore » and accuracies. This method is a combination of scale conversion and merging weight optimization, involving performance-tracing based on Bayesian statistics and trend-analysis, which yields merging weights for each precipitation data source. The weights are optimized at multiple scales to facilitate multiscale merging and better precipitation downscaling. Precipitation data used in the experiment include products from the 12-km resolution North American Land Data Assimilation (NLDAS) system, the 8-km resolution CMORPH and the 4-km resolution National Stage-IV QPE. The test cases demonstrate that the OMP method is capable of identifying a better data source and allocating a higher priority for them in the merging procedure, dynamically over the region and time period. This method is also effective in filtering out poor quality data introduced into the merging process.« less
14 CFR 16.202 - Powers of a hearing officer.
Code of Federal Regulations, 2014 CFR
2014-01-01
... 16.202 Aeronautics and Space FEDERAL AVIATION ADMINISTRATION, DEPARTMENT OF TRANSPORTATION PROCEDURAL...) Issue subpoenas authorized by law and issue notices of deposition requested by the parties; (d) Limit... unnecessary and duplicative proceedings in the interest of prompt and fair resolution of the matters at issue...
14 CFR 16.202 - Powers of a hearing officer.
Code of Federal Regulations, 2013 CFR
2013-01-01
... 16.202 Aeronautics and Space FEDERAL AVIATION ADMINISTRATION, DEPARTMENT OF TRANSPORTATION PROCEDURAL...) Issue subpoenas authorized by law and issue notices of deposition requested by the parties; (d) Limit... unnecessary and duplicative proceedings in the interest of prompt and fair resolution of the matters at issue...
14 CFR 16.202 - Powers of a hearing officer.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 16.202 Aeronautics and Space FEDERAL AVIATION ADMINISTRATION, DEPARTMENT OF TRANSPORTATION PROCEDURAL...) Issue subpoenas authorized by law and issue notices of deposition requested by the parties; (d) Limit... unnecessary and duplicative proceedings in the interest of prompt and fair resolution of the matters at issue...
14 CFR 16.202 - Powers of a hearing officer.
Code of Federal Regulations, 2011 CFR
2011-01-01
... 16.202 Aeronautics and Space FEDERAL AVIATION ADMINISTRATION, DEPARTMENT OF TRANSPORTATION PROCEDURAL...) Issue subpoenas authorized by law and issue notices of deposition requested by the parties; (d) Limit... unnecessary and duplicative proceedings in the interest of prompt and fair resolution of the matters at issue...
14 CFR 16.202 - Powers of a hearing officer.
Code of Federal Regulations, 2012 CFR
2012-01-01
... 16.202 Aeronautics and Space FEDERAL AVIATION ADMINISTRATION, DEPARTMENT OF TRANSPORTATION PROCEDURAL...) Issue subpoenas authorized by law and issue notices of deposition requested by the parties; (d) Limit... unnecessary and duplicative proceedings in the interest of prompt and fair resolution of the matters at issue...
Current Issues in Canadian Education.
ERIC Educational Resources Information Center
Bergen, John J.
Based on interviews with 150 persons in departments of education and in national, provincial, and territorial education organizations in Canada's major capital cities, this paper discusses seven vital issues in Canadian education and briefly states seven others. The seven major issues needing resolution concern: (1) the appropriate balance between…
United States and Western Europe cooperation in planetary exploration
NASA Technical Reports Server (NTRS)
Levy, Eugene H.; Hunten, Donald M.; Masursky, Harold; Scarf, Frederick L.; Solomon, Sean C.; Wilkening, Laurel L.; Fechtig, Hugo; Balsiger, Hans; Blamont, Jacques; Fulchignoni, Marcello
1989-01-01
A framework was sought for U.S.-European cooperation in planetary exploration. Specific issues addressed include: types and levels of possible cooperative activities in the planetary sciences; specific or general scientific areas that seem most promising as the main focus of cooperative efforts; potential mission candidates for cooperative ventures; identification of special issues or problems for resolution by negotiation between the agencies, and possible suggestions for their resolutions; and identification of coordinated technological and instrumental developments for planetary missions.
Cost and Training Effectiveness Analysis (CTEA) Performance Guide.
1980-09-01
planning the CTFA and provides illustrative ne.-hods to be adapted to the particular data availability situations of the CTEA analyst. The report...11-1 B. Assessment of Available Data 1-1-3 C. Issues Requiring Resolution 1-5 D. Strategies for Conducting the CTE-A 11-6 E. Summnarv TI-15 III...Resolution of Issues 111-137 1. ITV CTEA 111-137 2. DRIMS CTEA 111-137 3. Trainability Analysis 11i-137 G. Sample Procedure 111-140 1. Assessment of Data
An automated procedure for detection of IDP's dwellings using VHR satellite imagery
NASA Astrophysics Data System (ADS)
Jenerowicz, Malgorzata; Kemper, Thomas; Soille, Pierre
2011-11-01
This paper presents the results for the estimation of dwellings structures in Al Salam IDP Camp, Southern Darfur, based on Very High Resolution multispectral satellite images obtained by implementation of Mathematical Morphology analysis. A series of image processing procedures, feature extraction methods and textural analysis have been applied in order to provide reliable information about dwellings structures. One of the issues in this context is related to similarity of the spectral response of thatched dwellings' roofs and the surroundings in the IDP camps, where the exploitation of multispectral information is crucial. This study shows the advantage of automatic extraction approach and highlights the importance of detailed spatial and spectral information analysis based on multi-temporal dataset. The additional data fusion of high-resolution panchromatic band with lower resolution multispectral bands of WorldView-2 satellite has positive influence on results and thereby can be useful for humanitarian aid agency, providing support of decisions and estimations of population especially in situations when frequent revisits by space imaging system are the only possibility of continued monitoring.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vereschagin, Konstantin A; Vereschagin, Alexey K; Smirnov, Valery V
2006-07-31
A high-resolution spectroscopic method is developed for recording Raman spectra of molecular transitions in transient objects during a laser pulse with a resolution of {approx}0.1 cm{sup -1}. The method is based on CARS spectroscopy using a Fabry-Perot interferometer for spectral analysis of the CARS signal and detecting a circular interferometric pattern on a two-dimensional multichannel photodetector. It is shown that the use of the Dual-Broad-Band-CARS configuration to obtain the CARS process provides the efficient averaging of the spectral-amplitude noise of the CARS signal generated by a laser pulse and, in combination with the angular integration of the two-dimensional interference pattern,more » considerably improves the quality of interferograms. The method was tested upon diagnostics of the transient oxygen-hydrogen flame where information on the shapes of spectral lines of the Q-branch of hydrogen molecules required for measuring temperature was simultaneously obtained and used. (special issue devoted to the 90th anniversary of a.m. prokhorov)« less
A high-resolution synthetic bed elevation grid of the Antarctic continent
NASA Astrophysics Data System (ADS)
Graham, Felicity S.; Roberts, Jason L.; Galton-Fenzi, Ben K.; Young, Duncan; Blankenship, Donald; Siegert, Martin J.
2017-05-01
Digital elevation models of Antarctic bed topography are smoothed and interpolated onto low-resolution ( > 1 km) grids as current observed topography data are generally sparsely and unevenly sampled. This issue has potential implications for numerical simulations of ice-sheet dynamics, especially in regions prone to instability where detailed knowledge of the topography, including fine-scale roughness, is required. Here, we present a high-resolution (100 m) synthetic bed elevation terrain for Antarctica, encompassing the continent, continental shelf, and seas south of 60° S. Although not identically matching observations, the synthetic bed surface - denoted as HRES - preserves topographic roughness characteristics of airborne and ground-based ice-penetrating radar data measured by the ICECAP (Investigating the Cryospheric Evolution of the Central Antarctic Plate) consortium or used to create the Bedmap1 compilation. Broad-scale ( > 5 km resolution) features of the Antarctic landscape are incorporated using a low-pass filter of the Bedmap2 bed elevation data. HRES has applicability in high-resolution ice-sheet modelling studies, including investigations of the interaction between topography, ice-sheet dynamics, and hydrology, where processes are highly sensitive to bed elevations and fine-scale roughness. The data are available for download from the Australian Antarctic Data Centre (doi:10.4225/15/57464ADE22F50).
NASA Astrophysics Data System (ADS)
Podimata, Marianthi V.; Yannopoulos, Panayotis C.
2015-04-01
Water managers, decision-makers, water practitioners and others involved in Integrated Water Resources Management often encounter the problem of finding a joint agreement among stakeholders concerning the management of a common water body. Handling conflict situations/disputes over water issues and finding an acceptable joint solution remain a thorny issue in water negotiation processes, since finding a formula for wise, fair and sustainable management of a water resource is a complex process that includes environmental, economic, technical, socio-political criteria and their uncertainties. Decision Support Systems and Adaptive Management are increasingly used in that direction. To assist decision makers in handling water disputes and execute negotiations, a conceptual tool is required. The Graph Model for Conflict Resolution is a Decision Support flexible tool for negotiation support regarding water conflicts. It includes efficient algorithms for estimating strategic moves of water stakeholders, even though there is a lack of detail concerning their real motives and prospects. It calculates the stability of their states and encourages what-if analyses. This paper presents a case study of water decision makers' evaluations concerning the management of up-coming technical infrastructure Peiros-Parapeiros Dam, in Achaia Region (Greece). The continuous consultations between institutions and representatives revealed that the formation of a joint agreement between stakeholders is not easy, due to arising conflicts and contradictions regarding the jurisdiction and legal status of the dam operator and the cost undertaking of the dam operation. This paper analyzes the positions of the parties involved in the consultation process and examines possible conflict resolution states, using GMCR II. This methodology tries to minimize uncertainty to a certain extent concerning the possible moves/decisions of involved parties regarding the operation and management of the dam by developing and simulating potential strategic interactions and multilateral negotiations and finding confidence-building cooperation schemes (cooperative arrangements) over water use and management.
Globally Gridded Satellite observations for climate studies
Knapp, K.R.; Ansari, S.; Bain, C.L.; Bourassa, M.A.; Dickinson, M.J.; Funk, Chris; Helms, C.N.; Hennon, C.C.; Holmes, C.D.; Huffman, G.J.; Kossin, J.P.; Lee, H.-T.; Loew, A.; Magnusdottir, G.
2011-01-01
Geostationary satellites have provided routine, high temporal resolution Earth observations since the 1970s. Despite the long period of record, use of these data in climate studies has been limited for numerous reasons, among them that no central archive of geostationary data for all international satellites exists, full temporal and spatial resolution data are voluminous, and diverse calibration and navigation formats encumber the uniform processing needed for multisatellite climate studies. The International Satellite Cloud Climatology Project (ISCCP) set the stage for overcoming these issues by archiving a subset of the full-resolution geostationary data at ~10-km resolution at 3-hourly intervals since 1983. Recent efforts at NOAA's National Climatic Data Center to provide convenient access to these data include remapping the data to a standard map projection, recalibrating the data to optimize temporal homogeneity, extending the record of observations back to 1980, and reformatting the data for broad public distribution. The Gridded Satellite (GridSat) dataset includes observations from the visible, infrared window, and infrared water vapor channels. Data are stored in Network Common Data Format (netCDF) using standards that permit a wide variety of tools and libraries to process the data quickly and easily. A novel data layering approach, together with appropriate satellite and file metadata, allows users to access GridSat data at varying levels of complexity based on their needs. The result is a climate data record already in use by the meteorological community. Examples include reanalysis of tropical cyclones, studies of global precipitation, and detection and tracking of the intertropical convergence zone.
Finite slice analysis (FINA) of sliced and velocity mapped images on a Cartesian grid
NASA Astrophysics Data System (ADS)
Thompson, J. O. F.; Amarasinghe, C.; Foley, C. D.; Rombes, N.; Gao, Z.; Vogels, S. N.; van de Meerakker, S. Y. T.; Suits, A. G.
2017-08-01
Although time-sliced imaging yields improved signal-to-noise and resolution compared with unsliced velocity mapped ion images, for finite slice widths as encountered in real experiments there is a loss of resolution and recovered intensities for the slow fragments. Recently, we reported a new approach that permits correction of these effects for an arbitrarily sliced distribution of a 3D charged particle cloud. This finite slice analysis (FinA) method utilizes basis functions that model the out-of-plane contribution of a given velocity component to the image for sequential subtraction in a spherical polar coordinate system. However, the original approach suffers from a slow processing time due to the weighting procedure needed to accurately model the out-of-plane projection of an anisotropic angular distribution. To overcome this issue we present a variant of the method in which the FinA approach is performed in a cylindrical coordinate system (Cartesian in the image plane) rather than a spherical polar coordinate system. Dubbed C-FinA, we show how this method is applied in much the same manner. We compare this variant to the polar FinA method and find that the processing time (of a 510 × 510 pixel image) in its most extreme case improves by a factor of 100. We also show that although the resulting velocity resolution is not quite as high as the polar version, this new approach shows superior resolution for fine structure in the differential cross sections. We demonstrate the method on a range of experimental and synthetic data at different effective slice widths.
Globally Gridded Satellite (GridSat) Observations for Climate Studies
NASA Technical Reports Server (NTRS)
Knapp, Kenneth R.; Ansari, Steve; Bain, Caroline L.; Bourassa, Mark A.; Dickinson, Michael J.; Funk, Chris; Helms, Chip N.; Hennon, Christopher C.; Holmes, Christopher D.; Huffman, George J.;
2012-01-01
Geostationary satellites have provided routine, high temporal resolution Earth observations since the 1970s. Despite the long period of record, use of these data in climate studies has been limited for numerous reasons, among them: there is no central archive of geostationary data for all international satellites, full temporal and spatial resolution data are voluminous, and diverse calibration and navigation formats encumber the uniform processing needed for multi-satellite climate studies. The International Satellite Cloud Climatology Project set the stage for overcoming these issues by archiving a subset of the full resolution geostationary data at approx.10 km resolution at 3 hourly intervals since 1983. Recent efforts at NOAA s National Climatic Data Center to provide convenient access to these data include remapping the data to a standard map projection, recalibrating the data to optimize temporal homogeneity, extending the record of observations back to 1980, and reformatting the data for broad public distribution. The Gridded Satellite (GridSat) dataset includes observations from the visible, infrared window, and infrared water vapor channels. Data are stored in the netCDF format using standards that permit a wide variety of tools and libraries to quickly and easily process the data. A novel data layering approach, together with appropriate satellite and file metadata, allows users to access GridSat data at varying levels of complexity based on their needs. The result is a climate data record already in use by the meteorological community. Examples include reanalysis of tropical cyclones, studies of global precipitation, and detection and tracking of the intertropical convergence zone.
Distinct conflict resolution deficits related to different facets of Schizophrenia.
Kerns, John G
2009-11-01
An important issue in understanding the nature of conflict processing is whether it is a unitary or multidimensional construct. One way to examine this is to study whether people with impaired conflict processing exhibit a general pattern of deficits or whether they exhibit impairments in distinct aspects of conflict processing. One group who might exhibit conflict deficits are people with schizophrenia. Schizophrenia is a heterogeneous disorder, with one way to break down the heterogeneity of schizophrenia is to examine specific symptoms. Previous research has found that specific symptoms of schizophrenia are associated with specific deficits in conflict processing. In particular, disorganization is associated with increased response conflict, alogia is associated with increased retrieval conflict, and anhedonia is associated with increased emotional conflict. Moreover, there is evidence that different types of conflict processing are unassociated with each other. This evidence suggests that conflict processing is a multidimensional construct and that different aspects of schizophrenia are associated with impairments in processing different types of conflict.
The Morality of Socioscientific Issues: Construal and Resolution of Genetic Engineering Dilemmas
ERIC Educational Resources Information Center
Sadler, Troy D.; Zeidler, Dana L.
2004-01-01
The ability to negotiate and resolve socioscientific issues has been posited as integral components of scientific literacy. Although philosophers and science educators have argued that socioscientific issues inherently involve moral and ethical considerations, the ultimate arbiters of morality are individual decision-makers. This study explored…
Assistive Technology and Academic Libraries: Legal Issues and Problem Resolution
ERIC Educational Resources Information Center
Green, Ravonne A.
2009-01-01
Legal issues have increasingly come to the forefront in academic libraries in recent years. Most of these issues involve The Rehabilitation Act, Section 504 (1973) or Americans with Disabilities Act (1990), complaints related to discriminatory practices with regard to accommodations or assistive technologies. This article provides a brief synopsis…
Hydrologic applications of weather radar
NASA Astrophysics Data System (ADS)
Seo, Dong-Jun; Habib, Emad; Andrieu, Hervé; Morin, Efrat
2015-12-01
By providing high-resolution quantitative precipitation information (QPI), weather radars have revolutionized hydrology in the last two decades. With the aid of GIS technology, radar-based quantitative precipitation estimates (QPE) have enabled routine high-resolution hydrologic modeling in many parts of the world. Given the ever-increasing need for higher-resolution hydrologic and water resources information for a wide range of applications, one may expect that the use of weather radar will only grow. Despite the tremendous progress, a number of significant scientific, technological and engineering challenges remain to realize its potential. New challenges are also emerging as new areas of applications are discovered, explored and pursued. The purpose of this special issue is to provide the readership with some of the latest advances, lessons learned, experiences gained, and science issues and challenges related to hydrologic applications of weather radar. The special issue features 20 contributions on various topics which reflect the increasing diversity as well as the areas of focus in radar hydrology today. The contributions may be grouped as follows:
Transmission Pricing Issues for Electricity Generation From Renewable Resources
1999-01-01
This article discusses how the resolution of transmission pricing issues which have arisen under the Federal Energy Regulatory Commission's (FERC) open access environment may affect the prospects for renewable-based electricity.
A Review on Potential Issues and Challenges in MR Imaging
Kanakaraj, Jagannathan
2013-01-01
Magnetic resonance imaging is a noninvasive technique that has been developed for its excellent depiction of soft tissue contrasts. Instruments capable of ultra-high field strengths, ≥7 Tesla, were recently engineered and have resulted in higher signal-to-noise and higher resolution images. This paper presents various subsystems of the MR imaging systems like the magnet subsystem, gradient subsystem, and also various issues which arise due to the magnet. Further, it also portrays finer details about the RF coils and transceiver and also various limitations of the RF coils and transceiver. Moreover, the concept behind the data processing system and the challenges related to it were also depicted. Finally, the various artifacts associated with the MR imaging were clearly pointed out. It also presents a brief overview about all the challenges related to MR imaging systems. PMID:24381523
Negotiating Decisions during Informed Consent for Pediatric Phase I Oncology Trials
Marshall, Patricia A.; Magtanong, Ruth V.; Leek, Angela C.; Hizlan, Sabahat; Yamokoski, Amy D.; Kodish, Eric D.
2012-01-01
During informed consent conferences (ICCs) for Phase I trials, oncologists must present complex information while addressing concerns. Research on communication that evolves during ICCs remains largely unexplored. We examined communication during ICCs for pediatric Phase I cancer trials using a stratified random sample from six pediatric cancer centers. A grounded theory approach identified key communication steps and factors influencing the negotiation of decisions for trial participation. Analysis suggests that during ICCs, families, patients, and clinicians exercise choice and control by negotiating micro-decisions in two broad domains: drug logic and logistics, and administration/scheduling. Micro-decisions unfold in a four-step communication process: (1) introduction of an issue; (2) response; (3) negotiation of the issue; and (4) resolution and decision. Negotiation over smaller micro-decisions is prominent in ICCs and merits further study. PMID:22565583
Negotiating decisions during informed consent for pediatric Phase I oncology trials.
Marshall, Patricia A; Magtanong, Ruth V; Leek, Angela C; Hizlan, Sabahat; Yamokoski, Amy D; Kodish, Eric D
2012-04-01
During informed consent conferences (ICCs) for Phase I trials, oncologists must present complex information while addressing concerns. Research on communication that evolves during ICCs remains largely unexplored. We examined communication during ICCs for pediatric Phase I cancer trials using a stratified random sample from six pediatric cancer centers. A grounded theory approach identified key communication steps and factors influencing the negotiation of decisions for trial participation. Analysis suggests that during ICCs, families, patients, and clinicians exercise choice and control by negotiating micro-decisions in two broad domains: drug logic and logistics, and administration/scheduling. Micro-decisions unfold in a four-step communication process: (1) introduction of an issue; (2) response; (3) negotiation of the issue; and (4) resolution and decision. Negotiation over smaller micro-decisions is prominent in ICCs and merits further study.
High Resolution Displays Using NCAP Liquid Crystals
NASA Astrophysics Data System (ADS)
Macknick, A. Brian; Jones, Phil; White, Larry
1989-07-01
Nematic curvilinear aligned phase (NCAP) liquid crystals have been found useful for high information content video displays. NCAP materials are liquid crystals which have been encapsulated in a polymer matrix and which have a light transmission which is variable with applied electric fields. Because NCAP materials do not require polarizers, their on-state transmission is substantially better than twisted nematic cells. All dimensional tolerances are locked in during the encapsulation process and hence there are no critical sealing or spacing issues. By controlling the polymer/liquid crystal morphology, switching speeds of NCAP materials have been significantly improved over twisted nematic systems. Recent work has combined active matrix addressing with NCAP materials. Active matrices, such as thin film transistors, have given displays of high resolution. The paper will discuss the advantages of NCAP materials specifically designed for operation at video rates on transistor arrays; applications for both backlit and projection displays will be discussed.
Photomask quality assessment solution for 90-nm technology node
NASA Astrophysics Data System (ADS)
Ohira, Katsumi; Chung, Dong Hoon P.; Nobuyuki, Yoshioka; Tateno, Motonari; Matsumura, Kenichi; Chen, Jiunn-Hung; Luk-Pat, Gerard T.; Fukui, Norio; Tanaka, Yoshio
2004-08-01
As 90 nm LSI devices are about to enter pre-production, the cost and turn-around time of photomasks for such devices will be key factors for success in device production. Such devices will be manufactured with state-of-the-art 193nm photolithography systems. Photomasks for these devices are being produced with the most advanced equipment, material and processing technologies and yet, quality assurance still remains an issue for volume production. These issues include defect classification and disposition due to the insufficient resolution of the defect inspection system at conventional review and classification processes and to aggressive RETs, uncertainty of the impact the defects have on the printed feature as well as inconsistencies of classical defect specifications as applied in the sub-wavelength era are becoming a serious problem. Simulation-based photomask qualification using the Virtual Stepper System is widely accepted today as a reliable mask quality assessment tool of mask defects for both the 180 nm and 130 nm technology nodes. This study examines the extendibility of the Virtual Stepper System to 90nm technology node. The proposed method of simulation-based mask qualification uses aerial image defect simulation in combination with a next generation DUV inspection system with shorter wavelength (266nm) and small pixel size combined with DUV high-resolution microscope for some defect cases. This paper will present experimental results that prove the applicability for enabling 90nm technology nodes. Both contact and line/space patterns with varies programmed defects on ArF Attenuated PSM will be used. This paper will also address how to make the strategy production-worthy.
Evaluation and development of unmanned aircraft (UAV) for UDOT needs.
DOT National Transportation Integrated Search
2012-05-01
This research involved the use of high-resolution aerial photography obtained from Unmanned Aerial Vehicles (UAV) to aid UDOT in monitoring and documenting State Roadway structures and associated issues. Using geo-referenced UAV high resolution aeria...
Moed, Anat; Gershoff, Elizabeth T; Eisenberg, Nancy; Hofer, Claire; Losoya, Sandra; Spinrad, Tracy L; Liew, Jeffrey
2015-08-01
Although conflict is a normative part of parent-adolescent relationships, conflicts that are long or highly negative are likely to be detrimental to these relationships and to youths' development. In the present article, sequential analyses of data from 138 parent-adolescent dyads (adolescents' mean age was 13.44, SD = 1.16; 52 % girls, 79 % non-Hispanic White) were used to define conflicts as reciprocal exchanges of negative emotion observed while parents and adolescents were discussing "hot," conflictual issues. Dynamic components of these exchanges, including who started the conflicts, who ended them, and how long they lasted, were identified. Mediation analyses revealed that a high proportion of conflicts ended by adolescents was associated with longer conflicts, which in turn predicted perceptions of the "hot" issue as unresolved and adolescent behavior problems. The findings illustrate advantages of using sequential analysis to identify patterns of interactions and, with some certainty, obtain an estimate of the contingent relationship between a pattern of behavior and child and parental outcomes. These interaction patterns are discussed in terms of the roles that parents and children play when in conflict with each other, and the processes through which these roles affect conflict resolution and adolescents' behavior problems.
NASA Astrophysics Data System (ADS)
Benedetti, Laura Robin; Eggert, J. H.; Kilkenny, J. D.; Bradley, D. K.; Bell, P. M.; Palmer, N. E.; Rygg, J. R.; Boehly, T. R.; Collins, G. W.; Sorce, C.
2017-06-01
Since X-ray diffraction is the most definitive method for identifying crystalline phases of a material, it is an important technique for probing high-energy-density materials during laser-driven compression experiments. We are developing a design for collecting several x-ray diffraction datasets during a single laser-driven experiment, with a goal of achieving temporal resolution better than 1ns. The design combines x-ray streak cameras, for a continuous temporal record of diffraction, with fast x-ray imagers, to collect several diffraction patterns with sufficient solid angle range and resolution to identify crystalline texture. Preliminary experiments will be conducted at the Omega laser and then implemented at the National Ignition Facility. We will describe the status of the conceptual design, highlighting tradeoffs in the design process. We will also discuss the technical issues that must be addressed in order to develop a successful experimental platform. These include: Facility-specific geometric constraints such as unconverted laser light and target alignment; EMP issues when electronic diagnostics are close to the target; X-ray source requirements; and detector capabilities. This work was performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344, LLNL-ABS-725146.
NASA Astrophysics Data System (ADS)
Brook, Anna; Robins, Lotem; Olmedo Casal, Estrella
2017-04-01
Measuring the level of Sea Surface Salinity (SSS) is a principle component in order to understand climate processes that occur today and for better understanding of climate change in the future; Different processes create different salt concentration in different places in the oceans. This different salinity level had a role in determining the vertical and horizontal water fluxes. As the first three meters of the ocean surface contain more heat than that in the whole atmosphere, the influence of the salinity level on the layering of the different water levels and the different fluxes, thus, it is an important factor determining air sea interaction. An existing problem in predicting the oceans is the lack of salinity samples in the oceans. While Sea surface Temperature (SST) could be evaluated easier from remote sensed devices, analyzing data at the Near Infra-Red and Visual wavelength. Measuring and locating salinity spectral signature was an obstacle. This lack of data caused problems running different models that describe different parameters of the ocean, both in depth and surface. One of the main goals of a program called: Soil Moisture and Ocean Salinity (SMOS), is to deliver data on a global scale concerning the sea surface salinity (SSS). The main idea of the SMOS technology is based on the differences between the electro-magnetic properties (spectral signatures) of distilled water and salted water. High concentration of salt revealed by analyzing the energy emitted from the ocean's surface, using detectors that are sensitive for the wavelength at the range of 21 cm (L-band: 1.4 GHz). One of the main problems, measuring this wavelength, is that it requires very large antennas. In order to solve this problem, a Y shaped satellite was built, on each of its arms, 69 antennas were attached, with equal distances between each antenna. Each antenna is 165 mm on the diameter and their height is 19 mm. This antenna transmits all the information they receive to a central device. By using interferometry and a matching algorithm between the signals of each possible couple of antennas, synthesis of all the antennas is preformed, which makes it possible to overcome the problem of using large antennas. The SMOS satellite delivers data at accuracy of 0.1psu, with spatial resolution of 200 km. The data is available in different forms, starting from unprocessed data to final products of brightness temperature that are temporally and spatially synchronized (L2 level). By downscaling those products, models of spatial resolution of 25 km and temporal daily resolution are calculated. The meaning of this long process is that the data is going through long process before arriving to scientific analysis. The process from raw data until L2 level involves radiometric and physical corrections, detection and exclusion of atmospheric disturbance, geographical anchor etc. In order to check the possible usage of the processed data of SMOS in the Mediterranean, a simple comparison was held between SMOS and MEDRYS1V2 - reanalysis of the Mediterranean sea during the years 1992-2013, which is based on the oceanic model of NEMO12, and forced by the atmospheric model of ALDERA. While doing this comparison, it is important to remember that the goal of the SMOS program is to deliver data in a global scale, while MEDRYS1V2 was created especially for the Mediterranean. From the comparison of the two data sets, it is possible to detect to main issues: The first issue is that it seems that the SMOS satellite uses more linear interpolation, to describe the space, while the reanalysis is based on the primitive physical equations and data assimilation. The second issue is a large anomaly that occurs probably due to the river spill, which is getting a different signature, as the low resolution of SMOS might be a problem detecting correctly the spill, without another local data source. To conclude, the SMOS program which one of its main goals is to create a reliable data source of SSS, on a global scale, has an important role for understanding oceanic process and climate change patterns. While the global goal is contributing for research and development, on a more local scale it is possible to observe that analyzation of the Mediterranean, that mainly being held in high spatial resolution is not represented well using SMOS products. The main reason is the low spatial resolution, of the satellite, but owing to its unique technology, different methods could be applied, to better represent smaller scaled research.
NASA Technical Reports Server (NTRS)
Yudkin, Howard
1988-01-01
The next generation of computer systems are studied by examining the processes and methodologies. The present generation is ok for small projects, but not so good for large projects. They are not good for addressing the iterative nature of requirements, resolution, and implementation. They do not address complexity issues of requirements stabilization. They do not explicitly address reuse opportunities, and they do not help with people shortages. Therefore, there is a need to define and automate improved software engineering processes. Some help may be gained by reuse and prototyping, which are two sides of the same coin. Reuse library parts are used to generate good approximations to desired solutions, i.e., prototypes. And rapid prototype composition implies use of preexistent parts, i.e., reusable parts.
The esa earth explorer land surface processes and interactions mission
NASA Astrophysics Data System (ADS)
Labandibar, Jean-Yves; Jubineau, Franck; Silvestrin, Pierluigi; Del Bello, Umberto
2017-11-01
The European Space Agency (ESA) is defining candidate missions for Earth Observation. In the class of the Earth Explorer missions, dedicated to research and pre-operational demonstration, the Land Surface Processes and Interactions Mission (LSPIM) will acquire the accurate quantitative measurements needed to improve our understanding of the nature and evolution of biosphere-atmosphere interactions and to contribute significantly to a solution of the scaling problems for energy, water and carbon fluxes at the Earth's surface. The mission is intended to provide detailed observations of the surface of the Earth and to collect data related to ecosystem processes and radiation balance. It is also intended to address a range of issues important for environmental monitoring, renewable resources assessment and climate models. The mission involves a dedicated maneuvering satellite which provides multi-directional observations for systematic measurement of Land Surface BRDF (BiDirectional Reflectance Distribution Function) of selected sites on Earth. The satellite carries an optical payload : PRISM (Processes Research by an Imaging Space Mission), a multispectral imager providing reasonably high spatial resolution images (50 m over 50 km swath) in the whole optical spectral domain (from 450 nm to 2.35 μm with a resolution close to 10 nm, and two thermal bands from 8.1 to 9.1 μm). This paper presents the results of the Phase A study awarded by ESA, led by ALCATEL Space Industries and concerning the design of LSPIM.
Current research issues related to post-wildfire runoff and erosion processes
Moody, John A.; Shakesby, Richard A.; Robichaud, Peter R.; Cannon, Susan H.; Martin, Deborah A.
2013-01-01
Research into post-wildfire effects began in the United States more than 70 years ago and only later extended to other parts of the world. Post-wildfire responses are typically transient, episodic, variable in space and time, dependent on thresholds, and involve multiple processes measured by different methods. These characteristics tend to hinder research progress, but the large empirical knowledge base amassed in different regions of the world suggests that it should now be possible to synthesize the data and make a substantial improvement in the understanding of post-wildfire runoff and erosion response. Thus, it is important to identify and prioritize the research issues related to post-wildfire runoff and erosion. Priority research issues are the need to: (1) organize and synthesize similarities and differences in post-wildfire responses between different fire-prone regions of the world in order to determine common patterns and generalities that can explain cause and effect relations; (2) identify and quantify functional relations between metrics of fire effects and soil hydraulic properties that will better represent the dynamic and transient conditions after a wildfire; (3) determine the interaction between burned landscapes and temporally and spatially variable meso-scale precipitation, which is often the primary driver of post-wildfire runoff and erosion responses; (4) determine functional relations between precipitation, basin morphology, runoff connectivity, contributing area, surface roughness, depression storage, and soil characteristics required to predict the timing, magnitudes, and duration of floods and debris flows from ungaged burned basins; and (5) develop standard measurement methods that will ensure the collection of uniform and comparable runoff and erosion data. Resolution of these issues will help to improve conceptual and computer models of post-wildfire runoff and erosion processes.
Healthy by Design: Using a Gender Focus to Influence Complete Streets Policy.
Keippel, April Ennis; Henderson, Melissa A; Golbeck, Amanda L; Gallup, TommiLee; Duin, Diane K; Hayes, Stephen; Alexander, Stephanie; Ciemins, Elizabeth L
2017-10-17
Public health leaders in Yellowstone County, Montana, formed an alliance to address community-wide issues. One such issue is Complete Streets, with its vision of safe streets for all. This case study focuses on development and adoption of a Complete Streets policy. It examines how a community coalition, Healthy By Design, infused a gender focus into the policymaking process. An incremental and nonlinear policymaking process was aided by a focus on gender and health equity. The focus on a large constituency helped to frame advocacy in terms of a broad population's needs, not just special interests. The city council unanimously adopted a Complete Streets resolution, informed by a gender lens. Healthy By Design further used gender information to successfully mobilize the community in response to threats of repeal of the policy, and then influenced the adoption of a revised policy. Policies developed with a focus on equity, including gender equity, may have broader impact on the community. Such policies may pave the way for future policies that seek to transform gender norms toward building a healthier community for all residents. Published by Elsevier Inc.
A 4.5 km resolution Arctic Ocean simulation with the global multi-resolution model FESOM 1.4
NASA Astrophysics Data System (ADS)
Wang, Qiang; Wekerle, Claudia; Danilov, Sergey; Wang, Xuezhu; Jung, Thomas
2018-04-01
In the framework of developing a global modeling system which can facilitate modeling studies on Arctic Ocean and high- to midlatitude linkage, we evaluate the Arctic Ocean simulated by the multi-resolution Finite Element Sea ice-Ocean Model (FESOM). To explore the value of using high horizontal resolution for Arctic Ocean modeling, we use two global meshes differing in the horizontal resolution only in the Arctic Ocean (24 km vs. 4.5 km). The high resolution significantly improves the model's representation of the Arctic Ocean. The most pronounced improvement is in the Arctic intermediate layer, in terms of both Atlantic Water (AW) mean state and variability. The deepening and thickening bias of the AW layer, a common issue found in coarse-resolution simulations, is significantly alleviated by using higher resolution. The topographic steering of the AW is stronger and the seasonal and interannual temperature variability along the ocean bottom topography is enhanced in the high-resolution simulation. The high resolution also improves the ocean surface circulation, mainly through a better representation of the narrow straits in the Canadian Arctic Archipelago (CAA). The representation of CAA throughflow not only influences the release of water masses through the other gateways but also the circulation pathways inside the Arctic Ocean. However, the mean state and variability of Arctic freshwater content and the variability of freshwater transport through the Arctic gateways appear not to be very sensitive to the increase in resolution employed here. By highlighting the issues that are independent of model resolution, we address that other efforts including the improvement of parameterizations are still required.
Problems and Issues of Diversity in the United States.
ERIC Educational Resources Information Center
Naylor, Larry L., Ed.
A number of social problems created by cultural diversity in the United States seem to defy resolution. The essays in this collection address some of these issues and are designed to initiate serious discussions on some of the most serious questions. The chapters are: (1) "Introduction to American Cultural Diversity: Unresolved Questions, Issues,…
Issues and Agency: Postgraduate Student and Tutor Experiences with Written Feedback
ERIC Educational Resources Information Center
Sanchez, Hugo Santiago; Dunworth, Katie
2015-01-01
This paper examines the issues which postgraduate students and tutors experienced as they engaged in receiving, providing and requesting feedback, as well as the strategies which they adopted as they sought resolution of these issues. The study employed a case study approach, using data obtained from semi-structured and stimulated recall…
Related Critical Psychometric Issues and Their Resolutions during Development of PE Metrics
ERIC Educational Resources Information Center
Fox, Connie; Zhu, Weimo; Park, Youngsik; Fisette, Jennifer L.; Graber, Kim C.; Dyson, Ben; Avery, Marybell; Franck, Marian; Placek, Judith H.; Rink, Judy; Raynes, De
2011-01-01
In addition to validity and reliability evidence, other psychometric qualities of the PE Metrics assessments needed to be examined. This article describes how those critical psychometric issues were addressed during the PE Metrics assessment bank construction. Specifically, issues included (a) number of items or assessments needed, (b) training…
Ethics issues experienced in HBM within Portuguese health surveillance and research projects.
Reis, M Fátima; Segurado, Susana; Brantes, Ana; Simões, Helena Teresinha; Melim, J Maurício; Geraldes, V; Miguel, J Pereira
2008-06-05
In keeping with the fundamental practice of transparency in the discussion and resolution of ethics conflicts raised by research, a summary of ethics issues raised during Portuguese biomonitoring in health surveillance and research is presented and, where applicable, their resolution is described. Projects underway aim to promote the surveillance of public health related to the presence of solid waste incinerators or to study associations between human exposure to environmental factors and adverse health effects. The methodological approach involves biomonitoring of heavy metals, dioxins and/or other persistent organic pollutants in tissues including blood, human milk and both scalp and pubic hair in groups such as the general population, children, pregnant women or women attempting pregnancy. As such, the projects entail the recruitment of individuals representing different demographic and health conditions, the collection of body tissues and personal data, and the processing of the data and results. The issue of autonomy is raised during the recruitment of participants and during the collection of samples and data. This right is protected by the requirement for prior written, informed consent from the participant or, in the case of children, from their guardian. Recruitment has been successful, among eligible participants, in spite of incentives rarely being offered. The exception has been in obtaining guardians' consent for children's participation, particularly for blood sampling. In an attempt to mitigate the harm-benefit ratio, current research efforts include alternative less invasive biomarkers.Surveys are currently being conducted under contract as independent biomonitoring actions and as such, must be explicitly disclosed as a potential conflict of interests. Communication of results to participants is in general only practised when a health issue is present and corrective action possible. Concerning human milk a careful approach is taken, considering breast-feeding's proven benefits. No national legislation currently accounts for the surveillance component of biomonitoring as distinct from research. Ethics issues arising within the domain of research are resolved according to available regulations. For issues encountered during surveillance, the same principles are used as guidance, completed by the authors' best judgement and relevant ethics committees' findings.
In situ high-resolution thermal microscopy on integrated circuits.
Zhuo, Guan-Yu; Su, Hai-Ching; Wang, Hsien-Yi; Chan, Ming-Che
2017-09-04
The miniaturization of metal tracks in integrated circuits (ICs) can cause abnormal heat dissipation, resulting in electrostatic discharge, overvoltage breakdown, and other unwanted issues. Unfortunately, locating areas of abnormal heat dissipation is limited either by the spatial resolution or imaging acquisition speed of current thermal analytical techniques. A rapid, non-contact approach to the thermal imaging of ICs with sub-μm resolution could help to alleviate this issue. In this work, based on the intensity of the temperature-dependent two-photon fluorescence (TPF) of Rhodamine 6G (R6G) material, we developed a novel fast and non-invasive thermal microscopy with a sub-μm resolution. Its application to the location of hotspots that may evolve into thermally induced defects in ICs was also demonstrated. To the best of our knowledge, this is the first study to present high-resolution 2D thermal microscopic images of ICs, showing the generation, propagation, and distribution of heat during its operation. According to the demonstrated results, this scheme has considerable potential for future in situ hotspot analysis during the optimization stage of IC development.
The FRIGG project: From intermediate galactic scales to self-gravitating cores
NASA Astrophysics Data System (ADS)
Hennebelle, Patrick
2018-03-01
Context. Understanding the detailed structure of the interstellar gas is essential for our knowledge of the star formation process. Aim. The small-scale structure of the interstellar medium (ISM) is a direct consequence of the galactic scales and making the link between the two is essential. Methods: We perform adaptive mesh simulations that aim to bridge the gap between the intermediate galactic scales and the self-gravitating prestellar cores. For this purpose we use stratified supernova regulated ISM magneto-hydrodynamical simulations at the kpc scale to set up the initial conditions. We then zoom, performing a series of concentric uniform refinement and then refining on the Jeans length for the last levels. This allows us to reach a spatial resolution of a few 10-3 pc. The cores are identified using a clump finder and various criteria based on virial analysis. Their most relevant properties are computed and, due to the large number of objects formed in the simulations, reliable statistics are obtained. Results: The cores' properties show encouraging agreements with observations. The mass spectrum presents a clear powerlaw at high masses with an exponent close to ≃-1.3 and a peak at about 1-2 M⊙. The velocity dispersion and the angular momentum distributions are respectively a few times the local sound speed and a few 10-2 pc km s-1. We also find that the distribution of thermally supercritical cores present a range of magnetic mass-to-flux over critical mass-to-flux ratios, typically between ≃0.3 and 3 indicating that they are significantly magnetized. Investigating the time and spatial dependence of these statistical properties, we conclude that they are not significantly affected by the zooming procedure and that they do not present very large fluctuations. The most severe issue appears to be the dependence on the numerical resolution of the core mass function (CMF). While the core definition process may possibly introduce some biases, the peak tends to shift to smaller values when the resolution improves. Conclusions: Our simulations, which use self-consistently generated initial conditions at the kpc scale, produce a large number of prestellar cores from which reliable statistics can be inferred. Preliminary comparisons with observations show encouraging agreements. In particular the inferred CMFs resemble the ones inferred from recent observations. We stress, however, a possible issue with the peak position shifting with numerical resolution.
Sacred bounds on rational resolution of violent political conflict
Ginges, Jeremy; Atran, Scott; Medin, Douglas; Shikaki, Khalil
2007-01-01
We report a series of experiments carried out with Palestinian and Israeli participants showing that violent opposition to compromise over issues considered sacred is (i) increased by offering material incentives to compromise but (ii) decreased when the adversary makes symbolic compromises over their own sacred values. These results demonstrate some of the unique properties of reasoning and decision-making over sacred values. We show that the use of material incentives to promote the peaceful resolution of political and cultural conflicts may backfire when adversaries treat contested issues as sacred values. PMID:17460042
Tank 241-C-112 vapor sampling and analysis tank characterization report. Revision 1
DOE Office of Scientific and Technical Information (OSTI.GOV)
Huckaby, J.L.
1995-05-31
Tank 241-C-112 headspace gas and vapor samples were collected and analyzed to help determine the potential risks to tank farm workers due to fugitive emissions from the tank. The drivers and objectives of waste tank headspace sampling and analysis are discussed in {open_quotes}Program Plan for the Resolution of Tank Vapor Issues.{close_quotes} Tank 241-C-112 was vapor sampled in accordance with {open_quotes}Data Quality Objectives for Generic In-Tank Health and Safety Issue Resolution.{close_quotes}
On improved understanding of plasma-chemical processes in complex low-temperature plasmas
NASA Astrophysics Data System (ADS)
Röpcke, Jürgen; Loffhagen, Detlef; von Wahl, Eric; Nave, Andy S. C.; Hamann, Stephan; van Helden, Jean-Piere H.; Lang, Norbert; Kersten, Holger
2018-05-01
Over the last years, chemical sensing using optical emission spectroscopy (OES) in the visible spectral range has been combined with methods of mid infrared laser absorption spectroscopy (MIR-LAS) in the molecular fingerprint region from 3 to 20 μm, which contains strong rotational-vibrational absorption bands of a large variety of gaseous species. This optical approach established powerful in situ diagnostic tools to study plasma-chemical processes of complex low-temperature plasmas. The methods of MIR-LAS enable to detect stable and transient molecular species in ground and excited states and to measure the concentrations and temperatures of reactive species in plasmas. Since kinetic processes are inherent to discharges ignited in molecular gases, high time resolution on sub-second timescales is frequently desired for fundamental studies as well as for process monitoring in applied research and industry. In addition to high sensitivity and good temporal resolution, the capacity for broad spectral coverage enabling multicomponent detection is further expanding the use of OES and MIR-LAS techniques. Based on selected examples, this paper reports on recent achievements in the understanding of complex low-temperature plasmas. Recently, a link with chemical modeling of the plasma has been provided, which is the ultimate objective for a better understanding of the chemical and reaction kinetic processes occurring in the plasma. Contribution to the Topical Issue "Fundamentals of Complex Plasmas", edited by Jürgen Meichsner, Michael Bonitz, Holger Fehske, Alexander Piel.
Hispanic Medical Organizations' Support for LGBT Health Issues.
Sánchez, John Paul; Sola, Orlando; Ramallo, Jorge; Sánchez, Nelson Felix; Dominguez, Kenneth; Romero-Leggott, Valerie
2014-09-01
Hispanics represent the fastest growing ethnic segment of the lesbian, gay, bisexual, and transgender (LGBT) community in the United States and are disproportionately burdened by LGBT-related health issues and limited political support from Hispanic medical organizations. Recently, the Latino Medical Student Association, the National Hispanic Medical Association, and the Hispanic Serving Health Professions Schools, representing over 60,000 Hispanic students and providers and 35 institutions, collaborated to support a resolution opposing discrimination based on sexual orientation or gender identity and recognizing the obstacles encountered by LGBTQ Hispanics. The resolution provides an important framework for organizational members and leaders to address LGBT health issues and serve to support a more positive sociopolitical climate for the Hispanic LGBT community nationally and internationally.
Corporate Civil Disobedience in the Consumer Interest.
ERIC Educational Resources Information Center
Dennis, Michael R.; And Others
1994-01-01
Through catalytic issue management, corporations proactively seek to affect resolutions of issues in which they have some interest. Corporations now catalyze legal changes by purposely disobeying existing law, facing the associated consequences, and lobbying for desired changes. (Author)
Burkardt, Nina; Ruell, Emily W.
2012-01-01
Water resources in parts of the Western United States are over-allocated, which intensifies the pressure to support water management decisions with strong scientific evidence. Because scientific studies sometimes provide uncertain or competing results or recommendations, science can become a source of disputes during decision-making processes. The Bureau of Reclamation (Reclamation) is an important water manager in the Western United States, and Reclamation decision processes are often contested by a variety of affected constituencies. We conducted a Web-based survey of Reclamation employees to determine (1) which types of disputes over science are occurring and how common they are, (2) which approaches have been used by Reclamation to try to resolve these different types of disputes, (3) how useful Reclamation employees find these approaches at resolving these types of disputes, (4) the final outcomes of these disputes and the decision-making processes that were hindered by the disputes over science, and (5) the potential usefulness of several different types of dispute resolution resources that Reclamation could provide for employees that become involved in disputes over science. The calculated minimum response rate for the survey was 59 percent. Twenty-five percent of respondents indicated that they had been involved in a dispute over science while working at Reclamation. Native species and species listed under the Endangered Species Act of 1973 were the most common issue types reported in these disputes over science. Survey respondents indicated that they used a variety of approaches to resolve disputes over science and rated most approaches as either neutral or somewhat helpful in these endeavors. Future research is needed to determine whether there are additional variables underlying these disputes that were not measured in this survey that may identify when dispute resolution methods are most effective, or whether resolving aspects of these disputes, such as differing interpretations of science, is very difficult or impossible regardless of the dispute resolution methods used.
How to squeeze high quantum efficiency and high time resolution out of a SPAD
NASA Technical Reports Server (NTRS)
Lacaita, A.; Zappa, F.; Cova, Sergio; Ripamonti, Giancarlo; Spinelli, A.
1993-01-01
We address the issue whether Single-Photon Avalanche Diodes (SPADs) can be suitably designed to achieve a trade-off between quantum efficiency and time resolution performance. We briefly recall the physical mechanisms setting the time resolution of avalanche photodiodes operated in single-photon counting, and we give some criteria for the design of SPADs with a quantum efficiency better than l0 percent at 1064 nm together with a time resolution below 50 ps rms.
HRSC: High resolution stereo camera
Neukum, G.; Jaumann, R.; Basilevsky, A.T.; Dumke, A.; Van Gasselt, S.; Giese, B.; Hauber, E.; Head, J. W.; Heipke, C.; Hoekzema, N.; Hoffmann, H.; Greeley, R.; Gwinner, K.; Kirk, R.; Markiewicz, W.; McCord, T.B.; Michael, G.; Muller, Jan-Peter; Murray, J.B.; Oberst, J.; Pinet, P.; Pischel, R.; Roatsch, T.; Scholten, F.; Willner, K.
2009-01-01
The High Resolution Stereo Camera (HRSC) on Mars Express has delivered a wealth of image data, amounting to over 2.5 TB from the start of the mapping phase in January 2004 to September 2008. In that time, more than a third of Mars was covered at a resolution of 10-20 m/pixel in stereo and colour. After five years in orbit, HRSC is still in excellent shape, and it could continue to operate for many more years. HRSC has proven its ability to close the gap between the low-resolution Viking image data and the high-resolution Mars Orbiter Camera images, leading to a global picture of the geological evolution of Mars that is now much clearer than ever before. Derived highest-resolution terrain model data have closed major gaps and provided an unprecedented insight into the shape of the surface, which is paramount not only for surface analysis and geological interpretation, but also for combination with and analysis of data from other instruments, as well as in planning for future missions. This chapter presents the scientific output from data analysis and highlevel data processing, complemented by a summary of how the experiment is conducted by the HRSC team members working in geoscience, atmospheric science, photogrammetry and spectrophotometry. Many of these contributions have been or will be published in peer-reviewed journals and special issues. They form a cross-section of the scientific output, either by summarising the new geoscientific picture of Mars provided by HRSC or by detailing some of the topics of data analysis concerning photogrammetry, cartography and spectral data analysis.
The interprofessional team as a small group.
Kane, R A
1975-01-01
Conflicts in interprofessional teamwork may be as much explained by group process considerations as by the interaction of professional roles and statuses. This paper examines the interprofessional team as a small group, using a synthesis of sources from social psychology, social group work, T-group literature, management theory, and health team research. Eight issues are considered in relation to the team as a small group, namely, (a) the individual in the group, (b) team size, (c) group norms, (d) democracy, (e) decision making and conflict resolution, (f) communication and structure, (g) leadership, and (h) group harmony and its relationship to group productivity.
High-Resolution X-Ray Telescopes
NASA Technical Reports Server (NTRS)
ODell, Stephen L.; Brissenden, Roger J.; Davis, William; Elsner, Ronald F.; Elvis, Martin; Freeman, Mark; Gaetz, Terry; Gorenstein, Paul; Gubarev, Mikhail V.
2010-01-01
Fundamental needs for future x-ray telescopes: a) Sharp images => excellent angular resolution. b) High throughput => large aperture areas. Generation-X optics technical challenges: a) High resolution => precision mirrors & alignment. b) Large apertures => lots of lightweight mirrors. Innovation needed for technical readiness: a) 4 top-level error terms contribute to image size. b) There are approaches to controlling those errors. Innovation needed for manufacturing readiness. Programmatic issues are comparably challenging.
Large-area Soil Moisture Surveys Using a Cosmic-ray Rover: Approaches and Results from Australia
NASA Astrophysics Data System (ADS)
Hawdon, A. A.; McJannet, D. L.; Renzullo, L. J.; Baker, B.; Searle, R.
2017-12-01
Recent improvements in satellite instrumentation has increased the resolution and frequency of soil moisture observations, and this in turn has supported the development of higher resolution land surface process models. Calibration and validation of these products is restricted by the mismatch of scales between remotely sensed and contemporary ground based observations. Although the cosmic ray neutron soil moisture probe can provide estimates soil moisture at a scale useful for the calibration and validation purposes, it is spatially limited to a single, fixed location. This scaling issue has been addressed with the development of mobile soil moisture monitoring systems that utilizes the cosmic ray neutron method, typically referred to as a `rover'. This manuscript describes a project designed to develop approaches for undertaking rover surveys to produce soil moisture estimates at scales comparable to satellite observations and land surface process models. A custom designed, trailer-mounted rover was used to conduct repeat surveys at two scales in the Mallee region of Victoria, Australia. A broad scale survey was conducted at 36 x 36 km covering an area of a standard SMAP pixel and an intensive scale survey was conducted over a 10 x 10 km portion of the broad scale survey, which is at a scale equivalent to that used for national water balance modelling. We will describe the design of the rover, the methods used for converting neutron counts into soil moisture and discuss factors controlling soil moisture variability. We found that the intensive scale rover surveys produced reliable soil moisture estimates at 1 km resolution and the broad scale at 9 km resolution. We conclude that these products are well suited for future analysis of satellite soil moisture retrievals and finer scale soil moisture models.
Recent advances in time series InSAR
NASA Astrophysics Data System (ADS)
Hooper, Andrew; Bekaert, David; Spaans, Karsten
2010-05-01
Despite the multiple successes of InSAR at measuring surface displacement, in many instances the signal over much of an image either decorrelates too quickly to be useful or is swamped by atmospheric noise. Time series InSAR methods seek to address these issues by essentially increasing the signal-to-noise ratio (SNR) through the use of more data. These techniques are particularly useful for applications where the strain rates detected at the surface are low, such as postseismic/interseismic motion, magma/fluid movement, landslides and reservoir exploitation. Our previous developments in this field have included a persistent scatterer algorithm based on spatial correlation, a full resolution small baseline approach based on the same strategy, and procedure for combining the two [Hooper, GRL, 2008]. This combined method works well on small areas (up to one frame) at ERS or Envisat strip-map resolution. However, in applying it to larger areas, such as the Guerrero region of Mexico and western Anatolia in Turkey, or when processing data at higher resolution, e.g. from TerraSAR-X, computer resource problems can arise. We have therefore altered the processing strategy to involve smarter use of computer memory. Further improvement is achieved by the resampling of the selected pixels (whether persistent scatterers or distributed scatterers) to a coarser resolution - usually we do not require a resolution on the scale of individual resolution cells for geophysical applications. Aliasing is avoided by summing the phase of nearby selected pixels, weighted according to their estimated SNR. This is akin to smart multilooking, but note that better results can be achieved than by starting the analysis with low-resolution (multilooked) data. Another development concerns selecting pixels only in images where they appear reliable. This allows for resolution cells that become correlated/decorrelated either in a temporary fashion, e.g., due to snow cover, or in a permanent way due to the appearance or removal of scatterers. The detection algorithm relies on the degree of spatial correlation for the pixel of interest in each image. We have also modified our 3-D phase-unwrapping algorithms to allow for the resulting differing combinations of coherent pixels in every interferogram. We demonstrate our improved techniques on volcanoes in Iceland and the 2006 slow-slip event in Guerrero, Mexico.
ERIC Educational Resources Information Center
Sadler, Troy D.; Zeidler, Dana L.
2005-01-01
This study focused on informal reasoning regarding socioscientific issues. It sought to explore how content knowledge influenced the negotiation and resolution of contentious and complex scenarios based on genetic engineering. Two hundred and sixty-nine students drawn from undergraduate natural science and nonnatural science courses completed a…
The nitrogen-vacancy colour centre in diamond
NASA Astrophysics Data System (ADS)
Doherty, Marcus W.; Manson, Neil B.; Delaney, Paul; Jelezko, Fedor; Wrachtrup, Jörg; Hollenberg, Lloyd C. L.
2013-07-01
The nitrogen-vacancy (NV) colour centre in diamond is an important physical system for emergent quantum technologies, including quantum metrology, information processing and communications, as well as for various nanotechnologies, such as biological and sub-diffraction limit imaging, and for tests of entanglement in quantum mechanics. Given this array of existing and potential applications and the almost 50 years of NV research, one would expect that the physics of the centre is well understood, however, the study of the NV centre has proved challenging, with many early assertions now believed false and many remaining issues yet to be resolved. This review represents the first time that the key empirical and ab initio results have been extracted from the extensive NV literature and assembled into one consistent picture of the current understanding of the centre. As a result, the key unresolved issues concerning the NV centre are identified and the possible avenues for their resolution are examined.
Camera sensor arrangement for crop/weed detection accuracy in agronomic images.
Romeo, Juan; Guerrero, José Miguel; Montalvo, Martín; Emmi, Luis; Guijarro, María; Gonzalez-de-Santos, Pablo; Pajares, Gonzalo
2013-04-02
In Precision Agriculture, images coming from camera-based sensors are commonly used for weed identification and crop line detection, either to apply specific treatments or for vehicle guidance purposes. Accuracy of identification and detection is an important issue to be addressed in image processing. There are two main types of parameters affecting the accuracy of the images, namely: (a) extrinsic, related to the sensor's positioning in the tractor; (b) intrinsic, related to the sensor specifications, such as CCD resolution, focal length or iris aperture, among others. Moreover, in agricultural applications, the uncontrolled illumination, existing in outdoor environments, is also an important factor affecting the image accuracy. This paper is exclusively focused on two main issues, always with the goal to achieve the highest image accuracy in Precision Agriculture applications, making the following two main contributions: (a) camera sensor arrangement, to adjust extrinsic parameters and (b) design of strategies for controlling the adverse illumination effects.
NursingQuest: supporting an analysis of nursing issues.
Bassendowski, Sandra L
2007-02-01
With the development and use of new strategies, practices, applications, and resources in technology, the teaching and learning context is shifting. Nurse educators are challenged to create instructional strategies that appeal to the newer generation of students and have the potential to enhance learning. Effective learning programs for these students require new digital communication skills, new pedagogies, and new practices. Nursing students should not be seeking the right answer as much as they should be seeking appropriate information and then developing approaches to issues or resolutions for problems. The focus of the teaching and learning context is shifting from the individual to the group, with the purpose of constructing new knowledge from available information. This article discusses the value of WebQuest activities as inquiry-oriented strategies and the process of adapting the WebQuest format for the development of a strategy called NursingQuest.
Conflict Resolution among Preschool Children: The Appeal of Negotiation in Hypothetical Disputes.
ERIC Educational Resources Information Center
Iskandar, Niveen; And Others
1995-01-01
Using hypothetical puppet interviews, 48 preschool children were interviewed about their preferences for teacher methods of conflict intervention. Puppet vignettes contrasted conflict issue, peer status, and resolution strategy (negotiation, power assertion, and disengagement). Results showed that preschoolers preferred negotiation strategies over…
Wave Phenomena Associated with Interplanetary Shocks
NASA Astrophysics Data System (ADS)
Golla, T.; MacDowall, R. J.
2016-12-01
Although laboratory and space-based experiments were used for the last several decades to study the collisionless shocks, several questions remain less than fully understood. These include: (1) what type of wave-particle energy dissipation is responsible for the shock formation, (2) what type of in-situ waves occur in the upstream, transition and downstream regions, and (3) which physical processes are responsible for the excitation of the fundamental and second harmonic solar type II radio emissions. In this study, we will address these issues using (1) the in situ and radio wave data obtained by the WAVES experiments of the STEREO A and B, and WIND spacecraft, especially the high time resolution data from the time domain samplers (TDS) of these WAVES experiments and (2) the Fourier, wavelet and higher order spectral analysis techniques. Using the in situ wave data, especially the high time resolution data observed during the local type II bursts, we will identify the nonlinear processes associated with these solar radio emissions. Comparing the estimated radio intensities by the known emission mechanisms for the observed peak Langmuir wave intensities with the observed peak radio intensities of type II bursts, we will identify the emission mechanisms.
Comparison of turbulence mitigation algorithms
NASA Astrophysics Data System (ADS)
Kozacik, Stephen T.; Paolini, Aaron; Sherman, Ariel; Bonnett, James; Kelmelis, Eric
2017-07-01
When capturing imagery over long distances, atmospheric turbulence often degrades the data, especially when observation paths are close to the ground or in hot environments. These issues manifest as time-varying scintillation and warping effects that decrease the effective resolution of the sensor and reduce actionable intelligence. In recent years, several image processing approaches to turbulence mitigation have shown promise. Each of these algorithms has different computational requirements, usability demands, and degrees of independence from camera sensors. They also produce different degrees of enhancement when applied to turbulent imagery. Additionally, some of these algorithms are applicable to real-time operational scenarios while others may only be suitable for postprocessing workflows. EM Photonics has been developing image-processing-based turbulence mitigation technology since 2005. We will compare techniques from the literature with our commercially available, real-time, GPU-accelerated turbulence mitigation software. These comparisons will be made using real (not synthetic), experimentally obtained data for a variety of conditions, including varying optical hardware, imaging range, subjects, and turbulence conditions. Comparison metrics will include image quality, video latency, computational complexity, and potential for real-time operation. Additionally, we will present a technique for quantitatively comparing turbulence mitigation algorithms using real images of radial resolution targets.
Hutchinson, James L; Rajagopal, Shalini P; Sales, Kurt J; Jabbour, Henry N
2011-07-01
Inflammatory processes are central to reproductive events including ovulation, menstruation, implantation and labour, while inflammatory dysregulation is a feature of numerous reproductive pathologies. In recent years, there has been much research into the endogenous mechanisms by which inflammatory reactions are terminated and tissue homoeostasis is restored, a process termed resolution. The identification and characterisation of naturally occurring pro-resolution mediators including lipoxins and annexin A1 has prompted a shift in the field of anti-inflammation whereby resolution is now observed as an active process, triggered as part of a normal inflammatory response. This review will address the process of resolution, discuss available evidence for expression of pro-resolution factors in the reproductive tract and explore possible roles for resolution in physiological reproductive processes and associated pathologies.
High-resolution urban flood modelling - a joint probability approach
NASA Astrophysics Data System (ADS)
Hartnett, Michael; Olbert, Agnieszka; Nash, Stephen
2017-04-01
The hydrodynamic modelling of rapid flood events due to extreme climatic events in urban environment is both a complex and challenging task. The horizontal resolution necessary to resolve complexity of urban flood dynamics is a critical issue; the presence of obstacles of varying shapes and length scales, gaps between buildings and the complex geometry of the city such as slopes affect flow paths and flood levels magnitudes. These small scale processes require a high resolution grid to be modelled accurately (2m or less, Olbert et al., 2015; Hunter et al., 2008; Brown et al., 2007) and, therefore, altimetry data of at least the same resolution. Along with availability of high-resolution LiDAR data and computational capabilities, as well as state of the art nested modelling approaches, these problems can now be overcome. Flooding and drying, domain definition, frictional resistance and boundary descriptions are all important issues to be addressed when modelling urban flooding. In recent years, the number of urban flood models dramatically increased giving a good insight into various modelling problems and solutions (Mark et al., 2004; Mason et al., 2007; Fewtrell et al., 2008; Shubert et al., 2008). Despite extensive modelling work conducted for fluvial (e.g. Mignot et al., 2006; Hunter et al., 2008; Yu and Lane, 2006) and coastal mechanisms of flooding (e.g. Gallien et al., 2011; Yang et al., 2012), the amount of investigations into combined coastal-fluvial flooding is still very limited (e.g. Orton et al., 2012; Lian et al., 2013). This is surprising giving the extent of flood consequences when both mechanisms occur simultaneously, which usually happens when they are driven by one process such as a storm. The reason for that could be the fact that the likelihood of joint event is much smaller than those of any of the two contributors occurring individually, because for fast moving storms the rainfall-driven fluvial flood arrives usually later than the storm surge (Divoky et al., 2005). Nevertheless, such events occur and in Ireland alone there are several cases of serious damage due to flooding resulting from a combination of high sea water levels and river flows driven by the same meteorological conditions (e.g. Olbert et al. 2015). A November 2009 fluvial-coastal flooding of Cork City bringing €100m loss was one such incident. This event was used by Olbert et al. (2015) to determine processes controlling urban flooding and is further explored in this study to elaborate on coastal and fluvial flood mechanisms and their roles in controlling water levels. The objective of this research is to develop a methodology to assess combined effect of multiple source flooding on flood probability and severity in urban areas and to establish a set of conditions that dictate urban flooding due to extreme climatic events. These conditions broadly combine physical flood drivers (such as coastal and fluvial processes), their mechanisms and thresholds defining flood severity. The two main physical processes controlling urban flooding: high sea water levels (coastal flooding) and high river flows (fluvial flooding), and their threshold values for which flood is likely to occur, are considered in this study. Contribution of coastal and fluvial drivers to flooding and their impacts are assessed in a two-step process. The first step involves frequency analysis and extreme value statistical modelling of storm surges, tides and river flows and ultimately the application of joint probability method to estimate joint exceedence return periods for combination of surges, tide and river flows. In the second step, a numerical model of Cork Harbour MSN_Flood comprising a cascade of four nested high-resolution models is used to perform simulation of flood inundation under numerous hypothetical coastal and fluvial flood scenarios. The risk of flooding is quantified based on a range of physical aspects such as the extent and depth of inundation (Apel et al., 2008) The methodology includes estimates of flood probabilities due to coastal- and fluvial-driven processes occurring individually or jointly, mechanisms of flooding and their impacts on urban environment. Various flood scenarios are examined in order to demonstrate that this methodology is necessary to quantify the important physical processes in coastal flood predictions. Cork City, located on the south of Ireland subject to frequent coastal-fluvial flooding, is used as a study case.
25 CFR 42.4 - What are alternative dispute resolution processes?
Code of Federal Regulations, 2010 CFR
2010-04-01
... What are alternative dispute resolution processes? Alternative dispute resolution (ADR) processes are... action. (a) ADR processes may: (1) Include peer adjudication, mediation, and conciliation; and (2... that these practices are readily identifiable. (b) For further information on ADR processes and how to...
25 CFR 42.4 - What are alternative dispute resolution processes?
Code of Federal Regulations, 2011 CFR
2011-04-01
... What are alternative dispute resolution processes? Alternative dispute resolution (ADR) processes are... action. (a) ADR processes may: (1) Include peer adjudication, mediation, and conciliation; and (2... that these practices are readily identifiable. (b) For further information on ADR processes and how to...
5 CFR 2610.307 - Further proceedings.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 2610.307 Administrative Personnel OFFICE OF GOVERNMENT ETHICS ORGANIZATION AND PROCEDURES... written submissions or, as to issues other than substantial justification (such as the applicant's... further proceedings shall be held only when necessary for full and fair resolution of the issues arising...
Implementation of Complex Signal Processing Algorithms for Position-Sensitive Microcalorimeters
NASA Technical Reports Server (NTRS)
Smith, Stephen J.
2008-01-01
We have recently reported on a theoretical digital signal-processing algorithm for improved energy and position resolution in position-sensitive, transition-edge sensor (POST) X-ray detectors [Smith et al., Nucl, lnstr and Meth. A 556 (2006) 2371. PoST's consists of one or more transition-edge sensors (TES's) on a large continuous or pixellated X-ray absorber and are under development as an alternative to arrays of single pixel TES's. PoST's provide a means to increase the field-of-view for the fewest number of read-out channels. In this contribution we extend the theoretical correlated energy position optimal filter (CEPOF) algorithm (originally developed for 2-TES continuous absorber PoST's) to investigate the practical implementation on multi-pixel single TES PoST's or Hydras. We use numerically simulated data for a nine absorber device, which includes realistic detector noise, to demonstrate an iterative scheme that enables convergence on the correct photon absorption position and energy without any a priori assumptions. The position sensitivity of the CEPOF implemented on simulated data agrees very well with the theoretically predicted resolution. We discuss practical issues such as the impact of random arrival phase of the measured data on the performance of the CEPOF. The CEPOF algorithm demonstrates that full-width-at- half-maximum energy resolution of < 8 eV coupled with position-sensitivity down to a few 100 eV should be achievable for a fully optimized device.
NASA Astrophysics Data System (ADS)
Ouma, Yashon O.
2016-01-01
Technologies for imaging the surface of the Earth, through satellite based Earth observations (EO) have enormously evolved over the past 50 years. The trends are likely to evolve further as the user community increases and their awareness and demands for EO data also increases. In this review paper, a development trend on EO imaging systems is presented with the objective of deriving the evolving patterns for the EO user community. From the review and analysis of medium-to-high resolution EO-based land-surface sensor missions, it is observed that there is a predictive pattern in the EO evolution trends such that every 10-15 years, more sophisticated EO imaging systems with application specific capabilities are seen to emerge. Such new systems, as determined in this review, are likely to comprise of agile and small payload-mass EO land surface imaging satellites with the ability for high velocity data transmission and huge volumes of spatial, spectral, temporal and radiometric resolution data. This availability of data will magnify the phenomenon of ;Big Data; in Earth observation. Because of the ;Big Data; issue, new computing and processing platforms such as telegeoprocessing and grid-computing are expected to be incorporated in EO data processing and distribution networks. In general, it is observed that the demand for EO is growing exponentially as the application and cost-benefits are being recognized in support of resource management.
Photonomics: automation approaches yield economic aikido for photonics device manufacture
NASA Astrophysics Data System (ADS)
Jordan, Scott
2002-09-01
In the glory days of photonics, with exponentiating demand for photonics devices came exponentiating competition, with new ventures commencing deliveries seemingly weekly. Suddenly the industry was faced with a commodity marketplace well before a commodity cost structure was in place. Economic issues like cost, scalability, yield-call it all "Photonomics" -now drive the industry. Automation and throughput-optimization are obvious answers, but until now, suitable modular tools had not been introduced. Available solutions were barely compatible with typical transverse alignment tolerances and could not automate angular alignments of collimated devices and arrays. And settling physics served as the insoluble bottleneck to throughput and resolution advancement in packaging, characterization and fabrication processes. The industry has addressed these needs in several ways, ranging from special configurations of catalog motion devices to integrated microrobots based on a novel mini-hexapod configuration. This intriguing approach allows tip/tilt alignments to be automated about any point in space, such as a beam waist, a focal point, the cleaved face of a fiber, or the optical axis of a waveguide- ideal for MEMS packaging automation and array alignment. Meanwhile, patented new low-cost settling-enhancement technology has been applied in applications ranging from air-bearing long-travel stages to subnanometer-resolution piezo positioners to advance resolution and process cycle-times in sensitive applications such as optical coupling characterization and fiber Bragg grating generation. Background, examples and metrics are discussed, providing an up-to-date industry overview of available solutions.
NASA Astrophysics Data System (ADS)
Tang, Yunwei; Atkinson, Peter M.; Zhang, Jingxiong
2015-03-01
A cross-scale data integration method was developed and tested based on the theory of geostatistics and multiple-point geostatistics (MPG). The goal was to downscale remotely sensed images while retaining spatial structure by integrating images at different spatial resolutions. During the process of downscaling, a rich spatial correlation model in the form of a training image was incorporated to facilitate reproduction of similar local patterns in the simulated images. Area-to-point cokriging (ATPCK) was used as locally varying mean (LVM) (i.e., soft data) to deal with the change of support problem (COSP) for cross-scale integration, which MPG cannot achieve alone. Several pairs of spectral bands of remotely sensed images were tested for integration within different cross-scale case studies. The experiment shows that MPG can restore the spatial structure of the image at a fine spatial resolution given the training image and conditioning data. The super-resolution image can be predicted using the proposed method, which cannot be realised using most data integration methods. The results show that ATPCK-MPG approach can achieve greater accuracy than methods which do not account for the change of support issue.
THINGS: THE H I NEARBY GALAXY SURVEY
DOE Office of Scientific and Technical Information (OSTI.GOV)
Walter, Fabian; Bigiel, Frank; Leroy, Adam
2008-12-15
We present 'The H I Nearby Galaxy Survey (THINGS)', a high spectral ({<=}5.2 km s{sup -1}) and spatial ({approx}6'') resolution survey of H I emission in 34 nearby galaxies obtained using the NRAO Very Large Array (VLA). The overarching scientific goal of THINGS is to investigate fundamental characteristics of the interstellar medium (ISM) related to galaxy morphology, star formation, and mass distribution across the Hubble sequence. Unique characteristics of the THINGS database are the homogeneous sensitivity as well as spatial and velocity resolution of the H I data, which is at the limit of what can be achieved with themore » VLA for a significant number of galaxies. A sample of 34 objects at distances 2 {approx}< D {approx}< 15 Mpc (resulting in linear resolutions of {approx}100 to 500 pc) are targeted in THINGS, covering a wide range of star formation rates ({approx}10{sup -3} to 6 M{sub sun} yr{sup -1}), total H I masses M{sub HI} (0.01 to 14 x 10{sup 9} M{sub sun}), absolute luminosities M{sub B} (-11.5 to -21.7 mag), and metallicities (7.5 to 9.2 in units of 12+log[O/H]). We describe the setup of the VLA observations, the data reduction procedures, and the creation of the final THINGS data products. We present an atlas of the integrated H I maps, the velocity fields, the second moment (velocity dispersion) maps and individual channel maps of each THINGS galaxy. The THINGS data products are made publicly available through a dedicated webpage. Accompanying THINGS papers (in this issue of the Astronomical Journal) address issues such as the small-scale structure of the ISM, the (dark) matter distribution in THINGS galaxies, and the processes leading to star formation.« less
Street Level Hydrology: An Urban Application of the WRF-Hydro Framework in Denver, Colorado
NASA Astrophysics Data System (ADS)
Read, L.; Hogue, T. S.; Salas, F. R.; Gochis, D.
2015-12-01
Urban flood modeling at the watershed scale carries unique challenges in routing complexity, data resolution, social and political issues, and land surface - infrastructure interactions. The ability to accurately trace and predict the flow of water through the urban landscape enables better emergency response management, floodplain mapping, and data for future urban infrastructure planning and development. These services are of growing importance as urban population is expected to continue increasing by 1.84% per year for the next 25 years, increasing the vulnerability of urban regions to damages and loss of life from floods. Although a range of watershed-scale models have been applied in specific urban areas to examine these issues, there is a trend towards national scale hydrologic modeling enabled by supercomputing resources to understand larger system-wide hydrologic impacts and feedbacks. As such it is important to address how urban landscapes can be represented in large scale modeling processes. The current project investigates how coupling terrain and infrastructure routing can improve flow prediction and flooding events over the urban landscape. We utilize the WRF-Hydro modeling framework and a high-resolution terrain routing grid with the goal of compiling standard data needs necessary for fine scale urban modeling and dynamic flood forecasting in the urban setting. The city of Denver is selected as a case study, as it has experienced several large flooding events in the last five years and has an urban annual population growth rate of 1.5%, one of the highest in the U.S. Our work highlights the hydro-informatic challenges associated with linking channel networks and drainage infrastructure in an urban area using the WRF-Hydro modeling framework and high resolution urban models for short-term flood prediction.
Canada in 3D - Toward a Sustainable 3D Model for Canadian Geology from Diverse Data Sources
NASA Astrophysics Data System (ADS)
Brodaric, B.; Pilkington, M.; Snyder, D. B.; St-Onge, M. R.; Russell, H.
2015-12-01
Many big science issues span large areas and require data from multiple heterogeneous sources, for example climate change, resource management, and hazard mitigation. Solutions to these issues can significantly benefit from access to a consistent and integrated geological model that would serve as a framework. However, such a model is absent for most large countries including Canada, due to the size of the landmass and the fragmentation of the source data into institutional and disciplinary silos. To overcome these barriers, the "Canada in 3D" (C3D) pilot project was recently launched by the Geological Survey of Canada. C3D is designed to be evergreen, multi-resolution, and inter-disciplinary: (a) it is to be updated regularly upon acquisition of new data; (b) portions vary in resolution and will initially consist of four layers (surficial, sedimentary, crystalline, and mantle) with intermediary patches of higher-resolution fill; and (c) a variety of independently managed data sources are providing inputs, such as geophysical, 3D and 2D geological models, drill logs, and others. Notably, scalability concerns dictate a decentralized and interoperable approach, such that only key control objects, denoting anchors for the modeling process, are imported into the C3D database while retaining provenance links to original sources. The resultant model is managed in the database, contains full modeling provenance as well as links to detailed information on rock units, and is to be visualized in desktop and online environments. It is anticipated that C3D will become the authoritative state of knowledge for the geology of Canada at a national scale.
30 CFR 281.13 - Joint State/Federal coordination.
Code of Federal Regulations, 2010 CFR
2010-07-01
... and development activities; facilitate the resolution of issues of mutual interest; and provide a... be offered for lease; (2) The potential for conflicts between the exploration and development of OCS mineral resources, other users and uses of the area, and means for resolution or mitigation of these...
Sen. Enzi, Michael B. [R-WY
2011-11-30
Senate - 12/01/2011 Read the second time. Placed on Senate Legislative Calendar under General Orders. Calendar No. 242. (All Actions) Tracker: This bill has the status IntroducedHere are the steps for Status of Legislation:
High-energy solar flare observations at the Y2K maximum
NASA Astrophysics Data System (ADS)
Emslie, A. Gordon
2000-04-01
Solar flares afford an opportunity to observe processes associated with the acceleration and propagation of high-energy particles at a level of detail not accessible in any other astrophysical source. I will review some key results from previous high-energy solar flare observations, including those from the Compton Gamma-Ray Observatory, and the problems that they pose for our understanding of energy release and particle acceleration processes in the astrophysical environment. I will then discuss a program of high-energy observations to be carried out during the upcoming 2000-2001 solar maximum that is aimed at addressing and resolving these issues. A key element in this observational program is the High Energy Solar Spectroscopic Imager (HESSI) spacecraft, which will provide imaging spectroscopic observations with spatial, temporal, and energy resolutions commensurate with the physical processes believed to be operating, and will in addition provide the first true gamma-ray spectroscopy of an astrophysical source. .
Paracetamol degradation in aqueous solution by non-thermal plasma
NASA Astrophysics Data System (ADS)
Baloul, Yasmine; Aubry, Olivier; Rabat, Hervé; Colas, Cyril; Maunit, Benoît; Hong, Dunpin
2017-08-01
This study deals with paracetamol degradation in water using a non-thermal plasma (NTP) created by a dielectric barrier discharge (DBD). The effects of the NTP operating conditions on the degradation were studied, showing that the treatment efficiency of the process was highly dependent on the electrical parameters and working gas composition in the reactor containing the aqueous solution. A conversion rate higher than 99% was reached with an energy yield of 12 g/kWh. High resolution mass spectrometry (HRMS) measurements showed that the main species produced in water during the process were nitrogen compounds, carboxylic acids and aromatic compounds. Contribution to the topical issue "The 15th International Symposium on High Pressure Low Temperature Plasma Chemistry (HAKONE XV)", edited by Nicolas Gherardi and Tomáš Hoder
Fast neutron-gamma discrimination on neutron emission profile measurement on JT-60U.
Ishii, K; Shinohara, K; Ishikawa, M; Baba, M; Isobe, M; Okamoto, A; Kitajima, S; Sasao, M
2010-10-01
A digital signal processing (DSP) system is applied to stilbene scintillation detectors of the multichannel neutron emission profile monitor in JT-60U. Automatic analysis of the neutron-γ pulse shape discrimination is a key issue to diminish the processing time in the DSP system, and it has been applied using the two-dimensional (2D) map. Linear discriminant function is used to determine the dividing line between neutron events and γ-ray events on a 2D map. In order to verify the validity of the dividing line determination, the pulse shape discrimination quality is evaluated. As a result, the γ-ray contamination in most of the beam heating phase was negligible compared with the statistical error with 10 ms time resolution.
On the Initiation Mechanism in Exploding Bridgewire and Laser Detonators
NASA Astrophysics Data System (ADS)
Stewart, D. Scott; Thomas, Keith A.; Clarke, S.; Mallett, H.; Martin, E.; Martinez, M.; Munger, A.; Saenz, Juan
2006-07-01
Since its invention by Los Alamos during the Manhattan Project era the exploding bridgewire detonator (EBW) has seen tremendous use and study. Recent development of a laser-powered device with detonation properties similar to an EBW is reviving interest in the basic physics of the deflagration-to-detonation (DDT) process in both of these devices. Cutback experiments using both laser interferometry and streak camera observations are providing new insight into the initiation mechanism in EBWs. These measurements are being correlated to a DDT model of compaction to detonation and shock to detonation developed previously by Xu and Stewart. The DDT model is incorporated into a high-resolution, multi-material model code for simulating the complete process. Model formulation and the modeling issues required to describe the test data will be discussed.
Fast neutron-gamma discrimination on neutron emission profile measurement on JT-60U
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ishii, K.; Okamoto, A.; Kitajima, S.
2010-10-15
A digital signal processing (DSP) system is applied to stilbene scintillation detectors of the multichannel neutron emission profile monitor in JT-60U. Automatic analysis of the neutron-{gamma} pulse shape discrimination is a key issue to diminish the processing time in the DSP system, and it has been applied using the two-dimensional (2D) map. Linear discriminant function is used to determine the dividing line between neutron events and {gamma}-ray events on a 2D map. In order to verify the validity of the dividing line determination, the pulse shape discrimination quality is evaluated. As a result, the {gamma}-ray contamination in most of themore » beam heating phase was negligible compared with the statistical error with 10 ms time resolution.« less
Unstructured grid research and use at NASA Lewis Research Center
NASA Technical Reports Server (NTRS)
Potapczuk, Mark G.
1993-01-01
Computational fluid dynamics applications of grid research at LRC include inlets, nozzles, and ducts; turbomachinery; propellers - ducted and unducted; and aircraft icing. Some issues related to internal flow grid generation are resolution requirements on several boundaries, shock resolution vs. grid periodicity, grid spacing at blade/shroud gap, grid generation in turbine blade passages, and grid generation for inlet/nozzle geometries. Aircraft icing grid generation issues include (1) small structures relative to airfoil chord must be resolved; (2) excessive number of grid points in far-field using structured grid; and (3) grid must be recreated as ice shape grows.
Predictive displays for a process-control schematic interface.
Yin, Shanqing; Wickens, Christopher D; Helander, Martin; Laberge, Jason C
2015-02-01
Our objective was to examine the extent to which increasing precision of predictive (rate of change) information in process control will improve performance on a simulated process-control task. Predictive displays have been found to be useful in process control (as well as aviation and maritime industries). However, authors of prior research have not examined the extent to which predictive value is increased by increasing predictor resolution, nor has such research tied potential improvements to changes in process control strategy. Fifty nonprofessional participants each controlled a simulated chemical mixture process (honey mixer simulation) that simulated the operations found in process control. Participants in each of five groups controlled with either no predictor or a predictor ranging in the resolution of prediction of the process. Increasing detail resolution generally increased the benefit of prediction over the control condition although not monotonically so. The best overall performance, combining quality and predictive ability, was obtained by the display of intermediate resolution. The two displays with the lowest resolution were clearly inferior. Predictors with higher resolution are of value but may trade off enhanced sensitivity to variable change (lower-resolution discrete state predictor) with smoother control action (higher-resolution continuous predictors). The research provides guidelines to the process-control industry regarding displays that can most improve operator performance.
ERIC Educational Resources Information Center
Kolsto, Stein Dankert
2001-01-01
Describes a qualitative study in which 16-year-old Norwegian pupils dealt with a socio-scientific issue. Investigates aspects of students' decision-making concerning a local version of the well-known controversial issue of whether or not power transmission lines increase the risk for childhood leukemia. Some of the resolution strategies imply that…
NASA Technical Reports Server (NTRS)
Maynard, O. E.
1980-01-01
Progress in analysis and design of solid state approaches to the solar power satellite microwave power transmission system is reviewed with special emphasis on the Sandwich concept and the issues of maintenance of low junction temperatures for amplifiers to assure acceptable lifetime. Ten specific issues or considerations are discussed and their resolution or status is presented.
Spatial Modeling of Geometallurgical Properties: Techniques and a Case Study
DOE Office of Scientific and Technical Information (OSTI.GOV)
Deutsch, Jared L., E-mail: jdeutsch@ualberta.ca; Palmer, Kevin; Deutsch, Clayton V.
High-resolution spatial numerical models of metallurgical properties constrained by geological controls and more extensively by measured grade and geomechanical properties constitute an important part of geometallurgy. Geostatistical and other numerical techniques are adapted and developed to construct these high-resolution models accounting for all available data. Important issues that must be addressed include unequal sampling of the metallurgical properties versus grade assays, measurements at different scale, and complex nonlinear averaging of many metallurgical parameters. This paper establishes techniques to address each of these issues with the required implementation details and also demonstrates geometallurgical mineral deposit characterization for a copper–molybdenum deposit inmore » South America. High-resolution models of grades and comminution indices are constructed, checked, and are rigorously validated. The workflow demonstrated in this case study is applicable to many other deposit types.« less
Advanced Reactor Technologies - Regulatory Technology Development Plan (RTDP)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Moe, Wayne L.
This DOE-NE Advanced Small Modular Reactor (AdvSMR) regulatory technology development plan (RTDP) will link critical DOE nuclear reactor technology development programs to important regulatory and policy-related issues likely to impact a “critical path” for establishing a viable commercial AdvSMR presence in the domestic energy market. Accordingly, the regulatory considerations that are set forth in the AdvSMR RTDP will not be limited to any one particular type or subset of advanced reactor technology(s) but rather broadly consider potential regulatory approaches and the licensing implications that accompany all DOE-sponsored research and technology development activity that deal with commercial non-light water reactors. However,more » it is also important to remember that certain “minimum” levels of design and safety approach knowledge concerning these technology(s) must be defined and available to an extent that supports appropriate pre-licensing regulatory analysis within the RTDP. Final resolution to advanced reactor licensing issues is most often predicated on the detailed design information and specific safety approach as documented in a facility license application and submitted for licensing review. Because the AdvSMR RTDP is focused on identifying and assessing the potential regulatory implications of DOE-sponsored reactor technology research very early in the pre-license application development phase, the information necessary to support a comprehensive regulatory analysis of a new reactor technology, and the resolution of resulting issues, will generally not be available. As such, the regulatory considerations documented in the RTDP should be considered an initial “first step” in the licensing process which will continue until a license is issued to build and operate the said nuclear facility. Because a facility license application relies heavily on the data and information generated by technology development studies, the anticipated regulatory importance of key DOE reactor research initiatives should be assessed early in the technology development process. Quality assurance requirements supportive of later licensing activities must also be attached to important research activities to ensure resulting data is usable in that context. Early regulatory analysis and licensing approach planning thus provides a significant benefit to the formulation of research plans and also enables the planning and development of a compatible AdvSMR licensing framework, should significant modification be required.« less
Advanced Reactor Technology -- Regulatory Technology Development Plan (RTDP)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Moe, Wayne Leland
This DOE-NE Advanced Small Modular Reactor (AdvSMR) regulatory technology development plan (RTDP) will link critical DOE nuclear reactor technology development programs to important regulatory and policy-related issues likely to impact a “critical path” for establishing a viable commercial AdvSMR presence in the domestic energy market. Accordingly, the regulatory considerations that are set forth in the AdvSMR RTDP will not be limited to any one particular type or subset of advanced reactor technology(s) but rather broadly consider potential regulatory approaches and the licensing implications that accompany all DOE-sponsored research and technology development activity that deal with commercial non-light water reactors. However,more » it is also important to remember that certain “minimum” levels of design and safety approach knowledge concerning these technology(s) must be defined and available to an extent that supports appropriate pre-licensing regulatory analysis within the RTDP. Final resolution to advanced reactor licensing issues is most often predicated on the detailed design information and specific safety approach as documented in a facility license application and submitted for licensing review. Because the AdvSMR RTDP is focused on identifying and assessing the potential regulatory implications of DOE-sponsored reactor technology research very early in the pre-license application development phase, the information necessary to support a comprehensive regulatory analysis of a new reactor technology, and the resolution of resulting issues, will generally not be available. As such, the regulatory considerations documented in the RTDP should be considered an initial “first step” in the licensing process which will continue until a license is issued to build and operate the said nuclear facility. Because a facility license application relies heavily on the data and information generated by technology development studies, the anticipated regulatory importance of key DOE reactor research initiatives should be assessed early in the technology development process. Quality assurance requirements supportive of later licensing activities must also be attached to important research activities to ensure resulting data is usable in that context. Early regulatory analysis and licensing approach planning thus provides a significant benefit to the formulation of research plans and also enables the planning and development of a compatible AdvSMR licensing framework, should significant modification be required.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Held, Isaac; V. Balaji; Fueglistaler, Stephan
We have constructed and analyzed a series of idealized models of tropical convection interacting with large-scale circulations, with 25-50km resolution and with 1-2km cloud resolving resolution to set the stage for rigorous tests of convection closure schemes in high resolution global climate models. Much of the focus has been on the climatology of tropical cyclogenesis in rotating systems and the related problem of the spontaneous aggregation of convection in non-rotating systems. The PI (Held) will be delivering the honorary Bjerknes lecture at the Fall 2016 AGU meeting in December on this work. We have also provided new analyses of long-standingmore » issues related to the interaction between convection and the large-scale circulation: Kelvin waves in the upper troposphere and lower stratosphere, water vapor transport into the stratosphere, and upper tropospheric temperature trends. The results of these analyses help to improve our understanding of processes, and provide tests for future high resolution global modeling. Our final goal of testing new convections schemes in next-generation global atmospheric models at GFDL has been left for future work due to the complexity of the idealized model results meant as tests for these models uncovered in this work and to computational resource limitations. 11 papers have been published with support from this grant, 2 are in review, and another major summary paper is in preparation.« less
NASA Astrophysics Data System (ADS)
Bermudez, L. E.; Percivall, G.; Idol, T. A.
2015-12-01
Experts in climate modeling, remote sensing of the Earth, and cyber infrastructure must work together in order to make climate predictions available to decision makers. Such experts and decision makers worked together in the Open Geospatial Consortium's (OGC) Testbed 11 to address a scenario of population displacement by coastal inundation due to the predicted sea level rise. In a Policy Fact Sheet "Harnessing Climate Data to Boost Ecosystem & Water Resilience", issued by White House Office of Science and Technology (OSTP) in December 2014, OGC committed to increase access to climate change information using open standards. In July 2015, the OGC Testbed 11 Urban Climate Resilience activity delivered on that commitment with open standards based support for climate-change preparedness. Using open standards such as the OGC Web Coverage Service and Web Processing Service and the NetCDF and GMLJP2 encoding standards, Testbed 11 deployed an interoperable high-resolution flood model to bring climate model outputs together with global change assessment models and other remote sensing data for decision support. Methods to confirm model predictions and to allow "what-if-scenarios" included in-situ sensor webs and crowdsourcing. A scenario was in two locations: San Francisco Bay Area and Mozambique. The scenarios demonstrated interoperation and capabilities of open geospatial specifications in supporting data services and processing services. The resultant High Resolution Flood Information System addressed access and control of simulation models and high-resolution data in an open, worldwide, collaborative Web environment. The scenarios examined the feasibility and capability of existing OGC geospatial Web service specifications in supporting the on-demand, dynamic serving of flood information from models with forecasting capacity. Results of this testbed included identification of standards and best practices that help researchers and cities deal with climate-related issues. Results of the testbeds will now be deployed in pilot applications. The testbed also identified areas of additional development needed to help identify scientific investments and cyberinfrastructure approaches needed to improve the application of climate science research results to urban climate resilence.
Vapor characterization of Tank 241-C-103
DOE Office of Scientific and Technical Information (OSTI.GOV)
Huckaby, J.L.; Story, M.S.
The Westinghouse Hanford Company Tank Vapor Issue Resolution Program has developed, in cooperation with Northwest Instrument Systems, Inc., Oak Ridge National Laboratory, Oregon Graduate Institute of Science and Technology, Pacific Northwest Laboratory, and Sandia National Laboratory, the equipment and expertise to characterize gases and vapors in the high-level radioactive waste storage tanks at the Hanford Site in south central Washington State. This capability has been demonstrated by the characterization of the tank 241-C-103 headspace. This tank headspace is the first, and for many reasons is expected to be the most problematic, that will be characterized (Osborne 1992). Results from themore » most recent and comprehensive sampling event, sample job 7B, are presented for the purpose of providing scientific bases for resolution of vapor issues associated with tank 241-C-103. This report is based on the work of Clauss et al. 1994, Jenkins et al. 1994, Ligotke et al. 1994, Mahon et al. 1994, and Rasmussen and Einfeld 1994. No attempt has been made in this report to evaluate the implications of the data presented, such as the potential impact of headspace gases and vapors to tank farm workers health. That and other issues will be addressed elsewhere. Key to the resolution of worker health issues is the quantitation of compounds of toxicological concern. The Toxicology Review Panel, a panel of Pacific Northwest Laboratory experts in various areas, of toxicology, has chosen 19 previously identified compounds as being of potential toxicological concern. During sample job 7B, the sampling and analytical methodology was validated for this preliminary list of compounds of toxicological concern. Validation was performed according to guidance provided by the Tank Vapor Conference Committee, a group of analytical chemists from academic institutions and national laboratories assembled and commissioned by the Tank Vapor Issue Resolution Program.« less
29 CFR 33.12 - Complaint handling procedures.
Code of Federal Regulations, 2010 CFR
2010-07-01
... that resolution of the complaint would require a fundamental alteration of the program or undue... Management (Deputy ASAM). (i) If informal resolution is not achieved, the Deputy ASAM shall issue a... Administration and Management (ASAM). (j)(1) An appeal of the Deputy ASAM's determination may be filed with the...
Specialized ADR To Settle Faculty Employment Disputes.
ERIC Educational Resources Information Center
DiNardo, Lawrence C.; Sherrill, John A.; Palmer, Anna R.
2001-01-01
Presents an innovative approach to resolution of faculty employment disputes at institutions of higher education. Discusses the framework in which faculty employment issues arise, the current state of alternative dispute resolution (ADR) as it is relevant to employment disputes, and the substantial benefits which could be achieved by developing a…
18 CFR 385.604 - Alternative means of dispute resolution (Rule 604).
Code of Federal Regulations, 2011 CFR
2011-04-01
... final written agreement or arbitral award reached as a result of a dispute resolution proceeding, is not..., unless the decisional authority, upon motion or otherwise, orders a different procedure. (b) Definitions... arbitration, or any combination thereof; (2) Award means any decision by an arbitrator resolving the issues in...
High density processing electronics for superconducting tunnel junction x-ray detector arrays
NASA Astrophysics Data System (ADS)
Warburton, W. K.; Harris, J. T.; Friedrich, S.
2015-06-01
Superconducting tunnel junctions (STJs) are excellent soft x-ray (100-2000 eV) detectors, particularly for synchrotron applications, because of their ability to obtain energy resolutions below 10 eV at count rates approaching 10 kcps. In order to achieve useful solid detection angles with these very small detectors, they are typically deployed in large arrays - currently with 100+ elements, but with 1000 elements being contemplated. In this paper we review a 5-year effort to develop compact, computer controlled low-noise processing electronics for STJ detector arrays, focusing on the major issues encountered and our solutions to them. Of particular interest are our preamplifier design, which can set the STJ operating points under computer control and achieve 2.7 eV energy resolution; our low noise power supply, which produces only 2 nV/√Hz noise at the preamplifier's critical cascode node; our digital processing card that digitizes and digitally processes 32 channels; and an STJ I-V curve scanning algorithm that computes noise as a function of offset voltage, allowing an optimum operating point to be easily selected. With 32 preamplifiers laid out on a custom 3U EuroCard, and the 32 channel digital card in a 3U PXI card format, electronics for a 128 channel array occupy only two small chassis, each the size of a National Instruments 5-slot PXI crate, and allow full array control with simple extensions of existing beam line data collection packages.
New photon-counting detectors for single-molecule fluorescence spectroscopy and imaging
Michalet, X.; Colyer, R. A.; Scalia, G.; Weiss, S.; Siegmund, Oswald H. W.; Tremsin, Anton S.; Vallerga, John V.; Villa, F.; Guerrieri, F.; Rech, I.; Gulinatti, A.; Tisa, S.; Zappa, F.; Ghioni, M.; Cova, S.
2013-01-01
Solution-based single-molecule fluorescence spectroscopy is a powerful new experimental approach with applications in all fields of natural sciences. Two typical geometries can be used for these experiments: point-like and widefield excitation and detection. In point-like geometries, the basic concept is to excite and collect light from a very small volume (typically femtoliter) and work in a concentration regime resulting in rare burst-like events corresponding to the transit of a single-molecule. Those events are accumulated over time to achieve proper statistical accuracy. Therefore the advantage of extreme sensitivity is somewhat counterbalanced by a very long acquisition time. One way to speed up data acquisition is parallelization. Here we will discuss a general approach to address this issue, using a multispot excitation and detection geometry that can accommodate different types of novel highly-parallel detector arrays. We will illustrate the potential of this approach with fluorescence correlation spectroscopy (FCS) and single-molecule fluorescence measurements. In widefield geometries, the same issues of background reduction and single-molecule concentration apply, but the duration of the experiment is fixed by the time scale of the process studied and the survival time of the fluorescent probe. Temporal resolution on the other hand, is limited by signal-to-noise and/or detector resolution, which calls for new detector concepts. We will briefly present our recent results in this domain. PMID:24729836
New photon-counting detectors for single-molecule fluorescence spectroscopy and imaging.
Michalet, X; Colyer, R A; Scalia, G; Weiss, S; Siegmund, Oswald H W; Tremsin, Anton S; Vallerga, John V; Villa, F; Guerrieri, F; Rech, I; Gulinatti, A; Tisa, S; Zappa, F; Ghioni, M; Cova, S
2011-05-13
Solution-based single-molecule fluorescence spectroscopy is a powerful new experimental approach with applications in all fields of natural sciences. Two typical geometries can be used for these experiments: point-like and widefield excitation and detection. In point-like geometries, the basic concept is to excite and collect light from a very small volume (typically femtoliter) and work in a concentration regime resulting in rare burst-like events corresponding to the transit of a single-molecule. Those events are accumulated over time to achieve proper statistical accuracy. Therefore the advantage of extreme sensitivity is somewhat counterbalanced by a very long acquisition time. One way to speed up data acquisition is parallelization. Here we will discuss a general approach to address this issue, using a multispot excitation and detection geometry that can accommodate different types of novel highly-parallel detector arrays. We will illustrate the potential of this approach with fluorescence correlation spectroscopy (FCS) and single-molecule fluorescence measurements. In widefield geometries, the same issues of background reduction and single-molecule concentration apply, but the duration of the experiment is fixed by the time scale of the process studied and the survival time of the fluorescent probe. Temporal resolution on the other hand, is limited by signal-to-noise and/or detector resolution, which calls for new detector concepts. We will briefly present our recent results in this domain.
ERIC Educational Resources Information Center
Harris, Geoff
2008-01-01
This article commences with an explanation of some of the technical terms in the field of conflict resolution. It then examines the common ways which parties to a conflict use in an effort to deal with it and concludes that, on a number of criteria, collaborative conflict resolution is the superior method. Using some representative examples of…
Performance Analysis of Satellite Missions for Multi-Temporal SAR Interferometry
Belmonte, Antonella; Nutricato, Raffaele; Nitti, Davide O.; Chiaradia, Maria T.
2018-01-01
Multi-temporal InSAR (MTI) applications pose challenges related to the availability of coherent scattering from the ground surface, the complexity of the ground deformations, atmospheric artifacts, and visibility problems related to ground elevation. Nowadays, several satellite missions are available providing interferometric SAR data at different wavelengths, spatial resolutions, and revisit time. A new and interesting opportunity is provided by Sentinel-1, which has a spatial resolution comparable to that of previous ESA C-band sensors, and revisit times improved by up to 6 days. According to these different SAR space-borne missions, the present work discusses current and future opportunities of MTI applications in terms of ground instability monitoring. Issues related to coherent target detection, mean velocity precision, and product geo-location are addressed through a simple theoretical model assuming backscattering mechanisms related to point scatterers. The paper also presents an example of a multi-sensor ground instability investigation over Lesina Marina, a village in Southern Italy lying over a gypsum diapir, where a hydration process, involving the underlying anhydride, causes a smooth uplift and the formation of scattered sinkholes. More than 20 years of MTI SAR data have been processed, coming from both legacy ERS and ENVISAT missions, and latest-generation RADARSAT-2, COSMO-SkyMed, and Sentinel-1A sensors. Results confirm the presence of a rather steady uplift process, with limited to null variations throughout the whole monitored time-period. PMID:29702588
NASA Technical Reports Server (NTRS)
Kwong, Victor H. S.
2003-01-01
The laser ablation/ion storage facility at the UNLV Physics Department has been dedicated to the study of atomic and molecular processes in low temperature plasmas. Our program focuses on the charge transfer (electron capture) of multiply charged ions and neutrals important in astrophysics. The electron transfer reactions with atoms and molecules is crucial to the ionization condition of neutral rich photoionized plasmas. With the successful deployment of the Far Ultraviolet Spectroscopic Explorer (FUSE) and the Chandra X-ray Observatory by NASA high resolution VUV and X-ray emission spectra fiom various astrophysical objects have been collected. These spectra will be analyzed to determine the source of the emission and the chemical and physical environment of the source. The proper interpretation of these spectra will require complete knowledge of all the atomic processes in these plasmas. In a neutral rich environment, charge transfer can be the dominant process. The rate coefficients need to be known accurately. We have also extended our charge transfer measurements to KeV region with a pulsed ion beam. The inclusion of this facility into our current program provides flexibility in extending the measurement to higher energies (KeV) if needed. This flexibility enables us to address issues of immediate interest to the astrophysical community as new observations are made by high resolution space based observatories.
Performance Analysis of Satellite Missions for Multi-Temporal SAR Interferometry.
Bovenga, Fabio; Belmonte, Antonella; Refice, Alberto; Pasquariello, Guido; Nutricato, Raffaele; Nitti, Davide O; Chiaradia, Maria T
2018-04-27
Multi-temporal InSAR (MTI) applications pose challenges related to the availability of coherent scattering from the ground surface, the complexity of the ground deformations, atmospheric artifacts, and visibility problems related to ground elevation. Nowadays, several satellite missions are available providing interferometric SAR data at different wavelengths, spatial resolutions, and revisit time. A new and interesting opportunity is provided by Sentinel-1, which has a spatial resolution comparable to that of previous ESA C-band sensors, and revisit times improved by up to 6 days. According to these different SAR space-borne missions, the present work discusses current and future opportunities of MTI applications in terms of ground instability monitoring. Issues related to coherent target detection, mean velocity precision, and product geo-location are addressed through a simple theoretical model assuming backscattering mechanisms related to point scatterers. The paper also presents an example of a multi-sensor ground instability investigation over Lesina Marina, a village in Southern Italy lying over a gypsum diapir, where a hydration process, involving the underlying anhydride, causes a smooth uplift and the formation of scattered sinkholes. More than 20 years of MTI SAR data have been processed, coming from both legacy ERS and ENVISAT missions, and latest-generation RADARSAT-2, COSMO-SkyMed, and Sentinel-1A sensors. Results confirm the presence of a rather steady uplift process, with limited to null variations throughout the whole monitored time-period.
Gradation (approx. 10 size states) of synaptic strength by quantal addition of structural modules
2017-01-01
Memory storage involves activity-dependent strengthening of synaptic transmission, a process termed long-term potentiation (LTP). The late phase of LTP is thought to encode long-term memory and involves structural processes that enlarge the synapse. Hence, understanding how synapse size is graded provides fundamental information about the information storage capability of synapses. Recent work using electron microscopy (EM) to quantify synapse dimensions has suggested that synapses may structurally encode as many as 26 functionally distinct states, which correspond to a series of proportionally spaced synapse sizes. Other recent evidence using super-resolution microscopy has revealed that synapses are composed of stereotyped nanoclusters of α-amino-3-hydroxy-5-methyl-4-isoxazolepropionic acid (AMPA) receptors and scaffolding proteins; furthermore, synapse size varies linearly with the number of nanoclusters. Here we have sought to develop a model of synapse structure and growth that is consistent with both the EM and super-resolution data. We argue that synapses are composed of modules consisting of matrix material and potentially one nanocluster. LTP induction can add a trans-synaptic nanocluster to a module, thereby converting a silent module to an AMPA functional module. LTP can also add modules by a linear process, thereby producing an approximately 10-fold gradation in synapse size and strength. This article is part of the themed issue ‘Integrating Hebbian and homeostatic plasticity’. PMID:28093559
Gradation (approx. 10 size states) of synaptic strength by quantal addition of structural modules.
Liu, Kang K L; Hagan, Michael F; Lisman, John E
2017-03-05
Memory storage involves activity-dependent strengthening of synaptic transmission, a process termed long-term potentiation (LTP). The late phase of LTP is thought to encode long-term memory and involves structural processes that enlarge the synapse. Hence, understanding how synapse size is graded provides fundamental information about the information storage capability of synapses. Recent work using electron microscopy (EM) to quantify synapse dimensions has suggested that synapses may structurally encode as many as 26 functionally distinct states, which correspond to a series of proportionally spaced synapse sizes. Other recent evidence using super-resolution microscopy has revealed that synapses are composed of stereotyped nanoclusters of α-amino-3-hydroxy-5-methyl-4-isoxazolepropionic acid (AMPA) receptors and scaffolding proteins; furthermore, synapse size varies linearly with the number of nanoclusters. Here we have sought to develop a model of synapse structure and growth that is consistent with both the EM and super-resolution data. We argue that synapses are composed of modules consisting of matrix material and potentially one nanocluster. LTP induction can add a trans-synaptic nanocluster to a module, thereby converting a silent module to an AMPA functional module. LTP can also add modules by a linear process, thereby producing an approximately 10-fold gradation in synapse size and strength.This article is part of the themed issue 'Integrating Hebbian and homeostatic plasticity'. © 2017 The Author(s).
Legal Issues in Anonymity and Pseudonymity.
ERIC Educational Resources Information Center
Froomkin, A. Michael
1999-01-01
Regulation of anonymous and pseudonymous communications is an important and contentious Internetrelated issues of the 21st century. Resolution of this controversy will effect freedom of speech, nature of electronic commerce, and capabilities of law enforcement. The legal constraints on anonymous communication, and the constitutional constraints on…
Review of Congressional Issues. News from Capitol Hill.
ERIC Educational Resources Information Center
Heinz, Ann Simeo
2002-01-01
Focuses on U.S. congressional issues in two categories: (1) enacted legislation, and (2) proposed legislation. Addresses topics such as the resolution related to Iraq, the Department of Homeland Security, Pledge of Allegiance, social security protection, elder justice, and women's rights. Includes learning activities. (CMK)
RELAP5-3D Resolution of Known Restart/Backup Issues
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mesina, George L.; Anderson, Nolan A.
2014-12-01
The state-of-the-art nuclear reactor system safety analysis computer program developed at the Idaho National Laboratory (INL), RELAP5-3D, continues to adapt to changes in computer hardware and software and to develop to meet the ever-expanding needs of the nuclear industry. To continue at the forefront, code testing must evolve with both code and industry developments, and it must work correctly. To best ensure this, the processes of Software Verification and Validation (V&V) are applied. Verification compares coding against its documented algorithms and equations and compares its calculations against analytical solutions and the method of manufactured solutions. A form of this, sequentialmore » verification, checks code specifications against coding only when originally written then applies regression testing which compares code calculations between consecutive updates or versions on a set of test cases to check that the performance does not change. A sequential verification testing system was specially constructed for RELAP5-3D to both detect errors with extreme accuracy and cover all nuclear-plant-relevant code features. Detection is provided through a “verification file” that records double precision sums of key variables. Coverage is provided by a test suite of input decks that exercise code features and capabilities necessary to model a nuclear power plant. A matrix of test features and short-running cases that exercise them is presented. This testing system is used to test base cases (called null testing) as well as restart and backup cases. It can test RELAP5-3D performance in both standalone and coupled (through PVM to other codes) runs. Application of verification testing revealed numerous restart and backup issues in both standalone and couple modes. This document reports the resolution of these issues.« less
Moving It Along: A study of healthcare professionals' experience with ethics consultations.
Crigger, Nancy; Fox, Maria; Rosell, Tarris; Rojjanasrirat, Wilaiporn
2017-05-01
Ethics consultation is the traditional way of resolving challenging ethical questions raised about patient care in the United States. Little research has been published on the resolution process used during ethics consultations and on how this experience affects healthcare professionals who participate in them. The purpose of this qualitative research was to uncover the basic process that occurs in consultation services through study of the perceptions of healthcare professionals. The researchers in this study used a constructivist grounded theory approach that represents how one group of professionals experienced ethics consultations in their hospital in the United States. The results were sufficient to develop an initial theory that has been named after the core concept: Moving It Along. Three process stages emerged from data interpretation: moral questioning, seeing the big picture, and coming together. It is hoped that this initial work stimulates additional research in describing and understanding the complex social process that occurs for healthcare professionals as they address the difficult moral issues that arise in clinical practice.
NASA Astrophysics Data System (ADS)
Ruin, Isabelle
2014-05-01
How do people answer to heavy precipitation and flood warnings? How do they adapt their daily schedule and activity to the fast evolution of the environmental circumstances? More generally, how do social processes interact with physical ones? Such questions address the dynamical interactions between hydro-meteorological variables, human perception and representation of the environment, and actual individual and social behavioral responses. It also poses the question of scales and hierarchy issues through seamless interactions between smaller and larger scales. These questions are relevant for both social and physical scientists. They are more and more pertinently addressed in the Global Environmental Change perspective through the concepts of Coupled Human And Natural Systems (CHANS), resilience or panarchy developped in the context of interdisciplinary collaborations. Nevertheless those concepts are complex and not easy to handle, specially when facing with operational goals. One of the main difficulty to advance these integrated approaches is the access to empirical data informing the processes at various scales. In fact, if physical and social processes are well studied by distinct disciplines, they are rarely jointly explored within similar spatial and temporal resolutions. Such coupled observation and analysis poses methodological challenges, specially when dealing with responses to short-fuse and extreme weather events. In fact, if such coupled approach is quite common to study large scale phenomenon like global change (for instance using historical data on green house gaz emissions and the evolution of temperatures worldwide), it is rarer for studing smaller nested sets of scales of human-nature systems where finer resolution data are sparse. Another problem arise from the need to produce comparable analysis on different case studies where social, physical and even cultural contexts may be diverse. Generic and robust framework for data collection, modeling and analysis are needed to allow cross comparison and deeper understanding of the processes accross scales. This presentation will address these issues based on concrete exemples from empirical studies on past flash flooding events across Europe and USA.
34 CFR 300.510 - Resolution process.
Code of Federal Regulations, 2012 CFR
2012-07-01
... 34 Education 2 2012-07-01 2012-07-01 false Resolution process. 300.510 Section 300.510 Education Regulations of the Offices of the Department of Education (Continued) OFFICE OF SPECIAL EDUCATION AND... DISABILITIES Procedural Safeguards Due Process Procedures for Parents and Children § 300.510 Resolution process...
The new frontiers of multimodality and multi-isotope imaging
NASA Astrophysics Data System (ADS)
Behnam Azad, Babak; Nimmagadda, Sridhar
2014-06-01
Technological advances in imaging systems and the development of target specific imaging tracers has been rapidly growing over the past two decades. Recent progress in "all-in-one" imaging systems that allow for automated image coregistration has significantly added to the growth of this field. These developments include ultra high resolution PET and SPECT scanners that can be integrated with CT or MR resulting in PET/CT, SPECT/CT, SPECT/PET and PET/MRI scanners for simultaneous high resolution high sensitivity anatomical and functional imaging. These technological developments have also resulted in drastic enhancements in image quality and acquisition time while eliminating cross compatibility issues between modalities. Furthermore, the most cutting edge technology, though mostly preclinical, also allows for simultaneous multimodality multi-isotope image acquisition and image reconstruction based on radioisotope decay characteristics. These scientific advances, in conjunction with the explosion in the development of highly specific multimodality molecular imaging agents, may aid in realizing simultaneous imaging of multiple biological processes and pave the way towards more efficient diagnosis and improved patient care.
Modeling lakes and reservoirs in the climate system
MacKay, M.D.; Neale, P.J.; Arp, C.D.; De Senerpont Domis, L. N.; Fang, X.; Gal, G.; Jo, K.D.; Kirillin, G.; Lenters, J.D.; Litchman, E.; MacIntyre, S.; Marsh, P.; Melack, J.; Mooij, W.M.; Peeters, F.; Quesada, A.; Schladow, S.G.; Schmid, M.; Spence, C.; Stokes, S.L.
2009-01-01
Modeling studies examining the effect of lakes on regional and global climate, as well as studies on the influence of climate variability and change on aquatic ecosystems, are surveyed. Fully coupled atmosphere-land surface-lake climate models that could be used for both of these types of study simultaneously do not presently exist, though there are many applications that would benefit from such models. It is argued here that current understanding of physical and biogeochemical processes in freshwater systems is sufficient to begin to construct such models, and a path forward is proposed. The largest impediment to fully representing lakes in the climate system lies in the handling of lakes that are too small to be explicitly resolved by the climate model, and that make up the majority of the lake-covered area at the resolutions currently used by global and regional climate models. Ongoing development within the hydrological sciences community and continual improvements in model resolution should help ameliorate this issue.
Femtoelectron-Based Terahertz Imaging of Hydration State in a Proton Exchange Membrane Fuel Cell
NASA Astrophysics Data System (ADS)
Buaphad, P.; Thamboon, P.; Kangrang, N.; Rhodes, M. W.; Thongbai, C.
2015-08-01
Imbalanced water management in a proton exchange membrane (PEM) fuel cell significantly reduces the cell performance and durability. Visualization of water distribution and transport can provide greater comprehension toward optimization of the PEM fuel cell. In this work, we are interested in water flooding issues that occurred in flow channels on cathode side of the PEM fuel cell. The sample cell was fabricated with addition of a transparent acrylic window allowing light access and observed the process of flooding formation (in situ) via a CCD camera. We then explore potential use of terahertz (THz) imaging, consisting of femtoelectron-based THz source and off-angle reflective-mode imaging, to identify water presence in the sample cell. We present simulations of two hydration states (water and nonwater area), which are in agreement with the THz image results. A line-scan plot is utilized for quantitative analysis and for defining spatial resolution of the image. Implementing metal mesh filtering can improve spatial resolution of our THz imaging system.
Problem Based Learning and the scientific process
NASA Astrophysics Data System (ADS)
Schuchardt, Daniel Shaner
This research project was developed to inspire students to constructively use problem based learning and the scientific process to learn middle school science content. The student population in this study consisted of male and female seventh grade students. Students were presented with authentic problems that are connected to physical and chemical properties of matter. The intent of the study was to have students use the scientific process of looking at existing knowledge, generating learning issues or questions about the problems, and then developing a course of action to research and design experiments to model resolutions to the authentic problems. It was expected that students would improve their ability to actively engage with others in a problem solving process to achieve a deeper understanding of Michigan's 7th Grade Level Content Expectations, the Next Generation Science Standards, and a scientific process. Problem based learning was statistically effective in students' learning of the scientific process. Students statistically showed improvement on pre to posttest scores. The teaching method of Problem Based Learning was effective for seventh grade science students at Dowagiac Middle School.
NASA Astrophysics Data System (ADS)
Marzolff, Irene
2014-05-01
One hundred years after the first publication on aerial photography taken from unmanned aerial platforms (Arthur Batut 1890), small-format aerial photography (SFAP) became a distinct niche within remote sensing during the 1990s. Geographers, plant biologists, archaeologists and other researchers with geospatial interests re-discovered the usefulness of unmanned platforms for taking high-resolution, low-altitude photographs that could then be digitized and analysed with geographical information systems, (softcopy) photogrammetry and image processing techniques originally developed for digital satellite imagery. Even before the ubiquity of digital consumer-grade cameras and 3D analysis software accessible to the photogrammetric layperson, do-it-yourself remote sensing using kites, blimps, drones and micro air vehicles literally enabled the questing researcher to get their own pictures of the world. As a flexible, cost-effective method, SFAP offered images with high spatial and temporal resolutions that could be ideally adapted to the scales of landscapes, forms and distribution patterns to be monitored. During the last five years, this development has been significantly accelerated by the rapid technological advancements of GPS navigation, autopiloting and revolutionary softcopy-photogrammetry techniques. State-of-the-art unmanned aerial systems (UAS) now allow automatic flight planning, autopilot-controlled aerial surveys, ground control-free direct georeferencing and DEM plus orthophoto generation with centimeter accuracy, all within the space of one day. The ease of use of current UAS and processing software for the generation of high-resolution topographic datasets and spectacular visualizations is tempting and has spurred the number of publications on these issues - but which advancements in our knowledge and understanding of geomorphological processes have we seen and can we expect in the future? This presentation traces the development of the last two decades by presenting and discussing examples for geomorphological research using UAS, mostly from the field of soil erosion monitoring.
Does more mean less? The value of information for conservation planning under sea level rise.
Runting, Rebecca K; Wilson, Kerrie A; Rhodes, Jonathan R
2013-02-01
Many studies have explored the benefits of adopting more sophisticated modelling techniques or spatial data in terms of our ability to accurately predict ecosystem responses to global change. However, we currently know little about whether the improved predictions will actually lead to better conservation outcomes once the costs of gaining improved models or data are accounted for. This severely limits our ability to make strategic decisions for adaptation to global pressures, particularly in landscapes subject to dynamic change such as the coastal zone. In such landscapes, the global phenomenon of sea level rise is a critical consideration for preserving biodiversity. Here, we address this issue in the context of making decisions about where to locate a reserve system to preserve coastal biodiversity with a limited budget. Specifically, we determined the cost-effectiveness of investing in high-resolution elevation data and process-based models for predicting wetland shifts in a coastal region of South East Queensland, Australia. We evaluated the resulting priority areas for reserve selection to quantify the cost-effectiveness of investment in better quantifying biological and physical processes. We show that, in this case, it is considerably more cost effective to use a process-based model and high-resolution elevation data, even if this requires a substantial proportion of the project budget to be expended (up to 99% in one instance). The less accurate model and data set failed to identify areas of high conservation value, reducing the cost-effectiveness of the resultant conservation plan. This suggests that when developing conservation plans in areas where sea level rise threatens biodiversity, investing in high-resolution elevation data and process-based models to predict shifts in coastal ecosystems may be highly cost effective. A future research priority is to determine how this cost-effectiveness varies among different regions across the globe. © 2012 Blackwell Publishing Ltd.
New low noise CCD cameras for Pi-of-the-Sky project
NASA Astrophysics Data System (ADS)
Kasprowicz, G.; Czyrkowski, H.; Dabrowski, R.; Dominik, W.; Mankiewicz, L.; Pozniak, K.; Romaniuk, R.; Sitek, P.; Sokolowski, M.; Sulej, R.; Uzycki, J.; Wrochna, G.
2006-10-01
Modern research trends require observation of fainter and fainter astronomical objects on large areas of the sky. This implies usage of systems with high temporal and optical resolution with computer based data acquisition and processing. Therefore Charge Coupled Devices (CCD) became so popular. They offer quick picture conversion with much better quality than film based technologies. This work is theoretical and practical study of the CCD based picture acquisition system. The system was optimized for "Pi of The Sky" project. But it can be adapted to another professional astronomical researches. The work includes issue of picture conversion, signal acquisition, data transfer and mechanical construction of the device.
Generation Algorithm of Discrete Line in Multi-Dimensional Grids
NASA Astrophysics Data System (ADS)
Du, L.; Ben, J.; Li, Y.; Wang, R.
2017-09-01
Discrete Global Grids System (DGGS) is a kind of digital multi-resolution earth reference model, in terms of structure, it is conducive to the geographical spatial big data integration and mining. Vector is one of the important types of spatial data, only by discretization, can it be applied in grids system to make process and analysis. Based on the some constraint conditions, this paper put forward a strict definition of discrete lines, building a mathematic model of the discrete lines by base vectors combination method. Transforming mesh discrete lines issue in n-dimensional grids into the issue of optimal deviated path in n-minus-one dimension using hyperplane, which, therefore realizing dimension reduction process in the expression of mesh discrete lines. On this basis, we designed a simple and efficient algorithm for dimension reduction and generation of the discrete lines. The experimental results show that our algorithm not only can be applied in the two-dimensional rectangular grid, also can be applied in the two-dimensional hexagonal grid and the three-dimensional cubic grid. Meanwhile, when our algorithm is applied in two-dimensional rectangular grid, it can get a discrete line which is more similar to the line in the Euclidean space.
Exploring scaling issues by using NASA Cold Land Processes Experiment(CLPX-1, IOP3) radiometric data
NASA Technical Reports Server (NTRS)
Tedesco, Marco; Kim, Edward J.; Cline, Don; Graf, Tobias; Koike, Toshio; Armstrong, Richard; Brodzik, Mary; Stankov, Boba; Gasiewski, Al; Klein, Marian
2004-01-01
The NASA Cold-land Processes Field Experiment-1 (CLPX-1) involved several instruments in order to acquire data at different spatial resolutions. Indeed, one of the main tasks of CLPX-1 was to explore scaling issues associated with microwave remote sensing of snowpacks. To achieve this task, microwave brightness temperatures collected at 18.7, 36.5, and 89 GHz at LSOS test site by means of the University of Tokyo s Ground Based Microwave Radiometer-7 (GBMR-7) were compared with brightness temperatures recorded by the NOAA Polarimetric Scanning Radiometer (PSR/A) and by SSM/I and AMSR-E radiometers. Differences between different scales observations were observed and they may be due to the topography of the terrain and to observed footprints. In the case of satellite and airborne data, indeed, it is necessary to consider the heterogeneity of the terrain and the presence of trees inside the observed scene becomes a very important factor. Also when comparing data acquired only by the two satellites, differences were found. Different acquisition times and footprint positions, together with different calibration and validation procedures, can be responsible for the observed differences.
28 CFR 542.14 - Initial filing.
Code of Federal Regulations, 2010 CFR
2010-07-01
... place a single complaint or a reasonable number of closely related issues on the form. If the inmate includes on a single form multiple unrelated issues, the submission shall be rejected and returned without... of informal resolution and submission of a formal written Administrative Remedy Request, on the...
2009-03-31
failure in some cases to ensure that its exported products meet U.S. health and safety standards. Further complicating the bilateral economic...resolution cases against China in the WTO, and continuing pressure on China to appreciate its currency. Others have warned against using...14 WTO Implementation Issues................................................................................................... 15 Pending Cases
Multi-Resolution Indexing for Hierarchical Out-of-Core Traversal of Rectilinear Grids
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pascucci, V.
2000-07-10
The real time processing of very large volumetric meshes introduces specific algorithmic challenges due to the impossibility of fitting the input data in the main memory of a computer. The basic assumption (RAM computational model) of uniform-constant-time access to each memory location is not valid because part of the data is stored out-of-core or in external memory. The performance of most algorithms does not scale well in the transition from the in-core to the out-of-core processing conditions. The performance degradation is due to the high frequency of I/O operations that may start dominating the overall running time. Out-of-core computing [28]more » addresses specifically the issues of algorithm redesign and data layout restructuring to enable data access patterns with minimal performance degradation in out-of-core processing. Results in this area are also valuable in parallel and distributed computing where one has to deal with the similar issue of balancing processing time with data migration time. The solution of the out-of-core processing problem is typically divided into two parts: (i) analysis of a specific algorithm to understand its data access patterns and, when possible, redesign the algorithm to maximize their locality; and (ii) storage of the data in secondary memory with a layout consistent with the access patterns of the algorithm to amortize the cost of each I/O operation over several memory access operations. In the case of a hierarchical visualization algorithms for volumetric data the 3D input hierarchy is traversed to build derived geometric models with adaptive levels of detail. The shape of the output models is then modified dynamically with incremental updates of their level of detail. The parameters that govern this continuous modification of the output geometry are dependent on the runtime user interaction making it impossible to determine a priori what levels of detail are going to be constructed. For example they can be dependent from external parameters like the viewpoint of the current display window or from internal parameters like the isovalue of an isocontour or the position of an orthogonal slice. The structure of the access pattern can be summarized into two main points: (i) the input hierarchy is traversed level by level so that the data in the same level of resolution or in adjacent levels is traversed at the same time and (ii) within each level of resolution the data is mostly traversed at the same time in regions that are geometrically close. In this paper I introduce a new static indexing scheme that induces a data layout satisfying both requirements (i) and (ii) for the hierarchical traversal of n-dimensional regular grids. In one particular implementation the scheme exploits in a new way the recursive construction of the Z-order space filling curve. The standard indexing that maps the input nD data onto a 1D sequence for the Z-order curve is based on a simple bit interleaving operation that merges the n input indices into one index n times longer. This helps in grouping the data for geometric proximity but only for a specific level of detail. In this paper I show how this indexing can be transformed into an alternative index that allows to group the data per level of resolution first and then the data within each level per geometric proximity. This yields a data layout that is appropriate for hierarchical out-of-core processing of large grids.« less
Supernova feedback in numerical simulations of galaxy formation: separating physics from numerics
NASA Astrophysics Data System (ADS)
Smith, Matthew C.; Sijacki, Debora; Shen, Sijing
2018-07-01
While feedback from massive stars exploding as supernovae (SNe) is thought to be one of the key ingredients regulating galaxy formation, theoretically it is still unclear how the available energy couples to the interstellar medium and how galactic scale outflows are launched. We present a novel implementation of six sub-grid SN feedback schemes in the moving-mesh code AREPO, including injections of thermal and/or kinetic energy, two parametrizations of delayed cooling feedback and a `mechanical' feedback scheme that injects the correct amount of momentum depending on the relevant scale of the SN remnant resolved. All schemes make use of individually time-resolved SN events. Adopting isolated disc galaxy set-ups at different resolutions, with the highest resolution runs reasonably resolving the Sedov-Taylor phase of the SN, we aim to find a physically motivated scheme with as few tunable parameters as possible. As expected, simple injections of energy overcool at all but the highest resolution. Our delayed cooling schemes result in overstrong feedback, destroying the disc. The mechanical feedback scheme is efficient at suppressing star formation, agrees well with the Kennicutt-Schmidt relation, and leads to converged star formation rates and galaxy morphologies with increasing resolution without fine-tuning any parameters. However, we find it difficult to produce outflows with high enough mass loading factors at all but the highest resolution, indicating either that we have oversimplified the evolution of unresolved SN remnants, require other stellar feedback processes to be included, and require a better star formation prescription or most likely some combination of these issues.
Supernova feedback in numerical simulations of galaxy formation: separating physics from numerics
NASA Astrophysics Data System (ADS)
Smith, Matthew C.; Sijacki, Debora; Shen, Sijing
2018-04-01
While feedback from massive stars exploding as supernovae (SNe) is thought to be one of the key ingredients regulating galaxy formation, theoretically it is still unclear how the available energy couples to the interstellar medium and how galactic scale outflows are launched. We present a novel implementation of six sub-grid SN feedback schemes in the moving-mesh code AREPO, including injections of thermal and/or kinetic energy, two parametrizations of delayed cooling feedback and a `mechanical' feedback scheme that injects the correct amount of momentum depending on the relevant scale of the SN remnant resolved. All schemes make use of individually time-resolved SN events. Adopting isolated disk galaxy setups at different resolutions, with the highest resolution runs reasonably resolving the Sedov-Taylor phase of the SN, we aim to find a physically motivated scheme with as few tunable parameters as possible. As expected, simple injections of energy overcool at all but the highest resolution. Our delayed cooling schemes result in overstrong feedback, destroying the disk. The mechanical feedback scheme is efficient at suppressing star formation, agrees well with the Kennicutt-Schmidt relation and leads to converged star formation rates and galaxy morphologies with increasing resolution without fine tuning any parameters. However, we find it difficult to produce outflows with high enough mass loading factors at all but the highest resolution, indicating either that we have oversimplified the evolution of unresolved SN remnants, require other stellar feedback processes to be included, require a better star formation prescription or most likely some combination of these issues.
Moving on or digging deeper: Regulatory mode and interpersonal conflict resolution.
Webb, Christine E; Coleman, Peter T; Rossignac-Milon, Maya; Tomasulo, Stephen J; Higgins, E Tory
2017-04-01
Conflict resolution, in its most basic sense, requires movement and change between opposing motivational states. Although scholars and practitioners have long acknowledged this point, research has yet to investigate whether individual differences in the motivation for movement from state-to-state influence conflict resolution processes. Regulatory Mode Theory (RMT) describes this fundamental motivation as locomotion. RMT simultaneously describes an orthogonal motivational emphasis on assessment, a tendency for critical evaluation and comparison. We argue that this tendency, in the absence of a stronger motivation for locomotion, can obstruct peoples' propensity to reconcile. Five studies, using diverse measures and methods, found that the predominance of an individual's locomotion over assessment facilitates interpersonal conflict resolution. The first two studies present participants with hypothetical conflict scenarios to examine how chronic (Study 1) and experimentally induced (Study 2) individual differences in locomotion predominance influence the motivation to reconcile. The next two studies investigate this relation by way of participants' own conflict experiences, both through essay recall of previous conflict events (Study 3) and verbal narratives of ongoing conflict issues (Study 4). We then explore this association in the context of real-world conflict discussions between roommates (Study 5). Lastly, we examine results across these studies meta-analytically (Study 6). Overall, locomotion and assessment can inform lay theories of individual variation in the motivation to "move on" or "dig deeper" in conflict situations. We conclude by emphasizing the importance of using RMT to go beyond instrumental approaches to conflict resolution to understand fundamental individual motivations underlying its occurrence. (PsycINFO Database Record (c) 2017 APA, all rights reserved).
Conflict management: importance and implications.
McKibben, Laurie
2017-01-26
Conflict is a consistent and unavoidable issue within healthcare teams. Despite training of nurse leaders and managers around areas of conflict resolution, the problem of staff relations, stress, sickness and retention remain. Conflict arises from issues with interpersonal relationships, change and poor leadership. New members of staff entering an already established healthcare team should be supported and integrated, to encourage mutual role respect between all team members and establish positive working relationships, in order to maximise patient care. This paper explores the concept of conflict, the importance of addressing causes of conflict, effective management, and the relevance of positive approaches to conflict resolution. Good leadership, nurturing positive team dynamics and communication, encourages shared problem solving and acceptance of change. Furthermore mutual respect fosters a more positive working environment for those in healthcare teams. As conflict has direct implications for patients, positive resolution is essential, to promote safe and effective delivery of care, whilst encouraging therapeutic relationships between colleagues and managers.
It’s just a matter of time before we see global climate models increasing their spatial resolution to that now typical of regional models. This encroachment brings in an urgent need for making regional NWP and climate models applicable at certain finer resolutions. One of the hin...
USDA-ARS?s Scientific Manuscript database
Genotyping by sequencing (GBS) provides opportunities to generate high-resolution genetic maps at a low per-sample genotyping cost, but missing data and under-calling of heterozygotes complicate the creation of GBS linkage maps for highly heterozygous species. To overcome these issues, we developed ...
Texts Adopted at Meetings of the European Ministers Responsible for Sport. 1975-86.
ERIC Educational Resources Information Center
Council of Europe, Strasbourg (France).
This monograph presents resolutions passed by conferences of European ministers responsible for sport in five meetings from 1975 to 1986. The meetings were held in Brussels, London, Palma de Majorca, Malta, and Dublin. Reported also are declarations, press communiques, and resolutions issued by informal working parties and informal meetings of…
Federal Register 2010, 2011, 2012, 2013, 2014
2013-04-26
... International Law (ACPIL)--Online Dispute Resolution (ODR) Study Group The Office of the Assistant Legal Adviser... guidelines and minimum requirements for online dispute resolution providers and arbitrators, substantive... that the working group is addressing is the identification of security issues relating to use of the...
Imaging Mass Spectrometry on the Nanoscale with Cluster Ion Beams
2015-01-01
Imaging with cluster secondary ion mass spectrometry (SIMS) is reaching a mature level of development. Using a variety of molecular ion projectiles to stimulate desorption, 3-dimensional imaging with the selectivity of mass spectrometry can now be achieved with submicrometer spatial resolution and <10 nm depth resolution. In this Perspective, stock is taken regarding what it will require to routinely achieve these remarkable properties. Issues include the chemical nature of the projectile, topography formation, differential erosion rates, and perhaps most importantly, ionization efficiency. Shortcomings of existing instrumentation are also noted. Speculation about how to successfully resolve these issues is a key part of the discussion. PMID:25458665
Nouri, Hamideh; Anderson, Sharolyn; Sutton, Paul; Beecham, Simon; Nagler, Pamela L.; Jarchow, Christopher J.; Roberts, Dar A.
2017-01-01
This research addresses the question as to whether or not the Normalised Difference Vegetation Index (NDVI) is scale invariant (i.e. constant over spatial aggregation) for pure pixels of urban vegetation. It has been long recognized that there are issues related to the modifiable areal unit problem (MAUP) pertaining to indices such as NDVI and images at varying spatial resolutions. These issues are relevant to using NDVI values in spatial analyses. We compare two different methods of calculation of a mean NDVI: 1) using pixel values of NDVI within feature/object boundaries and 2) first calculating the mean red and mean near-infrared across all feature pixels and then calculating NDVI. We explore the nature and magnitude of these differences for images taken from two sensors, a 1.24 m resolution WorldView-3 and a 0.1 m resolution digital aerial image. We apply these methods over an urban park located in the Adelaide Parklands of South Australia. We demonstrate that the MAUP is not an issue for calculation of NDVI within a sensor for pure urban vegetation pixels. This may prove useful for future rule-based monitoring of the ecosystem functioning of green infrastructure.
Interactive entity resolution in relational data: a visual analytic tool and its evaluation.
Kang, Hyunmo; Getoor, Lise; Shneiderman, Ben; Bilgic, Mustafa; Licamele, Louis
2008-01-01
Databases often contain uncertain and imprecise references to real-world entities. Entity resolution, the process of reconciling multiple references to underlying real-world entities, is an important data cleaning process required before accurate visualization or analysis of the data is possible. In many cases, in addition to noisy data describing entities, there is data describing the relationships among the entities. This relational data is important during the entity resolution process; it is useful both for the algorithms which determine likely database references to be resolved and for visual analytic tools which support the entity resolution process. In this paper, we introduce a novel user interface, D-Dupe, for interactive entity resolution in relational data. D-Dupe effectively combines relational entity resolution algorithms with a novel network visualization that enables users to make use of an entity's relational context for making resolution decisions. Since resolution decisions often are interdependent, D-Dupe facilitates understanding this complex process through animations which highlight combined inferences and a history mechanism which allows users to inspect chains of resolution decisions. An empirical study with 12 users confirmed the benefits of the relational context visualization on the performance of entity resolution tasks in relational data in terms of time as well as users' confidence and satisfaction.
Introduction to the virtual special issue on super-resolution imaging techniques
NASA Astrophysics Data System (ADS)
Cao, Liangcai; Liu, Zhengjun
2017-12-01
Until quite recently, the resolution of optical imaging instruments, including telescopes, cameras and microscopes, was considered to be limited by the diffraction of light and by image sensors. In the past few years, many exciting super-resolution approaches have emerged that demonstrate intriguing ways to bypass the classical limit in optics and detectors. More and more research groups are engaged in the study of advanced super-resolution schemes, devices, algorithms, systems, and applications [1-6]. Super-resolution techniques involve new methods in science and engineering of optics [7,8], measurements [9,10], chemistry [11,12] and information [13,14]. Promising applications, particularly in biomedical research and semiconductor industry, have been successfully demonstrated.
75 FR 57537 - Sunshine Act; Notice of Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2010-09-21
..., 2010 9:30 a.m. Briefing on Security Issues (Closed--Ex. 1). * * * * * * The schedule for Commission..., 2010 1 p.m. Briefing on Resolution of Generic Safety Issue (GSI)--191, Assessment of Debris... 18, 2010 1:30 p.m. NRC All Employees Meeting (Public Meeting) Marriott Bethesda North Hotel, 5701...
Nuclear Issues: Strategies and Worksheets. Health Education.
ERIC Educational Resources Information Center
Lantieri, Linda; And Others
This document is designed to provide students with an opportunity to share feelings and clarify their own values given the facts about nuclear technology. Using this short curriculum, students should be able to: (1) explore their associations with nuclear issues; (2) analyze guidelines for conflict resolution in personal situations; (3) analyze…
ERIC Educational Resources Information Center
National Conference of State Legislatures, Denver, CO.
This summary of legislation, with a special focus on maternal and child health and primary care, describes nearly 600 laws and resolutions pertinent to these issues passed by the 50 states, the District of Columbia, and Puerto Rico in the 1996 legislative sessions. The summary includes health care reform and access issues, managed care and…
Adolescent Health Issues: State Actions 1992-1994.
ERIC Educational Resources Information Center
Savage, Melissa Hough
This publication summarizes approximately 300 laws and resolutions concerning adolescent health and related issues passed by the 50 states, and U.S. commonwealths and territories between 1992 and 1994. The state legislation reflects the health problems that, on the whole, involve preventable behavior. Brief descriptions of laws are provided…
Adolescent Health Issues: State Actions 1995.
ERIC Educational Resources Information Center
Savage, Melissa Hough; Ourada, Joanne
Many adolescents need basic health care and other services that address risky behaviors such as sexual activity, violence, alcohol and other drug abuse, and the consequences of those behaviors. This publication summarizes approximately 250 laws and resolutions concerning adolescent health and related issues passed by the 50 states and the District…
Adolescent Health Issues: State Actions 1997.
ERIC Educational Resources Information Center
Kendell, Nicole
Many adolescents need basic health care and other services that address risky behaviors such as sexual activity, violence, alcohol and drug abuse, and the consequences of these behaviors. This publication summarizes laws and resolutions on adolescent health issues passed in 1997 state and territory legislative sessions. No 1997 legislative session…
A Stock Approach to Value Debate.
ERIC Educational Resources Information Center
Colbert, Kent R.
Existing theories of value debating (resolutions dealing with values rather than policy) may be more effectively applied and developed when viewed as stock issues paradigms for debating values in competitive situations. Issues are vital to an advocate's cause because they are essential to the meaning of a proposition and can also provide an…
12 CFR 390.466 - Risk-based capital credit risk-weight categories.
Code of Federal Regulations, 2014 CFR
2014-01-01
...-sector entity; (J) Bonds issued by the Financing Corporation or the Resolution Funding Corporation; (K... country. (iii) 50 percent Risk Weight (Category 3). (A) Revenue bonds issued by any public-sector entity....g., industrial development bonds; (J) Debt securities not otherwise described in this section; (K...
Navy Ford (CVN-78) Class Aircraft Carrier Program: Background and Issues for Congress
2015-12-17
AP funding for the ship. Oversight issues for Congress for the CVN-78 program include the following: the potential impact on the CVN-78 program...Potential Impact of Continuing Resolution (CR) for FY2016 .................................................. 7 Overview...7 Impact on CVN-78 Program
Environmental Science Misconceptions--Resolution of an Anomaly.
ERIC Educational Resources Information Center
Groves, Fred H.; Pugh, Ava F.
This document reports on research on the ability of a short-term intervention to substantially increase elementary pre-service teacher knowledge of major environmental science issues. The study was conducted each semester over seven years. Student understanding of such issues as global warming, ozone depletion, and local groundwater problems was…
Exploration into technical procedures for vertical integration. [information systems
NASA Technical Reports Server (NTRS)
Michel, R. J.; Maw, K. D.
1979-01-01
Issues in the design and use of a digital geographic information system incorporating landuse, zoning, hazard, LANDSAT, and other data are discussed. An eleven layer database was generated. Issues in spatial resolution, registration, grid versus polygonal structures, and comparison of photointerpreted landuse to LANDSAT land cover are examined.
2009-06-23
failure in some cases to ensure that its exported products meet U.S. health and safety standards. Further complicating the bilateral economic...resolution cases against China to the WTO; and continuing pressure on China to appreciate its currency. Others have warned against using...Implementation Issues ............................................................................................... 16 Pending U.S. Cases Against China
Federal Register 2010, 2011, 2012, 2013, 2014
2013-03-29
... resolution of issues and challenges involving air transportation concepts, requirements, operational... impact the future Air Traffic Management System. This charter renewal will take effect on April 1, 2013... operational and technological issues that impact the Next Generation Air Transportation System (NextGen...
From atoms to layers: in situ gold cluster growth kinetics during sputter deposition
NASA Astrophysics Data System (ADS)
Schwartzkopf, Matthias; Buffet, Adeline; Körstgens, Volker; Metwalli, Ezzeldin; Schlage, Kai; Benecke, Gunthard; Perlich, Jan; Rawolle, Monika; Rothkirch, André; Heidmann, Berit; Herzog, Gerd; Müller-Buschbaum, Peter; Röhlsberger, Ralf; Gehrke, Rainer; Stribeck, Norbert; Roth, Stephan V.
2013-05-01
The adjustment of size-dependent catalytic, electrical and optical properties of gold cluster assemblies is a very significant issue in modern applied nanotechnology. We present a real-time investigation of the growth kinetics of gold nanostructures from small nuclei to a complete gold layer during magnetron sputter deposition with high time resolution by means of in situ microbeam grazing incidence small-angle X-ray scattering (μGISAXS). We specify the four-stage growth including their thresholds with sub-monolayer resolution and identify phase transitions monitored in Yoneda intensity as a material-specific characteristic. An innovative and flexible geometrical model enables the extraction of morphological real space parameters, such as cluster size and shape, correlation distance, layer porosity and surface coverage, directly from reciprocal space scattering data. This approach enables a large variety of future investigations of the influence of different process parameters on the thin metal film morphology. Furthermore, our study allows for deducing the wetting behavior of gold cluster films on solid substrates and provides a better understanding of the growth kinetics in general, which is essential for optimization of manufacturing parameters, saving energy and resources.The adjustment of size-dependent catalytic, electrical and optical properties of gold cluster assemblies is a very significant issue in modern applied nanotechnology. We present a real-time investigation of the growth kinetics of gold nanostructures from small nuclei to a complete gold layer during magnetron sputter deposition with high time resolution by means of in situ microbeam grazing incidence small-angle X-ray scattering (μGISAXS). We specify the four-stage growth including their thresholds with sub-monolayer resolution and identify phase transitions monitored in Yoneda intensity as a material-specific characteristic. An innovative and flexible geometrical model enables the extraction of morphological real space parameters, such as cluster size and shape, correlation distance, layer porosity and surface coverage, directly from reciprocal space scattering data. This approach enables a large variety of future investigations of the influence of different process parameters on the thin metal film morphology. Furthermore, our study allows for deducing the wetting behavior of gold cluster films on solid substrates and provides a better understanding of the growth kinetics in general, which is essential for optimization of manufacturing parameters, saving energy and resources. Electronic supplementary information (ESI) available: The full GISAXS image sequence of the experiment, the model-based IsGISAXS-simulation sequence as movie files for comparison and detailed information about sample cleaning, XRR, FESEM, IsGISAXS, comparison μGIWAXS/μGISAXS, and sampling statistics. See DOI: 10.1039/c3nr34216f
The spiral ganglion: connecting the peripheral and central auditory systems
Nayagam, Bryony A; Muniak, Michael A; Ryugo, David K
2011-01-01
In mammals, the initial bridge between the physical world of sound and perception of that sound is established by neurons of the spiral ganglion. The cell bodies of these neurons give rise to peripheral processes that contact acoustic receptors in the organ of Corti, and the central processes collect together to form the auditory nerve that projects into the brain. In order to better understand hearing at this initial stage, we need to know the following about spiral ganglion neurons: (1) their cell biology including cytoplasmic, cytoskeletal, and membrane properties, (2) their peripheral and central connections including synaptic structure; (3) the nature of their neural signaling; and (4) their capacity for plasticity and rehabilitation. In this report, we will update the progress on these topics and indicate important issues still awaiting resolution. PMID:21530629
NASA Astrophysics Data System (ADS)
Huang, Wei; Zhang, Xingnan; Li, Chenming; Wang, Jianying
Management of group decision-making is an important issue in water source management development. In order to overcome the defects in lacking of effective communication and cooperation in the existing decision-making models, this paper proposes a multi-layer dynamic model for coordination in water resource allocation and scheduling based group decision making. By introducing the scheme-recognized cooperative satisfaction index and scheme-adjusted rationality index, the proposed model can solve the problem of poor convergence of multi-round decision-making process in water resource allocation and scheduling. Furthermore, the problem about coordination of limited resources-based group decision-making process can be solved based on the effectiveness of distance-based group of conflict resolution. The simulation results show that the proposed model has better convergence than the existing models.
Army Command and Control Study-82 (ACCS-82). Volume III. Annexes.
1979-09-30
specific issues identified by the group as requiring resolution in order to acompLish the study objective (Vol I, Chap 7). Recommendations are organized...Volume I, have been approved with the following modifications: a. Organizational Issue 1. The Army Readiness and Mobilization Region concept, described...in Volume IV, is the approved organizational alternative. b. Organizational Issue 3. The activation of one additional CONUS headquarters is
Syntactic Constraints and Individual Differences in Native and Non-Native Processing of Wh-Movement
Johnson, Adrienne; Fiorentino, Robert; Gabriele, Alison
2016-01-01
There is a debate as to whether second language (L2) learners show qualitatively similar processing profiles as native speakers or whether L2 learners are restricted in their ability to use syntactic information during online processing. In the realm of wh-dependency resolution, research has examined whether learners, similar to native speakers, attempt to resolve wh-dependencies in grammatically licensed contexts but avoid positing gaps in illicit contexts such as islands. Also at issue is whether the avoidance of gap filling in islands is due to adherence to syntactic constraints or whether islands simply present processing bottlenecks. One approach has been to examine the relationship between processing abilities and the establishment of wh-dependencies in islands. Grammatical accounts of islands do not predict such a relationship as the parser should simply not predict gaps in illicit contexts. In contrast, a pattern of results showing that individuals with more processing resources are better able to establish wh-dependencies in islands could conceivably be compatible with certain processing accounts. In a self-paced reading experiment which examines the processing of wh-dependencies, we address both questions, examining whether native English speakers and Korean learners of English show qualitatively similar patterns and whether there is a relationship between working memory, as measured by counting span and reading span, and processing in both island and non-island contexts. The results of the self-paced reading experiment suggest that learners can use syntactic information on the same timecourse as native speakers, showing qualitative similarity between the two groups. Results of regression analyses did not reveal a significant relationship between working memory and the establishment of wh-dependencies in islands but we did observe significant relationships between working memory and the processing of licit wh-dependencies. As the contexts in which these relationships emerged differed for learners and native speakers, our results call for further research examining individual differences in dependency resolution in both populations. PMID:27148152
Syntactic Constraints and Individual Differences in Native and Non-Native Processing of Wh-Movement.
Johnson, Adrienne; Fiorentino, Robert; Gabriele, Alison
2016-01-01
There is a debate as to whether second language (L2) learners show qualitatively similar processing profiles as native speakers or whether L2 learners are restricted in their ability to use syntactic information during online processing. In the realm of wh-dependency resolution, research has examined whether learners, similar to native speakers, attempt to resolve wh-dependencies in grammatically licensed contexts but avoid positing gaps in illicit contexts such as islands. Also at issue is whether the avoidance of gap filling in islands is due to adherence to syntactic constraints or whether islands simply present processing bottlenecks. One approach has been to examine the relationship between processing abilities and the establishment of wh-dependencies in islands. Grammatical accounts of islands do not predict such a relationship as the parser should simply not predict gaps in illicit contexts. In contrast, a pattern of results showing that individuals with more processing resources are better able to establish wh-dependencies in islands could conceivably be compatible with certain processing accounts. In a self-paced reading experiment which examines the processing of wh-dependencies, we address both questions, examining whether native English speakers and Korean learners of English show qualitatively similar patterns and whether there is a relationship between working memory, as measured by counting span and reading span, and processing in both island and non-island contexts. The results of the self-paced reading experiment suggest that learners can use syntactic information on the same timecourse as native speakers, showing qualitative similarity between the two groups. Results of regression analyses did not reveal a significant relationship between working memory and the establishment of wh-dependencies in islands but we did observe significant relationships between working memory and the processing of licit wh-dependencies. As the contexts in which these relationships emerged differed for learners and native speakers, our results call for further research examining individual differences in dependency resolution in both populations.
Data Mining and Optimization Tools for Developing Engine Parameters Tools
NASA Technical Reports Server (NTRS)
Dhawan, Atam P.
1998-01-01
This project was awarded for understanding the problem and developing a plan for Data Mining tools for use in designing and implementing an Engine Condition Monitoring System. From the total budget of $5,000, Tricia and I studied the problem domain for developing ail Engine Condition Monitoring system using the sparse and non-standardized datasets to be available through a consortium at NASA Lewis Research Center. We visited NASA three times to discuss additional issues related to dataset which was not made available to us. We discussed and developed a general framework of data mining and optimization tools to extract useful information from sparse and non-standard datasets. These discussions lead to the training of Tricia Erhardt to develop Genetic Algorithm based search programs which were written in C++ and used to demonstrate the capability of GA algorithm in searching an optimal solution in noisy datasets. From the study and discussion with NASA LERC personnel, we then prepared a proposal, which is being submitted to NASA for future work for the development of data mining algorithms for engine conditional monitoring. The proposed set of algorithm uses wavelet processing for creating multi-resolution pyramid of the data for GA based multi-resolution optimal search. Wavelet processing is proposed to create a coarse resolution representation of data providing two advantages in GA based search: 1. We will have less data to begin with to make search sub-spaces. 2. It will have robustness against the noise because at every level of wavelet based decomposition, we will be decomposing the signal into low pass and high pass filters.
SMAP RADAR Calibration and Validation
NASA Astrophysics Data System (ADS)
West, R. D.; Jaruwatanadilok, S.; Chaubel, M. J.; Spencer, M.; Chan, S. F.; Chen, C. W.; Fore, A.
2015-12-01
The Soil Moisture Active Passive (SMAP) mission launched on Jan 31, 2015. The mission employs L-band radar and radiometer measurements to estimate soil moisture with 4% volumetric accuracy at a resolution of 10 km, and freeze-thaw state at a resolution of 1-3 km. Immediately following launch, there was a three month instrument checkout period, followed by six months of level 1 (L1) calibration and validation. In this presentation, we will discuss the calibration and validation activities and results for the L1 radar data. Early SMAP radar data were used to check commanded timing parameters, and to work out issues in the low- and high-resolution radar processors. From April 3-13 the radar collected receive only mode data to conduct a survey of RFI sources. Analysis of the RFI environment led to a preferred operating frequency. The RFI survey data were also used to validate noise subtraction and scaling operations in the radar processors. Normal radar operations resumed on April 13. All radar data were examined closely for image quality and calibration issues which led to improvements in the radar data products for the beta release at the end of July. Radar data were used to determine and correct for small biases in the reported spacecraft attitude. Geo-location was validated against coastline positions and the known positions of corner reflectors. Residual errors at the time of the beta release are about 350 m. Intra-swath biases in the high-resolution backscatter images are reduced to less than 0.3 dB for all polarizations. Radiometric cross-calibration with Aquarius was performed using areas of the Amazon rain forest. Cross-calibration was also examined using ocean data from the low-resolution processor and comparing with the Aquarius wind model function. Using all a-priori calibration constants provided good results with co-polarized measurements matching to better than 1 dB, and cross-polarized measurements matching to about 1 dB in the beta release. During the second half of the L1 cal/val period, the RFI removal algorithm will be tuned for optimal performance, and the Faraday rotation corrections used in radar processing will be further developed and validated. This work is supported by the SMAP project at the Jet Propulsion Laboratory, California Institute of Technology.
NASA Astrophysics Data System (ADS)
Beskardes, G. D.; Hole, J. A.; Wang, K.; Wu, Q.; Chapman, M. C.; Davenport, K. K.; Michaelides, M.; Brown, L. D.; Quiros, D. A.
2016-12-01
Back-projection imaging has recently become a practical method for local earthquake detection and location due to the deployment of densely sampled, continuously recorded, local seismograph arrays. Back-projection is scalable to earthquakes with a wide range of magnitudes from very tiny to very large. Local dense arrays provide the opportunity to capture very tiny events for a range applications, such as tectonic microseismicity, source scaling studies, wastewater injection-induced seismicity, hydraulic fracturing, CO2 injection monitoring, volcano studies, and mining safety. While back-projection sometimes utilizes the full seismic waveform, the waveforms are often pre-processed to overcome imaging issues. We compare the performance of back-projection using four previously used data pre-processing methods: full waveform, envelope, short-term averaging / long-term averaging (STA/LTA), and kurtosis. The goal is to identify an optimized strategy for an entirely automated imaging process that is robust in the presence of real-data issues, has the lowest signal-to-noise thresholds for detection and for location, has the best spatial resolution of the energy imaged at the source, preserves magnitude information, and considers computational cost. Real data issues include aliased station spacing, low signal-to-noise ratio (to <1), large noise bursts and spatially varying waveform polarity. For evaluation, the four imaging methods were applied to the aftershock sequence of the 2011 Virginia earthquake as recorded by the AIDA array with 200-400 m station spacing. These data include earthquake magnitudes from -2 to 3 with highly variable signal to noise, spatially aliased noise, and large noise bursts: realistic issues in many environments. Each of the four back-projection methods has advantages and disadvantages, and a combined multi-pass method achieves the best of all criteria. Preliminary imaging results from the 2011 Virginia dataset will be presented.
Issues and Strategies in Solving Multidisciplinary Optimization Problems
NASA Technical Reports Server (NTRS)
Patnaik, Surya
2013-01-01
Optimization research at NASA Glenn Research Center has addressed the design of structures, aircraft and airbreathing propulsion engines. The accumulated multidisciplinary design activity is collected under a testbed entitled COMETBOARDS. Several issues were encountered during the solution of the problems. Four issues and the strategies adapted for their resolution are discussed. This is followed by a discussion on analytical methods that is limited to structural design application. An optimization process can lead to an inefficient local solution. This deficiency was encountered during design of an engine component. The limitation was overcome through an augmentation of animation into optimization. Optimum solutions obtained were infeasible for aircraft and airbreathing propulsion engine problems. Alleviation of this deficiency required a cascading of multiple algorithms. Profile optimization of a beam produced an irregular shape. Engineering intuition restored the regular shape for the beam. The solution obtained for a cylindrical shell by a subproblem strategy converged to a design that can be difficult to manufacture. Resolution of this issue remains a challenge. The issues and resolutions are illustrated through a set of problems: Design of an engine component, Synthesis of a subsonic aircraft, Operation optimization of a supersonic engine, Design of a wave-rotor-topping device, Profile optimization of a cantilever beam, and Design of a cylindrical shell. This chapter provides a cursory account of the issues. Cited references provide detailed discussion on the topics. Design of a structure can also be generated by traditional method and the stochastic design concept. Merits and limitations of the three methods (traditional method, optimization method and stochastic concept) are illustrated. In the traditional method, the constraints are manipulated to obtain the design and weight is back calculated. In design optimization, the weight of a structure becomes the merit function with constraints imposed on failure modes and an optimization algorithm is used to generate the solution. Stochastic design concept accounts for uncertainties in loads, material properties, and other parameters and solution is obtained by solving a design optimization problem for a specified reliability. Acceptable solutions can be produced by all the three methods. The variation in the weight calculated by the methods was found to be modest. Some variation was noticed in designs calculated by the methods. The variation may be attributed to structural indeterminacy. It is prudent to develop design by all three methods prior to its fabrication. The traditional design method can be improved when the simplified sensitivities of the behavior constraint is used. Such sensitivity can reduce design calculations and may have a potential to unify the traditional and optimization methods. Weight versus reliability traced out an inverted-S-shaped graph. The center of the graph corresponded to mean valued design. A heavy design with weight approaching infinity could be produced for a near-zero rate of failure. Weight can be reduced to a small value for a most failure-prone design. Probabilistic modeling of load and material properties remained a challenge.
Opposite effects of capacity load and resolution load on distractor processing.
Zhang, Weiwei; Luck, Steven J
2015-02-01
According to the load theory of attention, an increased perceptual load reduces distractor processing whereas an increased working memory load facilitates distractor processing. Here we raise the possibility that the critical distinction may instead be between an emphasis on resolution and an emphasis on capacity. That is, perceptual load manipulations typically emphasize resolution (fine-grained discriminations), whereas working memory load manipulations typically emphasize capacity (simultaneous processing of multiple relevant stimuli). To test the plausibility of this hypothesis, we used a visual working memory task that emphasized either the number of items to be stored (capacity load, retaining 2 vs. 4 colors) or the precision of the representations (resolution load, detecting small vs. large color changes). We found that an increased capacity load led to increased flanker interference (a measure of distractor processing), whereas an increased resolution load led to reduced flanker interference. These opposite effects of capacity load and resolution load on distractor processing mirror the previously described opposite effects of perceptual load and working memory load.
Opposite Effects of Capacity Load and Resolution Load on Distractor Processing
Zhang, Weiwei; Luck, Steven J.
2014-01-01
According to the load theory of attention, an increased perceptual load reduces distractor processing whereas an increased working memory load facilitates distractor processing. Here we raise the possibility that the critical distinction may instead be between an emphasis on resolution and an emphasis on capacity. That is, perceptual load manipulations typically emphasize resolution (fine-grained discriminations), whereas working memory load manipulations typically emphasize capacity (simultaneous processing of multiple relevant stimuli). To test the plausibility of this hypothesis, we used a visual working memory task that emphasized either the number of items to be stored (capacity load, retaining two versus four colors) or the precision of the representations (resolution load, detecting small versus large color changes). We found that an increased capacity load led to increased flanker interference (a measure of distractor processing), whereas an increased resolution load led to reduced flanker interference. These opposite effects of capacity load and resolution load on distractor processing mirror the previously described opposite effects of perceptual load and working memory load. PMID:25365573
Antommaria, Armand H Matheny
2004-01-01
The American Society for Bioethics and Humanities debated for several years about whether it should adopt positions and, if so, on what range of issues. The membership recently approved an amendment to its bylaws permitting the Society to adopt positions on matters related to academic freedom and professionalism but not on substantive moral and policy issues. This resolution is problematic for a number of reasons, including the lack of a categorical difference between these types of claims and the Society's inability to speak on behalf of patients and research subjects. The implementation of the amendment also raises several issues. The Society will need to refrain from speaking too specifically and to articulate the responsibilities of its members. If the Society fails to address these concerns, it runs the risk of denigrating its public image and that of the profession.
Chiarello, Elizabeth
2013-12-01
Social science studies of bioethics demonstrate that ethics are highly contextual, functioning differently across local settings as actors make daily decisions "on the ground." Sociological studies that demonstrate the key role organizations play in shaping ethical decision-making have disproportionately focused on physicians and nurses working in hospital settings where they contend with life and death issues. This study broadens our understanding of the contexts of ethical decision-making by empirically examining understudied healthcare professionals - pharmacists - working in two organizational settings, retail and hospital, where they act as gatekeepers to regulated goods and services as they contend with ethical issues ranging from the serious to the mundane. This study asks: How do organizations shape pharmacists' identification, negotiation, and resolution of ethical challenges; in other words, how do organizations shape pharmacists' gatekeeping processes? Based on 95 semi-structured interviews with U.S. pharmacists practicing in retail and hospital pharmacies conducted between September 2009 and May 2011, this research finds that organizations influence ethical decision-making by shaping how pharmacists construct four gatekeeping processes: medical, legal, fiscal, and moral. Each gatekeeping process manifests differently across organizations due to how these settings structure inter-professional power dynamics, proximity to patients, and means of accessing information. Findings suggest new directions for theorizing about ethical decision-making in medical contexts by drawing attention to new ethical actors, new organizational settings, an expanded definition of ethical challenges, and a broader conceptualization of gatekeeping. Copyright © 2012 Elsevier Ltd. All rights reserved.
Classification of vegetation types in military region
NASA Astrophysics Data System (ADS)
Gonçalves, Miguel; Silva, Jose Silvestre; Bioucas-Dias, Jose
2015-10-01
In decision-making process regarding planning and execution of military operations, the terrain is a determining factor. Aerial photographs are a source of vital information for the success of an operation in hostile region, namely when the cartographic information behind enemy lines is scarce or non-existent. The objective of present work is the development of a tool capable of processing aerial photos. The methodology implemented starts with feature extraction, followed by the application of an automatic selector of features. The next step, using the k-fold cross validation technique, estimates the input parameters for the following classifiers: Sparse Multinomial Logist Regression (SMLR), K Nearest Neighbor (KNN), Linear Classifier using Principal Component Expansion on the Joint Data (PCLDC) and Multi-Class Support Vector Machine (MSVM). These classifiers were used in two different studies with distinct objectives: discrimination of vegetation's density and identification of vegetation's main components. It was found that the best classifier on the first approach is the Sparse Logistic Multinomial Regression (SMLR). On the second approach, the implemented methodology applied to high resolution images showed that the better performance was achieved by KNN classifier and PCLDC. Comparing the two approaches there is a multiscale issue, in which for different resolutions, the best solution to the problem requires different classifiers and the extraction of different features.
Nasica-Labouze, Jessica; Meli, Massimiliano; Derreumaux, Philippe; Colombo, Giorgio; Mousseau, Normand
2011-01-01
The self-organization of peptides into amyloidogenic oligomers is one of the key events for a wide range of molecular and degenerative diseases. Atomic-resolution characterization of the mechanisms responsible for the aggregation process and the resulting structures is thus a necessary step to improve our understanding of the determinants of these pathologies. To address this issue, we combine the accelerated sampling properties of replica exchange molecular dynamics simulations based on the OPEP coarse-grained potential with the atomic resolution description of interactions provided by all-atom MD simulations, and investigate the oligomerization process of the GNNQQNY for three system sizes: 3-mers, 12-mers and 20-mers. Results for our integrated simulations show a rich variety of structural arrangements for aggregates of all sizes. Elongated fibril-like structures can form transiently in the 20-mer case, but they are not stable and easily interconvert in more globular and disordered forms. Our extensive characterization of the intermediate structures and their physico-chemical determinants points to a high degree of polymorphism for the GNNQQNY sequence that can be reflected at the macroscopic scale. Detailed mechanisms and structures that underlie amyloid aggregation are also provided. PMID:21625573
Ambiguity of Quality in Remote Sensing Data
NASA Technical Reports Server (NTRS)
Lynnes, Christopher; Leptoukh, Greg
2010-01-01
This slide presentation reviews some of the issues in quality of remote sensing data. Data "quality" is used in several different contexts in remote sensing data, with quite different meanings. At the pixel level, quality typically refers to a quality control process exercised by the processing algorithm, not an explicit declaration of accuracy or precision. File level quality is usually a statistical summary of the pixel-level quality but is of doubtful use for scenes covering large areal extents. Quality at the dataset or product level, on the other hand, usually refers to how accurately the dataset is believed to represent the physical quantities it purports to measure. This assessment often bears but an indirect relationship at best to pixel level quality. In addition to ambiguity at different levels of granularity, ambiguity is endemic within levels. Pixel-level quality terms vary widely, as do recommendations for use of these flags. At the dataset/product level, quality for low-resolution gridded products is often extrapolated from validation campaigns using high spatial resolution swath data, a suspect practice at best. Making use of quality at all levels is complicated by the dependence on application needs. We will present examples of the various meanings of quality in remote sensing data and possible ways forward toward a more unified and usable quality framework.
The Bermuda Bio-Optics Program (BBOP). Chapter 16
NASA Technical Reports Server (NTRS)
Siegel, David A.
2001-01-01
The Bermuda Bio-Optics Project (BBOP) is a collaborative effort between the Institute for Computational Earth System Science (ICESS) at the University of California at Santa Barbara (UCSB) and the Bermuda Biological Station for Research (BBSR). This research program is designed to characterize light availability and utilization in the Sargasso Sea, and to provide an optical link by which biogeochemical observations may be used to evaluate bio-optical models for pigment concentration, primary production, and sinking particle fluxes from satellite-based ocean color sensors. The BBOP time-series was initiated in 1992, and is carried out in conjunction with the US JGOFS Bermuda Atlantic Time-series Study (BATS) at the Bermuda Biological Station for Research. The BATS program itself has been observing biogeochemical processes (primary productivity, particle flux at and elemental cycles) in the mesotrophic waters of the Sargasso Sea since 1988. Closely affiliated with BBOP and BATS is a separate NASA-funded study of the spatial variability of biogeochemical processes in the Sargasso Sea using high-resolution Advanced Very High Resolution Radiometer (AVHRR) and Sea-Viewing Wide Field-of-view Sensor (SeaWiFS) data collected at Bermuda. The collaboration between BATS and BBOP measurements has resulted in a unique data set that addresses not only the SIMBIOS goals but also the broader issues of important factors controlling the carbon cycle.
ERIC Educational Resources Information Center
Danielson, Leon E.; Garber, Simon K.
The extension educator in public policy education and alternative dispute resolution (ADR) has many roles from which to choose. These include information provider, technical advisor, convener, facilitator and program developer. The increased importance of issues programming and the increased priority given to measurement of results are creating…
Session of the General Assembly of IUCN (15th, Christchurch, New Zealand, October 11-23, 1981).
ERIC Educational Resources Information Center
International Union for Conservation of Nature and Natural Resources, Morges, (Switzerland).
Resolutions adopted by the 15th session of the General Assembly of the International Union for Conservation of Nature and Natural Resources (IUCN) are provided in this document. These resolutions focus on areas/issues related to: (1) world conservation strategy; (2) conservation and peace; (3) people, resources, and environment; (4) environmental…
Resolution of seven-axis manipulator redundancy: A heuristic issue
NASA Technical Reports Server (NTRS)
Chen, I.
1990-01-01
An approach is presented for the resolution of the redundancy of a seven-axis manipulator arm from the AI and expert systems point of view. This approach is heuristic, analytical, and globally resolves the redundancy at the position level. When compared with other approaches, this approach has several improved performance capabilities, including singularity avoidance, repeatability, stability, and simplicity.
ERIC Educational Resources Information Center
Wagner, David L.
Designed to serve as a framework from which high school debate students, coaches, and judges can evaluate the issues, arguments, and evidence present in sustaining and reforming the U.S. justice system, this booklet provides debaters with guidelines for research on the 1983-84 debate resolutions selected by the National University Continuing…
ERIC First Analysis: Agricultural Policy. 1986-87 National High School Debate Resolutions.
ERIC Educational Resources Information Center
Wagner, David L.; Fraleigh, Douglas
Designed to serve as a framework in which high school debate students, coaches, and judges can evaluate the issues, arguments, and evidence concerning which agricultural policies best serve the United States, this booklet provides guidelines for research on the 1986-87 debate resolutions selected by the National Federation of State High School…
The Professional Educator: Why Supporting Latino Children and Families Is Union Work
ERIC Educational Resources Information Center
Fortino, Catalina R.
2017-01-01
This article discusses the creation and promise of the American Federation of Teachers (AFT) resolution "¡Si Se Puede!: Improving Outcomes for Latino Children and Youth and Addressing the Needs of the Latino Community." The resolution affirms the AFT's commitment to elevating the importance of Latino issues. As a union committed to…
ERIC First Analysis: Water Resources; 1985-86 National High School Debate Resolutions.
ERIC Educational Resources Information Center
Wagner, David L.; Fraleigh, Douglas
Designed to serve as a framework from which high school debate students, coaches, and judges can evaluate the issues, arguments and evidence present in the availability and quality of water resources in the United States, this booklet provides guidelines for research on the 1985-86 debate resolutions selected by the National Federation of State…
75 FR 57233 - 340B Drug Pricing Program Administrative Dispute Resolution Process
Federal Register 2010, 2011, 2012, 2013, 2014
2010-09-20
... Dispute Resolution Process AGENCY: Health Resources and Services Administration, HHS. ACTION: Advance...) to promulgate regulations to establish and implement an administrative dispute resolution process for... does not currently refer to HRSA's plan on how it will resolve any decision made through the new...
NASA Astrophysics Data System (ADS)
Castagnetti, C.; Dubbini, M.; Ricci, P. C.; Rivola, R.; Giannini, M.; Capra, A.
2017-05-01
The new era of designing in architecture and civil engineering applications lies in the Building Information Modeling (BIM) approach, based on a 3D geometric model including a 3D database. This is easier for new constructions whereas, when dealing with existing buildings, the creation of the BIM is based on the accurate knowledge of the as-built construction. Such a condition is allowed by a 3D survey, often carried out with laser scanning technology or modern photogrammetry, which are able to guarantee an adequate points cloud in terms of resolution and completeness by balancing both time consuming and costs with respect to the request of final accuracy. The BIM approach for existing buildings and even more for historical buildings is not yet a well known and deeply discussed process. There are still several choices to be addressed in the process from the survey to the model and critical issues to be discussed in the modeling step, particularly when dealing with unconventional elements such as deformed geometries or historical elements. The paper describes a comprehensive workflow that goes through the survey and the modeling, allowing to focus on critical issues and key points to obtain a reliable BIM of an existing monument. The case study employed to illustrate the workflow is the Basilica of St. Stefano in Bologna (Italy), a large monumental complex with great religious, historical and architectural assets.
NEET In-Pile Ultrasonic Sensor Enablement-FY 2012 Status Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
JE Daw; JL Rempe; BR Tittmann
2012-09-01
Several Department Of Energy-Nuclear Energy (DOE-NE) programs, such as the Fuel Cycle Research and Development, Advanced Reactor Concepts, Light Water Reactor Sustainability, and Next Generation Nuclear Plant programs, are investigating new fuels and materials for advanced and existing reactors. A key objective of such programs is to understand the performance of these fuels and materials when irradiated. The Nuclear Energy Enabling Technology (NEET) Advanced Sensors and Instrumentation (ASI) in-pile instrumentation development activities are focused upon addressing cross-cutting needs for DOE-NE irradiation testing by providing higher fidelity, real-time data, with increased accuracy and resolution from smaller, compact sensors that are lessmore » intrusive. Ultrasonic technologies offer the potential to measure a range of parameters, including geometry changes, temperature, crack initiation and growth, gas pressure and composition, and microstructural changes, under harsh irradiation test conditions. There are two primary issues associated with in-pile deployment of ultrasonic sensors. The first is transducer survivability. The ability of ultrasonic transducer materials to maintain their useful properties during an irradiation must be demonstrated. The second issue is signal processing. Ultrasonic testing is typically performed in a lab or field environment, where the sensor and sample are accessible. Due to the harsh nature of in-pile testing, and the range of measurements that are desired, an enhanced signal processing capability is needed to make in-pile ultrasonic sensors viable. This project addresses these technology deployment issues.« less
Interference Resolution in Emotional Working Memory as a Function of Alexithymia.
Colligan, Sean M; Koven, Nancy S
2015-01-01
Although alexithymia is recognized as a set of traitlike deficits in emotion processing, research suggests there are concomitant cognitive issues as well, including what appears to be an unusual pattern of enhanced working memory (WM) despite broader executive dysfunction. It is unknown whether this enhancement includes the executive elements of WM and whether executive control of WM in alexithymia differs for emotional and neutral stimuli. This study examined how alexithymia moderates patterns of interference resolution in WM with valenced and nonvalenced stimuli. Participants (N = 93) completed the Toronto Alexithymia Scale and a recency probes WM task containing positive, negative, and neutral stimuli, with some trials containing proactive interference from previous trials. The reaction time difference between interference and noninterference trials indexed degree of interference resolution. Toronto Alexithymia Scale score moderated a within-subject effect such that, when valenced probes were used, there was less proactive interference in the positive relative to negative valence condition; this valence-based interference discrepancy was significant for a subset of highly alexithymic participants. Alexithymia did not moderate proactive interference to negative or neutral stimuli or accuracy of responses. These results suggest that, although alexithymia does not influence executive control in WM for nonemotional items, alexithymic people demonstrate an idiosyncratic response to positive stimuli that might indicate blunted reactivity.
Resist characteristics with direct-write electron beam and SCALPEL exposure system
NASA Astrophysics Data System (ADS)
Sato, Mitsuru; Omori, Katsumi; Ishikawa, Kiyoshi; Nakayama, Toshimasa; Novembre, Anthony E.; Ocola, Leonidas E.
1999-06-01
High acceleration voltage electron beam exposure is one of the possible candidates for post-optical lithography. The use of electrons, instead of photons, avoids optical related problems such as the standing wave issues. However, resists must conform to certain needs for the SCALPEL system, such as exposure in a vacuum chamber with 100kv electron beams. Taking into account the challenging requirements of high resolution, high sensitivity, low bake dependency and no outgassing, TOK has been able to develop resists to meet most of the SCALPEL system needs. However, due to the nature of chemical amplification and the PEB dependency, as is the case with DUV resist which varies for different features, we must recommend different resist for multiple features such as dense lines, isolated lines and contact holes. TOK has designed an electron beam negative resist, EN-009, which demonstrate 100nm pattern resolution. The dose to print on the SCALPEL system is 5.0(mu) C/cm2. The electron beam positive resist, EP-004M, has been designed for line and space patterns. The dose to print on the SCALPEL system is 8.25(mu) C/cm2. The processing conditions are standard, using 0.26N developer. These are the lowest exposure energies reported to date for similar resolution on this exposure tools.
Optimum Image Formation for Spaceborne Microwave Radiometer Products.
Long, David G; Brodzik, Mary J
2016-05-01
This paper considers some of the issues of radiometer brightness image formation and reconstruction for use in the NASA-sponsored Calibrated Passive Microwave Daily Equal-Area Scalable Earth Grid 2.0 Brightness Temperature Earth System Data Record project, which generates a multisensor multidecadal time series of high-resolution radiometer products designed to support climate studies. Two primary reconstruction algorithms are considered: the Backus-Gilbert approach and the radiometer form of the scatterometer image reconstruction (SIR) algorithm. These are compared with the conventional drop-in-the-bucket (DIB) gridded image formation approach. Tradeoff study results for the various algorithm options are presented to select optimum values for the grid resolution, the number of SIR iterations, and the BG gamma parameter. We find that although both approaches are effective in improving the spatial resolution of the surface brightness temperature estimates compared to DIB, SIR requires significantly less computation. The sensitivity of the reconstruction to the accuracy of the measurement spatial response function (MRF) is explored. The partial reconstruction of the methods can tolerate errors in the description of the sensor measurement response function, which simplifies the processing of historic sensor data for which the MRF is not known as well as modern sensors. Simulation tradeoff results are confirmed using actual data.
G.A.M.E.: GPU-accelerated mixture elucidator.
Schurz, Alioune; Su, Bo-Han; Tu, Yi-Shu; Lu, Tony Tsung-Yu; Lin, Olivia A; Tseng, Yufeng J
2017-09-15
GPU acceleration is useful in solving complex chemical information problems. Identifying unknown structures from the mass spectra of natural product mixtures has been a desirable yet unresolved issue in metabolomics. However, this elucidation process has been hampered by complex experimental data and the inability of instruments to completely separate different compounds. Fortunately, with current high-resolution mass spectrometry, one feasible strategy is to define this problem as extending a scaffold database with sidechains of different probabilities to match the high-resolution mass obtained from a high-resolution mass spectrum. By introducing a dynamic programming (DP) algorithm, it is possible to solve this NP-complete problem in pseudo-polynomial time. However, the running time of the DP algorithm grows by orders of magnitude as the number of mass decimal digits increases, thus limiting the boost in structural prediction capabilities. By harnessing the heavily parallel architecture of modern GPUs, we designed a "compute unified device architecture" (CUDA)-based GPU-accelerated mixture elucidator (G.A.M.E.) that considerably improves the performance of the DP, allowing up to five decimal digits for input mass data. As exemplified by four testing datasets with verified constitutions from natural products, G.A.M.E. allows for efficient and automatic structural elucidation of unknown mixtures for practical procedures. Graphical abstract .
Essential methodological considerations when using grounded theory.
Achora, Susan; Matua, Gerald Amandu
2016-07-01
To suggest important methodological considerations when using grounded theory. A research method widely used in nursing research is grounded theory, at the centre of which is theory construction. However, researchers still struggle with some of its methodological issues. Although grounded theory is widely used to study and explain issues in nursing practice, many researchers are still failing to adhere to its rigorous standards. Researchers should articulate the focus of their investigations - the substantive area of interest as well as the focal population. This should be followed by a succinct explanation of the strategies used to collect and analyse data, supported by clear coding processes. Finally, the resolution of the core issues, including the core category and related categories, should be explained to advance readers' understanding. Researchers should endeavour to understand the tenets of grounded theory. This enables 'neophytes' in particular to make methodological decisions that will improve their studies' rigour and fit with grounded theory. This paper complements the current dialogue on improving the understanding of grounded theory methodology in nursing research. The paper also suggests important procedural decisions researchers need to make to preserve their studies' scientific merit and fit with grounded theory.
Principles of Safety Pharmacology
Pugsley, M K; Authier, S; Curtis, M J
2008-01-01
Safety Pharmacology is a rapidly developing discipline that uses the basic principles of pharmacology in a regulatory-driven process to generate data to inform risk/benefit assessment. The aim of Safety Pharmacology is to characterize the pharmacodynamic/pharmacokinetic (PK/PD) relationship of a drug's adverse effects using continuously evolving methodology. Unlike toxicology, Safety Pharmacology includes within its remit a regulatory requirement to predict the risk of rare lethal events. This gives Safety Pharmacology its unique character. The key issues for Safety Pharmacology are detection of an adverse effect liability, projection of the data into safety margin calculation and finally clinical safety monitoring. This article sets out to explain the drivers for Safety Pharmacology so that the wider pharmacology community is better placed to understand the discipline. It concludes with a summary of principles that may help inform future resolution of unmet needs (especially establishing model validation for accurate risk assessment). Subsequent articles in this issue of the journal address specific aspects of Safety Pharmacology to explore the issues of model choice, the burden of proof and to highlight areas of intensive activity (such as testing for drug-induced rare event liability, and the challenge of testing the safety of so-called biologics (antibodies, gene therapy and so on.). PMID:18604233
Immersion lithography defectivity analysis at DUV inspection wavelength
NASA Astrophysics Data System (ADS)
Golan, E.; Meshulach, D.; Raccah, N.; Yeo, J. Ho.; Dassa, O.; Brandl, S.; Schwarz, C.; Pierson, B.; Montgomery, W.
2007-03-01
Significant effort has been directed in recent years towards the realization of immersion lithography at 193nm wavelength. Immersion lithography is likely a key enabling technology for the production of critical layers for 45nm and 32nm design rule (DR) devices. In spite of the significant progress in immersion lithography technology, there remain several key technology issues, with a critical issue of immersion lithography process induced defects. The benefits of the optical resolution and depth of focus, made possible by immersion lithography, are well understood. Yet, these benefits cannot come at the expense of increased defect counts and decreased production yield. Understanding the impact of the immersion lithography process parameters on wafer defects formation and defect counts, together with the ability to monitor, control and minimize the defect counts down to acceptable levels is imperative for successful introduction of immersion lithography for production of advanced DR's. In this report, we present experimental results of immersion lithography defectivity analysis focused on topcoat layer thickness parameters and resist bake temperatures. Wafers were exposed on the 1150i-α-immersion scanner and 1200B Scanner (ASML), defect inspection was performed using a DUV inspection tool (UVision TM, Applied Materials). Higher sensitivity was demonstrated at DUV through detection of small defects not detected at the visible wavelength, indicating on the potential high sensitivity benefits of DUV inspection for this layer. The analysis indicates that certain types of defects are associated with different immersion process parameters. This type of analysis at DUV wavelengths would enable the optimization of immersion lithography processes, thus enabling the qualification of immersion processes for volume production.
NASA Astrophysics Data System (ADS)
Newchurch, M.; Al-Saadi, J. A.; Alvarez, R. J.; Burris, J.; Cantrell, W.; Chen, G.; De Young, R.; Hardesty, R.; Hoff, R. M.; Kaye, J. A.; kuang, S.; Langford, A. O.; LeBlanc, T.; McDermid, I. S.; McGee, T. J.; Pierce, R.; Senff, C. J.; Sullivan, J. T.; Szykman, J.; Tonnesen, G.; Wang, L.
2012-12-01
An interagency research initiative for ground-based ozone and aerosol lidar profiling recently funded by NASA has important applications to air-quality studies in addition to the goal of serving the GEO-CAPE and other air-quality missions. Ozone is a key trace-gas species, a greenhouse gas, and an important pollutant in the troposphere. High spatial and temporal variability of ozone affected by various physical and photochemical processes motivates the high spatio-temporal lidar profiling of tropospheric ozone for improving the simulation and forecasting capability of the photochemical/air-quality models, especially in the boundary layer where the resolution and precision of satellite retrievals are fundamentally limited. It is well known that there are large discrepancies between the surface and upper-air ozone due to titration, surface deposition, diurnal processes, free-tropospheric transport, and other processes. Near-ground ozone profiling has been technically challenging for lidars due to some engineering difficulties, such as near-range saturation, field-of-view overlap, and signal processing issues. This initiative provides an opportunity for us to solve those engineering issues and redesign the lidars aimed at long-term, routine ozone/aerosol observations from the near surface to the top of the troposphere at multiple stations (i.e., NASA/GSFC, NASA/LaRC, NASA/JPL, NOAA/ESRL, UAHuntsville) for addressing the needs of NASA, NOAA, EPA and State/local AQ agencies. We will present the details of the science investigations, current status of the instrumentation development, data access/protocol, and the future goals of this lidar network. Ozone lidar/RAQMS comparison of laminar structures.
Massive black hole and gas dynamics in galaxy nuclei mergers - I. Numerical implementation
NASA Astrophysics Data System (ADS)
Lupi, Alessandro; Haardt, Francesco; Dotti, Massimo
2015-01-01
Numerical effects are known to plague adaptive mesh refinement (AMR) codes when treating massive particles, e.g. representing massive black holes (MBHs). In an evolving background, they can experience strong, spurious perturbations and then follow unphysical orbits. We study by means of numerical simulations the dynamical evolution of a pair MBHs in the rapidly and violently evolving gaseous and stellar background that follows a galaxy major merger. We confirm that spurious numerical effects alter the MBH orbits in AMR simulations, and show that numerical issues are ultimately due to a drop in the spatial resolution during the simulation, drastically reducing the accuracy in the gravitational force computation. We therefore propose a new refinement criterion suited for massive particles, able to solve in a fast and precise way for their orbits in highly dynamical backgrounds. The new refinement criterion we designed enforces the region around each massive particle to remain at the maximum resolution allowed, independently upon the local gas density. Such maximally resolved regions then follow the MBHs along their orbits, and effectively avoids all spurious effects caused by resolution changes. Our suite of high-resolution, AMR hydrodynamic simulations, including different prescriptions for the sub-grid gas physics, shows that the new refinement implementation has the advantage of not altering the physical evolution of the MBHs, accounting for all the non-trivial physical processes taking place in violent dynamical scenarios, such as the final stages of a galaxy major merger.
The effect of federal health policy on occupational medicine.
McCunney, R J; Cikins, W
1990-01-01
All three branches of the federal government affect occupational medicine. Notable examples include: 1) the Department of Transportation ruling (1988) requiring drug testing in diverse areas of the transportation industry (executive branch); 2) the Workplace Drug Act (1988) calling for organizations to have a policy towards drug and alcohol abuse (legislative branch); and 3) the Supreme Court ruling on the constitutionality of drug testing in the transportation industry (1989) and that infectious diseases are a handicap in accordance with the 1973 Federal Rehabilitation Act (1987). The executive branch plays a major role in occupational medicine primarily through the Occupational Safety and Health Administration (OSHA), which issues standards based on a rule making process; the executive branch can also affect occupational medicine indirectly, as evidenced by President Reagan's Executive Order 12291 calling for Office of Management and Budget oversight of regulatory initiatives. The legislative branch enacts laws, conducts hearings, and requests reports on the operations of federal agencies. The judicial branch addresses occupational health issues when people affected by an executive ruling want to challenge the ruling; or in the case of the Supreme Court, when deliberating an issue over which two circuit courts of appeal have come to divergent opinions. The Occupational Medicine profession can participate in the political process through awareness of proposed legislation and by responding accordingly with letters, resolutions, or testimony. Similar options exist within the executive branch by participating in the rule-making process. A representative of the Governmental Affairs Committee, through periodic visits with key Washington representatives, can keep members of the American College of Occupational Medicine informed about federal legislative and regulatory activities. In appropriate cases, the organization can then take a formal position on governmental activities that affect the speciality.
Automating the conflict resolution process
NASA Technical Reports Server (NTRS)
Wike, Jeffrey S.
1991-01-01
The purpose is to initiate a discussion of how the conflict resolution process at the Network Control Center can be made more efficient. Described here are how resource conflicts are currently resolved as well as the impacts of automating conflict resolution in the ATDRSS era. A variety of conflict resolution strategies are presented.
An Overview of the Space Shuttle Orbiter's Aging Aircraft Program
NASA Technical Reports Server (NTRS)
Russell, Richard W.
2007-01-01
The Space Shuttle Orbiter has well exceeded its original design life of 10 years or 100 missions. The Orbiter Project Office (OPO) has sponsored several activities to address aging vehicle concerns, including a Corrosion Control Review Board (CCRB), a mid-life certification program, and most recently the formation of the Aging Orbiter Working Group (AOWG). The AOWG was chartered in 2004 as a proactive group which provides the OPO oversight for aging issues such as corrosion, non-destructive inspection, non-metallics, wiring and subsystems. The core team consists of mainly representatives from the Materials and Processes Problem Resolution Team (M&P PRT) and Safety and Mission Assurance (S&MA). Subsystem engineers and subject matter experts are called in as required. The AOWG has functioned by forming issues based sub-teams. Examples of completed sub-teams include adhesives, wiring and wing leading edge metallic materials. Current sub-teams include Composite Over-Wrapped Pressure Vessels (COPV), elastomeric materials and mechanisms.
Optical control and study of biological processes at the single-cell level in a live organism
NASA Astrophysics Data System (ADS)
Feng, Zhiping; Zhang, Weiting; Xu, Jianmin; Gauron, Carole; Ducos, Bertrand; Vriz, Sophie; Volovitch, Michel; Jullien, Ludovic; Weiss, Shimon; Bensimon, David
2013-07-01
Living organisms are made of cells that are capable of responding to external signals by modifying their internal state and subsequently their external environment. Revealing and understanding the spatio-temporal dynamics of these complex interaction networks is the subject of a field known as systems biology. To investigate these interactions (a necessary step before understanding or modelling them) one needs to develop means to control or interfere spatially and temporally with these processes and to monitor their response on a fast timescale (< minute) and with single-cell resolution. In 2012, an EMBO workshop on ‘single-cell physiology’ (organized by some of us) was held in Paris to discuss those issues in the light of recent developments that allow for precise spatio-temporal perturbations and observations. This review will be largely based on the investigations reported there. We will first present a non-exhaustive list of examples of cellular interactions and developmental pathways that could benefit from these new approaches. We will review some of the novel tools that have been developed for the observation of cellular activity and then discuss the recent breakthroughs in optical super-resolution microscopy that allow for optical observations beyond the diffraction limit. We will review the various means to photo-control the activity of biomolecules, which allow for local perturbations of physiological processes. We will end up this review with a report on the current status of optogenetics: the use of photo-sensitive DNA-encoded proteins as sensitive reporters and efficient actuators to perturb and monitor physiological processes.
20 CFR 667.510 - What is the Grant Officer resolution process?
Code of Federal Regulations, 2010 CFR
2010-04-01
... 20 Employees' Benefits 3 2010-04-01 2010-04-01 false What is the Grant Officer resolution process? 667.510 Section 667.510 Employees' Benefits EMPLOYMENT AND TRAINING ADMINISTRATION, DEPARTMENT OF... From Monitoring and Oversight Reviews § 667.510 What is the Grant Officer resolution process? (a...
20 CFR 667.510 - What is the Grant Officer resolution process?
Code of Federal Regulations, 2014 CFR
2014-04-01
... 20 Employees' Benefits 4 2014-04-01 2014-04-01 false What is the Grant Officer resolution process? 667.510 Section 667.510 Employees' Benefits EMPLOYMENT AND TRAINING ADMINISTRATION, DEPARTMENT OF... Findings From Monitoring and Oversight Reviews § 667.510 What is the Grant Officer resolution process? (a...
State-Tribal Legislation: 1992 and 1993 Summaries.
ERIC Educational Resources Information Center
White-Tail Feather, Alex; And Others
This report summarizes state legislative activity in 1992 and 1993 pertaining to Native American issues. An overview of each year is followed by state-by-state summaries. In 1993, of 238 bills, resolutions, and memorials introduced, 116 were enacted, with 31 pending. During 1993, education issues were important and included the integration of…
Cases in Bioethics from the Hastings Center Report.
ERIC Educational Resources Information Center
Levine, Carol, Ed.; Veatch, Robert M.
Case studies of ethical issues based on real events are followed by comments illustrating how people from various ethical traditions and frameworks and from different academic and professional disciplines analyze the issues and work toward a resolution of the conflict posed. The cases are intended to help the public and professional persons pursue…
Applied Physics Lab Kennedy Space Center: Recent Contributions
NASA Technical Reports Server (NTRS)
Starr, Stan; Youngquist, Robert
2006-01-01
The mission of the Applied Physics Lab is: (1) Develop and deliver novel sensors and devices to support KSC mission operations. (2) Analyze operational issues and recommend or deliver practical solutions. (3) Apply physics to the resolution of long term space flight issues that affect space port operation on Earth or on other planets.
Natural Resources / Division of Forestry Alaska Board of Forestry The nine-member Alaska Board of Forestry advises the state on forest practices issues and provides a forum for discussion and resolution of forest management issues on state land. The board also reviews all proposed changes to the Alaska Forest Resources
2008-10-07
practices. In response, it filed a number of trade dispute resolution cases against China in the WTO, including China’s failure to protect IPR and...in China. In addition, the Administration reversed a long-standing policy that countervailing cases (dealing with government subsidies) could not be...13 WTO Implementation Issues . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14 U.S. WTO Cases Against China
Imaging Mass Spectrometry on the Nanoscale with Cluster Ion Beams
Winograd, Nicholas
2014-12-02
Imaging with cluster secondary ion mass spectrometry (SIMS) is reaching a mature level of development. When, using a variety of molecular ion projectiles to stimulate desorption, 3-dimensional imaging with the selectivity of mass spectrometry can now be achieved with submicrometer spatial resolution and <10 nm depth resolution. In this Perspective, stock is taken regarding what it will require to routinely achieve these remarkable properties. Some issues include the chemical nature of the projectile, topography formation, differential erosion rates, and perhaps most importantly, ionization efficiency. Shortcomings of existing instrumentation are also noted. One key part of this discussion involves speculation onmore » how best to resolve these issues.« less
Imaging Mass Spectrometry on the Nanoscale with Cluster Ion Beams
DOE Office of Scientific and Technical Information (OSTI.GOV)
Winograd, Nicholas
Imaging with cluster secondary ion mass spectrometry (SIMS) is reaching a mature level of development. When, using a variety of molecular ion projectiles to stimulate desorption, 3-dimensional imaging with the selectivity of mass spectrometry can now be achieved with submicrometer spatial resolution and <10 nm depth resolution. In this Perspective, stock is taken regarding what it will require to routinely achieve these remarkable properties. Some issues include the chemical nature of the projectile, topography formation, differential erosion rates, and perhaps most importantly, ionization efficiency. Shortcomings of existing instrumentation are also noted. One key part of this discussion involves speculation onmore » how best to resolve these issues.« less
Global-scale Ionospheric Outflow: Major Processes and Unresolved Problems
NASA Astrophysics Data System (ADS)
Liemohn, M. W.; Welling, D. T.; Ilie, R.; Khazanov, G. V.; Jahn, J. M.; Zou, S.; Ganushkina, N. Y.; Valek, P. W.; Elliott, H. A.; Gilchrist, B. E.; Hoegy, W. R.; Glocer, A.
2016-12-01
Outflow from the ionosphere is a major source of plasma to the magnetosphere. Its presence, especially that of ions heavier than He+, mass loads the magnetosphere and changes reconnection rates, current system configurations, plasma wave excitation and wave-particle interactions. It even impacts the propagation of information. We present a brief overview of the major processes and scientific history of this field. There are still major gaps, however, in our understanding of the global-scale nature of ionospheric outflow. We discuss these unresolved problems highlighting the leading questions still outstanding on this topic. First and foremost, since the measurements of ionospheric outflow have largely come from individual satellites and sounding rockets, the processes are best known on the local level, while the spatial distribution of outflow has never been simultaneously measured on more global scales. The spatial coherence and correlation of outflow across time and space have not been quantified. Furthermore, the composition of the outflow is often only measured at a coarse level of H+, He+, and O+, neglecting other species such as N+ or moleculars. However, resolving O+ from N+, as is customary in planetary research, aids in revealing the physics and altitude dependence of the energization processes in the ionosphere. Similarly, fine-resolution velocity space measurements of ionospheric outflow have been limited, yet such observations can also reveal energization processes driving the outflow. A final unresolved issue to mention is magnetically conjugate outflow and the full extent of hemispherically asymmetric outflow fluxes or fluence. Each of these open questions have substantial ramifications for magnetospheric physics; their resolution could yield sweeping changes in our understanding of nonlinear feedback and cross-scale physical interactions, magnetosphere-ionosphere coupling, and geospace system-level science.
Consistent detection and identification of individuals in a large camera network
NASA Astrophysics Data System (ADS)
Colombo, Alberto; Leung, Valerie; Orwell, James; Velastin, Sergio A.
2007-10-01
In the wake of an increasing number of terrorist attacks, counter-terrorism measures are now a main focus of many research programmes. An important issue for the police is the ability to track individuals and groups reliably through underground stations, and in the case of post-event analysis, to be able to ascertain whether specific individuals have been at the station previously. While there exist many motion detection and tracking algorithms, the reliable deployment of them in a large network is still ongoing research. Specifically, to track individuals through multiple views, on multiple levels and between levels, consistent detection and labelling of individuals is crucial. In view of these issues, we have developed a change detection algorithm to work reliably in the presence of periodic movements, e.g. escalators and scrolling advertisements, as well as a content-based retrieval technique for identification. The change detection technique automatically extracts periodically varying elements in the scene using Fourier analysis, and constructs a Markov model for the process. Training is performed online, and no manual intervention is required, making this system suitable for deployment in large networks. Experiments on real data shows significant improvement over existing techniques. The content-based retrieval technique uses MPEG-7 descriptors to identify individuals. Given the environment under which the system operates, i.e. at relatively low resolution, this approach is suitable for short timescales. For longer timescales, other forms of identification such as gait, or if the resolution allows, face recognition, will be required.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, Donald F.; Schulz, Carl; Konijnenburg, Marco
High-resolution Fourier transform ion cyclotron resonance (FT-ICR) mass spectrometry imaging enables the spatial mapping and identification of biomolecules from complex surfaces. The need for long time-domain transients, and thus large raw file sizes, results in a large amount of raw data (“big data”) that must be processed efficiently and rapidly. This can be compounded by largearea imaging and/or high spatial resolution imaging. For FT-ICR, data processing and data reduction must not compromise the high mass resolution afforded by the mass spectrometer. The continuous mode “Mosaic Datacube” approach allows high mass resolution visualization (0.001 Da) of mass spectrometry imaging data, butmore » requires additional processing as compared to featurebased processing. We describe the use of distributed computing for processing of FT-ICR MS imaging datasets with generation of continuous mode Mosaic Datacubes for high mass resolution visualization. An eight-fold improvement in processing time is demonstrated using a Dutch nationally available cloud service.« less
An Evolving Worldview: Making Open Source Easy
NASA Astrophysics Data System (ADS)
Rice, Z.
2017-12-01
NASA Worldview is an interactive interface for browsing full-resolution, global satellite imagery. Worldview supports an open data policy so that academia, private industries and the general public can use NASA's satellite data to address Earth science related issues. Worldview was open sourced in 2014. By shifting to an open source approach, the Worldview application has evolved to better serve end-users. Project developers are able to have discussions with end-users and community developers to understand issues and develop new features. Community developers are able to track upcoming features, collaborate on them and make their own contributions. Developers who discover issues are able to address those issues and submit a fix. This reduces the time it takes for a project developer to reproduce an issue or develop a new feature. Getting new developers to contribute to the project has been one of the most important and difficult aspects of open sourcing Worldview. After witnessing potential outside contributors struggle, a focus has been made on making the installation of Worldview simple to reduce the initial learning curve and make contributing code easy. One way we have addressed this is through a simplified setup process. Our setup documentation includes a set of prerequisites and a set of straightforward commands to clone, configure, install and run. This presentation will emphasize our focus to simplify and standardize Worldview's open source code so that more people are able to contribute. The more people who contribute, the better the application will become over time.
Prospects for improving the representation of coastal and shelf seas in global ocean models
NASA Astrophysics Data System (ADS)
Holt, Jason; Hyder, Patrick; Ashworth, Mike; Harle, James; Hewitt, Helene T.; Liu, Hedong; New, Adrian L.; Pickles, Stephen; Porter, Andrew; Popova, Ekaterina; Icarus Allen, J.; Siddorn, John; Wood, Richard
2017-02-01
Accurately representing coastal and shelf seas in global ocean models represents one of the grand challenges of Earth system science. They are regions of immense societal importance through the goods and services they provide, hazards they pose and their role in global-scale processes and cycles, e.g. carbon fluxes and dense water formation. However, they are poorly represented in the current generation of global ocean models. In this contribution, we aim to briefly characterise the problem, and then to identify the important physical processes, and their scales, needed to address this issue in the context of the options available to resolve these scales globally and the evolving computational landscape.We find barotropic and topographic scales are well resolved by the current state-of-the-art model resolutions, e.g. nominal 1/12°, and still reasonably well resolved at 1/4°; here, the focus is on process representation. We identify tides, vertical coordinates, river inflows and mixing schemes as four areas where modelling approaches can readily be transferred from regional to global modelling with substantial benefit. In terms of finer-scale processes, we find that a 1/12° global model resolves the first baroclinic Rossby radius for only ˜ 8 % of regions < 500 m deep, but this increases to ˜ 70 % for a 1/72° model, so resolving scales globally requires substantially finer resolution than the current state of the art.We quantify the benefit of improved resolution and process representation using 1/12° global- and basin-scale northern North Atlantic nucleus for a European model of the ocean (NEMO) simulations; the latter includes tides and a k-ɛ vertical mixing scheme. These are compared with global stratification observations and 19 models from CMIP5. In terms of correlation and basin-wide rms error, the high-resolution models outperform all these CMIP5 models. The model with tides shows improved seasonal cycles compared to the high-resolution model without tides. The benefits of resolution are particularly apparent in eastern boundary upwelling zones.To explore the balance between the size of a globally refined model and that of multiscale modelling options (e.g. finite element, finite volume or a two-way nesting approach), we consider a simple scale analysis and a conceptual grid refining approach. We put this analysis in the context of evolving computer systems, discussing model turnaround time, scalability and resource costs. Using a simple cost model compared to a reference configuration (taken to be a 1/4° global model in 2011) and the increasing performance of the UK Research Councils' computer facility, we estimate an unstructured mesh multiscale approach, resolving process scales down to 1.5 km, would use a comparable share of the computer resource by 2021, the two-way nested multiscale approach by 2022, and a 1/72° global model by 2026. However, we also note that a 1/12° global model would not have a comparable computational cost to a 1° global model in 2017 until 2027. Hence, we conclude that for computationally expensive models (e.g. for oceanographic research or operational oceanography), resolving scales to ˜ 1.5 km would be routinely practical in about a decade given substantial effort on numerical and computational development. For complex Earth system models, this extends to about 2 decades, suggesting the focus here needs to be on improved process parameterisation to meet these challenges.
High-resolution CSR GRACE RL05 mascons
NASA Astrophysics Data System (ADS)
Save, Himanshu; Bettadpur, Srinivas; Tapley, Byron D.
2016-10-01
The determination of the gravity model for the Gravity Recovery and Climate Experiment (GRACE) is susceptible to modeling errors, measurement noise, and observability issues. The ill-posed GRACE estimation problem causes the unconstrained GRACE RL05 solutions to have north-south stripes. We discuss the development of global equal area mascon solutions to improve the GRACE gravity information for the study of Earth surface processes. These regularized mascon solutions are developed with a 1° resolution using Tikhonov regularization in a geodesic grid domain. These solutions are derived from GRACE information only, and no external model or data is used to inform the constraints. The regularization matrix is time variable and will not bias or attenuate future regional signals to some past statistics from GRACE or other models. The resulting Center for Space Research (CSR) mascon solutions have no stripe errors and capture all the signals observed by GRACE within the measurement noise level. The solutions are not tailored for specific applications and are global in nature. This study discusses the solution approach and compares the resulting solutions with postprocessed results from the RL05 spherical harmonic solutions and other global mascon solutions for studies of Arctic ice sheet processes, ocean bottom pressure variation, and land surface total water storage change. This suite of comparisons leads to the conclusion that the mascon solutions presented here are an enhanced representation of the RL05 GRACE solutions and provide accurate surface-based gridded information that can be used without further processing.
Sparse signal representation and its applications in ultrasonic NDE.
Zhang, Guang-Ming; Zhang, Cheng-Zhong; Harvey, David M
2012-03-01
Many sparse signal representation (SSR) algorithms have been developed in the past decade. The advantages of SSR such as compact representations and super resolution lead to the state of the art performance of SSR for processing ultrasonic non-destructive evaluation (NDE) signals. Choosing a suitable SSR algorithm and designing an appropriate overcomplete dictionary is a key for success. After a brief review of sparse signal representation methods and the design of overcomplete dictionaries, this paper addresses the recent accomplishments of SSR for processing ultrasonic NDE signals. The advantages and limitations of SSR algorithms and various overcomplete dictionaries widely-used in ultrasonic NDE applications are explored in depth. Their performance improvement compared to conventional signal processing methods in many applications such as ultrasonic flaw detection and noise suppression, echo separation and echo estimation, and ultrasonic imaging is investigated. The challenging issues met in practical ultrasonic NDE applications for example the design of a good dictionary are discussed. Representative experimental results are presented for demonstration. Copyright © 2011 Elsevier B.V. All rights reserved.
Tools for Atmospheric Radiative Transfer: Streamer and FluxNet. Revised
NASA Technical Reports Server (NTRS)
Key, Jeffrey R.; Schweiger, Axel J.
1998-01-01
Two tools for the solution of radiative transfer problems are presented. Streamer is a highly flexible medium spectral resolution radiative transfer model based on the plane-parallel theory of radiative transfer. Capable of computing either fluxes or radiances, it is suitable for studying radiative processes at the surface or within the atmosphere and for the development of remote-sensing algorithms. FluxNet is a fast neural network-based implementation of Streamer for computing surface fluxes. It allows for a sophisticated treatment of radiative processes in the analysis of large data sets and potential integration into geophysical models where computational efficiency is an issue. Documentation and tools for the development of alternative versions of Fluxnet are available. Collectively, Streamer and FluxNet solve a wide variety of problems related to radiative transfer: Streamer provides the detail and sophistication needed to perform basic research on most aspects of complex radiative processes while the efficiency and simplicity of FluxNet make it ideal for operational use.
A machine vision system for micro-EDM based on linux
NASA Astrophysics Data System (ADS)
Guo, Rui; Zhao, Wansheng; Li, Gang; Li, Zhiyong; Zhang, Yong
2006-11-01
Due to the high precision and good surface quality that it can give, Electrical Discharge Machining (EDM) is potentially an important process for the fabrication of micro-tools and micro-components. However, a number of issues remain unsolved before micro-EDM becomes a reliable process with repeatable results. To deal with the difficulties in micro electrodes on-line fabrication and tool wear compensation, a micro-EDM machine vision system is developed with a Charge Coupled Device (CCD) camera, with an optical resolution of 1.61μm and an overall magnification of 113~729. Based on the Linux operating system, an image capturing program is developed with the V4L2 API, and an image processing program is exploited by using OpenCV. The contour of micro electrodes can be extracted by means of the Canny edge detector. Through the system calibration, the micro electrodes diameter can be measured on-line. Experiments have been carried out to prove its performance, and the reasons of measurement error are also analyzed.
ERIC Educational Resources Information Center
Turan, Selahattin; Taylor, Charles
This paper briefly introduces alternative dispute resolution (ADR) processes and their fundamental principles. The paper provides a review of the literature on ADR and discusses its applicability in educational settings. The concept of conflict is explained, along with analysis of the limitations of traditional conflict resolution processes. The…
Lessons Learned about Neurodegeneration from Microglia and Monocyte Depletion Studies
Lund, Harald; Pieber, Melanie; Harris, Robert A.
2017-01-01
While bone marrow-derived Ly6Chi monocytes can infiltrate the central nervous system (CNS) they are developmentally and functionally distinct from resident microglia. Our understanding of the relative importance of these two populations in the distinct processes of pathogenesis and resolution of inflammation during neurodegenerative disorders was limited by a lack of tools to specifically manipulate each cell type. During recent years, the development of experimental cell-specific depletion models has enabled this issue to be addressed. Herein we compare and contrast the different depletion approaches that have been used, focusing on the respective functionalities of microglia and monocyte-derived macrophages in a range of neurodegenerative disease states, and discuss their prospects for immunotherapy. PMID:28804456
NASA Technical Reports Server (NTRS)
Stewart, Mark E. M.; Moder, Jeffrey P.
2016-01-01
This paper presents ANSYS Fluent simulation results and analysis for self-pressurization of a flightweight, cryogenic, liquid hydrogen tank in 1-g. These results are compared with experimental data, in particular, pressure evolution and temperature measurements at a set of sensors. The simulations can be analyzed to identify and quantify heat flows in the tank. Heat flows change over time and influence the self-pressurization process. The initial rate of self-pressurization is sensitive to the initial temperature profile near the interface. Uncertainty in saturation pressure data and the accuracy of experimental measurements complicate simulation of self-pressurization. Numerical issues encountered, and their resolution, are also explained.
Development of noise emission measurement specifications for color printing multifunctional devices
NASA Astrophysics Data System (ADS)
Kimizuka, Ikuo
2005-09-01
Color printing (including copying) is becoming more popular application in home, as well as in offices. Existing de jule and/or industrial standards (such as ISO 7779, ECMA-74, ANSI S12.10 series, etc.), however, state only monochrome patterns, which are mainly intended for acoustic noise testing of mechanical impact type printers. This paper discusses the key issues and corresponding resolutions for development of color printing patterns for acoustic noise measurements. The results of these technical works will be published by JBMS-74 (new industry standard of JBMIA within 2005), and hopefully be the technical basis of updating other standards mentioned above. This paper also shows the development processes and key features of proposed patterns.
Review of Waste Management Symposium 2007, Tucson, AZ, USA
Luna, Robert E.; Yoshimura, R. H.
2007-03-01
The Waste Management Symposium 2007 is the most recent in a long series that has been held at Tucson, Arizona. The meeting has become extremely popular as a venue for technical exchange, marketing, and networking involving upward of 1800 persons involved with various aspects of radioactive waste management. However, in a break with tradition, the symposium organizers reported that next year’s Waste Management Symposium would be held at the Phoenix, AZ convention center. Additionally, most of the WM07 sessions dealt with the technical and institutional issues relating to the resolution of waste disposal and processing challenges, including a number ofmore » sessions dealing with related transport activities.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Burke, F.; Donoghue, N.
1994-11-01
Vietnam`s energy needs are clear and acute. Economic reforms have triggered a dynamic development process with a large and growing appetite for power. In view of the Vietnamese government`s own shortages of capital, private international power companies have been identified as key problem-solvers in the country`s efforts to meet a skyrocketing demand for energy resources. There are no restrictions on the nature of projects in which non-Vietnamese investors may participate. A number of legal issues need resolution before independent power producers can take advantage of the Republic`s recently created Builder-Operator-Transfer Contracts (the BOT Regulations). This paper discusses these regulations andmore » how they affect independent power producers« less
Enhancing Decision-Making in STSE Education by Inducing Reflection and Self-Regulated Learning
NASA Astrophysics Data System (ADS)
Gresch, Helge; Hasselhorn, Marcus; Bögeholz, Susanne
2017-02-01
Thoughtful decision-making to resolve socioscientific issues is central to science, technology, society, and environment (STSE) education. One approach for attaining this goal involves fostering students' decision-making processes. Thus, the present study explores whether the application of decision-making strategies, combined with reflections on the decision-making processes of others, enhances decision-making competence. In addition, this study examines whether this process is supported by elements of self-regulated learning, i.e., self-reflection regarding one's own performance and the setting of goals for subsequent tasks. A computer-based training program which involves the resolution of socioscientific issues related to sustainable development was developed in two versions: with and without elements of self-regulated learning. Its effects on decision-making competence were analyzed using a pre test-post test follow-up control-group design ( N = 242 high school students). Decision-making competence was assessed using an open-ended questionnaire that focused on three facets: consideration of advantages and disadvantages, metadecision aspects, and reflection on the decision-making processes of others. The findings suggest that students in both training groups incorporated aspects of metadecision into their statements more often than students in the control group. Furthermore, both training groups were more successful in reflecting on the decision-making processes of others. The students who received additional training in self-regulated learning showed greater benefits in terms of metadecision aspects and reflection, and these effects remained significant two months later. Overall, our findings demonstrate that the application of decision-making strategies, combined with reflections on the decision-making process and elements of self-regulated learning, is a fruitful approach in STSE education.
ERIC First Analysis: National Defense Commitments; 1982-83 National High School Debate Resolutions.
ERIC Educational Resources Information Center
Wagner, David L.
The purpose of this booklet is to provide a brief overview of some of the issues involved in the 1982-83 high school debate resolutions, which focus on the defense commitments of the United States. The first of the booklet's four chapters provides a review of information sources for use in researching the topic of defense commitments. The…
ERIC Educational Resources Information Center
Pennsylvania Joint State Government Commission, Harrisburg.
Pennsylvania's House Resolution 43 of 1995 directs the Joint State Government Commission to report to the General Assembly on the feasibility of creating a voluntary residential school program for disadvantaged children. The Commission assembled a Working Group to consider this issue, and the group focused on poor children living in high crime…
Nouri, Hamideh; Anderson, Sharolyn; Sutton, Paul; Beecham, Simon; Nagler, Pamela; Jarchow, Christopher J; Roberts, Dar A
2017-04-15
This research addresses the question as to whether or not the Normalised Difference Vegetation Index (NDVI) is scale invariant (i.e. constant over spatial aggregation) for pure pixels of urban vegetation. It has been long recognized that there are issues related to the modifiable areal unit problem (MAUP) pertaining to indices such as NDVI and images at varying spatial resolutions. These issues are relevant to using NDVI values in spatial analyses. We compare two different methods of calculation of a mean NDVI: 1) using pixel values of NDVI within feature/object boundaries and 2) first calculating the mean red and mean near-infrared across all feature pixels and then calculating NDVI. We explore the nature and magnitude of these differences for images taken from two sensors, a 1.24m resolution WorldView-3 and a 0.1m resolution digital aerial image. We apply these methods over an urban park located in the Adelaide Parklands of South Australia. We demonstrate that the MAUP is not an issue for calculation of NDVI within a sensor for pure urban vegetation pixels. This may prove useful for future rule-based monitoring of the ecosystem functioning of green infrastructure. Copyright © 2017 Elsevier B.V. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ballard, James D.; Halstead, Robert J.; Dilger, Fred
2012-07-01
In 1999, the State of Nevada brought its concerns about physical protection of current spent nuclear fuel (SNF) shipments, and future SNF shipments to a federal repository, before the NRC in a 1999 petition for rulemaking (PRM-73-10). In October 2010, the NRC published a rulemaking decision which would significantly strengthen physical protection of SNF in transit. The newest articulation of the rule (10 CFR 73.37) incorporates regulatory clarifications and security enhancements requested in Nevada's 1999 petition for rulemaking, codifies the findings of the Nuclear NRC and DOE consequence analyses into policy guidance documents and brings forward into regulations the agencymore » and licensee experience gained since the terrorist attacks of September 11, 2001. Although at present DOE SNF shipments would continue to be exempt from these NRC regulations, Nevada considers the rule to constitute a largely satisfactory resolution to stakeholder concerns raised in the original petition and in subsequent comments submitted to the NRC. This paper reviews the process of regulatory changes, assesses the specific improvements contained in the new rules and briefly describes the significance of the new rule in the context of a future national nuclear waste management program. Nevada's petition for rulemaking led to a generally satisfactory resolution of the State's concerns. The decade plus timeframe from petition to rulemaking conclusion saw a sea change in many aspects of the relevant issues - perhaps most importantly the attacks on 9/11 led to the recognition by regulatory bodies that a new threat environment exists wherein shipments of SNF and HLW pose a viable target for human initiated events. The State of Nevada has always considered security a critical concern for the transport of these highly radioactive materials. This was one of the primary reasons for the original rulemaking petition and subsequent advocacy by Nevada on related issues. NRC decisions on the majority of the concerns expressed in the petition, additional developments by other regulatory bodies and the change in how the United States sees threats to the homeland - all of these produced a satisfactory resolution through the rulemaking process. While not all of the concerns expressed by Nevada were addressed in the proposed rule and significant challenges face any programmatic shipment campaign in the future, the lesson learned on this occasion is that stakeholder concerns can be resolved through rulemaking. If DOE would engage with stakeholders on its role in transport of SNF and HLW under the NWPA, these concerns would be better addressed. Specifically the attempts by DOE to resist transportation and security regulations now considered necessary by the NRC for the adequate protection of the shipments of highly radioactive materials, these DOE efforts seem ill advised. One clear lesson learned from this successful rulemaking petition process is that the system of stakeholder input can work to better the regulatory environment. (authors)« less
Le Pogam, Adrien; Hatt, Mathieu; Descourt, Patrice; Boussion, Nicolas; Tsoumpas, Charalampos; Turkheimer, Federico E.; Prunier-Aesch, Caroline; Baulieu, Jean-Louis; Guilloteau, Denis; Visvikis, Dimitris
2011-01-01
Purpose Partial volume effects (PVE) are consequences of the limited spatial resolution in emission tomography leading to under-estimation of uptake in tissues of size similar to the point spread function (PSF) of the scanner as well as activity spillover between adjacent structures. Among PVE correction methodologies, a voxel-wise mutual multi-resolution analysis (MMA) was recently introduced. MMA is based on the extraction and transformation of high resolution details from an anatomical image (MR/CT) and their subsequent incorporation into a low resolution PET image using wavelet decompositions. Although this method allows creating PVE corrected images, it is based on a 2D global correlation model which may introduce artefacts in regions where no significant correlation exists between anatomical and functional details. Methods A new model was designed to overcome these two issues (2D only and global correlation) using a 3D wavelet decomposition process combined with a local analysis. The algorithm was evaluated on synthetic, simulated and patient images, and its performance was compared to the original approach as well as the geometric transfer matrix (GTM) method. Results Quantitative performance was similar to the 2D global model and GTM in correlated cases. In cases where mismatches between anatomical and functional information were present the new model outperformed the 2D global approach, avoiding artefacts and significantly improving quality of the corrected images and their quantitative accuracy. Conclusions A new 3D local model was proposed for a voxel-wise PVE correction based on the original mutual multi-resolution analysis approach. Its evaluation demonstrated an improved and more robust qualitative and quantitative accuracy compared to the original MMA methodology, particularly in the absence of full correlation between anatomical and functional information. PMID:21978037
Stable clustering and the resolution of dissipationless cosmological N-body simulations
NASA Astrophysics Data System (ADS)
Benhaiem, David; Joyce, Michael; Sylos Labini, Francesco
2017-10-01
The determination of the resolution of cosmological N-body simulations, I.e. the range of scales in which quantities measured in them represent accurately the continuum limit, is an important open question. We address it here using scale-free models, for which self-similarity provides a powerful tool to control resolution. Such models also provide a robust testing ground for the so-called stable clustering approximation, which gives simple predictions for them. Studying large N-body simulations of such models with different force smoothing, we find that these two issues are in fact very closely related: our conclusion is that the accuracy of two-point statistics in the non-linear regime starts to degrade strongly around the scale at which their behaviour deviates from that predicted by the stable clustering hypothesis. Physically the association of the two scales is in fact simple to understand: stable clustering fails to be a good approximation when there are strong interactions of structures (in particular merging) and it is precisely such non-linear processes which are sensitive to fluctuations at the smaller scales affected by discretization. Resolution may be further degraded if the short distance gravitational smoothing scale is larger than the scale to which stable clustering can propagate. We examine in detail the very different conclusions of studies by Smith et al. and Widrow et al. and find that the strong deviations from stable clustering reported by these works are the results of over-optimistic assumptions about scales resolved accurately by the measured power spectra, and the reliance on Fourier space analysis. We emphasize the much poorer resolution obtained with the power spectrum compared to the two-point correlation function.
NGNP High Temperature Materials White Paper
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lew Lommers; George Honma
2012-08-01
This white paper is one in a series of white papers that address key generic issues of the combined construction and operating license (COL) pre-application program key generic issues for the Next Generation Nuclear Plant reactor using the prismatic block fuel technology. The purpose of the pre-application program interactions with the NRC staff is to reduce the time required for COL application review by identifying and addressing key regulatory issues and, if possible, obtaining agreements for their resolution
Advanced manufacturing—A transformative enabling capability for fusion
Nygren, Richard E.; Dehoff, Ryan R.; Youchison, Dennis L.; ...
2018-05-24
Additive Manufacturing (AM) can create novel and complex engineered material structures. Features such as controlled porosity, micro-fibers and/or nano-particles, transitions in materials and integral robust coatings can be important in developing solutions for fusion subcomponents. A realistic understanding of this capability would be particularly valuable in identifying development paths. Major concerns for using AM processes with lasers or electron beams that melt powder to make refractory parts are the power required and residual stresses arising in fabrication. A related issue is the required combination of lasers or e-beams to continue heating of deposited material (to reduce stresses) and to depositmore » new material at a reasonable built rate while providing adequate surface finish and resolution for meso-scale features. In conclusion, Some Direct Write processes that can make suitable preforms and be cured to an acceptable density may offer another approach for PFCs.« less
A tool for simulating parallel branch-and-bound methods
NASA Astrophysics Data System (ADS)
Golubeva, Yana; Orlov, Yury; Posypkin, Mikhail
2016-01-01
The Branch-and-Bound method is known as one of the most powerful but very resource consuming global optimization methods. Parallel and distributed computing can efficiently cope with this issue. The major difficulty in parallel B&B method is the need for dynamic load redistribution. Therefore design and study of load balancing algorithms is a separate and very important research topic. This paper presents a tool for simulating parallel Branchand-Bound method. The simulator allows one to run load balancing algorithms with various numbers of processors, sizes of the search tree, the characteristics of the supercomputer's interconnect thereby fostering deep study of load distribution strategies. The process of resolution of the optimization problem by B&B method is replaced by a stochastic branching process. Data exchanges are modeled using the concept of logical time. The user friendly graphical interface to the simulator provides efficient visualization and convenient performance analysis.
Shape measurement and vibration analysis of moving speaker cone
NASA Astrophysics Data System (ADS)
Zhang, Qican; Liu, Yuankun; Lehtonen, Petri
2014-06-01
Surface three-dimensional (3-D) shape information is needed for many fast processes such as structural testing of material, standing waves on loudspeaker cone, etc. Usually measurement is done from limited number of points using electrical sensors or laser distance meters. Fourier Transform Profilometry (FTP) enables fast shape measurement of the whole surface. Method is based on angled sinusoidal fringe pattern projection and image capturing. FTP requires only one image of the deformed fringe pattern to restore the 3-D shape of the measured object, which makes real-time or dynamic data processing possible. In our experiment the method was used for loudspeaker cone distortion measurement in dynamic conditions. For sound quality issues it is important that the whole cone moves in same phase and there are no partial waves. Our imaging resolution was 1280x1024 pixels and frame rate was 200 fps. Using our setup we found unwanted spatial waves in our sample cone.
A multi-component evaporation model for beam melting processes
NASA Astrophysics Data System (ADS)
Klassen, Alexander; Forster, Vera E.; Körner, Carolin
2017-02-01
In additive manufacturing using laser or electron beam melting technologies, evaporation losses and changes in chemical composition are known issues when processing alloys with volatile elements. In this paper, a recently described numerical model based on a two-dimensional free surface lattice Boltzmann method is further developed to incorporate the effects of multi-component evaporation. The model takes into account the local melt pool composition during heating and fusion of metal powder. For validation, the titanium alloy Ti-6Al-4V is melted by selective electron beam melting and analysed using mass loss measurements and high-resolution microprobe imaging. Numerically determined evaporation losses and spatial distributions of aluminium compare well with experimental data. Predictions of the melt pool formation in bulk samples provide insight into the competition between the loss of volatile alloying elements from the irradiated surface and their advective redistribution within the molten region.
Advanced manufacturing—A transformative enabling capability for fusion
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nygren, Richard E.; Dehoff, Ryan R.; Youchison, Dennis L.
Additive Manufacturing (AM) can create novel and complex engineered material structures. Features such as controlled porosity, micro-fibers and/or nano-particles, transitions in materials and integral robust coatings can be important in developing solutions for fusion subcomponents. A realistic understanding of this capability would be particularly valuable in identifying development paths. Major concerns for using AM processes with lasers or electron beams that melt powder to make refractory parts are the power required and residual stresses arising in fabrication. A related issue is the required combination of lasers or e-beams to continue heating of deposited material (to reduce stresses) and to depositmore » new material at a reasonable built rate while providing adequate surface finish and resolution for meso-scale features. In conclusion, Some Direct Write processes that can make suitable preforms and be cured to an acceptable density may offer another approach for PFCs.« less
Molybdenum-base cermet fuel development
NASA Astrophysics Data System (ADS)
Pilger, James P.; Gurwell, William E.; Moss, Ronald W.; White, George D.; Seifert, David A.
Development of a multimegawatt (MMW) space nuclear power system requires identification and resolution of several technical feasibility issues before selecting one or more promising system concepts. Demonstration of reactor fuel fabrication technology is required for cermet-fueled reactor concepts. The MMW reactor fuel development activity at Pacific Northwest Laboratory (PNL) is focused on producing a molybdenum-matrix uranium-nitride (UN) fueled cermte. This cermet is to have a high matrix density (greater than or equal to 95 percent) for high strength and high thermal conductance coupled with a high particle (UN) porosity (approximately 25 percent) for retention of released fission gas at high burnup. Fabrication process development involves the use of porous TiN microspheres as surrogate fuel material until porous Un microspheres become available. Process development was conducted in the areas of microsphere synthesis, particle sealing/coating, and high-energy-rate forming (HERF) and the vacuum hot press consolidation techniques. This paper summarizes the status of these activities.
Higher resolution satellite remote sensing and the impact on image mapping
Watkins, Allen H.; Thormodsgard, June M.
1987-01-01
Recent advances in spatial, spectral, and temporal resolution of civil land remote sensing satellite data are presenting new opportunities for image mapping applications. The U.S. Geological Survey's experimental satellite image mapping program is evolving toward larger scale image map products with increased information content as a result of improved image processing techniques and increased resolution. Thematic mapper data are being used to produce experimental image maps at 1:100,000 scale that meet established U.S. and European map accuracy standards. Availability of high quality, cloud-free, 30-meter ground resolution multispectral data from the Landsat thematic mapper sensor, along with 10-meter ground resolution panchromatic and 20-meter ground resolution multispectral data from the recently launched French SPOT satellite, present new cartographic and image processing challenges.The need to fully exploit these higher resolution data increases the complexity of processing the images into large-scale image maps. The removal of radiometric artifacts and noise prior to geometric correction can be accomplished by using a variety of image processing filters and transforms. Sensor modeling and image restoration techniques allow maximum retention of spatial and radiometric information. An optimum combination of spectral information and spatial resolution can be obtained by merging different sensor types. These processing techniques are discussed and examples are presented.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Daling, P.M.; Marler, J.E.; Vo, T.V.
This study evaluates the values (benefits) and impacts (costs) associated with potential resolutions to Generic Issue 143, ``Availability of HVAC and Chilled Water Systems.`` The study identifies vulnerabilities related to failures of HVAC, chilled water, and room cooling systems; develops estimates of room heatup rates and safety-related equipment vulnerabilities following losses of HVAC/room cooler systems; develops estimates of the core damage frequencies and public risks associated with failures of these systems; develops three proposed resolution strategies to this generic issue; and performs a value/impact analysis of the proposed resolutions. Existing probabilistic risk assessments for four representative plants, including one plantmore » from each vendor, form the basis for the core damage frequency and public risk calculations. Both internal and external events were considered. It was concluded that all three proposed resolution strategies exceed the $1,000/person-rem cost-effectiveness ratio. Additional evaluations were performed to develop ``generic`` insights on potential design-related and configuration-related vulnerabilities and potential high-frequency ({approximately}1E-04/RY) accident sequences that involve failures of HVAC/room cooling functions. It was concluded that, although high-frequency accident sequences may exist at some plants, these high-frequency sequences are plant-specific in nature or have been resolved through hardware and/or operational changes. The plant-specific Individual Plant Examinations are an effective vehicle for identification and resolution of these plant-specific anomalies and hardware configurations.« less
A Hydrogen and He Isotope Nanoprobe
DOE Office of Scientific and Technical Information (OSTI.GOV)
Doyle, Barney L.; Van Deusen, Stuart B.
Materials that incorporate hydrogen and helium isotopes are of great interest at Sandia and throughout the NNSA and DOE. The Ion Beam Lab at SNL-NM has invented techniques using micron to mm-size MeV ion beams to recoil these light isotopes (Elastic Recoil Detection or ERD) that can very accurately make such measurements. However, there are many measurements that would benefit NW and DOE that require much better resolution, such as the distribution of H isotopes (and 3He) in individual grains of materials relevant to TPBARs, H and He-embrittlement of weapon components important to Tritium Sustainment Programs, issues with GTSs, batteries…more » Higher resolution would also benefit the field of materials science in general. To address these and many other issues, nm-scale lateral resolution is required. This LDRD demonstrated that neutral H atoms could be recoiled through a thin film by 70 keV electrons and detected with a Channeltron electron multiplier (CEM). The electrons were steered away from the CEM by strong permanent magnets. This proved the feasibility that the high energy electrons from a transmissionelectron- microscope-TEM can potentially be used to recoil and subsequently detect (e-ERD), quantify and map the concentration of H and He isotopes with nm resolution. This discovery could lead to a TEM-based H/He-isotope nanoprobe with 1000x higher resolution than currently available.« less
The evolution of extreme precipitations in high resolution scenarios over France
NASA Astrophysics Data System (ADS)
Colin, J.; Déqué, M.; Somot, S.
2009-09-01
Over the past years, improving the modelling of extreme events and their variability at climatic time scales has become one of the challenging issue raised in the regional climate research field. This study shows the results of a high resolution (12 km) scenario run over France with the limited area model (LAM) ALADIN-Climat, regarding the representation of extreme precipitations. The runs were conducted in the framework of the ANR-SCAMPEI national project on high resolution scenarios over French mountains. As a first step, we attempt to quantify one of the uncertainties implied by the use of LAM : the size of the area on which the model is run. In particular, we address the issue of whether a relatively small domain allows the model to create its small scale process. Indeed, high resolution scenarios cannot be run on large domains because of the computation time. Therefore one needs to answer this preliminary question before producing and analyzing such scenarios. To do so, we worked in the framework of a « big brother » experiment. We performed a 23-year long global simulation in present-day climate (1979-2001) with the ARPEGE-Climat GCM, at a resolution of approximately 50 km over Europe (stretched grid). This first simulation, named ARP50, constitutes the « big brother » reference of our experiment. It has been validated in comparison with the CRU climatology. Then we filtered the short waves (up to 200 km) from ARP50 in order to obtain the equivalent of coarse resolution lateral boundary conditions (LBC). We have carried out three ALADIN-Climat simulations at a 50 km resolution with these LBC, using different configurations of the model : * FRA50, run over a small domain (2000 x 2000 km, centered over France), * EUR50, run over a larger domain (5000 x 5000 km, centered over France as well), * EUR50-SN, run over the large domain (using spectral nudging). Considering the facts that ARPEGE-Climat and ALADIN-Climat models share the same physics and dynamics and that both regional and global simulations were run at the same resolution, ARP50 can be regarded as a reference with which FRA50, EUR50 and EUR50-SN should each be compared. After an analysis of the differences between the regional simulations and ARP50 in annual and seasonal mean, we focus on the representation of rainfall extremes comparing two dimensional fields of various index inspired from STARDEX and quantile-quantile plots. The results show a good agreement with the ARP50 reference for all three regional simulations and little differences are found between them. This result indicates that the use of small domains is not significantly detrimental to the modelling of extreme precipitation events. It also shows that the spectral nudging technique has no detrimental effect on the extreme precipitation. Therefore, high resolution scenarios performed on a relatively small domain such as the ones run for SCAMPEI, can be regarded as good tools to explore their possible evolution in the future climate. Preliminary results on the response of precipitation extremes over South-East France are given.
NASA Astrophysics Data System (ADS)
Grant, G.; Gallaher, D. W.
2017-12-01
New methods for processing massive remotely sensed datasets are used to evaluate Antarctic land surface temperature (LST) extremes. Data from the MODIS/Terra sensor (Collection 6) provides a twice-daily look at Antarctic LSTs over a 17 year period, at a higher spatiotemporal resolution than past studies. Using a data condensation process that creates databases of anomalous values, our processes create statistical images of Antarctic LSTs. In general, the results find few significant trends in extremes; however, they do reveal a puzzling picture of inconsistent cloud detection and possible systemic errors, perhaps due to viewing geometry. Cloud discrimination shows a distinct jump in clear-sky detections starting in 2011, and LSTs around the South Pole exhibit a circular cooling pattern, which may also be related to cloud contamination. Possible root causes are discussed. Ongoing investigations seek to determine whether the results are a natural phenomenon or, as seems likely, the results of sensor degradation or processing artefacts. If the unusual LST patterns or cloud detection discontinuities are natural, they point to new, interesting processes on the Antarctic continent. If the data artefacts are artificial, MODIS LST users should be alerted to the potential issues.
ERIC Educational Resources Information Center
Yahaya, Jamil Mikhail; Nurulazam, Ahmad; Karpudewan, Mageswary
2016-01-01
A socioscientific issues integrated instruction was used in the study to resolve college students attitude towards sexually-themed science content. Some 200 college students participated in the study as experimental and control groups. The former consisting of 98 students from one college was taught the content using the socioscientific issues…
Adoption and Assisted Reproduction. Adoption and Ethics, Volume 4.
ERIC Educational Resources Information Center
Freundlich, Madelyn
The controversies in adoption have extended across a spectrum of policy and practice issues, and although the issues have become clear, resolution has not been achieved nor has consensus developed regarding a framework on which to improve the quality of adoption policy and practice. This book is the fourth in a series to use an ethics-based…
The Market Forces in Adoption. Adoption and Ethics, Volume 2.
ERIC Educational Resources Information Center
Freundlich, Madelyn
The controversies in adoption have extended across a spectrum of policy and practice issues, and although the issues have become clear, resolution has not been achieved nor has consensus developed regarding a framework on which to improve the quality of adoption policy and practice. This book is the second in a series to use an ethics-based…
Technology-design-manufacturing co-optimization for advanced mobile SoCs
NASA Astrophysics Data System (ADS)
Yang, Da; Gan, Chock; Chidambaram, P. R.; Nallapadi, Giri; Zhu, John; Song, S. C.; Xu, Jeff; Yeap, Geoffrey
2014-03-01
How to maintain the Moore's Law scaling beyond the 193 immersion resolution limit is the key question semiconductor industry needs to answer in the near future. Process complexity will undoubtfully increase for 14nm node and beyond, which brings both challenges and opportunities for technology development. A vertically integrated design-technologymanufacturing co-optimization flow is desired to better address the complicated issues new process changes bring. In recent years smart mobile wireless devices have been the fastest growing consumer electronics market. Advanced mobile devices such as smartphones are complex systems with the overriding objective of providing the best userexperience value by harnessing all the technology innovations. Most critical system drivers are better system performance/power efficiency, cost effectiveness, and smaller form factors, which, in turns, drive the need of system design and solution with More-than-Moore innovations. Mobile system-on-chips (SoCs) has become the leading driver for semiconductor technology definition and manufacturing. Here we highlight how the co-optimization strategy influenced architecture, device/circuit, process technology and package, in the face of growing process cost/complexity and variability as well as design rule restrictions.
NASA Astrophysics Data System (ADS)
Ikeshima, D.; Yamazaki, D.; Yoshikawa, S.; Kanae, S.
2015-12-01
The specification of worldwide water body distribution is important for discovering hydrological cycle. Global 3-second Water Body Map (G3WBM) is a global scale map, which indicates the distribution of water body in 90m resolutions (http://hydro.iis.u-tokyo.ac.jp/~yamadai/G3WBM/index.html). This dataset was mainly built to identify the width of river channels, which is one of major uncertainties of continental-scale river hydrodynamics models. To survey the true width of the river channel, this water body map distinguish Permanent Water Body from Temporary Water Body, which means separating river channel and flood plain. However, rivers with narrower width, which is a major case in usual river, could not be observed in this map. To overcome this problem, updating the algorithm of G3WBM and enhancing the resolutions to 30m is the goal of this research. Although this 30m-resolution water body map uses similar algorithm as G3WBM, there are many technical issues attributed to relatively high resolutions. Those are such as lack of same high-resolution digital elevation map, or contamination problem of sub-pixel scale object on satellite acquired image, or invisibility of well-vegetated water body such as swamp. To manage those issues, this research used more than 30,000 satellite images of Landsat Global Land Survey (GLS), and lately distributed topography data of Shuttle Rader Topography Mission (SRTM) 1 arc-second (30m) digital elevation map. Also the effect of aerosol, which would scatter the sun reflectance and disturb the acquired result image, was considered. Due to these revises, the global water body distribution was established in more precise resolution.
Foreword to the theme issue on geospatial computer vision
NASA Astrophysics Data System (ADS)
Wegner, Jan Dirk; Tuia, Devis; Yang, Michael; Mallet, Clement
2018-06-01
Geospatial Computer Vision has become one of the most prevalent emerging fields of investigation in Earth Observation in the last few years. In this theme issue, we aim at showcasing a number of works at the interface between remote sensing, photogrammetry, image processing, computer vision and machine learning. In light of recent sensor developments - both from the ground as from above - an unprecedented (and ever growing) quantity of geospatial data is available for tackling challenging and urgent tasks such as environmental monitoring (deforestation, carbon sequestration, climate change mitigation), disaster management, autonomous driving or the monitoring of conflicts. The new bottleneck for serving these applications is the extraction of relevant information from such large amounts of multimodal data. This includes sources, stemming from multiple sensors, that exhibit distinct physical nature of heterogeneous quality, spatial, spectral and temporal resolutions. They are as diverse as multi-/hyperspectral satellite sensors, color cameras on drones, laser scanning devices, existing open land-cover geodatabases and social media. Such core data processing is mandatory so as to generate semantic land-cover maps, accurate detection and trajectories of objects of interest, as well as by-products of superior added-value: georeferenced data, images with enhanced geometric and radiometric qualities, or Digital Surface and Elevation Models.
Sizing up human health through remote sensing: uses and misuses.
Herbreteau, V; Salem, G; Souris, M; Hugot, J P; Gonzalez, J P
2005-03-01
Following the launch of new satellites, remote sensing (RS) has been increasingly implicated in human health research for thirty years, providing a growing availability of images with higher resolution and spectral ranges. However, the scope of applications, beyond theoretical large potentialities, appears limited both by their technical nature and the models developed. An exhaustive review of RS applications in human health highlights the real implication thus far regarding the diversity and range of health issues, remotely sensed data, processes and interpretations. The place of RS is far under its expected potential, revealing fundamental barriers in its implementation for health applications. The selection of images is done by practical considerations as trivial as price and availability, which are often not relevant to addressing health questions requiring suitable resolutions and spatio-temporal range. The relationships of environmental variables from RS, geospatial data from other sources for health investigations are poorly addressed and usually simplified. A discussion covering the potential of RS for human health is developed here to assist health scientists deal with spatial and temporal dynamics of health, by finding the most relevant data and analysis procedures.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wood, B.D.
The objective of this project is to advance lower cost solar cooling technology with the feasibility analysis, design and evaluation of proof-of-concept open cycle solar cooling concepts. The work is divided into three phases, with planned completion of each phase before proceeding with the following phase: Phase I - performance/economic/environmental related analysis and exploratory studies; Phase II - design and construction of an experimental system, including evaluative testing; Phase III - extended system testing during operation and engineering modifications as required. For Phase I, analysis and resolution of critical issues were completed with the objective of developing design specifications formore » an improved prototype OCA system.« less
Will embryonic stem cells change health policy?
Sage, William M
2010-01-01
Embryonic stem cells are actively debated in political and public policy arenas. However, the connections between stem cell innovation and overall health care policy are seldom elucidated. As with many controversial aspects of medical care, the stem cell debate bridges to a variety of social conversations beyond abortion. Some issues, such as translational medicine, commercialization, patient and public safety, health care spending, physician practice, and access to insurance and health care services, are core health policy concerns. Other issues, such as economic development, technologic progress, fiscal politics, and tort reform, are only indirectly related to the health care system but are frequently seen through a health care lens. These connections will help determine whether the stem cell debate reaches a resolution, and what that resolution might be.
NASA Astrophysics Data System (ADS)
Staple, Bevan; Earhart, R. P.; Slaymaker, Philip A.; Drouillard, Thomas F., II; Mahony, Thomas
2005-05-01
3D imaging LADARs have emerged as the key technology for producing high-resolution imagery of targets in 3-dimensions (X and Y spatial, and Z in the range/depth dimension). Ball Aerospace & Technologies Corp. continues to make significant investments in this technology to enable critical NASA, Department of Defense, and national security missions. As a consequence of rapid technology developments, two issues have emerged that need resolution. First, the terminology used to rate LADAR performance (e.g., range resolution) is inconsistently defined, is improperly used, and thus has become misleading. Second, the terminology does not include a metric of the system"s ability to resolve the 3D depth features of targets. These two issues create confusion when translating customer requirements into hardware. This paper presents a candidate framework for addressing these issues. To address the consistency issue, the framework utilizes only those terminologies proposed and tested by leading LADAR research and standards institutions. We also provide suggestions for strengthening these definitions by linking them to the well-known Rayleigh criterion extended into the range dimension. To address the inadequate 3D image quality metrics, the framework introduces the concept of a Range/Depth Modulation Transfer Function (RMTF). The RMTF measures the impact of the spatial frequencies of a 3D target on its measured modulation in range/depth. It is determined using a new, Range-Based, Slanted Knife-Edge test. We present simulated results for two LADAR pulse detection techniques and compare them to a baseline centroid technique. Consistency in terminology plus a 3D image quality metric enable improved system standardization.
Interpersonal conflict: strategies and guidelines for resolution.
Wolfe, D E; Bushardt, S C
1985-02-01
Historically, management theorists have recommended the avoidance or suppression of conflict. Modern management theorists recognize interpersonal conflict as an inevitable byproduct of growth and change. The issue is no longer avoidance of conflict but the strategy by which conflict is resolved. Various strategies of conflict resolution and the consequences of each are discussed in this article, along with guidelines for the effective use of confrontation strategy.
Machine Learning Based Single-Frame Super-Resolution Processing for Lensless Blood Cell Counting
Huang, Xiwei; Jiang, Yu; Liu, Xu; Xu, Hang; Han, Zhi; Rong, Hailong; Yang, Haiping; Yan, Mei; Yu, Hao
2016-01-01
A lensless blood cell counting system integrating microfluidic channel and a complementary metal oxide semiconductor (CMOS) image sensor is a promising technique to miniaturize the conventional optical lens based imaging system for point-of-care testing (POCT). However, such a system has limited resolution, making it imperative to improve resolution from the system-level using super-resolution (SR) processing. Yet, how to improve resolution towards better cell detection and recognition with low cost of processing resources and without degrading system throughput is still a challenge. In this article, two machine learning based single-frame SR processing types are proposed and compared for lensless blood cell counting, namely the Extreme Learning Machine based SR (ELMSR) and Convolutional Neural Network based SR (CNNSR). Moreover, lensless blood cell counting prototypes using commercial CMOS image sensors and custom designed backside-illuminated CMOS image sensors are demonstrated with ELMSR and CNNSR. When one captured low-resolution lensless cell image is input, an improved high-resolution cell image will be output. The experimental results show that the cell resolution is improved by 4×, and CNNSR has 9.5% improvement over the ELMSR on resolution enhancing performance. The cell counting results also match well with a commercial flow cytometer. Such ELMSR and CNNSR therefore have the potential for efficient resolution improvement in lensless blood cell counting systems towards POCT applications. PMID:27827837
Development of Time-Series Human Settlement Mapping System Using Historical Landsat Archive
NASA Astrophysics Data System (ADS)
Miyazaki, H.; Nagai, M.; Shibasaki, R.
2016-06-01
Methodology of automated human settlement mapping is highly needed for utilization of historical satellite data archives for urgent issues of urban growth in global scale, such as disaster risk management, public health, food security, and urban management. As development of global data with spatial resolution of 10-100 m was achieved by some initiatives using ASTER, Landsat, and TerraSAR-X, next goal has targeted to development of time-series data which can contribute to studies urban development with background context of socioeconomy, disaster risk management, public health, transport and other development issues. We developed an automated algorithm to detect human settlement by classification of built-up and non-built-up in time-series Landsat images. A machine learning algorithm, Local and Global Consistency (LLGC), was applied with improvements for remote sensing data. The algorithm enables to use MCD12Q1, a MODIS-based global land cover map with 500-m resolution, as training data so that any manual process is not required for preparation of training data. In addition, we designed the method to composite multiple results of LLGC into a single output to reduce uncertainty. The LLGC results has a confidence value ranging 0.0 to 1.0 representing probability of built-up and non-built-up. The median value of the confidence for a certain period around a target time was expected to be a robust output of confidence to identify built-up or non-built-up areas against uncertainties in satellite data quality, such as cloud and haze contamination. Four scenes of Landsat data for each target years, 1990, 2000, 2005, and 2010, were chosen among the Landsat archive data with cloud contamination less than 20%.We developed a system with the algorithms on the Data Integration and Analysis System (DIAS) in the University of Tokyo and processed 5200 scenes of Landsat data for cities with more than one million people worldwide.
Dai, Ru H.; Chen, Hsueh-Chih; Chan, Yu C.; Wu, Ching-Lin; Li, Ping; Cho, Shu L.; Hu, Jon-Fan
2017-01-01
It is well accepted that the humor comprehension processing involves incongruity detection and resolution and then induces a feeling of amusement. However, this three-stage model of humor processing does not apply to absurd humor (so-called nonsense humor). Absurd humor contains an unresolvable incongruity but can still induce a feeling of mirth. In this study, we used functional magnetic resonance imaging (fMRI) to identify the neural mechanisms of absurd humor. Specifically, we aimed to investigate the neural substrates associated with the complete resolution of incongruity resolution humor and partial resolution of absurd humor. Based on the fMRI data, we propose a dual-path model of incongruity resolution and absurd verbal humor. According to this model, the detection and resolution for the incongruity of incongruity resolution humor activate brain regions involved in the temporo-parietal lobe (TPJ) implicated in the integration of multiple information and precuneus, likely to be involved in the ability of perspective taking. The appreciation of incongruity resolution humor activates regions the posterior cingulate cortex (PCC), implicated in autobiographic or event memory retrieval, and parahippocampal gyrus (PHG), implying the funny feeling. By contrast, the partial resolution of absurd humor elicits greater activation in the fusiform gyrus which have been implicated in word processing, inferior frontal gyrus (IFG) for the process of incongruity resolution and superior temporal gyrus (STG) for the pragmatic awareness. PMID:28484402
ISS Hygiene Activities - Issues and Resolutions
NASA Technical Reports Server (NTRS)
Prokhorov, Kimberlee S.; Feldman, Brienne; Walker, Stephanie; Bruce, Rebekah
2009-01-01
Hygiene is something that is usually taken for granted by those of us on the Earth. The ability to perform hygiene satisfactorily during long duration space flight is crucial for the crew's ability to function. Besides preserving the basic health of the crew, crew members have expressed that the ability to clean up on-orbit is vital for mental health. Providing this functionality involves more than supplying hygiene items such as soap and toothpaste. On the International Space Station (ISS), the details on where and how to perform hygiene were left to the crew discretion for the first seventeen increments. Without clear guidance, the methods implemented on-orbit have resulted in some unintended consequences to the ISS environment. This paper will outline the issues encountered regarding hygiene activities on-board the ISS, and the lessons that have been learned in addressing those issues. Additionally, the paper will address the resolutions that have been put into place to protect the ISS environment while providing the crew sufficient means to perform hygiene.
Considerations for pattern placement error correction toward 5nm node
NASA Astrophysics Data System (ADS)
Yaegashi, Hidetami; Oyama, Kenichi; Hara, Arisa; Natori, Sakurako; Yamauchi, Shohei; Yamato, Masatoshi; Koike, Kyohei; Maslow, Mark John; Timoshkov, Vadim; Kiers, Ton; Di Lorenzo, Paolo; Fonseca, Carlos
2017-03-01
Multi-patterning has been adopted widely in high volume manufacturing as 193 immersion extension, and it becomes realistic solution of nano-order scaling. In fact, it must be key technology on single directional (1D) layout design [1] for logic devise and it becomes a major option for further scaling technique in SAQP. The requirement for patterning fidelity control is getting savior more and more, stochastic fluctuation as well as LER (Line edge roughness) has to be micro-scopic observation aria. In our previous work, such atomic order controllability was viable in complemented technique with etching and deposition [2]. Overlay issue form major potion in yield management, therefore, entire solution is needed keenly including alignment accuracy on scanner and detectability on overlay measurement instruments. As EPE (Edge placement error) was defined as the gap between design pattern and contouring of actual pattern edge, pattern registration in single process level must be considerable. The complementary patterning to fabricate 1D layout actually mitigates any process restrictions, however, multiple process step, symbolized as LELE with 193-i, is burden to yield management and affordability. Recent progress of EUV technology is remarkable, and it is major potential solution for such complicated technical issues. EUV has robust resolution limit and it must be definitely strong scaling driver for process simplification. On the other hand, its stochastic variation such like shot noise due to light source power must be resolved with any additional complemented technique. In this work, we examined the nano-order CD and profile control on EUV resist pattern and would introduce excellent accomplishments.
HESS Opinions: The need for process-based evaluation of large-domain hyper-resolution models
NASA Astrophysics Data System (ADS)
Melsen, Lieke A.; Teuling, Adriaan J.; Torfs, Paul J. J. F.; Uijlenhoet, Remko; Mizukami, Naoki; Clark, Martyn P.
2016-03-01
A meta-analysis on 192 peer-reviewed articles reporting on applications of the variable infiltration capacity (VIC) model in a distributed way reveals that the spatial resolution at which the model is applied has increased over the years, while the calibration and validation time interval has remained unchanged. We argue that the calibration and validation time interval should keep pace with the increase in spatial resolution in order to resolve the processes that are relevant at the applied spatial resolution. We identified six time concepts in hydrological models, which all impact the model results and conclusions. Process-based model evaluation is particularly relevant when models are applied at hyper-resolution, where stakeholders expect credible results both at a high spatial and temporal resolution.
HESS Opinions: The need for process-based evaluation of large-domain hyper-resolution models
NASA Astrophysics Data System (ADS)
Melsen, L. A.; Teuling, A. J.; Torfs, P. J. J. F.; Uijlenhoet, R.; Mizukami, N.; Clark, M. P.
2015-12-01
A meta-analysis on 192 peer-reviewed articles reporting applications of the Variable Infiltration Capacity (VIC) model in a distributed way reveals that the spatial resolution at which the model is applied has increased over the years, while the calibration and validation time interval has remained unchanged. We argue that the calibration and validation time interval should keep pace with the increase in spatial resolution in order to resolve the processes that are relevant at the applied spatial resolution. We identified six time concepts in hydrological models, which all impact the model results and conclusions. Process-based model evaluation is particularly relevant when models are applied at hyper-resolution, where stakeholders expect credible results both at a high spatial and temporal resolution.
Topography and Landforms of Ecuador
Chirico, Peter G.; Warner, Michael B.
2005-01-01
EXPLANATION The digital elevation model of Ecuador represented in this data set was produced from over 40 individual tiles of elevation data from the Shuttle Radar Topography Mission (SRTM). Each tile was downloaded, converted from its native Height file format (.hgt), and imported into a geographic information system (GIS) for additional processing. Processing of the data included data gap filling, mosaicking, and re-projection of the tiles to form one single seamless digital elevation model. For 11 days in February of 2000, NASA, the National Geospatial-Intelligence Agency (NGA), the German Aerospace Center (DLR), and the Italian Space Agency (ASI) flew X-band and C-band radar interferometry onboard the Space Shuttle Endeavor. The mission covered the Earth between 60?N and 57?S and will provide interferometric digital elevation models (DEMs) of approximately 80% of the Earth's land mass when processing is complete. The radar-pointing angle was approximately 55? at scene center. Ascending and descending orbital passes generated multiple interferometric data scenes for nearly all areas. Up to eight passes of data were merged to form the final processed SRTM DEMs. The effect of merging scenes averages elevation values recorded in coincident scenes and reduces, but does not completely eliminate, the amount of area with layover and terrain shadow effects. The most significant form of data processing for the Ecuador DEM was gap-filling areas where the SRTM data contained a data void. These void areas are a result of radar shadow, layover, standing water, and other effects of terrain, as well as technical radar interferometry phase unwrapping issues. To fill these gaps, topographic contours were digitized from 1:50,000 - scale topographic maps which date from the mid-late 1980's (Souris, 2001). Digital contours were gridded to form elevation models for void areas and subsequently were merged with the SRTM data through GIS and remote sensing image-processing techniques. The data contained in this publication includes a gap filled, countrywide SRTM DEM of Ecuador projected in Universal Transverse Mercator (UTM) Zone 17 North projection, Provisional South American, 1956, Ecuador datum and a non gap filled SRTM DEM of the Galapagos Islands projected in UTM Zone 15 North projection. Both the Ecuador and Galapagos Islands DEMs are available as an ESRI Grid, stored as ArcInfo Export files (.e00), and in Erdas Imagine (IMG) file formats with a 90 meter pixel resolution. Also included in this publication are high and low resolution Adobe Acrobat (PDF) files of topography and landforms maps in Ecuador. The high resolution map should be used for printing and display, while the lower resolution map can be used for quick viewing and reference purposes.
Negotiation From a Near and Distant Time Perspective
Henderson, Marlone D.; Trope, Yaacov; Carnevale, Peter J.
2011-01-01
Across 3 experiments, the authors examined the effects of temporal distance on negotiation behavior. They found that greater temporal distance from negotiation decreased preference for piecemeal, single-issue consideration over integrative, multi-issue consideration (Experiment 1). They also found that greater temporal distance from an event being negotiated increased interest in conceding on the lowest priority issue and decreased interest in conceding on the highest priority issue (Experiment 2). Lastly, they found increased temporal distance from an event being negotiated produced a greater proportion of multi-issue offers, a greater likelihood of conceding on the lowest priority issue in exchange for a concession on the highest priority issue, and greater individual and joint outcomes (Experiment 3). Implications for conflict resolution and construal level theory are discussed. PMID:17014295
NCAR Earth Observing Laboratory's Data Tracking System
NASA Astrophysics Data System (ADS)
Cully, L. E.; Williams, S. F.
2014-12-01
The NCAR Earth Observing Laboratory (EOL) maintains an extensive collection of complex, multi-disciplinary datasets from national and international, current and historical projects accessible through field project web pages (https://www.eol.ucar.edu/all-field-projects-and-deployments). Data orders are processed through the EOL Metadata Database and Cyberinfrastructure (EMDAC) system. Behind the scenes is the institutionally created EOL Computing, Data, and Software/Data Management Group (CDS/DMG) Data Tracking System (DTS) tool. The DTS is used to track the complete life cycle (from ingest to long term stewardship) of the data, metadata, and provenance for hundreds of projects and thousands of data sets. The DTS is an EOL internal only tool which consists of three subsystems: Data Loading Notes (DLN), Processing Inventory Tool (IVEN), and Project Metrics (STATS). The DLN is used to track and maintain every dataset that comes to the CDS/DMG. The DLN captures general information such as title, physical locations, responsible parties, high level issues, and correspondence. When the CDS/DMG processes a data set, IVEN is used to track the processing status while collecting sufficient information to ensure reproducibility. This includes detailed "How To" documentation, processing software (with direct links to the EOL Subversion software repository), and descriptions of issues and resolutions. The STATS subsystem generates current project metrics such as archive size, data set order counts, "Top 10" most ordered data sets, and general information on who has ordered these data. The DTS was developed over many years to meet the specific needs of the CDS/DMG, and it has been successfully used to coordinate field project data management efforts for the past 15 years. This paper will describe the EOL CDS/DMG Data Tracking System including its basic functionality, the provenance maintained within the system, lessons learned, potential improvements, and future developments.
The Benefits of Executive Control Training and the Implications for Language Processing
Hussey, Erika K.; Novick, Jared M.
2012-01-01
Recent psycholinguistics research suggests that the executive function (EF) skill known as conflict resolution – the ability to adjust behavior in the service of resolving among incompatible representations – is important for several language processing tasks such as lexical and syntactic ambiguity resolution, verbal fluency, and common-ground assessment. Here, we discuss work showing that various EF skills can be enhanced through consistent practice with working-memory tasks that tap these EFs, and, moreover, that improvements on the training tasks transfer across domains to novel tasks that may rely on shared underlying EFs. These findings have implications for language processing and could launch new research exploring if EF training, within a “process-specific” framework, could be used as a remediation tool for improving general language use. Indeed, work in our lab demonstrates that EF training that increases conflict-resolution processes has selective benefits on an untrained sentence-processing task requiring syntactic ambiguity resolution, which relies on shared conflict-resolution functions. Given claims that conflict-resolution abilities contribute to a range of linguistic skills, EF training targeting this process could theoretically yield wider performance gains beyond garden-path recovery. We offer some hypotheses on the potential benefits of EF training as a component of interventions to mitigate general difficulties in language processing. However, there are caveats to consider as well, which we also address. PMID:22661962
The Impact of Adoption on Members of the Triad. Adoption and Ethics, Volume 3.
ERIC Educational Resources Information Center
Freundlich, Madelyn
The controversies in adoption have extended across a spectrum of policy and practice issues, and although the issues have become clear, resolution has not been achieved nor has consensus developed regarding a framework on which to improve the quality of adoption policy and practice. This book is the third in a series to use an ethics-based…
Public involvement and consensus building in the Verde River Watershed in central Arizona
Tom Bonomo
1996-01-01
Currently an organization called the Verde Watershed Association is the central point for consensus building and public involvement in water issues in the Verde River Watershed. The association is the out growth of efforts towards the resolution of watershed issues without passing new laws, initiating regulations, or entering the win-lose arena of litigation. The...
Complementarity of PALM and SOFI for super-resolution live-cell imaging of focal adhesions
Deschout, Hendrik; Lukes, Tomas; Sharipov, Azat; Szlag, Daniel; Feletti, Lely; Vandenberg, Wim; Dedecker, Peter; Hofkens, Johan; Leutenegger, Marcel; Lasser, Theo; Radenovic, Aleksandra
2016-01-01
Live-cell imaging of focal adhesions requires a sufficiently high temporal resolution, which remains a challenge for super-resolution microscopy. Here we address this important issue by combining photoactivated localization microscopy (PALM) with super-resolution optical fluctuation imaging (SOFI). Using simulations and fixed-cell focal adhesion images, we investigate the complementarity between PALM and SOFI in terms of spatial and temporal resolution. This PALM-SOFI framework is used to image focal adhesions in living cells, while obtaining a temporal resolution below 10 s. We visualize the dynamics of focal adhesions, and reveal local mean velocities around 190 nm min−1. The complementarity of PALM and SOFI is assessed in detail with a methodology that integrates a resolution and signal-to-noise metric. This PALM and SOFI concept provides an enlarged quantitative imaging framework, allowing unprecedented functional exploration of focal adhesions through the estimation of molecular parameters such as fluorophore densities and photoactivation or photoswitching kinetics. PMID:27991512
Complementarity of PALM and SOFI for super-resolution live-cell imaging of focal adhesions
NASA Astrophysics Data System (ADS)
Deschout, Hendrik; Lukes, Tomas; Sharipov, Azat; Szlag, Daniel; Feletti, Lely; Vandenberg, Wim; Dedecker, Peter; Hofkens, Johan; Leutenegger, Marcel; Lasser, Theo; Radenovic, Aleksandra
2016-12-01
Live-cell imaging of focal adhesions requires a sufficiently high temporal resolution, which remains a challenge for super-resolution microscopy. Here we address this important issue by combining photoactivated localization microscopy (PALM) with super-resolution optical fluctuation imaging (SOFI). Using simulations and fixed-cell focal adhesion images, we investigate the complementarity between PALM and SOFI in terms of spatial and temporal resolution. This PALM-SOFI framework is used to image focal adhesions in living cells, while obtaining a temporal resolution below 10 s. We visualize the dynamics of focal adhesions, and reveal local mean velocities around 190 nm min-1. The complementarity of PALM and SOFI is assessed in detail with a methodology that integrates a resolution and signal-to-noise metric. This PALM and SOFI concept provides an enlarged quantitative imaging framework, allowing unprecedented functional exploration of focal adhesions through the estimation of molecular parameters such as fluorophore densities and photoactivation or photoswitching kinetics.
Complementarity of PALM and SOFI for super-resolution live-cell imaging of focal adhesions.
Deschout, Hendrik; Lukes, Tomas; Sharipov, Azat; Szlag, Daniel; Feletti, Lely; Vandenberg, Wim; Dedecker, Peter; Hofkens, Johan; Leutenegger, Marcel; Lasser, Theo; Radenovic, Aleksandra
2016-12-19
Live-cell imaging of focal adhesions requires a sufficiently high temporal resolution, which remains a challenge for super-resolution microscopy. Here we address this important issue by combining photoactivated localization microscopy (PALM) with super-resolution optical fluctuation imaging (SOFI). Using simulations and fixed-cell focal adhesion images, we investigate the complementarity between PALM and SOFI in terms of spatial and temporal resolution. This PALM-SOFI framework is used to image focal adhesions in living cells, while obtaining a temporal resolution below 10 s. We visualize the dynamics of focal adhesions, and reveal local mean velocities around 190 nm min -1 . The complementarity of PALM and SOFI is assessed in detail with a methodology that integrates a resolution and signal-to-noise metric. This PALM and SOFI concept provides an enlarged quantitative imaging framework, allowing unprecedented functional exploration of focal adhesions through the estimation of molecular parameters such as fluorophore densities and photoactivation or photoswitching kinetics.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Karpius, Peter Joseph; Myers, Steven Charles
This presentation is a part of the DHS LSS spectroscopy course and provides an overview of the following concepts: detector system components, intrinsic and absolute efficiency, resolution and linearity, and operational issues and limits.
Scientific Synergy between LSST and Euclid
NASA Astrophysics Data System (ADS)
Rhodes, Jason; Nichol, Robert C.; Aubourg, Éric; Bean, Rachel; Boutigny, Dominique; Bremer, Malcolm N.; Capak, Peter; Cardone, Vincenzo; Carry, Benoît; Conselice, Christopher J.; Connolly, Andrew J.; Cuillandre, Jean-Charles; Hatch, N. A.; Helou, George; Hemmati, Shoubaneh; Hildebrandt, Hendrik; Hložek, Renée; Jones, Lynne; Kahn, Steven; Kiessling, Alina; Kitching, Thomas; Lupton, Robert; Mandelbaum, Rachel; Markovic, Katarina; Marshall, Phil; Massey, Richard; Maughan, Ben J.; Melchior, Peter; Mellier, Yannick; Newman, Jeffrey A.; Robertson, Brant; Sauvage, Marc; Schrabback, Tim; Smith, Graham P.; Strauss, Michael A.; Taylor, Andy; Von Der Linden, Anja
2017-12-01
Euclid and the Large Synoptic Survey Telescope (LSST) are poised to dramatically change the astronomy landscape early in the next decade. The combination of high-cadence, deep, wide-field optical photometry from LSST with high-resolution, wide-field optical photometry, and near-infrared photometry and spectroscopy from Euclid will be powerful for addressing a wide range of astrophysical questions. We explore Euclid/LSST synergy, ignoring the political issues associated with data access to focus on the scientific, technical, and financial benefits of coordination. We focus primarily on dark energy cosmology, but also discuss galaxy evolution, transient objects, solar system science, and galaxy cluster studies. We concentrate on synergies that require coordination in cadence or survey overlap, or would benefit from pixel-level co-processing that is beyond the scope of what is currently planned, rather than scientific programs that could be accomplished only at the catalog level without coordination in data processing or survey strategies. We provide two quantitative examples of scientific synergies: the decrease in photo-z errors (benefiting many science cases) when high-resolution Euclid data are used for LSST photo-z determination, and the resulting increase in weak-lensing signal-to-noise ratio from smaller photo-z errors. We briefly discuss other areas of coordination, including high-performance computing resources and calibration data. Finally, we address concerns about the loss of independence and potential cross-checks between the two missions and the potential consequences of not collaborating.
Temporal consistent depth map upscaling for 3DTV
NASA Astrophysics Data System (ADS)
Schwarz, Sebastian; Sjöström, Mârten; Olsson, Roger
2014-03-01
The ongoing success of three-dimensional (3D) cinema fuels increasing efforts to spread the commercial success of 3D to new markets. The possibilities of a convincing 3D experience at home, such as three-dimensional television (3DTV), has generated a great deal of interest within the research and standardization community. A central issue for 3DTV is the creation and representation of 3D content. Acquiring scene depth information is a fundamental task in computer vision, yet complex and error-prone. Dedicated range sensors, such as the Time of-Flight camera (ToF), can simplify the scene depth capture process and overcome shortcomings of traditional solutions, such as active or passive stereo analysis. Admittedly, currently available ToF sensors deliver only a limited spatial resolution. However, sophisticated depth upscaling approaches use texture information to match depth and video resolution. At Electronic Imaging 2012 we proposed an upscaling routine based on error energy minimization, weighted with edge information from an accompanying video source. In this article we develop our algorithm further. By adding temporal consistency constraints to the upscaling process, we reduce disturbing depth jumps and flickering artifacts in the final 3DTV content. Temporal consistency in depth maps enhances the 3D experience, leading to a wider acceptance of 3D media content. More content in better quality can boost the commercial success of 3DTV.
Human machine interface by using stereo-based depth extraction
NASA Astrophysics Data System (ADS)
Liao, Chao-Kang; Wu, Chi-Hao; Lin, Hsueh-Yi; Chang, Ting-Ting; Lin, Tung-Yang; Huang, Po-Kuan
2014-03-01
The ongoing success of three-dimensional (3D) cinema fuels increasing efforts to spread the commercial success of 3D to new markets. The possibilities of a convincing 3D experience at home, such as three-dimensional television (3DTV), has generated a great deal of interest within the research and standardization community. A central issue for 3DTV is the creation and representation of 3D content. Acquiring scene depth information is a fundamental task in computer vision, yet complex and error-prone. Dedicated range sensors, such as the Time of-Flight camera (ToF), can simplify the scene depth capture process and overcome shortcomings of traditional solutions, such as active or passive stereo analysis. Admittedly, currently available ToF sensors deliver only a limited spatial resolution. However, sophisticated depth upscaling approaches use texture information to match depth and video resolution. At Electronic Imaging 2012 we proposed an upscaling routine based on error energy minimization, weighted with edge information from an accompanying video source. In this article we develop our algorithm further. By adding temporal consistency constraints to the upscaling process, we reduce disturbing depth jumps and flickering artifacts in the final 3DTV content. Temporal consistency in depth maps enhances the 3D experience, leading to a wider acceptance of 3D media content. More content in better quality can boost the commercial success of 3DTV.
BOREAS TE-18, 60-m, Radiometrically Rectified Landsat TM Imagery
NASA Technical Reports Server (NTRS)
Hall, Forrest G. (Editor); Knapp, David
2000-01-01
The BOREAS TE-18 team used a radiometric rectification process to produce standardized DN values for a series of Landsat TM images of the BOREAS SSA and NSA in order to compare images that were collected under different atmospheric conditions. The images for each study area were referenced to an image that had very clear atmospheric qualities. The reference image for the SSA was collected on 02-Sep-1994, while the reference image for the NSA was collected on 2 1 Jun-1995. The 23 rectified images cover the period of 07-Jul-1985 to 18-Sep-1994 in the SSA and 22-Jun-1984 to 09-Jun-1994 in the NSA. Each of the reference scenes had coincident atmospheric optical thickness measurements made by RSS-11. The radiometric rectification process is described in more detail by Hall et al. (1991). The original Landsat TM data were received from CCRS for use in the BOREAS project. Due to the nature of the radiometric rectification process and copyright issues, the full-resolution (30-m) images may not be publicly distributed. However, this spatially degraded 60-m resolution version of the images may be openly distributed and is available on the BOREAS CD-ROM series. After the radiometric rectification processing, the original data were degraded to a 60-m pixel size from the original 30-m pixel size by averaging the data over a 2- by 2-pixel window. The data are stored in binary image-format files. The data files are available on a CD-ROM (see document number 20010000884), or from the Oak Ridge National Laboratory (ORNL) Distributed Activity Archive Center (DAAC).
An Investigation of Synthetic Aperture Radar Autofocus,
1985-04-01
uniform straight line motion of the aircraft. Unknown aircraft motions alter the matched filter required for processing . Autofocussing involves determi...REFERENCES APPENDIX 1 RESOLUTION OF SAR APPENDIX 2 AIRCRAFT MOTION TOLERANCE APPENDIX 3 INITIAL RESOLUTION FOR FOLLOW-DOWN PROCESSING APPENDIX 4 DEPENDENCE OF...Range resolution is achieved using on-board pulse-compression techniques, while azimuth processing is currently done at RSRE on a Marconi hardware
A new regional RADAR network for nowcasting applications: the RESMAR achievements
NASA Astrophysics Data System (ADS)
Antonini, Andrea; Melani, Samantha; Mazza, Alessandro; Ortolani, Alberto; Gozzini, Bernardo; Corongiu, Manuela; Cristofori, Simone
2013-04-01
Monitoring weather phenomena from radar has an essential role in nowcasting applications. As one of the most useful sources of quantitative precipitation estimation, rainfall radar analysis can be a very useful research tool in supporting methods for rainfall forecasting. Its short-term prediction is often needed in various meteorological and hydrological applications where accurate prediction of rainfall is essential from national service and civil protection forecasting up to agriculture and urban issues. Very recently, Tuscany region (central Italy) is equipped with two X-band radars with a maximum range of 108 km, a beam width of 3° and a high spatial resolution (i.e., radial resolution up to 90m), located in Livorno and Cima del Monte (Elba island) sites. The first system is property of Livorno's port Authority, the second one of Consorzio LaMMA (Laboratory of Monitoring and Environmental Modelling for the sustainable development) who has installed it in the framework of "RESMAR - Environmental Resources in the MARitime Space" activities, a strategic project, financed in the framework of the European Cross-Border Cooperation Programme Italy-France "Maritime", coordinated by the Liguria Region Administration. Both systems are managed by LaMMA. The cross-border sharing of such relevant meteorological observation instruments and the integration of these data with existing tools and methodologies is intended to improve operational regional weather services in nowcasting activities and their impacts on the territory, as those related to LaMMA daily issues. This sharing is widely promoted within RESMAR project between the different partner regions (ARPA-Sardinia, Meteo-France and Liguria). The integration of these data with other complementary and ancillary measurements is also needed to increase the reliability and accuracy of radar measurements in view of both a better meteorological phenomena understanding and quantitative precipitation estimation. The use of satellite data largely improves the spatial and temporal information on the events, filling up the gaps of uneven data distribution; for this issue LaMMA has multi-year skills in the acquisition and processing of geostationary and polar satellites. The regional raingauge network and meteorological stations will be instead used to obtain useful information both to calibration (as those related to radar reflectivity - rain rate relationships) and validation processes. The radar system and its mosaicking will be presented, as well as some preliminary products.
NASA Astrophysics Data System (ADS)
Hall-Brown, Mary
The heterogeneity of Arctic vegetation can make land cover classification vey difficult when using medium to small resolution imagery (Schneider et al., 2009; Muller et al., 1999). Using high radiometric and spatial resolution imagery, such as the SPOT 5 and IKONOS satellites, have helped arctic land cover classification accuracies rise into the 80 and 90 percentiles (Allard, 2003; Stine et al., 2010; Muller et al., 1999). However, those increases usually come at a high price. High resolution imagery is very expensive and can often add tens of thousands of dollars onto the cost of the research. The EO-1 satellite launched in 2002 carries two sensors that have high specral and/or high spatial resolutions and can be an acceptable compromise between the resolution versus cost issues. The Hyperion is a hyperspectral sensor with the capability of collecting 242 spectral bands of information. The Advanced Land Imager (ALI) is an advanced multispectral sensor whose spatial resolution can be sharpened to 10 meters. This dissertation compares the accuracies of arctic land cover classifications produced by the Hyperion and ALI sensors to the classification accuracies produced by the Systeme Pour l' Observation de le Terre (SPOT), the Landsat Thematic Mapper (TM) and the Landsat Enhanced Thematic Mapper Plus (ETM+) sensors. Hyperion and ALI images from August 2004 were collected over the Upper Kuparuk River Basin, Alaska. Image processing included the stepwise discriminant analysis of pixels that were positively classified from coinciding ground control points, geometric and radiometric correction, and principle component analysis. Finally, stratified random sampling was used to perform accuracy assessments on satellite derived land cover classifications. Accuracy was estimated from an error matrix (confusion matrix) that provided the overall, producer's and user's accuracies. This research found that while the Hyperion sensor produced classfication accuracies that were equivalent to the TM and ETM+ sensor (approximately 78%), the Hyperion could not obtain the accuracy of the SPOT 5 HRV sensor. However, the land cover classifications derived from the ALI sensor exceeded most classification accuracies derived from the TM and ETM+ senors and were even comparable to most SPOT 5 HRV classifications (87%). With the deactivation of the Landsat series satellites, the monitoring of remote locations such as in the Arctic on an uninterupted basis thoughout the world is in jeopardy. The utilization of the Hyperion and ALI sensors are a way to keep that endeavor operational. By keeping the ALI sensor active at all times, uninterupted observation of the entire Earth can be accomplished. Keeping the Hyperion sensor as a "tasked" sensor can provide scientists with additional imagery and options for their studies without overburdening storage issues.
NASA Technical Reports Server (NTRS)
Van Dongen, Hans P A.; Dinges, David F.
2003-01-01
The two-process model of sleep regulation has been applied successfully to describe, predict, and understand sleep-wake regulation in a variety of experimental protocols such as sleep deprivation and forced desynchrony. A non-linear interaction between the homeostatic and circadian processes was reported when the model was applied to describe alertness and performance data obtained during forced desynchrony. This non-linear interaction could also be due to intrinsic non-linearity in the metrics used to measure alertness and performance, however. Distinguishing these possibilities would be of theoretical interest, but could also have important implications for the design and interpretation of experiments placing sleep at different circadian phases or varying the duration of sleep and/or wakefulness. Although to date no resolution to this controversy has been found, here we show that the issue can be addressed with existing data sets. The interaction between the homeostatic and circadian processes of sleep-wake regulation was investigated using neurobehavioural performance data from a laboratory experiment involving total sleep deprivation. The results provided evidence of an actual non-linear interaction between the homeostatic and circadian processes of sleep-wake regulation for the prediction of waking neurobehavioural performance.
NASA Astrophysics Data System (ADS)
Dennison, Andrew G.
Classification of the seafloor substrate can be done with a variety of methods. These methods include Visual (dives, drop cameras); mechanical (cores, grab samples); acoustic (statistical analysis of echosounder returns). Acoustic methods offer a more powerful and efficient means of collecting useful information about the bottom type. Due to the nature of an acoustic survey, larger areas can be sampled, and by combining the collected data with visual and mechanical survey methods provide greater confidence in the classification of a mapped region. During a multibeam sonar survey, both bathymetric and backscatter data is collected. It is well documented that the statistical characteristic of a sonar backscatter mosaic is dependent on bottom type. While classifying the bottom-type on the basis on backscatter alone can accurately predict and map bottom-type, i.e a muddy area from a rocky area, it lacks the ability to resolve and capture fine textural details, an important factor in many habitat mapping studies. Statistical processing of high-resolution multibeam data can capture the pertinent details about the bottom-type that are rich in textural information. Further multivariate statistical processing can then isolate characteristic features, and provide the basis for an accurate classification scheme. The development of a new classification method is described here. It is based upon the analysis of textural features in conjunction with ground truth sampling. The processing and classification result of two geologically distinct areas in nearshore regions of Lake Superior; off the Lester River,MN and Amnicon River, WI are presented here, using the Minnesota Supercomputer Institute's Mesabi computing cluster for initial processing. Processed data is then calibrated using ground truth samples to conduct an accuracy assessment of the surveyed areas. From analysis of high-resolution bathymetry data collected at both survey sites is was possible to successfully calculate a series of measures that describe textural information about the lake floor. Further processing suggests that the features calculated capture a significant amount of statistical information about the lake floor terrain as well. Two sources of error, an anomalous heave and refraction error significantly deteriorated the quality of the processed data and resulting validate results. Ground truth samples used to validate the classification methods utilized for both survey sites, however, resulted in accuracy values ranging from 5 -30 percent at the Amnicon River, and between 60-70 percent for the Lester River. The final results suggest that this new processing methodology does adequately capture textural information about the lake floor and does provide an acceptable classification in the absence of significant data quality issues.
Photomask etch system and process for 10nm technology node and beyond
NASA Astrophysics Data System (ADS)
Chandrachood, Madhavi; Grimbergen, Michael; Yu, Keven; Leung, Toi; Tran, Jeffrey; Chen, Jeff; Bivens, Darin; Yalamanchili, Rao; Wistrom, Richard; Faure, Tom; Bartlau, Peter; Crawford, Shaun; Sakamoto, Yoshifumi
2015-10-01
While the industry is making progress to offer EUV lithography schemes to attain ultimate critical dimensions down to 20 nm half pitch, an interim optical lithography solution to address an immediate need for resolution is offered by various integration schemes using advanced PSM (Phase Shift Mask) materials including thin e-beam resist and hard mask. Using the 193nm wavelength to produce 10nm or 7nm patterns requires a range of optimization techniques, including immersion and multiple patterning, which place a heavy demand on photomask technologies. Mask schemes with hard mask certainly help attain better selectivity and hence better resolution but pose integration challenges and defectivity issues. This paper presents a new photomask etch solution for attenuated phase shift masks that offers high selectivity (Cr:Resist > 1.5:1), tighter control on the CD uniformity with a 3sigma value approaching 1 nm and controllable CD bias (5-20 nm) with excellent CD linearity performance (<5 nm) down to the finer resolution. The new system has successfully demonstrated capability to meet the 10 nm node photomask CD requirements without the use of more complicated hard mask phase shift blanks. Significant improvement in post wet clean recovery performance was demonstrated by the use of advanced chamber materials. Examples of CD uniformity, linearity, and minimum feature size, and etch bias performance on 10 nm test site and production mask designs will be shown.
Exponential Modelling for Mutual-Cohering of Subband Radar Data
NASA Astrophysics Data System (ADS)
Siart, U.; Tejero, S.; Detlefsen, J.
2005-05-01
Increasing resolution and accuracy is an important issue in almost any type of radar sensor application. However, both resolution and accuracy are strongly related to the available signal bandwidth and energy that can be used. Nowadays, often several sensors operating in different frequency bands become available on a sensor platform. It is an attractive goal to use the potential of advanced signal modelling and optimization procedures by making proper use of information stemming from different frequency bands at the RF signal level. An important prerequisite for optimal use of signal energy is coherence between all contributing sensors. Coherent multi-sensor platforms are greatly expensive and are thus not available in general. This paper presents an approach for accurately estimating object radar responses using subband measurements at different RF frequencies. An exponential model approach allows to compensate for the lack of mutual coherence between independently operating sensors. Mutual coherence is recovered from the a-priori information that both sensors have common scattering centers in view. Minimizing the total squared deviation between measured data and a full-range exponential signal model leads to more accurate pole angles and pole magnitudes compared to single-band optimization. The model parameters (range and magnitude of point scatterers) after this full-range optimization process are also more accurate than the parameters obtained from a commonly used super-resolution procedure (root-MUSIC) applied to the non-coherent subband data.
Power quality considerations for nuclear spectroscopy applications: Grounding
NASA Astrophysics Data System (ADS)
García-Hernández, J. M.; Ramírez-Jiménez, F. J.; Mondragón-Contreras, L.; López-Callejas, R.; Torres-Bribiesca, M. A.; Peña-Eguiluz, R.
2013-11-01
Traditionally the electrical installations are designed for supplying power and to assure the personnel safety. In nuclear analysis laboratories, additional issues about grounding also must be considered for proper operation of high resolution nuclear spectroscopy systems. This paper shows the traditional ways of grounding nuclear spectroscopy systems and through different scenarios, it shows the effects on the more sensitive parameter of these systems: the energy resolution, it also proposes the constant monitoring of a power quality parameter as a way to preserve or to improve the resolution of the systems, avoiding the influence of excessive extrinsic noise.
The Power to Declare War: The Ultimate Check on Presidential Power
2012-03-07
action through a resolution or by continued funding. When Congress does declare war it grants sweeping powers to the Executive Branch that threatens...formally declare war. If war only requires limited resources then Congress authorizes the action through a resolution or by continued funding. When...surface it may appear that Congress has not performed its constitutional obligations by authorizing military actions without issuing a formal
Micro-computed tomography pore-scale study of flow in porous media: Effect of voxel resolution
NASA Astrophysics Data System (ADS)
Shah, S. M.; Gray, F.; Crawshaw, J. P.; Boek, E. S.
2016-09-01
A fundamental understanding of flow in porous media at the pore-scale is necessary to be able to upscale average displacement processes from core to reservoir scale. The study of fluid flow in porous media at the pore-scale consists of two key procedures: Imaging - reconstruction of three-dimensional (3D) pore space images; and modelling such as with single and two-phase flow simulations with Lattice-Boltzmann (LB) or Pore-Network (PN) Modelling. Here we analyse pore-scale results to predict petrophysical properties such as porosity, single-phase permeability and multi-phase properties at different length scales. The fundamental issue is to understand the image resolution dependency of transport properties, in order to up-scale the flow physics from pore to core scale. In this work, we use a high resolution micro-computed tomography (micro-CT) scanner to image and reconstruct three dimensional pore-scale images of five sandstones (Bentheimer, Berea, Clashach, Doddington and Stainton) and five complex carbonates (Ketton, Estaillades, Middle Eastern sample 3, Middle Eastern sample 5 and Indiana Limestone 1) at four different voxel resolutions (4.4 μm, 6.2 μm, 8.3 μm and 10.2 μm), scanning the same physical field of view. Implementing three phase segmentation (macro-pore phase, intermediate phase and grain phase) on pore-scale images helps to understand the importance of connected macro-porosity in the fluid flow for the samples studied. We then compute the petrophysical properties for all the samples using PN and LB simulations in order to study the influence of voxel resolution on petrophysical properties. We then introduce a numerical coarsening scheme which is used to coarsen a high voxel resolution image (4.4 μm) to lower resolutions (6.2 μm, 8.3 μm and 10.2 μm) and study the impact of coarsening data on macroscopic and multi-phase properties. Numerical coarsening of high resolution data is found to be superior to using a lower resolution scan because it avoids the problem of partial volume effects and reduces the scaling effect by preserving the pore-space properties influencing the transport properties. This is evidently compared in this study by predicting several pore network properties such as number of pores and throats, average pore and throat radius and coordination number for both scan based analysis and numerical coarsened data.
Cost/benefit analysis for video security systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1997-01-01
Dr. Don Hush and Scott Chapman, in conjunction with the Electrical and Computer Engineering Department of the University of New Mexico (UNM), have been contracted by Los Alamos National Laboratories to perform research in the area of high security video analysis. The first phase of this research, presented in this report, is a cost/benefit analysis of various approaches to the problem in question. This discussion begins with a description of three architectures that have been used as solutions to the problem of high security surveillance. An overview of the relative merits and weaknesses of each of the proposed systems ismore » included. These descriptions are followed directly by a discussion of the criteria chosen in evaluating the systems and the techniques used to perform the comparisons. The results are then given in graphical and tabular form, and their implications discussed. The project to this point has involved assessing hardware and software issues in image acquisition, processing and change detection. Future work is to leave these questions behind to consider the issues of change analysis - particularly the detection of human motion - and alarm decision criteria. The criteria for analysis in this report include: cost; speed; tradeoff issues in moving primative operations from software to hardware; real time operation considerations; change image resolution; and computational requirements.« less
12th meeting of Asian Parliamentarians on Population and Development.
1999-01-01
At the 12th annual Asian Parliamentarians Meeting on Population and Development, co-sponsored by the Asian Population and Development Association (APDA) of Japan and the Philippine Legislative Committee on Population and Development (PLCPD), the adverse effect of population growth on economic development and the importance of improvements in women's status were central themes. Fukusaburo Maeda, President of APDA Japan, noted that an understanding of women's issues is key to solving global population problems. Numerous participants urged rapid implementation of plans outlined at recent conferences in Cairo and Beijing to empower women and involve them in all stages of the development process. Even issues of food security are linked to women's issues, since women are generally responsible for feeding their families. Participants voted to adopt the "Manila Resolution on Women, Gender, Population, and Development"--a call for social and economic empowerment of women and resources for gender-related programs. Dr. Patricia Licuanan, Chair of the UN Committee on the Status of Women, noted that men should not feel threatened by women holding positions of power; rather, they should welcome an equal partnership between men and women. In her closing address, Senator Leticia Ramos Shahani, Chair of PLCPD, stressed the importance of placing women's empowerment on various parliamentary agendas and commended APDA for its research and population-based surveys in Asia.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Graniere, R.J.
1993-06-15
A timeline of the past 20 years would characterize an American telecommunications policy revolution dominated by alternating periods of market structure and access. It also would reveal that this cycle is not a casual phenomenon but the result of procompetitive regulatory and judicial decisions that spawn equal and open access issues whose resolution is, in turn, a source of additional market structure issues. Passage of the Energy Policy Act of 1992 has started a similar cogenerative process in the electricity industry. How can electric utility executives and regulators use the lessons of the telecommunications industry to deal with emerging transmissionmore » issues in the electricity industry They can begin by realizing that multiple forms of mandatory transmission access may be new to electric utilities, but they are second nature to telephone local exchange companies (LECs). For example, LECs have been providing local access services to equipment manufacturers and long-distance companies for over a decade. These firms also are deploying local access services for the enhanced and information-services providers under the rubric of open network architecture (ONA). This full range of access services might soon be commonplace in the electricity industry, too, as exempt wholesale generators (EWGs) enter the wholesale power markets.« less
Implementation issues of the nearfield equivalent source imaging microphone array
NASA Astrophysics Data System (ADS)
Bai, Mingsian R.; Lin, Jia-Hong; Tseng, Chih-Wen
2011-01-01
This paper revisits a nearfield microphone array technique termed nearfield equivalent source imaging (NESI) proposed previously. In particular, various issues concerning the implementation of the NESI algorithm are examined. The NESI can be implemented in both the time domain and the frequency domain. Acoustical variables including sound pressure, particle velocity, active intensity and sound power are calculated by using multichannel inverse filters. Issues concerning sensor deployment are also investigated for the nearfield array. The uniform array outperformed a random array previously optimized for far-field imaging, which contradicts the conventional wisdom in far-field arrays. For applications in which only a patch array with scarce sensors is available, a virtual microphone approach is employed to ameliorate edge effects using extrapolation and to improve imaging resolution using interpolation. To enhance the processing efficiency of the time-domain NESI, an eigensystem realization algorithm (ERA) is developed. Several filtering methods are compared in terms of computational complexity. Significant saving on computations can be achieved using ERA and the frequency-domain NESI, as compared to the traditional method. The NESI technique was also experimentally validated using practical sources including a 125 cc scooter and a wooden box model with a loudspeaker fitted inside. The NESI technique proved effective in identifying broadband and non-stationary sources produced by the sources.
Towards a physically-based multi-scale ecohydrological simulator for semi-arid regions
NASA Astrophysics Data System (ADS)
Caviedes-Voullième, Daniel; Josefik, Zoltan; Hinz, Christoph
2017-04-01
The use of numerical models as tools for describing and understanding complex ecohydrological systems has enabled to test hypothesis and propose fundamental, process-based explanations of the system system behaviour as a whole as well as its internal dynamics. Reaction-diffusion equations have been used to describe and generate organized pattern such as bands, spots, and labyrinths using simple feedback mechanisms and boundary conditions. Alternatively, pattern-matching cellular automaton models have been used to generate vegetation self-organization in arid and semi-arid regions also using simple description of surface hydrological processes. A key question is: How much physical realism is needed in order to adequately capture the pattern formation processes in semi-arid regions while reliably representing the water balance dynamics at the relevant time scales? In fact, redistribution of water by surface runoff at the hillslope scale occurs at temporal resolution of minutes while the vegetation development requires much lower temporal resolution and longer times spans. This generates a fundamental spatio-temporal multi-scale problem to be solved, for which high resolution rainfall and surface topography are required. Accordingly, the objective of this contribution is to provide proof-of-concept that governing processes can be described numerically at those multiple scales. The requirements for a simulating ecohydrological processes and pattern formation with increased physical realism are, amongst others: i. high resolution rainfall that adequately captures the triggers of growth as vegetation dynamics of arid regions respond as pulsed systems. ii. complex, natural topography in order to accurately model drainage patterns, as surface water redistribution is highly sensitive to topographic features. iii. microtopography and hydraulic roughness, as small scale variations do impact on large scale hillslope behaviour iv. moisture dependent infiltration as temporal dynamics of infiltration affects water storage under vegetation and in bare soil Despite the volume of research in this field, fundamental limitations still exist in the models regarding the aforementioned issues. Topography and hydrodynamics have been strongly simplified. Infiltration has been modelled as dependent on depth but independent of soil moisture. Temporal rainfall variability has only been addressed for seasonal rain. Spatial heterogenity of the topography as well as roughness and infiltration properties, has not been fully and explicitly represented. We hypothesize that physical processes must be robustly modelled and the drivers of complexity must be present with as much resolution as possible in order to provide the necessary realism to improve transient simulations, perhaps leading the way to virtual laboratories and, arguably, predictive tools. This work provides a first approach into a model with explicit hydrological processes represented by physically-based hydrodynamic models, coupled with well-accepted vegetation models. The model aims to enable new possibilities relating to spatiotemporal variability, arbitrary topography and representation of spatial heterogeneity, including sub-daily (in fact, arbitrary) temporal variability of rain as the main forcing of the model, explicit representation of infiltration processes, and various feedback mechanisms between the hydrodynamics and the vegetation. Preliminary testing strongly suggests that the model is viable, has the potential of producing new information of internal dynamics of the system, and allows to successfully aggregate many of the sources of complexity. Initial benchmarking of the model also reveals strengths to be exploited, thus providing an interesting research outlook, as well as weaknesses to be addressed in the immediate future.
Shen, Kai; Lu, Hui; Baig, Sarfaraz; Wang, Michael R
2017-11-01
The multi-frame superresolution technique is introduced to significantly improve the lateral resolution and image quality of spectral domain optical coherence tomography (SD-OCT). Using several sets of low resolution C-scan 3D images with lateral sub-spot-spacing shifts on different sets, the multi-frame superresolution processing of these sets at each depth layer reconstructs a higher resolution and quality lateral image. Layer by layer processing yields an overall high lateral resolution and quality 3D image. In theory, the superresolution processing including deconvolution can solve the diffraction limit, lateral scan density and background noise problems together. In experiment, the improved lateral resolution by ~3 times reaching 7.81 µm and 2.19 µm using sample arm optics of 0.015 and 0.05 numerical aperture respectively as well as doubling the image quality has been confirmed by imaging a known resolution test target. Improved lateral resolution on in vitro skin C-scan images has been demonstrated. For in vivo 3D SD-OCT imaging of human skin, fingerprint and retina layer, we used the multi-modal volume registration method to effectively estimate the lateral image shifts among different C-scans due to random minor unintended live body motion. Further processing of these images generated high lateral resolution 3D images as well as high quality B-scan images of these in vivo tissues.
Magnetic heating of stellar chromospheres and coronae
NASA Astrophysics Data System (ADS)
van Ballegooijen, A. A.
The theoretical discussion of magnetic heating focuses on heating by dissipation of field-aligned electric currents. Several mechanisms are set forth to account for the very high current densities needed to generate the heat, but observed radiative losses do not justify the resultant Ohmic heating rate. Tearing modes, 'turbulent resistivity', and 'hyper-resistivity' are considered to resolve the implied inefficiency of coronal heating. Because the mechanisms are not readily applicable to the sun, transverse magnetic energy flows and magnetic flare release are considered to account for the magnitude of observed radiative loss. High-resolution observations of the sun are concluded to be an efficient way to examine the issues of magnetic heating in spite of the very small spatial scales of the heating processes.
Science & Technology Review September/October 2008
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bearinger, J P
2008-07-21
This issue has the following articles: (1) Answering Scientists Most Audacious Questions--Commentary by Dona Crawford; (2) Testing the Accuracy of the Supernova Yardstick--High-resolution simulations are advancing understanding of Type Ia supernovae to help uncover the mysteries of dark energy; (3) Developing New Drugs and Personalized Medical Treatment--Accelerator mass spectrometry is emerging as an essential tool for assessing the effects of drugs in humans; (4) Triage in a Patch--A painless skin patch and accompanying detector can quickly indicate human exposure to biological pathogens, chemicals, explosives, or radiation; and (5) Smoothing Out Defects for Extreme Ultraviolet Lithography--A process for smoothing mask defectsmore » helps move extreme ultraviolet lithography one step closer to creating smaller, more powerful computer chips.« less
An advanced lithium-air battery exploiting an ionic liquid-based electrolyte.
Elia, G A; Hassoun, J; Kwak, W-J; Sun, Y-K; Scrosati, B; Mueller, F; Bresser, D; Passerini, S; Oberhumer, P; Tsiouvaras, N; Reiter, J
2014-11-12
A novel lithium-oxygen battery exploiting PYR14TFSI-LiTFSI as ionic liquid-based electrolyte medium is reported. The Li/PYR14TFSI-LiTFSI/O2 battery was fully characterized by electrochemical impedance spectroscopy, capacity-limited cycling, field emission scanning electron microscopy, high-resolution transmission electron microscopy, and X-ray photoelectron spectroscopy. The results of this extensive study demonstrate that this new Li/O2 cell is characterized by a stable electrode-electrolyte interface and a highly reversible charge-discharge cycling behavior. Most remarkably, the charge process (oxygen oxidation reaction) is characterized by a very low overvoltage, enhancing the energy efficiency to 82%, thus, addressing one of the most critical issues preventing the practical application of lithium-oxygen batteries.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cui Yunfeng; Galvin, James M.; Radiation Therapy Oncology Group, American College of Radiology, Philadelphia, Pennsylvania
2013-01-01
Purpose: To report the process and initial experience of remote credentialing of three-dimensional (3D) image guided radiation therapy (IGRT) as part of the quality assurance (QA) of submitted data for Radiation Therapy Oncology Group (RTOG) clinical trials; and to identify major issues resulting from this process and analyze the review results on patient positioning shifts. Methods and Materials: Image guided radiation therapy datasets including in-room positioning CT scans and daily shifts applied were submitted through the Image Guided Therapy QA Center from institutions for the IGRT credentialing process, as required by various RTOG trials. A centralized virtual environment is establishedmore » at the RTOG Core Laboratory, containing analysis tools and database infrastructure for remote review by the Physics Principal Investigators of each protocol. The appropriateness of IGRT technique and volumetric image registration accuracy were evaluated. Registration accuracy was verified by repeat registration with a third-party registration software system. With the accumulated review results, registration differences between those obtained by the Physics Principal Investigators and from the institutions were analyzed for different imaging sites, shift directions, and imaging modalities. Results: The remote review process was successfully carried out for 87 3D cases (out of 137 total cases, including 2-dimensional and 3D) during 2010. Frequent errors in submitted IGRT data and challenges in the review of image registration for some special cases were identified. Workarounds for these issues were developed. The average differences of registration results between reviewers and institutions ranged between 2 mm and 3 mm. Large discrepancies in the superior-inferior direction were found for megavoltage CT cases, owing to low spatial resolution in this direction for most megavoltage CT cases. Conclusion: This first experience indicated that remote review for 3D IGRT as part of QA for RTOG clinical trials is feasible and effective. The magnitude of registration discrepancy between institution and reviewer was presented, and the major issues were investigated to further improve this remote evaluation process.« less
A Submillimeter Resolution PET Prototype Evaluated With an 18F Inkjet Printed Phantom
NASA Astrophysics Data System (ADS)
Schneider, Florian R.; Hohberg, Melanie; Mann, Alexander B.; Paul, Stephan; Ziegler, Sibylle I.
2015-10-01
This work presents a submillimeter resolution PET (Positron Emission Tomography) scanner prototype based on SiPM/MPPC arrays (Silicon Photomultiplier/Multi Pixel Photon Counter). Onto each active area a 1 ×1 ×20 mm3 LYSO (Lutetium-Yttrium-Oxyorthosilicate) scintillator crystal is coupled one-to-one. Two detector modules facing each other in a distance of 10.0 cm have been set up with in total 64 channels that are digitized by SADCs (Sampling Analog to Digital Converters) with 80 MHz, 10 bit resolution and FPGA (Field Programmable Gate Array) based extraction of energy and time information. Since standard phantoms are not sufficient for testing submillimeter resolution at which positron range is an issue, a 18F inkjet printed phantom has been used to explore the limit in spatial resolution. The phantom could be successfully reconstructed with an iterative MLEM (Maximum Likelihood Expectation Maximization) and an analytically calculated system matrix based on the DRF (Detector Response Function) model. The system yields a coincidence time resolution of 4.8 ns FWHM, an energy resolution of 20%-30% FWHM and a spatial resolution of 0.8 mm.
NASA Astrophysics Data System (ADS)
Kerridge, J. F.; McSween, H. Y., Jr.; Bunch, T. E.
1994-07-01
We wish to draw attention to a major controversy that has arisen in the area of CM-chondrite petrology. The problem is important because its resolution will have profound implications for ideas concerning nebular dynamics, gas-solid interactions in the nebula, and accretionary processes in the nebula, among other issues. On the one hand, cogent arguments have been presented that 'accretionary dust mantles,' were formed in the solar nebula prior to accretion of the CM parent asteroid(s). On the other hand, no-less-powerful arguments have been advanced that a significant fraction of the CM lithology is secondary, produced by aqueous alteration in the near-surface regions of an asteroid-sized object. Because most, if not all, CM chondrites are breccias, these two views could coexist harmoniously, were it not for the fact that some of the coarse-grained lithologies surrounded by 'accretion dust mantles' are themselves of apparently secondary origin. Such an observation must clearly force a reassessment of one or both of the present schools of thought. Our objective here is to stimulate such a reassessment. Four possible resolutions of this conflict may be postulated. First, perhaps nature found a way of permitting such secondary alteration to take place in the nebula. Second, maybe dust mantles could form in a regolith, rather than a nebular, environment. Third, it is possible that dust mantles around secondary lithologies are different from those around primary lithologies. Finally, perhaps formation of CM chondrites involved a more complex sequence of events than visualized so far, so that some apparently 'primary' processes postdated certain 'secondary' processes.
Separation technologies for stem cell bioprocessing.
Diogo, Maria Margarida; da Silva, Cláudia Lobato; Cabral, Joaquim M S
2012-11-01
Stem cells have been the focus of an intense research due to their potential in Regenerative Medicine, drug discovery, toxicology studies, as well as for fundamental studies on developmental biology and human disease mechanisms. To fully accomplish this potential, the successful application of separation processes for the isolation and purification of stem cells and stem cell-derived cells is a crucial issue. Although separation methods have been used over the past decades for the isolation and enrichment of hematopoietic stem/progenitor cells for transplantation in hemato-oncological settings, recent achievements in the stem cell field have created new challenges including the need for novel scalable separation processes with a higher resolution and more cost-effective. Important examples are the need for high-resolution methods for the separation of heterogeneous populations of multipotent adult stem cells to study their differential biological features and clinical utility, as well as for the depletion of tumorigenic cells after pluripotent stem cell differentiation. Focusing on these challenges, this review presents a critical assessment of separation processes that have been used in the stem cell field, as well as their current and potential applications. The techniques are grouped according to the fundamental principles that govern cell separation, which are defined by the main physical, biophysical, and affinity properties of cells. A special emphasis is given to novel and promising approaches such as affinity-based methods that take advantage of the use of new ligands (e.g., aptamers, lectins), as well as to novel biophysical-based methods requiring no cell labeling and integrated with microscale technologies. Copyright © 2012 Wiley Periodicals, Inc.
Fundamentals of image acquisition and processing in the digital era.
Farman, A G
2003-01-01
To review the historic context for digital imaging in dentistry and to outline the fundamental issues related to digital imaging modalities. Digital dental X-ray images can be achieved by scanning analog film radiographs (secondary capture), with photostimulable phosphors, or using solid-state detectors (e.g. charge-coupled device and complementary metal oxide semiconductor). There are four characteristics that are basic to all digital image detectors; namely, size of active area, signal-to-noise ratio, contrast resolution and the spatial resolution. To perceive structure in a radiographic image, there needs to be sufficient difference between contrasting densities. This primarily depends on the differences in the attenuation of the X-ray beam by adjacent tissues. It is also depends on the signal received; therefore, contrast tends to increase with increased exposure. Given adequate signal and sufficient differences in radiodensity, contrast will be sufficient to differentiate between adjacent structures, irrespective of the recording modality and processing used. Where contrast is not sufficient, digital images can sometimes be post-processed to disclose details that would otherwise go undetected. For example, cephalogram isodensity mapping can improve soft tissue detail. It is concluded that it could be a further decade or two before three-dimensional digital imaging systems entirely replace two-dimensional analog films. Such systems need not only to produce prettier images, but also to provide a demonstrable evidence-based higher standard of care at a cost that is not economically prohibitive for the practitioner or society, and which allows efficient and effective workflow within the business of dental practice.
High-Resolution Adaptive Optics Test-Bed for Vision Science
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wilks, S C; Thomspon, C A; Olivier, S S
2001-09-27
We discuss the design and implementation of a low-cost, high-resolution adaptive optics test-bed for vision research. It is well known that high-order aberrations in the human eye reduce optical resolution and limit visual acuity. However, the effects of aberration-free eyesight on vision are only now beginning to be studied using adaptive optics to sense and correct the aberrations in the eye. We are developing a high-resolution adaptive optics system for this purpose using a Hamamatsu Parallel Aligned Nematic Liquid Crystal Spatial Light Modulator. Phase-wrapping is used to extend the effective stroke of the device, and the wavefront sensing and wavefrontmore » correction are done at different wavelengths. Issues associated with these techniques will be discussed.« less
12 CFR 308.517 - Authority of the ALJ.
Code of Federal Regulations, 2010 CFR
2010-01-01
...) Administer oaths and affirmations; (5) Issue subpoenas requiring the attendance of witnesses and the... regarding the validity of federal statutes or regulations or of directives, rules, resolutions, policies...
Dynamically re-configurable CMOS imagers for an active vision system
NASA Technical Reports Server (NTRS)
Yang, Guang (Inventor); Pain, Bedabrata (Inventor)
2005-01-01
A vision system is disclosed. The system includes a pixel array, at least one multi-resolution window operation circuit, and a pixel averaging circuit. The pixel array has an array of pixels configured to receive light signals from an image having at least one tracking target. The multi-resolution window operation circuits are configured to process the image. Each of the multi-resolution window operation circuits processes each tracking target within a particular multi-resolution window. The pixel averaging circuit is configured to sample and average pixels within the particular multi-resolution window.
NASA Astrophysics Data System (ADS)
Wang, Lei
Natural and human-induced environmental changes have been altering the earth's surface and hydrological processes, and thus directly contribute to the severity of flood hazards. To understand these changes and their impacts, this research developed a GIS-based hydrological and hydraulic modeling system, which incorporates state-of-the-art remote sensing data to simulate flood under various scenarios. The conceptual framework and technical issues of incorporating multi-scale remote sensing data have been addressed. This research develops an object-oriented hydrological modeling framework. Compared with traditional lumped or cell-based distributed hydrological modeling frameworks, the object-oriented framework allows basic spatial hydrologic units to have various size and irregular shape. This framework is capable of assimilating various GIS and remotely-sensed data with different spatial resolutions. It ensures the computational efficiency, while preserving sufficient spatial details of input data and model outputs. Sensitivity analysis and comparison of high resolution LIDAR DEM with traditional USGS 30m resolution DEM suggests that the use of LIDAR DEMs can greatly reduce uncertainty in calibration of flow parameters in the hydrologic model and hence increase the reliability of modeling results. In addition, subtle topographic features and hydrologic objects like surface depressions and detention basins can be extracted from the high resolution LiDAR DEMs. An innovative algorithm has been developed to efficiently delineate surface depressions and detention basins from LiDAR DEMs. Using a time series of Landsat images, a retrospective analysis of surface imperviousness has been conducted to assess the hydrologic impact of urbanization. The analysis reveals that with rapid urbanization the impervious surface has been increased from 10.1% to 38.4% for the case study area during 1974--2002. As a result, the peak flow for a 100-year flood event has increased by 20% and the floodplain extent has expanded by about 21.6%. The quantitative analysis suggests that the large regional detentions basins have effectively offset the adverse effect of increased impervious surface during the urbanization process. Based on the simulation and scenario analyses of land subsidence and potential climate changes, some planning measures and policy implications have been derived for guiding smart urban growth and sustainable resource development and management to minimize flood hazards.
NASA Astrophysics Data System (ADS)
Hernandez, Charles; Drobinski, Philippe; Turquety, Solène
2015-10-01
Wildfires alter land cover creating changes in dynamic, vegetative, radiative, thermal and hydrological properties of the surface. However, how so drastic changes induced by wildfires and how the age of the burnt scar affect the small and meso-scale atmospheric boundary layer dynamics are largely unknown. These questions are relevant for process analysis, meteorological and air quality forecast but also for regional climate analysis. Such questions are addressed numerically in this study on the case of the Portugal wildfires in 2003 as a testbed. In order to study the effects of burnt scars, an ensemble of numerical simulations using the Weather Research and Forecasting modeling system (WRF) have been performed with different surface properties mimicking the surface state immediately after the fire, few days after the fire and few months after the fire. In order to investigate such issue in a seamless approach, the same modelling framework has been used with various horizontal resolutions of the model grid and land use, ranging from 3.5 km, which can be considered as the typical resolution of state-of-the art regional numerical weather prediction models to 14 km which is now the typical target resolution of regional climate models. The study shows that the combination of high surface heat fluxes over the burnt area, large differential heating with respect to the preserved surroundings and lower surface roughness produces very intense frontogenesis with vertical velocity reaching few meters per second. This powerful meso-scale circulation can pump more humid air from the surroundings not impacted by the wildfire and produce more cloudiness over the burnt area. The influence of soil temperature immediately after the wildfire ceases is mainly seen at night as the boundary-layer remains unstably stratified and lasts only few days. So the intensity of the induced meso-scale circulation decreases with time, even though it remains until full recovery of the vegetation. Finally all these effects are simulated whatever the land cover and model resolution and there are thus robust processes in both regional climate simulations and process studies or short-time forecast. However, the impact of burnt scars on the precipitation signal remains very uncertain, especially because low precipitation is at stake.
Applications and Improvement of a Coupled, Global and Cloud-Resolving Modeling System
NASA Technical Reports Server (NTRS)
Tao, W.-K.; Chern, J.; Atlas, R.
2005-01-01
Recently Grabowski (2001) and Khairoutdinov and Randall (2001) have proposed the use of 2D CFWs as a "super parameterization" [or multi-scale modeling framework (MMF)] to represent cloud processes within atmospheric general circulation models (GCMs). In the MMF, a fine-resolution 2D CRM takes the place of the single-column parameterization used in conventional GCMs. A prototype Goddard MMF based on the 2D Goddard Cumulus Ensemble (GCE) model and the Goddard finite volume general circulation model (fvGCM) is now being developed. The prototype includes the fvGCM run at 2.50 x 20 horizontal resolution with 32 vertical layers from the surface to 1 mb and the 2D (x-z) GCE using 64 horizontal and 32 vertical grid points with 4 km horizontal resolution and a cyclic lateral boundary. The time step for the 2D GCE would be 15 seconds, and the fvGCM-GCE coupling frequency would be 30 minutes (i.e. the fvGCM physical time step). We have successfully developed an fvGCM-GCE coupler for this prototype. Because the vertical coordinate of the fvGCM (a terrain-following floating Lagrangian coordinate) is different from that of the GCE (a z coordinate), vertical interpolations between the two coordinates are needed in the coupler. In interpolating fields from the GCE to fvGCM, we use an existing fvGCM finite- volume piecewise parabolic mapping (PPM) algorithm, which conserves the mass, momentum, and total energy. A new finite-volume PPM algorithm, which conserves the mass, momentum and moist static energy in the z coordinate, is being developed for interpolating fields from the fvGCM to the GCE. In the meeting, we will discuss the major differences between the two MMFs (i.e., the CSU MMF and the Goddard MMF). We will also present performance and critical issues related to the MMFs. In addition, we will present multi-dimensional cloud datasets (i.e., a cloud data library) generated by the Goddard MMF that will be provided to the global modeling community to help improve the representation and performance of moist processes in climate models and to improve our understanding of cloud processes globally (the software tools needed to produce cloud statistics and to identify various types of clouds and cloud systems from both high-resolution satellite and model data will be also presented).
The Histochemistry and Cell Biology pandect: the year 2014 in review.
Taatjes, Douglas J; Roth, Jürgen
2015-04-01
This review encompasses a brief synopsis of the articles published in 2014 in Histochemistry and Cell Biology. Out of the total of 12 issues published in 2014, two special issues were devoted to "Single-Molecule Super-Resolution Microscopy." The present review is divided into 11 categories, providing an easy format for readers to quickly peruse topics of particular interest to them.
Healthcare rationing: issues and implications.
Cypher, D P
1997-01-01
What methods, if any, should be used to practice healthcare rationing? This article looks at healthcare rationing in the United States, identifies ethical issues associated with implementing healthcare rationing, and addresses legal implications. The author utilizes sources from published literature and her own experience. Society must recognize that it does not have the resources available to fulfill all healthcare needs of all its members. Resolution will bring conflict and compromise.