Environmental Early Warning Systems (EEWS): Equation Writer’s Manual.
1986-07-01
Flowchart 17 2-3 Preliminary Equation Form in Non-EEWS Format 20 2-4 Completed EEWS Form for a Coastal Zone Consideration 21 2-5 Data Ready to Be Loaded 24...Area into workable Subtopics. 4. Formulate flowcharts to separate and show interrelationships between Subtopic Areas and their desired outputs. 5...Army demand criteria (Army background). 9. Refine flowcharts for each specific Subtopic Area and estimate difficulty of accomplishing each step (Figure
ERIC Educational Resources Information Center
Demaray, Bryan
Five packets comprise the marine science component of an enrichment program for gifted elementary students. Considered in the introductory section are identification (pre/post measure) procedures. Remaining packets address the following topics (subtopics in parentheses): basic marine science laboratory techniques (microscope techniques and metric…
Medical School Admissions: The Insider's Guide.
ERIC Educational Resources Information Center
Zebala, John A.; Jones, Daniel B.
A handbook on the medical school admissions process is presented, offering a first hand account of what works. Six chapters discuss the following topics and subtopics: (1) premedical preparation (planning undergraduate study and picking the right college); (2) power techniques for higher grades (techniques for grade point success, improving grades…
Psychological and Environmental Treatment of Asthma: A Review.
ERIC Educational Resources Information Center
Gray, Steven G.; And Others
Seventy citations (1886-1980) on psychological and environmental treatment of asthma are reviewed. Information is analyzed for the following topics (sample subtopics in parentheses): assessment of asthma (self report, activity restriction, medical examination); behavior therapy (relaxation procedures, biofeedback, operant techniques); dynamic…
Overview of the TREC 2012 Web Track
2012-11-01
picture of the Last Supper painting by Leonardo da Vinci . </description> <subtopic number=ŕ" type="nav"> Find a picture of the Last Supper painting by... Leonardo da Vinci . </subtopic> <subtopic number=Ŗ" type="nav"> Are tickets available online to view da Vinci’s Last Supper in Milan, Italy
1978 Annual Review of Child Abuse and Neglect Research.
ERIC Educational Resources Information Center
Martin, Mary Porter; Klaus, Susan L.
The review of research on child abuse and neglect presents brief abstracts of studies collected by the Clearinghouse of the National Center on Child Abuse and Neglect. Material is organized into five subject areas (sample subtopics in parentheses): definition of abuse and neglect; incidence (national and selected geographic estimates);…
ERIC Educational Resources Information Center
Bergerson, Peter J., Ed.
The 16 chapters of this book offer innovative instructional techniques used to train public managers. It presents public management concepts along with such subtopics as organizational theory and ethics, research skills, program evaluation, financial management, computers and communication skills in public administration, comparative public…
Composing Songs for Teaching Science to College Students
ERIC Educational Resources Information Center
Yee Pinn Tsin, Isabel
2015-01-01
Recent studies have shown that songs may enhance learning as they function as mnemonic devices to increase memorability. In this research, songs based on the more difficult subtopics in Chemistry were composed, encompassing many formulas, equations and facts to be remembered. This technique of song composition can be used in any subject, any point…
Water: A Topic for All Sciences
ERIC Educational Resources Information Center
Davies, Malonne I.; Seimears, C. Matt
2008-01-01
The authors illustrate an effective lesson-planning technique known as unpacking for the broad topic of water. Interconnections among science disciplines are shown for numerous possible subtopics. Two lesson sets are included, the first dealing with properties of water and the second dealing with water as a resource. (Contains 1 table and 4…
Information retrieval system utilizing wavelet transform
Brewster, Mary E.; Miller, Nancy E.
2000-01-01
A method for automatically partitioning an unstructured electronically formatted natural language document into its sub-topic structure. Specifically, the document is converted to an electronic signal and a wavelet transform is then performed on the signal. The resultant signal may then be used to graphically display and interact with the sub-topic structure of the document.
Interactive Book Reading with Expository Science Texts in Preschool Special Education Classrooms
ERIC Educational Resources Information Center
Breit-Smith, Allison; Busch, Jamie D.; Dinnesen, Megan Schneider; Guo, Ying
2017-01-01
Expository, or informational, text can be defined as a type of nonfiction that describes a topic categorically by moving from subtopic to subtopic with the intent to teach content or convey information (Maloch & Bomer, 2013). One vehicle for teaching the text structure and language of expository text to preschool-age children is through…
Information retrieval system utilizing wavelet transform
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brewster, M.E.; Miller, N.E.
A method is disclosed for automatically partitioning an unstructured electronically formatted natural language document into its sub-topic structure. Specifically, the document is converted to an electronic signal and a wavelet transform is then performed on the signal. The resultant signal may then be used to graphically display and interact with the sub-topic structure of the document.
2014-03-27
Technology (AFIT). Research at AFIT investigates the use of DSA for both civilian and military applications while advancing technology in the area of radio...other military platforms is vital for successful operations. Twelve core functions comprise the US Air Force: Nuclear Deterrence Operations, Special...problems. This Air Force report discusses “Frequency Agile Spectrum Utilization”, a sub-topic of DSA, as a potential capability area [3]. Military
NASA Astrophysics Data System (ADS)
Berger, Roland; Hänze, Martin
2015-01-01
We assessed the impact of expert students' instructional quality on the academic performance of novice students in 12th-grade physics classes organized in an expert model of cooperative learning ('jigsaw classroom'). The instructional quality of 129 expert students was measured by a newly developed rating system. As expected, when aggregating across all four subtopics taught, regression analysis revealed that academic performance of novice students increases with the quality of expert students' instruction. The difficulty of subtopics, however, moderates this effect: higher instructional quality of more difficult subtopics did not lead to better academic performance of novice students. We interpret this finding in the light of Cognitive Load Theory. Demanding tasks cause high intrinsic cognitive load and hindered the novice students' learning.
NASA Technical Reports Server (NTRS)
1985-01-01
The concept of a large disturbance bypass mechanism for the initiation of transition is reviewed and studied. This mechanism, or some manifestation thereof, is suspected to be at work in the boundary layers present in a turbine flow passage. Discussion is presented on four relevant subtopics: (1) the effect of upstream disturbances and wakes on transition; (2) transition prediction models, code development, and verification; (3) transition and turbulence measurement techniques; and (4) the hydrodynamic condition of low Reynolds number boundary layers.
Design of a Genomics Curriculum: Competencies for Practicing Pathologists.
Laudadio, Jennifer; McNeal, Jeffrey L; Boyd, Scott D; Le, Long Phi; Lockwood, Christina; McCloskey, Cindy B; Sharma, Gaurav; Voelkerding, Karl V; Haspel, Richard L
2015-07-01
The field of genomics is rapidly impacting medical care across specialties. To help guide test utilization and interpretation, pathologists must be knowledgeable about genomic techniques and their clinical utility. The technology allowing timely generation of genomic data is relatively new to patient care and the clinical laboratory, and therefore, many currently practicing pathologists have been trained without any molecular or genomics exposure. Furthermore, the exposure that current and recent trainees receive in this field remains inconsistent. To assess pathologists' learning needs in genomics and to develop a curriculum to address these educational needs. A working group formed by the College of American Pathologists developed an initial list of genomics competencies (knowledge and skills statements) that a practicing pathologist needs to be successful. Experts in genomics were then surveyed to rate the importance of each competency. These data were used to create a final list of prioritized competencies. A subset of the working group defined subtopics and tasks for each competency. Appropriate delivery methods for the educational material were also proposed. A final list of 32 genomics competency statements was developed. A prioritized curriculum was created with designated subtopics and tasks associated with each competency. We present a genomics curriculum designed as a first step toward providing practicing pathologists with the competencies needed to practice successfully.
Investigating multiphoton phenomena using nonlinear dynamics
NASA Astrophysics Data System (ADS)
Huang, Shu
Many seemingly simple systems can display extraordinarily complex dynamics which has been studied and uncovered through nonlinear dynamical theory. The leitmotif of this thesis is changing phase-space structures and their (linear or non-linear) stabilities by adding control functions (which act on the system as external perturbations) to the relevant Hamiltonians. These phase-space structures may be periodic orbits, invariant tori or their stable and unstable manifolds. One-electron systems and diatomic molecules are fundamental and important staging ground for new discoveries in nonlinear dynamics. In past years, increasing emphasis and effort has been put on the control or manipulation of these systems. Recent developments of nonlinear dynamical tools can provide efficient ways of doing so. In the first subtopic of the thesis, we are adding a control function to restore tori at prescribed locations in phase space. In the remainder of the thesis, a control function with parameters is used to change the linear stability of the periodic orbits which govern the processes in question. In this thesis, we report our theoretical analyses on multiphoton ionization of Rydberg atoms exposed to strong microwave fields and the dissociation of diatomic molecules exposed to bichromatic lasers using nonlinear dynamical tools. This thesis is composed of three subtopics. In the first subtopic, we employ local control theory to reduce the stochastic ionization of hydrogen atom in a strong microwave field by adding a relatively small control term to the original Hamiltonian. In the second subtopic, we perform periodic orbit analysis to investigate multiphoton ionization driven by a bichromatic microwave field. Our results show quantitative and qualitative agreement with previous studies, and hence identify the mechanism through which short periodic orbits organize the dynamics in multiphoton ionization. In addition, we achieve substantial time savings with this approach. In the third subtopic we extend our periodic orbit analysis to the dissociation of diatomic molecules driven by a bichromatic laser. In this problem, our results based on periodic orbit analysis again show good agreement with previous work, and hence promise more potential applications of this approach in molecular physics.
Small business innovation research: Program solicitation
NASA Technical Reports Server (NTRS)
1989-01-01
This, the seventh annual SBIR solicitation by NASA, describes the program, identifies eligibility requirements, outlines the required proposal format and content, states proposal preparation and submission requirements, describes the proposal evaluation and award selection process, and provides other information to assist those interested in participating in NASA's SBIR program. It also identifies the Technical Topics and Subtopics in which SBIR Phase 1 proposals are solicited in 1989. These Topics and Subtopics cover a broad range of current NASA interests, but do not necessarily include all areas in which NASA plans or currently conducts research. High-risk high pay-off innovations are desired.
NASA SBIR Subtopic S2.04 "Advanced Optical Components"
NASA Technical Reports Server (NTRS)
Stahl, H. Philip
2009-01-01
The primary purpose of this subtopic is to develop and demonstrate technologies to manufacture ultra-low-cost precision optical systems for very large x-ray, UV/optical or infrared telescopes. Potential solutions include but are not limited to direct precision machining, rapid optical fabrication, slumping or replication technologies to manufacture 1 to 2 meter (or larger) precision quality mirror or lens segments (either normal incidence for uv/optical/infrared or grazing incidence for x-ray). An additional key enabling technology for UV/optical telescopes is a broadband (from 100 nm to 2500 nm) high-reflectivity mirror coating with extremely uniform amplitude and polarization properties which can be deposited on 1 to 3 meter class mirror.
Small business innovation research program solicitation
NASA Technical Reports Server (NTRS)
1994-01-01
The National Aeronautics and Space Administration invites eligible small business concerns to submit Phase 1 proposals for its 1994 Small Business Innovation Research (SBIR) Program, which is described in this twelfth annual NASA SBIR Program Solicitation. The 1994 solicitation period for Phase 1 proposals begins April 4, 1994 and ends June 15, 1994. Eligible firms with research or research and development capabilities (R/R&D) in any of the listed topic and subtopic areas are encouraged to participate. Through SBIR, NASA seeks innovative concepts addressing the program needs described in the SBIR solicitation subtopics and offering commercial application potential. This document contains program background information, outlines eligibility requirements for SBIR participants, describes the three SBIR program phases, and provides the information qualified offerors need to prepare and submit responsive proposals.
Teaching Chemical Equilibrium with the Jigsaw Technique
NASA Astrophysics Data System (ADS)
Doymus, Kemal
2008-03-01
This study investigates the effect of cooperative learning (jigsaw) versus individual learning methods on students’ understanding of chemical equilibrium in a first-year general chemistry course. This study was carried out in two different classes in the department of primary science education during the 2005-2006 academic year. One of the classes was randomly assigned as the non-jigsaw group (control) and other as the jigsaw group (cooperative). Students participating in the jigsaw group were divided into four “home groups” since the topic chemical equilibrium is divided into four subtopics (Modules A, B, C and D). Each of these home groups contained four students. The groups were as follows: (1) Home Group A (HGA), representin g the equilibrium state and quantitative aspects of equilibrium (Module A), (2) Home Group B (HGB), representing the equilibrium constant and relationships involving equilibrium constants (Module B), (3) Home Group C (HGC), representing Altering Equilibrium Conditions: Le Chatelier’s principle (Module C), and (4) Home Group D (HGD), representing calculations with equilibrium constants (Module D). The home groups then broke apart, like pieces of a jigsaw puzzle, and the students moved into jigsaw groups consisting of members from the other home groups who were assigned the same portion of the material. The jigsaw groups were then in charge of teaching their specific subtopic to the rest of the students in their learning group. The main data collection tool was a Chemical Equilibrium Achievement Test (CEAT), which was applied to both the jigsaw and non-jigsaw groups The results indicated that the jigsaw group was more successful than the non-jigsaw group (individual learning method).
Terrestrial photovoltaic measurements, 2
NASA Technical Reports Server (NTRS)
1976-01-01
The following major topics are discussed; (1) Terrestrial solar irradiance; (2) Solar simulation and reference cell calibration; and (3) Cell and array measurement procedures. Numerous related subtopics are also discussed within each major topic area.
Small Business Innovation Research. Program solicitation. Closing date: July 21, 1992
NASA Technical Reports Server (NTRS)
1992-01-01
The National Aeronautics and Space Administration (NASA) invites small businesses to submit Phase 1 proposals in response to its Small Business Innovation Research (SBIR) Program Solicitation 92-1. Firms with research or research and development capabilities (R/R&D) in science or engineering in any of the areas listed are encouraged to participate. This, the tenth annual SBIR solicitation by NASA, describes the program, identifies eligibility requirements, describes the proposal evaluation and award selection process, and provides other information to assist those interested in participating in NASA's SBIR program. It also identifies, in Section 8.0, the technical topics and subtopics in which SBIR Phase 1 proposals are solicited in 1992. These topics and subtopics cover a broad range of current NASA interests but do not necessarily include all areas in which NASA plans or currently conducts research. The NASA SBIR program seeks innovative approaches that respond to the needs, technical requirements, and new opportunities described in the subtopics. The focus is on innovation through the use of emerging technologies, novel applications of existing technologies, exploitation of scientific breakthroughs, or new capabilities or major improvements to existing technologies. NASA plans to select about 320 high-quality research or research and development proposals for Phase 1 contract awards on the basis of this Solicitation. Phase 1 contracts are normally six months in duration and funded up to $50,000, including profit. Selections will be based on the competitive merits of the offers and on NASA needs and priorities.
ERIC Educational Resources Information Center
King, William, Comp.
1981-01-01
A collection of quotations drawn from research and opinion papers dealing with the impact of television viewing on children. Subtopics addressed are: television viewing statistics, effects of television violence, and the relationship of television to education. (JJD)
NASA Technical Reports Server (NTRS)
Smetana, Jerry (Editor); Mittra, Raj (Editor); Laprade, Nick; Edward, Bryan; Zaghloul, Amir
1987-01-01
The IEEE AP-S ADCOM is attempting to expand its educational, tutorial and information exchange activities as a further benefit to all members. To this end, ADCOM will be forming specialized workshops on topics of interest to its members. The first such workshop on Characterization and Packaging of MMIC Devices for Array Antennas was conceived. The workshop took place on June 13, 1986 as part of the 1986 International Symposium sponsored by IEEE AP-S and URSI in Philadelphia, PA, June 9-13, 1986. The workshop was formed to foster the interchange of ideas among MMIC device users and to provide a forum to collect and focus information among engineers experienced and interested in the topic. After brief presentations by the panelists and comments from attendees on several subtopics, the group was divided into working committees. Each committee evaluated and made recommendations on one of the subtopics.
Following the Social Media: Aspect Evolution of Online Discussion
NASA Astrophysics Data System (ADS)
Tang, Xuning; Yang, Christopher C.
Due to the advance of Internet and Web 2.0 technologies, it is easy to extract thousands of threads about a topic of interest from an online forum but it is nontrivial to capture the blueprint of different aspects (i.e., subtopic, or facet) associated with the topic. To better understand and analyze a forum discussion given topic, it is important to uncover the evolution relationships (temporal dependencies) between different topic aspects (i.e. how the discussion topic is evolving). Traditional Topic Detection and Tracking (TDT) techniques usually organize topics as a flat structure but it does not present the evolution relationships between topic aspects. In addition, the properties of short and sparse messages make the content-based TDT techniques difficult to perform well in identifying evolution relationships. The contributions in this paper are two-folded. We formally define a topic aspect evolution graph modeling framework and propose to utilize social network information, content similarity and temporal proximity to model evolution relationships between topic aspects. The experimental results showed that, by incorporating social network information, our technique significantly outperformed content-based technique in the task of extracting evolution relationships between topic aspects.
Leroy, Gondy; Xu, Jennifer; Chung, Wingyan; Eggers, Shauna; Chen, Hsinchun
2007-01-01
Retrieving sufficient relevant information online is difficult for many people because they use too few keywords to search and search engines do not provide many support tools. To further complicate the search, users often ignore support tools when available. Our goal is to evaluate in a realistic setting when users use support tools and how they perceive these tools. We compared three medical search engines with support tools that require more or less effort from users to form a query and evaluate results. We carried out an end user study with 23 users who were asked to find information, i.e., subtopics and supporting abstracts, for a given theme. We used a balanced within-subjects design and report on the effectiveness, efficiency and usability of the support tools from the end user perspective. We found significant differences in efficiency but did not find significant differences in effectiveness between the three search engines. Dynamic user support tools requiring less effort led to higher efficiency. Fewer searches were needed and more documents were found per search when both query reformulation and result review tools dynamically adjust to the user query. The query reformulation tool that provided a long list of keywords, dynamically adjusted to the user query, was used most often and led to more subtopics. As hypothesized, the dynamic result review tools were used more often and led to more subtopics than static ones. These results were corroborated by the usability questionnaires, which showed that support tools that dynamically optimize output were preferred.
ERIC Educational Resources Information Center
Winston, Alan G., Ed.; Seekins, Nancy, Ed.
The manual is intended to provide guidelines for the planning and development of parks and recreation facilities which are accessbile to everyone. Separate chapters present guidelines for the following topics (sample subtopics in parentheses): general information (space relationships and wheelchair functions); general site conditions (soil…
Planning Instruction for the Severely Handicapped.
ERIC Educational Resources Information Center
North Carolina State Dept. of Public Instruction, Raleigh. Div. for Exceptional Children.
The manual discusses legal and procedural guidelines established by North Carolina regarding educational services for severely handicapped students. Covered in separate sections are the following topics (sample subtopics in parentheses): definition; placement procedures (referral, screening, school-based committee, assessment, placement, and exit…
NASA Astrophysics Data System (ADS)
Campbell, Chad Edward
Over the past decade, hundreds of studies have introduced genomics and bioinformatics (GB) curricula and laboratory activities at the undergraduate level. While these publications have facilitated the teaching and learning of cutting-edge content, there has yet to be an evaluation of these assessment tools to determine if they are meeting the quality control benchmarks set forth by the educational research community. An analysis of these assessment tools indicated that <10% referenced any quality control criteria and that none of the assessments met more than one of the quality control benchmarks. In the absence of evidence that these benchmarks had been met, it is unclear whether these assessment tools are capable of generating valid and reliable inferences about student learning. To remedy this situation the development of a robust GB assessment aligned with the quality control benchmarks was undertaken in order to ensure evidence-based evaluation of student learning outcomes. Content validity is a central piece of construct validity, and it must be used to guide instrument and item development. This study reports on: (1) the correspondence of content validity evidence gathered from independent sources; (2) the process of item development using this evidence; (3) the results from a pilot administration of the assessment; (4) the subsequent modification of the assessment based on the pilot administration results and; (5) the results from the second administration of the assessment. Twenty-nine different subtopics within GB (Appendix B: Genomics and Bioinformatics Expert Survey) were developed based on preliminary GB textbook analyses. These subtopics were analyzed using two methods designed to gather content validity evidence: (1) a survey of GB experts (n=61) and (2) a detailed content analyses of GB textbooks (n=6). By including only the subtopics that were shown to have robust support across these sources, 22 GB subtopics were established for inclusion in the assessment. An expert panel subsequently developed, evaluated, and revised two multiple-choice items to align with each of the 22 subtopics, producing a final item pool of 44 items. These items were piloted with student samples of varying content exposure levels. Both Classical Test Theory (CTT) and Item Response Theory (IRT) methodologies were used to evaluate the assessment's validity, reliability and ability inferences, and its ability to differentiate students with different magnitudes of content exposure. A total of 18 items were subsequently modified and reevaluated by an expert panel. The 26 original and 18 modified items were once again piloted with student samples of varying content exposure levels. Both CTT and IRT methodologies were once again used to evaluate student responses in order to evaluate the assessment's validity and reliability inferences as well as its ability to differentiate students with different magnitudes of content exposure. Interviews with students from different content exposure levels were also performed in order to gather convergent validity evidence (external validity evidence) as well as substantive validity evidence. Also included are the limitations of the assessment and a set of guidelines on how the assessment can best be used.
NASA Technical Reports Server (NTRS)
Baldwin, Kenneth; Feeback, Daniel
1999-01-01
Presentations from the assembled group of investigators involved in specific research projeects related to skeletal muscle in space flight can categorized in thematic subtopics: regulation of contractile protein phenotypes, muscle growth and atrophy, muscle structure: injury, recovery,and regeneration, metabolism and fatigue, and motor control and loading factors.
ERIC Educational Resources Information Center
Marzuki, Ariffin Bin; Peng, J. Y.
A profile of Malaysia is sketched in this paper. Emphasis is placed on the nature, scope, and accomplishments of population activities in the country. Topics and sub-topics include: location and description of the country; population (size, growth patterns, age structure, urban/rural distribution, ethnic and religious composition, migration,…
Rules and Regulations for Education Programs for the Handicapped.
ERIC Educational Resources Information Center
Pace, R. Elwood; And Others
The manual presents Utah's rules and regulations for education programs serving handicapped students. Regulations touch upon the following topics (sample subtopics in parentheses): responsibilities of the State Office of Education (authority to make policy); child identification (child find and screening, referral, evaluation/classification…
ICTNET at Web Track 2009 Diversity task
2009-11-01
performance. On the World Wide Web, there exist many documents which represents several implicit subtopics. We used commerce search engines to gather those...documents. In this task, our work can be divided into five steps. First, we collect documents returned by commerce search engines , and considered
Power Sprayers, Power Dusters, and Aerial Equipment for Pesticide Application.
ERIC Educational Resources Information Center
Cole, Herbert, Jr.
This agriculture extension service publication from Pennsylvania State University discusses agricultural pesticide application equipment. The three sections of the publication are Power Sprayers, Power Dusters, and Aerial Equipment. In the section discussing power sprayers, subtopics include hydraulic sprayers, component parts, multi-purpose farm…
ERIC Educational Resources Information Center
Population Council, New York, NY.
A profile of Jamaica is sketched in this paper. Emphasis is placed on the nature, scope, and accomplishments of population activities in the country. Topics and sub-topics include: location and description of the island; population - size, growth patterns, age structure, rural/urban distribution, ethnic and religious composition, literacy, future…
Legal Rights & Intellectual Disability: A Short Guide.
ERIC Educational Resources Information Center
Hall, Julia, Ed.; And Others
The book examines actions that may be taken to redress wrongs illegally perpetrated against people with intellectual disabilities in New South Wales, Australia. Ten topic areas are addressed (sample subtopics in parentheses): protecting rights (complaints to government departments, use of the ombudsman); discrimination (legal aid); personal…
ERIC Educational Resources Information Center
Population Council, New York, NY.
A profile of Indonesia is sketched in this paper. Emphasis is placed on the nature, scope, and accomplishments of population activities in the country. Topics and sub-topics include: location and description of the country; population - size, growth patterns, age structure, urban/rural distribution, ethnic and religious composition, migration,…
ERIC Educational Resources Information Center
Keeny, S. M.; And Others
A profile of Taiwan is sketched in this paper. Emphasis is placed on the nature, scope, and accomplishments of population activities in the country. Topics and sub-topics include: location and description of the country; population (size, growth patterns, age structure, urban/rural distribution, ethnic and religious composition, migration,…
Selling to Industry for Sheltered Workshops.
ERIC Educational Resources Information Center
Rehabilitation Services Administration (DHEW), Washington, DC.
Intended for staffs of sheltered workshops for handicapped individuals, the guide presents a plan for selling the workshop idea to industry, hints on meeting obstacles, and ideas for expanding and upgrading workshop contract promotion. Brief sections cover the following topics (example subtopics are in parentheses): finding work contract prospects…
Developing a Research Agenda for Assisted Living
ERIC Educational Resources Information Center
Kane, Rosalie A.; Wilson, Keren Brown; Spector, William
2007-01-01
Purpose: We describe an approach to identifying knowledge gaps, research questions, and methodological issues for assisted living (AL) research. Design and Methods: We undertook an inventory of AL literature and research in progress and commissioned background papers critiquing knowledge on selected subtopics. With an advisory committee, we…
ERIC Educational Resources Information Center
Council for Exceptional Children, Reston, VA. Center for Special Education Technology.
This set of 10 resource inventories provides listings of information and service resources organized by state or by subtopic. Listings typically include name, address, phone, and a contact person. The first inventory lists the 39 Alliance for Technology Access Centers which are community-based resources providing specific areas of expertise for…
Trip Leaders Guide. Outdoor Expeditions and Classes.
ERIC Educational Resources Information Center
Leister, Bob
Written to help teachers or leaders plan and lead field trips, excursions, or expeditions which stimulate a motivation to positive action, this pamphlet provides assistance in conducting learning experiences outside the classroom. Topics and subtopics discussed include: (1) Campsites: selection; firebuilding; knives, axes, saws; neat campsites;…
ERIC Educational Resources Information Center
Mahoney, Joyce; And Others
1988-01-01
Evaluates 16 commercially available courseware packages covering topics for introductory physics. Discusses the price, sub-topics, program type, interaction, time, calculus required, graphics, and comments of each program. Recommends two packages in measurement and vectors, and one-dimensional motion respectively. (YP)
Human Welfare and Technological Innovation. Open Grants Papers No. 2.
ERIC Educational Resources Information Center
Hayashi, Yujiro
This publication on human welfare and technological innovation contains two sections. The first section examines the objectives and functions of technological innovation while the second section discusses the direction and analysis of technology transfer between Japan and other nations. Subtopics within the first section include: (1)…
Nursery Production, A Student Handbook.
ERIC Educational Resources Information Center
Buckey, Sylvia; And Others
Developed by a group of university facilty members and graduate students, this textbook is designed for high school, technical school, and associate degree agricultural programs in the northeast section of the United States who study the nursery industry. Chapter topics, which include 84 subtopics, are: (1) Kinds of Nurseries, (2) Occupation in…
Guidelines for Teachers and Parents of Visually Handicapped Children with Additional Handicaps.
ERIC Educational Resources Information Center
Eustis, E. M.; Tierney B.
Intended for parents and teachers of blind, multihandicapped children in special schools; the booklet outlines practical suggestions for teaching children with varying degrees of handicap. Sections cover the following areas (subtopics in parentheses): visual handicap (degrees of blindness); motor development and mobility (suggestions for…
In-Service Training Materials for Teachers of the Educable Mentally Retarded. Session III.
ERIC Educational Resources Information Center
Meyen, Edward L.; Carr, Donald L.
Supplementing language arts for the educable mentally handicapped, the guide provides a representative unit on newspapers with core area activities, vocabulary, and 33 lesson plans. Sub-topics include community orientation, occupations, leisure time and recreation, weather, local history, money management, homemaking and home repair,…
Research Topic | Research Site Name | NREL
exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Capabilities Width of 1746px - Height can , quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Subtopic Heading 2 ex ea commodo consequat. Publications Article Title, Journal Name (Year) Conference Paper Title
Igniting Creativity and Planning for Your Gifted Students.
ERIC Educational Resources Information Center
Russell, Don W., Ed.
The collection of instructional plans is designed to offer samples of strategies and ideas to teachers involved with gifted students. Approximately 30 plans are presented for the following areas (sample subtopics in science (atomic fusion), social studies (mores and folkways), mathematics (spatial relations), health and physiology, philosophy, and…
New Trends in Mathematics Teaching, Volume III.
ERIC Educational Resources Information Center
United Nations Educational, Scientific, and Cultural Organization, Paris (France).
Each of the ten chapters in this volume is intended to present an objective analysis of the trends of some important subtopic in mathematics education and each includes a bibliography for fuller study. The chapters cover primary school mathematics, algebra, geometry, probability and statistics, analysis, logic, applications of mathematics, methods…
An Audiovisual Program in Cell Biology
ERIC Educational Resources Information Center
Fedoroff, Sergey; Opel, William
1978-01-01
A subtopic of cell biology, the structure and function of cell membranes, has been developed as a series of seven self-instructional slide-tape units and tested in five medical schools. Organization of advisers, analysis and definition of objectives and content, and development and evaluation of scripts and storyboards are discussed. (Author/LBH)
Aging Americans: Trends and Projections. 1985-86 Edition.
ERIC Educational Resources Information Center
Congress of the U.S., Washington, DC. Senate Special Committee on Aging.
Analyzed statistics relevant to and about older Americans are contained in this document. Chapter topics (with some subtopics) include the following: (1) size and growth of the older population (age distribution, life expectancy); (2) geographic distribution and mobility (mobility, countermigration); (3) economic status (median cash income, sex,…
ERIC Educational Resources Information Center
Brantlinger, Ellen; And Others
The first of two documents designed to provide training for paraprofessionals working with moderately and severely/profoundly handicapped students, the sourcebook provides information on preservice and inservice education. Narrative is presented and activities are described for the following preservice topics (sample subtopics in parentheses):…
Mainstreaming: Merging Regular and Special Education.
ERIC Educational Resources Information Center
Hasazi, Susan E.; And Others
The booklet on mainstreaming looks at the merging of special and regular education as a process rather than as an end. Chapters address the following topics (sample subtopics in parentheses): what is mainstreaming; pros and cons of mainstreaming; forces influencing change in special education (educators, parents and advocacy groups, the courts,…
Guidelines for the Use of Behavioral Procedures in State Programs for Retarded Persons.
ERIC Educational Resources Information Center
May, Jack G., Jr.; And Others
One of six publications of the Research Advisory Committee of the National Association for Retarded Citizens, the monograph presents guidelines for using behavioral procedures with retarded individuals in residential settings, group living homes, sheltered workshops, or other settings. Addressed are the following topics (sample subtopics in…
ERIC Educational Resources Information Center
Moody, Sidney B.; Miller, L. E.
The handbook is designed to assist youth leaders in the Future Farmers of America (FFA). It is organized into nine sections of varying length which consider the following facets of FFA (with sample sub-topics in parentheses): FFA members (things to know to become an effective member, membership policy); FFA officers (duties and qualifications of…
Deinstitutionalization and Residential Services: A Literature Survey. Project Report No. 1.
ERIC Educational Resources Information Center
Thurlow, Martha L.; And Others
The monograph reviews literature on issues related to deinstitutionalization and residential services for the developmentally disabled. Six main topics are addressed in the review (sample subtopics in parentheses): planning for deinstitutionalization (use of institutional facilities no longer in operation, training and job placement of displaced…
76 FR 71968 - Proposed Agency Information Collection Activities; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2011-11-21
... thereby be included in regulatory capital. Although the accounting for capital contributions is not... contributions in their FR Y-9C report in accordance with generally accepted accounting principles (GAAP). In.... The accounting for capital contributions in the form of notes receivable is set forth in ASC Subtopic...
The Best of Challenge. Volume III.
ERIC Educational Resources Information Center
American Alliance for Health, Physical Education, and Recreation, Washington, DC.
Provided are reprints of 56 articles on physical education and recreation for the mentally retarded originally published between September/October 1973 and April/May 1976. Articles are grouped according to the following major topics (sample subtopics in parentheses): activities (arts, crafts, and games; camping and canoeing; drama and music; and…
The World of Man: A Curriculum Guide.
ERIC Educational Resources Information Center
Peters, Richard O.
This one semester, ecology-oriented, eleventh or twelfth grade elective course exposes students to the problems of environmental degradation and makes them aware of man's attempts to remedy crisis situations. The curriculum guide is divided into three major topics, each comprised of several subtopics which include content, objectives, and…
Physiological Response to Physical Activity in Children.
ERIC Educational Resources Information Center
Gilliam, Thomas B.
This is a report on research in the field of physical responses of children to strenuous activity. The paper is divided into three subtopics: (1) peak performance measure in children; (2) training effects on children; and (3) importance of physical activity for children. Measurements used are oxygen consumption, ventilation, heart rate, cardiac…
Higher Education: A Bibliographic Handbook, Volume II.
ERIC Educational Resources Information Center
Halstead, D. Kent, Ed.
Higher education topics that pertain to the individual institution are addressed in this annotated bibliography, which primarily covers publications issued during 1968-1980. In addition, introductory descriptions of each topic and outlines of subtopics are provided. The 20 major topics and the compilers for each topic are as follows:…
Real-time filtering and detection of dynamics for compression of HDTV
NASA Technical Reports Server (NTRS)
Sauer, Ken D.; Bauer, Peter
1991-01-01
The preprocessing of video sequences for data compressing is discussed. The end goal associated with this is a compression system for HDTV capable of transmitting perceptually lossless sequences at under one bit per pixel. Two subtopics were emphasized to prepare the video signal for more efficient coding: (1) nonlinear filtering to remove noise and shape the signal spectrum to take advantage of insensitivities of human viewers; and (2) segmentation of each frame into temporally dynamic/static regions for conditional frame replenishment. The latter technique operates best under the assumption that the sequence can be modelled as a superposition of active foreground and static background. The considerations were restricted to monochrome data, since it was expected to use the standard luminance/chrominance decomposition, which concentrates most of the bandwidth requirements in the luminance. Similar methods may be applied to the two chrominance signals.
Exercises to Accompany Mathematics 301. Curriculum Support Series.
ERIC Educational Resources Information Center
Manitoba Dept. of Education, Winnipeg.
These sample problems, exercises, questions, and projects were compiled to supplement the guide for the Manitoba course Mathematics 301 in order to assist teachers in implementing the program. Arranged according to the modules of the course guide, they are coded to the objectives of the program. Review exercises follow either the subtopics within…
Designing a Virtual Grand Tour
ERIC Educational Resources Information Center
Hansen, Per Skafte
2004-01-01
The Virtual Grand Tour (VGT) is a paradigm for integrating a presentation of an overview of a larger subject with the possibility of launching at any time an exploratory study of a given sub-topic. The name derives from the paradigm's emulation of those 18th-century travels intended to educate (especially) young, affluent British men; today, with…
Context-Based Questions: Optics in Animal Eyes
ERIC Educational Resources Information Center
Kaltakci, Derya; Eryilmaz, Ali
2011-01-01
Context is important as a motivational factor for student involvement with physics. The diversity in the types and the functions of animal eyes is an excellent context in which to achieve this goal. There exists a range of subtopics in optics including pinhole, reflection, refraction, and superposition that can be discussed in the context of the…
ERIC Educational Resources Information Center
Indiana State Dept. of Education, Indianapolis. Center for School Improvement and Performance.
This guide uses a thematic approach to show the integration of subjects (reading, mathematics, language arts, science/fine arts) and skills to create a context for learning. The contents of this guide are presented in a holistic format. There are six major topics in the guide, each with subtopics: (1) "Getting Your Feet Wet--An Introduction to…
ERIC Educational Resources Information Center
Montgomery County Public Schools, Rockville, MD.
The bibliography includes references to approximately 1500 instructional materials for use with gifted and talented students. Entries usually include author's name, title, availability information and a brief annotation for the following areas (sample subtopics in parentheses): aesthetic education (recommended art and music textbooks); English…
ERIC Educational Resources Information Center
Brantlinger, Ellen; And Others
The second of two documents designed for training paraprofessionals to work with moderately and severely/profoundly handicapped students, the teacher's guide presents information on preservice and inservice education. Preservice information to be read by the paraprofessional touches on the following topics (sample subtopics in parentheses):…
Preparing Food for Preschoolers: A Guide for Food Service Personnel.
ERIC Educational Resources Information Center
Lundin, Janet, Ed.; O'Malley, Edward T., Ed.
Guidelines and suggestions to help food service workers in children's day care centers plan, prepare, and serve a variety of nutritious, tasty, and attractive meals and snacks are presented. The following topics are included (subtopics are listed in parentheses): (1) preparation of food (seasoning foods; preparing meat, fish, vegetables, and…
Spencer, Nicholas D
2012-01-01
The 156th Faraday Discussion covered the field of tribology, focussing on the subtopics of biotribology, predictive modelling, smart surfaces, and future lubricated systems. The papers themselves covered topics that drew on the fields of biology, medicine, chemistry, physics, materials science and mechanical engineering, providing a challenging and fascinating insight into the current state of the field of tribology.
Inside Television: A Guide To Critical Viewing.
ERIC Educational Resources Information Center
White, Ned
This course is divided into seven units, each focusing on a particular aspect of television. The unit topics and some of the subtopics included are: (1) television and the American viewer; (2) the television industry (the networks, the role of the Federal Communications Commission, public television, and the business of television); (3) programs…
ERIC Educational Resources Information Center
National Accreditation Council for Agencies Serving the Blind and Visually Handicapped, New York, NY.
The guide provides accreditation standards for programs which transcribe printed matter into alternate media for blind and visually handicapped persons. Presented in a self study format, the booklet touches upon six aspects of the production of reading materials (sample subtopics in parentheses): planning and organization (administration,…
Health and Fitness Through Physical Activity.
ERIC Educational Resources Information Center
Pollock, Michael L.; And Others
A synthesis of research findings in exercise and physical fitness is presented to provide the general public with insights into establishing an individualized exercise program. The material is divided into seven subtopics: (1) a general overview of the need for exercise and fitness and how it is an integral part of preventive medicine programs;…
The Future of Government Funding for Persons with Disabilities: Some Key Factors.
ERIC Educational Resources Information Center
Ross, E. Clarke
1980-01-01
The paper identifies and discusses key factors associated with government funding for disabled individuals. An introductory section traces the growth of public expenditures in recent years. Five key factors affecting government funding are examined (sample subtopics in parentheses): state government tax and spending limits (Proposition 13 and the…
Evaluating and Using Literature Including People with Disabilities in All Classrooms
ERIC Educational Resources Information Center
Oslick, Mary Ellen; Pearson, Mary
2016-01-01
To help students see their worlds differently and to expand those views beyond their own backyards, educators can expose them to quality multicultural children's literature. In this article, we focus on a subtopic within the genre of multicultural children's literature: literature including people with disabilities. We chose seven recent texts…
Bright Promise for Your Child with Cleft Lip and Cleft Palate. Revised Edition.
ERIC Educational Resources Information Center
McDonald, Eugene T.; Berlin, Asa J.
Intended for parents of children with cleft lip and cleft palate, the booklet provides an overview of the condition. Addressed are the following topics (sample subtopics in parentheses): prenatal development and birth defects (facial development); possible causes of cleft lip/cleft palate (common misconceptions, genetic factors, environmental…
ERIC Educational Resources Information Center
Mahoney, Joyce; And Others
1988-01-01
Evaluates 10 courseware packages covering topics for introductory physics. Discusses the price; sub-topics; program type; interaction; possible hardware; time; calculus required; graphics; and comments on each program. Recommends two packages in projectile and circular motion, and three packages in statics and rotational dynamics. (YP)
ERIC Educational Resources Information Center
Grable-Wallace, Lisa; And Others
1989-01-01
Evaluates 5 courseware packages covering the topics of simple harmonic motion, 7 packages for wave motion, and 10 packages for sound. Discusses the price range, sub-topics, program type, interaction, time, calculus required, graphics, and comments of each courseware. Selects several packages based on the criteria. (YP)
Technical Writing Resources. A Handbook for Engineering and Technology Faculty at Purdue.
ERIC Educational Resources Information Center
Cheek, Madelon
Ideas for technical writing assistance and resources that are available to Purdue University faculty who incorporate a writing component into their courses are presented in this guide. Following an introduction containing the purpose, background, and scope of the guide, three main topics and their subtopics form the guide's structure: (1)…
What's in a Name? Denotation, Connotation, and "A Boy Named Sue"
ERIC Educational Resources Information Center
Lawton, Bessie
2011-01-01
Language choice--specifically word choice--is an important topic on a basic communication or public speaking course. One sub-topic under "Language" involves understanding the difference between denotation and connotation. Denotation refers to a word's definition, while connotation refers to the emotions associated with the word. Speakers need to…
Topics for Mathematics Clubs. Second Edition.
ERIC Educational Resources Information Center
Dalton, LeRoy C., Ed.; Snyder, Henry D., Ed.
One of the main purposes of a mathematics club is to provide the opportunity for students to study exciting topics in mathematics not ordinarily discussed in the classroom. Each of the 10 chapters in this booklet is a collection of related subtopics. Each idea is presented and discussed; bibliographies then suggest in-depth reading. The chapters…
Mental Health and Mental Retardation Services in Nevada.
ERIC Educational Resources Information Center
Kakalik, J. S.; And Others
Summarized are the findings and recommendations of a 2-year study of all major mental health, and mental retardation, alcohol, and drug abuse services and programs in Nevada. Fourteen chapters are given to the following topics (sample subtopics are in parentheses): description of the survey (scope of the project); summary and recommendations…
Learning Qualitative and Quantitative Reasoning in a Microworld for Elastic Impacts.
ERIC Educational Resources Information Center
Ploetzner, Rolf; And Others
1990-01-01
Discusses the artificial-intelligence-based microworld DiBi and MULEDS, a multilevel diagnosis system. Developed to adapt tutoring style to the individual learner. Explains that DiBi sets up a learning environment, and simulates elastic impacts as a subtopic of classical mechanics, and supporting reasoning on different levels of mental domain…
Assessing information transfer in full mission flight simulations
NASA Technical Reports Server (NTRS)
Lee, Alfred T.
1990-01-01
Considerable attention must be given to the important topic of aircrew situation awareness in any discussion of aviation safety and flight deck design. Reliable means of assessing this important aspect of crew behavior without simultaneously interfering with the behavior are difficult to develop. Unobtrusive measurement of crew situation awareness is particularly important in the conduct of full mission simulations where considerable effort and cost is expended to achieve a high degree of operational fidelity. An unobtrusive method of assessing situational awareness is described here which employs a topical analysis of intra-crew communications. The communications were taken from videotapes of crew behavior prior to, during, and following an encounter with a microburst/windshear event. The simulation scenario re-created an actual encounter with an event during an approach into Denver Stapleton Airport. The analyses were conducted on twelve experienced airline crews with the objective of determining the effect on situation awareness of uplinking ground-based information of the crew during the approach. The topical analysis of crew communication was conducted on all references to weather or weather-related topics. The general weather topic was further divided into weather subtopical references such as surface winds, windshear, precipitation, etc., thereby allowing for an assessment of the relative frequency of subtopic reference during the scenario. Reliable differences were found between the relative frequency of subtopic references when comparing the communications of crews receiving a cockpit display of ground-based information to the communications of a control group. The findings support the utility of this method of assessing situation awareness and information value in full mission simulations. A limiting factor in the use of this measure is that crews vary in the amount of intra-crew communications that may take place due to individual differences and other factors associated with crew coordination. This factor must be taken into consideration when employing this measure. Viewgraphs are given.
Commemorating the End of World War II: How World War II Is Taught in American Classrooms.
ERIC Educational Resources Information Center
Barth, James L.
1995-01-01
Explains and presents the results of a survey that asked teachers to rank in importance, and provide time spent on, broad topics (rise of fascism) and related subtopics (Hitler's approach to power). Two-hundred four K-12 teachers responded and provided personal information such as gender and class period length. (MJP)
Speech Deficits in Persons with Autism: Etiology and Symptom Presentation
ERIC Educational Resources Information Center
Matson, Johnny L.; Kozlowski, Alison M.; Matson, Michael M.
2012-01-01
Speech and other communication deficits are core features of the autism spectrum. This topic has become one of the most heavily studied in the child health/mental health field. Even within this group of disorders, considerable variability in symptoms is evident. A variety of subtopics within this area have been studied. Topics include types of…
Spaceship Earth. Social Studies Interim Grade Guide for Grade Seven.
ERIC Educational Resources Information Center
Manitoba Dept. of Education, Winnipeg. Curriculum Development Branch.
Seventh graders in Manitoba will gain a better understanding of the highly interdependent and interconnected world in which they live when they complete these supplementary units of study. Units and subtopics are: (1) Planet Earth--how it resembles a spaceship, its relationship to the universe and to the solar system, and how its motions and…
ERIC Educational Resources Information Center
George-Nichols, Nancy; And Others
The guide is intended to provide information on appropriate programing for elementary and secondary pupils with either perceptual/communicative or emotional/behavioral disorders. The guide, which is patterned after regular education objectives, offers comprehensive task analysis in four content areas (subtopics in parentheses): (1) readiness…
COURSE OUTLINE FOR THIRD SIX WEEKS OF SCIENCE-LEVEL II, TALENT PRESERVATION CLASSES.
ERIC Educational Resources Information Center
Houston Independent School District, TX.
UNIT III (SIX WEEKS) CONCERNS PLANT LIFE, AND DEALS WITH THALLUS PLANTS, MOSSES, FERNS, AND SEED PLANTS. UNIT IV (SIX WEEKS) COVERS AIR AND SPACE, WITH SUBTOPICS ON ASTRONOMY AND WEATHER. "THE CHANGING EARTH," DEALING WITH GEOLOGY AND CONSERVATION, COMPRISES UNIT V (6WEEKS). THE LAST, UNIT VI (6 WEEKS), DEALS WITH CONSUMER…
The Citizen Bee Guide to American Studies [with] Student Answer Key. Third Edition.
ERIC Educational Resources Information Center
Close Up Foundation, Arlington, VA.
Designed for students, this survey of American history, culture, government, economics, and geography tests their knowledge in these areas through a variety of questions. The questions are organized into 12 subtopics divided among 4 major categories: 5 topics under History, 5 under Government, 1 under Economics, and 1 under Geography. The topic…
Class Discussions: Locating Social Class in Novels for Children and Young Adults
ERIC Educational Resources Information Center
McLeod, Cynthia Anne
2008-01-01
Few studies on representations of social class in children's literature have been published in the United States. As a language arts teacher and media specialist in a high poverty school, the author describes children's novels that directly address social class and the subtopic of the labor movement and consider the continued relevance of social…
People Through the Ages. Social Studies Interim Grade Guide for Grade Eight.
ERIC Educational Resources Information Center
Manitoba Dept. of Education, Winnipeg. Curriculum Development Branch.
Supplementary units of study help eighth graders in Manitoba explore the ways people lived within selected societies of the past and realize that life today is closely related to developments which have occurred through the ages. Units and subtopics are: (1) Life during Prehistoric and Early Historic Times--prehistoric times, life in early river…
Discipline and the Section 504 Student: Your Quick-Reference Guide to Best Practices.
ERIC Educational Resources Information Center
Caruso, Brian, Ed.
This document is intended to provide guidance to schools in the discipline of students with disabilities in compliance with regulations under Section 504 of the Rehabilitation Act of 1973. Chapters address the following topics (sample sub-topics in parentheses): (1) basics of discipline under Section 504 (common mistakes districts make when…
Science Syllabus for Middle and Junior High Schools. Living Systems: Block C, Micro-Organisms.
ERIC Educational Resources Information Center
New York State Education Dept., Albany. Bureau of General Education Curriculum Development.
This syllabus begins with a list of program objectives and performance criteria for seven general topic areas related to the study of microorganisms and a list of 23 science processes. Following this information, concepts and understandings for subtopics within the general topic areas are listed as follows: (1) introduction; (2) the cell (basic…
ERIC Educational Resources Information Center
Cui, Weili; Jones, Wayne E., Jr.; Klotzkin, David; Myers, Greta L.; Wagoner, Shawn; White, Bruce
2015-01-01
Microfabrication is a critical area to many branches of science and engineering. However, to many students accustomed to seeing transistors as things that come in a lab kit, it is an obscure subtopic of their discipline. Beginning in 2009, the authors undertook a broad multidisciplinary approach to bring microfabrication into all aspects of the…
ERIC Educational Resources Information Center
PATRICK, JOHN J.
A REVIEW OF EXISTING RESEARCH WAS MADE ON THE TOPIC OF POLITICAL SOCIALIZATION OF AMERICAN YOUTH. THE AUTHOR POSED THE FOLLOWING QUESTIONS AS SUBTOPICS TO THE OVERALL RESEARCH REVIEW--(1) WHAT IS POLITICAL SOCIALIZATION, (2) WHAT DO YOUNG AMERICANS BELIEVE ABOUT POLITICS, (3) HOW DO YOUNG AMERICANS ACQUIRE POLITICAL BELIEFS, AND (4) HOW IMPORTANT…
A New Look at Reading in the Social Studies. Perspectives in Reading No. 12.
ERIC Educational Resources Information Center
Preston, Ralph C., Ed.
Five papers given at the International Reading Association conference held in conjunction with a National Council for the Social Studies conference in 1968 are presented in this work. Although each paper is addressed to a different subtopic, there is the common theme of the role that reading can play in social studies instruction. The subtopics…
ERIC Educational Resources Information Center
Nyroos, Mikaela; Wiklund-Hornqvist, Carola
2012-01-01
The aim of this study was to examine the relationship between working memory capacity and mathematical performance measured by the national curriculum assessment in third-grade children (n = 40). The national tests concerned six subareas within mathematics. One-way ANOVA, two-tailed Pearson correlation and multiple regression analyses were…
ERIC Educational Resources Information Center
Pierangelo, Roger; Crane, Rochelle
This book is intended to provide a comprehensive guide to the transition of students from special education programs into adulthood. The 13 chapters address the following specific issues, with sample sub-topics indicated in parentheses: (1) fundamentals of transition services (self-determination, importance of keeping records); (2) transitional…
ERIC Educational Resources Information Center
Richardson, Larry S.
Circular reasoning is often employed in comparative advantage debate cases when only a plan and advantages are articulated without adequate reference to the resolution which inspired the proposal. The advancing of such subtopical analyses as debate cases is deleterious to the long-range interests of educational debate because the practice…
ERIC Educational Resources Information Center
National Association for Retarded Citizens, Arlington, TX. Research and Demonstration Inst.
Guidelines are presented which were developed to aid federal, state, and local agencies prepare regulations concerning the use of mentally retarded subjects in biomedical and pharmacological research projects. Guidelines are set forth for the following topic areas (sample subtopics in parentheses): the formation of a Professional Review Committee…
Science Syllabus for Middle and Junior High Schools. Block D, The Earth's Changing Surface.
ERIC Educational Resources Information Center
New York State Education Dept., Albany. Bureau of General Education Curriculum Development.
This syllabus begins with a list of program objectives and performance criteria for the study of three general topic areas in earth science and a list of 22 science processes. Following this information is a listing of concepts and understandings for subtopics within the general topic areas: (1) the earth's surface--surface features, rock…
The Fourth Amendment in the Public Schools: Issues for the 1990's and Beyond. Presentation Outline.
ERIC Educational Resources Information Center
Schreck, Myron
In 1985, the United States Supreme Court, in "New Jersey v. T.L.O.," held that the Fourth Amendment applies to searches and seizures conducted by public school administrators. This paper discusses the current state of Fourth Amendment law with regard to public school searches and seizures. Among the subtopics discussed are the following:…
The Beautiful Brain: A Unit for Grades 5-9 with Further Explorations for Gifted and Talented.
ERIC Educational Resources Information Center
Struve, Nancy
The unit provides information on the study of the human brain for students in grades 5-9 with suggestions for extending the lessons for gifted and talented students. Learning activities are offered for ten lessons (sample subtopics in parentheses); introduction to the unit (student pretest and posttest); brain growth; medulla-oblongata-reptilian…
ERIC Educational Resources Information Center
Perikos, Isidoros; Grivokostopoulou, Foteini; Hatzilygeroudis, Ioannis
2017-01-01
Logic as a knowledge representation and reasoning language is a fundamental topic of an Artificial Intelligence (AI) course and includes a number of sub-topics. One of them, which brings difficulties to students to deal with, is converting natural language (NL) sentences into first-order logic (FOL) formulas. To assist students to overcome those…
Kamau, Paul; Aloo-Obudho, Penina; Kabiru, Ephantus; Ombacho, Kepha; Langat, Bernard; Mucheru, Obadiah; Ireri, Laban
2012-03-01
Most intestinal parasites are cosmopolitan with the highest prevalence in the tropics and subtopics. Rural-to-urban migration rapidly increases the number of food eating places in towns and their environs. Some of these eating estabishments have poor sanitation and are overcrowded, facilitating disease transmission, especially through food-handling. Our investigations in Nairobi, therefore, were set to determine the presence of intestinal parasites in food-handlers with valid medical certificates. Direct and concentrated stool processing techniques were used. Chisquare test and ANOVA were used for data analysis. The parasites Ascaris lumbricoides, Entamoeba histolytica and Giardia lamblia were observed in certified food-handlers. Significant difference was found in parasite frequency by eating classes and gender (χ(2) = 9.49, P = 0.73), (F = 1.495, P = 0.297), but not in parasite occurrence between age brackets (χ(2) = 6.99, P = 0.039). The six-month medical certificate validity period may contribute significantly to the presence of intestinal parasites in certified food-handlers.
Zinc hazards to plants and animals with emphasis on fishery and wildlife resources
Eisler, R.; Cheremisinoff, Paul N.
1997-01-01
Ecological and toxicological aspects of zinc in the environment are reviewed with emphasis on natural resources. Subtopics include sources and uses; chemical and biochemical properties; carcinogenicity, mutagenicity, teratogenicity; background concentrations in biological and nonbiological compartments; effects of zinc deficiency; toxic and sublethal effects on terrestrial plants and invertebrates, aquatic organisms, birds, and mammals; and recommendations for the protection of sensitive resources.
Scanning Radar Investigations to Characterize Cloud and Precipitation Processes for ASR
DOE Office of Scientific and Technical Information (OSTI.GOV)
Venkatachalam, Chandrasekar
2016-12-17
The project conducted investigations in the following areas related to scanning radar retrievals: a) Development for Cloud drizzle separation studies for the ENA site based on Doppler Spectra b) Advanced radar retrieval for the SGP site c) Characterizing falling snow using multifrequency dual-polarization measurements d) BAECC field experiment. More details about these investigations can be found within each subtopic within the report.
Molybdenum Hazards to Fish, Wildlife, and Invertebrates: A Synoptic Review
Eisler, R.
1989-01-01
Ecological and toxicological aspects of molybdenum (Mo) in the environment are briefly reviewed, with emphasis on fish and wildlife. Subtopics include sources and uses, chemical properties, mode of action, background concentrations in biological and nonbiological samples, and lethal and sublethal effects on terrestrial plants and invertebrates, aquatic organisms, birds, and mammals. Current recommendations for Mo and the protection of sensitive living resources are presented.
ERIC Educational Resources Information Center
Vavougios, Dionisios; Verevi, Alkistis; Papalexopoulos, Panagiotis F.; Verevi, Crystallia-Ioanna; Panagopoulou, Athanasia
2016-01-01
This article reviews 24 years of research focused on science education for students with learning and other disabilities. Our results are based on 53 articles from 2 relevant databases. We hereby present and discuss the results of the most popular topics investigated, which include: constructivism, exploratory learning, hands-on activities,…
A core curriculum for clinical fellowship training in pathology informatics
McClintock, David S.; Levy, Bruce P.; Lane, William J.; Lee, Roy E.; Baron, Jason M.; Klepeis, Veronica E.; Onozato, Maristela L.; Kim, JiYeon; Dighe, Anand S.; Beckwith, Bruce A.; Kuo, Frank; Black-Schaffer, Stephen; Gilbertson, John R.
2012-01-01
Background: In 2007, our healthcare system established a clinical fellowship program in Pathology Informatics. In 2010 a core didactic course was implemented to supplement the fellowship research and operational rotations. In 2011, the course was enhanced by a formal, structured core curriculum and reading list. We present and discuss our rationale and development process for the Core Curriculum and the role it plays in our Pathology Informatics Fellowship Training Program. Materials and Methods: The Core Curriculum for Pathology Informatics was developed, and is maintained, through the combined efforts of our Pathology Informatics Fellows and Faculty. The curriculum was created with a three-tiered structure, consisting of divisions, topics, and subtopics. Primary (required) and suggested readings were selected for each subtopic in the curriculum and incorporated into a curated reading list, which is reviewed and maintained on a regular basis. Results: Our Core Curriculum is composed of four major divisions, 22 topics, and 92 subtopics that cover the wide breadth of Pathology Informatics. The four major divisions include: (1) Information Fundamentals, (2) Information Systems, (3) Workflow and Process, and (4) Governance and Management. A detailed, comprehensive reading list for the curriculum is presented in the Appendix to the manuscript and contains 570 total readings (current as of March 2012). Discussion: The adoption of a formal, core curriculum in a Pathology Informatics fellowship has significant impacts on both fellowship training and the general field of Pathology Informatics itself. For a fellowship, a core curriculum defines a basic, common scope of knowledge that the fellowship expects all of its graduates will know, while at the same time enhancing and broadening the traditional fellowship experience of research and operational rotations. For the field of Pathology Informatics itself, a core curriculum defines to the outside world, including departments, companies, and health systems considering hiring a pathology informatician, the core knowledge set expected of a person trained in the field and, more fundamentally, it helps to define the scope of the field within Pathology and healthcare in general. PMID:23024890
ERIC Educational Resources Information Center
Iowa Univ., Iowa City. Audiovisual Center.
This summary report of the annual conference includes statements of participant concerns prepared for the planning committee, the keynote address on the impact of mass media on education by Nicholas Johnson with responses by Minaruth Galey and Jon Dunn, and reports from the groups which studied the seven subtopics identified in general sessions:…
Dental cements for definitive luting: a review and practical clinical considerations.
Hill, Edward E
2007-07-01
Dental cement used to attach an indirect restoration to a prepared tooth is called a luting agent. A clinically relevant discussion of conventional and contemporary definitive luting agents is presented in this article. Physical properties are listed in table form to assist in comparison and decision-making. Additional subtopics include luting agent requirements, classifications, retention and bonding, cement considerations for implant-supported teeth, and fatigue failure.
ERIC Educational Resources Information Center
Rehabilitation Services Administration (DHEW), Washington, DC.
The annual report discusses the FY 1979 administration of the Rehabilitation Act of 1973. Covered are five aspects (sample subtopics in parentheses): program operations (basic vocational rehabilitation program, services to the blind and visually handicapped, rehabilitation for American Indians); program development activities (special projects for…
ERIC Educational Resources Information Center
Nesbitt, John A., Ed.
Fifty-seven papers on new models of community recreation for the handicapped comprise the third report in the series (EC 114 401-409). Papers deal with the following topics (sample subtopics in parentheses): administration (management by objectives); advocacy; areas and equipment (outdoor playground equipment); attitudes; barriers (an analysis of…
Small Business Innovation Research. Program solicitation. Closing date: July 22, 1988
NASA Technical Reports Server (NTRS)
1988-01-01
The sixth annual Small Business Innovation Research (SBIR) solicitation by NASA, describes the program, identifies eligibility requirements, outlines proposal preparation and submission requirements, describes the proposal evaluation and award selection process, and provides other information to assist those interested in participating in the SBIR program. It also identifies in Section 8.0 and Appendix D, the specific technical topics and subtopics in which SBIR Phase 1 proposals are solicited in 1988.
Technology transfer within the government
NASA Technical Reports Server (NTRS)
Russell, John
1992-01-01
The report of a workshop panel concerned with technology transfer within the government is presented. The presentation is made in vugraph form. The assigned subtopic for this panel are as follows: (1) transfer from non-NASA US government technology developers to NASA space missions/programs; and (2) transfer from NASA to other US government space mission programs. A specific area of inquiry was Technology Maturation Milestones. Three areas were investigated: technology development; advanced development; and flight hardware development.
ERIC Educational Resources Information Center
Hansen, John R.
The intent of this investigation was to design a resource unit to be used by junior high school science teachers to teach the concept of the kinetic theory of gases. The document was prepared to aid teachers with minimal preparation in physics. The research design consisted of three main subproblems: (1) the identification of the subtopics of the…
Solar Energy Education. Renewable energy: a background text. [Includes glossary
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1985-01-01
Some of the most common forms of renewable energy are presented in this textbook for students. The topics include solar energy, wind power hydroelectric power, biomass ocean thermal energy, and tidal and geothermal energy. The main emphasis of the text is on the sun and the solar energy that it yields. Discussions on the sun's composition and the relationship between the earth, sun and atmosphere are provided. Insolation, active and passive solar systems, and solar collectors are the subtopics included under solar energy. (BCS)
Pentachlorophenol Hazards to Fish, Wildlife, and Invertebrates: A Synoptic Review
Eisler, R.
1989-01-01
Pentachlorophenol (PCP) is now widely used as a wood preservative, and this has contributed to the detection of PCP residues in air, rain, groundwaters, surface waters, fish and aquatic invertebrates, and in human urine, blood, and milk of nursing mothers. This report briefly reviews the technical literature on ecological and toxicological aspects of PCP in the environment, with emphasis on fishery and wildlife resources. Subtopics include sources and uses, chemical properties, fate, background concentrations, lethal and sublethal effects, and current recommendations for resource protection
Coordinating Council. Sixth Meeting: Who Are Our Key Users?
NASA Technical Reports Server (NTRS)
1991-01-01
This NASA Scientific and Technical Information Program Coordinating Council meeting deals with the topic 'Who are our key users?' Presentations were made on the following subtopics: Key users: Who uses the system the most, Who orders the most documents, Users: What do we know about them?, NASA/DOD Aerospace Knowledge Diffusion research project on 'Potential key users', How we meet the user's needs, and STI Council user requirements update. Summaries of discussions after the presentations are included along with visuals for the presentations.
Brazilian recommendations of mechanical ventilation 2013. Part 2
2014-01-01
Perspectives on invasive and noninvasive ventilatory support for critically ill patients are evolving, as much evidence indicates that ventilation may have positive effects on patient survival and the quality of the care provided in intensive care units in Brazil. For those reasons, the Brazilian Association of Intensive Care Medicine (Associação de Medicina Intensiva Brasileira - AMIB) and the Brazilian Thoracic Society (Sociedade Brasileira de Pneumologia e Tisiologia - SBPT), represented by the Mechanical Ventilation Committee and the Commission of Intensive Therapy, respectively, decided to review the literature and draft recommendations for mechanical ventilation with the goal of creating a document for bedside guidance as to the best practices on mechanical ventilation available to their members. The document was based on the available evidence regarding 29 subtopics selected as the most relevant for the subject of interest. The project was developed in several stages, during which the selected topics were distributed among experts recommended by both societies with recent publications on the subject of interest and/or significant teaching and research activity in the field of mechanical ventilation in Brazil. The experts were divided into pairs that were charged with performing a thorough review of the international literature on each topic. All the experts met at the Forum on Mechanical Ventilation, which was held at the headquarters of AMIB in São Paulo on August 3 and 4, 2013, to collaboratively draft the final text corresponding to each sub-topic, which was presented to, appraised, discussed and approved in a plenary session that included all 58 participants and aimed to create the final document. PMID:25410835
Brazilian recommendations of mechanical ventilation 2013. Part I
Barbas, Carmen Sílvia Valente; Ísola, Alexandre Marini; Farias, Augusto Manoel de Carvalho; Cavalcanti, Alexandre Biasi; Gama, Ana Maria Casati; Duarte, Antonio Carlos Magalhães; Vianna, Arthur; Serpa, Ary; Bravim, Bruno de Arruda; Pinheiro, Bruno do Valle; Mazza, Bruno Franco; de Carvalho, Carlos Roberto Ribeiro; Toufen, Carlos; David, Cid Marcos Nascimento; Taniguchi, Corine; Mazza, Débora Dutra da Silveira; Dragosavac, Desanka; Toledo, Diogo Oliveira; Costa, Eduardo Leite; Caser, Eliana Bernardete; Silva, Eliezer; Amorim, Fabio Ferreira; Saddy, Felipe; Galas, Filomena Regina Barbosa Gomes; Silva, Gisele Sampaio; de Matos, Gustavo Faissol Janot; Emmerich, João Claudio; Valiatti, Jorge Luis dos Santos; Teles, José Mario Meira; Victorino, Josué Almeida; Ferreira, Juliana Carvalho; Prodomo, Luciana Passuello do Vale; Hajjar, Ludhmila Abrahão; Martins, Luiz Cláudio; Malbouisson, Luiz Marcelo Sá; Vargas, Mara Ambrosina de Oliveira; Reis, Marco Antonio Soares; Amato, Marcelo Brito Passos; Holanda, Marcelo Alcântara; Park, Marcelo; Jacomelli, Marcia; Tavares, Marcos; Damasceno, Marta Cristina Paulette; Assunção, Murillo Santucci César; Damasceno, Moyzes Pinto Coelho Duarte; Youssef, Nazah Cherif Mohamad; Teixeira, Paulo José Zimmermann; Caruso, Pedro; Duarte, Péricles Almeida Delfino; Messeder, Octavio; Eid, Raquel Caserta; Rodrigues, Ricardo Goulart; de Jesus, Rodrigo Francisco; Kairalla, Ronaldo Adib; Justino, Sandra; Nemer, Sérgio Nogueira; Romero, Simone Barbosa; Amado, Verônica Moreira
2014-01-01
Perspectives on invasive and noninvasive ventilatory support for critically ill patients are evolving, as much evidence indicates that ventilation may have positive effects on patient survival and the quality of the care provided in intensive care units in Brazil. For those reasons, the Brazilian Association of Intensive Care Medicine (Associação de Medicina Intensiva Brasileira - AMIB) and the Brazilian Thoracic Society (Sociedade Brasileira de Pneumologia e Tisiologia - SBPT), represented by the Mechanical Ventilation Committee and the Commission of Intensive Therapy, respectively, decided to review the literature and draft recommendations for mechanical ventilation with the goal of creating a document for bedside guidance as to the best practices on mechanical ventilation available to their members. The document was based on the available evidence regarding 29 subtopics selected as the most relevant for the subject of interest. The project was developed in several stages, during which the selected topics were distributed among experts recommended by both societies with recent publications on the subject of interest and/or significant teaching and research activity in the field of mechanical ventilation in Brazil. The experts were divided into pairs that were charged with performing a thorough review of the international literature on each topic. All the experts met at the Forum on Mechanical Ventilation, which was held at the headquarters of AMIB in São Paulo on August 3 and 4, 2013, to collaboratively draft the final text corresponding to each sub-topic, which was presented to, appraised, discussed and approved in a plenary session that included all 58 participants and aimed to create the final document. PMID:25028944
Brazilian recommendations of mechanical ventilation 2013. Part 2
Barbas, Carmen Sílvia Valente; Ísola, Alexandre Marini; Farias, Augusto Manoel de Carvalho; Cavalcanti, Alexandre Biasi; Gama, Ana Maria Casati; Duarte, Antonio Carlos Magalhães; Vianna, Arthur; Serpa Neto, Ary; Bravim, Bruno de Arruda; Pinheiro, Bruno do Valle; Mazza, Bruno Franco; de Carvalho, Carlos Roberto Ribeiro; Toufen Júnior, Carlos; David, Cid Marcos Nascimento; Taniguchi, Corine; Mazza, Débora Dutra da Silveira; Dragosavac, Desanka; Toledo, Diogo Oliveira; Costa, Eduardo Leite; Caser, Eliana Bernadete; Silva, Eliezer; Amorim, Fabio Ferreira; Saddy, Felipe; Galas, Filomena Regina Barbosa Gomes; Silva, Gisele Sampaio; de Matos, Gustavo Faissol Janot; Emmerich, João Claudio; Valiatti, Jorge Luis dos Santos; Teles, José Mario Meira; Victorino, Josué Almeida; Ferreira, Juliana Carvalho; Prodomo, Luciana Passuello do Vale; Hajjar, Ludhmila Abrahão; Martins, Luiz Claudio; Malbouisson, Luis Marcelo Sá; Vargas, Mara Ambrosina de Oliveira; Reis, Marco Antonio Soares; Amato, Marcelo Brito Passos; Holanda, Marcelo Alcântara; Park, Marcelo; Jacomelli, Marcia; Tavares, Marcos; Damasceno, Marta Cristina Paulette; Assunção, Murillo Santucci César; Damasceno, Moyzes Pinto Coelho Duarte; Youssef, Nazah Cherif Mohamed; Teixeira, Paulo José Zimmermann; Caruso, Pedro; Duarte, Péricles Almeida Delfino; Messeder, Octavio; Eid, Raquel Caserta; Rodrigues, Ricardo Goulart; de Jesus, Rodrigo Francisco; Kairalla, Ronaldo Adib; Justino, Sandra; Nemer, Sergio Nogueira; Romero, Simone Barbosa; Amado, Verônica Moreira
2014-01-01
Perspectives on invasive and noninvasive ventilatory support for critically ill patients are evolving, as much evidence indicates that ventilation may have positive effects on patient survival and the quality of the care provided in intensive care units in Brazil. For those reasons, the Brazilian Association of Intensive Care Medicine (Associação de Medicina Intensiva Brasileira - AMIB) and the Brazilian Thoracic Society (Sociedade Brasileira de Pneumologia e Tisiologia - SBPT), represented by the Mechanical Ventilation Committee and the Commission of Intensive Therapy, respectively, decided to review the literature and draft recommendations for mechanical ventilation with the goal of creating a document for bedside guidance as to the best practices on mechanical ventilation available to their members. The document was based on the available evidence regarding 29 subtopics selected as the most relevant for the subject of interest. The project was developed in several stages, during which the selected topics were distributed among experts recommended by both societies with recent publications on the subject of interest and/or significant teaching and research activity in the field of mechanical ventilation in Brazil. The experts were divided into pairs that were charged with performing a thorough review of the international literature on each topic. All the experts met at the Forum on Mechanical Ventilation, which was held at the headquarters of AMIB in São Paulo on August 3 and 4, 2013, to collaboratively draft the final text corresponding to each sub-topic, which was presented to, appraised, discussed and approved in a plenary session that included all 58 participants and aimed to create the final document. PMID:25295817
Brazilian recommendations of mechanical ventilation 2013. Part I
2014-01-01
Perspectives on invasive and noninvasive ventilatory support for critically ill patients are evolving, as much evidence indicates that ventilation may have positive effects on patient survival and the quality of the care provided in intensive care units in Brazil. For those reasons, the Brazilian Association of Intensive Care Medicine (Associação de Medicina Intensiva Brasileira - AMIB) and the Brazilian Thoracic Society (Sociedade Brasileira de Pneumologia e Tisiologia - SBPT), represented by the Mechanical Ventilation Committee and the Commission of Intensive Therapy, respectively, decided to review the literature and draft recommendations for mechanical ventilation with the goal of creating a document for bedside guidance as to the best practices on mechanical ventilation available to their members. The document was based on the available evidence regarding 29 subtopics selected as the most relevant for the subject of interest. The project was developed in several stages, during which the selected topics were distributed among experts recommended by both societies with recent publications on the subject of interest and/or significant teaching and research activity in the field of mechanical ventilation in Brazil. The experts were divided into pairs that were charged with performing a thorough review of the international literature on each topic. All the experts met at the Forum on Mechanical Ventilation, which was held at the headquarters of AMIB in São Paulo on August 3 and 4, 2013, to collaboratively draft the final text corresponding to each sub-topic, which was presented to, appraised, discussed and approved in a plenary session that included all 58 participants and aimed to create the final document. PMID:25210957
DOE Office of Scientific and Technical Information (OSTI.GOV)
Baldock, Nick; Sevilla, Fernando; Redfern, Robin
The United States Department of Energy (DOE) awarded a grant to GL Garrad Hassan (GL GH) to investigate the logistics, opportunities, and costs associated with existing and emerging installation and operation and maintenance (O&M) activities at offshore wind projects as part of the DOE’s program to reduce barriers facing offshore wind project development in the United States (U.S.). This report (the Report) forms part of Subtopic 5.3 “Optimized Installation, Operation and Maintenance Strategies Study” which in turn is part of the “Removing Market Barriers in U.S. Offshore Wind” set of projects for the DOE. The purpose of Subtopic 5.3 ismore » to aid and facilitate informed decision-making regarding installation and O&M during the development, installation, and operation of offshore wind projects in order to increase efficiency and reduce the levelized cost of energy (LCoE). Given the large area of U.S. territorial waters, the generally higher mean wind speeds offshore, and the proximity to the coast of many large U.S. cities, offshore wind power has the potential to become a significant contributor of energy to U.S. markets. However, for the U.S. to ensure that the development of offshore wind energy projects is carried out in an efficient and cost-effective manner, it is important to be cognizant of the current and emerging practices in both the domestic and international offshore wind energy industries. The U.S. can harness the experience gained globally and combine this with the skills and assets of an already sizeable onshore wind industry, as well as the resources of a mature offshore oil and gas industry, to develop a strong offshore wind sector. The work detailed in this report is aimed at assisting with that learning curve, particularly in terms of offshore specific installation and O&M activities. This Report and the Installation and O&M LCoE Analysis Tool, which were developed together by GL GH as part of this study, allow readers to identify, model and probe the economic merits and sensitivities of various approaches to construction and O&M practices, using illustrative offshore projects across a wide range of alternative offshore development areas located in U.S. waters. The intention is to assist decision-makers in clearly understanding the relative economic benefits of both conventional and novel construction installation methodologies and maintenance techniques within the critical parameters of a Project’s LCoE.« less
NASA Astrophysics Data System (ADS)
Chan, Chi Keung
The aim of this study was to examine the contribution of students' meta-conceptual awareness and modelling skills to their conceptual change when learning atomic-molecular theory. Instructional materials used in the intervention covered three sub-topics: atomic structure, chemical bonding, and structures and properties. Glynn's (1991) Teaching with Analogy model and Chambliss's (2002) guidelines for constructing scientific texts were used as the frameworks for designing and implementing instructional materials for the intervention. Forty-five Secondary 4 chemistry students from two classes at a secondary school in Hong Kong participated in the study. The two classes were taught by the same teacher. The study consisted of two phases. During Phase I, which lasted for 6 weeks, Class A (n = 13) used the above-mentioned instructional materials to learn the three sub-topics, whereas Class B (n = 32) learned the same sub-topics using traditional textbook materials. To further examine the effects of the intervention, a 2-week switching-replication treatment was implemented in Phase II. Class A used traditional textbook materials for revision whereas Class B used the tailor-made instructional materials. A mixed-methods design was used to assess the effectiveness of the intervention. Based on the student misconceptions documented in the literature, a written test of the three sub-topics was developed. The test comprised 33 two-tier multiple-choice items. The test was administered three times: before Phase I (T1), just after Phase I and before Phase II (T2), and 2 weeks after Phase II (T3). Qualitative data were gathered from semi-structured interviews with five students. Three students from Class A and two students from Class B were interviewed individually after Phase I and Phase II, respectively, to assess students' understanding of the essential theoretical concepts and to assess students' modelling skills. The results of paired-samples t-test showed that there was a significant difference between scores at T1 and T2 in both classes, but the difference between scores at T2 and T3 in both classes was insignificant. The results of independent-samples t-test showed that there were no significant differences in scores between Classes A and B at T1, T2 and T3. The results indicate that explicit presentation of misconceptions and scientific concepts, as suggested by Chambliss (2002), was found to be roughly as effective as traditional textbook instruction in terms of students' meta-conceptual awareness. The semi-structured interviews revealed that the modelling skills of the three students from Class A and the two students from Class B had improved after receiving the interventional treatment in Phase I and the switching-replication treatment in Phase II, respectively. They had achieved modelling skills with the general characteristics of level-2 modelling according to Grosslight et al.'s (1991) framework of epistemological views of models and their use in science. They recognised that each model had its own presentation purposes and its own strengths and limitations. They understood that models are not physical copies of reality. In addition, they were aware that a chemical bond is a force rather than a material. They also knew that ionic bonds are present throughout the whole lattice of sodium chloride. These skills enabled the students to avoid forming or retaining some important misconceptions about atomic-molecular theory. They were able to distinguish between microscopic and macroscopic properties. However, some students retained their original misconceptions such as misinterpretation of electron shells as fixed orbits. Possible reasons to account for the results are suggested. The significance and implications of the findings for chemistry education in secondary school are discussed.
Research priorities in the field of HIV and AIDS in Iran
Haghdoost, AliAkbar; Sadeghi, Masoomeh; Nasirian, Maryam; Mirzazadeh, Ali; Navadeh, Soodabeh
2012-01-01
Background: HIV is a multidimensional problem. Therefore, prioritization of research topics in this field is a serious challenge. We decided to prioritize the major areas of research on HIV/AIDS in Iran. Materials ans Methods: In a brain-storming session with the main national and provincial stakeholders and experts from different relevant fields, the direct and indirect dimensions of HIV/AIDS and its related research issues were explored. Afterward, using the Delphi method, we sent questionnaires to 20 experts (13 respondents) from different sectors. In this electronic based questioner, we requested experts to evaluate main topics and their subtopics. The ranges of scores were between 0 and 100. Results: The score of priorities of main themes were preventive activities (43.2), large scale planning (25.4), the estimation of the HIV/AIDS burden (20.9), and basic scientific research (10.5). The most important priority in each main theme was education particularly in high risk groups (52.5), developing the national strategy to address the epidemic (31.8), estimation of the incidence and prevalence among high-risk groups (59.5) and developing new preventive methods (66.7), respectively. Conclusions: The most important priorities of researches on HIV/AIDS were preventive activities and developing national strategy. As high risk groups are the most involved people in the epidemic, and they are also the most hard-to-reach sub-populations, a national well designated comprehensive strategy is essential. However, we believe with a very specific and directed scheme, special attention to research in basic sciences is necessary, at least in limited number of institutes. PMID:23626616
Webster, Joseph B
2009-03-01
To determine the performance and change over time when incorporating questions in the core competency domains of practice-based learning and improvement (PBLI), systems-based practice (SBP), and professionalism (PROF) into the national PM&R Self-Assessment Examination for Residents (SAER). Prospective, longitudinal analysis. The national Self-Assessment Examination for Residents (SAER) in Physical Medicine and Rehabilitation, which is administered annually. Approximately 1100 PM&R residents who take the examination annually. Inclusion of progressively more challenging questions in the core competency domains of PBLI, SBP, and PROF. Individual test item level of difficulty (P value) and discrimination (point biserial index). Compared with the overall test, questions in the subtopic areas of PBLI, SBP, and PROF were relatively easier and less discriminating (correlation of resident performance on these domains compared with that on the total test). These differences became smaller during the 3-year time period. The difficulty level of the questions in each of the subtopic domains was raised during the 3 year period to a level close to the overall exam. Discrimination of the test items improved or remained stable. This study demonstrates that, with careful item writing and review, multiple-choice items in the PBLI, SBP, and PROF domains can be successfully incorporated into an annual, national self-assessment examination for residents. The addition of these questions had value in assessing competency while not compromising the overall validity and reliability of the exam. It is yet to be determined if resident performance on these questions corresponds to performance on other measures of competency in the areas of PBLI, SBP, and PROF.
Molecular Fluorescence, Phosphorescence, and Chemiluminescence Spectrometry
DOE Office of Scientific and Technical Information (OSTI.GOV)
Powe, Aleeta; Das, Susmita; Lowry, Mark
This review covers the 2 year period since our last review (1) from January 2008 through December 2009. A computer search of Chemical Abstracts provided most of the references for this review. A search for documents written in English containing the terms fluorescence or phosphorescence or chemiluminescence published in 2008-2009 resulted in more than 100 000 hits. An initial screening reduced this number to approximately 23 000 publications that were considered for inclusion in this review. Key word searches of this subset provided subtopics of manageable size. Other citations were found through individual searches by the various authors who wrotemore » a particular section of this review.« less
Cognitive considerations for helmet-mounted display design
NASA Astrophysics Data System (ADS)
Francis, Gregory; Rash, Clarence E.
2010-04-01
Helmet-mounted displays (HMDs) are designed as a tool to increase performance. To achieve this, there must be an accurate transfer of information from the HMD to the user. Ideally, an HMD would be designed to accommodate the abilities and limitations of users' cognitive processes. It is not enough for the information (whether visual, auditory, or tactual) to be displayed; the information must be perceived, attended, remembered, and organized in a way that guides appropriate decision-making, judgment, and action. Following a general overview, specific subtopics of cognition, including perception, attention, memory, knowledge, decision-making, and problem solving are explored within the context of HMDs.
Copper hazards to fish, wildlife and invertebrates: a synoptic review
Eisler, Ronald
1998-01-01
Selective review and synthesis of the technical literature on copper and copper salts in the environment and their effects primarily on fishes, birds, mammals, terrestrial and aquatic invertebrates, and other natural resources. The subtopics include copper sources and uses; chemical and biochemical properties; concentrations of copper in field collections of abiotic materials and living organisms; effects of copper deficiency; lethal and sublethal effects on terrestrial plants and invertebrates, aquatic organisms, birds and mammals, including effects on survival, growth, reproduction, behavior, metabolism, carcinogenicity, matagenicity, and teratogenicity; proposed criteria for the protection of human health and sensitive natural resources; and recommendations for additional research.
Nickel Hazards to Fish, Wildlife and Invertebrates: A Synoptic Review
Eisler, R.
1998-01-01
This account is a selective review and synthesis of the technical literature on nickel and nickel salts in the environment and their effects on terrestrial plants and invertebrates, aquatic plants and animals, avian and mammalian wildlife, and other natural resources, The subtopics include nickel sources and uses; physical, chemical, and metabolic properties of nickel; nickel concentrations in field collections of abiotic materials and living organisms; nickel deficiency effects; lethal and sublethal effects, including effects on survival, growth, reproduction, metabolism, mutagenicity, teratogenicity, and carcinogenicity; currently proposed nickel criteria for the protection of human health and sensitive natural resources; and recommendations for additional research.
On the Topical Structure of Medical Charts
Archbold, Armar A.; Evans, David A.
1989-01-01
In a study of 55 H&P sections of hospital charts, we tested the hypothesis that topic-sub-topic sequencing is sufficiently regular to provide ‘missing’ information in the construction of explicit propositions from elliptical text. ‘Propositions’ were taken to be frames with the slots topic, sub-topic, method, site, attribute, value, and qualifier. Topic was identifiable in 96% of all cases; attribute-value pairs were uniquely recoverable from topics in 69% of all cases; site was co-determined by topic, method, and attribute. Our results suggest that uncertainties in the automated processing of H&P statements can be overcome by appealing to knowledge about the topical structure of medical charts.
The EXPERT project: part of the Super-FRS Experiment Collaboration
NASA Astrophysics Data System (ADS)
Chudoba, V.; "EXPERT project,
DOE Office of Scientific and Technical Information (OSTI.GOV)
Reynolds, K.M.; Holsten, E.H.; Werner, R.A.
1995-03-01
SBexpert version 1.0 is a knowledge-based decision-support system for management of spruce beetle developed for use in Microsoft Windows. The users guide provides detailed instructions on the use of all SBexpert features. SBexpert has four main subprograms; introduction, analysis, textbook, and literature. The introduction is the first of the five subtopics in the SBexpert help system. The analysis topic is an advisory system for spruce beetle management that provides recommendation for reducing spruce beetle hazard and risk to spruce stands and is the main analytical topic in SBexpert. The textbook and literature topics provide complementary decision support for analysis.
Postmortem aviation forensic toxicology: an overview.
Chaturvedi, Arvind K
2010-05-01
An overview of the subtopic aviation combustion toxicology of the field of aerospace toxicology has been published. In a continuation of the overview, the findings associated with postmortem aviation forensic toxicology are being summarized in the present overview. A literature search for the period of 1960-2007 was performed. The important findings related to postmortem toxicology were evaluated. In addition to a brief introduction, this overview is divided into the sections of analytical methods; carboxyhemoglobin and blood cyanide ion; ethanol; drugs; result interpretation; glucose and hemoglobin A(1c); and references. Specific details of the subject matter were discussed. It is anticipated that this overview will be an outline source for aviation forensic toxicology within the field of aerospace toxicology.
Student perception of group dynamics predicts individual performance: Comfort and equity matter
Theobald, Elli J.; Eddy, Sarah L.; Grunspan, Daniel Z.; Wiggins, Benjamin L.
2017-01-01
Active learning in college classes and participation in the workforce frequently hinge on small group work. However, group dynamics vary, ranging from equitable collaboration to dysfunctional groups dominated by one individual. To explore how group dynamics impact student learning, we asked students in a large-enrollment university biology class to self-report their experience during in-class group work. Specifically, we asked students whether there was a friend in their group, whether they were comfortable in their group, and whether someone dominated their group. Surveys were administered after students participated in two different types of intentionally constructed group activities: 1) a loosely-structured activity wherein students worked together for an entire class period (termed the ‘single-group’ activity), or 2) a highly-structured ‘jigsaw’ activity wherein students first independently mastered different subtopics, then formed new groups to peer-teach their respective subtopics. We measured content mastery by the change in score on identical pre-/post-tests. We then investigated whether activity type or student demographics predicted the likelihood of reporting working with a dominator, being comfortable in their group, or working with a friend. We found that students who more strongly agreed that they worked with a dominator were 17.8% less likely to answer an additional question correct on the 8-question post-test. Similarly, when students were comfortable in their group, content mastery increased by 27.5%. Working with a friend was the single biggest predictor of student comfort, although working with a friend did not impact performance. Finally, we found that students were 67% less likely to agree that someone dominated their group during the jigsaw activities than during the single group activities. We conclude that group activities that rely on positive interdependence, and include turn-taking and have explicit prompts for students to explain their reasoning, such as our jigsaw, can help reduce the negative impact of inequitable groups. PMID:28727749
A human factors methodology for real-time support applications
NASA Technical Reports Server (NTRS)
Murphy, E. D.; Vanbalen, P. M.; Mitchell, C. M.
1983-01-01
A general approach to the human factors (HF) analysis of new or existing projects at NASA/Goddard is delineated. Because the methodology evolved from HF evaluations of the Mission Planning Terminal (MPT) and the Earth Radiation Budget Satellite Mission Operations Room (ERBS MOR), it is directed specifically to the HF analysis of real-time support applications. Major topics included for discussion are the process of establishing a working relationship between the Human Factors Group (HFG) and the project, orientation of HF analysts to the project, human factors analysis and review, and coordination with major cycles of system development. Sub-topics include specific areas for analysis and appropriate HF tools. Management support functions are outlined. References provide a guide to sources of further information.
Assessment of polytechnic students' understanding of basic algebra
NASA Astrophysics Data System (ADS)
Mokmin, Nur Azlina Mohamed; Masood, Mona
2015-12-01
It is important for engineering students to excel in algebra. Previous studies show that the algebraic fraction is a subtopic of algebra that was found to be the most challenging for engineering students. This study is done with 191 first semester engineering students who have enrolled in engineering programs in Malaysian polytechnic. The respondents are divided into Group 1 (Distinction) and Group 2 (Credit) based on their Mathematics SPM result. A computer application is developed for this study to assess student information and understanding of the algebraic fraction topic. The result is analyzed using SPSS and Microsoft Excel. The test results show that there are significant differences between Group 1 and Group 2 and that most of the students scored below the minimum requirement.
Damage Arresting Composites for Shaped Vehicles
NASA Technical Reports Server (NTRS)
Velicki, Alex
2009-01-01
This report describes the development of a novel structural solution that addresses the demanding fuselage loading requirements for the Hybrid Wing or Blended Wing Body configurations that are described in NASA NRA subtopic A2A.3, "Materials and Structures for Wing Components and Non-Circular Fuselage." The phase I portion of this task includes a comprehensive finite element model-based structural sizing exercise performed using the BWB airplane configuration to generate internal loads and fuselage panel weights for an advanced Pultruded Rod Stitched Efficient Unitized Structure (PRSEUS) structural concept. An accompanying element-level test program is also described which substantiates the analytical results and calculation methods used in the trade study. The phase II plan for the continuation of this research is also included herein.
[Sex differences and anesthesiology: preface and comments].
Nishno, Takashi
2009-01-01
In this special issue, the topic of sex difference in the field of anesthesiology is featured. Eight subtopics are discussed including 1) sex differences in cardiovascular medicine, 2) perioperative cardiovascular management, 3) sex differences in the respiratory functions of the upper airway, 4) sex differences in the anesthetic management, 5) sex differences in pain, 6) sex differences in laboratory medicine, 7) sex differences in pharmacokinetics of anesthetics, and 8) postoperative nausea and vomiting. Although recent clinical and experimental studies have shown the existence of sex and/or gender differences in many fields of medicine, our knowledge of sex differences in anesthesiology is apparently insufficient. I believe that anesthesiologists should pay more attention to this topic to improve our daily practice of anesthesia.
Analysis on the application of background parameters on remote sensing classification
NASA Astrophysics Data System (ADS)
Qiao, Y.
Drawing accurate crop cultivation acreage, dynamic monitoring of crops growing and yield forecast are some important applications of remote sensing to agriculture. During the 8th 5-Year Plan period, the task of yield estimation using remote sensing technology for the main crops in major production regions in China once was a subtopic to the national research task titled "Study on Application of Remote sensing Technology". In 21 century in a movement launched by Chinese Ministry of Agriculture to combine high technology to farming production, remote sensing has given full play to farm crops' growth monitoring and yield forecast. And later in 2001 Chinese Ministry of Agriculture entrusted the Northern China Center of Agricultural Remote Sensing to forecast yield of some main crops like wheat, maize and rice in rather short time to supply information for the government decision maker. Present paper is a report for this task. It describes the application of background parameters in image recognition, classification and mapping with focuses on plan of the geo-science's theory, ecological feature and its cartographical objects or scale, the study of phrenology for image optimal time for classification of the ground objects, the analysis of optimal waveband composition and the application of background data base to spatial information recognition ;The research based on the knowledge of background parameters is indispensable for improving the accuracy of image classification and mapping quality and won a secondary reward of tech-science achievement from Chinese Ministry of Agriculture. Keywords: Spatial image; Classification; Background parameter
Leong, T Y; Kaiser, K; Miksch, S
2007-01-01
Guideline-based clinical decision support is an emerging paradigm to help reduce error, lower cost, and improve quality in evidence-based medicine. The free and open source (FOS) approach is a promising alternative for delivering cost-effective information technology (IT) solutions in health care. In this paper, we survey the current FOS enabling technologies for patient-centric, guideline-based care, and discuss the current trends and future directions of their role in clinical decision support. We searched PubMed, major biomedical informatics websites, and the web in general for papers and links related to FOS health care IT systems. We also relied on our background and knowledge for specific subtopics. We focused on the functionalities of guideline modeling tools, and briefly examined the supporting technologies for terminology, data exchange and electronic health record (EHR) standards. To effectively support patient-centric, guideline-based care, the computerized guidelines and protocols need to be integrated with existing clinical information systems or EHRs. Technologies that enable such integration should be accessible, interoperable, and scalable. A plethora of FOS tools and techniques for supporting different knowledge management and quality assurance tasks involved are available. Many challenges, however, remain in their implementation. There are active and growing trends of deploying FOS enabling technologies for integrating clinical guidelines, protocols, and pathways into the main care processes. The continuing development and maturation of such technologies are likely to make increasingly significant contributions to patient-centric, guideline-based clinical decision support.
Pre-service Elementary Teachers Understanding on Force and Motion
NASA Astrophysics Data System (ADS)
Anggoro, S.; Widodo, A.; Suhandi, A.
2017-09-01
The research is done to investigate the understanding on the subtopic of Force and Motion that exists among the pre-services elementary teachers. The participants were 71 Elementary Teachers Study Program students in 6th and 77 one in 2nd semester at private university. Research instrument consisted of background information of respondents, belief of preconception and 8 questions that relates to Force and Motion with four alternative answers and their explained. Descriptive statistics such as percentage and bar chart were used for analyzing the data collected. Research findings have shown many participants have some misunderstand or misconception conception especially in free fall object, rest object, buoyant force and gravitation. This research recommends learning progression pre-services teachers to be exposed with conflict cognitive strategy for science conceptual change.
Small business innovation research program solicitation: Closing date July 16, 1990
NASA Technical Reports Server (NTRS)
1990-01-01
This is the eighth annual solicitation by NASA addressed to small business firms, inviting them to submit proposals for research, or research and development, activities in some of the science and engineering areas of interest to NASA. The solicitation describes the Small Business Innovative Research (SBIR) program, identifies eligibility requirements, outlines the required proposal format and content, states proposal preparation and submission requirements, describes the proposal evaluation and award selection process, and provides other information to assist those interested in participating in NASA's SBIR program. It also identifies the technical topics and subtopics for which SBIR proposals are solicited. These cover a broad range of current NASA interests, but do not necessarily include all areas in which NASA plans or currently conducts research. High-risk high pay-off innovations are desired.
Lecuyer, Lou; White, Rehema M; Schmook, Birgit; Calmé, Sophie
2018-08-15
Conservation biology faces critical challenges that require collaborative approaches, including novel strategies to support interactions among actors in biodiversity conflicts. The goals of this study were to investigate the concept of common ground across multiple issues and to explore its practical application for the support of environmental management. We conceptually defined common ground as the areas of relevance underlying the suite of issues expressed by people regarding environmental management in a particular context. We then empirically tested this in the Calakmul region of Mexico, where the complex socio-historical context and high biodiversity have created environmental management challenges that are now being addressed by a local, multi-stakeholder management board. We conducted 26 open interviews with members of the board and a further round of quantitative prioritisation of issues raised. Using a coding process designed to reveal common ground, we categorized the issues at four levels ranging from coarse to fine (themes, topics, sub-topics and perspectives). We then analysed two levels, topics (n = 14 issues) and sub-topics (n = 51 issues). To do so, we built common ground matrices to identify and analyze common ground among actors and across issues. First, cluster and non-metrical data analyses revealed the diversity of actor positions and the lack of consistent grouping among actors by occupational activity. This demonstrated that focusing on actors' differences might be misleading, and that actors' views were not closely aligned with their roles. Second, we located issues according to their levels of common ground and importance among actors. We showed that by not focusing on single issue conflicts, the identification of common ground across multiple issues can pinpoint synergies. We then proposed a framework for collaboration that prioritizes issues of high importance with greater common ground (e.g. sustainable resource use activities), to support the development of trust and norms of reciprocity among actors, strengthening the potential for future cooperation. By adopting this approach, environmental managers could support the initial stages of collaborative conservation strategies, engaging with other actors to seek common ground, avoid the creation of polarised groups and help effectively manage biodiversity conflicts. Copyright © 2018 Elsevier Ltd. All rights reserved.
PIBAS FedSPARQL: a web-based platform for integration and exploration of bioinformatics datasets.
Djokic-Petrovic, Marija; Cvjetkovic, Vladimir; Yang, Jeremy; Zivanovic, Marko; Wild, David J
2017-09-20
There are a huge variety of data sources relevant to chemical, biological and pharmacological research, but these data sources are highly siloed and cannot be queried together in a straightforward way. Semantic technologies offer the ability to create links and mappings across datasets and manage them as a single, linked network so that searching can be carried out across datasets, independently of the source. We have developed an application called PIBAS FedSPARQL that uses semantic technologies to allow researchers to carry out such searching across a vast array of data sources. PIBAS FedSPARQL is a web-based query builder and result set visualizer of bioinformatics data. As an advanced feature, our system can detect similar data items identified by different Uniform Resource Identifiers (URIs), using a text-mining algorithm based on the processing of named entities to be used in Vector Space Model and Cosine Similarity Measures. According to our knowledge, PIBAS FedSPARQL was unique among the systems that we found in that it allows detecting of similar data items. As a query builder, our system allows researchers to intuitively construct and run Federated SPARQL queries across multiple data sources, including global initiatives, such as Bio2RDF, Chem2Bio2RDF, EMBL-EBI, and one local initiative called CPCTAS, as well as additional user-specified data source. From the input topic, subtopic, template and keyword, a corresponding initial Federated SPARQL query is created and executed. Based on the data obtained, end users have the ability to choose the most appropriate data sources in their area of interest and exploit their Resource Description Framework (RDF) structure, which allows users to select certain properties of data to enhance query results. The developed system is flexible and allows intuitive creation and execution of queries for an extensive range of bioinformatics topics. Also, the novel "similar data items detection" algorithm can be particularly useful for suggesting new data sources and cost optimization for new experiments. PIBAS FedSPARQL can be expanded with new topics, subtopics and templates on demand, rendering information retrieval more robust.
An Overview of GIS-Based Modeling and Assessment of Mining-Induced Hazards: Soil, Water, and Forest
Kim, Sung-Min; Yi, Huiuk; Choi, Yosoon
2017-01-01
In this study, current geographic information system (GIS)-based methods and their application for the modeling and assessment of mining-induced hazards were reviewed. Various types of mining-induced hazard, including soil contamination, soil erosion, water pollution, and deforestation were considered in the discussion of the strength and role of GIS as a viable problem-solving tool in relation to mining-induced hazards. The various types of mining-induced hazard were classified into two or three subtopics according to the steps involved in the reclamation procedure, or elements of the hazard of interest. Because GIS is appropriated for the handling of geospatial data in relation to mining-induced hazards, the application and feasibility of exploiting GIS-based modeling and assessment of mining-induced hazards within the mining industry could be expanded further. PMID:29186922
Recent progress in research on tungsten materials for nuclear fusion applications in Europe
NASA Astrophysics Data System (ADS)
Rieth, M.; Dudarev, S. L.; Gonzalez de Vicente, S. M.; Aktaa, J.; Ahlgren, T.; Antusch, S.; Armstrong, D. E. J.; Balden, M.; Baluc, N.; Barthe, M.-F.; Basuki, W. W.; Battabyal, M.; Becquart, C. S.; Blagoeva, D.; Boldyryeva, H.; Brinkmann, J.; Celino, M.; Ciupinski, L.; Correia, J. B.; De Backer, A.; Domain, C.; Gaganidze, E.; García-Rosales, C.; Gibson, J.; Gilbert, M. R.; Giusepponi, S.; Gludovatz, B.; Greuner, H.; Heinola, K.; Höschen, T.; Hoffmann, A.; Holstein, N.; Koch, F.; Krauss, W.; Li, H.; Lindig, S.; Linke, J.; Linsmeier, Ch.; López-Ruiz, P.; Maier, H.; Matejicek, J.; Mishra, T. P.; Muhammed, M.; Muñoz, A.; Muzyk, M.; Nordlund, K.; Nguyen-Manh, D.; Opschoor, J.; Ordás, N.; Palacios, T.; Pintsuk, G.; Pippan, R.; Reiser, J.; Riesch, J.; Roberts, S. G.; Romaner, L.; Rosiński, M.; Sanchez, M.; Schulmeyer, W.; Traxler, H.; Ureña, A.; van der Laan, J. G.; Veleva, L.; Wahlberg, S.; Walter, M.; Weber, T.; Weitkamp, T.; Wurster, S.; Yar, M. A.; You, J. H.; Zivelonghi, A.
2013-01-01
The current magnetic confinement nuclear fusion power reactor concepts going beyond ITER are based on assumptions about the availability of materials with extreme mechanical, heat, and neutron load capacity. In Europe, the development of such structural and armour materials together with the necessary production, machining, and fabrication technologies is pursued within the EFDA long-term fusion materials programme. This paper reviews the progress of work within the programme in the area of tungsten and tungsten alloys. Results, conclusions, and future projections are summarized for each of the programme's main subtopics, which are: (1) fabrication, (2) structural W materials, (3) W armour materials, and (4) materials science and modelling. It gives a detailed overview of the latest results on materials research, fabrication processes, joining options, high heat flux testing, plasticity studies, modelling, and validation experiments.
An Overview of GIS-Based Modeling and Assessment of Mining-Induced Hazards: Soil, Water, and Forest.
Suh, Jangwon; Kim, Sung-Min; Yi, Huiuk; Choi, Yosoon
2017-11-27
In this study, current geographic information system (GIS)-based methods and their application for the modeling and assessment of mining-induced hazards were reviewed. Various types of mining-induced hazard, including soil contamination, soil erosion, water pollution, and deforestation were considered in the discussion of the strength and role of GIS as a viable problem-solving tool in relation to mining-induced hazards. The various types of mining-induced hazard were classified into two or three subtopics according to the steps involved in the reclamation procedure, or elements of the hazard of interest. Because GIS is appropriated for the handling of geospatial data in relation to mining-induced hazards, the application and feasibility of exploiting GIS-based modeling and assessment of mining-induced hazards within the mining industry could be expanded further.
NASA Scope and Subject Category Guide
NASA Technical Reports Server (NTRS)
2011-01-01
This guide provides a simple, effective tool to assist aerospace information analysts and database builders in the high-level subject classification of technical materials. Each of the 76 subject categories comprising the classification scheme is presented with a description of category scope, a listing of subtopics, cross references, and an indication of particular areas of NASA interest. The guide also includes an index of nearly 3,000 specific research topics cross referenced to the subject categories. The portable document format (PDF) version of the guide contains links in the index from each input subject to its corresponding categories. In addition to subject classification, the guide can serve as an aid to searching databases that use the classification scheme, and is also an excellent selection guide for those involved in the acquisition of aerospace literature. The CD-ROM contains both HTML and PDF versions.
National Air Space (NAS) Data Exchange Environment Through 2060
NASA Technical Reports Server (NTRS)
Roy, Aloke
2015-01-01
NASA's NextGen Concepts and Technology Development (CTD) Project focuses on capabilities to improve safety, capacity and efficiency of the National Air Space (NAS). In order to achieve those objectives, NASA sought industry-Government partnerships to research and identify solutions for traffic flow management, dynamic airspace configuration, separation assurance, super density operations, airport surface operations and similar forward-looking air-traffic modernization (ATM) concepts. Data exchanges over NAS being the key enabler for most of these ATM concepts, the Sub-Topic area 3 of the CTD project sought to identify technology candidates that can satisfy air-to-air and air/ground communications needs of the NAS in the year 2060 timeframe. Honeywell, under a two-year contract with NASA, is working on this communications technology research initiative. This report summarizes Honeywell's research conducted during the second year of the study task.
NASA Technical Reports Server (NTRS)
Pinelli, Thomas E. (Editor); Sullivan, Shannon (Editor); Sanchez, Alicia (Editor)
2008-01-01
This NASA Conference Publication features select papers and PowerPoint presentations from the Education and Training Track of MODSIM World 2007 Conference and Expo. Invited speakers and panelists of national and international renown, representing academia, industry and government, discussed how modeling and simulation (M&S) technology can be used to accelerate learning in the K-16 classroom, especially when using M&S technology as a tool for integrating science, technology, engineering and mathematics (STEM) classes. The presenters also addressed the application ofM&S technology to learning and training outside of the classroom. Specific sub-topics of the presentations included: learning theory; curriculum development; professional development; tools/user applications; implementation/infrastructure/issues; and workforce development. There was a session devoted to student M&S competitions in Virginia too, as well as a poster session.
Weight loss in combat sports: physiological, psychological and performance effects.
Franchini, Emerson; Brito, Ciro José; Artioli, Guilherme Giannini
2012-12-13
The present article briefly reviews the weight loss processes in combat sports. We aimed to discuss the most relevant aspects of rapid weight loss (RWL) in combat sports. This review was performed in the databases MedLine, Lilacs, PubMed and SciELO, and organized into sub-topics: (1) prevalence, magnitude and procedures, (2) psychological, physiological and performance effects, (3) possible strategies to avoid decreased performance (4) organizational strategies to avoid such practices. There was a high prevalence (50%) of RWL, regardless the specific combat discipline. Methods used are harmful to performance and health, such as laxatives, diuretics, use of plastic or rubber suits, and sauna. RWL affects physical and cognitive capacities, and may increase the risk of death. Recommendations during different training phases, educational and organizational approaches are presented to deal with or to avoid RWL.
A Critical Review of Dental Implant Materials with an Emphasis on Titanium versus Zirconia
Osman, Reham B.; Swain, Michael V.
2015-01-01
The goal of the current publication is to provide a comprehensive literature review on the topic of dental implant materials. The following paper focuses on conventional titanium implants and more recently introduced and increasingly popular zirconia implants. Major subtopics include the material science and the clinical considerations involving both implant materials and the influence of their physical properties on the treatment outcome. Titanium remains the gold standard for the fabrication of oral implants, even though sensitivity does occur, though its clinical relevance is not yet clear. Zirconia implants may prove to be promising in the future; however, further in vitro and well-designed in vivo clinical studies are needed before such a recommendation can be made. Special considerations and technical experience are needed when dealing with zirconia implants to minimize the incidence of mechanical failure. PMID:28787980
Technology transfer within the government
NASA Technical Reports Server (NTRS)
Christensen, Carissa Bryce
1992-01-01
The report of a workshop panel concerned with technology transfer within the government is presented. The suggested subtopics for the panel were as follows: (1) transfer from non-NASA U.S. government technology developers to NASA space missions/programs; and (2) transfer from NASA to other U.S. government civil space mission programs. Two presentations were made to the panel: Roles/Value of Early Strategic Planning Within the Space Exploration Initiative (SEI) to Facilitate Later Technology Transfer To and From Industry; and NOAA Satellite Programs and Technology Requirements. The panel discussion addresses the following major issues: DOD/NASA cooperation; alternative mechanisms for interagency communication and interactions; current technology transfer relationships among federal research agencies, and strategies for improving this transfer; technology transfer mechanisms appropriate to intragovernment transfer; the importance of industry as a technology transfer conduit; and measures of merit.
Two biased estimation techniques in linear regression: Application to aircraft
NASA Technical Reports Server (NTRS)
Klein, Vladislav
1988-01-01
Several ways for detection and assessment of collinearity in measured data are discussed. Because data collinearity usually results in poor least squares estimates, two estimation techniques which can limit a damaging effect of collinearity are presented. These two techniques, the principal components regression and mixed estimation, belong to a class of biased estimation techniques. Detection and assessment of data collinearity and the two biased estimation techniques are demonstrated in two examples using flight test data from longitudinal maneuvers of an experimental aircraft. The eigensystem analysis and parameter variance decomposition appeared to be a promising tool for collinearity evaluation. The biased estimators had far better accuracy than the results from the ordinary least squares technique.
NASA Astrophysics Data System (ADS)
Matthews, Kelly E.; Adams, Peter; Goos, Merrilyn
2016-07-01
Application of mathematical and statistical thinking and reasoning, typically referred to as quantitative skills, is essential for university bioscience students. First, this study developed an assessment task intended to gauge graduating students' quantitative skills. The Quantitative Skills Assessment of Science Students (QSASS) was the result, which examined 10 mathematical and statistical sub-topics. Second, the study established an evidential baseline of students' quantitative skills performance and confidence levels by piloting the QSASS with 187 final-year biosciences students at a research-intensive university. The study is framed within the planned-enacted-experienced curriculum model and contributes to science reform efforts focused on enhancing the quantitative skills of university graduates, particularly in the biosciences. The results found, on average, weak performance and low confidence on the QSASS, suggesting divergence between academics' intentions and students' experiences of learning quantitative skills. Implications for curriculum design and future studies are discussed.
Publication ethics from the perspective of PhD students of health sciences: a limited experience.
Arda, Berna
2012-06-01
Publication ethics, an important subtopic of science ethics, deals with determination of the misconducts of science in performing research or in the dissemination of ideas, data and products. Science, the main features of which are secure, reliable and ethically obtained data, plays a major role in shaping the society. As long as science maintains its quality by being based on reliable and ethically obtained data, it will be possible to maintain its role in shaping the society. This article is devoted to the presentation of opinions of PhD candidate students in health sciences in Ankara concerning publication ethics. The data obtained from 143 PhD students from the fields of medicine, dentistry, pharmacy and veterinary reveal limited but unique experiences. It also shows that plagiarism is one of the worst issues in the publication ethics from the perspective of these young academics.
Westergaard, David; Stærfeldt, Hans-Henrik; Tønsberg, Christian; Jensen, Lars Juhl; Brunak, Søren
2018-02-01
Across academia and industry, text mining has become a popular strategy for keeping up with the rapid growth of the scientific literature. Text mining of the scientific literature has mostly been carried out on collections of abstracts, due to their availability. Here we present an analysis of 15 million English scientific full-text articles published during the period 1823-2016. We describe the development in article length and publication sub-topics during these nearly 250 years. We showcase the potential of text mining by extracting published protein-protein, disease-gene, and protein subcellular associations using a named entity recognition system, and quantitatively report on their accuracy using gold standard benchmark data sets. We subsequently compare the findings to corresponding results obtained on 16.5 million abstracts included in MEDLINE and show that text mining of full-text articles consistently outperforms using abstracts only.
Westergaard, David; Stærfeldt, Hans-Henrik
2018-01-01
Across academia and industry, text mining has become a popular strategy for keeping up with the rapid growth of the scientific literature. Text mining of the scientific literature has mostly been carried out on collections of abstracts, due to their availability. Here we present an analysis of 15 million English scientific full-text articles published during the period 1823–2016. We describe the development in article length and publication sub-topics during these nearly 250 years. We showcase the potential of text mining by extracting published protein–protein, disease–gene, and protein subcellular associations using a named entity recognition system, and quantitatively report on their accuracy using gold standard benchmark data sets. We subsequently compare the findings to corresponding results obtained on 16.5 million abstracts included in MEDLINE and show that text mining of full-text articles consistently outperforms using abstracts only. PMID:29447159
Kirk, Megan A; Rhodes, Ryan E
2011-07-01
Preschoolers with developmental delay (DD) are at risk for poor fundamental movement skills (FMS), but a paucity of early FMS interventions exist. The purpose of this review was to critically appraise the existing interventions to establish direction for future trials targeting preschoolers with DD. A total of 11 studies met the inclusion criteria. Major findings were summarized based on common subtopics of overall intervention effect, locomotor skill outcomes, object-control outcomes, and gender differences. Trials ranged from 8 to 24 weeks and offered 540-1700 min of instruction. The majority of trials (n = 9) significantly improved FMS of preschoolers with DD, with a large intervention effect (η(2) = 0.57-0.85). This review supports the utility of interventions to improve FMS of preschoolers with DD. Future researchers are encouraged to include more robust designs, a theoretical framework, and involvement of parents and teachers in the delivery of the intervention.
Weight loss in combat sports: physiological, psychological and performance effects
2012-01-01
Background The present article briefly reviews the weight loss processes in combat sports. We aimed to discuss the most relevant aspects of rapid weight loss (RWL) in combat sports. Methods This review was performed in the databases MedLine, Lilacs, PubMed and SciELO, and organized into sub-topics: (1) prevalence, magnitude and procedures, (2) psychological, physiological and performance effects, (3) possible strategies to avoid decreased performance (4) organizational strategies to avoid such practices. Results There was a high prevalence (50%) of RWL, regardless the specific combat discipline. Methods used are harmful to performance and health, such as laxatives, diuretics, use of plastic or rubber suits, and sauna. RWL affects physical and cognitive capacities, and may increase the risk of death. Conclusion Recommendations during different training phases, educational and organizational approaches are presented to deal with or to avoid RWL. PMID:23237303
Boundary methods for mode estimation
NASA Astrophysics Data System (ADS)
Pierson, William E., Jr.; Ulug, Batuhan; Ahalt, Stanley C.
1999-08-01
This paper investigates the use of Boundary Methods (BMs), a collection of tools used for distribution analysis, as a method for estimating the number of modes associated with a given data set. Model order information of this type is required by several pattern recognition applications. The BM technique provides a novel approach to this parameter estimation problem and is comparable in terms of both accuracy and computations to other popular mode estimation techniques currently found in the literature and automatic target recognition applications. This paper explains the methodology used in the BM approach to mode estimation. Also, this paper quickly reviews other common mode estimation techniques and describes the empirical investigation used to explore the relationship of the BM technique to other mode estimation techniques. Specifically, the accuracy and computational efficiency of the BM technique are compared quantitatively to the a mixture of Gaussian (MOG) approach and a k-means approach to model order estimation. The stopping criteria of the MOG and k-means techniques is the Akaike Information Criteria (AIC).
NASA Astrophysics Data System (ADS)
Sutliff, T. J.; Otero, A. M.; Urban, D. L.
2002-01-01
The Physical Sciences Research Program of NASA has chartered a broad suite of peer-reviewed research investigating both fundamental combustion phenomena and applied combustion research topics. Fundamental research provides insights to develop accurate simulations of complex combustion processes and allows developers to improve the efficiency of combustion devices, to reduce the production of harmful emissions, and to reduce the incidence of accidental uncontrolled combustion (fires, explosions). The applied research benefit humans living and working in space through its fire safety program. The Combustion Science Discipline is implementing a structured flight research program utilizing the International Space Station (ISS) and two of its premier facilities, the Combustion Integrated Rack of the Fluids and Combustion Facility and the Microgravity Science Glovebox to conduct this space-based research. This paper reviews the current vision of Combustion Science research planned for International Space Station implementation from 2003 through 2012. A variety of research efforts in droplets and sprays, solid-fuels combustion, and gaseous combustion have been independently selected and critiqued through a series of peer-review processes. During this period, while both the ISS carrier and its research facilities are under development, the Combustion Science Discipline has synergistically combined research efforts into sub-topical areas. To conduct this research aboard ISS in the most cost effective and resource efficient manner, the sub-topic research areas are implemented via a multi-user hardware approach. This paper also summarizes the multi-user hardware approach and recaps the progress made in developing these research hardware systems. A balanced program content has been developed to maximize the production of fundamental and applied combustion research results within the current budgetary and ISS operational resource constraints. Decisions on utilizing the Combustion Integrated Rack and the Microgravity Science Glovebox are made based on facility capabilities and research requirements. To maximize research potential, additional research objectives are specified as desires a priori during the research design phase. These expanded research goals, which are designed to be achievable even with late addition of operational resources, allow additional research of a known, peer-endorsed scope to be conducted at marginal cost. Additional operational resources such as upmass, crewtime, data downlink bandwidth, and stowage volume may be presented by the ISS planners late in the research mission planning process. The Combustion Discipline has put in place plans to be prepared to take full advantage of such opportunities.
NASA Astrophysics Data System (ADS)
Favia, Andrej
My Ph.D. research is about examining the persistence of 215 common misconceptions in astronomy. Each misconception is based on an often commonly-held incorrect belief by college students taking introductory astronomy. At the University of Maine, the course is taught in alternating semesters by Prof. Neil F. Comins and Prof. David J. Batuski. In this dissertation, I examine the persistence of common astronomy misconceptions by the administration of a retrospective survey. The survey is a new instrument in that it permits the student to indicate either endorsement or rejection of each misconception at various stages in the student's life. I analyze data from a total of 639 students over six semesters. I compare the survey data to the results of exams taken by the students and additional instruments that assess students' misconceptions prior to instruction. I show that the consistency of the students' recollection of their own misconceptions is on par with the consistency of responses between prelims and the final exam. I also find that students who report higher increased childhood interest in astronomy are more likely to have accurate recalls of their own past recollections. I then discuss the use of principal components analysis as a technique for describing the extent to which misconceptions are correlated with each other. The analysis yields logical groupings of subtopics from which to teach. I then present a brief overview of item response theory, the methodology of which calculates relative difficulties of the items. My analysis reveals orders to teach the associated topics in ways that are most effective at dispelling misconceptions during instruction. I also find that the best order to teach the associated concepts is often different for high school and college level courses.
ISO Key Project: Exploring the Full Range of Quasar/Agn Properties
NASA Technical Reports Server (NTRS)
Wilkes, Belinda; Oliversen, Ronald J. (Technical Monitor)
2003-01-01
While most of the work on this program has been completed, as previously reported, the portion of the program dealing with the subtopic of ISO LWS data analysis and reduction for the LWS Extragalactic Science Team and its leader, Dr. Howard Smith, is still active. This program in fact continues to generate results, and newly available computer modeling has extended the value of the datasets. As a result the team requests a one-year no-cost extension to this program, through 31 December 2004. The essence of the proposal is to perform ISO spectroscopic studies, including data analysis and modeling, of star-formation regions using an ensemble of archival space-based data from the Infrared Space Observatory's Long Wavelength Spectrometer and Short Wavelength Spectrometer, but including as well some other spectroscopic databases. Four kinds of regions are considered in the studies: (1) disks around more evolved objects; (2) young, low or high mass pre-main sequence stars in star-formation regions; (3) star formation in external, bright IR galaxies; and (4) the galactic center. One prime focus of the program is the OH lines in the far infrared. The program has the following goals: 1) Refine the data analysis of ISO observations to obtain deeper and better SNR results on selected sources. The ISO data itself underwent 'pipeline 10' reductions in early 2001, and additional 'hands-on data reduction packages' were supplied by the ISO teams in 2001. The Fabry-Perot database is particularly sensitive to noise and slight calibration errors; 2) Model the atomic and molecular line shapes, in particular the OH lines, using revised Monte-Carlo techniques developed by the SWAS team at the Center for Astrophysics; 3) Attend scientific meetings and workshops; 4) Perform E&PO activities related to infrared astrophysics and/or spectroscopy.
Evaluation of gravimetric techniques to estimate the microvascular filtration coefficient
Dongaonkar, R. M.; Laine, G. A.; Stewart, R. H.
2011-01-01
Microvascular permeability to water is characterized by the microvascular filtration coefficient (Kf). Conventional gravimetric techniques to estimate Kf rely on data obtained from either transient or steady-state increases in organ weight in response to increases in microvascular pressure. Both techniques result in considerably different estimates and neither account for interstitial fluid storage and lymphatic return. We therefore developed a theoretical framework to evaluate Kf estimation techniques by 1) comparing conventional techniques to a novel technique that includes effects of interstitial fluid storage and lymphatic return, 2) evaluating the ability of conventional techniques to reproduce Kf from simulated gravimetric data generated by a realistic interstitial fluid balance model, 3) analyzing new data collected from rat intestine, and 4) analyzing previously reported data. These approaches revealed that the steady-state gravimetric technique yields estimates that are not directly related to Kf and are in some cases directly proportional to interstitial compliance. However, the transient gravimetric technique yields accurate estimates in some organs, because the typical experimental duration minimizes the effects of interstitial fluid storage and lymphatic return. Furthermore, our analytical framework reveals that the supposed requirement of tying off all draining lymphatic vessels for the transient technique is unnecessary. Finally, our numerical simulations indicate that our comprehensive technique accurately reproduces the value of Kf in all organs, is not confounded by interstitial storage and lymphatic return, and provides corroboration of the estimate from the transient technique. PMID:21346245
NASA Technical Reports Server (NTRS)
Morris, A. Terry
1999-01-01
This paper examines various sources of error in MIT's improved top oil temperature rise over ambient temperature model and estimation process. The sources of error are the current parameter estimation technique, quantization noise, and post-processing of the transformer data. Results from this paper will show that an output error parameter estimation technique should be selected to replace the current least squares estimation technique. The output error technique obtained accurate predictions of transformer behavior, revealed the best error covariance, obtained consistent parameter estimates, and provided for valid and sensible parameters. This paper will also show that the output error technique should be used to minimize errors attributed to post-processing (decimation) of the transformer data. Models used in this paper are validated using data from a large transformer in service.
Bayesian techniques for surface fuel loading estimation
Kathy Gray; Robert Keane; Ryan Karpisz; Alyssa Pedersen; Rick Brown; Taylor Russell
2016-01-01
A study by Keane and Gray (2013) compared three sampling techniques for estimating surface fine woody fuels. Known amounts of fine woody fuel were distributed on a parking lot, and researchers estimated the loadings using different sampling techniques. An important result was that precise estimates of biomass required intensive sampling for both the planar intercept...
Estimation of correlation functions by stochastic approximation.
NASA Technical Reports Server (NTRS)
Habibi, A.; Wintz, P. A.
1972-01-01
Consideration of the autocorrelation function of a zero-mean stationary random process. The techniques are applicable to processes with nonzero mean provided the mean is estimated first and subtracted. Two recursive techniques are proposed, both of which are based on the method of stochastic approximation and assume a functional form for the correlation function that depends on a number of parameters that are recursively estimated from successive records. One technique uses a standard point estimator of the correlation function to provide estimates of the parameters that minimize the mean-square error between the point estimates and the parametric function. The other technique provides estimates of the parameters that maximize a likelihood function relating the parameters of the function to the random process. Examples are presented.
Sim, Kok Swee; NorHisham, Syafiq
2016-11-01
A technique based on linear Least Squares Regression (LSR) model is applied to estimate signal-to-noise ratio (SNR) of scanning electron microscope (SEM) images. In order to test the accuracy of this technique on SNR estimation, a number of SEM images are initially corrupted with white noise. The autocorrelation function (ACF) of the original and the corrupted SEM images are formed to serve as the reference point to estimate the SNR value of the corrupted image. The LSR technique is then compared with the previous three existing techniques known as nearest neighbourhood, first-order interpolation, and the combination of both nearest neighborhood and first-order interpolation. The actual and the estimated SNR values of all these techniques are then calculated for comparison purpose. It is shown that the LSR technique is able to attain the highest accuracy compared to the other three existing techniques as the absolute difference between the actual and the estimated SNR value is relatively small. SCANNING 38:771-782, 2016. © 2016 Wiley Periodicals, Inc. © Wiley Periodicals, Inc.
As-built design specification for proportion estimate software subsystem
NASA Technical Reports Server (NTRS)
Obrien, S. (Principal Investigator)
1980-01-01
The Proportion Estimate Processor evaluates four estimation techniques in order to get an improved estimate of the proportion of a scene that is planted in a selected crop. The four techniques to be evaluated were provided by the techniques development section and are: (1) random sampling; (2) proportional allocation, relative count estimate; (3) proportional allocation, Bayesian estimate; and (4) sequential Bayesian allocation. The user is given two options for computation of the estimated mean square error. These are referred to as the cluster calculation option and the segment calculation option. The software for the Proportion Estimate Processor is operational on the IBM 3031 computer.
Steinauer, Jody; LaRochelle, Flynn; Rowh, Marta; Backus, Lois; Sandahl, Yarrow; Foster, Angel
2009-07-01
This study evaluates the inclusion of sexual and reproductive health (SRH) topics in preclinical US and Canadian medical education. Between 2002 and 2005, we sent surveys to the student coordinators of active Medical Students for Choice chapters at 122 US and Canadian medical schools. Students reported on the preclinical curricular inclusion of 50 specific SRH topics in the broad categories of pregnancy, contraception, infertility, elective abortion, ethical and social issues, and other topics. We received 77 completed surveys, for an overall response rate of 63%. Coverage of pregnancy physiology and STIs/HIV was uniformly high. In contrast, inclusion of contraceptive methods and elective abortion procedures greatly varied by subtopic and geographic region. Thirty-three percent of respondents reported no coverage of elective abortion-related topics. Inclusion of contraception and elective abortion in preclinical medical school courses varies widely. As critical components of women's lives and health, we recommend that medical schools work to integrate comprehensive family planning content into their standard curricula.
Pharmacy curriculum outcomes assessment for individual student assessment and curricular evaluation.
Scott, Day M; Bennett, Lunawati L; Ferrill, Mary J; Brown, Daniel L
2010-12-15
The Pharmacy Curriculum Outcomes Assessment (PCOA) is a standardized examination for assessing academic progress of pharmacy students. Although no other national benchmarking tool is available on a national level, the PCOA has not been adopted by all colleges and schools of pharmacy. Palm Beach Atlantic University (PBAU) compared 2008-2010 PCOA results of its P1, P2, and P3 students to their current grade point average (GPA) and to results of a national cohort. The reliability coefficient of PCOA was 0.91, 0.90, and 0.93 for the 3 years, respectively. PBAU results showed a positive correlation between GPA and PCOA scale score. A comparison of subtopic results helped to identify areas of strengths and weaknesses of the curriculum. PCOA provides useful comparative data that can facilitate individual student assessment as well as programmatic evaluation. There are no other standardized assessment tools available. Despite limitations, PCOA warrants consideration by colleges and schools of pharmacy. Expanded participation could enhance its utility as a meaningful benchmark.
Pharmacy Curriculum Outcomes Assessment for Individual Student Assessment and Curricular Evaluation
Bennett, Lunawati L.; Ferrill, Mary J.; Brown, Daniel L.
2010-01-01
The Pharmacy Curriculum Outcomes Assessment (PCOA) is a standardized examination for assessing academic progress of pharmacy students. Although no other national benchmarking tool is available on a national level, the PCOA has not been adopted by all colleges and schools of pharmacy. Palm Beach Atlantic University (PBAU) compared 2008-2010 PCOA results of its P1, P2, and P3 students to their current grade point average (GPA) and to results of a national cohort. The reliability coefficient of PCOA was 0.91, 0.90, and 0.93 for the 3 years, respectively. PBAU results showed a positive correlation between GPA and PCOA scale score. A comparison of subtopic results helped to identify areas of strengths and weaknesses of the curriculum. PCOA provides useful comparative data that can facilitate individual student assessment as well as programmatic evaluation. There are no other standardized assessment tools available. Despite limitations, PCOA warrants consideration by colleges and schools of pharmacy. Expanded participation could enhance its utility as a meaningful benchmark. PMID:21436924
Gurvitz, Michelle; Burns, Kristin M.; Brindis, Ralph; Broberg, Craig S.; Daniels, Curt J.; Fuller, Stephanie M.P.N.; Honein, Margaret A.; Khairy, Paul; Kuehl, Karen S.; Landzberg, Michael J.; Mahle, William T.; Mann, Douglas L.; Marelli, Ariane; Newburger, Jane W.; Pearson, Gail D.; Starling, Randall C.; Tringali, Glenn R.; Valente, Anne Marie; Wu, Joseph C.; Califf, Robert M.
2016-01-01
Congenital heart disease (CHD) is the most common birth defect, affecting about 0.8% of live births. Advances in recent decades have allowed >85% of children with CHD to survive to adulthood, creating a growing population of adults with CHD. Little information exists regarding survival, demographics, late outcomes, and comorbidities in this emerging group, and multiple barriers impede research in adult CHD (ACHD). The National Heart, Lung, and Blood Institute and the Adult Congenital Heart Association convened a multidisciplinary Working Group to identify high-impact research questions in ACHD. This report summarizes the meeting discussions in the broad areas of CHD-related heart failure, vascular disease and multisystem complications. High-priority subtopics identified included heart failure in tetralogy of Fallot, mechanical circulatory support/transplantation, sudden cardiac death, vascular outcomes in coarctation of the aorta, late outcomes in single ventricle disease, cognitive and psychiatric issues, and pregnancy. PMID:27102511
NASA Technical Reports Server (NTRS)
Kojiro, Daniel R.; Lee, Geoffrey S.
2006-01-01
The purposes of the SBIR Program are to: stimulate technological innovation in the private sector; strengthen the role of Small Business Concerns (SBCs) in meeting Federal research and development needs; increase the commercial application of these research results; and encourage participation of socially and economically disadvantaged persons and women-owned small businesses. The process can be highly rewarding, providing the small business with resources to pursue research and development with a focus on providing NASA with new and advanced capabilities. We present two examples of how the NASA Ames SBIR Program has addressed these purposes, nurturing innovative ideas from small, businesses into commercially viable products that also address analytical needs in space research. These examples, from the Science Instruments for Conducting Solar System Exploration Subtopic, describe the journey from innovative concept to analytical instrument, one successful and one hampered by numerous roadblocks (including some international intrigue}.
Students’ conceptions analysis on several electricity concepts
NASA Astrophysics Data System (ADS)
Saputro, D. E.; Sarwanto, S.; Sukarmin, S.; Ratnasari, D.
2018-05-01
This research is aimed to analyse students’ conceptions on several electricity concept. This is a descriptive research with the subjects of new students of Sebelas Maret University. The numbers of the subject were 279 students that consisted of several departments such as science education, physics education, chemistry education, biology education and mathematics education in the academic year of 2017/2018. The instrument used in this research was the multiple-choice test with arguments. Based on the result of the research and analysis, it can be concluded that most of the students still find misconceptions and do not understand electricity concept on sub-topics such as electric current characteristic in the series and parallel arrangement, the value of capacitor capacitance, the influence of the capacitor charge and discharge towards the loads, and the amount of capacitor series arrangement. For the future research, it is suggested to improve students’ conceptual understanding with appropriate learning method and assessment instrument because electricity is one of physics material that closely related with students’ daily life.
Network Analysis of Publications on Topological Indices from the Web of Science.
Bodlaj, Jernej; Batagelj, Vladimir
2014-08-01
In this paper we analyze a collection of bibliographic networks, constructed from the data from the Web of Science on works (papers, books, etc.) on the topic of topological indices and on relating scientific fields. We present the general outlook and more specific findings about authors, works and journals, subtopics and keywords and also important relations between them based on scientometric approaches like the strongest and main citation paths, the main themes on citation path based on keywords, results of co-authorship analysis in form of the most prominent islands of citing authors, groups of collaborating authors, two-mode cores of authors and works. We investigate the nature of citing of authors, important journals and citing of works between them, journals preferred by authors and expose hierarchy of similar collaborating authors, based on keywords they use. We perform temporal analysis on one important journal as well. We give a comprehensive scientometric insight into the field of topological indices. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Swanson, James M; Wigal, Timothy; Jensen, Peter S; Mitchell, John T; Weisner, Thomas S; Murray, Desiree; Arnold, L Eugene; Hechtman, Lily; Molina, Brooke S G; Owens, Elizabeth B; Hinshaw, Stephen P; Belendiuk, Katherine; Howard, Andrea; Wigal, Sharon B; Sorensen, Page; Stehli, Annamarie
2017-10-01
To evaluate participants' perceptions about frequent use and reasons for substance use (SU) in the qualitative interview study, an add-on to the multimodal treatment study of ADHD (MTA). Using the longitudinal MTA database, 39 ADHD cases and 19 peers with Persistent SU, and 86 ADHD cases and 39 peers without Persistent SU were identified and recruited. In adulthood, an open-ended interview was administered, and SU excerpts were indexed and classified to create subtopics (frequent use and reasons for use of alcohol, marijuana, and other drugs). For marijuana, the Persistent compared with Nonpersistent SU group had a significantly higher percentage of participants describing frequent use and giving reasons for use, and the ADHD group compared with the group of peers had a significantly higher percentage giving "stability" as a reason for use. Motivations for persistent marijuana use may differ for adults with and without a history of ADHD.
The Mental Health Team: Evaluation From a Professional Viewpoint. A Qualitative Study.
Pileño, María Elena; Morillo, Javier; Morillo, Andrea; Losa-Iglesias, Marta
2018-04-01
Health care institutions include workers who must operate in accordance with the requirements of the position, even though there are psychosocial influences that can affect the stability of the worker. To analyze the organizational culture of the team of professionals who work in the mental health network. A qualitative methodology was used to assess a sample of 55 mental health professionals who have been practicing for at least 5years. "Team" was the overall topic. The subtopics within "Team" were: getting along in the unit, getting along with the patient, personal resources for dealing with patients, adaptive resources of team members and, resources that the team uses in their group activities. It was observed that the team does not work with a common objective and needs an accepted leader to manage the group. The definition and acceptance of roles can result in conflict. By increasing the skill level of each worker, the multidisciplinary team would be more collaborative. Copyright © 2017 Elsevier Inc. All rights reserved.
Use of Empirical Estimates of Shrinkage in Multiple Regression: A Caution.
ERIC Educational Resources Information Center
Kromrey, Jeffrey D.; Hines, Constance V.
1995-01-01
The accuracy of four empirical techniques to estimate shrinkage in multiple regression was studied through Monte Carlo simulation. None of the techniques provided unbiased estimates of the population squared multiple correlation coefficient, but the normalized jackknife and bootstrap techniques demonstrated marginally acceptable performance with…
NASA Technical Reports Server (NTRS)
Suit, W. T.; Cannaday, R. L.
1979-01-01
The longitudinal and lateral stability and control parameters for a high wing, general aviation, airplane are examined. Estimations using flight data obtained at various flight conditions within the normal range of the aircraft are presented. The estimations techniques, an output error technique (maximum likelihood) and an equation error technique (linear regression), are presented. The longitudinal static parameters are estimated from climbing, descending, and quasi steady state flight data. The lateral excitations involve a combination of rudder and ailerons. The sensitivity of the aircraft modes of motion to variations in the parameter estimates are discussed.
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1996-03-01
The module provides an overview of general techniques that owners and operators of reporting facilities may use to estimate their toxic chemical releases. It exlains the basic release estimation techniques used to determine the chemical quantities reported on the Form R and uses those techniques, along with fundamental chemical or physical principles and properties, to estimate releases of listed toxic chemicals. It converts units of mass, volume, and time. It states the rules governing significant figures and rounding techniques, and references general and industry-specific estimation documents.
Comparing Three Estimation Methods for the Three-Parameter Logistic IRT Model
ERIC Educational Resources Information Center
Lamsal, Sunil
2015-01-01
Different estimation procedures have been developed for the unidimensional three-parameter item response theory (IRT) model. These techniques include the marginal maximum likelihood estimation, the fully Bayesian estimation using Markov chain Monte Carlo simulation techniques, and the Metropolis-Hastings Robbin-Monro estimation. With each…
Deep learning ensemble with asymptotic techniques for oscillometric blood pressure estimation.
Lee, Soojeong; Chang, Joon-Hyuk
2017-11-01
This paper proposes a deep learning based ensemble regression estimator with asymptotic techniques, and offers a method that can decrease uncertainty for oscillometric blood pressure (BP) measurements using the bootstrap and Monte-Carlo approach. While the former is used to estimate SBP and DBP, the latter attempts to determine confidence intervals (CIs) for SBP and DBP based on oscillometric BP measurements. This work originally employs deep belief networks (DBN)-deep neural networks (DNN) to effectively estimate BPs based on oscillometric measurements. However, there are some inherent problems with these methods. First, it is not easy to determine the best DBN-DNN estimator, and worthy information might be omitted when selecting one DBN-DNN estimator and discarding the others. Additionally, our input feature vectors, obtained from only five measurements per subject, represent a very small sample size; this is a critical weakness when using the DBN-DNN technique and can cause overfitting or underfitting, depending on the structure of the algorithm. To address these problems, an ensemble with an asymptotic approach (based on combining the bootstrap with the DBN-DNN technique) is utilized to generate the pseudo features needed to estimate the SBP and DBP. In the first stage, the bootstrap-aggregation technique is used to create ensemble parameters. Afterward, the AdaBoost approach is employed for the second-stage SBP and DBP estimation. We then use the bootstrap and Monte-Carlo techniques in order to determine the CIs based on the target BP estimated using the DBN-DNN ensemble regression estimator with the asymptotic technique in the third stage. The proposed method can mitigate the estimation uncertainty such as large the standard deviation of error (SDE) on comparing the proposed DBN-DNN ensemble regression estimator with the DBN-DNN single regression estimator, we identify that the SDEs of the SBP and DBP are reduced by 0.58 and 0.57 mmHg, respectively. These indicate that the proposed method actually enhances the performance by 9.18% and 10.88% compared with the DBN-DNN single estimator. The proposed methodology improves the accuracy of BP estimation and reduces the uncertainty for BP estimation. Copyright © 2017 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Rajshekhar, G.; Gorthi, Sai Siva; Rastogi, Pramod
2010-04-01
For phase estimation in digital holographic interferometry, a high-order instantaneous moments (HIM) based method was recently developed which relies on piecewise polynomial approximation of phase and subsequent evaluation of the polynomial coefficients using the HIM operator. A crucial step in the method is mapping the polynomial coefficient estimation to single-tone frequency determination for which various techniques exist. The paper presents a comparative analysis of the performance of the HIM operator based method in using different single-tone frequency estimation techniques for phase estimation. The analysis is supplemented by simulation results.
Positional estimation techniques for an autonomous mobile robot
NASA Technical Reports Server (NTRS)
Nandhakumar, N.; Aggarwal, J. K.
1990-01-01
Techniques for positional estimation of a mobile robot navigation in an indoor environment are described. A comprehensive review of the various positional estimation techniques studied in the literature is first presented. The techniques are divided into four different types and each of them is discussed briefly. Two different kinds of environments are considered for positional estimation; mountainous natural terrain and an urban, man-made environment with polyhedral buildings. In both cases, the robot is assumed to be equipped with single visual camera that can be panned and tilted and also a 3-D description (world model) of the environment is given. Such a description could be obtained from a stereo pair of aerial images or from the architectural plans of the buildings. Techniques for positional estimation using the camera input and the world model are presented.
Estimation for bilinear stochastic systems
NASA Technical Reports Server (NTRS)
Willsky, A. S.; Marcus, S. I.
1974-01-01
Three techniques for the solution of bilinear estimation problems are presented. First, finite dimensional optimal nonlinear estimators are presented for certain bilinear systems evolving on solvable and nilpotent lie groups. Then the use of harmonic analysis for estimation problems evolving on spheres and other compact manifolds is investigated. Finally, an approximate estimation technique utilizing cumulants is discussed.
Comparison of five canopy cover estimation techniques in the western Oregon Cascades.
Anne C.S. Fiala; Steven L. Garman; Andrew N. Gray
2006-01-01
Estimates of forest canopy cover are widely used in forest research and management, yet methods used to quantify canopy cover and the estimates they provide vary greatly. Four commonly used ground-based techniques for estimating overstory cover - line-intercept, spherical densiometer, moosehorn, and hemispherical photography - and cover estimates generated from crown...
Simulations of motor unit number estimation techniques
NASA Astrophysics Data System (ADS)
Major, Lora A.; Jones, Kelvin E.
2005-06-01
Motor unit number estimation (MUNE) is an electrodiagnostic procedure used to evaluate the number of motor axons connected to a muscle. All MUNE techniques rely on assumptions that must be fulfilled to produce a valid estimate. As there is no gold standard to compare the MUNE techniques against, we have developed a model of the relevant neuromuscular physiology and have used this model to simulate various MUNE techniques. The model allows for a quantitative analysis of candidate MUNE techniques that will hopefully contribute to consensus regarding a standard procedure for performing MUNE.
Quantitative CT: technique dependence of volume estimation on pulmonary nodules
NASA Astrophysics Data System (ADS)
Chen, Baiyu; Barnhart, Huiman; Richard, Samuel; Colsher, James; Amurao, Maxwell; Samei, Ehsan
2012-03-01
Current estimation of lung nodule size typically relies on uni- or bi-dimensional techniques. While new three-dimensional volume estimation techniques using MDCT have improved size estimation of nodules with irregular shapes, the effect of acquisition and reconstruction parameters on accuracy (bias) and precision (variance) of the new techniques has not been fully investigated. To characterize the volume estimation performance dependence on these parameters, an anthropomorphic chest phantom containing synthetic nodules was scanned and reconstructed with protocols across various acquisition and reconstruction parameters. Nodule volumes were estimated by a clinical lung analysis software package, LungVCAR. Precision and accuracy of the volume assessment were calculated across the nodules and compared between protocols via a generalized estimating equation analysis. Results showed that the precision and accuracy of nodule volume quantifications were dependent on slice thickness, with different dependences for different nodule characteristics. Other parameters including kVp, pitch, and reconstruction kernel had lower impact. Determining these technique dependences enables better volume quantification via protocol optimization and highlights the importance of consistent imaging parameters in sequential examinations.
Nonparametric probability density estimation by optimization theoretic techniques
NASA Technical Reports Server (NTRS)
Scott, D. W.
1976-01-01
Two nonparametric probability density estimators are considered. The first is the kernel estimator. The problem of choosing the kernel scaling factor based solely on a random sample is addressed. An interactive mode is discussed and an algorithm proposed to choose the scaling factor automatically. The second nonparametric probability estimate uses penalty function techniques with the maximum likelihood criterion. A discrete maximum penalized likelihood estimator is proposed and is shown to be consistent in the mean square error. A numerical implementation technique for the discrete solution is discussed and examples displayed. An extensive simulation study compares the integrated mean square error of the discrete and kernel estimators. The robustness of the discrete estimator is demonstrated graphically.
Development and application of the maximum entropy method and other spectral estimation techniques
NASA Astrophysics Data System (ADS)
King, W. R.
1980-09-01
This summary report is a collection of four separate progress reports prepared under three contracts, which are all sponsored by the Office of Naval Research in Arlington, Virginia. This report contains the results of investigations into the application of the maximum entropy method (MEM), a high resolution, frequency and wavenumber estimation technique. The report also contains a description of two, new, stable, high resolution spectral estimation techniques that is provided in the final report section. Many examples of wavenumber spectral patterns for all investigated techniques are included throughout the report. The maximum entropy method is also known as the maximum entropy spectral analysis (MESA) technique, and both names are used in the report. Many MEM wavenumber spectral patterns are demonstrated using both simulated and measured radar signal and noise data. Methods for obtaining stable MEM wavenumber spectra are discussed, broadband signal detection using the MEM prediction error transform (PET) is discussed, and Doppler radar narrowband signal detection is demonstrated using the MEM technique. It is also shown that MEM cannot be applied to randomly sampled data. The two new, stable, high resolution, spectral estimation techniques discussed in the final report section, are named the Wiener-King and the Fourier spectral estimation techniques. The two new techniques have a similar derivation based upon the Wiener prediction filter, but the two techniques are otherwise quite different. Further development of the techniques and measurement of the technique spectral characteristics is recommended for subsequent investigation.
Improved Pulse Wave Velocity Estimation Using an Arterial Tube-Load Model
Gao, Mingwu; Zhang, Guanqun; Olivier, N. Bari; Mukkamala, Ramakrishna
2015-01-01
Pulse wave velocity (PWV) is the most important index of arterial stiffness. It is conventionally estimated by non-invasively measuring central and peripheral blood pressure (BP) and/or velocity (BV) waveforms and then detecting the foot-to-foot time delay between the waveforms wherein wave reflection is presumed absent. We developed techniques for improved estimation of PWV from the same waveforms. The techniques effectively estimate PWV from the entire waveforms, rather than just their feet, by mathematically eliminating the reflected wave via an arterial tube-load model. In this way, the techniques may be more robust to artifact while revealing the true PWV in absence of wave reflection. We applied the techniques to estimate aortic PWV from simultaneously and sequentially measured central and peripheral BP waveforms and simultaneously measured central BV and peripheral BP waveforms from 17 anesthetized animals during diverse interventions that perturbed BP widely. Since BP is the major acute determinant of aortic PWV, especially under anesthesia wherein vasomotor tone changes are minimal, we evaluated the techniques in terms of the ability of their PWV estimates to track the acute BP changes in each subject. Overall, the PWV estimates of the techniques tracked the BP changes better than those of the conventional technique (e.g., diastolic BP root-mean-squared-errors of 3.4 vs. 5.2 mmHg for the simultaneous BP waveforms and 7.0 vs. 12.2 mmHg for the BV and BP waveforms (p < 0.02)). With further testing, the arterial tube-load model-based PWV estimation techniques may afford more accurate arterial stiffness monitoring in hypertensive and other patients. PMID:24263016
Accuracy of selected techniques for estimating ice-affected streamflow
Walker, John F.
1991-01-01
This paper compares the accuracy of selected techniques for estimating streamflow during ice-affected periods. The techniques are classified into two categories - subjective and analytical - depending on the degree of judgment required. Discharge measurements have been made at three streamflow-gauging sites in Iowa during the 1987-88 winter and used to established a baseline streamflow record for each site. Using data based on a simulated six-week field-tip schedule, selected techniques are used to estimate discharge during the ice-affected periods. For the subjective techniques, three hydrographers have independently compiled each record. Three measures of performance are used to compare the estimated streamflow records with the baseline streamflow records: the average discharge for the ice-affected period, and the mean and standard deviation of the daily errors. Based on average ranks for three performance measures and the three sites, the analytical and subjective techniques are essentially comparable. For two of the three sites, Kruskal-Wallis one-way analysis of variance detects significant differences among the three hydrographers for the subjective methods, indicating that the subjective techniques are less consistent than the analytical techniques. The results suggest analytical techniques may be viable tools for estimating discharge during periods of ice effect, and should be developed further and evaluated for sites across the United States.
Choosing a DIVA: a comparison of emerging digital imagery vegetation analysis techniques
Jorgensen, Christopher F.; Stutzman, Ryan J.; Anderson, Lars C.; Decker, Suzanne E.; Powell, Larkin A.; Schacht, Walter H.; Fontaine, Joseph J.
2013-01-01
Question: What is the precision of five methods of measuring vegetation structure using ground-based digital imagery and processing techniques? Location: Lincoln, Nebraska, USA Methods: Vertical herbaceous cover was recorded using digital imagery techniques at two distinct locations in a mixed-grass prairie. The precision of five ground-based digital imagery vegetation analysis (DIVA) methods for measuring vegetation structure was tested using a split-split plot analysis of covariance. Variability within each DIVA technique was estimated using coefficient of variation of mean percentage cover. Results: Vertical herbaceous cover estimates differed among DIVA techniques. Additionally, environmental conditions affected the vertical vegetation obstruction estimates for certain digital imagery methods, while other techniques were more adept at handling various conditions. Overall, percentage vegetation cover values differed among techniques, but the precision of four of the five techniques was consistently high. Conclusions: DIVA procedures are sufficient for measuring various heights and densities of standing herbaceous cover. Moreover, digital imagery techniques can reduce measurement error associated with multiple observers' standing herbaceous cover estimates, allowing greater opportunity to detect patterns associated with vegetation structure.
Development of a technique for estimating noise covariances using multiple observers
NASA Technical Reports Server (NTRS)
Bundick, W. Thomas
1988-01-01
Friedland's technique for estimating the unknown noise variances of a linear system using multiple observers has been extended by developing a general solution for the estimates of the variances, developing the statistics (mean and standard deviation) of these estimates, and demonstrating the solution on two examples.
An adaptive technique for estimating the atmospheric density profile during the AE mission
NASA Technical Reports Server (NTRS)
Argentiero, P.
1973-01-01
A technique is presented for processing accelerometer data obtained during the AE missions in order to estimate the atmospheric density profile. A minimum variance, adaptive filter is utilized. The trajectory of the probe and probe parameters are in a consider mode where their estimates are unimproved but their associated uncertainties are permitted an impact on filter behavior. Simulations indicate that the technique is effective in estimating a density profile to within a few percentage points.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Andrews, A H; Kerr, L A; Cailliet, G M
2007-11-04
Canary rockfish (Sebastes pinniger) have long been an important part of recreational and commercial rockfish fishing from southeast Alaska to southern California, but localized stock abundances have declined considerably. Based on age estimates from otoliths and other structures, lifespan estimates vary from about 20 years to over 80 years. For the purpose of monitoring stocks, age composition is routinely estimated by counting growth zones in otoliths; however, age estimation procedures and lifespan estimates remain largely unvalidated. Typical age validation techniques have limited application for canary rockfish because they are deep dwelling and may be long lived. In this study, themore » unaged otolith of the pair from fish aged at the Department of Fisheries and Oceans Canada was used in one of two age validation techniques: (1) lead-radium dating and (2) bomb radiocarbon ({sup 14}C) dating. Age estimate accuracy and the validity of age estimation procedures were validated based on the results from each technique. Lead-radium dating proved successful in determining a minimum estimate of lifespan was 53 years and provided support for age estimation procedures up to about 50-60 years. These findings were further supported by {Delta}{sup 14}C data, which indicated a minimum estimate of lifespan was 44 {+-} 3 years. Both techniques validate, to differing degrees, age estimation procedures and provide support for inferring that canary rockfish can live more than 80 years.« less
NASA Astrophysics Data System (ADS)
Gorthi, Sai Siva; Rajshekhar, Gannavarpu; Rastogi, Pramod
2010-06-01
Recently, a high-order instantaneous moments (HIM)-operator-based method was proposed for accurate phase estimation in digital holographic interferometry. The method relies on piece-wise polynomial approximation of phase and subsequent evaluation of the polynomial coefficients from the HIM operator using single-tone frequency estimation. The work presents a comparative analysis of the performance of different single-tone frequency estimation techniques, like Fourier transform followed by optimization, estimation of signal parameters by rotational invariance technique (ESPRIT), multiple signal classification (MUSIC), and iterative frequency estimation by interpolation on Fourier coefficients (IFEIF) in HIM-operator-based methods for phase estimation. Simulation and experimental results demonstrate the potential of the IFEIF technique with respect to computational efficiency and estimation accuracy.
NASA Astrophysics Data System (ADS)
Sehad, Mounir; Lazri, Mourad; Ameur, Soltane
2017-03-01
In this work, a new rainfall estimation technique based on the high spatial and temporal resolution of the Spinning Enhanced Visible and Infra Red Imager (SEVIRI) aboard the Meteosat Second Generation (MSG) is presented. This work proposes efficient scheme rainfall estimation based on two multiclass support vector machine (SVM) algorithms: SVM_D for daytime and SVM_N for night time rainfall estimations. Both SVM models are trained using relevant rainfall parameters based on optical, microphysical and textural cloud proprieties. The cloud parameters are derived from the Spectral channels of the SEVIRI MSG radiometer. The 3-hourly and daily accumulated rainfall are derived from the 15 min-rainfall estimation given by the SVM classifiers for each MSG observation image pixel. The SVMs were trained with ground meteorological radar precipitation scenes recorded from November 2006 to March 2007 over the north of Algeria located in the Mediterranean region. Further, the SVM_D and SVM_N models were used to estimate 3-hourly and daily rainfall using data set gathered from November 2010 to March 2011 over north Algeria. The results were validated against collocated rainfall observed by rain gauge network. Indeed, the statistical scores given by correlation coefficient, bias, root mean square error and mean absolute error, showed good accuracy of rainfall estimates by the present technique. Moreover, rainfall estimates of our technique were compared with two high accuracy rainfall estimates methods based on MSG SEVIRI imagery namely: random forests (RF) based approach and an artificial neural network (ANN) based technique. The findings of the present technique indicate higher correlation coefficient (3-hourly: 0.78; daily: 0.94), and lower mean absolute error and root mean square error values. The results show that the new technique assign 3-hourly and daily rainfall with good and better accuracy than ANN technique and (RF) model.
NASA Astrophysics Data System (ADS)
Shrivastava, Akash; Mohanty, A. R.
2018-03-01
This paper proposes a model-based method to estimate single plane unbalance parameters (amplitude and phase angle) in a rotor using Kalman filter and recursive least square based input force estimation technique. Kalman filter based input force estimation technique requires state-space model and response measurements. A modified system equivalent reduction expansion process (SEREP) technique is employed to obtain a reduced-order model of the rotor system so that limited response measurements can be used. The method is demonstrated using numerical simulations on a rotor-disk-bearing system. Results are presented for different measurement sets including displacement, velocity, and rotational response. Effects of measurement noise level, filter parameters (process noise covariance and forgetting factor), and modeling error are also presented and it is observed that the unbalance parameter estimation is robust with respect to measurement noise.
Parameter Estimation in Atmospheric Data Sets
NASA Technical Reports Server (NTRS)
Wenig, Mark; Colarco, Peter
2004-01-01
In this study the structure tensor technique is used to estimate dynamical parameters in atmospheric data sets. The structure tensor is a common tool for estimating motion in image sequences. This technique can be extended to estimate other dynamical parameters such as diffusion constants or exponential decay rates. A general mathematical framework was developed for the direct estimation of the physical parameters that govern the underlying processes from image sequences. This estimation technique can be adapted to the specific physical problem under investigation, so it can be used in a variety of applications in trace gas, aerosol, and cloud remote sensing. As a test scenario this technique will be applied to modeled dust data. In this case vertically integrated dust concentrations were used to derive wind information. Those results can be compared to the wind vector fields which served as input to the model. Based on this analysis, a method to compute atmospheric data parameter fields will be presented. .
Space Vehicle Pose Estimation via Optical Correlation and Nonlinear Estimation
NASA Technical Reports Server (NTRS)
Rakoczy, John M.; Herren, Kenneth A.
2008-01-01
A technique for 6-degree-of-freedom (6DOF) pose estimation of space vehicles is being developed. This technique draws upon recent developments in implementing optical correlation measurements in a nonlinear estimator, which relates the optical correlation measurements to the pose states (orientation and position). For the optical correlator, the use of both conjugate filters and binary, phase-only filters in the design of synthetic discriminant function (SDF) filters is explored. A static neural network is trained a priori and used as the nonlinear estimator. New commercial animation and image rendering software is exploited to design the SDF filters and to generate a large filter set with which to train the neural network. The technique is applied to pose estimation for rendezvous and docking of free-flying spacecraft and to terrestrial surface mobility systems for NASA's Vision for Space Exploration. Quantitative pose estimation performance will be reported. Advantages and disadvantages of the implementation of this technique are discussed.
Space Vehicle Pose Estimation via Optical Correlation and Nonlinear Estimation
NASA Technical Reports Server (NTRS)
Rakoczy, John; Herren, Kenneth
2007-01-01
A technique for 6-degree-of-freedom (6DOF) pose estimation of space vehicles is being developed. This technique draws upon recent developments in implementing optical correlation measurements in a nonlinear estimator, which relates the optical correlation measurements to the pose states (orientation and position). For the optical correlator, the use of both conjugate filters and binary, phase-only filters in the design of synthetic discriminant function (SDF) filters is explored. A static neural network is trained a priori and used as the nonlinear estimator. New commercial animation and image rendering software is exploited to design the SDF filters and to generate a large filter set with which to train the neural network. The technique is applied to pose estimation for rendezvous and docking of free-flying spacecraft and to terrestrial surface mobility systems for NASA's Vision for Space Exploration. Quantitative pose estimation performance will be reported. Advantages and disadvantages of the implementation of this technique are discussed.
Accuracy of Noninvasive Estimation Techniques for the State of the Cochlear Amplifier
NASA Astrophysics Data System (ADS)
Dalhoff, Ernst; Gummer, Anthony W.
2011-11-01
Estimation of the function of the cochlea in human is possible only by deduction from indirect measurements, which may be subjective or objective. Therefore, for basic research as well as diagnostic purposes, it is important to develop methods to deduce and analyse error sources of cochlear-state estimation techniques. Here, we present a model of technical and physiologic error sources contributing to the estimation accuracy of hearing threshold and the state of the cochlear amplifier and deduce from measurements of human that the estimated standard deviation can be considerably below 6 dB. Experimental evidence is drawn from two partly independent objective estimation techniques for the auditory signal chain based on measurements of otoacoustic emissions.
Darmawan, M F; Yusuf, Suhaila M; Kadir, M R Abdul; Haron, H
2015-02-01
Sex estimation is used in forensic anthropology to assist the identification of individual remains. However, the estimation techniques tend to be unique and applicable only to a certain population. This paper analyzed sex estimation on living individual child below 19 years old using the length of 19 bones of left hand applied for three classification techniques, which were Discriminant Function Analysis (DFA), Support Vector Machine (SVM) and Artificial Neural Network (ANN) multilayer perceptron. These techniques were carried out on X-ray images of the left hand taken from an Asian population data set. All the 19 bones of the left hand were measured using Free Image software, and all the techniques were performed using MATLAB. The group of age "16-19" years old and "7-9" years old were the groups that could be used for sex estimation with as their average of accuracy percentage was above 80%. ANN model was the best classification technique with the highest average of accuracy percentage in the two groups of age compared to other classification techniques. The results show that each classification technique has the best accuracy percentage on each different group of age. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.
Dave Gartner; Gregory A. Reams
2001-01-01
As Forest Inventory and Analysis changes from a periodic survey to a multipanel annual survey, a transition will occur where only some of the panels have been resurveyed. Several estimation techniques use data from the periodic survey in addition to the data from the partially completed multipanel data. These estimation techniques were compared using data from two...
USDA-ARS?s Scientific Manuscript database
Recently, an instrument (TEMPOTM) has been developed to automate the Most Probable Number (MPN) technique and reduce the effort required to estimate some bacterial populations. We compared the automated MPN technique to traditional microbiological plating methods or PetrifilmTM for estimating the t...
Ariyama, Kaoru; Kadokura, Masashi; Suzuki, Tadanao
2008-01-01
Techniques to determine the geographic origin of foods have been developed for various agricultural and fishery products, and they have used various principles. Some of these techniques are already in use for checking the authenticity of the labeling. Many are based on multielement analysis and chemometrics. We have developed such a technique to determine the geographic origin of onions (Allium cepa L.). This technique, which determines whether an onion is from outside Japan, is designed for onions labeled as having a geographic origin of Hokkaido, Hyogo, or Saga, the main onion production areas in Japan. However, estimations of discrimination errors for this technique have not been fully conducted; they have been limited to those for discrimination models and do not include analytical errors. Interlaboratory studies were conducted to estimate the analytical errors of the technique. Four collaborators each determined 11 elements (Na, Mg, P, Mn, Zn, Rb, Sr, Mo, Cd, Cs, and Ba) in 4 test materials of fresh and dried onions. Discrimination errors in this technique were estimated by summing (1) individual differences within lots, (2) variations between lots from the same production area, and (3) analytical errors. The discrimination errors for onions from Hokkaido, Hyogo, and Saga were estimated to be 2.3, 9.5, and 8.0%, respectively. Those for onions from abroad in determinations targeting Hokkaido, Hyogo, and Saga were estimated to be 28.2, 21.6, and 21.9%, respectively.
NASA Astrophysics Data System (ADS)
Mahaboob, B.; Venkateswarlu, B.; Sankar, J. Ravi; Balasiddamuni, P.
2017-11-01
This paper uses matrix calculus techniques to obtain Nonlinear Least Squares Estimator (NLSE), Maximum Likelihood Estimator (MLE) and Linear Pseudo model for nonlinear regression model. David Pollard and Peter Radchenko [1] explained analytic techniques to compute the NLSE. However the present research paper introduces an innovative method to compute the NLSE using principles in multivariate calculus. This study is concerned with very new optimization techniques used to compute MLE and NLSE. Anh [2] derived NLSE and MLE of a heteroscedatistic regression model. Lemcoff [3] discussed a procedure to get linear pseudo model for nonlinear regression model. In this research article a new technique is developed to get the linear pseudo model for nonlinear regression model using multivariate calculus. The linear pseudo model of Edmond Malinvaud [4] has been explained in a very different way in this paper. David Pollard et.al used empirical process techniques to study the asymptotic of the LSE (Least-squares estimation) for the fitting of nonlinear regression function in 2006. In Jae Myung [13] provided a go conceptual for Maximum likelihood estimation in his work “Tutorial on maximum likelihood estimation
Two ground-based canopy closure estimation techniques, the Spherical Densitometer (SD) and the Vertical Tube (VT), were compared for the effect of deciduous understory on dominantlco-dominant crown closure estimates in even-aged loblolly (Pinus taeda) pine stands located in the N...
Two ground-based canopy closure estimation techniques, the Spherical Densitometer (SD) and the Vertical Tube (VT), were compared for the effect of deciduous understory on dominant/co-dominant crown closure estimates in even-aged loblolly (Pinus taeda) pine stands located in the N...
On using sample selection methods in estimating the price elasticity of firms' demand for insurance.
Marquis, M Susan; Louis, Thomas A
2002-01-01
We evaluate a technique based on sample selection models that has been used by health economists to estimate the price elasticity of firms' demand for insurance. We demonstrate that, this technique produces inflated estimates of the price elasticity. We show that alternative methods lead to valid estimates.
NASA Technical Reports Server (NTRS)
Zimmerman, G. A.; Olsen, E. T.
1992-01-01
Noise power estimation in the High-Resolution Microwave Survey (HRMS) sky survey element is considered as an example of a constant false alarm rate (CFAR) signal detection problem. Order-statistic-based noise power estimators for CFAR detection are considered in terms of required estimator accuracy and estimator dynamic range. By limiting the dynamic range of the value to be estimated, the performance of an order-statistic estimator can be achieved by simpler techniques requiring only a single pass of the data. Simple threshold-and-count techniques are examined, and it is shown how several parallel threshold-and-count estimation devices can be used to expand the dynamic range to meet HRMS system requirements with minimal hardware complexity. An input/output (I/O) efficient limited-precision order-statistic estimator with wide but limited dynamic range is also examined.
Three-dimensional ultrasound strain imaging of skeletal muscles
NASA Astrophysics Data System (ADS)
Gijsbertse, K.; Sprengers, A. M. J.; Nillesen, M. M.; Hansen, H. H. G.; Lopata, R. G. P.; Verdonschot, N.; de Korte, C. L.
2017-01-01
In this study, a multi-dimensional strain estimation method is presented to assess local relative deformation in three orthogonal directions in 3D space of skeletal muscles during voluntary contractions. A rigid translation and compressive deformation of a block phantom, that mimics muscle contraction, is used as experimental validation of the 3D technique and to compare its performance with respect to a 2D based technique. Axial, lateral and (in case of 3D) elevational displacements are estimated using a cross-correlation based displacement estimation algorithm. After transformation of the displacements to a Cartesian coordinate system, strain is derived using a least-squares strain estimator. The performance of both methods is compared by calculating the root-mean-squared error of the estimated displacements with the calculated theoretical displacements of the phantom experiments. We observe that the 3D technique delivers more accurate displacement estimations compared to the 2D technique, especially in the translation experiment where out-of-plane motion hampers the 2D technique. In vivo application of the 3D technique in the musculus vastus intermedius shows good resemblance between measured strain and the force pattern. Similarity of the strain curves of repetitive measurements indicates the reproducibility of voluntary contractions. These results indicate that 3D ultrasound is a valuable imaging tool to quantify complex tissue motion, especially when there is motion in three directions, which results in out-of-plane errors for 2D techniques.
NASA Astrophysics Data System (ADS)
Röbke, B. R.; Vött, A.
2017-12-01
With human activity increasingly concentrating on coasts, tsunamis (from Japanese tsu = harbour, nami = wave) are a major natural hazard to today's society. Stimulated by disastrous tsunami impacts in recent years, for instance in south-east Asia (2004) or in Japan (2011), tsunami science has significantly flourished, which has brought great advances in hazard assessment and mitigation plans. Based on tsunami research of the last decades, this paper provides a thorough treatise on the tsunami phenomenon from a geoscientific point of view. Starting with the wave features, tsunamis are introduced as long shallow water waves or wave trains crossing entire oceans without major energy loss. At the coast, tsunamis typically show wave shoaling, funnelling and resonance effects as well as a significant run-up and backflow. Tsunami waves are caused by a sudden displacement of the water column due to a number of various trigger mechanisms. Such are earthquakes as the main trigger, submarine and subaerial mass wastings, volcanic activity, atmospheric disturbances (meteotsunamis) and cosmic impacts, as is demonstrated by giving corresponding examples from the past. Tsunamis are known to have a significant sedimentary and geomorphological off- and onshore response. So-called tsunamites form allochthonous high-energy deposits that are left at the coast during tsunami landfall. Tsunami deposits show typical sedimentary features, as basal erosional unconformities, fining-upward and -landward, a high content of marine fossils, rip-up clasts from underlying units and mud caps, all reflecting the hydrodynamic processes during inundation. The on- and offshore behaviour of tsunamis and related sedimentary processes can be simulated using hydro- and morphodynamic numerical models. The paper provides an overview of the basic tsunami modelling techniques, including discretisation, guidelines for appropriate temporal and spatial resolution as well as the nesting method. Furthermore, the Boussinesq approximation-a simplification of the three-dimensional Navier-Stokes equations-is presented as a basic theory behind numerical tsunami models, which adequately reflects the non-linear, dispersive wave behaviour of tsunamis. The fully non-linear Boussinesq equations allow the simulation of tsunamis e.g. in the form of N-waves. Based on the various subtopics presented, recommendations for future multidisciplinary tsunami research are made. It is especially discussed how the combination of sedimentary and geomorphological tsunami field traces and numerical modelling techniques can contribute to derive locally relevant tsunami sources and to improve the assessment of tsunami hazards considering the individual pre-/history and physiogeographical setting of a specific region.
McAllister, James P; Williams, Michael A; Walker, Marion L; Kestle, John R W; Relkin, Norman R; Anderson, Amy M; Gross, Paul H; Browd, Samuel R
2015-12-01
Building on previous National Institutes of Health-sponsored symposia on hydrocephalus research, "Opportunities for Hydrocephalus Research: Pathways to Better Outcomes" was held in Seattle, Washington, July 9-11, 2012. Plenary sessions were organized into four major themes, each with two subtopics: Causes of Hydrocephalus (Genetics and Pathophysiological Modifications); Diagnosis of Hydrocephalus (Biomarkers and Neuroimaging); Treatment of Hydrocephalus (Bioengineering Advances and Surgical Treatments); and Outcome in Hydrocephalus (Neuropsychological and Neurological). International experts gave plenary talks, and extensive group discussions were held for each of the major themes. The conference emphasized patient-centered care and translational research, with the main objective to arrive at a consensus on priorities in hydrocephalus that have the potential to impact patient care in the next 5 years. The current state of hydrocephalus research and treatment was presented, and the following priorities for research were recommended for each theme. 1) Causes of Hydrocephalus-CSF absorption, production, and related drug therapies; pathogenesis of human hydrocephalus; improved animal and in vitro models of hydrocephalus; developmental and macromolecular transport mechanisms; biomechanical changes in hydrocephalus; and age-dependent mechanisms in the development of hydrocephalus. 2) Diagnosis of Hydrocephalus-implementation of a standardized set of protocols and a shared repository of technical information; prospective studies of multimodal techniques including MRI and CSF biomarkers to test potential pharmacological treatments; and quantitative and cost-effective CSF assessment techniques. 3) Treatment of Hydrocephalus-improved bioengineering efforts to reduce proximal catheter and overall shunt failure; external or implantable diagnostics and support for the biological infrastructure research that informs these efforts; and evidence-based surgical standardization with longitudinal metrics to validate or refute implemented practices, procedures, or tests. 4) Outcome in Hydrocephalus-development of specific, reliable batteries with metrics focused on the hydrocephalic patient; measurements of neurocognitive outcome and quality-of-life measures that are adaptable, trackable across the growth spectrum, and applicable cross-culturally; development of comparison metrics against normal aging and sensitive screening tools to diagnose idiopathic normal pressure hydrocephalus against appropriate normative age-based data; better understanding of the incidence and prevalence of hydrocephalus within both pediatric and adult populations; and comparisons of aging patterns in adults with hydrocephalus against normal aging patterns.
Simulation studies of wide and medium field of view earth radiation data analysis
NASA Technical Reports Server (NTRS)
Green, R. N.
1978-01-01
A parameter estimation technique is presented to estimate the radiative flux distribution over the earth from radiometer measurements at satellite altitude. The technique analyzes measurements from a wide field of view (WFOV), horizon to horizon, nadir pointing sensor with a mathematical technique to derive the radiative flux estimates at the top of the atmosphere for resolution elements smaller than the sensor field of view. A computer simulation of the data analysis technique is presented for both earth-emitted and reflected radiation. Zonal resolutions are considered as well as the global integration of plane flux. An estimate of the equator-to-pole gradient is obtained from the zonal estimates. Sensitivity studies of the derived flux distribution to directional model errors are also presented. In addition to the WFOV results, medium field of view results are presented.
A photographic technique for estimating egg density of the white pine weevil, Pissodes strobi (Peck)
Roger T. Zerillo
1975-01-01
Compares a photographic technique with visual and dissection techniques for estimating egg density of the white pine weevil, Pissodes strobi (Peck). The relatively high correlations (.67 and .79) between counts from photographs and those obtained by dissection indicate that the non-destructive photographic technique could be a useful tool for...
Comparative evaluation of workload estimation techniques in piloting tasks
NASA Technical Reports Server (NTRS)
Wierwille, W. W.
1983-01-01
Techniques to measure operator workload in a wide range of situations and tasks were examined. The sensitivity and intrusion of a wide variety of workload assessment techniques in simulated piloting tasks were investigated. Four different piloting tasks, psychomotor, perceptual, mediational, and communication aspects of piloting behavior were selected. Techniques to determine relative sensitivity and intrusion were applied. Sensitivity is the relative ability of a workload estimation technique to discriminate statistically significant differences in operator loading. High sensitivity requires discriminable changes in score means as a function of load level and low variation of the scores about the means. Intrusion is an undesirable change in the task for which workload is measured, resulting from the introduction of the workload estimation technique or apparatus.
Michael E. Goerndt; Vicente J. Monleon; Hailemariam Temesgen
2011-01-01
One of the challenges often faced in forestry is the estimation of forest attributes for smaller areas of interest within a larger population. Small-area estimation (SAE) is a set of techniques well suited to estimation of forest attributes for small areas in which the existing sample size is small and auxiliary information is available. Selected SAE methods were...
Improved Estimates of Thermodynamic Parameters
NASA Technical Reports Server (NTRS)
Lawson, D. D.
1982-01-01
Techniques refined for estimating heat of vaporization and other parameters from molecular structure. Using parabolic equation with three adjustable parameters, heat of vaporization can be used to estimate boiling point, and vice versa. Boiling points and vapor pressures for some nonpolar liquids were estimated by improved method and compared with previously reported values. Technique for estimating thermodynamic parameters should make it easier for engineers to choose among candidate heat-exchange fluids for thermochemical cycles.
Combining graphene with silicon carbide: synthesis and properties - a review
NASA Astrophysics Data System (ADS)
Shtepliuk, Ivan; Khranovskyy, Volodymyr; Yakimova, Rositsa
2016-11-01
Being a true two-dimensional crystal, graphene possesses a lot of exotic properties that would enable unique applications. Integration of graphene with inorganic semiconductors, e.g. silicon carbide (SiC) promotes the birth of a class of hybrid materials which are highly promising for development of novel operations, since they combine the best properties of two counterparts in the frame of one hybrid platform. As a specific heterostructure, graphene on SiC performs strongly, dependent on the synthesis method and the growth modes. In this article, a comprehensive review of the most relevant studies of graphene growth methods and mechanisms on SiC substrates has been carried out. The aim is to elucidate the basic physical processes that are responsible for the formation of graphene on SiC. First, an introduction is made covering some intriguing and not so often discussed properties of graphene. Then, we focus on integration of graphene with SiC, which is facilitated by the nature of SiC to assume graphitization. Concerning the synthesis methods, we discuss thermal decomposition of SiC, chemical vapor deposition and molecular beam epitaxy, stressing that the first technique is the most common one when SiC substrates are used. In addition, we briefly appraise graphene synthesis via metal mediated carbon segregation. We address in detail the main aspects of the substrate effect, such as substrate face polarity, off-cut, kind of polytype and nonpolar surfaces on the growth of graphene layers. A comparison of graphene grown on the polar faces is made. In particular, growth of graphene on Si-face SiC is critically analyzed concerning growth kinetics and growth mechanisms taking into account the specific characteristics of SiC (0001) surfaces, such as the step-terrace structure and the unavoidable surface reconstruction upon heating. In all subtopics obstacles and solutions are featured. We complete the review with a short summary and concluding remarks.
A comparison of minimum distance and maximum likelihood techniques for proportion estimation
NASA Technical Reports Server (NTRS)
Woodward, W. A.; Schucany, W. R.; Lindsey, H.; Gray, H. L.
1982-01-01
The estimation of mixing proportions P sub 1, P sub 2,...P sub m in the mixture density f(x) = the sum of the series P sub i F sub i(X) with i = 1 to M is often encountered in agricultural remote sensing problems in which case the p sub i's usually represent crop proportions. In these remote sensing applications, component densities f sub i(x) have typically been assumed to be normally distributed, and parameter estimation has been accomplished using maximum likelihood (ML) techniques. Minimum distance (MD) estimation is examined as an alternative to ML where, in this investigation, both procedures are based upon normal components. Results indicate that ML techniques are superior to MD when component distributions actually are normal, while MD estimation provides better estimates than ML under symmetric departures from normality. When component distributions are not symmetric, however, it is seen that neither of these normal based techniques provides satisfactory results.
Psychological first aid training for the faith community: a model curriculum.
McCabe, O Lee; Lating, Jeffrey M; Everly, George S; Mosley, Adrian M; Teague, Paula J; Links, Jonathan M; Kaminsky, Michael J
2007-01-01
Traditionally faith communities have served important roles in helping survivors cope in the aftermath of public health disasters. However, the provision of optimally effective crisis intervention services for persons experiencing acute or prolonged emotional trauma following such incidents requires specialized knowledge, skills, and abilities. Supported by a federally-funded grant, several academic health centers and faith-based organizations collaborated to develop a training program in Psychological First Aid (PFA) and disaster ministry for members of the clergy serving urban minorities and Latino immigrants in Baltimore, Maryland. This article describes the one-day training curriculum composed of four content modules: Stress Reactions of Mind-Body-Spirit, Psychological First Aid and Crisis Intervention, Pastoral Care and Disaster Ministry, and Practical Resources and Self Care for the Spiritual Caregiver Detailed descriptions of each module are provided, including its purpose; rationale and background literature; learning objectives; topics and sub-topics; and educational methods, materials and resources. The strengths, weaknesses, and future applications of the training template are discussed from the vantage points of participants' subjective reactions to the training.
Digging into the Public's Astronomy Interests
NASA Astrophysics Data System (ADS)
Miller, Scott; Simpson, R.; Gay, P.
2009-05-01
The astronomy community is good at sharing what we feel is most important or most interesting to the public via press releases, and we can get a sense of what the media wants from what they select to publish. Understanding exactly what the public most enjoys, however, was until recently mostly a matter of guesswork. Social networking sites, however, provide a place for people to publicly indicate what they like. For instance, the site http://www.digg.com allows people to submit links to interesting content on any subject and people can add "Diggs" to that linked in page. As articles gain more and more Diggs, they rise through the ranks, with the overall highest ranked sites appearing on the Digg homepage, and the highest ranked in a variety of topics, including both "Science" and the sub-topic "Space." In this poster we look at 1 month of data from Digg and study what astronomy subjects the public selects to Digg, what items have the most staying power, and compare what is Dugg to what is released via press releases during the same period.
Availability of online educational content concerning topics of animal welfare.
Petervary, Nicolette; Allen, Tim; Stokes, William S; Banks, Ron E
2016-05-01
Animal welfare is an important area of study for professionals in fields of animal care and use, and many turn to self-learning resources to gain a better understanding of topics in this area. We assessed the state of these self-learning resources by evaluating open access, freely available resources on the internet with respect to their content and the reliability of their information. We categorized content using a modified list of the topics described in the American College of Animal Welfare's Role Delineation Document, and we identified subject areas that are underrepresented among freely available resources. We identified that the field needs more content describing practical information on subtopics of animal transportation, humane education and economic issues in animal welfare. We also suggest a targeted approach to improve and increase particular aspects of content that concerns the impacts of human, animal and environment interactions on animal welfare. We recommend that veterinary societies place more emphasis on welfare policies in their websites. Additionally, the field of animal welfare would benefit from more available and authoritative information on certain species and uses of animals that are presently underrepresented.
Remote Sensing of Aircraft Contrails Using a Field Portable Digital Array Scanned Interferometer
NASA Technical Reports Server (NTRS)
Smith, William Hayden
1997-01-01
With a Digital Array Scanned Interferometer (DASI), we have obtained proof-of-concept observations with which we demonstrate DASI capabilities for the determination of contrail properties. These include the measurement of the cloud and soot microphysical parameters, as well, the abundances of specific pollutant species such as SO(sub x) or NO(sub x). From high quality hyperspectral data and using radiative transfer methods and atmospheric chemistry analysis in the data reduction and interpretation, powerful inferences concerning cloud formation, evolution and dissipation can be made. Under this sub-topic, we will integrate DASI with computer controlled scanning of the field-of-view to direct the sensor towards contrails and exhaust plumes for tracking the emitting vehicles. The optimum DASI wavelength sensitivity range for sensing contrails is 0.35 - 2.5 micron. DASI deploys on the ground or from aircraft to observe contrails in the vicinity. This enables rapid, accurate measurement of the temporal, spatial, and chemical evolution of contrails (or other plumes or exhaust sources) with a low cost, efficient sensor.
Development of a Two-Wheel Contingency Mode for the MAP Spacecraft
NASA Technical Reports Server (NTRS)
Starin, Scott R.; ODonnell, James R., Jr.; Bauer, Frank H. (Technical Monitor)
2002-01-01
In the event of a failure of one of MAP's three reaction wheel assemblies (RWAs), it is not possible to achieve three-axis, full-state attitude control using the remaining two wheels. Hence, two of the attitude control algorithms implemented on the MAP spacecraft will no longer be usable in their current forms: Inertial Mode, used for slewing to and holding inertial attitudes, and Observing Mode, which implements the nominal dual-spin science mode. This paper describes the effort to create a complete strategy for using software algorithms to cope with a RWA failure. The discussion of the design process will be divided into three main subtopics: performing orbit maneuvers to reach and maintain an orbit about the second Earth-Sun libration point in the event of a RWA failure, completing the mission using a momentum-bias two-wheel science mode, and developing a new thruster-based mode for adjusting the inertially fixed momentum bias. In this summary, the philosophies used in designing these changes is shown; the full paper will supplement these with algorithm descriptions and testing results.
NASA Astrophysics Data System (ADS)
GonzáLez, Pablo J.; FernáNdez, José
2011-10-01
Interferometric Synthetic Aperture Radar (InSAR) is a reliable technique for measuring crustal deformation. However, despite its long application in geophysical problems, its error estimation has been largely overlooked. Currently, the largest problem with InSAR is still the atmospheric propagation errors, which is why multitemporal interferometric techniques have been successfully developed using a series of interferograms. However, none of the standard multitemporal interferometric techniques, namely PS or SB (Persistent Scatterers and Small Baselines, respectively) provide an estimate of their precision. Here, we present a method to compute reliable estimates of the precision of the deformation time series. We implement it for the SB multitemporal interferometric technique (a favorable technique for natural terrains, the most usual target of geophysical applications). We describe the method that uses a properly weighted scheme that allows us to compute estimates for all interferogram pixels, enhanced by a Montecarlo resampling technique that properly propagates the interferogram errors (variance-covariances) into the unknown parameters (estimated errors for the displacements). We apply the multitemporal error estimation method to Lanzarote Island (Canary Islands), where no active magmatic activity has been reported in the last decades. We detect deformation around Timanfaya volcano (lengthening of line-of-sight ˜ subsidence), where the last eruption in 1730-1736 occurred. Deformation closely follows the surface temperature anomalies indicating that magma crystallization (cooling and contraction) of the 300-year shallow magmatic body under Timanfaya volcano is still ongoing.
Noise estimation for hyperspectral imagery using spectral unmixing and synthesis
NASA Astrophysics Data System (ADS)
Demirkesen, C.; Leloglu, Ugur M.
2014-10-01
Most hyperspectral image (HSI) processing algorithms assume a signal to noise ratio model in their formulation which makes them dependent on accurate noise estimation. Many techniques have been proposed to estimate the noise. A very comprehensive comparative study on the subject is done by Gao et al. [1]. In a nut-shell, most techniques are based on the idea of calculating standard deviation from assumed-to-be homogenous regions in the image. Some of these algorithms work on a regular grid parameterized with a window size w, while others make use of image segmentation in order to obtain homogenous regions. This study focuses not only to the statistics of the noise but to the estimation of the noise itself. A noise estimation technique motivated from a recent HSI de-noising approach [2] is proposed in this study. The denoising algorithm is based on estimation of the end-members and their fractional abundances using non-negative least squares method. The end-members are extracted using the well-known simplex volume optimization technique called NFINDR after manual selection of number of end-members and the image is reconstructed using the estimated endmembers and abundances. Actually, image de-noising and noise estimation are two sides of the same coin: Once we denoise an image, we can estimate the noise by calculating the difference of the de-noised image and the original noisy image. In this study, the noise is estimated as described above. To assess the accuracy of this method, the methodology in [1] is followed, i.e., synthetic images are created by mixing end-member spectra and noise. Since best performing method for noise estimation was spectral and spatial de-correlation (SSDC) originally proposed in [3], the proposed method is compared to SSDC. The results of the experiments conducted with synthetic HSIs suggest that the proposed noise estimation strategy outperforms the existing techniques in terms of mean and standard deviation of absolute error of the estimated noise. Finally, it is shown that the proposed technique demonstrated a robust behavior to the change of its single parameter, namely the number of end-members.
Boyle, John J.; Kume, Maiko; Wyczalkowski, Matthew A.; Taber, Larry A.; Pless, Robert B.; Xia, Younan; Genin, Guy M.; Thomopoulos, Stavros
2014-01-01
When mechanical factors underlie growth, development, disease or healing, they often function through local regions of tissue where deformation is highly concentrated. Current optical techniques to estimate deformation can lack precision and accuracy in such regions due to challenges in distinguishing a region of concentrated deformation from an error in displacement tracking. Here, we present a simple and general technique for improving the accuracy and precision of strain estimation and an associated technique for distinguishing a concentrated deformation from a tracking error. The strain estimation technique improves accuracy relative to other state-of-the-art algorithms by directly estimating strain fields without first estimating displacements, resulting in a very simple method and low computational cost. The technique for identifying local elevation of strain enables for the first time the successful identification of the onset and consequences of local strain concentrating features such as cracks and tears in a highly strained tissue. We apply these new techniques to demonstrate a novel hypothesis in prenatal wound healing. More generally, the analytical methods we have developed provide a simple tool for quantifying the appearance and magnitude of localized deformation from a series of digital images across a broad range of disciplines. PMID:25165601
Estimation of Dynamical Parameters in Atmospheric Data Sets
NASA Technical Reports Server (NTRS)
Wenig, Mark O.
2004-01-01
In this study a new technique is used to derive dynamical parameters out of atmospheric data sets. This technique, called the structure tensor technique, can be used to estimate dynamical parameters such as motion, source strengths, diffusion constants or exponential decay rates. A general mathematical framework was developed for the direct estimation of the physical parameters that govern the underlying processes from image sequences. This estimation technique can be adapted to the specific physical problem under investigation, so it can be used in a variety of applications in trace gas, aerosol, and cloud remote sensing. The fundamental algorithm will be extended to the analysis of multi- channel (e.g. multi trace gas) image sequences and to provide solutions to the extended aperture problem. In this study sensitivity studies have been performed to determine the usability of this technique for data sets with different resolution in time and space and different dimensions.
Estimation variance bounds of importance sampling simulations in digital communication systems
NASA Technical Reports Server (NTRS)
Lu, D.; Yao, K.
1991-01-01
In practical applications of importance sampling (IS) simulation, two basic problems are encountered, that of determining the estimation variance and that of evaluating the proper IS parameters needed in the simulations. The authors derive new upper and lower bounds on the estimation variance which are applicable to IS techniques. The upper bound is simple to evaluate and may be minimized by the proper selection of the IS parameter. Thus, lower and upper bounds on the improvement ratio of various IS techniques relative to the direct Monte Carlo simulation are also available. These bounds are shown to be useful and computationally simple to obtain. Based on the proposed technique, one can readily find practical suboptimum IS parameters. Numerical results indicate that these bounding techniques are useful for IS simulations of linear and nonlinear communication systems with intersymbol interference in which bit error rate and IS estimation variances cannot be obtained readily using prior techniques.
Novel Application of Density Estimation Techniques in Muon Ionization Cooling Experiment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mohayai, Tanaz Angelina; Snopok, Pavel; Neuffer, David
The international Muon Ionization Cooling Experiment (MICE) aims to demonstrate muon beam ionization cooling for the first time and constitutes a key part of the R&D towards a future neutrino factory or muon collider. Beam cooling reduces the size of the phase space volume occupied by the beam. Non-parametric density estimation techniques allow very precise calculation of the muon beam phase-space density and its increase as a result of cooling. These density estimation techniques are investigated in this paper and applied in order to estimate the reduction in muon beam size in MICE under various conditions.
Development of the One-Sided Nonlinear Adaptive Doppler Shift Estimation
NASA Technical Reports Server (NTRS)
Beyon, Jeffrey Y.; Koch, Grady J.; Singh, Upendra N.; Kavaya, Michael J.; Serror, Judith A.
2009-01-01
The new development of a one-sided nonlinear adaptive shift estimation technique (NADSET) is introduced. The background of the algorithm and a brief overview of NADSET are presented. The new technique is applied to the wind parameter estimates from a 2-micron wavelength coherent Doppler lidar system called VALIDAR located in NASA Langley Research Center in Virginia. The new technique enhances wind parameters such as Doppler shift and power estimates in low Signal-To-Noise-Ratio (SNR) regimes using the estimates in high SNR regimes as the algorithm scans the range bins from low to high altitude. The original NADSET utilizes the statistics in both the lower and the higher range bins to refine the wind parameter estimates in between. The results of the two different approaches of NADSET are compared.
An angle-dependent estimation of CT x-ray spectrum from rotational transmission measurements
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lin, Yuan, E-mail: yuan.lin@duke.edu; Samei, Ehsan; Ramirez-Giraldo, Juan Carlos
2014-06-15
Purpose: Computed tomography (CT) performance as well as dose and image quality is directly affected by the x-ray spectrum. However, the current assessment approaches of the CT x-ray spectrum require costly measurement equipment and complicated operational procedures, and are often limited to the spectrum corresponding to the center of rotation. In order to address these limitations, the authors propose an angle-dependent estimation technique, where the incident spectra across a wide range of angular trajectories can be estimated accurately with only a single phantom and a single axial scan in the absence of the knowledge of the bowtie filter. Methods: Themore » proposed technique uses a uniform cylindrical phantom, made of ultra-high-molecular-weight polyethylene and positioned in an off-centered geometry. The projection data acquired with an axial scan have a twofold purpose. First, they serve as a reflection of the transmission measurements across different angular trajectories. Second, they are used to reconstruct the cross sectional image of the phantom, which is then utilized to compute the intersection length of each transmission measurement. With each CT detector element recording a range of transmission measurements for a single angular trajectory, the spectrum is estimated for that trajectory. A data conditioning procedure is used to combine information from hundreds of collected transmission measurements to accelerate the estimation speed, to reduce noise, and to improve estimation stability. The proposed spectral estimation technique was validated experimentally using a clinical scanner (Somatom Definition Flash, Siemens Healthcare, Germany) with spectra provided by the manufacturer serving as the comparison standard. Results obtained with the proposed technique were compared against those obtained from a second conventional transmission measurement technique with two materials (i.e., Cu and Al). After validation, the proposed technique was applied to measure spectra from the clinical system across a range of angular trajectories [−15°, 15°] and spectrum settings (80, 100, 120, 140 kVp). Results: At 140 kVp, the proposed technique was comparable to the conventional technique in terms of the mean energy difference (MED, −0.29 keV) and the normalized root mean square difference (NRMSD, 0.84%) from the comparison standard compared to 0.64 keV and 1.56%, respectively, with the conventional technique. The average absolute MEDs and NRMSDs across kVp settings and angular trajectories were less than 0.61 keV and 3.41%, respectively, which indicates a high level of estimation accuracy and stability. Conclusions: An angle-dependent estimation technique of CT x-ray spectra from rotational transmission measurements was proposed. Compared with the conventional technique, the proposed method simplifies the measurement procedures and enables incident spectral estimation for a wide range of angular trajectories. The proposed technique is suitable for rigorous research objectives as well as routine clinical quality control procedures.« less
Estimating propagation velocity through a surface acoustic wave sensor
Xu, Wenyuan; Huizinga, John S.
2010-03-16
Techniques are described for estimating the propagation velocity through a surface acoustic wave sensor. In particular, techniques which measure and exploit a proper segment of phase frequency response of the surface acoustic wave sensor are described for use as a basis of bacterial detection by the sensor. As described, use of velocity estimation based on a proper segment of phase frequency response has advantages over conventional techniques that use phase shift as the basis for detection.
We developed a technique for assessing the accuracy of sub-pixel derived estimates of impervious surface extracted from LANDSAT TM imagery. We utilized spatially coincident
sub-pixel derived impervious surface estimates, high-resolution planimetric GIS data, vector--to-
r...
Forest inventory and stratified estimation: a cautionary note
John Coulston
2008-01-01
The Forest Inventory and Analysis (FIA) Program uses stratified estimation techniques to produce estimates of forest attributes. Stratification must be unbiased and stratification procedures should be examined to identify any potential bias. This note explains simple techniques for identifying potential bias, discriminating between sample bias and stratification bias,...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zeng, L., E-mail: zeng@fusion.gat.com; Doyle, E. J.; Rhodes, T. L.
2016-11-15
A new model-based technique for fast estimation of the pedestal electron density gradient has been developed. The technique uses ordinary mode polarization profile reflectometer time delay data and does not require direct profile inversion. Because of its simple data processing, the technique can be readily implemented via a Field-Programmable Gate Array, so as to provide a real-time density gradient estimate, suitable for use in plasma control systems such as envisioned for ITER, and possibly for DIII-D and Experimental Advanced Superconducting Tokamak. The method is based on a simple edge plasma model with a linear pedestal density gradient and low scrape-off-layermore » density. By measuring reflectometer time delays for three adjacent frequencies, the pedestal density gradient can be estimated analytically via the new approach. Using existing DIII-D profile reflectometer data, the estimated density gradients obtained from the new technique are found to be in good agreement with the actual density gradients for a number of dynamic DIII-D plasma conditions.« less
USDA-ARS?s Scientific Manuscript database
Spatial frequency domain imaging technique has recently been developed for determination of the optical properties of food and biological materials. However, accurate estimation of the optical property parameters by the technique is challenging due to measurement errors associated with signal acquis...
Spring Small Grains Area Estimation
NASA Technical Reports Server (NTRS)
Palmer, W. F.; Mohler, R. J.
1986-01-01
SSG3 automatically estimates acreage of spring small grains from Landsat data. Report describes development and testing of a computerized technique for using Landsat multispectral scanner (MSS) data to estimate acreage of spring small grains (wheat, barley, and oats). Application of technique to analysis of four years of data from United States and Canada yielded estimates of accuracy comparable to those obtained through procedures that rely on trained analysis.
NASA Astrophysics Data System (ADS)
Winiarek, Victor; Bocquet, Marc; Duhanyan, Nora; Roustan, Yelva; Saunier, Olivier; Mathieu, Anne
2014-01-01
Inverse modelling techniques can be used to estimate the amount of radionuclides and the temporal profile of the source term released in the atmosphere during the accident of the Fukushima Daiichi nuclear power plant in March 2011. In Winiarek et al. (2012b), the lower bounds of the caesium-137 and iodine-131 source terms were estimated with such techniques, using activity concentration measurements. The importance of an objective assessment of prior errors (the observation errors and the background errors) was emphasised for a reliable inversion. In such critical context where the meteorological conditions can make the source term partly unobservable and where only a few observations are available, such prior estimation techniques are mandatory, the retrieved source term being very sensitive to this estimation. We propose to extend the use of these techniques to the estimation of prior errors when assimilating observations from several data sets. The aim is to compute an estimate of the caesium-137 source term jointly using all available data about this radionuclide, such as activity concentrations in the air, but also daily fallout measurements and total cumulated fallout measurements. It is crucial to properly and simultaneously estimate the background errors and the prior errors relative to each data set. A proper estimation of prior errors is also a necessary condition to reliably estimate the a posteriori uncertainty of the estimated source term. Using such techniques, we retrieve a total released quantity of caesium-137 in the interval 11.6-19.3 PBq with an estimated standard deviation range of 15-20% depending on the method and the data sets. The “blind” time intervals of the source term have also been strongly mitigated compared to the first estimations with only activity concentration data.
Estimating Crop Growth Stage by Combining Meteorological and Remote Sensing Based Techniques
NASA Astrophysics Data System (ADS)
Champagne, C.; Alavi-Shoushtari, N.; Davidson, A. M.; Chipanshi, A.; Zhang, Y.; Shang, J.
2016-12-01
Estimations of seeding, harvest and phenological growth stage of crops are important sources of information for monitoring crop progress and crop yield forecasting. Growth stage has been traditionally estimated at the regional level through surveys, which rely on field staff to collect the information. Automated techniques to estimate growth stage have included agrometeorological approaches that use temperature and day length information to estimate accumulated heat and photoperiod, with thresholds used to determine when these stages are most likely. These approaches however, are crop and hybrid dependent, and can give widely varying results depending on the method used, particularly if the seeding date is unknown. Methods to estimate growth stage from remote sensing have progressed greatly in the past decade, with time series information from the Normalized Difference Vegetation Index (NDVI) the most common approach. Time series NDVI provide information on growth stage through a variety of techniques, including fitting functions to a series of measured NDVI values or smoothing these values and using thresholds to detect changes in slope that are indicative of rapidly increasing or decreasing `greeness' in the vegetation cover. The key limitations of these techniques for agriculture are frequent cloud cover in optical data that lead to errors in estimating local features in the time series function, and the incongruity between changes in greenness and traditional agricultural growth stages. There is great potential to combine both meteorological approaches and remote sensing to overcome the limitations of each technique. This research will examine the accuracy of both meteorological and remote sensing approaches over several agricultural sites in Canada, and look at the potential to integrate these techniques to provide improved estimates of crop growth stage for common field crops.
Lorenz, David L.; Sanocki, Chris A.; Kocian, Matthew J.
2010-01-01
Knowledge of the peak flow of floods of a given recurrence interval is essential for regulation and planning of water resources and for design of bridges, culverts, and dams along Minnesota's rivers and streams. Statistical techniques are needed to estimate peak flow at ungaged sites because long-term streamflow records are available at relatively few places. Because of the need to have up-to-date peak-flow frequency information in order to estimate peak flows at ungaged sites, the U.S. Geological Survey (USGS) conducted a peak-flow frequency study in cooperation with the Minnesota Department of Transportation and the Minnesota Pollution Control Agency. Estimates of peak-flow magnitudes for 1.5-, 2-, 5-, 10-, 25-, 50-, 100-, and 500-year recurrence intervals are presented for 330 streamflow-gaging stations in Minnesota and adjacent areas in Iowa and South Dakota based on data through water year 2005. The peak-flow frequency information was subsequently used in regression analyses to develop equations relating peak flows for selected recurrence intervals to various basin and climatic characteristics. Two statistically derived techniques-regional regression equation and region of influence regression-can be used to estimate peak flow on ungaged streams smaller than 3,000 square miles in Minnesota. Regional regression equations were developed for selected recurrence intervals in each of six regions in Minnesota: A (northwestern), B (north central and east central), C (northeastern), D (west central and south central), E (southwestern), and F (southeastern). The regression equations can be used to estimate peak flows at ungaged sites. The region of influence regression technique dynamically selects streamflow-gaging stations with characteristics similar to a site of interest. Thus, the region of influence regression technique allows use of a potentially unique set of gaging stations for estimating peak flow at each site of interest. Two methods of selecting streamflow-gaging stations, similarity and proximity, can be used for the region of influence regression technique. The regional regression equation technique is the preferred technique as an estimate of peak flow in all six regions for ungaged sites. The region of influence regression technique is not appropriate for regions C, E, and F because the interrelations of some characteristics of those regions do not agree with the interrelations throughout the rest of the State. Both the similarity and proximity methods for the region of influence technique can be used in the other regions (A, B, and D) to provide additional estimates of peak flow. The peak-flow-frequency estimates and basin characteristics for selected streamflow-gaging stations and regional peak-flow regression equations are included in this report.
Application of an Optimal Tuner Selection Approach for On-Board Self-Tuning Engine Models
NASA Technical Reports Server (NTRS)
Simon, Donald L.; Armstrong, Jeffrey B.; Garg, Sanjay
2012-01-01
An enhanced design methodology for minimizing the error in on-line Kalman filter-based aircraft engine performance estimation applications is presented in this paper. It specific-ally addresses the under-determined estimation problem, in which there are more unknown parameters than available sensor measurements. This work builds upon an existing technique for systematically selecting a model tuning parameter vector of appropriate dimension to enable estimation by a Kalman filter, while minimizing the estimation error in the parameters of interest. While the existing technique was optimized for open-loop engine operation at a fixed design point, in this paper an alternative formulation is presented that enables the technique to be optimized for an engine operating under closed-loop control throughout the flight envelope. The theoretical Kalman filter mean squared estimation error at a steady-state closed-loop operating point is derived, and the tuner selection approach applied to minimize this error is discussed. A technique for constructing a globally optimal tuning parameter vector, which enables full-envelope application of the technology, is also presented, along with design steps for adjusting the dynamic response of the Kalman filter state estimates. Results from the application of the technique to linear and nonlinear aircraft engine simulations are presented and compared to the conventional approach of tuner selection. The new methodology is shown to yield a significant improvement in on-line Kalman filter estimation accuracy.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cool, Richard, M.; Hudon, Thomas, J.; Basco, David, R.
2009-12-10
On April 15, 2008, the Department of Energy (DOE) issued a Funding Opportunity Announcement for Advanced Water Power Projects which included a Topic Area for Marine and Hydrokinetic Renewable Energy Market Acceleration Projects. Within this Topic Area, DOE identified potential navigational impacts of marine and hydrokinetic renewable energy technologies and measures to prevent adverse impacts on navigation as a sub-topic area. DOE defines marine and hydrokinetic technologies as those capable of utilizing one or more of the following resource categories for energy generation: ocean waves; tides or ocean currents; free flowing water in rivers or streams; and energy generation frommore » the differentials in ocean temperature. PCCI was awarded Cooperative Agreement DE-FC36-08GO18177 from the DOE to identify the potential navigational impacts and mitigation measures for marine hydrokinetic technologies, as summarized herein. The contract also required cooperation with the U.S. Coast Guard (USCG) and two recipients of awards (Pacific Energy Ventures and reVision) in a sub-topic area to develop a protocol to identify streamlined, best-siting practices. Over the period of this contract, PCCI and our sub-consultants, David Basco, Ph.D., and Neil Rondorf of Science Applications International Corporation, met with USCG headquarters personnel, with U.S. Army Corps of Engineers headquarters and regional personnel, with U.S. Navy regional personnel and other ocean users in order to develop an understanding of existing practices for the identification of navigational impacts that might occur during construction, operation, maintenance, and decommissioning. At these same meetings, “standard” and potential mitigation measures were discussed so that guidance could be prepared for project developers. Concurrently, PCCI reviewed navigation guidance published by the USCG and international community. This report summarizes the results of this effort, provides guidance in the form of a checklist for assessing the navigational impacts of potential marine and hydrokinetic projects, and provides guidance for improving the existing navigational guidance promulgated by the USCG in Navigation Vessel Inspection Circular 02 07. At the request of the USCG, our checklist and mitigation guidance was written in a generic nature so that it could be equally applied to offshore wind projects. PCCI teleconferenced on a monthly basis with DOE, Pacific Energy Ventures and reVision in order to share information and review work products. Although the focus of our effort was on marine and hydrokinetic technologies, as defined above, this effort drew upon earlier work by the USCG on offshore wind renewable energy installations. The guidance provided herein can be applied equally to marine and hydrokinetic technologies and to offshore wind, which are collectively referred to by the USCG as Renewable Energy Installations.« less
Gao, Mingwu; Cheng, Hao-Min; Sung, Shih-Hsien; Chen, Chen-Huan; Olivier, Nicholas Bari; Mukkamala, Ramakrishna
2017-07-01
pulse transit time (PTT) varies with blood pressure (BP) throughout the cardiac cycle, yet, because of wave reflection, only one PTT value at the diastolic BP level is conventionally estimated from proximal and distal BP waveforms. The objective was to establish a technique to estimate multiple PTT values at different BP levels in the cardiac cycle. a technique was developed for estimating PTT as a function of BP (to indicate the PTT value for every BP level) from proximal and distal BP waveforms. First, a mathematical transformation from one waveform to the other is defined in terms of the parameters of a nonlinear arterial tube-load model accounting for BP-dependent arterial compliance and wave reflection. Then, the parameters are estimated by optimally fitting the waveforms to each other via the model-based transformation. Finally, PTT as a function of BP is specified by the parameters. The technique was assessed in animals and patients in several ways including the ability of its estimated PTT-BP function to serve as a subject-specific curve for calibrating PTT to BP. the calibration curve derived by the technique during a baseline period yielded bias and precision errors in mean BP of 5.1 ± 0.9 and 6.6 ± 1.0 mmHg, respectively, during hemodynamic interventions that varied mean BP widely. the new technique may permit, for the first time, estimation of PTT values throughout the cardiac cycle from proximal and distal waveforms. the technique could potentially be applied to improve arterial stiffness monitoring and help realize cuff-less BP monitoring.
A-posteriori error estimation for second order mechanical systems
NASA Astrophysics Data System (ADS)
Ruiner, Thomas; Fehr, Jörg; Haasdonk, Bernard; Eberhard, Peter
2012-06-01
One important issue for the simulation of flexible multibody systems is the reduction of the flexible bodies degrees of freedom. As far as safety questions are concerned knowledge about the error introduced by the reduction of the flexible degrees of freedom is helpful and very important. In this work, an a-posteriori error estimator for linear first order systems is extended for error estimation of mechanical second order systems. Due to the special second order structure of mechanical systems, an improvement of the a-posteriori error estimator is achieved. A major advantage of the a-posteriori error estimator is that the estimator is independent of the used reduction technique. Therefore, it can be used for moment-matching based, Gramian matrices based or modal based model reduction techniques. The capability of the proposed technique is demonstrated by the a-posteriori error estimation of a mechanical system, and a sensitivity analysis of the parameters involved in the error estimation process is conducted.
Estimates of air emissions from asphalt storage tanks and truck loading
DOE Office of Scientific and Technical Information (OSTI.GOV)
Trumbore, D.C.
1999-12-31
Title V of the 1990 Clean Air Act requires the accurate estimation of emissions from all US manufacturing processes, and places the burden of proof for that estimate on the process owner. This paper is published as a tool to assist in the estimation of air emission from hot asphalt storage tanks and asphalt truck loading operations. Data are presented on asphalt vapor pressure, vapor molecular weight, and the emission split between volatile organic compounds and particulate emissions that can be used with AP-42 calculation techniques to estimate air emissions from asphalt storage tanks and truck loading operations. Since currentmore » AP-42 techniques are not valid in asphalt tanks with active fume removal, a different technique for estimation of air emissions in those tanks, based on direct measurement of vapor space combustible gas content, is proposed. Likewise, since AP-42 does not address carbon monoxide or hydrogen sulfide emissions that are known to be present in asphalt operations, this paper proposes techniques for estimation of those emissions. Finally, data are presented on the effectiveness of fiber bed filters in reducing air emissions in asphalt operations.« less
Estimation of Heavy Metals Contamination in the Soil of Zaafaraniya City Using the Neural Network
NASA Astrophysics Data System (ADS)
Ghazi, Farah F.
2018-05-01
The aim of this paper is to estimate the heavy metals Contamination in soils which can be used to determine the rate of environmental contamination by using new technique depend on design feedback neural network as an alternative accurate technique. The network simulates to estimate the concentration of Cadmium (Cd), Nickel (Ni), Lead (Pb), Zinc (Zn) and Copper (Cu). Then to show the accuracy and efficiency of suggested design we applied the technique in Al- Zafaraniyah in Baghdad city. The results of this paper show that the suggested networks can be successfully applied to the rapid and accuracy estimation of concentration of heavy metals.
An Empirical State Error Covariance Matrix for the Weighted Least Squares Estimation Method
NASA Technical Reports Server (NTRS)
Frisbee, Joseph H., Jr.
2011-01-01
State estimation techniques effectively provide mean state estimates. However, the theoretical state error covariance matrices provided as part of these techniques often suffer from a lack of confidence in their ability to describe the un-certainty in the estimated states. By a reinterpretation of the equations involved in the weighted least squares algorithm, it is possible to directly arrive at an empirical state error covariance matrix. This proposed empirical state error covariance matrix will contain the effect of all error sources, known or not. Results based on the proposed technique will be presented for a simple, two observer, measurement error only problem.
ERIC Educational Resources Information Center
Recchia, Gabriel L.; Louwerse, Max M.
2016-01-01
Computational techniques comparing co-occurrences of city names in texts allow the relative longitudes and latitudes of cities to be estimated algorithmically. However, these techniques have not been applied to estimate the provenance of artifacts with unknown origins. Here, we estimate the geographic origin of artifacts from the Indus Valley…
A Comparison of a Bayesian and a Maximum Likelihood Tailored Testing Procedure.
ERIC Educational Resources Information Center
McKinley, Robert L.; Reckase, Mark D.
A study was conducted to compare tailored testing procedures based on a Bayesian ability estimation technique and on a maximum likelihood ability estimation technique. The Bayesian tailored testing procedure selected items so as to minimize the posterior variance of the ability estimate distribution, while the maximum likelihood tailored testing…
Genie M. Fleming; Joseph M. Wunderle; David N. Ewert; Joseph O' Brien
2014-01-01
Aim: Non-destructive methods for quantifying above-ground plant biomass are important tools in many ecological studies and management endeavours, but estimation methods can be labour intensive and particularly difficult in structurally diverse vegetation types. We aimed to develop a low-cost, but reasonably accurate, estimation technique within early-successional...
Exploratory Study for Continuous-time Parameter Estimation of Ankle Dynamics
NASA Technical Reports Server (NTRS)
Kukreja, Sunil L.; Boyle, Richard D.
2014-01-01
Recently, a parallel pathway model to describe ankle dynamics was proposed. This model provides a relationship between ankle angle and net ankle torque as the sum of a linear and nonlinear contribution. A technique to identify parameters of this model in discrete-time has been developed. However, these parameters are a nonlinear combination of the continuous-time physiology, making insight into the underlying physiology impossible. The stable and accurate estimation of continuous-time parameters is critical for accurate disease modeling, clinical diagnosis, robotic control strategies, development of optimal exercise protocols for longterm space exploration, sports medicine, etc. This paper explores the development of a system identification technique to estimate the continuous-time parameters of ankle dynamics. The effectiveness of this approach is assessed via simulation of a continuous-time model of ankle dynamics with typical parameters found in clinical studies. The results show that although this technique improves estimates, it does not provide robust estimates of continuous-time parameters of ankle dynamics. Due to this we conclude that alternative modeling strategies and more advanced estimation techniques be considered for future work.
NASA Astrophysics Data System (ADS)
Shi, Lei; Guo, Lianghui; Ma, Yawei; Li, Yonghua; Wang, Weilai
2018-05-01
The technique of teleseismic receiver function H-κ stacking is popular for estimating the crustal thickness and Vp/Vs ratio. However, it has large uncertainty or ambiguity when the Moho multiples in receiver function are not easy to be identified. We present an improved technique to estimate the crustal thickness and Vp/Vs ratio by joint constraints of receiver function and gravity data. The complete Bouguer gravity anomalies, composed of the anomalies due to the relief of the Moho interface and the heterogeneous density distribution within the crust, are associated with the crustal thickness, density and Vp/Vs ratio. According to their relationship formulae presented by Lowry and Pérez-Gussinyé, we invert the complete Bouguer gravity anomalies by using a common algorithm of likelihood estimation to obtain the crustal thickness and Vp/Vs ratio, and then utilize them to constrain the receiver function H-κ stacking result. We verified the improved technique on three synthetic crustal models and evaluated the influence of selected parameters, the results of which demonstrated that the novel technique could reduce the ambiguity and enhance the accuracy of estimation. Real data test at two given stations in the NE margin of Tibetan Plateau illustrated that the improved technique provided reliable estimations of crustal thickness and Vp/Vs ratio.
Quantum-classical boundary for precision optical phase estimation
NASA Astrophysics Data System (ADS)
Birchall, Patrick M.; O'Brien, Jeremy L.; Matthews, Jonathan C. F.; Cable, Hugo
2017-12-01
Understanding the fundamental limits on the precision to which an optical phase can be estimated is of key interest for many investigative techniques utilized across science and technology. We study the estimation of a fixed optical phase shift due to a sample which has an associated optical loss, and compare phase estimation strategies using classical and nonclassical probe states. These comparisons are based on the attainable (quantum) Fisher information calculated per number of photons absorbed or scattered by the sample throughout the sensing process. We find that for a given number of incident photons upon the unknown phase, nonclassical techniques in principle provide less than a 20 % reduction in root-mean-square error (RMSE) in comparison with ideal classical techniques in multipass optical setups. Using classical techniques in a different optical setup that we analyze, which incorporates additional stages of interference during the sensing process, the achievable reduction in RMSE afforded by nonclassical techniques falls to only ≃4 % . We explain how these conclusions change when nonclassical techniques are compared to classical probe states in nonideal multipass optical setups, with additional photon losses due to the measurement apparatus.
NASA Astrophysics Data System (ADS)
Kumar, Shashi; Khati, Unmesh G.; Chandola, Shreya; Agrawal, Shefali; Kushwaha, Satya P. S.
2017-08-01
The regulation of the carbon cycle is a critical ecosystem service provided by forests globally. It is, therefore, necessary to have robust techniques for speedy assessment of forest biophysical parameters at the landscape level. It is arduous and time taking to monitor the status of vast forest landscapes using traditional field methods. Remote sensing and GIS techniques are efficient tools that can monitor the health of forests regularly. Biomass estimation is a key parameter in the assessment of forest health. Polarimetric SAR (PolSAR) remote sensing has already shown its potential for forest biophysical parameter retrieval. The current research work focuses on the retrieval of forest biophysical parameters of tropical deciduous forest, using fully polarimetric spaceborne C-band data with Polarimetric SAR Interferometry (PolInSAR) techniques. PolSAR based Interferometric Water Cloud Model (IWCM) has been used to estimate aboveground biomass (AGB). Input parameters to the IWCM have been extracted from the decomposition modeling of SAR data as well as PolInSAR coherence estimation. The technique of forest tree height retrieval utilized PolInSAR coherence based modeling approach. Two techniques - Coherence Amplitude Inversion (CAI) and Three Stage Inversion (TSI) - for forest height estimation are discussed, compared and validated. These techniques allow estimation of forest stand height and true ground topography. The accuracy of the forest height estimated is assessed using ground-based measurements. PolInSAR based forest height models showed enervation in the identification of forest vegetation and as a result height values were obtained in river channels and plain areas. Overestimation in forest height was also noticed at several patches of the forest. To overcome this problem, coherence and backscatter based threshold technique is introduced for forest area identification and accurate height estimation in non-forested regions. IWCM based modeling for forest AGB retrieval showed R2 value of 0.5, RMSE of 62.73 (t ha-1) and a percent accuracy of 51%. TSI based PolInSAR inversion modeling showed the most accurate result for forest height estimation. The correlation between the field measured forest height and the estimated tree height using TSI technique is 62% with an average accuracy of 91.56% and RMSE of 2.28 m. The study suggested that PolInSAR coherence based modeling approach has significant potential for retrieval of forest biophysical parameters.
The Extended-Image Tracking Technique Based on the Maximum Likelihood Estimation
NASA Technical Reports Server (NTRS)
Tsou, Haiping; Yan, Tsun-Yee
2000-01-01
This paper describes an extended-image tracking technique based on the maximum likelihood estimation. The target image is assume to have a known profile covering more than one element of a focal plane detector array. It is assumed that the relative position between the imager and the target is changing with time and the received target image has each of its pixels disturbed by an independent additive white Gaussian noise. When a rotation-invariant movement between imager and target is considered, the maximum likelihood based image tracking technique described in this paper is a closed-loop structure capable of providing iterative update of the movement estimate by calculating the loop feedback signals from a weighted correlation between the currently received target image and the previously estimated reference image in the transform domain. The movement estimate is then used to direct the imager to closely follow the moving target. This image tracking technique has many potential applications, including free-space optical communications and astronomy where accurate and stabilized optical pointing is essential.
NASA Astrophysics Data System (ADS)
Hamaguchi, Nana; Yamamoto, Keiko; Iwai, Daisuke; Sato, Kosuke
We investigate ambient sensing techniques that recognize writer's psychological states by measuring vibrations of handwriting on a desk panel using a piezoelectric contact sensor attached to its underside. In particular, we describe a technique for estimating the subjective difficulty of a question for a student as the ratio of the time duration of thinking to the total amount of time spent on the question. Through experiments, we confirm that our technique correctly recognizes whether or not a person writes something down on paper by measured vibration data at the accuracy of over 80 %, and that the order of computed subjective difficulties of three questions is coincident with that reported by the subject in 60 % of experiments. We also propose a technique to estimate a writer's psychological stress by using the standard deviation of the spectrum of the measured vibration. Results of a proof-of-concept experiment show that the proposed technique correctly estimates whether or not the subject feels stress at least 90 % of the time.
High-resolution bottom-loss estimation using the ambient-noise vertical coherence function.
Muzi, Lanfranco; Siderius, Martin; Quijano, Jorge E; Dosso, Stan E
2015-01-01
The seabed reflection loss (shortly "bottom loss") is an important quantity for predicting transmission loss in the ocean. A recent passive technique for estimating the bottom loss as a function of frequency and grazing angle exploits marine ambient noise (originating at the surface from breaking waves, wind, and rain) as an acoustic source. Conventional beamforming of the noise field at a vertical line array of hydrophones is a fundamental step in this technique, and the beamformer resolution in grazing angle affects the quality of the estimated bottom loss. Implementation of this technique with short arrays can be hindered by their inherently poor angular resolution. This paper presents a derivation of the bottom reflection coefficient from the ambient-noise spatial coherence function, and a technique based on this derivation for obtaining higher angular resolution bottom-loss estimates. The technique, which exploits the (approximate) spatial stationarity of the ambient-noise spatial coherence function, is demonstrated on both simulated and experimental data.
Hocalar, A; Türker, M; Karakuzu, C; Yüzgeç, U
2011-04-01
In this study, previously developed five different state estimation methods are examined and compared for estimation of biomass concentrations at a production scale fed-batch bioprocess. These methods are i. estimation based on kinetic model of overflow metabolism; ii. estimation based on metabolic black-box model; iii. estimation based on observer; iv. estimation based on artificial neural network; v. estimation based on differential evaluation. Biomass concentrations are estimated from available measurements and compared with experimental data obtained from large scale fermentations. The advantages and disadvantages of the presented techniques are discussed with regard to accuracy, reproducibility, number of primary measurements required and adaptation to different working conditions. Among the various techniques, the metabolic black-box method seems to have advantages although the number of measurements required is more than that for the other methods. However, the required extra measurements are based on commonly employed instruments in an industrial environment. This method is used for developing a model based control of fed-batch yeast fermentations. Copyright © 2010 ISA. Published by Elsevier Ltd. All rights reserved.
UAV State Estimation Modeling Techniques in AHRS
NASA Astrophysics Data System (ADS)
Razali, Shikin; Zhahir, Amzari
2017-11-01
Autonomous unmanned aerial vehicle (UAV) system is depending on state estimation feedback to control flight operation. Estimation on the correct state improves navigation accuracy and achieves flight mission safely. One of the sensors configuration used in UAV state is Attitude Heading and Reference System (AHRS) with application of Extended Kalman Filter (EKF) or feedback controller. The results of these two different techniques in estimating UAV states in AHRS configuration are displayed through position and attitude graphs.
The effects of missing data on global ozone estimates
NASA Technical Reports Server (NTRS)
Drewry, J. W.; Robbins, J. L.
1981-01-01
The effects of missing data and model truncation on estimates of the global mean, zonal distribution, and global distribution of ozone are considered. It is shown that missing data can introduce biased estimates with errors that are not accounted for in the accuracy calculations of empirical modeling techniques. Data-fill techniques are introduced and used for evaluating error bounds and constraining the estimate in areas of sparse and missing data. It is found that the accuracy of the global mean estimate is more dependent on data distribution than model size. Zonal features can be accurately described by 7th order models over regions of adequate data distribution. Data variance accounted for by higher order models appears to represent climatological features of columnar ozone rather than pure error. Data-fill techniques can prevent artificial feature generation in regions of sparse or missing data without degrading high order estimates over dense data regions.
Tumor response estimation in radar-based microwave breast cancer detection.
Kurrant, Douglas J; Fear, Elise C; Westwick, David T
2008-12-01
Radar-based microwave imaging techniques have been proposed for early stage breast cancer detection. A considerable challenge for the successful implementation of these techniques is the reduction of clutter, or components of the signal originating from objects other than the tumor. In particular, the reduction of clutter from the late-time scattered fields is required in order to detect small (subcentimeter diameter) tumors. In this paper, a method to estimate the tumor response contained in the late-time scattered fields is presented. The method uses a parametric function to model the tumor response. A maximum a posteriori estimation approach is used to evaluate the optimal values for the estimates of the parameters. A pattern classification technique is then used to validate the estimation. The ability of the algorithm to estimate a tumor response is demonstrated by using both experimental and simulated data obtained with a tissue sensing adaptive radar system.
Oberg, Kevin A.; Mades, Dean M.
1987-01-01
Four techniques for estimating generalized skew in Illinois were evaluated: (1) a generalized skew map of the US; (2) an isoline map; (3) a prediction equation; and (4) a regional-mean skew. Peak-flow records at 730 gaging stations having 10 or more annual peaks were selected for computing station skews. Station skew values ranged from -3.55 to 2.95, with a mean of -0.11. Frequency curves computed for 30 gaging stations in Illinois using the variations of the regional-mean skew technique are similar to frequency curves computed using a skew map developed by the US Water Resources Council (WRC). Estimates of the 50-, 100-, and 500-yr floods computed for 29 of these gaging stations using the regional-mean skew techniques are within the 50% confidence limits of frequency curves computed using the WRC skew map. Although the three variations of the regional-mean skew technique were slightly more accurate than the WRC map, there is no appreciable difference between flood estimates computed using the variations of the regional-mean technique and flood estimates computed using the WRC skew map. (Peters-PTT)
Biomagnetic techniques for evaluating gastric emptying, peristaltic contraction and transit time
la Roca-Chiapas, Jose María De; Cordova-Fraga, Teodoro
2011-01-01
Biomagnetic techniques were used to measure motility in various parts of the gastrointestinal (GI) tract, particularly a new technique for detecting magnetic markers and tracers. A coil was used to enhance the signal from a magnetic tracer in the GI tract and the signal was detected using a fluxgate magnetometer or a magnetoresistor in an unshielded room. Estimates of esophageal transit time were affected by the position of the subject. The reproducibility of estimates derived using the new biomagnetic technique was greater than 85% and it yielded estimates similar to those obtained using scintigraphy. This technique is suitable for studying the effect of emotional state on GI physiology and for measuring GI transit time. The biomagnetic technique can be used to evaluate digesta transit time in the esophagus, stomach and colon, peristaltic frequency and gastric emptying and is easy to use in the hospital setting. PMID:22025978
Biomagnetic techniques for evaluating gastric emptying, peristaltic contraction and transit time.
la Roca-Chiapas, Jose María De; Cordova-Fraga, Teodoro
2011-10-15
Biomagnetic techniques were used to measure motility in various parts of the gastrointestinal (GI) tract, particularly a new technique for detecting magnetic markers and tracers. A coil was used to enhance the signal from a magnetic tracer in the GI tract and the signal was detected using a fluxgate magnetometer or a magnetoresistor in an unshielded room. Estimates of esophageal transit time were affected by the position of the subject. The reproducibility of estimates derived using the new biomagnetic technique was greater than 85% and it yielded estimates similar to those obtained using scintigraphy. This technique is suitable for studying the effect of emotional state on GI physiology and for measuring GI transit time. The biomagnetic technique can be used to evaluate digesta transit time in the esophagus, stomach and colon, peristaltic frequency and gastric emptying and is easy to use in the hospital setting.
A visual training tool for the Photoload sampling technique
Violet J. Holley; Robert E. Keane
2010-01-01
This visual training aid is designed to provide Photoload users a tool to increase the accuracy of fuel loading estimations when using the Photoload technique. The Photoload Sampling Technique (RMRS-GTR-190) provides fire managers a sampling method for obtaining consistent, accurate, inexpensive, and quick estimates of fuel loading. It is designed to require only one...
NASA Technical Reports Server (NTRS)
Daly, J. K.
1974-01-01
The programming techniques used to implement the equations and mathematical techniques of the Houston Operations Predictor/Estimator (HOPE) orbit determination program on the UNIVAC 1108 computer are described. Detailed descriptions are given of the program structure, the internal program structure, the internal program tables and program COMMON, modification and maintainence techniques, and individual subroutine documentation.
NASA Technical Reports Server (NTRS)
1980-01-01
A plan is presented for a supplemental experiment to evaluate a sample allocation technique for selecting picture elements from remotely sensed multispectral imagery for labeling in connection with a new crop proportion estimation technique. The method of evaluating an improved allocation and proportion estimation technique is also provided.
NASA Technical Reports Server (NTRS)
Sheffner, E. J.; Hlavka, C. A.; Bauer, E. M.
1984-01-01
Two techniques have been developed for the mapping and area estimation of small grains in California from Landsat digital data. The two techniques are Band Ratio Thresholding, a semi-automated version of a manual procedure, and LCLS, a layered classification technique which can be fully automated and is based on established clustering and classification technology. Preliminary evaluation results indicate that the two techniques have potential for providing map products which can be incorporated into existing inventory procedures and automated alternatives to traditional inventory techniques and those which currently employ Landsat imagery.
Mann, Michael P.; Rizzardo, Jule; Satkowski, Richard
2004-01-01
Accurate streamflow statistics are essential to water resource agencies involved in both science and decision-making. When long-term streamflow data are lacking at a site, estimation techniques are often employed to generate streamflow statistics. However, procedures for accurately estimating streamflow statistics often are lacking. When estimation procedures are developed, they often are not evaluated properly before being applied. Use of unevaluated or underevaluated flow-statistic estimation techniques can result in improper water-resources decision-making. The California State Water Resources Control Board (SWRCB) uses two key techniques, a modified rational equation and drainage basin area-ratio transfer, to estimate streamflow statistics at ungaged locations. These techniques have been implemented to varying degrees, but have not been formally evaluated. For estimating peak flows at the 2-, 5-, 10-, 25-, 50-, and 100-year recurrence intervals, the SWRCB uses the U.S. Geological Surveys (USGS) regional peak-flow equations. In this study, done cooperatively by the USGS and SWRCB, the SWRCB estimated several flow statistics at 40 USGS streamflow gaging stations in the north coast region of California. The SWRCB estimates were made without reference to USGS flow data. The USGS used the streamflow data provided by the 40 stations to generate flow statistics that could be compared with SWRCB estimates for accuracy. While some SWRCB estimates compared favorably with USGS statistics, results were subject to varying degrees of error over the region. Flow-based estimation techniques generally performed better than rain-based methods, especially for estimation of December 15 to March 31 mean daily flows. The USGS peak-flow equations also performed well, but tended to underestimate peak flows. The USGS equations performed within reported error bounds, but will require updating in the future as peak-flow data sets grow larger. Little correlation was discovered between estimation errors and geographic locations or various basin characteristics. However, for 25-percentile year mean-daily-flow estimates for December 15 to March 31, the greatest estimation errors were at east San Francisco Bay area stations with mean annual precipitation less than or equal to 30 inches, and estimated 2-year/24-hour rainfall intensity less than 3 inches.
NASA Technical Reports Server (NTRS)
Tranter, W. H.; Turner, M. D.
1977-01-01
Techniques are developed to estimate power gain, delay, signal-to-noise ratio, and mean square error in digital computer simulations of lowpass and bandpass systems. The techniques are applied to analog and digital communications. The signal-to-noise ratio estimates are shown to be maximum likelihood estimates in additive white Gaussian noise. The methods are seen to be especially useful for digital communication systems where the mapping from the signal-to-noise ratio to the error probability can be obtained. Simulation results show the techniques developed to be accurate and quite versatile in evaluating the performance of many systems through digital computer simulation.
Comparing capacity value estimation techniques for photovoltaic solar power
Madaeni, Seyed Hossein; Sioshansi, Ramteen; Denholm, Paul
2012-09-28
In this paper, we estimate the capacity value of photovoltaic (PV) solar plants in the western U.S. Our results show that PV plants have capacity values that range between 52% and 93%, depending on location and sun-tracking capability. We further compare more robust but data- and computationally-intense reliability-based estimation techniques with simpler approximation methods. We show that if implemented properly, these techniques provide accurate approximations of reliability-based methods. Overall, methods that are based on the weighted capacity factor of the plant provide the most accurate estimate. As a result, we also examine the sensitivity of PV capacity value to themore » inclusion of sun-tracking systems.« less
A proposed technique for the Venus balloon telemetry and Doppler frequency recovery
NASA Technical Reports Server (NTRS)
Jurgens, R. F.; Divsalar, D.
1985-01-01
A technique is proposed to accurately estimate the Doppler frequency and demodulate the digitally encoded telemetry signal that contains the measurements from balloon instruments. Since the data are prerecorded, one can take advantage of noncausal estimators that are both simpler and more computationally efficient than the usual closed-loop or real-time estimators for signal detection and carrier tracking. Algorithms for carrier frequency estimation subcarrier demodulation, bit and frame synchronization are described. A Viterbi decoder algorithm using a branch indexing technique has been devised to decode constraint length 6, rate 1/2 convolutional code that is being used by the balloon transmitter. These algorithms are memory efficient and can be implemented on microcomputer systems.
NASA Technical Reports Server (NTRS)
Smetana, F. O.; Summery, D. C.; Johnson, W. D.
1972-01-01
Techniques quoted in the literature for the extraction of stability derivative information from flight test records are reviewed. A recent technique developed at NASA's Langley Research Center was regarded as the most productive yet developed. Results of tests of the sensitivity of this procedure to various types of data noise and to the accuracy of the estimated values of the derivatives are reported. Computer programs for providing these initial estimates are given. The literature review also includes a discussion of flight test measuring techniques, instrumentation, and piloting techniques.
Use of high-order spectral moments in Doppler weather radar
NASA Astrophysics Data System (ADS)
di Vito, A.; Galati, G.; Veredice, A.
Three techniques to estimate the skewness and curtosis of measured precipitation spectra are evaluated. These are: (1) an extension of the pulse-pair technique, (2) fitting the autocorrelation function with a least square polynomial and differentiating it, and (3) the autoregressive spectral estimation. The third technique provides the best results but has an exceedingly large computation burden. The first technique does not supply any useful results due to the crude approximation of the derivatives of the ACF. The second technique requires further study to reduce its variance.
Effect of random errors in planar PIV data on pressure estimation in vortex dominated flows
NASA Astrophysics Data System (ADS)
McClure, Jeffrey; Yarusevych, Serhiy
2015-11-01
The sensitivity of pressure estimation techniques from Particle Image Velocimetry (PIV) measurements to random errors in measured velocity data is investigated using the flow over a circular cylinder as a test case. Direct numerical simulations are performed for ReD = 100, 300 and 1575, spanning laminar, transitional, and turbulent wake regimes, respectively. A range of random errors typical for PIV measurements is applied to synthetic PIV data extracted from numerical results. A parametric study is then performed using a number of common pressure estimation techniques. Optimal temporal and spatial resolutions are derived based on the sensitivity of the estimated pressure fields to the simulated random error in velocity measurements, and the results are compared to an optimization model derived from error propagation theory. It is shown that the reductions in spatial and temporal scales at higher Reynolds numbers leads to notable changes in the optimal pressure evaluation parameters. The effect of smaller scale wake structures is also quantified. The errors in the estimated pressure fields are shown to depend significantly on the pressure estimation technique employed. The results are used to provide recommendations for the use of pressure and force estimation techniques from experimental PIV measurements in vortex dominated laminar and turbulent wake flows.
Ellison, L.E.; O'Shea, T.J.; Neubaum, D.J.; Neubaum, M.A.; Pearce, R.D.; Bowen, R.A.
2007-01-01
We compared conventional capture (primarily mist nets and harp traps) and passive integrated transponder (PIT) tagging techniques for estimating capture and survival probabilities of big brown bats (Eptesicus fuscus) roosting in buildings in Fort Collins, Colorado. A total of 987 female adult and juvenile bats were captured and marked by subdermal injection of PIT tags during the summers of 2001-2005 at five maternity colonies in buildings. Openings to roosts were equipped with PIT hoop-style readers, and exit and entry of bats were passively monitored on a daily basis throughout the summers of 2002-2005. PIT readers 'recaptured' adult and juvenile females more often than conventional capture events at each roost. Estimates of annual capture probabilities for all five colonies were on average twice as high when estimated from PIT reader data (P?? = 0.93-1.00) than when derived from conventional techniques (P?? = 0.26-0.66), and as a consequence annual survival estimates were more precisely estimated when using PIT reader encounters. Short-term, daily capture estimates were also higher using PIT readers than conventional captures. We discuss the advantages and limitations of using PIT tags and passive encounters with hoop readers vs. conventional capture techniques for estimating these vital parameters in big brown bats. ?? Museum and Institute of Zoology PAS.
Optimizing focal plane electric field estimation for detecting exoplanets
NASA Astrophysics Data System (ADS)
Groff, T.; Kasdin, N. J.; Riggs, A. J. E.
Detecting extrasolar planets with angular separations and contrast levels similar to Earth requires a large space-based observatory and advanced starlight suppression techniques. This paper focuses on techniques employing an internal coronagraph, which is highly sensitive to optical errors and must rely on focal plane wavefront control techniques to achieve the necessary contrast levels. To maximize the available science time for a coronagraphic mission we demonstrate an estimation scheme using a discrete time Kalman filter. The state estimate feedback inherent to the filter allows us to minimize the number of exposures required to estimate the electric field. We also show progress including a bias estimate into the Kalman filter to eliminate incoherent light from the estimate. Since the exoplanets themselves are incoherent to the star, this has the added benefit of using the control history to gain certainty in the location of exoplanet candidates as the signal-to-noise between the planets and speckles improves. Having established a purely focal plane based wavefront estimation technique, we discuss a sensor fusion concept where alternate wavefront sensors feedforward a time update to the focal plane estimate to improve robustness to time varying speckle. The overall goal of this work is to reduce the time required for wavefront control on a target, thereby improving the observatory's planet detection performance by increasing the number of targets reachable during the lifespan of the mission.
Parrett, Charles; Hull, J.A.
1986-01-01
Once-monthly streamflow measurements were used to estimate selected percentile discharges on flow-duration curves of monthly mean discharge for 40 ungaged stream sites in the upper Yellowstone River basin in Montana. The estimation technique was a modification of the concurrent-discharge method previously described and used by H.C. Riggs to estimate annual mean discharge. The modified technique is based on the relationship of various mean seasonal discharges to the required discharges on the flow-duration curves. The mean seasonal discharges are estimated from the monthly streamflow measurements, and the percentile discharges are calculated from regression equations. The regression equations, developed from streamflow record at nine gaging stations, indicated a significant log-linear relationship between mean seasonal discharge and various percentile discharges. The technique was tested at two discontinued streamflow-gaging stations; the differences between estimated monthly discharges and those determined from the discharge record ranged from -31 to +27 percent at one site and from -14 to +85 percent at the other. The estimates at one site were unbiased, and the estimates at the other site were consistently larger than the recorded values. Based on the test results, the probable average error of the technique was + or - 30 percent for the 21 sites measured during the first year of the program and + or - 50 percent for the 19 sites measured during the second year. (USGS)
Using Deep Learning for Tropical Cyclone Intensity Estimation
NASA Astrophysics Data System (ADS)
Miller, J.; Maskey, M.; Berendes, T.
2017-12-01
Satellite-based techniques are the primary approach to estimating tropical cyclone (TC) intensity. Tropical cyclone warning centers worldwide still apply variants of the Dvorak technique for such estimations that include visual inspection of the satellite images. The National Hurricane Center (NHC) estimates about 10-20% uncertainty in its post analyses when only satellite-based estimates are available. The success of the Dvorak technique proves that spatial patterns in infrared (IR) imagery strongly relate to TC intensity. With the ever-increasing quality and quantity of satellite observations of TCs, deep learning techniques designed to excel at pattern recognition have become more relevant in this area of study. In our current study, we aim to provide a fully objective approach to TC intensity estimation by utilizing deep learning in the form of a convolutional neural network trained to predict TC intensity (maximum sustained wind speed) using IR satellite imagery. Large amounts of training data are needed to train a convolutional neural network, so we use GOES IR images from historical tropical storms from the Atlantic and Pacific basins spanning years 2000 to 2015. Images are labeled using a special subset of the HURDAT2 dataset restricted to time periods with airborne reconnaissance data available in order to improve the quality of the HURDAT2 data. Results and the advantages of this technique are to be discussed.
Accurate Estimation of Solvation Free Energy Using Polynomial Fitting Techniques
Shyu, Conrad; Ytreberg, F. Marty
2010-01-01
This report details an approach to improve the accuracy of free energy difference estimates using thermodynamic integration data (slope of the free energy with respect to the switching variable λ) and its application to calculating solvation free energy. The central idea is to utilize polynomial fitting schemes to approximate the thermodynamic integration data to improve the accuracy of the free energy difference estimates. Previously, we introduced the use of polynomial regression technique to fit thermodynamic integration data (Shyu and Ytreberg, J Comput Chem 30: 2297–2304, 2009). In this report we introduce polynomial and spline interpolation techniques. Two systems with analytically solvable relative free energies are used to test the accuracy of the interpolation approach. We also use both interpolation and regression methods to determine a small molecule solvation free energy. Our simulations show that, using such polynomial techniques and non-equidistant λ values, the solvation free energy can be estimated with high accuracy without using soft-core scaling and separate simulations for Lennard-Jones and partial charges. The results from our study suggest these polynomial techniques, especially with use of non-equidistant λ values, improve the accuracy for ΔF estimates without demanding additional simulations. We also provide general guidelines for use of polynomial fitting to estimate free energy. To allow researchers to immediately utilize these methods, free software and documentation is provided via http://www.phys.uidaho.edu/ytreberg/software. PMID:20623657
Fusion-based multi-target tracking and localization for intelligent surveillance systems
NASA Astrophysics Data System (ADS)
Rababaah, Haroun; Shirkhodaie, Amir
2008-04-01
In this paper, we have presented two approaches addressing visual target tracking and localization in complex urban environment. The two techniques presented in this paper are: fusion-based multi-target visual tracking, and multi-target localization via camera calibration. For multi-target tracking, the data fusion concepts of hypothesis generation/evaluation/selection, target-to-target registration, and association are employed. An association matrix is implemented using RGB histograms for associated tracking of multi-targets of interests. Motion segmentation of targets of interest (TOI) from the background was achieved by a Gaussian Mixture Model. Foreground segmentation, on other hand, was achieved by the Connected Components Analysis (CCA) technique. The tracking of individual targets was estimated by fusing two sources of information, the centroid with the spatial gating, and the RGB histogram association matrix. The localization problem is addressed through an effective camera calibration technique using edge modeling for grid mapping (EMGM). A two-stage image pixel to world coordinates mapping technique is introduced that performs coarse and fine location estimation of moving TOIs. In coarse estimation, an approximate neighborhood of the target position is estimated based on nearest 4-neighbor method, and in fine estimation, we use Euclidean interpolation to localize the position within the estimated four neighbors. Both techniques were tested and shown reliable results for tracking and localization of Targets of interests in complex urban environment.
Chaudhuri, Shomesh E; Merfeld, Daniel M
2013-03-01
Psychophysics generally relies on estimating a subject's ability to perform a specific task as a function of an observed stimulus. For threshold studies, the fitted functions are called psychometric functions. While fitting psychometric functions to data acquired using adaptive sampling procedures (e.g., "staircase" procedures), investigators have encountered a bias in the spread ("slope" or "threshold") parameter that has been attributed to the serial dependency of the adaptive data. Using simulations, we confirm this bias for cumulative Gaussian parametric maximum likelihood fits on data collected via adaptive sampling procedures, and then present a bias-reduced maximum likelihood fit that substantially reduces the bias without reducing the precision of the spread parameter estimate and without reducing the accuracy or precision of the other fit parameters. As a separate topic, we explain how to implement this bias reduction technique using generalized linear model fits as well as other numeric maximum likelihood techniques such as the Nelder-Mead simplex. We then provide a comparison of the iterative bootstrap and observed information matrix techniques for estimating parameter fit variance from adaptive sampling procedure data sets. The iterative bootstrap technique is shown to be slightly more accurate; however, the observed information technique executes in a small fraction (0.005 %) of the time required by the iterative bootstrap technique, which is an advantage when a real-time estimate of parameter fit variance is required.
Weighted image de-fogging using luminance dark prior
NASA Astrophysics Data System (ADS)
Kansal, Isha; Kasana, Singara Singh
2017-10-01
In this work, the weighted image de-fogging process based upon dark channel prior is modified by using luminance dark prior. Dark channel prior estimates the transmission by using three colour channels whereas luminance dark prior does the same by making use of only Y component of YUV colour space. For each pixel in a patch of ? size, the luminance dark prior uses ? pixels, rather than ? pixels used in DCP technique, which speeds up the de-fogging process. To estimate the transmission map, weighted approach based upon difference prior is used which mitigates halo artefacts at the time of transmission estimation. The major drawback of weighted technique is that it does not maintain the constancy of the transmission in a local patch even if there are no significant depth disruptions, due to which the de-fogged image looks over smooth and has low contrast. Apart from this, in some images, weighted transmission still carries less visible halo artefacts. Therefore, Gaussian filter is used to blur the estimated weighted transmission map which enhances the contrast of de-fogged images. In addition to this, a novel approach is proposed to remove the pixels belonging to bright light source(s) during the atmospheric light estimation process based upon histogram of YUV colour space. To show the effectiveness, the proposed technique is compared with existing techniques. This comparison shows that the proposed technique performs better than the existing techniques.
NASA Astrophysics Data System (ADS)
Bellili, Faouzi; Amor, Souheib Ben; Affes, Sofiène; Ghrayeb, Ali
2017-12-01
This paper addresses the problem of DOA estimation using uniform linear array (ULA) antenna configurations. We propose a new low-cost method of multiple DOA estimation from very short data snapshots. The new estimator is based on the annihilating filter (AF) technique. It is non-data-aided (NDA) and does not impinge therefore on the whole throughput of the system. The noise components are assumed temporally and spatially white across the receiving antenna elements. The transmitted signals are also temporally and spatially white across the transmitting sources. The new method is compared in performance to the Cramér-Rao lower bound (CRLB), the root-MUSIC algorithm, the deterministic maximum likelihood estimator and another Bayesian method developed precisely for the single snapshot case. Simulations show that the new estimator performs well over a wide SNR range. Prominently, the main advantage of the new AF-based method is that it succeeds in accurately estimating the DOAs from short data snapshots and even from a single snapshot outperforming by far the state-of-the-art techniques both in DOA estimation accuracy and computational cost.
Lance R. Williams; Melvin L. Warren; Susan B. Adams; Joseph L. Arvai; Christopher M. Taylor
2004-01-01
Basin Visual Estimation Techniques (BVET) are used to estimate abundance for fish populations in small streams. With BVET, independent samples are drawn from natural habitat units in the stream rather than sampling "representative reaches." This sampling protocol provides an alternative to traditional reach-level surveys, which are criticized for their lack...
A field test of cut-off importance sampling for bole volume
Jeffrey H. Gove; Harry T. Valentine; Michael J. Holmes
2000-01-01
Cut-off importance sampling has recently been introduced as a technique for estimating bole volume to some point below the tree tip, termed the cut-off point. A field test of this technique was conducted on a small population of eastern white pine trees using dendrometry as the standard for volume estimation. Results showed that the differences in volume estimates...
Robert E. Keane; Laura J. Dickinson
2007-01-01
Fire managers need better estimates of fuel loading so they can more accurately predict the potential fire behavior and effects of alternative fuel and ecosystem restoration treatments. This report presents a new fuel sampling method, called the photoload sampling technique, to quickly and accurately estimate loadings for six common surface fuel components (1 hr, 10 hr...
NASA Astrophysics Data System (ADS)
Sharan, Maithili; Singh, Amit Kumar; Singh, Sarvesh Kumar
2017-11-01
Estimation of an unknown atmospheric release from a finite set of concentration measurements is considered an ill-posed inverse problem. Besides ill-posedness, the estimation process is influenced by the instrumental errors in the measured concentrations and model representativity errors. The study highlights the effect of minimizing model representativity errors on the source estimation. This is described in an adjoint modelling framework and followed in three steps. First, an estimation of point source parameters (location and intensity) is carried out using an inversion technique. Second, a linear regression relationship is established between the measured concentrations and corresponding predicted using the retrieved source parameters. Third, this relationship is utilized to modify the adjoint functions. Further, source estimation is carried out using these modified adjoint functions to analyse the effect of such modifications. The process is tested for two well known inversion techniques, called renormalization and least-square. The proposed methodology and inversion techniques are evaluated for a real scenario by using concentrations measurements from the Idaho diffusion experiment in low wind stable conditions. With both the inversion techniques, a significant improvement is observed in the retrieval of source estimation after minimizing the representativity errors.
Vasanawala, Shreyas S; Yu, Huanzhou; Shimakawa, Ann; Jeng, Michael; Brittain, Jean H
2012-01-01
MRI imaging of hepatic iron overload can be achieved by estimating T(2) values using multiple-echo sequences. The purpose of this work is to develop and clinically evaluate a weighted least squares algorithm based on T(2) Iterative Decomposition of water and fat with Echo Asymmetry and Least-squares estimation (IDEAL) technique for volumetric estimation of hepatic T(2) in the setting of iron overload. The weighted least squares T(2) IDEAL technique improves T(2) estimation by automatically decreasing the impact of later, noise-dominated echoes. The technique was evaluated in 37 patients with iron overload. Each patient underwent (i) a standard 2D multiple-echo gradient echo sequence for T(2) assessment with nonlinear exponential fitting, and (ii) a 3D T(2) IDEAL technique, with and without a weighted least squares fit. Regression and Bland-Altman analysis demonstrated strong correlation between conventional 2D and T(2) IDEAL estimation. In cases of severe iron overload, T(2) IDEAL without weighted least squares reconstruction resulted in a relative overestimation of T(2) compared with weighted least squares. Copyright © 2011 Wiley-Liss, Inc.
Ronald E. McRoberts; Erkki O. Tomppo; Andrew O. Finley; Heikkinen Juha
2007-01-01
The k-Nearest Neighbor (k-NN) technique has become extremely popular for a variety of forest inventory mapping and estimation applications. Much of this popularity may be attributed to the non-parametric, multivariate features of the technique, its intuitiveness, and its ease of use. When used with satellite imagery and forest...
USDA-ARS?s Scientific Manuscript database
Traditional microbiological techniques for estimating populations of viable bacteria can be laborious and time consuming. The Most Probable Number (MPN) technique is especially tedious as multiple series of tubes must be inoculated at several different dilutions. Recently, an instrument (TEMPOTM) ...
C. Andrew Dolloff; Holly E. Jennings
1997-01-01
We compared estimates of stream habitat at the watershed scale using the basinwide visual estimation technique (BVET) and the representative reach extrapolation technique (RRET) in three small watersheds in the Appalachian Mountains. Within each watershed, all habitat units were sampled by the BVET, in contrast, three or four 100-m reaches were sampled with the RRET....
Three Different Methods of Estimating LAI in a Small Watershed
NASA Astrophysics Data System (ADS)
Speckman, H. N.; Ewers, B. E.; Beverly, D.
2015-12-01
Leaf area index (LAI) is a critical input of models that improve predictive understanding of ecology, hydrology, and climate change. Multiple techniques exist to quantify LAI, most of which are labor intensive, and all often fail to converge on similar estimates. . Recent large-scale bark beetle induced mortality greatly altered LAI, which is now dominated by younger and more metabolically active trees compared to the pre-beetle forest. Tree mortality increases error in optical LAI estimates due to the lack of differentiation between live and dead branches in dense canopy. Our study aims to quantify LAI using three different LAI methods, and then to compare the techniques to each other and topographic drivers to develop an effective predictive model of LAI. This study focuses on quantifying LAI within a small (~120 ha) beetle infested watershed in Wyoming's Snowy Range Mountains. The first technique estimated LAI using in-situ hemispherical canopy photographs that were then analyzed with Hemisfer software. The second LAI estimation technique was use of the Kaufmann 1982 allometrerics from forest inventories conducted throughout the watershed, accounting for stand basal area, species composition, and the extent of bark beetle driven mortality. The final technique used airborne light detection and ranging (LIDAR) first DMS returns, which were used to estimating canopy heights and crown area. LIDAR final returns provided topographical information and were then ground-truthed during forest inventories. Once data was collected, a fractural analysis was conducted comparing the three methods. Species composition was driven by slope position and elevation Ultimately the three different techniques provided very different estimations of LAI, but each had their advantage: estimates from hemisphere photos were well correlated with SWE and snow depth measurements, forest inventories provided insight into stand health and composition, and LIDAR were able to quickly and efficiently cover a very large area.
D'Agnese, F. A.; Faunt, C.C.; Turner, A.K.; ,
1996-01-01
The recharge and discharge components of the Death Valley regional groundwater flow system were defined by techniques that integrated disparate data types to develop a spatially complex representation of near-surface hydrological processes. Image classification methods were applied to multispectral satellite data to produce a vegetation map. The vegetation map was combined with ancillary data in a GIS to delineate different types of wetlands, phreatophytes and wet playa areas. Existing evapotranspiration-rate estimates were used to calculate discharge volumes for these area. An empirical method of groundwater recharge estimation was modified to incorporate data describing soil-moisture conditions, and a recharge potential map was produced. These discharge and recharge maps were readily converted to data arrays for numerical modelling codes. Inverse parameter estimation techniques also used these data to evaluate the reliability and sensitivity of estimated values.The recharge and discharge components of the Death Valley regional groundwater flow system were defined by remote sensing and GIS techniques that integrated disparate data types to develop a spatially complex representation of near-surface hydrological processes. Image classification methods were applied to multispectral satellite data to produce a vegetation map. This map provided a basis for subsequent evapotranspiration and infiltration estimations. The vegetation map was combined with ancillary data in a GIS to delineate different types of wetlands, phreatophytes and wet playa areas. Existing evapotranspiration-rate estimates were then used to calculate discharge volumes for these areas. A previously used empirical method of groundwater recharge estimation was modified by GIS methods to incorporate data describing soil-moisture conditions, and a recharge potential map was produced. These discharge and recharge maps were readily converted to data arrays for numerical modelling codes. Inverse parameter estimation techniques also used these data to evaluate the reliability and sensitivity of estimated values.
Comparison study on disturbance estimation techniques in precise slow motion control
NASA Astrophysics Data System (ADS)
Fan, S.; Nagamune, R.; Altintas, Y.; Fan, D.; Zhang, Z.
2010-08-01
Precise low speed motion control is important for the industrial applications of both micro-milling machine tool feed drives and electro-optical tracking servo systems. It calls for precise position and instantaneous velocity measurement and disturbance, which involves direct drive motor force ripple, guide way friction and cutting force etc., estimation. This paper presents a comparison study on dynamic response and noise rejection performance of three existing disturbance estimation techniques, including the time-delayed estimators, the state augmented Kalman Filters and the conventional disturbance observers. The design technique essentials of these three disturbance estimators are introduced. For designing time-delayed estimators, it is proposed to substitute Kalman Filter for Luenberger state observer to improve noise suppression performance. The results show that the noise rejection performances of the state augmented Kalman Filters and the time-delayed estimators are much better than the conventional disturbance observers. These two estimators can give not only the estimation of the disturbance but also the low noise level estimations of position and instantaneous velocity. The bandwidth of the state augmented Kalman Filters is wider than the time-delayed estimators. In addition, the state augmented Kalman Filters can give unbiased estimations of the slow varying disturbance and the instantaneous velocity, while the time-delayed estimators can not. The simulation and experiment conducted on X axis of a 2.5-axis prototype micro milling machine are provided.
NASA Technical Reports Server (NTRS)
Amis, M. L.; Martin, M. V.; Mcguire, W. G.; Shen, S. S. (Principal Investigator)
1982-01-01
Studies completed in fiscal year 1981 in support of the clustering/classification and preprocessing activities of the Domestic Crops and Land Cover project. The theme throughout the study was the improvement of subanalysis district (usually county level) crop hectarage estimates, as reflected in the following three objectives: (1) to evaluate the current U.S. Department of Agriculture Statistical Reporting Service regression approach to crop area estimation as applied to the problem of obtaining subanalysis district estimates; (2) to develop and test alternative approaches to subanalysis district estimation; and (3) to develop and test preprocessing techniques for use in improving subanalysis district estimates.
The augmented Lagrangian method for parameter estimation in elliptic systems
NASA Technical Reports Server (NTRS)
Ito, Kazufumi; Kunisch, Karl
1990-01-01
In this paper a new technique for the estimation of parameters in elliptic partial differential equations is developed. It is a hybrid method combining the output-least-squares and the equation error method. The new method is realized by an augmented Lagrangian formulation, and convergence as well as rate of convergence proofs are provided. Technically the critical step is the verification of a coercivity estimate of an appropriately defined Lagrangian functional. To obtain this coercivity estimate a seminorm regularization technique is used.
Advances in parameter estimation techniques applied to flexible structures
NASA Technical Reports Server (NTRS)
Maben, Egbert; Zimmerman, David C.
1994-01-01
In this work, various parameter estimation techniques are investigated in the context of structural system identification utilizing distributed parameter models and 'measured' time-domain data. Distributed parameter models are formulated using the PDEMOD software developed by Taylor. Enhancements made to PDEMOD for this work include the following: (1) a Wittrick-Williams based root solving algorithm; (2) a time simulation capability; and (3) various parameter estimation algorithms. The parameter estimations schemes will be contrasted using the NASA Mini-Mast as the focus structure.
NASA Astrophysics Data System (ADS)
Thoonsaengngam, Rattapol; Tangsangiumvisai, Nisachon
This paper proposes an enhanced method for estimating the a priori Signal-to-Disturbance Ratio (SDR) to be employed in the Acoustic Echo and Noise Suppression (AENS) system for full-duplex hands-free communications. The proposed a priori SDR estimation technique is modified based upon the Two-Step Noise Reduction (TSNR) algorithm to suppress the background noise while preserving speech spectral components. In addition, a practical approach to determine accurately the Echo Spectrum Variance (ESV) is presented based upon the linear relationship assumption between the power spectrum of far-end speech and acoustic echo signals. The ESV estimation technique is then employed to alleviate the acoustic echo problem. The performance of the AENS system that employs these two proposed estimation techniques is evaluated through the Echo Attenuation (EA), Noise Attenuation (NA), and two speech distortion measures. Simulation results based upon real speech signals guarantee that our improved AENS system is able to mitigate efficiently the problem of acoustic echo and background noise, while preserving the speech quality and speech intelligibility.
Harding, Brian J; Gehrels, Thomas W; Makela, Jonathan J
2014-02-01
The Earth's thermosphere plays a critical role in driving electrodynamic processes in the ionosphere and in transferring solar energy to the atmosphere, yet measurements of thermospheric state parameters, such as wind and temperature, are sparse. One of the most popular techniques for measuring these parameters is to use a Fabry-Perot interferometer to monitor the Doppler width and breadth of naturally occurring airglow emissions in the thermosphere. In this work, we present a technique for estimating upper-atmospheric winds and temperatures from images of Fabry-Perot fringes captured by a CCD detector. We estimate instrument parameters from fringe patterns of a frequency-stabilized laser, and we use these parameters to estimate winds and temperatures from airglow fringe patterns. A unique feature of this technique is the model used for the laser and airglow fringe patterns, which fits all fringes simultaneously and attempts to model the effects of optical defects. This technique yields accurate estimates for winds, temperatures, and the associated uncertainties in these parameters, as we show with a Monte Carlo simulation.
Langdon, Jonathan H; Elegbe, Etana; McAleavey, Stephen A
2015-01-01
Single Tracking Location (STL) Shear wave Elasticity Imaging (SWEI) is a method for detecting elastic differences between tissues. It has the advantage of intrinsic speckle bias suppression compared to Multiple Tracking Location (MTL) variants of SWEI. However, the assumption of a linear model leads to an overestimation of the shear modulus in viscoelastic media. A new reconstruction technique denoted Single Tracking Location Viscosity Estimation (STL-VE) is introduced to correct for this overestimation. This technique utilizes the same raw data generated in STL-SWEI imaging. Here, the STL-VE technique is developed by way of a Maximum Likelihood Estimation (MLE) for general viscoelastic materials. The method is then implemented for the particular case of the Kelvin-Voigt Model. Using simulation data, the STL-VE technique is demonstrated and the performance of the estimator is characterized. Finally, the STL-VE method is used to estimate the viscoelastic parameters of ex-vivo bovine liver. We find good agreement between the STL-VE results and the simulation parameters as well as between the liver shear wave data and the modeled data fit. PMID:26168170
Using the Delphi technique in economic evaluation: time to revisit the oracle?
Simoens, S
2006-12-01
Although the Delphi technique has been commonly used as a data source in medical and health services research, its application in economic evaluation of medicines has been more limited. The aim of this study was to describe the methodology of the Delphi technique, to present a case for using the technique in economic evaluation, and to provide recommendations to improve such use. The literature was accessed through MEDLINE focusing on studies discussing the methodology of the Delphi technique and economic evaluations of medicines using the Delphi technique. The Delphi technique can be used to provide estimates of health care resources required and to modify such estimates when making inter-country comparisons. The Delphi technique can also contribute to mapping the treatment process under investigation, to identifying the appropriate comparator to be used, and to ensuring that the economic evaluation estimates cost-effectiveness rather than cost-efficacy. Ideally, economic evaluations of medicines should be based on real-patient data. In the absence of such data, evaluations need to incorporate the best evidence available by employing approaches such as the Delphi technique. Evaluations based on this approach should state the limitations, and explore the impact of the associated uncertainty in the results.
Investigation of spectral analysis techniques for randomly sampled velocimetry data
NASA Technical Reports Server (NTRS)
Sree, Dave
1993-01-01
It is well known that velocimetry (LV) generates individual realization velocity data that are randomly or unevenly sampled in time. Spectral analysis of such data to obtain the turbulence spectra, and hence turbulence scales information, requires special techniques. The 'slotting' technique of Mayo et al, also described by Roberts and Ajmani, and the 'Direct Transform' method of Gaster and Roberts are well known in the LV community. The slotting technique is faster than the direct transform method in computation. There are practical limitations, however, as to how a high frequency and accurate estimate can be made for a given mean sampling rate. These high frequency estimates are important in obtaining the microscale information of turbulence structure. It was found from previous studies that reliable spectral estimates can be made up to about the mean sampling frequency (mean data rate) or less. If the data were evenly samples, the frequency range would be half the sampling frequency (i.e. up to Nyquist frequency); otherwise, aliasing problem would occur. The mean data rate and the sample size (total number of points) basically limit the frequency range. Also, there are large variabilities or errors associated with the high frequency estimates from randomly sampled signals. Roberts and Ajmani proposed certain pre-filtering techniques to reduce these variabilities, but at the cost of low frequency estimates. The prefiltering acts as a high-pass filter. Further, Shapiro and Silverman showed theoretically that, for Poisson sampled signals, it is possible to obtain alias-free spectral estimates far beyond the mean sampling frequency. But the question is, how far? During his tenure under 1993 NASA-ASEE Summer Faculty Fellowship Program, the author investigated from his studies on the spectral analysis techniques for randomly sampled signals that the spectral estimates can be enhanced or improved up to about 4-5 times the mean sampling frequency by using a suitable prefiltering technique. But, this increased bandwidth comes at the cost of the lower frequency estimates. The studies further showed that large data sets of the order of 100,000 points, or more, high data rates, and Poisson sampling are very crucial for obtaining reliable spectral estimates from randomly sampled data, such as LV data. Some of the results of the current study are presented.
Decision rules for unbiased inventory estimates
NASA Technical Reports Server (NTRS)
Argentiero, P. D.; Koch, D.
1979-01-01
An efficient and accurate procedure for estimating inventories from remote sensing scenes is presented. In place of the conventional and expensive full dimensional Bayes decision rule, a one-dimensional feature extraction and classification technique was employed. It is shown that this efficient decision rule can be used to develop unbiased inventory estimates and that for large sample sizes typical of satellite derived remote sensing scenes, resulting accuracies are comparable or superior to more expensive alternative procedures. Mathematical details of the procedure are provided in the body of the report and in the appendix. Results of a numerical simulation of the technique using statistics obtained from an observed LANDSAT scene are included. The simulation demonstrates the effectiveness of the technique in computing accurate inventory estimates.
Empirical State Error Covariance Matrix for Batch Estimation
NASA Technical Reports Server (NTRS)
Frisbee, Joe
2015-01-01
State estimation techniques effectively provide mean state estimates. However, the theoretical state error covariance matrices provided as part of these techniques often suffer from a lack of confidence in their ability to describe the uncertainty in the estimated states. By a reinterpretation of the equations involved in the weighted batch least squares algorithm, it is possible to directly arrive at an empirical state error covariance matrix. The proposed empirical state error covariance matrix will contain the effect of all error sources, known or not. This empirical error covariance matrix may be calculated as a side computation for each unique batch solution. Results based on the proposed technique will be presented for a simple, two observer and measurement error only problem.
A novel time of arrival estimation algorithm using an energy detector receiver in MMW systems
NASA Astrophysics Data System (ADS)
Liang, Xiaolin; Zhang, Hao; Lyu, Tingting; Xiao, Han; Gulliver, T. Aaron
2017-12-01
This paper presents a new time of arrival (TOA) estimation technique using an improved energy detection (ED) receiver based on the empirical mode decomposition (EMD) in an impulse radio (IR) 60 GHz millimeter wave (MMW) system. A threshold is employed via analyzing the characteristics of the received energy values with an extreme learning machine (ELM). The effect of the channel and integration period on the TOA estimation is evaluated. Several well-known ED-based TOA algorithms are used to compare with the proposed technique. It is shown that this ELM-based technique has lower TOA estimation error compared to other approaches and provides robust performance with the IEEE 802.15.3c channel models.
Heidari, M.; Ranjithan, S.R.
1998-01-01
In using non-linear optimization techniques for estimation of parameters in a distributed ground water model, the initial values of the parameters and prior information about them play important roles. In this paper, the genetic algorithm (GA) is combined with the truncated-Newton search technique to estimate groundwater parameters for a confined steady-state ground water model. Use of prior information about the parameters is shown to be important in estimating correct or near-correct values of parameters on a regional scale. The amount of prior information needed for an accurate solution is estimated by evaluation of the sensitivity of the performance function to the parameters. For the example presented here, it is experimentally demonstrated that only one piece of prior information of the least sensitive parameter is sufficient to arrive at the global or near-global optimum solution. For hydraulic head data with measurement errors, the error in the estimation of parameters increases as the standard deviation of the errors increases. Results from our experiments show that, in general, the accuracy of the estimated parameters depends on the level of noise in the hydraulic head data and the initial values used in the truncated-Newton search technique.In using non-linear optimization techniques for estimation of parameters in a distributed ground water model, the initial values of the parameters and prior information about them play important roles. In this paper, the genetic algorithm (GA) is combined with the truncated-Newton search technique to estimate groundwater parameters for a confined steady-state ground water model. Use of prior information about the parameters is shown to be important in estimating correct or near-correct values of parameters on a regional scale. The amount of prior information needed for an accurate solution is estimated by evaluation of the sensitivity of the performance function to the parameters. For the example presented here, it is experimentally demonstrated that only one piece of prior information of the least sensitive parameter is sufficient to arrive at the global or near-global optimum solution. For hydraulic head data with measurement errors, the error in the estimation of parameters increases as the standard deviation of the errors increases. Results from our experiments show that, in general, the accuracy of the estimated parameters depends on the level of noise in the hydraulic head data and the initial values used in the truncated-Newton search technique.
A Rapid Screen Technique for Estimating Nanoparticle Transport in Porous Media
Quantifying the mobility of engineered nanoparticles in hydrologic pathways from point of release to human or ecological receptors is essential for assessing environmental exposures. Column transport experiments are a widely used technique to estimate the transport parameters of ...
Taxi-cabs as Subjects for a Population Study
ERIC Educational Resources Information Center
Bishop, J. A.; Bradley, J. S.
1972-01-01
Describes the use of capture-recapture techniques to estimate the population of taxis in Liverpool and demonstrates the points of similarity to animal population estimation. Considers advantages of studying taxis rather than organisms in introductory studies of the techniques. (AL)
Estimating the cost of major ongoing cost plus hardware development programs
NASA Technical Reports Server (NTRS)
Bush, J. C.
1990-01-01
Approaches are developed for forecasting the cost of major hardware development programs while these programs are in the design and development C/D phase. Three approaches are developed: a schedule assessment technique for bottom-line summary cost estimation, a detailed cost estimation approach, and an intermediate cost element analysis procedure. The schedule assessment technique was developed using historical cost/schedule performance data.
Reiter, M.E.; Andersen, D.E.
2008-01-01
Both egg flotation and egg candling have been used to estimate incubation day (often termed nest age) in nesting birds, but little is known about the relative accuracy of these two techniques. We used both egg flotation and egg candling to estimate incubation day for Canada Geese (Branta canadensis interior) nesting near Cape Churchill, Manitoba, from 2000 to 2007. We modeled variation in the difference between estimates of incubation day using each technique as a function of true incubation day, as well as, variation in error rates with each technique as a function of the true incubation day. We also evaluated the effect of error in the estimated incubation day on estimates of daily survival rate (DSR) and nest success using simulations. The mean difference between concurrent estimates of incubation day based on egg flotation minus egg candling at the same nest was 0.85 ?? 0.06 (SE) days. The positive difference in favor of egg flotation and the magnitude of the difference in estimates of incubation day did not vary as a function of true incubation day. Overall, both egg flotation and egg candling overestimated incubation day early in incubation and underestimated incubation day later in incubation. The average difference between true hatch date and estimated hatch date did not differ from zero (days) for egg flotation, but egg candling overestimated true hatch date by about 1 d (true - estimated; days). Our simulations suggested that error associated with estimating the incubation day of nests and subsequently exposure days using either egg candling or egg flotation would have minimal effects on estimates of DSR and nest success. Although egg flotation was slightly less biased, both methods provided comparable and accurate estimates of incubation day and subsequent estimates of hatch date and nest success throughout the entire incubation period. ?? 2008 Association of Field Ornithologists.
Simon, Aaron B.; Dubowitz, David J.; Blockley, Nicholas P.; Buxton, Richard B.
2016-01-01
Calibrated blood oxygenation level dependent (BOLD) imaging is a multimodal functional MRI technique designed to estimate changes in cerebral oxygen metabolism from measured changes in cerebral blood flow and the BOLD signal. This technique addresses fundamental ambiguities associated with quantitative BOLD signal analysis; however, its dependence on biophysical modeling creates uncertainty in the resulting oxygen metabolism estimates. In this work, we developed a Bayesian approach to estimating the oxygen metabolism response to a neural stimulus and used it to examine the uncertainty that arises in calibrated BOLD estimation due to the presence of unmeasured model parameters. We applied our approach to estimate the CMRO2 response to a visual task using the traditional hypercapnia calibration experiment as well as to estimate the metabolic response to both a visual task and hypercapnia using the measurement of baseline apparent R2′ as a calibration technique. Further, in order to examine the effects of cerebral spinal fluid (CSF) signal contamination on the measurement of apparent R2′, we examined the effects of measuring this parameter with and without CSF-nulling. We found that the two calibration techniques provided consistent estimates of the metabolic response on average, with a median R2′-based estimate of the metabolic response to CO2 of 1.4%, and R2′- and hypercapnia-calibrated estimates of the visual response of 27% and 24%, respectively. However, these estimates were sensitive to different sources of estimation uncertainty. The R2′-calibrated estimate was highly sensitive to CSF contamination and to uncertainty in unmeasured model parameters describing flow-volume coupling, capillary bed characteristics, and the iso-susceptibility saturation of blood. The hypercapnia-calibrated estimate was relatively insensitive to these parameters but highly sensitive to the assumed metabolic response to CO2. PMID:26790354
Simon, Aaron B; Dubowitz, David J; Blockley, Nicholas P; Buxton, Richard B
2016-04-01
Calibrated blood oxygenation level dependent (BOLD) imaging is a multimodal functional MRI technique designed to estimate changes in cerebral oxygen metabolism from measured changes in cerebral blood flow and the BOLD signal. This technique addresses fundamental ambiguities associated with quantitative BOLD signal analysis; however, its dependence on biophysical modeling creates uncertainty in the resulting oxygen metabolism estimates. In this work, we developed a Bayesian approach to estimating the oxygen metabolism response to a neural stimulus and used it to examine the uncertainty that arises in calibrated BOLD estimation due to the presence of unmeasured model parameters. We applied our approach to estimate the CMRO2 response to a visual task using the traditional hypercapnia calibration experiment as well as to estimate the metabolic response to both a visual task and hypercapnia using the measurement of baseline apparent R2' as a calibration technique. Further, in order to examine the effects of cerebral spinal fluid (CSF) signal contamination on the measurement of apparent R2', we examined the effects of measuring this parameter with and without CSF-nulling. We found that the two calibration techniques provided consistent estimates of the metabolic response on average, with a median R2'-based estimate of the metabolic response to CO2 of 1.4%, and R2'- and hypercapnia-calibrated estimates of the visual response of 27% and 24%, respectively. However, these estimates were sensitive to different sources of estimation uncertainty. The R2'-calibrated estimate was highly sensitive to CSF contamination and to uncertainty in unmeasured model parameters describing flow-volume coupling, capillary bed characteristics, and the iso-susceptibility saturation of blood. The hypercapnia-calibrated estimate was relatively insensitive to these parameters but highly sensitive to the assumed metabolic response to CO2. Copyright © 2016 Elsevier Inc. All rights reserved.
Evaluation of Bayesian Sequential Proportion Estimation Using Analyst Labels
NASA Technical Reports Server (NTRS)
Lennington, R. K.; Abotteen, K. M. (Principal Investigator)
1980-01-01
The author has identified the following significant results. A total of ten Large Area Crop Inventory Experiment Phase 3 blind sites and analyst-interpreter labels were used in a study to compare proportional estimates obtained by the Bayes sequential procedure with estimates obtained from simple random sampling and from Procedure 1. The analyst error rate using the Bayes technique was shown to be no greater than that for the simple random sampling. Also, the segment proportion estimates produced using this technique had smaller bias and mean squared errors than the estimates produced using either simple random sampling or Procedure 1.
D'Agnese, F. A.; Faunt, C.C.; Keith, Turner A.
1996-01-01
The recharge and discharge components of the Death Valley regional groundwater flow system were defined by remote sensing and GIS techniques that integrated disparate data types to develop a spatially complex representation of near-surface hydrological processes. Image classification methods were applied to multispectral satellite data to produce a vegetation map. This map provided a basis for subsequent evapotranspiration and infiltration estimations. The vegetation map was combined with ancillary data in a GIS to delineate different types of wetlands, phreatophytes and wet playa areas. Existing evapotranspiration-rate estimates were then used to calculate discharge volumes for these areas. A previously used empirical method of groundwater recharge estimation was modified by GIS methods to incorporate data describing soil-moisture conditions, and a recharge potential map was produced. These discharge and recharge maps were readily converted to data arrays for numerical modelling codes. Inverse parameter estimation techniques also used these data to evaluate the reliability and sensitivity of estimated values.
Kalman filter approach for uncertainty quantification in time-resolved laser-induced incandescence.
Hadwin, Paul J; Sipkens, Timothy A; Thomson, Kevin A; Liu, Fengshan; Daun, Kyle J
2018-03-01
Time-resolved laser-induced incandescence (TiRe-LII) data can be used to infer spatially and temporally resolved volume fractions and primary particle size distributions of soot-laden aerosols, but these estimates are corrupted by measurement noise as well as uncertainties in the spectroscopic and heat transfer submodels used to interpret the data. Estimates of the temperature, concentration, and size distribution of soot primary particles within a sample aerosol are typically made by nonlinear regression of modeled spectral incandescence decay, or effective temperature decay, to experimental data. In this work, we employ nonstationary Bayesian estimation techniques to infer aerosol properties from simulated and experimental LII signals, specifically the extended Kalman filter and Schmidt-Kalman filter. These techniques exploit the time-varying nature of both the measurements and the models, and they reveal how uncertainty in the estimates computed from TiRe-LII data evolves over time. Both techniques perform better when compared with standard deterministic estimates; however, we demonstrate that the Schmidt-Kalman filter produces more realistic uncertainty estimates.
Ruíz, A; Ramos, A; San Emeterio, J L
2004-04-01
An estimation procedure to efficiently find approximate values of internal parameters in ultrasonic transducers intended for broadband operation would be a valuable tool to discover internal construction data. This information is necessary in the modelling and simulation of acoustic and electrical behaviour related to ultrasonic systems containing commercial transducers. There is not a general solution for this generic problem of parameter estimation in the case of broadband piezoelectric probes. In this paper, this general problem is briefly analysed for broadband conditions. The viability of application in this field of an artificial intelligence technique supported on the modelling of the transducer internal components is studied. A genetic algorithm (GA) procedure is presented and applied to the estimation of different parameters, related to two transducers which are working as pulsed transmitters. The efficiency of this GA technique is studied, considering the influence of the number and variation range of the estimated parameters. Estimation results are experimentally ratified.
Optical rangefinding applications using communications modulation technique
NASA Astrophysics Data System (ADS)
Caplan, William D.; Morcom, Christopher John
2010-10-01
A novel range detection technique combines optical pulse modulation patterns with signal cross-correlation to produce an accurate range estimate from low power signals. The cross-correlation peak is analyzed by a post-processing algorithm such that the phase delay is proportional to the range to target. This technique produces a stable range estimate from noisy signals. The advantage is higher accuracy obtained with relatively low optical power transmitted. The technique is useful for low cost, low power and low mass sensors suitable for tactical use. The signal coding technique allows applications including IFF and battlefield identification systems.
NASA Technical Reports Server (NTRS)
Choudhary, Alok Nidhi; Leung, Mun K.; Huang, Thomas S.; Patel, Janak H.
1989-01-01
Several techniques to perform static and dynamic load balancing techniques for vision systems are presented. These techniques are novel in the sense that they capture the computational requirements of a task by examining the data when it is produced. Furthermore, they can be applied to many vision systems because many algorithms in different systems are either the same, or have similar computational characteristics. These techniques are evaluated by applying them on a parallel implementation of the algorithms in a motion estimation system on a hypercube multiprocessor system. The motion estimation system consists of the following steps: (1) extraction of features; (2) stereo match of images in one time instant; (3) time match of images from different time instants; (4) stereo match to compute final unambiguous points; and (5) computation of motion parameters. It is shown that the performance gains when these data decomposition and load balancing techniques are used are significant and the overhead of using these techniques is minimal.
A direct-measurement technique for estimating discharge-chamber lifetime. [for ion thrusters
NASA Technical Reports Server (NTRS)
Beattie, J. R.; Garvin, H. L.
1982-01-01
The use of short-term measurement techniques for predicting the wearout of ion thrusters resulting from sputter-erosion damage is investigated. The laminar-thin-film technique is found to provide high precision erosion-rate data, although the erosion rates are generally substantially higher than those found during long-term erosion tests, so that the results must be interpreted in a relative sense. A technique for obtaining absolute measurements is developed using a masked-substrate arrangement. This new technique provides a means for estimating the lifetimes of critical discharge-chamber components based on direct measurements of sputter-erosion depths obtained during short-duration (approximately 1 hr) tests. Results obtained using the direct-measurement technique are shown to agree with sputter-erosion depths calculated for the plasma conditions of the test. The direct-measurement approach is found to be applicable to both mercury and argon discharge-plasma environments and will be useful for estimating the lifetimes of inert gas and extended performance mercury ion thrusters currently under development.
Restoration of out-of-focus images based on circle of confusion estimate
NASA Astrophysics Data System (ADS)
Vivirito, Paolo; Battiato, Sebastiano; Curti, Salvatore; La Cascia, M.; Pirrone, Roberto
2002-11-01
In this paper a new method for a fast out-of-focus blur estimation and restoration is proposed. It is suitable for CFA (Color Filter Array) images acquired by typical CCD/CMOS sensor. The method is based on the analysis of a single image and consists of two steps: 1) out-of-focus blur estimation via Bayer pattern analysis; 2) image restoration. Blur estimation is based on a block-wise edge detection technique. This edge detection is carried out on the green pixels of the CFA sensor image also called Bayer pattern. Once the blur level has been estimated the image is restored through the application of a new inverse filtering technique. This algorithm gives sharp images reducing ringing and crisping artifact, involving wider region of frequency. Experimental results show the effectiveness of the method, both in subjective and numerical way, by comparison with other techniques found in literature.
Optimal Tuner Selection for Kalman-Filter-Based Aircraft Engine Performance Estimation
NASA Technical Reports Server (NTRS)
Simon, Donald L.; Garg, Sanjay
2011-01-01
An emerging approach in the field of aircraft engine controls and system health management is the inclusion of real-time, onboard models for the inflight estimation of engine performance variations. This technology, typically based on Kalman-filter concepts, enables the estimation of unmeasured engine performance parameters that can be directly utilized by controls, prognostics, and health-management applications. A challenge that complicates this practice is the fact that an aircraft engine s performance is affected by its level of degradation, generally described in terms of unmeasurable health parameters such as efficiencies and flow capacities related to each major engine module. Through Kalman-filter-based estimation techniques, the level of engine performance degradation can be estimated, given that there are at least as many sensors as health parameters to be estimated. However, in an aircraft engine, the number of sensors available is typically less than the number of health parameters, presenting an under-determined estimation problem. A common approach to address this shortcoming is to estimate a subset of the health parameters, referred to as model tuning parameters. The problem/objective is to optimally select the model tuning parameters to minimize Kalman-filterbased estimation error. A tuner selection technique has been developed that specifically addresses the under-determined estimation problem, where there are more unknown parameters than available sensor measurements. A systematic approach is applied to produce a model tuning parameter vector of appropriate dimension to enable estimation by a Kalman filter, while minimizing the estimation error in the parameters of interest. Tuning parameter selection is performed using a multi-variable iterative search routine that seeks to minimize the theoretical mean-squared estimation error of the Kalman filter. This approach can significantly reduce the error in onboard aircraft engine parameter estimation applications such as model-based diagnostic, controls, and life usage calculations. The advantage of the innovation is the significant reduction in estimation errors that it can provide relative to the conventional approach of selecting a subset of health parameters to serve as the model tuning parameter vector. Because this technique needs only to be performed during the system design process, it places no additional computation burden on the onboard Kalman filter implementation. The technique has been developed for aircraft engine onboard estimation applications, as this application typically presents an under-determined estimation problem. However, this generic technique could be applied to other industries using gas turbine engine technology.
Estimating Mass of Inflatable Aerodynamic Decelerators Using Dimensionless Parameters
NASA Technical Reports Server (NTRS)
Samareh, Jamshid A.
2011-01-01
This paper describes a technique for estimating mass for inflatable aerodynamic decelerators. The technique uses dimensional analysis to identify a set of dimensionless parameters for inflation pressure, mass of inflation gas, and mass of flexible material. The dimensionless parameters enable scaling of an inflatable concept with geometry parameters (e.g., diameter), environmental conditions (e.g., dynamic pressure), inflation gas properties (e.g., molecular mass), and mass growth allowance. This technique is applicable for attached (e.g., tension cone, hypercone, and stacked toroid) and trailing inflatable aerodynamic decelerators. The technique uses simple engineering approximations that were developed by NASA in the 1960s and 1970s, as well as some recent important developments. The NASA Mars Entry and Descent Landing System Analysis (EDL-SA) project used this technique to estimate the masses of the inflatable concepts that were used in the analysis. The EDL-SA results compared well with two independent sets of high-fidelity finite element analyses.
High suspended sediment concentrations (SSCs) from natural and anthropogenic sources are responsible for biological impairments of many streams, rivers, lakes, and estuaries, but techniques to estimate sediment concentrations or loads accurately at the daily temporal resolution a...
Rapid estimation of nutritional elements on citrus leaves by near infrared reflectance spectroscopy.
Galvez-Sola, Luis; García-Sánchez, Francisco; Pérez-Pérez, Juan G; Gimeno, Vicente; Navarro, Josefa M; Moral, Raul; Martínez-Nicolás, Juan J; Nieves, Manuel
2015-01-01
Sufficient nutrient application is one of the most important factors in producing quality citrus fruits. One of the main guides in planning citrus fertilizer programs is by directly monitoring the plant nutrient content. However, this requires analysis of a large number of leaf samples using expensive and time-consuming chemical techniques. Over the last 5 years, it has been demonstrated that it is possible to quantitatively estimate certain nutritional elements in citrus leaves by using the spectral reflectance values, obtained by using near infrared reflectance spectroscopy (NIRS). This technique is rapid, non-destructive, cost-effective and environmentally friendly. Therefore, the estimation of macro and micronutrients in citrus leaves by this method would be beneficial in identifying the mineral status of the trees. However, to be used effectively NIRS must be evaluated against the standard techniques across different cultivars. In this study, NIRS spectral analysis, and subsequent nutrient estimations for N, K, Ca, Mg, B, Fe, Cu, Mn, and Zn concentration, were performed using 217 leaf samples from different citrus trees species. Partial least square regression and different pre-processing signal treatments were used to generate the best estimation against the current best practice techniques. It was verified a high proficiency in the estimation of N (Rv = 0.99) and Ca (Rv = 0.98) as well as achieving acceptable estimation for K, Mg, Fe, and Zn. However, no successful calibrations were obtained for the estimation of B, Cu, and Mn.
Rapid estimation of nutritional elements on citrus leaves by near infrared reflectance spectroscopy
Galvez-Sola, Luis; García-Sánchez, Francisco; Pérez-Pérez, Juan G.; Gimeno, Vicente; Navarro, Josefa M.; Moral, Raul; Martínez-Nicolás, Juan J.; Nieves, Manuel
2015-01-01
Sufficient nutrient application is one of the most important factors in producing quality citrus fruits. One of the main guides in planning citrus fertilizer programs is by directly monitoring the plant nutrient content. However, this requires analysis of a large number of leaf samples using expensive and time-consuming chemical techniques. Over the last 5 years, it has been demonstrated that it is possible to quantitatively estimate certain nutritional elements in citrus leaves by using the spectral reflectance values, obtained by using near infrared reflectance spectroscopy (NIRS). This technique is rapid, non-destructive, cost-effective and environmentally friendly. Therefore, the estimation of macro and micronutrients in citrus leaves by this method would be beneficial in identifying the mineral status of the trees. However, to be used effectively NIRS must be evaluated against the standard techniques across different cultivars. In this study, NIRS spectral analysis, and subsequent nutrient estimations for N, K, Ca, Mg, B, Fe, Cu, Mn, and Zn concentration, were performed using 217 leaf samples from different citrus trees species. Partial least square regression and different pre-processing signal treatments were used to generate the best estimation against the current best practice techniques. It was verified a high proficiency in the estimation of N (Rv = 0.99) and Ca (Rv = 0.98) as well as achieving acceptable estimation for K, Mg, Fe, and Zn. However, no successful calibrations were obtained for the estimation of B, Cu, and Mn. PMID:26257767
Shape and Spatially-Varying Reflectance Estimation from Virtual Exemplars.
Hui, Zhuo; Sankaranarayanan, Aswin C
2017-10-01
This paper addresses the problem of estimating the shape of objects that exhibit spatially-varying reflectance. We assume that multiple images of the object are obtained under a fixed view-point and varying illumination, i.e., the setting of photometric stereo. At the core of our techniques is the assumption that the BRDF at each pixel lies in the non-negative span of a known BRDF dictionary. This assumption enables a per-pixel surface normal and BRDF estimation framework that is computationally tractable and requires no initialization in spite of the underlying problem being non-convex. Our estimation framework first solves for the surface normal at each pixel using a variant of example-based photometric stereo. We design an efficient multi-scale search strategy for estimating the surface normal and subsequently, refine this estimate using a gradient descent procedure. Given the surface normal estimate, we solve for the spatially-varying BRDF by constraining the BRDF at each pixel to be in the span of the BRDF dictionary; here, we use additional priors to further regularize the solution. A hallmark of our approach is that it does not require iterative optimization techniques nor the need for careful initialization, both of which are endemic to most state-of-the-art techniques. We showcase the performance of our technique on a wide range of simulated and real scenes where we outperform competing methods.
Boron Hazards to Fish, Wildlife, and Invertebrates: A Synoptic Review
Eisler, R.
1990-01-01
Ecological and toxicological aspects of boron (B) in the environment are reviewed, with emphasis on natural resources. Subtopics covered include environmental chemistry, background concentrations, effects, and current recommendations for the protection of living resources. Boron is not now considered essential in mammalian nutrition, although low dietary levels protect against fluorosis and bone demineralization. Excessive consumption (i.e., >1,000 mg B/kg diet, >15 mg B/kg body weight daily, >1.0 mg B/L drinking water, or >210 mg B/kg body weight in a single dose) adversely affects growth, survival, or reproduction in sensitive mammals. Boron and its compounds are potent teratogens when applied directly to the mammalian embryo, but there is no evidence of mutagenicity or carcinogenicity. Boron`s unique affinity for cancerous tissues has been exploited in neutron capture radiation therapy of malignant human brain tumors. Current boron criteria recommended for the protection of sensitive species include <0.3 mg B/L in crop irrigation waters, <1.0 mg B/L for aquatic life, <5.0 mg B/L in livestock drinking waters, <30 mg B/kg in waterfowl diets, and <100 mg B/kg in livestock diets.
Health professionals working with First Nations, Inuit, and Métis consensus guideline.
Wilson, Don; de la Ronde, Sandra; Brascoupé, Simon; Apale, Alisha Nicole; Barney, Lucy; Guthrie, Bing; Harrold, Elizabeth; Horn, Ojistoh; Johnson, Robin; Rattray, Darrien; Robinson, Nicole; Alainga-Kango, Natsiq; Becker, Gisela; Senikas, Vyta; Aningmiuq, Annie; Bailey, Geri; Birch, Darlene; Cook, Katsi; Danforth, Jessica; Daoust, Mary; Kitty, Darlene; Koebel, Jaime; Kornelsen, Judith; Tsatsa Kotwas, Ndakaitedzva; Lawrence, Audrey; Mudry, Amanda; Senikas, Vyta; Turner, Gail Theresa; Van Wagner, Vicki; Vides, Eduardo; Wasekeesikaw, Fjola Hart; Wolfe, Sara
2013-06-01
Our aim is to provide health care professionals in Canada with the knowledge and tools to provide culturally safe care to First Nations, Inuit, and Métis women and through them, to their families, in order to improve the health of First Nations, Inuit, and Métis. Published literature was retrieved through searches of PubMed, CINAHL, Sociological Abstracts, and The Cochrane Library in 2011 using appropriate controlled vocabulary (e.g.,cultural competency, health services, indigenous, transcultural nursing) and key words (e.g., indigenous health services, transcultural health care, cultural safety). Targeted searches on subtopics (e.g., ceremonial rites and sexual coming of age) were also performed. The PubMed search was restricted to the years 2005 and later because of the large number of records retrieved on this topic. Searches were updated on a regular basis and incorporated in the guideline to May 2012. Grey (unpublished) literature was identified through searching the websites of selected related agencies (e.g., Campbell Collaboration, Social Care Online, Institute for Healthcare Improvement). The quality of evidence in this document was rated using the criteria described in the Report of the Canadian Task force on Preventive Health Care (Table).
The neurocircuitry of addiction: an overview
Feltenstein, M W; See, R E
2008-01-01
Drug addiction presents as a chronic relapsing disorder characterized by persistent drug-seeking and drug-taking behaviours. Given the significant detrimental effects of this disease both socially and economically, a considerable amount of research has been dedicated to understanding a number of issues in addiction, including behavioural and neuropharmacological factors that contribute to the development, loss of control and persistence of compulsive addictive behaviours. In this review, we will give a broad overview of various theories of addiction, animal models of addiction and relapse, drugs of abuse, and the neurobiology of drug dependence and relapse. Although drugs of abuse possess diverse neuropharmacological profiles, activation of the mesocorticolimbic system, particularly the ventral tegmental area, nucleus accumbens, amygdala and prefrontal cortex via dopaminergic and glutamatergic pathways, constitutes a common pathway by which various drugs of abuse mediate their acute reinforcing effects. However, long-term neuroadaptations in this circuitry likely underlie the transition to drug dependence and cycles of relapse. As further elucidated in more comprehensive reviews of various subtopics on addiction in later sections of this special issue, it is anticipated that continued basic neuroscience research will aid in the development of effective therapeutic interventions for the long-term treatment of drug-dependent individuals. PMID:18311189
Contaminant Hazard Reviews. [Reports No. 1-28 on CD-ROM.
Eisler, R.
1998-01-01
This compact disc (CD) contains the first 28 reports in the Contaminant Hazard Reviews (CHR) that were published originally between 1985 and 1994 in the U.S. Department of the Interior Biological Report series. The CD was produced because printed supplies of these reviews--a total of 84,000--became exhausted and demand remained high. Each review was prepared at the request of environmental specialists of the U.S. Fish and Wildlife Service and each contained specific information on mirex, cadmium, carbofuran, toxaphene, selenium, chromium, polychlorinated biphenyls, dioxins, diazinon, mercury, polycyclic aromatic hydrocarbons, arsenic, chlorpyrifos, lead, tin, index issue, pentachlorophenol, atrazine, molybdenum, boron, chlordane, paraquat, cyanide, fenvalerate, diflubenzuron, zinc, famphur, or acrolein. Each report reviewed and synthesized the technical literature on a single contaminant and its effects on terrestrial plants and invertebrates, aquatic plants and animals, avian and mammalian wildlife, and other natural resources. The subtopics include contaminant sources and uses; physical, chemical, and metabolic properties; concentrations in field collections of abiotic materials and living organisms; deficiency effects, where appropriate; lethal and sublethal effects, including effects on survival, growth, reproduction, metabolism, mutagenicity, teratogenicity, and carcinogenicity; proposed criteria for the protection of human health and sensitive natural resources; and recommendations for additional research.
Honzíková, N; Závodná, E
2016-12-13
The increased prevalence of obesity in children and its complications have led to a greater interest in studying baroreflex sensitivity (BRS) in children. This review of BRS in children and adolescents includes subtopics on: 1. Resting values of BRS and their reproducibility, 2. Genetics of BRS, 3. The role of a primarily low BRS and obesity in the development of hypertension, and 4. Association of diabetes mellitus, BRS, and obesity. The conclusions specific to this age follow from this review: 1. The mean heart rate (HR) influences the measurement of BRS. Since the mean HR decreases during adolescence, HR should be taken into account. 2. A genetic dependency of BRS was found. 3. Low BRS values may precede pathological blood-pressure elevation in children with white-coat hypertension. We hypothesize that low BRS plays an active role in the emergence of hypertension in youth. A contribution of obesity to the development of hypertension was also found. We hypothesize that both factors, a primarily low BRS and obesity, are partially independent risk factors for hypertension in youths. 4. In diabetics, a low BRS compared to healthy children can be associated with insulin resistance. A reversibility of the BRS values could be possible after weight loss.
A Survey of Research Performed at NASA Langley Research Center's Impact Dynamics Research Facility
NASA Technical Reports Server (NTRS)
Jackson, K. E.; Fasanella, E. L.
2003-01-01
The Impact Dynamics Research Facility (IDRF) is a 240-ft-high gantry structure located at NASA Langley Research Center in Hampton, Virginia. The facility was originally built in 1963 as a lunar landing simulator, allowing the Apollo astronauts to practice lunar landings under realistic conditions. The IDRF was designated a National Historic Landmark in 1985 based on its significant contributions to the Apollo Program. In 1972, the facility was converted to a full-scale crash test facility for light aircraft and rotorcraft. Since that time, the IDRF has been used to perform a wide variety of impact tests on full-scale aircraft and structural components in support of the General Aviation (GA) aircraft industry, the US Department of Defense, the rotorcraft industry, and NASA in-house aeronautics and space research programs. The objective of this paper is to describe most of the major full-scale crash test programs that were performed at this unique, world-class facility since 1974. The past research is divided into six sub-topics: the civil GA aircraft test program, transport aircraft test program, military test programs, space test programs, basic research, and crash modeling and simulation.
Liu, Chiung-Ju; Rawl, Susan M
2012-01-01
Increasing readability of written cancer prevention information is a fundamental step to increasing awareness and knowledge of cancer screening. Instead of readability formulas, the present study focused on text cohesion, which is the degree to which the text content ties together. The purpose of this study was to examine the effect of text cohesion on reading times, comprehension, and retention of colorectal cancer prevention information. English-speaking adults (50 years of age or older) were recruited from local communities. Participants were randomly assigned to read colorectal cancer prevention subtopics presented at 2 levels of text cohesion: from higher cohesion to lower cohesion, or vice versa. Reading times, word recognition, text comprehension, and recall were assessed after reading. Two weeks later, text comprehension and recall were reassessed. Forty-two adults completed the study, but five were lost to follow up. Higher text cohesion showed a significant effect on reading times and text comprehension but not on word recognition and recall. The effect of text cohesion was not found on text comprehension and recall after 2 weeks. Increasing text cohesion facilitates reading speed and comprehension of colorectal cancer prevention information. Further research on the effect of text cohesion is warranted.
Recent advances in the management of neuropsychiatric symptoms in dementia.
Forlenza, Orestes V; Loureiro, Júlia Cunha; Pais, Marcos Vasconcelos; Stella, Florindo
2017-03-01
The present article addresses intriguing questions related to the clinical intervention in distinct neuropsychiatric syndromes of patients with dementia. We reviewed 154 articles published between 2015 and 2016 targeting psychopharmacological and nonpharmacological interventions, and safety-tolerability concerns. We selected 115 articles addressing the purpose of this study. Of these, 33 were chosen because they were dedicated to subtopics: agitation (42), depression (33), apathy (18), sleep disorders/anxiety (8), and psychosis (4). Clinical studies using both pharmacological (70) and nonpharmacological (37) interventions were considered; others were included for theoretical support. Regarding the methodological design, we found double-blind RCTs (17), single-blinded RCTs (4), open-label studies (18), case reports (5), cross-sectional or cohort studies (25), epidemiological papers (2), and expert reviews (44). This observation raises concerns about the overall methodological adequacy of a substantial proportion of studies in this field, which limits the potential of generalization of the findings. Finally, 18 studies were designed to determine safety-tolerability issues of psychotropic medications (6 were discussed). Effective and well tolerated treatment of neuropsychiatric syndromes in dementia remains a critically unsolved challenge. We understand that this is an extremely important area of research, and critically required to guide clinical decisions in geriatric neuropsychiatry.
NCI Think Tank Concerning the Identifiability of Biospecimens and “-Omic” Data
Weil, Carol J.; Mechanic, Leah E.; Green, Tiffany; Kinsinger, Christopher; Lockhart, Nicole C.; Nelson, Stefanie A.; Rodriguez, Laura L.; Buccini, Laura D.
2014-01-01
On June 11 and 12, 2012, the National Cancer Institute (NCI) hosted a think tank concerning the identifiability of biospecimens and “omic” Data in order to explore challenges surrounding this complex and multifaceted topic. The think tank brought together forty-six leaders from several fields, including cancer genomics, bioinformatics, human subject protection, patient advocacy, and commercial genetics. The first day involved presentations regarding the state of the science of re-identification; current and proposed regulatory frameworks for assessing identifiability; developments in law, industry and biotechnology; and the expectations of patients and research participants. The second day was spent by think tank participants in small break-out groups designed to address specific sub-topics under the umbrella issue of identifiability, including considerations for the development of best practices for data sharing and consent, and targeted opportunities for further empirical research. We describe the outcomes of this two day meeting, including two complimentary themes that emerged from moderated discussions following the presentations on Day 1, and ideas presented for further empirical research to discern the preferences and concerns of research participants about data sharing and individual identifiability. PMID:23579437
Development of advanced techniques for rotorcraft state estimation and parameter identification
NASA Technical Reports Server (NTRS)
Hall, W. E., Jr.; Bohn, J. G.; Vincent, J. H.
1980-01-01
An integrated methodology for rotorcraft system identification consists of rotorcraft mathematical modeling, three distinct data processing steps, and a technique for designing inputs to improve the identifiability of the data. These elements are as follows: (1) a Kalman filter smoother algorithm which estimates states and sensor errors from error corrupted data. Gust time histories and statistics may also be estimated; (2) a model structure estimation algorithm for isolating a model which adequately explains the data; (3) a maximum likelihood algorithm for estimating the parameters and estimates for the variance of these estimates; and (4) an input design algorithm, based on a maximum likelihood approach, which provides inputs to improve the accuracy of parameter estimates. Each step is discussed with examples to both flight and simulated data cases.
Peeters, Frank; Atamanchuk, Dariia; Tengberg, Anders; Encinas-Fernández, Jorge; Hofmann, Hilmar
2016-01-01
Lake metabolism is a key factor for the understanding of turnover of energy and of organic and inorganic matter in lake ecosystems. Long-term time series on metabolic rates are commonly estimated from diel changes in dissolved oxygen. Here we present long-term data on metabolic rates based on diel changes in total dissolved inorganic carbon (DIC) utilizing an open-water diel CO2-technique. Metabolic rates estimated with this technique and the traditional diel O2-technique agree well in alkaline Lake Illmensee (pH of ~8.5), although the diel changes in molar CO2 concentrations are much smaller than those of the molar O2 concentrations. The open-water diel CO2- and diel O2-techniques provide independent measures of lake metabolic rates that differ in their sensitivity to transport processes. Hence, the combination of both techniques can help to constrain uncertainties arising from assumptions on vertical fluxes due to gas exchange and turbulent diffusion. This is particularly important for estimates of lake respiration rates because these are much more sensitive to assumptions on gradients in vertical fluxes of O2 or DIC than estimates of lake gross primary production. Our data suggest that it can be advantageous to estimate respiration rates assuming negligible gradients in vertical fluxes rather than including gas exchange with the atmosphere but neglecting vertical mixing in the water column. During two months in summer the average lake net production was close to zero suggesting at most slightly autotrophic conditions. However, the lake emitted O2 and CO2 during the entire time period suggesting that O2 and CO2 emissions from lakes can be decoupled from the metabolism in the near surface layer.
Peeters, Frank; Atamanchuk, Dariia; Tengberg, Anders; Encinas-Fernández, Jorge; Hofmann, Hilmar
2016-01-01
Lake metabolism is a key factor for the understanding of turnover of energy and of organic and inorganic matter in lake ecosystems. Long-term time series on metabolic rates are commonly estimated from diel changes in dissolved oxygen. Here we present long-term data on metabolic rates based on diel changes in total dissolved inorganic carbon (DIC) utilizing an open-water diel CO2-technique. Metabolic rates estimated with this technique and the traditional diel O2-technique agree well in alkaline Lake Illmensee (pH of ~8.5), although the diel changes in molar CO2 concentrations are much smaller than those of the molar O2 concentrations. The open-water diel CO2- and diel O2-techniques provide independent measures of lake metabolic rates that differ in their sensitivity to transport processes. Hence, the combination of both techniques can help to constrain uncertainties arising from assumptions on vertical fluxes due to gas exchange and turbulent diffusion. This is particularly important for estimates of lake respiration rates because these are much more sensitive to assumptions on gradients in vertical fluxes of O2 or DIC than estimates of lake gross primary production. Our data suggest that it can be advantageous to estimate respiration rates assuming negligible gradients in vertical fluxes rather than including gas exchange with the atmosphere but neglecting vertical mixing in the water column. During two months in summer the average lake net production was close to zero suggesting at most slightly autotrophic conditions. However, the lake emitted O2 and CO2 during the entire time period suggesting that O2 and CO2 emissions from lakes can be decoupled from the metabolism in the near surface layer. PMID:28002477
Heer, D M; Passel, J F
1987-01-01
This article compares 2 different methods for estimating the number of undocumented Mexican adults in Los Angeles County. The 1st method, the survey-based method, uses a combination of 1980 census data and the results of a survey conducted in Los Angeles County in 1980 and 1981. A sample was selected from babies born in Los Angeles County who had a mother or father of Mexican origin. The survey included questions about the legal status of the baby's parents and certain other relatives. The resulting estimates of undocumented Mexican immigrants are for males aged 18-44 and females aged 18-39. The 2nd method, the residual method, involves comparison of census figures for aliens counted with estimates of legally-resident aliens developed principally with data from the Immigration and Naturalization Service (INS). For this study, estimates by age, sex, and period of entry were produced for persons born in Mexico and living in Los Angeles County. The results of this research indicate that it is possible to measure undocumented immigration with different techniques, yet obtain results that are similar. Both techniques presented here are limited in that they represent estimates of undocumented aliens based on the 1980 census. The number of additional undocumented aliens not counted remains a subject of conjecture. The fact that the proportions undocumented shown in the survey (228,700) are quite similar to the residual estimates (317,800) suggests that the number of undocumented aliens not counted in the census may not be an extremely large fraction of the undocumented population. The survey-based estimates have some significant advantages over the residual estimates. The survey provides tabulations of the undocumented population by characteristics other than the limited demographic information provided by the residual technique. On the other hand, the survey-based estimates require that a survey be conducted and, if national or regional estimates are called for, they may require a number of surveys. The residual technique, however, also requires a data source other than the census. However, the INS discontinued the annual registration of aliens after 1981. Thus, estimates of undocumented aliens based on the residual technique will probably not be possible for subnational areas using the 1990 census unless the registration program is reinstituted. Perhaps the best information on the undocumented population in the 1990 census will come from an improved version of the survey-based technique described here applied in selected local areas.
Center of pressure based segment inertial parameters validation
Rezzoug, Nasser; Gorce, Philippe; Isableu, Brice; Venture, Gentiane
2017-01-01
By proposing efficient methods for estimating Body Segment Inertial Parameters’ (BSIP) estimation and validating them with a force plate, it is possible to improve the inverse dynamic computations that are necessary in multiple research areas. Until today a variety of studies have been conducted to improve BSIP estimation but to our knowledge a real validation has never been completely successful. In this paper, we propose a validation method using both kinematic and kinetic parameters (contact forces) gathered from optical motion capture system and a force plate respectively. To compare BSIPs, we used the measured contact forces (Force plate) as the ground truth, and reconstructed the displacements of the Center of Pressure (COP) using inverse dynamics from two different estimation techniques. Only minor differences were seen when comparing the estimated segment masses. Their influence on the COP computation however is large and the results show very distinguishable patterns of the COP movements. Improving BSIP techniques is crucial and deviation from the estimations can actually result in large errors. This method could be used as a tool to validate BSIP estimation techniques. An advantage of this approach is that it facilitates the comparison between BSIP estimation methods and more specifically it shows the accuracy of those parameters. PMID:28662090
Adaptive neuro fuzzy inference system-based power estimation method for CMOS VLSI circuits
NASA Astrophysics Data System (ADS)
Vellingiri, Govindaraj; Jayabalan, Ramesh
2018-03-01
Recent advancements in very large scale integration (VLSI) technologies have made it feasible to integrate millions of transistors on a single chip. This greatly increases the circuit complexity and hence there is a growing need for less-tedious and low-cost power estimation techniques. The proposed work employs Back-Propagation Neural Network (BPNN) and Adaptive Neuro Fuzzy Inference System (ANFIS), which are capable of estimating the power precisely for the complementary metal oxide semiconductor (CMOS) VLSI circuits, without requiring any knowledge on circuit structure and interconnections. The ANFIS to power estimation application is relatively new. Power estimation using ANFIS is carried out by creating initial FIS modes using hybrid optimisation and back-propagation (BP) techniques employing constant and linear methods. It is inferred that ANFIS with the hybrid optimisation technique employing the linear method produces better results in terms of testing error that varies from 0% to 0.86% when compared to BPNN as it takes the initial fuzzy model and tunes it by means of a hybrid technique combining gradient descent BP and mean least-squares optimisation algorithms. ANFIS is the best suited for power estimation application with a low RMSE of 0.0002075 and a high coefficient of determination (R) of 0.99961.
A technique for estimating 4D-CBCT using prior knowledge and limited-angle projections.
Zhang, You; Yin, Fang-Fang; Segars, W Paul; Ren, Lei
2013-12-01
To develop a technique to estimate onboard 4D-CBCT using prior information and limited-angle projections for potential 4D target verification of lung radiotherapy. Each phase of onboard 4D-CBCT is considered as a deformation from one selected phase (prior volume) of the planning 4D-CT. The deformation field maps (DFMs) are solved using a motion modeling and free-form deformation (MM-FD) technique. In the MM-FD technique, the DFMs are estimated using a motion model which is extracted from planning 4D-CT based on principal component analysis (PCA). The motion model parameters are optimized by matching the digitally reconstructed radiographs of the deformed volumes to the limited-angle onboard projections (data fidelity constraint). Afterward, the estimated DFMs are fine-tuned using a FD model based on data fidelity constraint and deformation energy minimization. The 4D digital extended-cardiac-torso phantom was used to evaluate the MM-FD technique. A lung patient with a 30 mm diameter lesion was simulated with various anatomical and respirational changes from planning 4D-CT to onboard volume, including changes of respiration amplitude, lesion size and lesion average-position, and phase shift between lesion and body respiratory cycle. The lesions were contoured in both the estimated and "ground-truth" onboard 4D-CBCT for comparison. 3D volume percentage-difference (VPD) and center-of-mass shift (COMS) were calculated to evaluate the estimation accuracy of three techniques: MM-FD, MM-only, and FD-only. Different onboard projection acquisition scenarios and projection noise levels were simulated to investigate their effects on the estimation accuracy. For all simulated patient and projection acquisition scenarios, the mean VPD (±S.D.)∕COMS (±S.D.) between lesions in prior images and "ground-truth" onboard images were 136.11% (±42.76%)∕15.5 mm (±3.9 mm). Using orthogonal-view 15°-each scan angle, the mean VPD∕COMS between the lesion in estimated and "ground-truth" onboard images for MM-only, FD-only, and MM-FD techniques were 60.10% (±27.17%)∕4.9 mm (±3.0 mm), 96.07% (±31.48%)∕12.1 mm (±3.9 mm) and 11.45% (±9.37%)∕1.3 mm (±1.3 mm), respectively. For orthogonal-view 30°-each scan angle, the corresponding results were 59.16% (±26.66%)∕4.9 mm (±3.0 mm), 75.98% (±27.21%)∕9.9 mm (±4.0 mm), and 5.22% (±2.12%)∕0.5 mm (±0.4 mm). For single-view scan angles of 3°, 30°, and 60°, the results for MM-FD technique were 32.77% (±17.87%)∕3.2 mm (±2.2 mm), 24.57% (±18.18%)∕2.9 mm (±2.0 mm), and 10.48% (±9.50%)∕1.1 mm (±1.3 mm), respectively. For projection angular-sampling-intervals of 0.6°, 1.2°, and 2.5° with the orthogonal-view 30°-each scan angle, the MM-FD technique generated similar VPD (maximum deviation 2.91%) and COMS (maximum deviation 0.6 mm), while sparser sampling yielded larger VPD∕COMS. With equal number of projections, the estimation results using scattered 360° scan angle were slightly better than those using orthogonal-view 30°-each scan angle. The estimation accuracy of MM-FD technique declined as noise level increased. The MM-FD technique substantially improves the estimation accuracy for onboard 4D-CBCT using prior planning 4D-CT and limited-angle projections, compared to the MM-only and FD-only techniques. It can potentially be used for the inter/intrafractional 4D-localization verification.
A technique for estimating 4D-CBCT using prior knowledge and limited-angle projections
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, You; Yin, Fang-Fang; Ren, Lei
2013-12-15
Purpose: To develop a technique to estimate onboard 4D-CBCT using prior information and limited-angle projections for potential 4D target verification of lung radiotherapy.Methods: Each phase of onboard 4D-CBCT is considered as a deformation from one selected phase (prior volume) of the planning 4D-CT. The deformation field maps (DFMs) are solved using a motion modeling and free-form deformation (MM-FD) technique. In the MM-FD technique, the DFMs are estimated using a motion model which is extracted from planning 4D-CT based on principal component analysis (PCA). The motion model parameters are optimized by matching the digitally reconstructed radiographs of the deformed volumes tomore » the limited-angle onboard projections (data fidelity constraint). Afterward, the estimated DFMs are fine-tuned using a FD model based on data fidelity constraint and deformation energy minimization. The 4D digital extended-cardiac-torso phantom was used to evaluate the MM-FD technique. A lung patient with a 30 mm diameter lesion was simulated with various anatomical and respirational changes from planning 4D-CT to onboard volume, including changes of respiration amplitude, lesion size and lesion average-position, and phase shift between lesion and body respiratory cycle. The lesions were contoured in both the estimated and “ground-truth” onboard 4D-CBCT for comparison. 3D volume percentage-difference (VPD) and center-of-mass shift (COMS) were calculated to evaluate the estimation accuracy of three techniques: MM-FD, MM-only, and FD-only. Different onboard projection acquisition scenarios and projection noise levels were simulated to investigate their effects on the estimation accuracy.Results: For all simulated patient and projection acquisition scenarios, the mean VPD (±S.D.)/COMS (±S.D.) between lesions in prior images and “ground-truth” onboard images were 136.11% (±42.76%)/15.5 mm (±3.9 mm). Using orthogonal-view 15°-each scan angle, the mean VPD/COMS between the lesion in estimated and “ground-truth” onboard images for MM-only, FD-only, and MM-FD techniques were 60.10% (±27.17%)/4.9 mm (±3.0 mm), 96.07% (±31.48%)/12.1 mm (±3.9 mm) and 11.45% (±9.37%)/1.3 mm (±1.3 mm), respectively. For orthogonal-view 30°-each scan angle, the corresponding results were 59.16% (±26.66%)/4.9 mm (±3.0 mm), 75.98% (±27.21%)/9.9 mm (±4.0 mm), and 5.22% (±2.12%)/0.5 mm (±0.4 mm). For single-view scan angles of 3°, 30°, and 60°, the results for MM-FD technique were 32.77% (±17.87%)/3.2 mm (±2.2 mm), 24.57% (±18.18%)/2.9 mm (±2.0 mm), and 10.48% (±9.50%)/1.1 mm (±1.3 mm), respectively. For projection angular-sampling-intervals of 0.6°, 1.2°, and 2.5° with the orthogonal-view 30°-each scan angle, the MM-FD technique generated similar VPD (maximum deviation 2.91%) and COMS (maximum deviation 0.6 mm), while sparser sampling yielded larger VPD/COMS. With equal number of projections, the estimation results using scattered 360° scan angle were slightly better than those using orthogonal-view 30°-each scan angle. The estimation accuracy of MM-FD technique declined as noise level increased.Conclusions: The MM-FD technique substantially improves the estimation accuracy for onboard 4D-CBCT using prior planning 4D-CT and limited-angle projections, compared to the MM-only and FD-only techniques. It can potentially be used for the inter/intrafractional 4D-localization verification.« less
Ronald E. McRoberts; Steen Magnussen; Erkki O. Tomppo; Gherardo Chirici
2011-01-01
Nearest neighbors techniques have been shown to be useful for estimating forest attributes, particularly when used with forest inventory and satellite image data. Published reports of positive results have been truly international in scope. However, for these techniques to be more useful, they must be able to contribute to scientific inference which, for sample-based...
2011-01-01
sensing an attractive technique for estimating LAI. Many vegetation indices, such as Normalized Difference Vegetation Index ( NDVI ), tend to saturate at...little or no improvement over NDVI . Furthermore, indirect ground-sampling techniques often used to evaluate the potential of vegetation indices also...landscapes makes remote sensing an attractive technique for estimating LAI. Many vegetation indices, such as Normalized Difference Vegetation Index ( NDVI
Simplified Estimation and Testing in Unbalanced Repeated Measures Designs.
Spiess, Martin; Jordan, Pascal; Wendt, Mike
2018-05-07
In this paper we propose a simple estimator for unbalanced repeated measures design models where each unit is observed at least once in each cell of the experimental design. The estimator does not require a model of the error covariance structure. Thus, circularity of the error covariance matrix and estimation of correlation parameters and variances are not necessary. Together with a weak assumption about the reason for the varying number of observations, the proposed estimator and its variance estimator are unbiased. As an alternative to confidence intervals based on the normality assumption, a bias-corrected and accelerated bootstrap technique is considered. We also propose the naive percentile bootstrap for Wald-type tests where the standard Wald test may break down when the number of observations is small relative to the number of parameters to be estimated. In a simulation study we illustrate the properties of the estimator and the bootstrap techniques to calculate confidence intervals and conduct hypothesis tests in small and large samples under normality and non-normality of the errors. The results imply that the simple estimator is only slightly less efficient than an estimator that correctly assumes a block structure of the error correlation matrix, a special case of which is an equi-correlation matrix. Application of the estimator and the bootstrap technique is illustrated using data from a task switch experiment based on an experimental within design with 32 cells and 33 participants.
Digital synchronization and communication techniques
NASA Technical Reports Server (NTRS)
Lindsey, William C.
1992-01-01
Information on digital synchronization and communication techniques is given in viewgraph form. Topics covered include phase shift keying, modems, characteristics of open loop digital synchronizers, an open loop phase and frequency estimator, and a digital receiver structure using an open loop estimator in a decision directed architecture.
NASA Astrophysics Data System (ADS)
Trirongjitmoah, Suchin; Iinaga, Kazuya; Sakurai, Toshihiro; Chiba, Hitoshi; Sriyudthsak, Mana; Shimizu, Koichi
2016-04-01
Quantification of small, dense low-density lipoprotein (sdLDL) cholesterol is clinically significant. We propose a practical technique to estimate the amount of sdLDL cholesterol using dynamic light scattering (DLS). An analytical solution in a closed form has newly been obtained to estimate the weight fraction of one species of scatterers in the DLS measurement of two species of scatterers. Using this solution, we can quantify the sdLDL cholesterol amount from the amounts of the low-density lipoprotein cholesterol and the high-density lipoprotein (HDL) cholesterol, which are commonly obtained through clinical tests. The accuracy of the proposed technique was confirmed experimentally using latex spheres with known size distributions. The applicability of the proposed technique was examined using samples of human blood serum. The possibility of estimating the sdLDL amount using the HDL data was demonstrated. These results suggest that the quantitative estimation of sdLDL amounts using DLS is feasible for point-of-care testing in clinical practice.
NASA Astrophysics Data System (ADS)
Piñero, G.; Vergara, L.; Desantes, J. M.; Broatch, A.
2000-11-01
The knowledge of the particle velocity fluctuations associated with acoustic pressure oscillation in the exhaust system of internal combustion engines may represent a powerful aid in the design of such systems, from the point of view of both engine performance improvement and exhaust noise abatement. However, usual velocity measurement techniques, even if applicable, are not well suited to the aggressive environment existing in exhaust systems. In this paper, a method to obtain a suitable estimate of velocity fluctuations is proposed, which is based on the application of spatial filtering (beamforming) techniques to instantaneous pressure measurements. Making use of simulated pressure-time histories, several algorithms have been checked by comparison between the simulated and the estimated velocity fluctuations. Then, problems related to the experimental procedure and associated with the proposed methodology are addressed, making application to measurements made in a real exhaust system. The results indicate that, if proper care is taken when performing the measurements, the application of beamforming techniques gives a reasonable estimate of the velocity fluctuations.
Estimation of Stratospheric Age Spectrum from Chemical Tracers
NASA Technical Reports Server (NTRS)
Schoeberl, Mark R.; Douglass, Anne R.; Polansky, Brian
2005-01-01
We have developed a technique to diagnose the stratospheric age spectrum and estimate the mean age of air using the distributions of at least four constituents with different photochemical lifetimes. We demonstrate that the technique works using a 3D CTM and then apply the technique to UMS CLAES January 1993 observations of CFC11, CFC12, CH4 and N2O. Our results are generally in agreement with mean age of air estimates from the chemical model and from observations of SF6 and CO2; however, the mean age estimates show an intrusion of very young tropical air into the mid-latitude stratosphere. This feature is consistent with mixing of high N20 air out of the tropics during the westerly phase of the QBO.
NASA Astrophysics Data System (ADS)
Berger, Lukas; Kleinheinz, Konstantin; Attili, Antonio; Bisetti, Fabrizio; Pitsch, Heinz; Mueller, Michael E.
2018-05-01
Modelling unclosed terms in partial differential equations typically involves two steps: First, a set of known quantities needs to be specified as input parameters for a model, and second, a specific functional form needs to be defined to model the unclosed terms by the input parameters. Both steps involve a certain modelling error, with the former known as the irreducible error and the latter referred to as the functional error. Typically, only the total modelling error, which is the sum of functional and irreducible error, is assessed, but the concept of the optimal estimator enables the separate analysis of the total and the irreducible errors, yielding a systematic modelling error decomposition. In this work, attention is paid to the techniques themselves required for the practical computation of irreducible errors. Typically, histograms are used for optimal estimator analyses, but this technique is found to add a non-negligible spurious contribution to the irreducible error if models with multiple input parameters are assessed. Thus, the error decomposition of an optimal estimator analysis becomes inaccurate, and misleading conclusions concerning modelling errors may be drawn. In this work, numerically accurate techniques for optimal estimator analyses are identified and a suitable evaluation of irreducible errors is presented. Four different computational techniques are considered: a histogram technique, artificial neural networks, multivariate adaptive regression splines, and an additive model based on a kernel method. For multiple input parameter models, only artificial neural networks and multivariate adaptive regression splines are found to yield satisfactorily accurate results. Beyond a certain number of input parameters, the assessment of models in an optimal estimator analysis even becomes practically infeasible if histograms are used. The optimal estimator analysis in this paper is applied to modelling the filtered soot intermittency in large eddy simulations using a dataset of a direct numerical simulation of a non-premixed sooting turbulent flame.
Importance of Geosat orbit and tidal errors in the estimation of large-scale Indian Ocean variations
NASA Technical Reports Server (NTRS)
Perigaud, Claire; Zlotnicki, Victor
1992-01-01
To improve the estimate accuracy of large-scale meridional sea-level variations, Geosat ERM data on the Indian Ocean for a 26-month period were processed using two different techniques of orbit error reduction. The first technique removes an along-track polynomial of degree 1 over about 5000 km and the second technique removes an along-track once-per-revolution sine wave about 40,000 km. Results obtained show that the polynomial technique produces stronger attenuation of both the tidal error and the large-scale oceanic signal. After filtering, the residual difference between the two methods represents 44 percent of the total variance and 23 percent of the annual variance. The sine-wave method yields a larger estimate of annual and interannual meridional variations.
Congestion estimation technique in the optical network unit registration process.
Kim, Geunyong; Yoo, Hark; Lee, Dongsoo; Kim, Youngsun; Lim, Hyuk
2016-07-01
We present a congestion estimation technique (CET) to estimate the optical network unit (ONU) registration success ratio for the ONU registration process in passive optical networks. An optical line terminal (OLT) estimates the number of collided ONUs via the proposed scheme during the serial number state. The OLT can obtain congestion level among ONUs to be registered such that this information may be exploited to change the size of a quiet window to decrease the collision probability. We verified the efficiency of the proposed method through simulation and experimental results.
Software for the grouped optimal aggregation technique
NASA Technical Reports Server (NTRS)
Brown, P. M.; Shaw, G. W. (Principal Investigator)
1982-01-01
The grouped optimal aggregation technique produces minimum variance, unbiased estimates of acreage and production for countries, zones (states), or any designated collection of acreage strata. It uses yield predictions, historical acreage information, and direct acreage estimate from satellite data. The acreage strata are grouped in such a way that the ratio model over historical acreage provides a smaller variance than if the model were applied to each individual stratum. An optimal weighting matrix based on historical acreages, provides the link between incomplete direct acreage estimates and the total, current acreage estimate.
Kroll, Lars Eric; Schumann, Maria; Müters, Stephan; Lampert, Thomas
2017-12-01
Nationwide health surveys can be used to estimate regional differences in health. Using traditional estimation techniques, the spatial depth for these estimates is limited due to the constrained sample size. So far - without special refreshment samples - results have only been available for larger populated federal states of Germany. An alternative is regression-based small-area estimation techniques. These models can generate smaller-scale data, but are also subject to greater statistical uncertainties because of the model assumptions. In the present article, exemplary regionalized results based on the studies "Gesundheit in Deutschland aktuell" (GEDA studies) 2009, 2010 and 2012, are compared to the self-rated health status of the respondents. The aim of the article is to analyze the range of regional estimates in order to assess the usefulness of the techniques for health reporting more adequately. The results show that the estimated prevalence is relatively stable when using different samples. Important determinants of the variation of the estimates are the achieved sample size on the district level and the type of the district (cities vs. rural regions). Overall, the present study shows that small-area modeling of prevalence is associated with additional uncertainties compared to conventional estimates, which should be taken into account when interpreting the corresponding findings.
NASA Astrophysics Data System (ADS)
Lee, T. R.; Wood, W. T.; Dale, J.
2017-12-01
Empirical and theoretical models of sub-seafloor organic matter transformation, degradation and methanogenesis require estimates of initial seafloor total organic carbon (TOC). This subsurface methane, under the appropriate geophysical and geochemical conditions may manifest as methane hydrate deposits. Despite the importance of seafloor TOC, actual observations of TOC in the world's oceans are sparse and large regions of the seafloor yet remain unmeasured. To provide an estimate in areas where observations are limited or non-existent, we have implemented interpolation techniques that rely on existing data sets. Recent geospatial analyses have provided accurate accounts of global geophysical and geochemical properties (e.g. crustal heat flow, seafloor biomass, porosity) through machine learning interpolation techniques. These techniques find correlations between the desired quantity (in this case TOC) and other quantities (predictors, e.g. bathymetry, distance from coast, etc.) that are more widely known. Predictions (with uncertainties) of seafloor TOC in regions lacking direct observations are made based on the correlations. Global distribution of seafloor TOC at 1 x 1 arc-degree resolution was estimated from a dataset of seafloor TOC compiled by Seiter et al. [2004] and a non-parametric (i.e. data-driven) machine learning algorithm, specifically k-nearest neighbors (KNN). Built-in predictor selection and a ten-fold validation technique generated statistically optimal estimates of seafloor TOC and uncertainties. In addition, inexperience was estimated. Inexperience is effectively the distance in parameter space to the single nearest neighbor, and it indicates geographic locations where future data collection would most benefit prediction accuracy. These improved geospatial estimates of TOC in data deficient areas will provide new constraints on methane production and subsequent methane hydrate accumulation.
NASA Technical Reports Server (NTRS)
Parada, N. D. J. (Principal Investigator); Deassuncao, G. V.; Moreira, M. A.; Novaes, R. A.
1984-01-01
The development of a methodology for annual estimates of irrigated rice crop in the State of Rio Grande do Sul, Brazil, using remote sensing techniques is proposed. The project involves interpretation, digital analysis, and sampling techniques of LANDSAT imagery. Results are discussed from a preliminary phase for identifying and evaluating irrigated rice crop areas in four counties of the State, for the crop year 1982/1983. This first phase involved just visual interpretation techniques of MSS/LANDSAT images.
NASA Technical Reports Server (NTRS)
Green, R. N.
1981-01-01
The shape factor, parameter estimation, and deconvolution data analysis techniques were applied to the same set of Earth emitted radiation measurements to determine the effects of different techniques on the estimated radiation field. All three techniques are defined and their assumptions, advantages, and disadvantages are discussed. Their results are compared globally, zonally, regionally, and on a spatial spectrum basis. The standard deviations of the regional differences in the derived radiant exitance varied from 7.4 W-m/2 to 13.5 W-m/2.
Image enhancement and advanced information extraction techniques for ERTS-1 data
NASA Technical Reports Server (NTRS)
Malila, W. A. (Principal Investigator); Nalepka, R. F.; Sarno, J. E.
1975-01-01
The author has identified the following significant results. It was demonstrated and concluded that: (1) the atmosphere has significant effects on ERTS MSS data which can seriously degrade recognition performance; (2) the application of selected signature extension techniques serve to reduce the deleterious effects of both the atmosphere and changing ground conditions on recognition performance; and (3) a proportion estimation algorithm for overcoming problems in acreage estimation accuracy resulting from the coarse spatial resolution of the ERTS MSS, was able to significantly improve acreage estimation accuracy over that achievable by conventional techniques, especially for high contrast targets such as lakes and ponds.
Automatic Estimation of Verified Floating-Point Round-Off Errors via Static Analysis
NASA Technical Reports Server (NTRS)
Moscato, Mariano; Titolo, Laura; Dutle, Aaron; Munoz, Cesar A.
2017-01-01
This paper introduces a static analysis technique for computing formally verified round-off error bounds of floating-point functional expressions. The technique is based on a denotational semantics that computes a symbolic estimation of floating-point round-o errors along with a proof certificate that ensures its correctness. The symbolic estimation can be evaluated on concrete inputs using rigorous enclosure methods to produce formally verified numerical error bounds. The proposed technique is implemented in the prototype research tool PRECiSA (Program Round-o Error Certifier via Static Analysis) and used in the verification of floating-point programs of interest to NASA.
Weak value amplification considered harmful
NASA Astrophysics Data System (ADS)
Ferrie, Christopher; Combes, Joshua
2014-03-01
We show using statistically rigorous arguments that the technique of weak value amplification does not perform better than standard statistical techniques for the tasks of parameter estimation and signal detection. We show that using all data and considering the joint distribution of all measurement outcomes yields the optimal estimator. Moreover, we show estimation using the maximum likelihood technique with weak values as small as possible produces better performance for quantum metrology. In doing so, we identify the optimal experimental arrangement to be the one which reveals the maximal eigenvalue of the square of system observables. We also show these conclusions do not change in the presence of technical noise.
NASA Technical Reports Server (NTRS)
Cornish, C. R.
1983-01-01
Following reception and analog to digital conversion (A/D) conversion, atmospheric radar backscatter echoes need to be processed so as to obtain desired information about atmospheric processes and to eliminate or minimize contaminating contributions from other sources. Various signal processing techniques have been implemented at mesosphere-stratosphere-troposphere (MST) radar facilities to estimate parameters of interest from received spectra. Such estimation techniques need to be both accurate and sufficiently efficient to be within the capabilities of the particular data-processing system. The various techniques used to parameterize the spectra of received signals are reviewed herein. Noise estimation, electromagnetic interference, data smoothing, correlation, and the Doppler effect are among the specific points addressed.
A three-dimensional muscle activity imaging technique for assessing pelvic muscle function
NASA Astrophysics Data System (ADS)
Zhang, Yingchun; Wang, Dan; Timm, Gerald W.
2010-11-01
A novel multi-channel surface electromyography (EMG)-based three-dimensional muscle activity imaging (MAI) technique has been developed by combining the bioelectrical source reconstruction approach and subject-specific finite element modeling approach. Internal muscle activities are modeled by a current density distribution and estimated from the intra-vaginal surface EMG signals with the aid of a weighted minimum norm estimation algorithm. The MAI technique was employed to minimally invasively reconstruct electrical activity in the pelvic floor muscles and urethral sphincter from multi-channel intra-vaginal surface EMG recordings. A series of computer simulations were conducted to evaluate the performance of the present MAI technique. With appropriate numerical modeling and inverse estimation techniques, we have demonstrated the capability of the MAI technique to accurately reconstruct internal muscle activities from surface EMG recordings. This MAI technique combined with traditional EMG signal analysis techniques is being used to study etiologic factors associated with stress urinary incontinence in women by correlating functional status of muscles characterized from the intra-vaginal surface EMG measurements with the specific pelvic muscle groups that generated these signals. The developed MAI technique described herein holds promise for eliminating the need to place needle electrodes into muscles to obtain accurate EMG recordings in some clinical applications.
NASA Astrophysics Data System (ADS)
Li, Dong; Cheng, Tao; Zhou, Kai; Zheng, Hengbiao; Yao, Xia; Tian, Yongchao; Zhu, Yan; Cao, Weixing
2017-07-01
Red edge position (REP), defined as the wavelength of the inflexion point in the red edge region (680-760 nm) of the reflectance spectrum, has been widely used to estimate foliar chlorophyll content from reflectance spectra. A number of techniques have been developed for REP extraction in the past three decades, but most of them require data-specific parameterization and the consistence of their performance from leaf to canopy levels remains poorly understood. In this study, we propose a new technique (WREP) to extract REPs based on the application of continuous wavelet transform to reflectance spectra. The REP is determined by the zero-crossing wavelength in the red edge region of a wavelet transformed spectrum for a number of scales of wavelet decomposition. The new technique is simple to implement and requires no parameterization from the user as long as continuous wavelet transforms are applied to reflectance spectra. Its performance was evaluated for estimating leaf chlorophyll content (LCC) and canopy chlorophyll content (CCC) of cereal crops (i.e. rice and wheat) and compared with traditional techniques including linear interpolation, linear extrapolation, polynomial fitting and inverted Gaussian. Our results demonstrated that WREP obtained the best estimation accuracy for both LCC and CCC as compared to traditional techniques. High scales of wavelet decomposition were favorable for the estimation of CCC and low scales for the estimation of LCC. The difference in optimal scale reveals the underlying mechanism of signature transfer from leaf to canopy levels. In addition, crop-specific models were required for the estimation of CCC over the full range. However, a common model could be built with the REPs extracted with Scale 5 of the WREP technique for wheat and rice crops when CCC was less than 2 g/m2 (R2 = 0.73, RMSE = 0.26 g/m2). This insensitivity of WREP to crop type indicates the potential for aerial mapping of chlorophyll content between growth seasons of cereal crops. The new REP extraction technique provides us a new insight for understanding the spectral changes in the red edge region in response to chlorophyll variation from leaf to canopy levels.
Least-squares sequential parameter and state estimation for large space structures
NASA Technical Reports Server (NTRS)
Thau, F. E.; Eliazov, T.; Montgomery, R. C.
1982-01-01
This paper presents the formulation of simultaneous state and parameter estimation problems for flexible structures in terms of least-squares minimization problems. The approach combines an on-line order determination algorithm, with least-squares algorithms for finding estimates of modal approximation functions, modal amplitudes, and modal parameters. The approach combines previous results on separable nonlinear least squares estimation with a regression analysis formulation of the state estimation problem. The technique makes use of sequential Householder transformations. This allows for sequential accumulation of matrices required during the identification process. The technique is used to identify the modal prameters of a flexible beam.
A solar energy estimation procedure using remote sensing techniques. [watershed hydrologic models
NASA Technical Reports Server (NTRS)
Khorram, S.
1977-01-01
The objective of this investigation is to design a remote sensing-aided procedure for daily location-specific estimation of solar radiation components over the watershed(s) of interest. This technique has been tested on the Spanish Creek Watershed, Northern California, with successful results.
A new slit lamp-based technique for anterior chamber angle estimation.
Gispets, Joan; Cardona, Genís; Tomàs, Núria; Fusté, Cèlia; Binns, Alison; Fortes, Miguel A
2014-06-01
To design and test a new noninvasive method for anterior chamber angle (ACA) estimation based on the slit lamp that is accessible to all eye-care professionals. A new technique (slit lamp anterior chamber estimation [SLACE]) that aims to overcome some of the limitations of the van Herick procedure was designed. The technique, which only requires a slit lamp, was applied to estimate the ACA of 50 participants (100 eyes) using two different slit lamp models, and results were compared with gonioscopy as the clinical standard. The Spearman nonparametric correlation between ACA values as determined by gonioscopy and SLACE were 0.81 (p < 0.001) and 0.79 (p < 0.001) for each slit lamp. Sensitivity values of 100 and 87.5% and specificity values of 75 and 81.2%, depending on the slit lamp used, were obtained for the SLACE technique as compared with gonioscopy (Spaeth classification). The SLACE technique, when compared with gonioscopy, displayed good accuracy in the detection of narrow angles, and it may be useful for eye-care clinicians without access to expensive alternative equipment or those who cannot perform gonioscopy because of legal constraints regarding the use of diagnostic drugs.
Recent Improvements in Estimating Convective and Stratiform Rainfall in Amazonia
NASA Technical Reports Server (NTRS)
Negri, Andrew J.
1999-01-01
In this paper we present results from the application of a satellite infrared (IR) technique for estimating rainfall over northern South America. Our main objectives are to examine the diurnal variability of rainfall and to investigate the relative contributions from the convective and stratiform components. We apply the technique of Anagnostou et al (1999). In simple functional form, the estimated rain area A(sub rain) may be expressed as: A(sub rain) = f(A(sub mode),T(sub mode)), where T(sub mode) is the mode temperature of a cloud defined by 253 K, and A(sub mode) is the area encompassed by T(sub mode). The technique was trained by a regression between coincident microwave estimates from the Goddard Profiling (GPROF) algorithm (Kummerow et al, 1996) applied to SSM/I data and GOES IR (11 microns) observations. The apportionment of the rainfall into convective and stratiform components is based on the microwave technique described by Anagnostou and Kummerow (1997). The convective area from this technique was regressed against an IR structure parameter (the Convective Index) defined by Anagnostou et al (1999). Finally, rainrates are assigned to the Am.de proportional to (253-temperature), with different rates for the convective and stratiform
Technique for estimating depth of 100-year floods in Tennessee
Gamble, Charles R.; Lewis, James G.
1977-01-01
Preface: A method is presented for estimating the depth of the loo-year flood in four hydrologic areas in Tennessee. Depths at 151 gaging stations on streams that were not significantly affected by man made changes were related to basin characteristics by multiple regression techniques. Equations derived from the analysis can be used to estimate the depth of the loo-year flood if the size of the drainage basin is known.
1998-03-01
benefit estimation techniques used to monetize the value of flood hazard reduction in the City of Roanoke. Each method was then used to estimate...behavior. This framework justifies interpreting people’s choices to infer and then monetize their preferences. If individuals have well-ordered and...Journal of Agricultural Economics. 68 (1986) 2: 280-290. Soule, Don M. and Claude M. Vaughn, "Flood Protection Benefits as Reflected in Property
Jacquemin, Bénédicte; Lepeule, Johanna; Boudier, Anne; Arnould, Caroline; Benmerad, Meriem; Chappaz, Claire; Ferran, Joane; Kauffmann, Francine; Morelli, Xavier; Pin, Isabelle; Pison, Christophe; Rios, Isabelle; Temam, Sofia; Künzli, Nino; Slama, Rémy; Siroux, Valérie
2013-09-01
Errors in address geocodes may affect estimates of the effects of air pollution on health. We investigated the impact of four geocoding techniques on the association between urban air pollution estimated with a fine-scale (10 m × 10 m) dispersion model and lung function in adults. We measured forced expiratory volume in 1 sec (FEV1) and forced vital capacity (FVC) in 354 adult residents of Grenoble, France, who were participants in two well-characterized studies, the Epidemiological Study on the Genetics and Environment on Asthma (EGEA) and the European Community Respiratory Health Survey (ECRHS). Home addresses were geocoded using individual building matching as the reference approach and three spatial interpolation approaches. We used a dispersion model to estimate mean PM10 and nitrogen dioxide concentrations at each participant's address during the 12 months preceding their lung function measurements. Associations between exposures and lung function parameters were adjusted for individual confounders and same-day exposure to air pollutants. The geocoding techniques were compared with regard to geographical distances between coordinates, exposure estimates, and associations between the estimated exposures and health effects. Median distances between coordinates estimated using the building matching and the three interpolation techniques were 26.4, 27.9, and 35.6 m. Compared with exposure estimates based on building matching, PM10 concentrations based on the three interpolation techniques tended to be overestimated. When building matching was used to estimate exposures, a one-interquartile range increase in PM10 (3.0 μg/m3) was associated with a 3.72-point decrease in FVC% predicted (95% CI: -0.56, -6.88) and a 3.86-point decrease in FEV1% predicted (95% CI: -0.14, -3.24). The magnitude of associations decreased when other geocoding approaches were used [e.g., for FVC% predicted -2.81 (95% CI: -0.26, -5.35) using NavTEQ, or 2.08 (95% CI -4.63, 0.47, p = 0.11) using Google Maps]. Our findings suggest that the choice of geocoding technique may influence estimated health effects when air pollution exposures are estimated using a fine-scale exposure model.
NASA Astrophysics Data System (ADS)
Prakash, Satya; Mahesh, C.; Gairola, Rakesh M.
2011-12-01
Large-scale precipitation estimation is very important for climate science because precipitation is a major component of the earth's water and energy cycles. In the present study, the GOES precipitation index technique has been applied to the Kalpana-1 satellite infrared (IR) images of every three-hourly, i.e., of 0000, 0300, 0600,…., 2100 hours UTC, for rainfall estimation as a preparatory to the INSAT-3D. After the temperatures of all the pixels in a grid are known, they are distributed to generate a three-hourly 24-class histogram of brightness temperatures of IR (10.5-12.5 μm) images for a 1.0° × 1.0° latitude/longitude box. The daily, monthly, and seasonal rainfall have been estimated using these three-hourly rain estimates for the entire south-west monsoon period of 2009 in the present study. To investigate the potential of these rainfall estimates, the validation of monthly and seasonal rainfall estimates has been carried out using the Global Precipitation Climatology Project and Global Precipitation Climatology Centre data. The validation results show that the present technique works very well for the large-scale precipitation estimation qualitatively as well as quantitatively. The results also suggest that the simple IR-based estimation technique can be used to estimate rainfall for tropical areas at a larger temporal scale for climatological applications.
Halford, K.J.; Mayer, G.C.
2000-01-01
Ground water discharge and recharge frequently have been estimated with hydrograph-separation techniques, but the critical assumptions of the techniques have not been investigated. The critical assumptions are that the hydraulic characteristics of the contributing aquifer (recession index) can be estimated from stream-discharge records; that periods of exclusively ground water discharge can be reliably identified; and that stream-discharge peaks approximate the magnitude and tinting of recharge events. The first assumption was tested by estimating the recession index from st earn-discharge hydrographs, ground water hydrographs, and hydraulic diffusivity estimates from aquifer tests in basins throughout the eastern United States and Montana. The recession index frequently could not be estimated reliably from stream-discharge records alone because many of the estimates of the recession index were greater than 1000 days. The ratio of stream discharge during baseflow periods was two to 36 times greater than the maximum expected range of ground water discharge at 12 of the 13 field sites. The identification of the ground water component of stream-discharge records was ambiguous because drainage from bank-storage, wetlands, surface water bodies, soils, and snowpacks frequently exceeded ground water discharge and also decreased exponentially during recession periods. The timing and magnitude of recharge events could not be ascertained from stream-discharge records at any of the sites investigated because recharge events were not directly correlated with stream peaks. When used alone, the recession-curve-displacement method and other hydrograph-separation techniques are poor tools for estimating ground water discharge or recharge because the major assumptions of the methods are commonly and grossly violated. Multiple, alternative methods of estimating ground water discharge and recharge should be used because of the uncertainty associated with any one technique.
Alternative Strategies for Pricing Home Work Time.
ERIC Educational Resources Information Center
Zick, Cathleen D.; Bryant, W. Keith
1983-01-01
Discusses techniques for measuring the value of home work time. Estimates obtained using the reservation wage technique are contrasted with market alternative estimates derived with the same data set. Findings suggest that the market alternative cost method understates the true value of a woman's home time to the household. (JOW)
A NOVEL TECHNIQUE FOR QUANTITATIVE ESTIMATION OF UPTAKE OF DIESEL EXHAUST PARTICLES BY LUNG CELLS
While airborne particulates like diesel exhaust particulates (DEP) exert significant toxicological effects on lungs, quantitative estimation of accumulation of DEP inside lung cells has not been reported due to a lack of an accurate and quantitative technique for this purpose. I...
Estimation of Target Angular Position Under Mainbeam Jamming Conditions,
1995-12-01
technique, Multiple Signal Classification ( MUSIC ), is used to estimate the target Direction Of Arrival (DOA) from the processed data vectors. The model...used in the MUSIC technique takes into account the fact that the jammer has been cancelled in the target data vector. The performance of this algorithm
We compare biomass burning emissions estimates from four different techniques that use satellite based fire products to determine area burned over regional to global domains. Three of the techniques use active fire detections from polar-orbiting MODIS sensors and one uses detec...
A nonparametric clustering technique which estimates the number of clusters
NASA Technical Reports Server (NTRS)
Ramey, D. B.
1983-01-01
In applications of cluster analysis, one usually needs to determine the number of clusters, K, and the assignment of observations to each cluster. A clustering technique based on recursive application of a multivariate test of bimodality which automatically estimates both K and the cluster assignments is presented.
ERIC Educational Resources Information Center
Stapleton, Laura M.
2008-01-01
This article discusses replication sampling variance estimation techniques that are often applied in analyses using data from complex sampling designs: jackknife repeated replication, balanced repeated replication, and bootstrapping. These techniques are used with traditional analyses such as regression, but are currently not used with structural…
A Biomechanical Modeling Guided CBCT Estimation Technique
Zhang, You; Tehrani, Joubin Nasehi; Wang, Jing
2017-01-01
Two-dimensional-to-three-dimensional (2D-3D) deformation has emerged as a new technique to estimate cone-beam computed tomography (CBCT) images. The technique is based on deforming a prior high-quality 3D CT/CBCT image to form a new CBCT image, guided by limited-view 2D projections. The accuracy of this intensity-based technique, however, is often limited in low-contrast image regions with subtle intensity differences. The solved deformation vector fields (DVFs) can also be biomechanically unrealistic. To address these problems, we have developed a biomechanical modeling guided CBCT estimation technique (Bio-CBCT-est) by combining 2D-3D deformation with finite element analysis (FEA)-based biomechanical modeling of anatomical structures. Specifically, Bio-CBCT-est first extracts the 2D-3D deformation-generated displacement vectors at the high-contrast anatomical structure boundaries. The extracted surface deformation fields are subsequently used as the boundary conditions to drive structure-based FEA to correct and fine-tune the overall deformation fields, especially those at low-contrast regions within the structure. The resulting FEA-corrected deformation fields are then fed back into 2D-3D deformation to form an iterative loop, combining the benefits of intensity-based deformation and biomechanical modeling for CBCT estimation. Using eleven lung cancer patient cases, the accuracy of the Bio-CBCT-est technique has been compared to that of the 2D-3D deformation technique and the traditional CBCT reconstruction techniques. The accuracy was evaluated in the image domain, and also in the DVF domain through clinician-tracked lung landmarks. PMID:27831866
NASA Technical Reports Server (NTRS)
Tilton, J. C.; Swain, P. H. (Principal Investigator); Vardeman, S. B.
1981-01-01
A key input to a statistical classification algorithm, which exploits the tendency of certain ground cover classes to occur more frequently in some spatial context than in others, is a statistical characterization of the context: the context distribution. An unbiased estimator of the context distribution is discussed which, besides having the advantage of statistical unbiasedness, has the additional advantage over other estimation techniques of being amenable to an adaptive implementation in which the context distribution estimate varies according to local contextual information. Results from applying the unbiased estimator to the contextual classification of three real LANDSAT data sets are presented and contrasted with results from non-contextual classifications and from contextual classifications utilizing other context distribution estimation techniques.
Depth-estimation-enabled compound eyes
NASA Astrophysics Data System (ADS)
Lee, Woong-Bi; Lee, Heung-No
2018-04-01
Most animals that have compound eyes determine object distances by using monocular cues, especially motion parallax. In artificial compound eye imaging systems inspired by natural compound eyes, object depths are typically estimated by measuring optic flow; however, this requires mechanical movement of the compound eyes or additional acquisition time. In this paper, we propose a method for estimating object depths in a monocular compound eye imaging system based on the computational compound eye (COMPU-EYE) framework. In the COMPU-EYE system, acceptance angles are considerably larger than interommatidial angles, causing overlap between the ommatidial receptive fields. In the proposed depth estimation technique, the disparities between these receptive fields are used to determine object distances. We demonstrate that the proposed depth estimation technique can estimate the distances of multiple objects.
Technique for estimation of streamflow statistics in mineral areas of interest in Afghanistan
Olson, Scott A.; Mack, Thomas J.
2011-01-01
A technique for estimating streamflow statistics at ungaged stream sites in areas of mineral interest in Afghanistan using drainage-area-ratio relations of historical streamflow data was developed and is documented in this report. The technique can be used to estimate the following streamflow statistics at ungaged sites: (1) 7-day low flow with a 10-year recurrence interval, (2) 7-day low flow with a 2-year recurrence interval, (3) daily mean streamflow exceeded 90 percent of the time, (4) daily mean streamflow exceeded 80 percent of the time, (5) mean monthly streamflow for each month of the year, (6) mean annual streamflow, and (7) minimum monthly streamflow for each month of the year. Because they are based on limited historical data, the estimates of streamflow statistics at ungaged sites are considered preliminary.
A New Approach to Estimate Forest Parameters Using Dual-Baseline Pol-InSAR Data
NASA Astrophysics Data System (ADS)
Bai, L.; Hong, W.; Cao, F.; Zhou, Y.
2009-04-01
In POL-InSAR applications using ESPRIT technique, it is assumed that there exist stable scattering centres in the forest. However, the observations in forest severely suffer from volume and temporal decorrelation. The forest scatters are not stable as assumed. The obtained interferometric information is not accurate as expected. Besides, ESPRIT techniques could not identify the interferometric phases corresponding to the ground and the canopy. It provides multiple estimations for the height between two scattering centers due to phase unwrapping. Therefore, estimation errors are introduced to the forest height results. To suppress the two types of errors, we use the dual-baseline POL-InSAR data to estimate forest height. Dual-baseline coherence optimization is applied to obtain interferometric information of stable scattering centers in the forest. From the interferometric phases for different baselines, estimation errors caused by phase unwrapping is solved. Other estimation errors can be suppressed, too. Experiments are done to the ESAR L band POL-InSAR data. Experimental results show the proposed methods provide more accurate forest height than ESPRIT technique.
An Adaptive Kalman Filter Using a Simple Residual Tuning Method
NASA Technical Reports Server (NTRS)
Harman, Richard R.
1999-01-01
One difficulty in using Kalman filters in real world situations is the selection of the correct process noise, measurement noise, and initial state estimate and covariance. These parameters are commonly referred to as tuning parameters. Multiple methods have been developed to estimate these parameters. Most of those methods such as maximum likelihood, subspace, and observer Kalman Identification require extensive offline processing and are not suitable for real time processing. One technique, which is suitable for real time processing, is the residual tuning method. Any mismodeling of the filter tuning parameters will result in a non-white sequence for the filter measurement residuals. The residual tuning technique uses this information to estimate corrections to those tuning parameters. The actual implementation results in a set of sequential equations that run in parallel with the Kalman filter. A. H. Jazwinski developed a specialized version of this technique for estimation of process noise. Equations for the estimation of the measurement noise have also been developed. These algorithms are used to estimate the process noise and measurement noise for the Wide Field Infrared Explorer star tracker and gyro.
Characterizing Detrended Fluctuation Analysis of multifractional Brownian motion
NASA Astrophysics Data System (ADS)
Setty, V. A.; Sharma, A. S.
2015-02-01
The Hurst exponent (H) is widely used to quantify long range dependence in time series data and is estimated using several well known techniques. Recognizing its ability to remove trends the Detrended Fluctuation Analysis (DFA) is used extensively to estimate a Hurst exponent in non-stationary data. Multifractional Brownian motion (mBm) broadly encompasses a set of models of non-stationary data exhibiting time varying Hurst exponents, H(t) as against a constant H. Recently, there has been a growing interest in time dependence of H(t) and sliding window techniques have been used to estimate a local time average of the exponent. This brought to fore the ability of DFA to estimate scaling exponents in systems with time varying H(t) , such as mBm. This paper characterizes the performance of DFA on mBm data with linearly varying H(t) and further test the robustness of estimated time average with respect to data and technique related parameters. Our results serve as a bench-mark for using DFA as a sliding window estimator to obtain H(t) from time series data.
Estimating Environmental Compliance Costs for Industry (1981)
The paper discusses the pros and cons of existing approaches to compliance cost estimation such as ex post survey estimation and ex ante estimation techniques (input cost accounting methods, engineering process models and, econometric models).
Comparison of 2-D and 3-D estimates of placental volume in early pregnancy.
Aye, Christina Y L; Stevenson, Gordon N; Impey, Lawrence; Collins, Sally L
2015-03-01
Ultrasound estimation of placental volume (PlaV) between 11 and 13 wk has been proposed as part of a screening test for small-for-gestational-age babies. A semi-automated 3-D technique, validated against the gold standard of manual delineation, has been found at this stage of gestation to predict small-for-gestational-age at term. Recently, when used in the third trimester, an estimate obtained using a 2-D technique was found to correlate with placental weight at delivery. Given its greater simplicity, the 2-D technique might be more useful as part of an early screening test. We investigated if the two techniques produced similar results when used in the first trimester. The correlation between PlaV values calculated by the two different techniques was assessed in 139 first-trimester placentas. The agreement on PlaV and derived "standardized placental volume," a dimensionless index correcting for gestational age, was explored with the Mann-Whitney test and Bland-Altman plots. Placentas were categorized into five different shape subtypes, and a subgroup analysis was performed. Agreement was poor for both PlaV and standardized PlaV (p < 0.001 and p < 0.001), with the 2-D technique yielding larger estimates for both indices compared with the 3-D method. The mean difference in standardized PlaV values between the two methods was 0.007 (95% confidence interval: 0.006-0.009). The best agreement was found for regular rectangle-shaped placentas (p = 0.438 and p = 0.408). The poor correlation between the 2-D and 3-D techniques may result from the heterogeneity of placental morphology at this stage of gestation. In early gestation, the simpler 2-D estimates of PlaV do not correlate strongly with those obtained with the validated 3-D technique. Copyright © 2015 World Federation for Ultrasound in Medicine & Biology. Published by Elsevier Inc. All rights reserved.
Estimating GATE rainfall with geosynchronous satellite images
NASA Technical Reports Server (NTRS)
Stout, J. E.; Martin, D. W.; Sikdar, D. N.
1979-01-01
A method of estimating GATE rainfall from either visible or infrared images of geosynchronous satellites is described. Rain is estimated from cumulonimbus cloud area by the equation R = a sub 0 A + a sub 1 dA/dt, where R is volumetric rainfall, A cloud area, t time, and a sub 0 and a sub 1 are constants. Rainfall, calculated from 5.3 cm ship radar, and cloud area are measured from clouds in the tropical North Atlantic. The constants a sub 0 and a sub 1 are fit to these measurements by the least-squares method. Hourly estimates by the infrared version of this technique correlate well (correlation coefficient of 0.84) with rain totals derived from composited radar for an area of 100,000 sq km. The accuracy of this method is described and compared to that of another technique using geosynchronous satellite images. It is concluded that this technique provides useful estimates of tropical oceanic rainfall on a convective scale.
Adaptive Elastic Net for Generalized Methods of Moments.
Caner, Mehmet; Zhang, Hao Helen
2014-01-30
Model selection and estimation are crucial parts of econometrics. This paper introduces a new technique that can simultaneously estimate and select the model in generalized method of moments (GMM) context. The GMM is particularly powerful for analyzing complex data sets such as longitudinal and panel data, and it has wide applications in econometrics. This paper extends the least squares based adaptive elastic net estimator of Zou and Zhang (2009) to nonlinear equation systems with endogenous variables. The extension is not trivial and involves a new proof technique due to estimators lack of closed form solutions. Compared to Bridge-GMM of Caner (2009), we allow for the number of parameters to diverge to infinity as well as collinearity among a large number of variables, also the redundant parameters set to zero via a data dependent technique. This method has the oracle property, meaning that we can estimate nonzero parameters with their standard limit and the redundant parameters are dropped from the equations simultaneously. Numerical examples are used to illustrate the performance of the new method.
Tube-Load Model Parameter Estimation for Monitoring Arterial Hemodynamics
Zhang, Guanqun; Hahn, Jin-Oh; Mukkamala, Ramakrishna
2011-01-01
A useful model of the arterial system is the uniform, lossless tube with parametric load. This tube-load model is able to account for wave propagation and reflection (unlike lumped-parameter models such as the Windkessel) while being defined by only a few parameters (unlike comprehensive distributed-parameter models). As a result, the parameters may be readily estimated by accurate fitting of the model to available arterial pressure and flow waveforms so as to permit improved monitoring of arterial hemodynamics. In this paper, we review tube-load model parameter estimation techniques that have appeared in the literature for monitoring wave reflection, large artery compliance, pulse transit time, and central aortic pressure. We begin by motivating the use of the tube-load model for parameter estimation. We then describe the tube-load model, its assumptions and validity, and approaches for estimating its parameters. We next summarize the various techniques and their experimental results while highlighting their advantages over conventional techniques. We conclude the review by suggesting future research directions and describing potential applications. PMID:22053157
Estimation of hysteretic damping of structures by stochastic subspace identification
NASA Astrophysics Data System (ADS)
Bajrić, Anela; Høgsberg, Jan
2018-05-01
Output-only system identification techniques can estimate modal parameters of structures represented by linear time-invariant systems. However, the extension of the techniques to structures exhibiting non-linear behavior has not received much attention. This paper presents an output-only system identification method suitable for random response of dynamic systems with hysteretic damping. The method applies the concept of Stochastic Subspace Identification (SSI) to estimate the model parameters of a dynamic system with hysteretic damping. The restoring force is represented by the Bouc-Wen model, for which an equivalent linear relaxation model is derived. Hysteretic properties can be encountered in engineering structures exposed to severe cyclic environmental loads, as well as in vibration mitigation devices, such as Magneto-Rheological (MR) dampers. The identification technique incorporates the equivalent linear damper model in the estimation procedure. Synthetic data, representing the random vibrations of systems with hysteresis, validate the estimated system parameters by the presented identification method at low and high-levels of excitation amplitudes.
Methods for Multiloop Identification of Visual and Neuromuscular Pilot Responses.
Olivari, Mario; Nieuwenhuizen, Frank M; Venrooij, Joost; Bülthoff, Heinrich H; Pollini, Lorenzo
2015-12-01
In this paper, identification methods are proposed to estimate the neuromuscular and visual responses of a multiloop pilot model. A conventional and widely used technique for simultaneous identification of the neuromuscular and visual systems makes use of cross-spectral density estimates. This paper shows that this technique requires a specific noninterference hypothesis, often implicitly assumed, that may be difficult to meet during actual experimental designs. A mathematical justification of the necessity of the noninterference hypothesis is given. Furthermore, two methods are proposed that do not have the same limitations. The first method is based on autoregressive models with exogenous inputs, whereas the second one combines cross-spectral estimators with interpolation in the frequency domain. The two identification methods are validated by offline simulations and contrasted to the classic method. The results reveal that the classic method fails when the noninterference hypothesis is not fulfilled; on the contrary, the two proposed techniques give reliable estimates. Finally, the three identification methods are applied to experimental data from a closed-loop control task with pilots. The two proposed techniques give comparable estimates, different from those obtained by the classic method. The differences match those found with the simulations. Thus, the two identification methods provide a good alternative to the classic method and make it possible to simultaneously estimate human's neuromuscular and visual responses in cases where the classic method fails.
Psychometric Evaluation of Lexical Diversity Indices: Assessing Length Effects.
Fergadiotis, Gerasimos; Wright, Heather Harris; Green, Samuel B
2015-06-01
Several novel techniques have been developed recently to assess the breadth of a speaker's vocabulary exhibited in a language sample. The specific aim of this study was to increase our understanding of the validity of the scores generated by different lexical diversity (LD) estimation techniques. Four techniques were explored: D, Maas, measure of textual lexical diversity, and moving-average type-token ratio. Four LD indices were estimated for language samples on 4 discourse tasks (procedures, eventcasts, story retell, and recounts) from 442 adults who are neurologically intact. The resulting data were analyzed using structural equation modeling. The scores for measure of textual lexical diversity and moving-average type-token ratio were stronger indicators of the LD of the language samples. The results for the other 2 techniques were consistent with the presence of method factors representing construct-irrelevant sources. These findings offer a deeper understanding of the relative validity of the 4 estimation techniques and should assist clinicians and researchers in the selection of LD measures of language samples that minimize construct-irrelevant sources.
NASA Technical Reports Server (NTRS)
Chang, C. Y.; Curlander, J. C.
1992-01-01
Estimation of the Doppler centroid ambiguity is a necessary element of the signal processing for SAR systems with large antenna pointing errors. Without proper resolution of the Doppler centroid estimation (DCE) ambiguity, the image quality will be degraded in the system impulse response function and the geometric fidelity. Two techniques for resolution of DCE ambiguity for the spaceborne SAR are presented; they include a brief review of the range cross-correlation technique and presentation of a new technique using multiple pulse repetition frequencies (PRFs). For SAR systems, where other performance factors control selection of the PRF's, an algorithm is devised to resolve the ambiguity that uses PRF's of arbitrary numerical values. The performance of this multiple PRF technique is analyzed based on a statistical error model. An example is presented that demonstrates for the Shuttle Imaging Radar-C (SIR-C) C-band SAR, the probability of correct ambiguity resolution is higher than 95 percent for antenna attitude errors as large as 3 deg.
A comparison of techniques for assessing farmland bumblebee populations.
Wood, T J; Holland, J M; Goulson, D
2015-04-01
Agri-environment schemes have been implemented across the European Union in order to reverse declines in farmland biodiversity. To assess the impact of these schemes for bumblebees, accurate measures of their populations are required. Here, we compared bumblebee population estimates on 16 farms using three commonly used techniques: standardised line transects, coloured pan traps and molecular estimates of nest abundance. There was no significant correlation between the estimates obtained by the three techniques, suggesting that each technique captured a different aspect of local bumblebee population size and distribution in the landscape. Bumblebee abundance as observed on the transects was positively influenced by the number of flowers present on the transect. The number of bumblebees caught in pan traps was positively influenced by the density of flowers surrounding the trapping location and negatively influenced by wider landscape heterogeneity. Molecular estimates of the number of nests of Bombus terrestris and B. hortorum were positively associated with the proportion of the landscape covered in oilseed rape and field beans. Both direct survey techniques are strongly affected by floral abundance immediately around the survey site, potentially leading to misleading results if attempting to infer overall abundance in an area or on a farm. In contrast, whilst the molecular method suffers from an inability to detect sister pairs at low sample sizes, it appears to be unaffected by the abundance of forage and thus is the preferred survey technique.
Energy Measurement Studies for CO2 Measurement with a Coherent Doppler Lidar System
NASA Technical Reports Server (NTRS)
Beyon, Jeffrey Y.; Koch, Grady J.; Vanvalkenburg, Randal L.; Yu, Jirong; Singh, Upendra N.; Kavaya, Michael J.
2010-01-01
The accurate measurement of energy in the application of lidar system for CO2 measurement is critical. Different techniques of energy estimation in the online and offline pulses are investigated for post processing of lidar returns. The cornerstone of the techniques is the accurate estimation of the spectrum of lidar signal and background noise. Since the background noise is not the ideal white Gaussian noise, simple average level estimation of noise level is not well fit in the energy estimation of lidar signal and noise. A brief review of the methods is presented in this paper.
Techniques for estimating flood hydrographs for ungaged urban watersheds
Stricker, V.A.; Sauer, V.B.
1984-01-01
The Clark Method, modified slightly was used to develop a synthetic, dimensionless hydrograph which can be used to estimate flood hydrographs for ungaged urban watersheds. Application of the technique results in a typical (average) flood hydrograph for a given peak discharge. Input necessary to apply the technique is an estimate of basin lagtime and the recurrence interval peak discharge. Equations for this purpose were obtained from a recent nationwide study on flood frequency in urban watersheds. A regression equation was developed which relates flood volumes to drainage area size, basin lagtime, and peak discharge. This equation is useful where storage of floodwater may be a part of design of flood prevention. (USGS)
Peak-picking fundamental period estimation for hearing prostheses.
Howard, D M
1989-09-01
A real-time peak-picking fundamental period estimation device is described which is used in advanced hearing prostheses for the totally and profoundly deafened. The operation of the peak picker is compared with three well-established fundamental frequency estimation techniques: the electrolaryngograph, which is used as a "standard" hardware implementations of the cepstral technique, and the Gold/Rabiner parallel processing algorithm. These comparisons illustrate and highlight some of the important advantages and disadvantages that characterize the operation of these techniques. The special requirements of the hearing prostheses are discussed with respect to the operation of each device, and the choice of the peak picker is found to be felicitous in this application.
Doppler centroid estimation ambiguity for synthetic aperture radars
NASA Technical Reports Server (NTRS)
Chang, C. Y.; Curlander, J. C.
1989-01-01
A technique for estimation of the Doppler centroid of an SAR in the presence of large uncertainty in antenna boresight pointing is described. Also investigated is the image degradation resulting from data processing that uses an ambiguous centroid. Two approaches for resolving ambiguities in Doppler centroid estimation (DCE) are presented: the range cross-correlation technique and the multiple-PRF (pulse repetition frequency) technique. Because other design factors control the PRF selection for SAR, a generalized algorithm is derived for PRFs not containing a common divisor. An example using the SIR-C parameters illustrates that this algorithm is capable of resolving the C-band DCE ambiguities for antenna pointing uncertainties of about 2-3 deg.
Bayesian sparse channel estimation
NASA Astrophysics Data System (ADS)
Chen, Chulong; Zoltowski, Michael D.
2012-05-01
In Orthogonal Frequency Division Multiplexing (OFDM) systems, the technique used to estimate and track the time-varying multipath channel is critical to ensure reliable, high data rate communications. It is recognized that wireless channels often exhibit a sparse structure, especially for wideband and ultra-wideband systems. In order to exploit this sparse structure to reduce the number of pilot tones and increase the channel estimation quality, the application of compressed sensing to channel estimation is proposed. In this article, to make the compressed channel estimation more feasible for practical applications, it is investigated from a perspective of Bayesian learning. Under the Bayesian learning framework, the large-scale compressed sensing problem, as well as large time delay for the estimation of the doubly selective channel over multiple consecutive OFDM symbols, can be avoided. Simulation studies show a significant improvement in channel estimation MSE and less computing time compared to the conventional compressed channel estimation techniques.
The Highly Adaptive Lasso Estimator
Benkeser, David; van der Laan, Mark
2017-01-01
Estimation of a regression functions is a common goal of statistical learning. We propose a novel nonparametric regression estimator that, in contrast to many existing methods, does not rely on local smoothness assumptions nor is it constructed using local smoothing techniques. Instead, our estimator respects global smoothness constraints by virtue of falling in a class of right-hand continuous functions with left-hand limits that have variation norm bounded by a constant. Using empirical process theory, we establish a fast minimal rate of convergence of our proposed estimator and illustrate how such an estimator can be constructed using standard software. In simulations, we show that the finite-sample performance of our estimator is competitive with other popular machine learning techniques across a variety of data generating mechanisms. We also illustrate competitive performance in real data examples using several publicly available data sets. PMID:29094111
Sixth Annual Flight Mechanics/Estimation Theory Symposium
NASA Technical Reports Server (NTRS)
Lefferts, E. (Editor)
1981-01-01
Methods of orbital position estimation were reviewed. The problem of accuracy in orbital mechanics is discussed and various techniques in current use are presented along with suggested improvements. Of special interest is the compensation for bias in satelliteborne instruments due to attitude instabilities. Image processing and correctional techniques are reported for geodetic measurements and mapping.
NASA Technical Reports Server (NTRS)
Davis, P. A.; Penn, L. M. (Principal Investigator)
1981-01-01
A technique is developed for the estimation of total daily insolation on the basis of data derivable from operational polar-orbiting satellites. Although surface insolation and meteorological observations are used in the development, the algorithm is constrained in application by the infrequent daytime polar-orbiter coverage.
Development and evaluation of the photoload sampling technique
Robert E. Keane; Laura J. Dickinson
2007-01-01
Wildland fire managers need better estimates of fuel loading so they can accurately predict potential fire behavior and effects of alternative fuel and ecosystem restoration treatments. This report presents the development and evaluation of a new fuel sampling method, called the photoload sampling technique, to quickly and accurately estimate loadings for six common...
Estimation of Unsteady Aerodynamic Models from Dynamic Wind Tunnel Data
NASA Technical Reports Server (NTRS)
Murphy, Patrick; Klein, Vladislav
2011-01-01
Demanding aerodynamic modelling requirements for military and civilian aircraft have motivated researchers to improve computational and experimental techniques and to pursue closer collaboration in these areas. Model identification and validation techniques are key components for this research. This paper presents mathematical model structures and identification techniques that have been used successfully to model more general aerodynamic behaviours in single-degree-of-freedom dynamic testing. Model parameters, characterizing aerodynamic properties, are estimated using linear and nonlinear regression methods in both time and frequency domains. Steps in identification including model structure determination, parameter estimation, and model validation, are addressed in this paper with examples using data from one-degree-of-freedom dynamic wind tunnel and water tunnel experiments. These techniques offer a methodology for expanding the utility of computational methods in application to flight dynamics, stability, and control problems. Since flight test is not always an option for early model validation, time history comparisons are commonly made between computational and experimental results and model adequacy is inferred by corroborating results. An extension is offered to this conventional approach where more general model parameter estimates and their standard errors are compared.
Ghosal, Sayan; Gannepalli, Anil; Salapaka, Murti
2017-08-11
In this article, we explore methods that enable estimation of material properties with the dynamic mode atomic force microscopy suitable for soft matter investigation. The article presents the viewpoint of casting the system, comprising of a flexure probe interacting with the sample, as an equivalent cantilever system and compares a steady-state analysis based method with a recursive estimation technique for determining the parameters of the equivalent cantilever system in real time. The steady-state analysis of the equivalent cantilever model, which has been implicitly assumed in studies on material property determination, is validated analytically and experimentally. We show that the steady-state based technique yields results that quantitatively agree with the recursive method in the domain of its validity. The steady-state technique is considerably simpler to implement, however, slower compared to the recursive technique. The parameters of the equivalent system are utilized to interpret storage and dissipative properties of the sample. Finally, the article identifies key pitfalls that need to be avoided toward the quantitative estimation of material properties.
Tropical Cyclone Intensity Estimation Using Deep Convolutional Neural Networks
NASA Technical Reports Server (NTRS)
Maskey, Manil; Cecil, Dan; Ramachandran, Rahul; Miller, Jeffrey J.
2018-01-01
Estimating tropical cyclone intensity by just using satellite image is a challenging problem. With successful application of the Dvorak technique for more than 30 years along with some modifications and improvements, it is still used worldwide for tropical cyclone intensity estimation. A number of semi-automated techniques have been derived using the original Dvorak technique. However, these techniques suffer from subjective bias as evident from the most recent estimations on October 10, 2017 at 1500 UTC for Tropical Storm Ophelia: The Dvorak intensity estimates ranged from T2.3/33 kt (Tropical Cyclone Number 2.3/33 knots) from UW-CIMSS (University of Wisconsin-Madison - Cooperative Institute for Meteorological Satellite Studies) to T3.0/45 kt from TAFB (the National Hurricane Center's Tropical Analysis and Forecast Branch) to T4.0/65 kt from SAB (NOAA/NESDIS Satellite Analysis Branch). In this particular case, two human experts at TAFB and SAB differed by 20 knots in their Dvorak analyses, and the automated version at the University of Wisconsin was 12 knots lower than either of them. The National Hurricane Center (NHC) estimates about 10-20 percent uncertainty in its post analysis when only satellite based estimates are available. The success of the Dvorak technique proves that spatial patterns in infrared (IR) imagery strongly relate to tropical cyclone intensity. This study aims to utilize deep learning, the current state of the art in pattern recognition and image recognition, to address the need for an automated and objective tropical cyclone intensity estimation. Deep learning is a multi-layer neural network consisting of several layers of simple computational units. It learns discriminative features without relying on a human expert to identify which features are important. Our study mainly focuses on convolutional neural network (CNN), a deep learning algorithm, to develop an objective tropical cyclone intensity estimation. CNN is a supervised learning algorithm requiring a large number of training data. Since the archives of intensity data and tropical cyclone centric satellite images is openly available for use, the training data is easily created by combining the two. Results, case studies, prototypes, and advantages of this approach will be discussed.
Estimation of submarine mass failure probability from a sequence of deposits with age dates
Geist, Eric L.; Chaytor, Jason D.; Parsons, Thomas E.; ten Brink, Uri S.
2013-01-01
The empirical probability of submarine mass failure is quantified from a sequence of dated mass-transport deposits. Several different techniques are described to estimate the parameters for a suite of candidate probability models. The techniques, previously developed for analyzing paleoseismic data, include maximum likelihood and Type II (Bayesian) maximum likelihood methods derived from renewal process theory and Monte Carlo methods. The estimated mean return time from these methods, unlike estimates from a simple arithmetic mean of the center age dates and standard likelihood methods, includes the effects of age-dating uncertainty and of open time intervals before the first and after the last event. The likelihood techniques are evaluated using Akaike’s Information Criterion (AIC) and Akaike’s Bayesian Information Criterion (ABIC) to select the optimal model. The techniques are applied to mass transport deposits recorded in two Integrated Ocean Drilling Program (IODP) drill sites located in the Ursa Basin, northern Gulf of Mexico. Dates of the deposits were constrained by regional bio- and magnetostratigraphy from a previous study. Results of the analysis indicate that submarine mass failures in this location occur primarily according to a Poisson process in which failures are independent and return times follow an exponential distribution. However, some of the model results suggest that submarine mass failures may occur quasiperiodically at one of the sites (U1324). The suite of techniques described in this study provides quantitative probability estimates of submarine mass failure occurrence, for any number of deposits and age uncertainty distributions.
Estimating index of refraction from polarimetric hyperspectral imaging measurements.
Martin, Jacob A; Gross, Kevin C
2016-08-08
Current material identification techniques rely on estimating reflectivity or emissivity which vary with viewing angle. As off-nadir remote sensing platforms become increasingly prevalent, techniques robust to changing viewing geometries are desired. A technique leveraging polarimetric hyperspectral imaging (P-HSI), to estimate complex index of refraction, N̂(ν̃), an inherent material property, is presented. The imaginary component of N̂(ν̃) is modeled using a small number of "knot" points and interpolation at in-between frequencies ν̃. The real component is derived via the Kramers-Kronig relationship. P-HSI measurements of blackbody radiation scattered off of a smooth quartz window show that N̂(ν̃) can be retrieved to within 0.08 RMS error between 875 cm-1 ≤ ν̃ ≤ 1250 cm-1. P-HSI emission measurements of a heated smooth Pyrex beaker also enable successful N̂(ν̃) estimates, which are also invariant to object temperature.
Evaluation of a technique for satellite-derived area estimation of forest fires
NASA Technical Reports Server (NTRS)
Cahoon, Donald R., Jr.; Stocks, Brian J.; Levine, Joel S.; Cofer, Wesley R., III; Chung, Charles C.
1992-01-01
The advanced very high resolution radiometer (AVHRR), has been found useful for the location and monitoring of both smoke and fires because of the daily observations, the large geographical coverage of the imagery, the spectral characteristics of the instrument, and the spatial resolution of the instrument. This paper will discuss the application of AVHRR data to assess the geographical extent of burning. Methods have been developed to estimate the surface area of burning by analyzing the surface area effected by fire with AVHRR imagery. Characteristics of the AVHRR instrument, its orbit, field of view, and archived data sets are discussed relative to the unique surface area of each pixel. The errors associated with this surface area estimation technique are determined using AVHRR-derived area estimates of target regions with known sizes. This technique is used to evaluate the area burned during the Yellowstone fires of 1988.
Estimation of Soil Moisture with L-band Multi-polarization Radar
NASA Technical Reports Server (NTRS)
Shi, J.; Chen, K. S.; Kim, Chung-Li Y.; Van Zyl, J. J.; Njoku, E.; Sun, G.; O'Neill, P.; Jackson, T.; Entekhabi, D.
2004-01-01
Through analyses of the model simulated data-base, we developed a technique to estimate surface soil moisture under HYDROS radar sensor (L-band multi-polarizations and 40deg incidence) configuration. This technique includes two steps. First, it decomposes the total backscattering signals into two components - the surface scattering components (the bare surface backscattering signals attenuated by the overlaying vegetation layer) and the sum of the direct volume scattering components and surface-volume interaction components at different polarizations. From the model simulated data-base, our decomposition technique works quit well in estimation of the surface scattering components with RMSEs of 0.12,0.25, and 0.55 dB for VV, HH, and VH polarizations, respectively. Then, we use the decomposed surface backscattering signals to estimate the soil moisture and the combined surface roughness and vegetation attenuation correction factors with all three polarizations.
A Fourier approach to cloud motion estimation
NASA Technical Reports Server (NTRS)
Arking, A.; Lo, R. C.; Rosenfield, A.
1977-01-01
A Fourier technique is described for estimating cloud motion from pairs of pictures using the phase of the cross spectral density. The method allows motion estimates to be made for individual spatial frequencies, which are related to cloud pattern dimensions. Results obtained are presented and compared with the results of a Fourier domain cross correlation scheme. Using both artificial and real cloud data show that the technique is relatively sensitive to the presence of mixtures of motions, changes in cloud shape, and edge effects.
WAATS: A computer program for Weights Analysis of Advanced Transportation Systems
NASA Technical Reports Server (NTRS)
Glatt, C. R.
1974-01-01
A historical weight estimating technique for advanced transportation systems is presented. The classical approach to weight estimation is discussed and sufficient data is presented to estimate weights for a large spectrum of flight vehicles including horizontal and vertical takeoff aircraft, boosters and reentry vehicles. A computer program, WAATS (Weights Analysis for Advanced Transportation Systems) embracing the techniques discussed has been written and user instructions are presented. The program was developed for use in the ODIN (Optimal Design Integration System) system.
Reconciling Estimates of Cell Proliferation from Stable Isotope Labeling Experiments
Drylewicz, Julia; Elemans, Marjet; Zhang, Yan; Kelly, Elizabeth; Reljic, Rajko; Tesselaar, Kiki; de Boer, Rob J.; Macallan, Derek C.; Borghans, José A. M.; Asquith, Becca
2015-01-01
Stable isotope labeling is the state of the art technique for in vivo quantification of lymphocyte kinetics in humans. It has been central to a number of seminal studies, particularly in the context of HIV-1 and leukemia. However, there is a significant discrepancy between lymphocyte proliferation rates estimated in different studies. Notably, deuterated 2H2-glucose (D2-glucose) labeling studies consistently yield higher estimates of proliferation than deuterated water (D2O) labeling studies. This hampers our understanding of immune function and undermines our confidence in this important technique. Whether these differences are caused by fundamental biochemical differences between the two compounds and/or by methodological differences in the studies is unknown. D2-glucose and D2O labeling experiments have never been performed by the same group under the same experimental conditions; consequently a direct comparison of these two techniques has not been possible. We sought to address this problem. We performed both in vitro and murine in vivo labeling experiments using identical protocols with both D2-glucose and D2O. This showed that intrinsic differences between the two compounds do not cause differences in the proliferation rate estimates, but that estimates made using D2-glucose in vivo were susceptible to difficulties in normalization due to highly variable blood glucose enrichment. Analysis of three published human studies made using D2-glucose and D2O confirmed this problem, particularly in the case of short term D2-glucose labeling. Correcting for these inaccuracies in normalization decreased proliferation rate estimates made using D2-glucose and slightly increased estimates made using D2O; thus bringing the estimates from the two methods significantly closer and highlighting the importance of reliable normalization when using this technique. PMID:26437372
NASA Astrophysics Data System (ADS)
Montzka, Carsten; Hendricks Franssen, Harrie-Jan; Moradkhani, Hamid; Pütz, Thomas; Han, Xujun; Vereecken, Harry
2013-04-01
An adequate description of soil hydraulic properties is essential for a good performance of hydrological forecasts. So far, several studies showed that data assimilation could reduce the parameter uncertainty by considering soil moisture observations. However, these observations and also the model forcings were recorded with a specific measurement error. It seems a logical step to base state updating and parameter estimation on observations made at multiple time steps, in order to reduce the influence of outliers at single time steps given measurement errors and unknown model forcings. Such outliers could result in erroneous state estimation as well as inadequate parameters. This has been one of the reasons to use a smoothing technique as implemented for Bayesian data assimilation methods such as the Ensemble Kalman Filter (i.e. Ensemble Kalman Smoother). Recently, an ensemble-based smoother has been developed for state update with a SIR particle filter. However, this method has not been used for dual state-parameter estimation. In this contribution we present a Particle Smoother with sequentially smoothing of particle weights for state and parameter resampling within a time window as opposed to the single time step data assimilation used in filtering techniques. This can be seen as an intermediate variant between a parameter estimation technique using global optimization with estimation of single parameter sets valid for the whole period, and sequential Monte Carlo techniques with estimation of parameter sets evolving from one time step to another. The aims are i) to improve the forecast of evaporation and groundwater recharge by estimating hydraulic parameters, and ii) to reduce the impact of single erroneous model inputs/observations by a smoothing method. In order to validate the performance of the proposed method in a real world application, the experiment is conducted in a lysimeter environment.
Sim, K S; Yeap, Z X; Tso, C P
2016-11-01
An improvement to the existing technique of quantifying signal-to-noise ratio (SNR) of scanning electron microscope (SEM) images using piecewise cubic Hermite interpolation (PCHIP) technique is proposed. The new technique uses an adaptive tuning onto the PCHIP, and is thus named as ATPCHIP. To test its accuracy, 70 images are corrupted with noise and their autocorrelation functions are then plotted. The ATPCHIP technique is applied to estimate the uncorrupted noise-free zero offset point from a corrupted image. Three existing methods, the nearest neighborhood, first order interpolation and original PCHIP, are used to compare with the performance of the proposed ATPCHIP method, with respect to their calculated SNR values. Results show that ATPCHIP is an accurate and reliable method to estimate SNR values from SEM images. SCANNING 38:502-514, 2016. © 2015 Wiley Periodicals, Inc. © Wiley Periodicals, Inc.
Downward longwave surface radiation from sun-synchronous satellite data - Validation of methodology
NASA Technical Reports Server (NTRS)
Darnell, W. L.; Gupta, S. K.; Staylor, W. F.
1986-01-01
An extensive study has been carried out to validate a satellite technique for estimating downward longwave radiation at the surface. The technique, mostly developed earlier, uses operational sun-synchronous satellite data and a radiative transfer model to provide the surface flux estimates. The satellite-derived fluxes were compared directly with corresponding ground-measured fluxes at four different sites in the United States for a common one-year period. This provided a study of seasonal variations as well as a diversity of meteorological conditions. Dome heating errors in the ground-measured fluxes were also investigated and were corrected prior to the comparisons. Comparison of the monthly averaged fluxes from the satellite and ground sources for all four sites for the entire year showed a correlation coefficient of 0.98 and a standard error of estimate of 10 W/sq m. A brief description of the technique is provided, and the results validating the technique are presented.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wahl, D.E.; Jakowatz, C.V. Jr.; Ghiglia, D.C.
1991-01-01
Autofocus methods in SAR and self-survey techniques in SONAR have a common mathematical basis in that they both involve estimation and correction of phase errors introduced by sensor position uncertainties. Time delay estimation and correlation methods have been shown to be effective in solving the self-survey problem for towed SONAR arrays. Since it can be shown that platform motion errors introduce similar time-delay estimation problems in SAR imaging, the question arises as to whether such techniques could be effectively employed for autofocus of SAR imagery. With a simple mathematical model for motion errors in SAR, we will show why suchmore » correlation/time-delay techniques are not nearly as effective as established SAR autofocus algorithms such as phase gradient autofocus or sub-aperture based methods. This analysis forms an important bridge between signal processing methodologies for SAR and SONAR. 5 refs., 4 figs.« less
A quantitative investigation of the fracture pump-in/flowback test
DOE Office of Scientific and Technical Information (OSTI.GOV)
Plahn, S.V.; Nolte, K.G.; Thompson, L.G.
1997-02-01
Fracture-closure pressure is an important parameter for fracture treatment design and evaluation. The pump-in/flowback (PIFB) test is frequently used to estimate its magnitude. The test is attractive because bottomhole pressures (BHP`s) during flowback develop a distinct and repeatable signature. This is in contrast to the pump-in/shut-in test, where strong indications of fracture closure are rarely seen. Various techniques are used to extract closure pressure from the flowback-pressure response. Unfortunately, these techniques give different estimates for closure pressure, and their theoretical bases are not well established. The authors present results that place the PIFB test on a firmer foundation. A numericalmore » model is used to simulate the PIFB test and glean physical mechanisms contributing to the response. On the basis of their simulation results, they propose interpretation techniques that give better estimates of closure pressure than existing techniques.« less
NASA Technical Reports Server (NTRS)
Doneaud, Andre A.; Miller, James R., Jr.; Johnson, L. Ronald; Vonder Haar, Thomas H.; Laybe, Patrick
1987-01-01
The use of the area-time-integral (ATI) technique, based only on satellite data, to estimate convective rain volume over a moving target is examined. The technique is based on the correlation between the radar echo area coverage integrated over the lifetime of the storm and the radar estimated rain volume. The processing of the GOES and radar data collected in 1981 is described. The radar and satellite parameters for six convective clusters from storm events occurring on June 12 and July 2, 1981 are analyzed and compared in terms of time steps and cluster lifetimes. Rain volume is calculated by first using the regression analysis to generate the regression equation used to obtain the ATI; the ATI versus rain volume relation is then employed to compute rain volume. The data reveal that the ATI technique using satellite data is applicable to the calculation of rain volume.
Wavelet Analyses of Oil Prices, USD Variations and Impact on Logistics
NASA Astrophysics Data System (ADS)
Melek, M.; Tokgozlu, A.; Aslan, Z.
2009-07-01
This paper is related with temporal variations of historical oil prices and Dollar and Euro in Turkey. Daily data based on OECD and Central Bank of Turkey records beginning from 1946 has been considered. 1D-continuous wavelets and wavelet packets analysis techniques have been applied on data. Wavelet techniques help to detect abrupt changing's, increasing and decreasing trends of data. Estimation of variables has been presented by using linear regression estimation techniques. The results of this study have been compared with the small and large scale effects. Transportation costs of track show a similar variation with fuel prices. The second part of the paper is related with estimation of imports, exports, costs, total number of vehicles and annual variations by considering temporal variation of oil prices and Dollar currency in Turkey. Wavelet techniques offer a user friendly methodology to interpret some local effects on increasing trend of imports and exports data.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Butler, J.J. Jr.; Hyder, Z.
The Nguyen and Pinder method is one of four techniques commonly used for analysis of response data from slug tests. Limited field research has raised questions about the reliability of the parameter estimates obtained with this method. A theoretical evaluation of this technique reveals that errors were made in the derivation of the analytical solution upon which the technique is based. Simulation and field examples show that the errors result in parameter estimates that can differ from actual values by orders of magnitude. These findings indicate that the Nguyen and Pinder method should no longer be a tool in themore » repertoire of the field hydrogeologist. If data from a slug test performed in a partially penetrating well in a confined aquifer need to be analyzed, recent work has shown that the Hvorslev method is the best alternative among the commonly used techniques.« less
NASA Astrophysics Data System (ADS)
Asfahani, Jamal
2017-08-01
An alternative approach using nuclear neutron-porosity and electrical resistivity well logging of long (64 inch) and short (16 inch) normal techniques is proposed to estimate the porosity and the hydraulic conductivity ( K) of the basaltic aquifers in Southern Syria. This method is applied on the available logs of Kodana well in Southern Syria. It has been found that the obtained K value by applying this technique seems to be reasonable and comparable with the hydraulic conductivity value of 3.09 m/day obtained by the pumping test carried out at Kodana well. The proposed alternative well logging methodology seems as promising and could be practiced in the basaltic environments for the estimation of hydraulic conductivity parameter. However, more detailed researches are still required to make this proposed technique very performed in basaltic environments.
NASA Astrophysics Data System (ADS)
Nakamura, T. K. M.; Nakamura, R.; Varsani, A.; Genestreti, K. J.; Baumjohann, W.; Liu, Y.-H.
2018-05-01
A remote sensing technique to infer the local reconnection electric field based on in situ multipoint spacecraft observation at the reconnection separatrix is proposed. In this technique, the increment of the reconnected magnetic flux is estimated by integrating the in-plane magnetic field during the sequential observation of the separatrix boundary by multipoint measurements. We tested this technique by applying it to virtual observations in a two-dimensional fully kinetic particle-in-cell simulation of magnetic reconnection without a guide field and confirmed that the estimated reconnection electric field indeed agrees well with the exact value computed at the X-line. We then applied this technique to an event observed by the Magnetospheric Multiscale mission when crossing an energetic plasma sheet boundary layer during an intense substorm. The estimated reconnection electric field for this event is nearly 1 order of magnitude higher than a typical value of magnetotail reconnection.
Application of remote sensing techniques for identification of irrigated crop lands in Arizona
NASA Technical Reports Server (NTRS)
Billings, H. A.
1981-01-01
Satellite imagery was used in a project developed to demonstrate remote sensing methods of determining irrigated acreage in Arizona. The Maricopa water district, west of Phoenix, was chosen as the test area. Band rationing and unsupervised categorization were used to perform the inventory. For both techniques the irrigation district boundaries and section lines were digitized and calculated and displayed by section. Both estimation techniques were quite accurate in estimating irrigated acreage in the 1979 growing season.
NASA Astrophysics Data System (ADS)
Jin, Minquan; Delshad, Mojdeh; Dwarakanath, Varadarajan; McKinney, Daene C.; Pope, Gary A.; Sepehrnoori, Kamy; Tilburg, Charles E.; Jackson, Richard E.
1995-05-01
In this paper we present a partitioning interwell tracer test (PITT) technique for the detection, estimation, and remediation performance assessment of the subsurface contaminated by nonaqueous phase liquids (NAPLs). We demonstrate the effectiveness of this technique by examples of experimental and simulation results. The experimental results are from partitioning tracer experiments in columns packed with Ottawa sand. Both the method of moments and inverse modeling techniques for estimating NAPL saturation in the sand packs are demonstrated. In the simulation examples we use UTCHEM, a comprehensive three-dimensional, chemical flood compositional simulator developed at the University of Texas, to simulate a hypothetical two-dimensional aquifer with properties similar to the Borden site contaminated by tetrachloroethylene (PCE), and we show how partitioning interwell tracer tests can be used to estimate the amount of PCE contaminant before remedial action and as the remediation process proceeds. Tracer tests results from different stages of remediation are compared to determine the quantity of PCE removed and the amount remaining. Both the experimental (small-scale) and simulation (large-scale) results demonstrate that PITT can be used as an innovative and effective technique to detect and estimate the amount of residual NAPL and for remediation performance assessment in subsurface formations.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jin, M.; Delshad, M.; Dwarakanath, V.
1995-05-01
In this paper we present a partitioning interwell tracer test (PITT) technique for the detection, estimation, and remediation performance assessment of the subsurface contaminated by nonaqueous phase liquids (NAPLs). We demonstrate the effectiveness of this technique by examples of experimental and simulation results. The experimental results are from partitioning tracer experiments in columns packed with Ottawa sand. Both the method of moments and inverse modeling techniques for estimating NAPL saturation in the sand packs are demonstrated. In the simulation examples we use UTCHEM, a comprehensive three-dimensional, chemical flood compositional simulator developed at the University of Texas, to simulate a hypotheticalmore » two-dimensional aquifer with properties similar to the Borden site contaminated by tetrachloroethylene (PCE), and we show how partitioning interwell tracer tests can be used to estimate the amount of PCE contaminant before remedial action and as the remediation process proceeds. Tracer test results from different stages of remediation are compared to determine the quantity of PCE removed and the amount remaining. Both the experimental (small-scale) and simulation (large-scale) results demonstrate that PITT can be used as an innovative and effective technique to detect and estimate the amount of residual NAPL and for remediation performance assessment in subsurface formations. 43 refs., 10 figs., 1 tab.« less
Piovesan, Davide; Pierobon, Alberto; DiZio, Paul; Lackner, James R
2012-01-01
This study presents and validates a Time-Frequency technique for measuring 2-dimensional multijoint arm stiffness throughout a single planar movement as well as during static posture. It is proposed as an alternative to current regressive methods which require numerous repetitions to obtain average stiffness on a small segment of the hand trajectory. The method is based on the analysis of the reassigned spectrogram of the arm's response to impulsive perturbations and can estimate arm stiffness on a trial-by-trial basis. Analytic and empirical methods are first derived and tested through modal analysis on synthetic data. The technique's accuracy and robustness are assessed by modeling the estimation of stiffness time profiles changing at different rates and affected by different noise levels. Our method obtains results comparable with two well-known regressive techniques. We also test how the technique can identify the viscoelastic component of non-linear and higher than second order systems with a non-parametrical approach. The technique proposed here is very impervious to noise and can be used easily for both postural and movement tasks. Estimations of stiffness profiles are possible with only one perturbation, making our method a useful tool for estimating limb stiffness during motor learning and adaptation tasks, and for understanding the modulation of stiffness in individuals with neurodegenerative diseases.
Kate, Rohit J.; Swartz, Ann M.; Welch, Whitney A.; Strath, Scott J.
2016-01-01
Wearable accelerometers can be used to objectively assess physical activity. However, the accuracy of this assessment depends on the underlying method used to process the time series data obtained from accelerometers. Several methods have been proposed that use this data to identify the type of physical activity and estimate its energy cost. Most of the newer methods employ some machine learning technique along with suitable features to represent the time series data. This paper experimentally compares several of these techniques and features on a large dataset of 146 subjects doing eight different physical activities wearing an accelerometer on the hip. Besides features based on statistics, distance based features and simple discrete features straight from the time series were also evaluated. On the physical activity type identification task, the results show that using more features significantly improve results. Choice of machine learning technique was also found to be important. However, on the energy cost estimation task, choice of features and machine learning technique were found to be less influential. On that task, separate energy cost estimation models trained specifically for each type of physical activity were found to be more accurate than a single model trained for all types of physical activities. PMID:26862679
A Comparison of Anthropogenic Carbon Dioxide Emissions Datasets: UND and CDIAC
NASA Astrophysics Data System (ADS)
Gregg, J. S.; Andres, R. J.
2005-05-01
Using data from the Department of Energy's Energy Information Administration (EIA), a technique is developed to estimate the monthly consumption of solid, liquid and gaseous fossil fuels for each state in the union. This technique employs monthly sales data to estimate the relative monthly proportions of the total annual carbon dioxide emissions from fossil-fuel use for all states in the union. The University of North Dakota (UND) results are compared to those published by Carbon Dioxide Information Analysis Center (CDIAC) at the Oak Ridge National Laboratory (ORNL). Recently, annual emissions per U.S. state (Blasing, Broniak, Marland, 2004a) as well as monthly CO2 emissions for the United States (Blasing, Broniak, Marland, 2004b) have been added to the CDIAC website. To determine the success of this technique, the individual state results are compared to the annual state totals calculated by CDIAC. In addition, the monthly country totals are compared with those produced by CDIAC. In general, the UND technique produces estimates that are consistent with those available on the CDIAC Trends website. Comparing the results from these two methods permits an improved understanding of the strengths and shortcomings of both estimation techniques. The primary advantages of the UND approach are its ease of implementation, the improved spatial and temporal resolution it can produce, and its universal applicability.
NASA Technical Reports Server (NTRS)
Moes, Timothy R.; Smith, Mark S.; Morelli, Eugene A.
2003-01-01
Near real-time stability and control derivative extraction is required to support flight demonstration of Intelligent Flight Control System (IFCS) concepts being developed by NASA, academia, and industry. Traditionally, flight maneuvers would be designed and flown to obtain stability and control derivative estimates using a postflight analysis technique. The goal of the IFCS concept is to be able to modify the control laws in real time for an aircraft that has been damaged in flight. In some IFCS implementations, real-time parameter identification (PID) of the stability and control derivatives of the damaged aircraft is necessary for successfully reconfiguring the control system. This report investigates the usefulness of Prescribed Simultaneous Independent Surface Excitations (PreSISE) to provide data for rapidly obtaining estimates of the stability and control derivatives. Flight test data were analyzed using both equation-error and output-error PID techniques. The equation-error PID technique is known as Fourier Transform Regression (FTR) and is a frequency-domain real-time implementation. Selected results were compared with a time-domain output-error technique. The real-time equation-error technique combined with the PreSISE maneuvers provided excellent derivative estimation in the longitudinal axis. However, the PreSISE maneuvers as presently defined were not adequate for accurate estimation of the lateral-directional derivatives.
NASA Astrophysics Data System (ADS)
Tran, H.; Mansfield, M. L.; Lyman, S. N.; O'Neil, T.; Jones, C. P.
2015-12-01
Emissions from produced-water treatment ponds are poorly characterized sources in oil and gas emission inventories that play a critical role in studying elevated winter ozone events in the Uintah Basin, Utah, U.S. Information gaps include un-quantified amounts and compositions of gases emitted from these facilities. The emitted gases are often known as volatile organic compounds (VOCs) which, beside nitrogen oxides (NOX), are major precursors for ozone formation in the near-surface layer. Field measurement campaigns using the flux-chamber technique have been performed to measure VOC emissions from a limited number of produced water ponds in the Uintah Basin of eastern Utah. Although the flux chamber provides accurate measurements at the point of sampling, it covers just a limited area of the ponds and is prone to altering environmental conditions (e.g., temperature, pressure). This fact raises the need to validate flux chamber measurements. In this study, we apply an inverse-dispersion modeling technique with evacuated canister sampling to validate the flux-chamber measurements. This modeling technique applies an initial and arbitrary emission rate to estimate pollutant concentrations at pre-defined receptors, and adjusts the emission rate until the estimated pollutant concentrations approximates measured concentrations at the receptors. The derived emission rates are then compared with flux-chamber measurements and differences are analyzed. Additionally, we investigate the applicability of the WATER9 wastewater emission model for the estimation of VOC emissions from produced-water ponds in the Uintah Basin. WATER9 estimates the emission of each gas based on properties of the gas, its concentration in the waste water, and the characteristics of the influent and treatment units. Results of VOC emission estimations using inverse-dispersion and WATER9 modeling techniques will be reported.
NASA Astrophysics Data System (ADS)
Yamazaki, Takaharu; Futai, Kazuma; Tomita, Tetsuya; Sato, Yoshinobu; Yoshikawa, Hideki; Tamura, Shinichi; Sugamoto, Kazuomi
2011-03-01
To achieve 3D kinematic analysis of total knee arthroplasty (TKA), 2D/3D registration techniques, which use X-ray fluoroscopic images and computer-aided design (CAD) model of the knee implant, have attracted attention in recent years. These techniques could provide information regarding the movement of radiopaque femoral and tibial components but could not provide information of radiolucent polyethylene insert, because the insert silhouette on X-ray image did not appear clearly. Therefore, it was difficult to obtain 3D kinemaitcs of polyethylene insert, particularly mobile-bearing insert that move on the tibial component. This study presents a technique and the accuracy for 3D kinematic analysis of mobile-bearing insert in TKA using X-ray fluoroscopy, and finally performs clinical applications. For a 3D pose estimation technique of the mobile-bearing insert in TKA using X-ray fluoroscopy, tantalum beads and CAD model with its beads are utilized, and the 3D pose of the insert model is estimated using a feature-based 2D/3D registration technique. In order to validate the accuracy of the present technique, experiments including computer simulation test were performed. The results showed the pose estimation accuracy was sufficient for analyzing mobile-bearing TKA kinematics (the RMS error: about 1.0 mm, 1.0 degree). In the clinical applications, seven patients with mobile-bearing TKA in deep knee bending motion were studied and analyzed. Consequently, present technique enables us to better understand mobile-bearing TKA kinematics, and this type of evaluation was thought to be helpful for improving implant design and optimizing TKA surgical techniques.
England, M L; Broderick, G A; Shaver, R D; Combs, D K
1997-11-01
Ruminally undegraded protein (RUP) values of blood meal (n = 2), hydrolyzed feather meal (n = 2), fish meal (n = 2), meat and bone meal, and soybean meal were estimated using an in situ method, an inhibitor in vitro method, and an inhibitor in vitro technique applying Michaelis-Menten saturation kinetics. Degradation rates for in situ and inhibitor in vitro methods were calculated by regression of the natural log of the proportion of crude protein (CP) remaining undegraded versus time. Nonlinear regression analysis of the integrated Michaelis-Menten equation was used to determine maximum velocity, the Michaelis constant, and degradation rate (the ratio of maximum velocity to the Michaelis constant). A ruminal passage rate of 0.06/h was assumed in the calculation of RUP. The in situ and inhibitor in vitro techniques yielded similar estimates of ruminal degradation. Mean RUP estimated for soybean meal, blood meal, hydrolyzed feather meal, fish meal, and meat and bone meal were, respectively, 28.6, 86.0, 77.4, 52.9, and 52.6% of CP by the in situ method and 26.4, 86.1, 76.0, 59.6, and 49.5% of CP by the inhibitor in vitro technique. The Michaelis-Menten inhibitor in vitro technique yielded more rapid CP degradation rates and decreased estimates of RUP. The inhibitor in vitro method required less time and labor than did the other two techniques to estimate the RUP values of animal by-product proteins. Results from in vitro incubations with pepsin.HCl suggested that low postruminal digestibility of hydrolyzed feather meal may impair its value as a source of RUP.
Improving Focal Depth Estimates: Studies of Depth Phase Detection at Regional Distances
NASA Astrophysics Data System (ADS)
Stroujkova, A.; Reiter, D. T.; Shumway, R. H.
2006-12-01
The accurate estimation of the depth of small, regionally recorded events continues to be an important and difficult explosion monitoring research problem. Depth phases (free surface reflections) are the primary tool that seismologists use to constrain the depth of a seismic event. When depth phases from an event are detected, an accurate source depth is easily found by using the delay times of the depth phases relative to the P wave and a velocity profile near the source. Cepstral techniques, including cepstral F-statistics, represent a class of methods designed for the depth-phase detection and identification; however, they offer only a moderate level of success at epicentral distances less than 15°. This is due to complexities in the Pn coda, which can lead to numerous false detections in addition to the true phase detection. Therefore, cepstral methods cannot be used independently to reliably identify depth phases. Other evidence, such as apparent velocities, amplitudes and frequency content, must be used to confirm whether the phase is truly a depth phase. In this study we used a variety of array methods to estimate apparent phase velocities and arrival azimuths, including beam-forming, semblance analysis, MUltiple SIgnal Classification (MUSIC) (e.g., Schmidt, 1979), and cross-correlation (e.g., Cansi, 1995; Tibuleac and Herrin, 1997). To facilitate the processing and comparison of results, we developed a MATLAB-based processing tool, which allows application of all of these techniques (i.e., augmented cepstral processing) in a single environment. The main objective of this research was to combine the results of three focal-depth estimation techniques and their associated standard errors into a statistically valid unified depth estimate. The three techniques include: 1. Direct focal depth estimate from the depth-phase arrival times picked via augmented cepstral processing. 2. Hypocenter location from direct and surface-reflected arrivals observed on sparse networks of regional stations using a Grid-search, Multiple-Event Location method (GMEL; Rodi and Toksöz, 2000; 2001). 3. Surface-wave dispersion inversion for event depth and focal mechanism (Herrmann and Ammon, 2002). To validate our approach and provide quality control for our solutions, we applied the techniques to moderated- sized events (mb between 4.5 and 6.0) with known focal mechanisms. We illustrate the techniques using events observed at regional distances from the KSAR (Wonju, South Korea) teleseismic array and other nearby broadband three-component stations. Our results indicate that the techniques can produce excellent agreement between the various depth estimates. In addition, combining the techniques into a "unified" estimate greatly reduced location errors and improved robustness of the solution, even if results from the individual methods yielded large standard errors.
NASA Astrophysics Data System (ADS)
Gaona Garcia, J.; Lewandowski, J.; Bellin, A.
2017-12-01
Groundwater-stream water interactions in rivers determine water balances, but also chemical and biological processes in the streambed at different spatial and temporal scales. Due to the difficult identification and quantification of gaining, neutral and losing conditions, it is necessary to combine techniques with complementary capabilities and scale ranges. We applied this concept to a study site at the River Schlaube, East Brandenburg-Germany, a sand bed stream with intense sediment heterogeneity and complex environmental conditions. In our approach, point techniques such as temperature profiles of the streambed together with vertical hydraulic gradients provide data for the estimation of fluxes between groundwater and surface water with the numerical model 1DTempPro. On behalf of distributed techniques, fiber optic distributed temperature sensing identifies the spatial patterns of neutral, down- and up-welling areas by analysis of the changes in the thermal patterns at the streambed interface under certain flow. The study finally links point and surface temperatures to provide a method for upscaling of fluxes. Point techniques provide point flux estimates with essential depth detail to infer streambed structures while the results hardly represent the spatial distribution of fluxes caused by the heterogeneity of streambed properties. Fiber optics proved capable of providing spatial thermal patterns with enough resolution to observe distinct hyporheic thermal footprints at multiple scales. The relation of thermal footprint patterns and temporal behavior with flux results from point techniques enabled the use of methods for spatial flux estimates. The lack of detailed information of the physical driver's spatial distribution restricts the spatial flux estimation to the application of the T-proxy method, whose highly uncertain results mainly provide coarse spatial flux estimates. The study concludes that the upscaling of groundwater-stream water interactions using thermal measurements with combined point and distributed techniques requires the integration of physical drivers because of the heterogeneity of the flux patterns. Combined experimental and modeling approaches may help to obtain more reliable understanding of groundwater-surface water interactions at multiple scales.
An extended stochastic method for seismic hazard estimation
NASA Astrophysics Data System (ADS)
Abd el-aal, A. K.; El-Eraki, M. A.; Mostafa, S. I.
2015-12-01
In this contribution, we developed an extended stochastic technique for seismic hazard assessment purposes. This technique depends on the hypothesis of stochastic technique of Boore (2003) "Simulation of ground motion using the stochastic method. Appl. Geophy. 160:635-676". The essential characteristics of extended stochastic technique are to obtain and simulate ground motion in order to minimize future earthquake consequences. The first step of this technique is defining the seismic sources which mostly affect the study area. Then, the maximum expected magnitude is defined for each of these seismic sources. It is followed by estimating the ground motion using an empirical attenuation relationship. Finally, the site amplification is implemented in calculating the peak ground acceleration (PGA) at each site of interest. We tested and applied this developed technique at Cairo, Suez, Port Said, Ismailia, Zagazig and Damietta cities to predict the ground motion. Also, it is applied at Cairo, Zagazig and Damietta cities to estimate the maximum peak ground acceleration at actual soil conditions. In addition, 0.5, 1, 5, 10 and 20 % damping median response spectra are estimated using the extended stochastic simulation technique. The calculated highest acceleration values at bedrock conditions are found at Suez city with a value of 44 cm s-2. However, these acceleration values decrease towards the north of the study area to reach 14.1 cm s-2 at Damietta city. This comes in agreement with the results of previous studies of seismic hazards in northern Egypt and is found to be comparable. This work can be used for seismic risk mitigation and earthquake engineering purposes.
The Inverse Problem for Confined Aquifer Flow: Identification and Estimation With Extensions
NASA Astrophysics Data System (ADS)
Loaiciga, Hugo A.; MariñO, Miguel A.
1987-01-01
The contributions of this work are twofold. First, a methodology for estimating the elements of parameter matrices in the governing equation of flow in a confined aquifer is developed. The estimation techniques for the distributed-parameter inverse problem pertain to linear least squares and generalized least squares methods. The linear relationship among the known heads and unknown parameters of the flow equation provides the background for developing criteria for determining the identifiability status of unknown parameters. Under conditions of exact or overidentification it is possible to develop statistically consistent parameter estimators and their asymptotic distributions. The estimation techniques, namely, two-stage least squares and three stage least squares, are applied to a specific groundwater inverse problem and compared between themselves and with an ordinary least squares estimator. The three-stage estimator provides the closer approximation to the actual parameter values, but it also shows relatively large standard errors as compared to the ordinary and two-stage estimators. The estimation techniques provide the parameter matrices required to simulate the unsteady groundwater flow equation. Second, a nonlinear maximum likelihood estimation approach to the inverse problem is presented. The statistical properties of maximum likelihood estimators are derived, and a procedure to construct confidence intervals and do hypothesis testing is given. The relative merits of the linear and maximum likelihood estimators are analyzed. Other topics relevant to the identification and estimation methodologies, i.e., a continuous-time solution to the flow equation, coping with noise-corrupted head measurements, and extension of the developed theory to nonlinear cases are also discussed. A simulation study is used to evaluate the methods developed in this study.
Longitudinal Factor Score Estimation Using the Kalman Filter.
ERIC Educational Resources Information Center
Oud, Johan H.; And Others
1990-01-01
How longitudinal factor score estimation--the estimation of the evolution of factor scores for individual examinees over time--can profit from the Kalman filter technique is described. The Kalman estimates change more cautiously over time, have lower estimation error variances, and reproduce the LISREL program latent state correlations more…
A spline-based parameter and state estimation technique for static models of elastic surfaces
NASA Technical Reports Server (NTRS)
Banks, H. T.; Daniel, P. L.; Armstrong, E. S.
1983-01-01
Parameter and state estimation techniques for an elliptic system arising in a developmental model for the antenna surface in the Maypole Hoop/Column antenna are discussed. A computational algorithm based on spline approximations for the state and elastic parameters is given and numerical results obtained using this algorithm are summarized.
B. Lane Rivenbark; C. Rhett Jackson
2004-01-01
Regional average evapotranspiration estimates developed by water balance techniques are frequently used to estimate average discharge in ungaged strttams. However, the lower stream size range for the validity of these techniques has not been explored. Flow records were collected and evaluated for 16 small streams in the Southern Appalachians to test whether the...
Estimating abundance of Sitka black-tailed deer using DNA from fecal pellets
Todd J. Brinkman; David K. Person; F. Stuart Chapin; Winston Smith; Kris J. Hundertmark
2011-01-01
Densely vegetated environments have hindered collection of basic population parameters on forest-dwelling ungulates. Our objective was to develop a mark-recapture technique that used DNA from fecal pellets to overcome constraints associated with estimating abundance of ungulates in landscapes where direct observation is difficult. We tested our technique on Sitka black...
Two above-ground forest biomass estimation techniques were evaluated for the United States Territory of Puerto Rico using predictor variables acquired from satellite based remotely sensed data and ground data from the U.S. Department of Agriculture Forest Inventory Analysis (FIA)...
Comparing techniques for estimating flame temperature of prescribed fires
Deborah K. Kennard; Kenneth W. Outcalt; David Jones; Joseph J. O' Brien
2005-01-01
A variety of techniques that estimate temperature and/or heat output during fires are available. We assessed the predictive ability of metal and tile pyrometers, calorimeters of different sizes, and fuel consumption to time-temperature metrics derived from thick and thin thermocouples at 140 points distributed over 9 management-scale burns in a longleaf pine forest in...
NASA Astrophysics Data System (ADS)
Luu, Gia Thien; Boualem, Abdelbassit; Duy, Tran Trung; Ravier, Philippe; Butteli, Olivier
Muscle Fiber Conduction Velocity (MFCV) can be calculated from the time delay between the surface electromyographic (sEMG) signals recorded by electrodes aligned with the fiber direction. In order to take into account the non-stationarity during the dynamic contraction (the most daily life situation) of the data, the developed methods have to consider that the MFCV changes over time, which induces time-varying delays and the data is non-stationary (change of Power Spectral Density (PSD)). In this paper, the problem of TVD estimation is considered using a parametric method. First, the polynomial model of TVD has been proposed. Then, the TVD model parameters are estimated by using a maximum likelihood estimation (MLE) strategy solved by a deterministic optimization technique (Newton) and stochastic optimization technique, called simulated annealing (SA). The performance of the two techniques is also compared. We also derive two appropriate Cramer-Rao Lower Bounds (CRLB) for the estimated TVD model parameters and for the TVD waveforms. Monte-Carlo simulation results show that the estimation of both the model parameters and the TVD function is unbiased and that the variance obtained is close to the derived CRBs. A comparison with non-parametric approaches of the TVD estimation is also presented and shows the superiority of the method proposed.
Satellite angular velocity estimation based on star images and optical flow techniques.
Fasano, Giancarmine; Rufino, Giancarlo; Accardo, Domenico; Grassi, Michele
2013-09-25
An optical flow-based technique is proposed to estimate spacecraft angular velocity based on sequences of star-field images. It does not require star identification and can be thus used to also deliver angular rate information when attitude determination is not possible, as during platform de tumbling or slewing. Region-based optical flow calculation is carried out on successive star images preprocessed to remove background. Sensor calibration parameters, Poisson equation, and a least-squares method are then used to estimate the angular velocity vector components in the sensor rotating frame. A theoretical error budget is developed to estimate the expected angular rate accuracy as a function of camera parameters and star distribution in the field of view. The effectiveness of the proposed technique is tested by using star field scenes generated by a hardware-in-the-loop testing facility and acquired by a commercial-off-the shelf camera sensor. Simulated cases comprise rotations at different rates. Experimental results are presented which are consistent with theoretical estimates. In particular, very accurate angular velocity estimates are generated at lower slew rates, while in all cases the achievable accuracy in the estimation of the angular velocity component along boresight is about one order of magnitude worse than the other two components.
Satellite Angular Velocity Estimation Based on Star Images and Optical Flow Techniques
Fasano, Giancarmine; Rufino, Giancarlo; Accardo, Domenico; Grassi, Michele
2013-01-01
An optical flow-based technique is proposed to estimate spacecraft angular velocity based on sequences of star-field images. It does not require star identification and can be thus used to also deliver angular rate information when attitude determination is not possible, as during platform de tumbling or slewing. Region-based optical flow calculation is carried out on successive star images preprocessed to remove background. Sensor calibration parameters, Poisson equation, and a least-squares method are then used to estimate the angular velocity vector components in the sensor rotating frame. A theoretical error budget is developed to estimate the expected angular rate accuracy as a function of camera parameters and star distribution in the field of view. The effectiveness of the proposed technique is tested by using star field scenes generated by a hardware-in-the-loop testing facility and acquired by a commercial-off-the shelf camera sensor. Simulated cases comprise rotations at different rates. Experimental results are presented which are consistent with theoretical estimates. In particular, very accurate angular velocity estimates are generated at lower slew rates, while in all cases the achievable accuracy in the estimation of the angular velocity component along boresight is about one order of magnitude worse than the other two components. PMID:24072023
AMT-200S Motor Glider Parameter and Performance Estimation
NASA Technical Reports Server (NTRS)
Taylor, Brian R.
2011-01-01
Parameter and performance estimation of an instrumented motor glider was conducted at the National Aeronautics and Space Administration Dryden Flight Research Center in order to provide the necessary information to create a simulation of the aircraft. An output-error technique was employed to generate estimates from doublet maneuvers, and performance estimates were compared with results from a well-known flight-test evaluation of the aircraft in order to provide a complete set of data. Aircraft specifications are given along with information concerning instrumentation, flight-test maneuvers flown, and the output-error technique. Discussion of Cramer-Rao bounds based on both white noise and colored noise assumptions is given. Results include aerodynamic parameter and performance estimates for a range of angles of attack.
Computerized technique for recording board defect data
R. Bruce Anderson; R. Edward Thomas; Charles J. Gatchell; Neal D. Bennett; Neal D. Bennett
1993-01-01
A computerized technique for recording board defect data has been developed that is faster and more accurate than manual techniques. The lumber database generated by this technique is a necessary input to computer simulation models that estimate potential cutting yields from various lumber breakdown sequences. The technique allows collection of detailed information...
Czarnecki, John B.; Stannard, David I.
1997-01-01
Franklin Lake playa is one of the principal discharge areas of the ground-water-flow system associated with Yucca Mountain, Nevada, the potential site of a high-level nuclear-waste repository. By using the energy-budget eddy-correlation technique, measurements made between June 1983 and April 1984 to estimate evapotranspiration were found to range from 0.1 centimeter per day during winter months to about 0.3 centimeter per day during summer months; the annual average was 0.16 centimeter per day. These estimates were compared with evapotranspiration estimates calculated from six other methods.
Estimating pixel variances in the scenes of staring sensors
Simonson, Katherine M [Cedar Crest, NM; Ma, Tian J [Albuquerque, NM
2012-01-24
A technique for detecting changes in a scene perceived by a staring sensor is disclosed. The technique includes acquiring a reference image frame and a current image frame of a scene with the staring sensor. A raw difference frame is generated based upon differences between the reference image frame and the current image frame. Pixel error estimates are generated for each pixel in the raw difference frame based at least in part upon spatial error estimates related to spatial intensity gradients in the scene. The pixel error estimates are used to mitigate effects of camera jitter in the scene between the current image frame and the reference image frame.
Predicting the long tail of book sales: Unearthing the power-law exponent
NASA Astrophysics Data System (ADS)
Fenner, Trevor; Levene, Mark; Loizou, George
2010-06-01
The concept of the long tail has recently been used to explain the phenomenon in e-commerce where the total volume of sales of the items in the tail is comparable to that of the most popular items. In the case of online book sales, the proportion of tail sales has been estimated using regression techniques on the assumption that the data obeys a power-law distribution. Here we propose a different technique for estimation based on a generative model of book sales that results in an asymptotic power-law distribution of sales, but which does not suffer from the problems related to power-law regression techniques. We show that the proportion of tail sales predicted is very sensitive to the estimated power-law exponent. In particular, if we assume that the power-law exponent of the cumulative distribution is closer to 1.1 rather than to 1.2 (estimates published in 2003, calculated using regression by two groups of researchers), then our computations suggest that the tail sales of Amazon.com, rather than being 40% as estimated by Brynjolfsson, Hu and Smith in 2003, are actually closer to 20%, the proportion estimated by its CEO.
Winter bird population studies and project prairie birds for surveying grassland birds
Twedt, D.J.; Hamel, P.B.; Woodrey, M.S.
2008-01-01
We compared 2 survey methods for assessing winter bird communities in temperate grasslands: Winter Bird Population Study surveys are area-searches that have long been used in a variety of habitats whereas Project Prairie Bird surveys employ active-flushing techniques on strip-transects and are intended for use in grasslands. We used both methods to survey birds on 14 herbaceous reforested sites and 9 coastal pine savannas during winter and compared resultant estimates of species richness and relative abundance. These techniques did not yield similar estimates of avian populations. We found Winter Bird Population Studies consistently produced higher estimates of species richness, whereas Project Prairie Birds produced higher estimates of avian abundance for some species. When it is important to identify all species within the winter bird community, Winter Bird Population Studies should be the survey method of choice. If estimates of the abundance of relatively secretive grassland bird species are desired, the use of Project Prairie Birds protocols is warranted. However, we suggest that both survey techniques, as currently employed, are deficient and recommend distance- based survey methods that provide species-specific estimates of detection probabilities be incorporated into these survey methods.
De Tobel, J; Phlypo, I; Fieuws, S; Politis, C; Verstraete, K L; Thevissen, P W
2017-12-01
The development of third molars can be evaluated with medical imaging to estimate age in subadults. The appearance of third molars on magnetic resonance imaging (MRI) differs greatly from that on radiographs. Therefore a specific staging technique is necessary to classify third molar development on MRI and to apply it for age estimation. To develop a specific staging technique to register third molar development on MRI and to evaluate its performance for age estimation in subadults. Using 3T MRI in three planes, all third molars were evaluated in 309 healthy Caucasian participants from 14 to 26 years old. According to the appearance of the developing third molars on MRI, descriptive criteria and schematic representations were established to define a specific staging technique. Two observers, with different levels of experience, staged all third molars independently with the developed technique. Intra- and inter-observer agreement were calculated. The data were imported in a Bayesian model for age estimation as described by Fieuws et al. (2016). This approach adequately handles correlation between age indicators and missing age indicators. It was used to calculate a point estimate and a prediction interval of the estimated age. Observed age minus predicted age was calculated, reflecting the error of the estimate. One-hundred and sixty-six third molars were agenetic. Five percent (51/1096) of upper third molars and 7% (70/1044) of lower third molars were not assessable. Kappa for inter-observer agreement ranged from 0.76 to 0.80. For intra-observer agreement kappa ranged from 0.80 to 0.89. However, two stage differences between observers or between staging sessions occurred in up to 2.2% (20/899) of assessments, probably due to a learning effect. Using the Bayesian model for age estimation, a mean absolute error of 2.0 years in females and 1.7 years in males was obtained. Root mean squared error equalled 2.38 years and 2.06 years respectively. The performance to discern minors from adults was better for males than for females, with specificities of 96% and 73% respectively. Age estimations based on the proposed staging method for third molars on MRI showed comparable reproducibility and performance as the established methods based on radiographs.
Nagwani, Naresh Kumar; Deo, Shirish V
2014-01-01
Understanding of the compressive strength of concrete is important for activities like construction arrangement, prestressing operations, and proportioning new mixtures and for the quality assurance. Regression techniques are most widely used for prediction tasks where relationship between the independent variables and dependent (prediction) variable is identified. The accuracy of the regression techniques for prediction can be improved if clustering can be used along with regression. Clustering along with regression will ensure the more accurate curve fitting between the dependent and independent variables. In this work cluster regression technique is applied for estimating the compressive strength of the concrete and a novel state of the art is proposed for predicting the concrete compressive strength. The objective of this work is to demonstrate that clustering along with regression ensures less prediction errors for estimating the concrete compressive strength. The proposed technique consists of two major stages: in the first stage, clustering is used to group the similar characteristics concrete data and then in the second stage regression techniques are applied over these clusters (groups) to predict the compressive strength from individual clusters. It is found from experiments that clustering along with regression techniques gives minimum errors for predicting compressive strength of concrete; also fuzzy clustering algorithm C-means performs better than K-means algorithm.
Nagwani, Naresh Kumar; Deo, Shirish V.
2014-01-01
Understanding of the compressive strength of concrete is important for activities like construction arrangement, prestressing operations, and proportioning new mixtures and for the quality assurance. Regression techniques are most widely used for prediction tasks where relationship between the independent variables and dependent (prediction) variable is identified. The accuracy of the regression techniques for prediction can be improved if clustering can be used along with regression. Clustering along with regression will ensure the more accurate curve fitting between the dependent and independent variables. In this work cluster regression technique is applied for estimating the compressive strength of the concrete and a novel state of the art is proposed for predicting the concrete compressive strength. The objective of this work is to demonstrate that clustering along with regression ensures less prediction errors for estimating the concrete compressive strength. The proposed technique consists of two major stages: in the first stage, clustering is used to group the similar characteristics concrete data and then in the second stage regression techniques are applied over these clusters (groups) to predict the compressive strength from individual clusters. It is found from experiments that clustering along with regression techniques gives minimum errors for predicting compressive strength of concrete; also fuzzy clustering algorithm C-means performs better than K-means algorithm. PMID:25374939
Poisson and negative binomial item count techniques for surveys with sensitive question.
Tian, Guo-Liang; Tang, Man-Lai; Wu, Qin; Liu, Yin
2017-04-01
Although the item count technique is useful in surveys with sensitive questions, privacy of those respondents who possess the sensitive characteristic of interest may not be well protected due to a defect in its original design. In this article, we propose two new survey designs (namely the Poisson item count technique and negative binomial item count technique) which replace several independent Bernoulli random variables required by the original item count technique with a single Poisson or negative binomial random variable, respectively. The proposed models not only provide closed form variance estimate and confidence interval within [0, 1] for the sensitive proportion, but also simplify the survey design of the original item count technique. Most importantly, the new designs do not leak respondents' privacy. Empirical results show that the proposed techniques perform satisfactorily in the sense that it yields accurate parameter estimate and confidence interval.
Krishan, Kewal; Chatterjee, Preetika M; Kanchan, Tanuj; Kaur, Sandeep; Baryah, Neha; Singh, R K
2016-04-01
Sex estimation is considered as one of the essential parameters in forensic anthropology casework, and requires foremost consideration in the examination of skeletal remains. Forensic anthropologists frequently employ morphologic and metric methods for sex estimation of human remains. These methods are still very imperative in identification process in spite of the advent and accomplishment of molecular techniques. A constant boost in the use of imaging techniques in forensic anthropology research has facilitated to derive as well as revise the available population data. These methods however, are less reliable owing to high variance and indistinct landmark details. The present review discusses the reliability and reproducibility of various analytical approaches; morphological, metric, molecular and radiographic methods in sex estimation of skeletal remains. Numerous studies have shown a higher reliability and reproducibility of measurements taken directly on the bones and hence, such direct methods of sex estimation are considered to be more reliable than the other methods. Geometric morphometric (GM) method and Diagnose Sexuelle Probabiliste (DSP) method are emerging as valid methods and widely used techniques in forensic anthropology in terms of accuracy and reliability. Besides, the newer 3D methods are shown to exhibit specific sexual dimorphism patterns not readily revealed by traditional methods. Development of newer and better methodologies for sex estimation as well as re-evaluation of the existing ones will continue in the endeavour of forensic researchers for more accurate results. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Lee, Bang Yeon; Kang, Su-Tae; Yun, Hae-Bum; Kim, Yun Yong
2016-01-12
The distribution of fiber orientation is an important factor in determining the mechanical properties of fiber-reinforced concrete. This study proposes a new image analysis technique for improving the evaluation accuracy of fiber orientation distribution in the sectional image of fiber-reinforced concrete. A series of tests on the accuracy of fiber detection and the estimation performance of fiber orientation was performed on artificial fiber images to assess the validity of the proposed technique. The validation test results showed that the proposed technique estimates the distribution of fiber orientation more accurately than the direct measurement of fiber orientation by image analysis.
Lee, Bang Yeon; Kang, Su-Tae; Yun, Hae-Bum; Kim, Yun Yong
2016-01-01
The distribution of fiber orientation is an important factor in determining the mechanical properties of fiber-reinforced concrete. This study proposes a new image analysis technique for improving the evaluation accuracy of fiber orientation distribution in the sectional image of fiber-reinforced concrete. A series of tests on the accuracy of fiber detection and the estimation performance of fiber orientation was performed on artificial fiber images to assess the validity of the proposed technique. The validation test results showed that the proposed technique estimates the distribution of fiber orientation more accurately than the direct measurement of fiber orientation by image analysis. PMID:28787839
NASA Astrophysics Data System (ADS)
Zvietcovich, Fernando; Yao, Jianing; Chu, Ying-Ju; Meemon, Panomsak; Rolland, Jannick P.; Parker, Kevin J.
2016-03-01
Optical Coherence Elastography (OCE) is a widely investigated noninvasive technique for estimating the mechanical properties of tissue. In particular, vibrational OCE methods aim to estimate the shear wave velocity generated by an external stimulus in order to calculate the elastic modulus of tissue. In this study, we compare the performance of five acquisition and processing techniques for estimating the shear wave speed in simulations and experiments using tissue-mimicking phantoms. Accuracy, contrast-to-noise ratio, and resolution are measured for all cases. The first two techniques make the use of one piezoelectric actuator for generating a continuous shear wave propagation (SWP) and a tone-burst propagation (TBP) of 400 Hz over the gelatin phantom. The other techniques make use of one additional actuator located on the opposite side of the region of interest in order to create an interference pattern. When both actuators have the same frequency, a standing wave (SW) pattern is generated. Otherwise, when there is a frequency difference df between both actuators, a crawling wave (CrW) pattern is generated and propagates with less speed than a shear wave, which makes it suitable for being detected by the 2D cross-sectional OCE imaging. If df is not small compared to the operational frequency, the CrW travels faster and a sampled version of it (SCrW) is acquired by the system. Preliminary results suggest that TBP (error < 4.1%) and SWP (error < 6%) techniques are more accurate when compared to mechanical measurement test results.
Efficient, adaptive estimation of two-dimensional firing rate surfaces via Gaussian process methods.
Rad, Kamiar Rahnama; Paninski, Liam
2010-01-01
Estimating two-dimensional firing rate maps is a common problem, arising in a number of contexts: the estimation of place fields in hippocampus, the analysis of temporally nonstationary tuning curves in sensory and motor areas, the estimation of firing rates following spike-triggered covariance analyses, etc. Here we introduce methods based on Gaussian process nonparametric Bayesian techniques for estimating these two-dimensional rate maps. These techniques offer a number of advantages: the estimates may be computed efficiently, come equipped with natural errorbars, adapt their smoothness automatically to the local density and informativeness of the observed data, and permit direct fitting of the model hyperparameters (e.g., the prior smoothness of the rate map) via maximum marginal likelihood. We illustrate the method's flexibility and performance on a variety of simulated and real data.
Montuno, Michael A; Kohner, Andrew B; Foote, Kelly D; Okun, Michael S
2013-01-01
Deep brain stimulation (DBS) is an effective technique that has been utilized to treat advanced and medication-refractory movement and psychiatric disorders. In order to avoid implanted pulse generator (IPG) failure and consequent adverse symptoms, a better understanding of IPG battery longevity and management is necessary. Existing methods for battery estimation lack the specificity required for clinical incorporation. Technical challenges prevent higher accuracy longevity estimations, and a better approach to managing end of DBS battery life is needed. The literature was reviewed and DBS battery estimators were constructed by the authors and made available on the web at http://mdc.mbi.ufl.edu/surgery/dbs-battery-estimator. A clinical algorithm for management of DBS battery life was constructed. The algorithm takes into account battery estimations and clinical symptoms. Existing methods of DBS battery life estimation utilize an interpolation of averaged current drains to calculate how long a battery will last. Unfortunately, this technique can only provide general approximations. There are inherent errors in this technique, and these errors compound with each iteration of the battery estimation. Some of these errors cannot be accounted for in the estimation process, and some of the errors stem from device variation, battery voltage dependence, battery usage, battery chemistry, impedance fluctuations, interpolation error, usage patterns, and self-discharge. We present web-based battery estimators along with an algorithm for clinical management. We discuss the perils of using a battery estimator without taking into account the clinical picture. Future work will be needed to provide more reliable management of implanted device batteries; however, implementation of a clinical algorithm that accounts for both estimated battery life and for patient symptoms should improve the care of DBS patients. © 2012 International Neuromodulation Society.
Survival of European mouflon (Artiodactyla: Bovidae) in Hawai'i based on tooth cementum lines
Hess, S.C.; Stephens, R.M.; Thompson, T.L.; Danner, R.M.; Kawakami, B.
2011-01-01
Reliable techniques for estimating age of ungulates are necessary to determine population parameters such as age structure and survival. Techniques that rely on dentition, horn, and facial patterns have limited utility for European mouflon sheep (Ovis gmelini musimon), but tooth cementum lines may offer a useful alternative. Cementum lines may not be reliable outside temperate regions, however, because lack of seasonality in diet may affect annulus formation. We evaluated the utility of tooth cementum lines for estimating age of mouflon in Hawai'i in comparison to dentition. Cementum lines were present in mouflon from Mauna Loa, island of Hawai'i, but were less distinct than in North American sheep. The two age-estimation methods provided similar estimates for individuals aged ???3 yr by dentition (the maximum age estimable by dentition), with exact matches in 51% (18/35) of individuals, and an average difference of 0.8 yr (range 04). Estimates of age from cementum lines were higher than those from dentition in 40% (14/35) and lower in 9% (3/35) of individuals. Discrepancies in age estimates between techniques and between paired tooth samples estimated by cementum lines were related to certainty categories assigned by the clarity of cementum lines, reinforcing the importance of collecting a sufficient number of samples to compensate for samples of lower quality, which in our experience, comprised approximately 22% of teeth. Cementum lines appear to provide relatively accurate age estimates for mouflon in Hawai'i, allow estimating age beyond 3 yr, and they offer more precise estimates than tooth eruption patterns. After constructing an age distribution, we estimated annual survival with a log-linear model to be 0.596 (95% CI 0.5540.642) for this heavily controlled population. ?? 2011 by University of Hawai'i Press.
Results and Error Estimates from GRACE Forward Modeling over Greenland, Canada, and Alaska
NASA Astrophysics Data System (ADS)
Bonin, J. A.; Chambers, D. P.
2012-12-01
Forward modeling using a weighted least squares technique allows GRACE information to be projected onto a pre-determined collection of local basins. This decreases the impact of spatial leakage, allowing estimates of mass change to be better localized. The technique is especially valuable where models of current-day mass change are poor, such as over Greenland and Antarctica. However, the accuracy of the forward model technique has not been determined, nor is it known how the distribution of the local basins affects the results. We use a "truth" model composed of hydrology and ice-melt slopes as an example case, to estimate the uncertainties of this forward modeling method and expose those design parameters which may result in an incorrect high-resolution mass distribution. We then apply these optimal parameters in a forward model estimate created from RL05 GRACE data. We compare the resulting mass slopes with the expected systematic errors from the simulation, as well as GIA and basic trend-fitting uncertainties. We also consider whether specific regions (such as Ellesmere Island and Baffin Island) can be estimated reliably using our optimal basin layout.
NASA Astrophysics Data System (ADS)
Qarib, Hossein; Adeli, Hojjat
2015-12-01
In this paper authors introduce a new adaptive signal processing technique for feature extraction and parameter estimation in noisy exponentially damped signals. The iterative 3-stage method is based on the adroit integration of the strengths of parametric and nonparametric methods such as multiple signal categorization, matrix pencil, and empirical mode decomposition algorithms. The first stage is a new adaptive filtration or noise removal scheme. The second stage is a hybrid parametric-nonparametric signal parameter estimation technique based on an output-only system identification technique. The third stage is optimization of estimated parameters using a combination of the primal-dual path-following interior point algorithm and genetic algorithm. The methodology is evaluated using a synthetic signal and a signal obtained experimentally from transverse vibrations of a steel cantilever beam. The method is successful in estimating the frequencies accurately. Further, it estimates the damping exponents. The proposed adaptive filtration method does not include any frequency domain manipulation. Consequently, the time domain signal is not affected as a result of frequency domain and inverse transformations.
Robb, Matthew L; Böhning, Dankmar
2011-02-01
Capture–recapture techniques have been used for considerable time to predict population size. Estimators usually rely on frequency counts for numbers of trappings; however, it may be the case that these are not available for a particular problem, for example if the original data set has been lost and only a summary table is available. Here, we investigate techniques for specific examples; the motivating example is an epidemiology study by Mosley et al., which focussed on a cholera outbreak in East Pakistan. To demonstrate the wider range of the technique, we also look at a study for predicting the long-term outlook of the AIDS epidemic using information on number of sexual partners. A new estimator is developed here which uses the EM algorithm to impute unobserved values and then uses these values in a similar way to the existing estimators. The results show that a truncated approach – mimicking the Chao lower bound approach – gives an improved estimate when population homogeneity is violated.
Parametric Model Based On Imputations Techniques for Partly Interval Censored Data
NASA Astrophysics Data System (ADS)
Zyoud, Abdallah; Elfaki, F. A. M.; Hrairi, Meftah
2017-12-01
The term ‘survival analysis’ has been used in a broad sense to describe collection of statistical procedures for data analysis. In this case, outcome variable of interest is time until an event occurs where the time to failure of a specific experimental unit might be censored which can be right, left, interval, and Partly Interval Censored data (PIC). In this paper, analysis of this model was conducted based on parametric Cox model via PIC data. Moreover, several imputation techniques were used, which are: midpoint, left & right point, random, mean, and median. Maximum likelihood estimate was considered to obtain the estimated survival function. These estimations were then compared with the existing model, such as: Turnbull and Cox model based on clinical trial data (breast cancer data), for which it showed the validity of the proposed model. Result of data set indicated that the parametric of Cox model proved to be more superior in terms of estimation of survival functions, likelihood ratio tests, and their P-values. Moreover, based on imputation techniques; the midpoint, random, mean, and median showed better results with respect to the estimation of survival function.
Sonko, Bakary J; Miller, Leland V; Jones, Richard H; Donnelly, Joseph E; Jacobsen, Dennis J; Hill, James O; Fennessey, Paul V
2003-12-15
Reducing water to hydrogen gas by zinc or uranium metal for determining D/H ratio is both tedious and time consuming. This has forced most energy metabolism investigators to use the "two-point" technique instead of the "Multi-point" technique for estimating total energy expenditure (TEE). Recently, we purchased a new platinum (Pt)-equilibration system that significantly reduces both time and labor required for D/H ratio determination. In this study, we compared TEE obtained from nine overweight but healthy subjects, estimated using the traditional Zn-reduction method to that obtained from the new Pt-equilibration system. Rate constants, pool spaces, and CO2 production rates obtained from use of the two methodologies were not significantly different. Correlation analysis demonstrated that TEEs estimated using the two methods were significantly correlated (r=0.925, p=0.0001). Sample equilibration time was reduced by 66% compared to those of similar methods. The data demonstrated that the Zn-reduction method could be replaced by the Pt-equilibration method when TEE was estimated using the "Multi-Point" technique. Furthermore, D equilibration time was significantly reduced.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lambie, F.W.; Yee, S.N.
The purpose of this and a previous project was to examine the feasibility of estimating intermediate grade uranium (0.01 to 0.05% U/sub 3/O/sub 8/) on the basis of existing, sparsely drilled holes. All data are from the Powder River Basin in Wyoming. DOE makes preliminary estimates of endowment by calculating an Average Area of Influence (AAI) based on densely drilled areas, multiplying that by the thickness of the mineralization and then dividing by a tonnage factor. The resulting tonnage of ore is then multiplied by the average grade of the interval to obtain the estimate of U/sub 3/O/sub 8/ tonnage.more » Total endowment is the sum of these values over all mineralized intervals in all wells in the area. In regions where wells are densely drilled and approximately regularly spaced this technique approaches the classical polygonal estimation technique used to estimate ore reserves and should be fairly reliable. The method is conservative because: (1) in sparsely drilled regions a large fraction of the area is not considered to contribute to endowment; (2) there is a bias created by the different distributions of point grades and mining block grades. A conservative approach may be justified for purposes of ore reserve estimation, where large investments may hinge on local forecasts. But for estimates of endowment over areas as large as 1/sup 0/ by 2/sup 0/ quadrangles, or the nation as a whole, errors in local predictions are not critical as long as they tend to cancel and a less conservative estimation approach may be justified.One candidate, developed for this study and described is called the contoured thickness technique. A comparison of estimates based on the contoured thickness approach with DOE calculations for five areas of Wyoming roll-fronts in the Powder River Basin is presented. The sensitivity of the technique to well density is examined and the question of predicting intermediate grade endowment from data on higher grades is discussed.« less
Comparative assessment of bone pose estimation using Point Cluster Technique and OpenSim.
Lathrop, Rebecca L; Chaudhari, Ajit M W; Siston, Robert A
2011-11-01
Estimating the position of the bones from optical motion capture data is a challenge associated with human movement analysis. Bone pose estimation techniques such as the Point Cluster Technique (PCT) and simulations of movement through software packages such as OpenSim are used to minimize soft tissue artifact and estimate skeletal position; however, using different methods for analysis may produce differing kinematic results which could lead to differences in clinical interpretation such as a misclassification of normal or pathological gait. This study evaluated the differences present in knee joint kinematics as a result of calculating joint angles using various techniques. We calculated knee joint kinematics from experimental gait data using the standard PCT, the least squares approach in OpenSim applied to experimental marker data, and the least squares approach in OpenSim applied to the results of the PCT algorithm. Maximum and resultant RMS differences in knee angles were calculated between all techniques. We observed differences in flexion/extension, varus/valgus, and internal/external rotation angles between all approaches. The largest differences were between the PCT results and all results calculated using OpenSim. The RMS differences averaged nearly 5° for flexion/extension angles with maximum differences exceeding 15°. Average RMS differences were relatively small (< 1.08°) between results calculated within OpenSim, suggesting that the choice of marker weighting is not critical to the results of the least squares inverse kinematics calculations. The largest difference between techniques appeared to be a constant offset between the PCT and all OpenSim results, which may be due to differences in the definition of anatomical reference frames, scaling of musculoskeletal models, and/or placement of virtual markers within OpenSim. Different methods for data analysis can produce largely different kinematic results, which could lead to the misclassification of normal or pathological gait. Improved techniques to allow non-uniform scaling of generic models to more accurately reflect subject-specific bone geometries and anatomical reference frames may reduce differences between bone pose estimation techniques and allow for comparison across gait analysis platforms.
Rath, Hemamalini; Rath, Rachna; Mahapatra, Sandeep; Debta, Tribikram
2017-01-01
The age of an individual can be assessed by a plethora of widely available tooth-based techniques, among which radiological methods prevail. The Demirjian's technique of age assessment based on tooth development stages has been extensively investigated in different populations of the world. The present study is to assess the applicability of Demirjian's modified 8-teeth technique in age estimation of population of East India (Odisha), utilizing Acharya's Indian-specific cubic functions. One hundred and six pretreatment orthodontic radiographs of patients in an age group of 7-23 years with representation from both genders were assessed for eight left mandibular teeth and scored as per the Demirjian's 9-stage criteria for teeth development stages. Age was calculated on the basis of Acharya's Indian formula. Statistical analysis was performed to compare the estimated and actual age. All data were analyzed using SPSS 20.0 (SPSS Inc., Chicago, Illinois, USA) and MS Excel Package. The results revealed that the mean absolute error (MAE) in age estimation of the entire sample was 1.3 years with 50% of the cases having an error rate within ± 1 year. The MAE in males and females (7-16 years) was 1.8 and 1.5, respectively. Likewise, the MAE in males and females (16.1-23 years) was 1.1 and 1.3, respectively. The low error rate in estimating age justifies the application of this modified technique and Acharya's Indian formulas in the present East Indian population.
Simultaneous multiple non-crossing quantile regression estimation using kernel constraints
Liu, Yufeng; Wu, Yichao
2011-01-01
Quantile regression (QR) is a very useful statistical tool for learning the relationship between the response variable and covariates. For many applications, one often needs to estimate multiple conditional quantile functions of the response variable given covariates. Although one can estimate multiple quantiles separately, it is of great interest to estimate them simultaneously. One advantage of simultaneous estimation is that multiple quantiles can share strength among them to gain better estimation accuracy than individually estimated quantile functions. Another important advantage of joint estimation is the feasibility of incorporating simultaneous non-crossing constraints of QR functions. In this paper, we propose a new kernel-based multiple QR estimation technique, namely simultaneous non-crossing quantile regression (SNQR). We use kernel representations for QR functions and apply constraints on the kernel coefficients to avoid crossing. Both unregularised and regularised SNQR techniques are considered. Asymptotic properties such as asymptotic normality of linear SNQR and oracle properties of the sparse linear SNQR are developed. Our numerical results demonstrate the competitive performance of our SNQR over the original individual QR estimation. PMID:22190842
1982-02-01
For these data elements, Initial Milestone 11 values were established as the Flanning Estimate (PE) with the Development Estimate ( DE ) to he based ...development of improved forensic collection techniques for Naval Investigative Agents on ships and overseas bases . As this is a continuing program, the above...overseas bases ), and continue development of improved forensic collection techniques for Naval Investigative Agents on ships and overseas baszs. 4. (U) FY
Surface albedo from bidirectional reflectance
NASA Technical Reports Server (NTRS)
Ranson, K. J.; Irons, J. R.; Daughtry, C. S. T.
1991-01-01
The validity of integrating over discrete wavelength bands is examined to estimate total shortwave bidirectional reflectance of vegetated and bare soil surfaces. Methods for estimating albedo from multiple angle, discrete wavelength band radiometer measurements are studied. These methods include a numerical integration technique and the integration of an empirically derived equation for bidirectional reflectance. It is concluded that shortwave albedos estimated through both techniques agree favorably with the independent pyranometer measurements. Absolute rms errors are found to be 0.5 percent or less for both grass sod and bare soil surfaces.
Quality assessment and control of finite element solutions
NASA Technical Reports Server (NTRS)
Noor, Ahmed K.; Babuska, Ivo
1987-01-01
Status and some recent developments in the techniques for assessing the reliability of finite element solutions are summarized. Discussion focuses on a number of aspects including: the major types of errors in the finite element solutions; techniques used for a posteriori error estimation and the reliability of these estimators; the feedback and adaptive strategies for improving the finite element solutions; and postprocessing approaches used for improving the accuracy of stresses and other important engineering data. Also, future directions for research needed to make error estimation and adaptive movement practical are identified.
Theoretical and simulated performance for a novel frequency estimation technique
NASA Technical Reports Server (NTRS)
Crozier, Stewart N.
1993-01-01
A low complexity, open-loop, discrete-time, delay-multiply-average (DMA) technique for estimating the frequency offset for digitally modulated MPSK signals is investigated. A nonlinearity is used to remove the MPSK modulation and generate the carrier component to be extracted. Theoretical and simulated performance results are presented and compared to the Cramer-Rao lower bound (CRLB) for the variance of the frequency estimation error. For all signal-to-noise ratios (SNR's) above threshold, it is shown that the CRLB can essentially be achieved with linear complexity.
NASA Technical Reports Server (NTRS)
Kotoda, K.; Nakagawa, S.; Kai, K.; Yoshino, M. M.; Takeda, K.; Seki, K.
1985-01-01
In a humid region like Japan, it seems that the radiation term in the energy balance equation plays a more important role for evapotranspiration then does the vapor pressure difference between the surface and lower atmospheric boundary layer. A Priestley-Taylor type equation (equilibrium evaporation model) is used to estimate evapotranspiration. Net radiation, soil heat flux, and surface temperature data are obtained. Only temperature data obtained by remotely sensed techniques are used.
ESTIMATING CHLOROFORM BIOTRANSFORMATION IN F-344 RAT LIVER USING IN VITRO TECHNIQUES AND PHARMACOKINETIC MODELING
Linskey, C.F.1, Harrison, R.A.2., Zhao, G.3., Barton, H.A., Lipscomb, J.C4., and Evans, M.V2., 1UNC, ESE, Chapel Hill, NC ; 2USEPA, ORD, NHEERL, RTP, NC; 3 UN...
Regional distribution of forest height and biomass from multisensor data fusion
Yifan Yu; Sassan Saatch; Linda S. Heath; Elizabeth LaPoint; Ranga Myneni; Yuri Knyazikhin
2010-01-01
Elevation data acquired from radar interferometry at C-band from SRTM are used in data fusion techniques to estimate regional scale forest height and aboveground live biomass (AGLB) over the state of Maine. Two fusion techniques have been developed to perform post-processing and parameter estimations from four data sets: 1 arc sec National Elevation Data (NED), SRTM...
Estimation of Biochemical Constituents From Fresh, Green Leaves By Spectrum Matching Techniques
NASA Technical Reports Server (NTRS)
Goetz, A. F. H.; Gao, B. C.; Wessman, C. A.; Bowman, W. D.
1990-01-01
Estimation of biochemical constituents in vegetation such as lignin, cellulose, starch, sugar and protein by remote sensing methods is an important goal in ecological research. The spectral reflectances of dried leaves exhibit diagnostic absorption features which can be used to estimate the abundance of important constituents. Lignin and nitrogen concentrations have been obtained from canopies by use of imaging spectrometry and multiple linear regression techniques. The difficulty in identifying individual spectra of leaf constituents in the region beyond 1 micrometer is that liquid water contained in the leaf dominates the spectral reflectance of leaves in this region. By use of spectrum matching techniques, originally used to quantify whole column water abundance in the atmosphere and equivalent liquid water thickness in leaves, we have been able to remove the liquid water contribution to the spectrum. The residual spectra resemble spectra for cellulose in the 1.1 micrometer region, lignin in the 1.7 micrometer region, and starch in the 2.0-2.3 micrometer region. In the entire 1.0-2.3 micrometer region each of the major constituents contributes to the spectrum. Quantitative estimates will require using unmixing techniques on the residual spectra.
NASA Technical Reports Server (NTRS)
Schkolnik, Gerard S.
1993-01-01
The application of an adaptive real-time measurement-based performance optimization technique is being explored for a future flight research program. The key technical challenge of the approach is parameter identification, which uses a perturbation-search technique to identify changes in performance caused by forced oscillations of the controls. The controls on the NASA F-15 highly integrated digital electronic control (HIDEC) aircraft were perturbed using inlet cowl rotation steps at various subsonic and supersonic flight conditions to determine the effect on aircraft performance. The feasibility of the perturbation-search technique for identifying integrated airframe-propulsion system performance effects was successfully shown through flight experiments and postflight data analysis. Aircraft response and control data were analyzed postflight to identify gradients and to determine the minimum drag point. Changes in longitudinal acceleration as small as 0.004 g were measured, and absolute resolution was estimated to be 0.002 g or approximately 50 lbf of drag. Two techniques for identifying performance gradients were compared: a least-squares estimation algorithm and a modified maximum likelihood estimator algorithm. A complementary filter algorithm was used with the least squares estimator.
NASA Technical Reports Server (NTRS)
Schkolnik, Gerald S.
1993-01-01
The application of an adaptive real-time measurement-based performance optimization technique is being explored for a future flight research program. The key technical challenge of the approach is parameter identification, which uses a perturbation-search technique to identify changes in performance caused by forced oscillations of the controls. The controls on the NASA F-15 highly integrated digital electronic control (HIDEC) aircraft were perturbed using inlet cowl rotation steps at various subsonic and supersonic flight conditions to determine the effect on aircraft performance. The feasibility of the perturbation-search technique for identifying integrated airframe-propulsion system performance effects was successfully shown through flight experiments and postflight data analysis. Aircraft response and control data were analyzed postflight to identify gradients and to determine the minimum drag point. Changes in longitudinal acceleration as small as 0.004 g were measured, and absolute resolution was estimated to be 0.002 g or approximately 50 lbf of drag. Two techniques for identifying performance gradients were compared: a least-squares estimation algorithm and a modified maximum likelihood estimator algorithm. A complementary filter algorithm was used with the least squares estimator.
Strategies for Estimating Discrete Quantities.
ERIC Educational Resources Information Center
Crites, Terry W.
1993-01-01
Describes the benchmark and decomposition-recomposition estimation strategies and presents five techniques to develop students' estimation ability. Suggests situations involving quantities of candy and popcorn in which the teacher can model those strategies for the students. (MDH)
Thevissen, Patrick W; Fieuws, Steffen; Willems, Guy
2013-03-01
Multiple third molar development registration techniques exist. Therefore the aim of this study was to detect which third molar development registration technique was most promising to use as a tool for subadult age estimation. On a collection of 1199 panoramic radiographs the development of all present third molars was registered following nine different registration techniques [Gleiser, Hunt (GH); Haavikko (HV); Demirjian (DM); Raungpaka (RA); Gustafson, Koch (GK); Harris, Nortje (HN); Kullman (KU); Moorrees (MO); Cameriere (CA)]. Regression models with age as response and the third molar registration as predictor were developed for each registration technique separately. The MO technique disclosed highest R(2) (F 51%, M 45%) and lowest root mean squared error (F 3.42 years; M 3.67 years) values, but differences with other techniques were small in magnitude. The amount of stages utilized in the explored staging techniques slightly influenced the age predictions. © 2013 American Academy of Forensic Sciences.
A method for nonlinear exponential regression analysis
NASA Technical Reports Server (NTRS)
Junkin, B. G.
1971-01-01
A computer-oriented technique is presented for performing a nonlinear exponential regression analysis on decay-type experimental data. The technique involves the least squares procedure wherein the nonlinear problem is linearized by expansion in a Taylor series. A linear curve fitting procedure for determining the initial nominal estimates for the unknown exponential model parameters is included as an integral part of the technique. A correction matrix was derived and then applied to the nominal estimate to produce an improved set of model parameters. The solution cycle is repeated until some predetermined criterion is satisfied.