DOE Office of Scientific and Technical Information (OSTI.GOV)
Xie, J; Wang, J; Peng, J
Purpose: To implement an entire workflow quality assurance (QA) process in the radiotherapy department and to reduce the error rates of radiotherapy based on the entire workflow management in the developing country. Methods: The entire workflow QA process management starts from patient registration to the end of last treatment including all steps through the entire radiotherapy process. Error rate of chartcheck is used to evaluate the the entire workflow QA process. Two to three qualified senior medical physicists checked the documents before the first treatment fraction of every patient. Random check of the treatment history during treatment was also performed.more » A total of around 6000 patients treatment data before and after implementing the entire workflow QA process were compared from May, 2014 to December, 2015. Results: A systemic checklist was established. It mainly includes patient’s registration, treatment plan QA, information exporting to OIS(Oncology Information System), documents of treatment QAand QA of the treatment history. The error rate derived from the chart check decreases from 1.7% to 0.9% after our the entire workflow QA process. All checked errors before the first treatment fraction were corrected as soon as oncologist re-confirmed them and reinforce staff training was accordingly followed to prevent those errors. Conclusion: The entire workflow QA process improved the safety, quality of radiotherapy in our department and we consider that our QA experience can be applicable for the heavily-loaded radiotherapy departments in developing country.« less
50 CFR 679.30 - General CDQ regulations.
Code of Federal Regulations, 2011 CFR
2011-10-01
... visual representation of the qualified applicant's entire organizational structure, including all... narrative description of how the CDQ group intends to harvest and process its CDQ allocations, including a...
Automated drug identification system
NASA Technical Reports Server (NTRS)
Campen, C. F., Jr.
1974-01-01
System speeds up analysis of blood and urine and is capable of identifying 100 commonly abused drugs. System includes computer that controls entire analytical process by ordering various steps in specific sequences. Computer processes data output and has readout of identified drugs.
Methodological challenges when doing research that includes ethnic minorities: a scoping review.
Morville, Anne-Le; Erlandsson, Lena-Karin
2016-11-01
There are challenging methodological issues in obtaining valid and reliable results on which to base occupational therapy interventions for ethnic minorities. The aim of this scoping review is to describe the methodological problems within occupational therapy research, when ethnic minorities are included. A thorough literature search yielded 21 articles obtained from the scientific databases PubMed, Cinahl, Web of Science and PsychInfo. Analysis followed Arksey and O'Malley's framework for scoping reviews, applying content analysis. The results showed methodological issues concerning the entire research process from defining and recruiting samples, the conceptual understanding, lack of appropriate instruments, data collection using interpreters to analyzing data. In order to avoid excluding the ethnic minorities from adequate occupational therapy research and interventions, development of methods for the entire research process is needed. It is a costly and time-consuming process, but the results will be valid and reliable, and therefore more applicable in clinical practice.
40 CFR 63.7882 - What site remediation sources at my facility does this subpart affect?
Code of Federal Regulations, 2014 CFR
2014-07-01
... source is the entire group of process vents associated with the in-situ and ex-situ remediation processes used at your site to remove, destroy, degrade, transform, or immobilize hazardous substances in the remediation material subject to remediation. Examples of such in-situ remediation processes include, but are...
40 CFR 63.7882 - What site remediation sources at my facility does this subpart affect?
Code of Federal Regulations, 2013 CFR
2013-07-01
... source is the entire group of process vents associated with the in-situ and ex-situ remediation processes used at your site to remove, destroy, degrade, transform, or immobilize hazardous substances in the remediation material subject to remediation. Examples of such in-situ remediation processes include, but are...
40 CFR 63.7882 - What site remediation sources at my facility does this subpart affect?
Code of Federal Regulations, 2011 CFR
2011-07-01
... source is the entire group of process vents associated with the in-situ and ex-situ remediation processes used at your site to remove, destroy, degrade, transform, or immobilize hazardous substances in the remediation material subject to remediation. Examples of such in-situ remediation processes include, but are...
40 CFR 63.7882 - What site remediation sources at my facility does this subpart affect?
Code of Federal Regulations, 2012 CFR
2012-07-01
... source is the entire group of process vents associated with the in-situ and ex-situ remediation processes used at your site to remove, destroy, degrade, transform, or immobilize hazardous substances in the remediation material subject to remediation. Examples of such in-situ remediation processes include, but are...
Evaluating Amtrak's S2S: Are Recorded Injury Rates Showing Actual Injury Rates?
DOT National Transportation Integrated Search
2017-08-01
Since 2009, Amtrak has been engaged in unprecedented efforts to advance its safety processes and improve the safety culture of the entire corporation, including establishing a peer-to-peer feedback process, known as the Safe-2-Safer program. FRA is c...
21 CFR 892.1715 - Full-field digital mammography system.
Code of Federal Regulations, 2012 CFR
2012-04-01
... planar digital x-ray images of the entire breast. This generic type of device may include digital mammography acquisition software, full-field digital image receptor, acquisition workstation, automatic exposure control, image processing and reconstruction programs, patient and equipment supports, component...
21 CFR 892.1715 - Full-field digital mammography system.
Code of Federal Regulations, 2013 CFR
2013-04-01
... planar digital x-ray images of the entire breast. This generic type of device may include digital mammography acquisition software, full-field digital image receptor, acquisition workstation, automatic exposure control, image processing and reconstruction programs, patient and equipment supports, component...
21 CFR 892.1715 - Full-field digital mammography system.
Code of Federal Regulations, 2014 CFR
2014-04-01
... planar digital x-ray images of the entire breast. This generic type of device may include digital mammography acquisition software, full-field digital image receptor, acquisition workstation, automatic exposure control, image processing and reconstruction programs, patient and equipment supports, component...
21 CFR 892.1715 - Full-field digital mammography system.
Code of Federal Regulations, 2011 CFR
2011-04-01
... planar digital x-ray images of the entire breast. This generic type of device may include digital mammography acquisition software, full-field digital image receptor, acquisition workstation, automatic exposure control, image processing and reconstruction programs, patient and equipment supports, component...
Working group organizational meeting
NASA Technical Reports Server (NTRS)
1982-01-01
Scene radiation and atmospheric effects, mathematical pattern recognition and image analysis, information evaluation and utilization, and electromagnetic measurements and signal handling are considered. Research issues in sensors and signals, including radar (SAR) reflectometry, SAR processing speed, registration, including overlay of SAR and optical imagery, entire system radiance calibration, and lack of requirements for both sensors and systems, etc. were discussed.
ERIC Educational Resources Information Center
Benoit, Gerald
2002-01-01
Discusses data mining (DM) and knowledge discovery in databases (KDD), taking the view that KDD is the larger view of the entire process, with DM emphasizing the cleaning, warehousing, mining, and visualization of knowledge discovery in databases. Highlights include algorithms; users; the Internet; text mining; and information extraction.…
NOSS altimeter algorithm specifications
NASA Technical Reports Server (NTRS)
Hancock, D. W.; Forsythe, R. G.; Mcmillan, J. D.
1982-01-01
A description of all algorithms required for altimeter processing is given. Each description includes title, description, inputs/outputs, general algebraic sequences and data volume. All required input/output data files are described and the computer resources required for the entire altimeter processing system were estimated. The majority of the data processing requirements for any radar altimeter of the Seasat-1 type are scoped. Additions and deletions could be made for the specific altimeter products required by other projects.
The Coastal Ocean Prediction Systems program: Understanding and managing our coastal ocean
DOE Office of Scientific and Technical Information (OSTI.GOV)
Eden, H.F.; Mooers, C.N.K.
1990-06-01
The goal of COPS is to couple a program of regular observations to numerical models, through techniques of data assimilation, in order to provide a predictive capability for the US coastal ocean including the Great Lakes, estuaries, and the entire Exclusive Economic Zone (EEZ). The objectives of the program include: determining the predictability of the coastal ocean and the processes that govern the predictability; developing efficient prediction systems for the coastal ocean based on the assimilation of real-time observations into numerical models; and coupling the predictive systems for the physical behavior of the coastal ocean to predictive systems for biological,more » chemical, and geological processes to achieve an interdisciplinary capability. COPS will provide the basis for effective monitoring and prediction of coastal ocean conditions by optimizing the use of increased scientific understanding, improved observations, advanced computer models, and computer graphics to make the best possible estimates of sea level, currents, temperatures, salinities, and other properties of entire coastal regions.« less
Bacci, Jennifer L; Berenbrok, Lucas A
2018-06-07
The scope of community pharmacy practice has expanded beyond the provision of drug product to include the provision of patient care services. Likewise, the community pharmacist's approach to patient safety must also expand beyond prevention of errors during medication dispensing to include optimization of medications and prevention of adverse events throughout the entire medication use process. Connectivity to patient data and other healthcare providers has been a longstanding challenge in community pharmacy with implications for the delivery and safety of patient care. Here, we describe three innovative advances in connectivity in community pharmacy practice that enhance patient safety in the provision of community pharmacist patient care services across the entire medication use process. Specifically, we discuss the growing use of immunization information systems, quality improvement platforms, and health information exchanges in community pharmacy practice and their implications for patient safety. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.
Extra Cheese, Please! Mozzarella's Journey from Cow to Pizza [and] Teaching Guide.
ERIC Educational Resources Information Center
Peterson, Chris
This book traces Annabelle the dairy cow's milk from the farm to the top of a Friday night pizza. The book relates that when Annabelle gives birth to her calf she also begins to produce milk; the milk is then processed into cheese, and from the cheese, pizza is made (recipe included). The book features color photographs of the entire process which…
Enabling the Public to Experience Science from Beginning to End (Invited)
NASA Astrophysics Data System (ADS)
Trouille, L.; Chen, Y.; Lintott, C.; Lynn, S.; Simmons, B.; Smith, A.; Tremonti, C.; Whyte, L.; Willett, K.; Zevin, M.; Science Team; Moderator Team, G.
2013-12-01
In this talk we present the results of an experiment in collaborative research and article writing within the citizen science context. During July-September 2013, astronomers and the Zooniverse team ran Galaxy Zoo Quench (quench.galaxyzoo.org), investigating the mechanism(s) that recently and abruptly shut off star formation in a sample of post-quenched galaxies. Through this project, the public had the opportunity to experience the entire process of science, including galaxy classification, reading background literature, data analysis, discussion, debate, drawing conclusions, and writing an article to submit to a professional journal. The context was galaxy evolution, however, the lessons learned are applicable across the disciplines. The discussion will focus on how to leverage online tools to authentically engage the public in the entire process of science.
Advances in Polyhydroxyalkanoate (PHA) Production.
Koller, Martin
2017-11-02
This editorial paper provides a synopsis of the contributions to the Bioengineering special issue "Advances in Polyhydroxyalkanoate (PHA) Production". It illustrates the embedding of the issue's individual research articles in the current global research and development landscape related to polyhydroxyalkanoates (PHA). The article shows how these articles are interrelated to each other, reflecting the entire PHA process chain including strain selection, metabolic and genetic considerations, feedstock evaluation, fermentation regimes, process engineering, and polymer processing towards high-value marketable products.
Microbiological Impact on Carbon Capture and Sequestration: Biotic Processes in Natural CO2 Analogue
Multiple ground-water based microbial community analyses including membrane lipids assays for phospholipid fatty acid and DNA analysis were performed from hydraulically isolated zones. DGGE results from DNA extracts from vertical profiling of the entire depth of aquifer sampled a...
Wake County Public School System Design Guidelines.
ERIC Educational Resources Information Center
Wake County Public School System, Raleigh, NC.
The Wake County Public School System has published its guidelines for planning and design of functional, cost effective, and durable educational facilities that are attractive and enhance the students' educational experience. The guidelines present basic planning requirement and design criteria for the entire construction process, including: codes…
DSN telemetry system data records
NASA Technical Reports Server (NTRS)
Gatz, E. C.
1976-01-01
The DSN telemetry system now includes the capability to provide a complete magnetic tape record, within 24 hours of reception, of all telemetry data received from a spacecraft. This record, the intermediate data record, is processed and generated almost entirely automatically, and provides a detailed accounting of any missing data.
Trends in Process Analytical Technology: Present State in Bioprocessing.
Jenzsch, Marco; Bell, Christian; Buziol, Stefan; Kepert, Felix; Wegele, Harald; Hakemeyer, Christian
2017-08-04
Process analytical technology (PAT), the regulatory initiative for incorporating quality in pharmaceutical manufacturing, is an area of intense research and interest. If PAT is effectively applied to bioprocesses, this can increase process understanding and control, and mitigate the risk from substandard drug products to both manufacturer and patient. To optimize the benefits of PAT, the entire PAT framework must be considered and each elements of PAT must be carefully selected, including sensor and analytical technology, data analysis techniques, control strategies and algorithms, and process optimization routines. This chapter discusses the current state of PAT in the biopharmaceutical industry, including several case studies demonstrating the degree of maturity of various PAT tools. Graphical Abstract Hierarchy of QbD components.
High-throughput imaging of heterogeneous cell organelles with an X-ray laser (CXIDB ID 25)
Hantke, Max, F.
2014-11-17
Preprocessed detector images that were used for the paper "High-throughput imaging of heterogeneous cell organelles with an X-ray laser". The CXI file contains the entire recorded data - including both hits and blanks. It also includes down-sampled images and LCLS machine parameters. Additionally, the Cheetah configuration file is attached that was used to create the pre-processed data.
Planetary image conversion task
NASA Technical Reports Server (NTRS)
Martin, M. D.; Stanley, C. L.; Laughlin, G.
1985-01-01
The Planetary Image Conversion Task group processed 12,500 magnetic tapes containing raw imaging data from JPL planetary missions and produced an image data base in consistent format on 1200 fully packed 6250-bpi tapes. The output tapes will remain at JPL. A copy of the entire tape set was delivered to US Geological Survey, Flagstaff, Ariz. A secondary task converted computer datalogs, which had been stored in project specific MARK IV File Management System data types and structures, to flat-file, text format that is processable on any modern computer system. The conversion processing took place at JPL's Image Processing Laboratory on an IBM 370-158 with existing software modified slightly to meet the needs of the conversion task. More than 99% of the original digital image data was successfully recovered by the conversion task. However, processing data tapes recorded before 1975 was destructive. This discovery is of critical importance to facilities responsible for maintaining digital archives since normal periodic random sampling techniques would be unlikely to detect this phenomenon, and entire data sets could be wiped out in the act of generating seemingly positive sampling results. Reccomended follow-on activities are also included.
Martin, A K; Mowry, B; Reutens, D; Robinson, G A
2015-10-01
Patients with schizophrenia often display deficits on tasks thought to measure "executive" processes. Recently, it has been suggested that reductions in fluid intelligence test performance entirely explain deficits reported for patients with focal frontal lesions on classical executive tasks. For patients with schizophrenia, it is unclear whether deficits on executive tasks are entirely accountable by fluid intelligence and representative of a common general process or best accounted for by distinct contributions to the cognitive profile of schizophrenia. In the current study, 50 patients with schizophrenia and 50 age, sex and premorbid intelligence matched controls were assessed using a broad neuropsychological battery, including tasks considered sensitive to executive abilities, namely the Hayling Sentence Completion Test (HSCT), word fluency, Stroop test, digit-span backwards, and spatial working memory. Fluid intelligence was measured using both the Matrix reasoning subtest from the Weschler Abbreviated Scale of Intelligence (WASI) and a composite score derived from a number of cognitive tests. Patients with schizophrenia were impaired on all cognitive measures compared with controls, except smell identification and the optimal betting and risk-taking measures from the Cambridge Gambling Task. After introducing fluid intelligence as a covariate, significant differences remained for HSCT suppression errors, and classical executive function tests such as the Stroop test and semantic/phonemic word fluency, regardless of which fluid intelligence measure was included. Fluid intelligence does not entirely explain impaired performance on all tests considered as reflecting "executive" processes. For schizophrenia, these measures should remain part of a comprehensive neuropsychological assessment alongside a measure of fluid intelligence. Copyright © 2015 Elsevier Inc. All rights reserved.
JWST Wavefront Sensing and Control: Operations Plans, Demonstrations, and Status
NASA Astrophysics Data System (ADS)
Perrin, Marshall; Acton, D. Scott; Lajoie, Charles-Philippe; Knight, J. Scott; Myers, Carey; Stark, Chris; JWST Wavefront Sensing & Control Team
2018-01-01
After JWST launches and unfolds in space, its telescope optics will be aligned through a complex series of wavefront sensing and control (WFSC) steps to achieve diffraction-limited performance. This iterative process will comprise about half of the observatory commissioning time (~ 3 out of 6 months). We summarize the JWST WFSC process, schedule, and expectations for achieved performance, and discuss our team’s activities to prepare for an effective & efficient telescope commissioning. During the recently-completed OTIS cryo test at NASA JSC, WFSC demonstrations showed the flight-like operation of the entire JWST active optics and WFSC system from end to end, including all hardware and software components. In parallel, the same test data were processed through the JWST Mission Operations Center at STScI to demonstrate the readiness of ground system components there (such as the flight operations system, data pipelines, archives, etc). Moreover, using the Astronomer’s Proposal Tool (APT), the entire telescope commissioning program has been implemented, reviewed, and is ready for execution. Between now and launch our teams will continue preparations for JWST commissioning, including further rehearsals and testing, to ensure a successful alignment of JWST’s telescope optics.
A Comprehensive Three-Dimensional Cortical Map of Vowel Space
ERIC Educational Resources Information Center
Scharinger, Mathias; Idsardi, William J.; Poe, Samantha
2011-01-01
Mammalian cortex is known to contain various kinds of spatial encoding schemes for sensory information including retinotopic, somatosensory, and tonotopic maps. Tonotopic maps are especially interesting for human speech sound processing because they encode linguistically salient acoustic properties. In this study, we mapped the entire vowel space…
16 CFR 1616.4 - Sampling and acceptance procedures.
Code of Federal Regulations, 2014 CFR
2014-01-01
... a suitable thread and stitch. The specimen shall include each of the components over its entire... fabric in Tightened Sampling must be discontinued until that part of the process or component which is... otherwise attaching the trim shall be done with thread or fastening material of the same composition and...
Production Techniques for Computer-Based Learning Material.
ERIC Educational Resources Information Center
Moonen, Jef; Schoenmaker, Jan
Experiences in the development of educational software in the Netherlands have included the use of individual and team approaches, the determination of software content and how it should be presented, and the organization of the entire development process, from experimental programs to prototype to final product. Because educational software is a…
The beneficial effects of berry fruit on cognitive and neuronal function in aging
USDA-ARS?s Scientific Manuscript database
Research has demonstrated, in both human and animals, that cognition decreases with age, to include deficits in processing speed, executive function, memory, and spatial learning. The cause of these functional declines is not entirely understood; however, neuronal losses and the associated changes i...
USDA-ARS?s Scientific Manuscript database
Research has demonstrated, in both human and animals, that cognitive functioning decreases with age, to include deficits in processing speed, executive function, memory, and spatial learning. The cause of these functional declines is not entirely understood; however, neuronal losses and the associat...
USDA-ARS?s Scientific Manuscript database
Research has demonstrated, in both human and animals, that cognitive functioning decreases with age, to include deficits in processing speed, executive function, memory, and spatial learning. The cause of these functional declines is not entirely understood; however, neuronal losses and the associat...
Advances in Polyhydroxyalkanoate (PHA) Production
2017-01-01
This editorial paper provides a synopsis of the contributions to the Bioengineering special issue “Advances in Polyhydroxyalkanoate (PHA) Production”. It illustrates the embedding of the issue’s individual research articles in the current global research and development landscape related to polyhydroxyalkanoates (PHA). The article shows how these articles are interrelated to each other, reflecting the entire PHA process chain including strain selection, metabolic and genetic considerations, feedstock evaluation, fermentation regimes, process engineering, and polymer processing towards high-value marketable products. PMID:29099065
Farley Three-Dimensional-Braiding Machine
NASA Technical Reports Server (NTRS)
Farley, Gary L.
1991-01-01
Process and device known as Farley three-dimensional-braiding machine conceived to fabricate dry continuous fiber-reinforced preforms of complex three-dimensional shapes for subsequent processing into composite structures. Robotic fiber supply dispenses yarn as it traverses braiding surface. Combines many attributes of weaving and braiding processes with other attributes and capabilities. Other applications include decorative cloths, rugs, and other domestic textiles. Concept could lead to large variety of fiber layups and to entirely new products as well as new fiber-reinforcing applications.
Parameter estimation for terrain modeling from gradient data. [navigation system for Martian rover
NASA Technical Reports Server (NTRS)
Dangelo, K. R.
1974-01-01
A method is developed for modeling terrain surfaces for use on an unmanned Martian roving vehicle. The modeling procedure employs a two-step process which uses gradient as well as height data in order to improve the accuracy of the model's gradient. Least square approximation is used in order to stochastically determine the parameters which describe the modeled surface. A complete error analysis of the modeling procedure is included which determines the effect of instrumental measurement errors on the model's accuracy. Computer simulation is used as a means of testing the entire modeling process which includes the acquisition of data points, the two-step modeling process and the error analysis. Finally, to illustrate the procedure, a numerical example is included.
Cameras for semiconductor process control
NASA Technical Reports Server (NTRS)
Porter, W. A.; Parker, D. L.
1977-01-01
The application of X-ray topography to semiconductor process control is described, considering the novel features of the high speed camera and the difficulties associated with this technique. The most significant results on the effects of material defects on device performance are presented, including results obtained using wafers processed entirely within this institute. Defects were identified using the X-ray camera and correlations made with probe data. Also included are temperature dependent effects of material defects. Recent applications and improvements of X-ray topographs of silicon-on-sapphire and gallium arsenide are presented with a description of a real time TV system prototype and of the most recent vacuum chuck design. Discussion is included of our promotion of the use of the camera by various semiconductor manufacturers.
Peres Penteado, Alissa; Fábio Maciel, Rafael; Erbs, João; Feijó Ortolani, Cristina Lucia; Aguiar Roza, Bartira; Torres Pisa, Ivan
2015-01-01
The entire kidney transplantation process in Brazil is defined through laws, decrees, ordinances, and resolutions, but there is no defined theoretical map describing this process. From this representation it's possible to perform analysis, such as the identification of bottlenecks and information and communication technologies (ICTs) that support this process. The aim of this study was to analyze and represent the kidney transplantation workflow using business process modeling notation (BPMN) and then to identify the ICTs involved in the process. This study was conducted in eight steps, including document analysis and professional evaluation. The results include the BPMN model of the kidney transplantation process in Brazil and the identification of ICTs. We discovered that there are great delays in the process due to there being many different ICTs involved, which can cause information to be poorly integrated.
Pekala, Katarzyna; Jurczakowski, Rafał; Lewera, Adam; Orlik, Marek
2007-05-10
The oscillatory oxidation of thiocyanate ions with hydrogen peroxide, catalyzed by Cu2+ ions in alkaline media, was so far observed as occurring simultaneously in the entire space of the batch or flow reactor. We performed this reaction for the first time in the thin-layer reactor and observed the spatiotemporal course of the above process, in the presence of luminol as the chemiluminescent indicator. A series of luminescent patterns periodically starting from the random reaction center and spreading throughout the entire solution layer was reported. For a batch-stirred system, the bursts of luminescence were found to correlate with the steep decreases of the oscillating Pt electrode potential. These novel results open possibilities for further experimental and theoretical investigations of those spatiotemporal patterns, including studies of the mechanism of this chemically complex process.
Accelerating Molecular Dynamic Simulation on Graphics Processing Units
Friedrichs, Mark S.; Eastman, Peter; Vaidyanathan, Vishal; Houston, Mike; Legrand, Scott; Beberg, Adam L.; Ensign, Daniel L.; Bruns, Christopher M.; Pande, Vijay S.
2009-01-01
We describe a complete implementation of all-atom protein molecular dynamics running entirely on a graphics processing unit (GPU), including all standard force field terms, integration, constraints, and implicit solvent. We discuss the design of our algorithms and important optimizations needed to fully take advantage of a GPU. We evaluate its performance, and show that it can be more than 700 times faster than a conventional implementation running on a single CPU core. PMID:19191337
Ethanol or Biodiesel? A Systems-Analysis Decision
ERIC Educational Resources Information Center
Dinan, Frank; Stabler, Tom
2008-01-01
This case study stresses the need to broadly consider an entire system, including all of the energy inputs and outputs involved, to determine the real efficiency of that system. It also asks its student audience to consider the role that scientific input plays in policy decision-making processes. It emphasizes that, despite the importance of this…
HIPAA Readiness Collaborative in Hawaii.
Chun, Marva; Forbes, Susan; Gose, Steven; Kumabe, Brenda; Loo, Jeffrey; Nichols, Lorraine; Rosa, Luis; Sherrill, Laura; Turner, Jim
2002-01-01
The vision of Hawaii's HIPAA Readiness Collaborative (HRC) effort is to realize the positive potential of HIPAA through a collaborative process that engages the entire healthcare delivery system. Goals include reducing the cost of healthcare through streamlining, reducing the cost of HIPAA implementation for HRC participants, and improving the interoperability between facilities through use of standard technologies.
Quasicrystals and Quantum Computing
NASA Astrophysics Data System (ADS)
Berezin, Alexander A.
1997-03-01
In Quantum (Q) Computing qubits form Q-superpositions for macroscopic times. One scheme for ultra-fast (Q) computing can be based on quasicrystals. Ultrafast processing in Q-coherent structures (and the very existence of durable Q-superpositions) may be 'consequence' of presence of entire manifold of integer arithmetic (A0, aleph-naught of Georg Cantor) at any 4-point of space-time, furthermore, at any point of any multidimensional phase space of (any) N-particle Q-system. The latter, apart from quasicrystals, can include dispersed and/or diluted systems (Berezin, 1994). In such systems such alleged centrepieces of Q-Computing as ability for fast factorization of long integers can be processed by sheer virtue of the fact that entire infinite pattern of prime numbers is instantaneously available as 'free lunch' at any instant/point. Infinitely rich pattern of A0 (including pattern of primes and almost primes) acts as 'independent' physical effect which directly generates Q-dynamics (and physical world) 'out of nothing'. Thus Q-nonlocality can be ultimately based on instantaneous interconnectedness through ever- the-same structure of A0 ('Platonic field' of integers).
Yoon, Hyejin; Leitner, Thomas
2014-12-17
Analyses of entire viral genomes or mtDNA requires comprehensive design of many primers across their genomes. In addition, simultaneous optimization of several DNA primer design criteria may improve overall experimental efficiency and downstream bioinformatic processing. To achieve these goals, we developed PrimerDesign-M. It includes several options for multiple-primer design, allowing researchers to efficiently design walking primers that cover long DNA targets, such as entire HIV-1 genomes, and that optimizes primers simultaneously informed by genetic diversity in multiple alignments and experimental design constraints given by the user. PrimerDesign-M can also design primers that include DNA barcodes and minimize primer dimerization. PrimerDesign-Mmore » finds optimal primers for highly variable DNA targets and facilitates design flexibility by suggesting alternative designs to adapt to experimental conditions.« less
Second Aerospace Environmental Technology Conference
NASA Technical Reports Server (NTRS)
Whitaker, A. F. (Editor); Clark-Ingram, M. (Editor)
1997-01-01
The mandated elimination of CFC'S, Halons, TCA, and other ozone depleting chemicals and specific hazardous materials has required changes and new developments in aerospace materials and processes. The aerospace industry has been involved for several years in providing product substitutions, redesigning entire production processes, and developing new materials that minimize or eliminate damage to the environment. These activities emphasize replacement cleaning solvents and their application, verification, compliant coatings including corrosion protection system and removal techniques, chemical propulsion effects on the environment, and the initiation of modifications to relevant processing and manufacturing specifications and standards.
Second Aerospace Environmental Technology Conference
NASA Technical Reports Server (NTRS)
Whitaker, A. F.; Clark-Ingram, M.; Hessler, S. L.
1997-01-01
The mandated elimination of CFC's, Halons, TCA, and other ozone depleting chemicals and specific hazardous materials has required changes and new developments in aerospace materials and processes. The aerospace industry has been involved for several years in providing product substitutions, redesigning entire production processes, and developing new materials that minimize or eliminate damage to the environment. These activities emphasize replacement cleaning solvents and their application verifications, compliant coatings including corrosion protection systems, and removal techniques, chemical propulsion effects on the environment, and the initiation of modifications to relevant processing and manufacturing specifications and standards.
Evaluation of work posture and quantification of fatigue by Rapid Entire Body Assessment (REBA)
NASA Astrophysics Data System (ADS)
Rizkya, I.; Syahputri, K.; Sari, R. M.; Anizar; Siregar, I.
2018-02-01
Work related musculoskeletal disorders (MSDs), poor body postures, and low back injuries are the most common problems occurring in many industries including small-medium industries. This study presents assessment and evaluation of ergonomic postures of material handling worker. That evaluation was carried out using REBA (Rapid Entire Body Assessment). REBA is a technique to quantize the fatigue experienced by the worker while manually lifting loads. Fatigue due to abnormal work posture leads to complaints of labor-perceived pain. REBA methods were used to an assessment of working postures for the existing process by a procedural analysis of body postures involved. This study shows that parts of the body have a high risk of work are the back, neck, and upper arms with REBA score 9, so action should be taken as soon as possible. Controlling actions were implemented to those process with high risk then substantial risk reduction was achieved.
Challenges facing the finance reform of the health system in Chile.
Herrera, Tania
2014-05-28
Financing is one of the key functions of health systems, which includes the processes of revenue collection, fund pooling and acquisitions in order to ensure access to healthcare for the entire population. The article analyzes the financing model of the Chilean health system in terms of the first two processes, confirming low public spending on healthcare and high out-of-pocket expenditure, in addition to an appropriation of public resources by private insurers and providers. Insofar as pooling, there is lack of solidarity and risk sharing leading to segmentation of the population that is not consistent with the concept of social security, undermines equity and reduces system-wide efficiency. There is a pressing need to jumpstart reforms that address these issues. Treatments must be considered together with public health concerns and primary care in order to ensure the right to health of the entire population.
Magagna, Federico; Guglielmetti, Alessandro; Liberto, Erica; Reichenbach, Stephen E; Allegrucci, Elena; Gobino, Guido; Bicchi, Carlo; Cordero, Chiara
2017-08-02
This study investigates chemical information of volatile fractions of high-quality cocoa (Theobroma cacao L. Malvaceae) from different origins (Mexico, Ecuador, Venezuela, Columbia, Java, Trinidad, and Sao Tomè) produced for fine chocolate. This study explores the evolution of the entire pattern of volatiles in relation to cocoa processing (raw, roasted, steamed, and ground beans). Advanced chemical fingerprinting (e.g., combined untargeted and targeted fingerprinting) with comprehensive two-dimensional gas chromatography coupled with mass spectrometry allows advanced pattern recognition for classification, discrimination, and sensory-quality characterization. The entire data set is analyzed for 595 reliable two-dimensional peak regions, including 130 known analytes and 13 potent odorants. Multivariate analysis with unsupervised exploration (principal component analysis) and simple supervised discrimination methods (Fisher ratios and linear regression trees) reveal informative patterns of similarities and differences and identify characteristic compounds related to sample origin and manufacturing step.
Zhao, Zhi -Jian; Kulkarni, Ambarish; Vilella, Laia; ...
2016-05-02
Selective oxidation of methane to methanol is one of the most difficult chemical processes to perform. A potential group of catalysts to achieve CH 4 partial oxidation are Cu-exchanged zeolites mimicking the active structure of the enzyme methane monooxygenase. However, the details of this conversion, including the structure of the active site, are still under debate. In this contribution, periodic density functional theory (DFT) methods were employed to explore the molecular features of the selective oxidation of methane to methanol catalyzed by Cu-exchanged mordenite (Cu-MOR). We focused on two types of previously suggested active species, CuOCu and CuOOCu. Our calculationsmore » indicate that the formation of CuOCu is more feasible than that of CuOOCu. In addition, a much lower C–H dissociation barrier is located on the former active site, indicating that C–H bond activation is easily achieved with CuOCu. We calculated the energy barriers of all elementary steps for the entire process, including catalyst activation, CH 4 activation, and CH 3OH desorption. Finally, our calculations are in agreement with experimental observations and present the first theoretical study examining the entire process of selective oxidation of methane to methanol.« less
The Nasa-Isro SAR Mission Science Data Products and Processing Workflows
NASA Astrophysics Data System (ADS)
Rosen, P. A.; Agram, P. S.; Lavalle, M.; Cohen, J.; Buckley, S.; Kumar, R.; Misra-Ray, A.; Ramanujam, V.; Agarwal, K. M.
2017-12-01
The NASA-ISRO SAR (NISAR) Mission is currently in the development phase and in the process of specifying its suite of data products and algorithmic workflows, responding to inputs from the NISAR Science and Applications Team. NISAR will provide raw data (Level 0), full-resolution complex imagery (Level 1), and interferometric and polarimetric image products (Level 2) for the entire data set, in both natural radar and geocoded coordinates. NASA and ISRO are coordinating the formats, meta-data layers, and algorithms for these products, for both the NASA-provided L-band radar and the ISRO-provided S-band radar. Higher level products will be also be generated for the purpose of calibration and validation, over large areas of Earth, including tectonic plate boundaries, ice sheets and sea-ice, and areas of ecosystem disturbance and change. This level of comprehensive product generation has been unprecedented for SAR missions in the past, and leads to storage processing challenges for the production system and the archive center. Further, recognizing the potential to support applications that require low latency product generation and delivery, the NISAR team is optimizing the entire end-to-end ground data system for such response, including exploring the advantages of cloud-based processing, algorithmic acceleration using GPUs, and on-demand processing schemes that minimize computational and transport costs, but allow rapid delivery to science and applications users. This paper will review the current products, workflows, and discuss the scientific and operational trade-space of mission capabilities.
ERIC Educational Resources Information Center
Center for Applied Linguistics, Washington, DC. Refugee Service Center.
This resettlement guide, entirely in Spanish, describes the initial stage of resettlement and the processes that refugees undergo as new arrivals. Subjects covered in this guide include pre-arrival procedures, admissions criteria, immigrant's statement of understanding, travel costs and U.S. Customs; resettlement procedures, immigrants'…
ERIC Educational Resources Information Center
Raabe, Richard; Gentile, Lisa
2008-01-01
A number of institutions have been, or are in the process of, modifying their biochemistry major to include some emphasis on the quantitative physical chemistry of biomolecules. Sometimes this is done as a replacement for part for the entire physical chemistry requirement, while at other institutions this is incorporated as a component into the…
ERIC Educational Resources Information Center
Coursen, David
Modern educators and playground designers are increasingly recognizing that play is a part, perhaps the decisive part, of the entire learning process. Theories of playground equipment design, planning the playground, financial considerations, and equipment suggestions are featured in this review. Examples of playgrounds include innovative…
Patient Activities Planning and Progress Noting a Humanistic Integrated-Team Approach.
ERIC Educational Resources Information Center
Muilenburg, Ted
This document outlines a system for planning recreation therapy, documenting progress, and relating the entire process to a team approach which includes patient assessment and involvement. The recreation program is seen as therapeutic, closely related to the total medical treatment program. The model is designed so that it can be adapted to almost…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
This document is a review journal that covers significant developments in the field of nuclear safety. Its scope includes the analysis and control of hazards associated with nuclear energy, operations involving fissionable materials, and the products of nuclear fission and their effects on the environment. Primary emphasis is on safety in reactor design, construction, and operation; however, the safety aspects of the entire fuel cycle, including fuel fabrication, spent-fuel processing, nuclear waste disposal, handling of radioisotopes, and environmental effects of these operations, are also treated.
Method and apparatus for implementing material thermal property measurement by flash thermal imaging
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sun, Jiangang
A method and apparatus are provided for implementing measurement of material thermal properties including measurement of thermal effusivity of a coating and/or film or a bulk material of uniform property. The test apparatus includes an infrared camera, a data acquisition and processing computer coupled to the infrared camera for acquiring and processing thermal image data, a flash lamp providing an input of heat onto the surface of a two-layer sample with an enhanced optical filter covering the flash lamp attenuating an entire infrared wavelength range with a series of thermal images is taken of the surface of the two-layer sample.
Gebremikael, Mesfin Tsegaye; Steel, Hanne; Bert, Wim; Maenhout, Peter; Sleutel, Steven; De Neve, Stefaan
2015-01-01
To understand the roles of nematodes in organic matter (OM) decomposition, experimental setups should include the entire nematode community, the native soil microflora, and their food sources. Yet, published studies are often based on either simplified experimental setups, using only a few selected species of nematode and their respective prey, despite the multitude of species present in natural soil, or on indirect estimation of the mineralization process using O2 consumption and the fresh weight of nematodes. We set up a six-month incubation experiment to quantify the contribution of the entire free living nematode community to carbon (C) mineralization under realistic conditions. The following treatments were compared with and without grass-clover amendment: defaunated soil reinoculated with the entire free living nematode communities (+Nem) and defaunated soil that was not reinoculated (-Nem). We also included untreated fresh soil as a control (CTR). Nematode abundances and diversity in +Nem was comparable to the CTR showing the success of the reinoculation. No significant differences in C mineralization were found between +Nem and -Nem treatments of the amended and unamended samples at the end of incubation. Other related parameters such as microbial biomass C and enzymatic activities did not show significant differences between +Nem and -Nem treatments in both amended and unamended samples. These findings show that the collective contribution of the entire nematode community to C mineralization is small. Previous reports in literature based on simplified experimental setups and indirect estimations are contrasting with the findings of the current study and further investigations are needed to elucidate the extent and the mechanisms of nematode involvement in C mineralization. PMID:26393517
Efficient material decomposition method for dual-energy X-ray cargo inspection system
NASA Astrophysics Data System (ADS)
Lee, Donghyeon; Lee, Jiseoc; Min, Jonghwan; Lee, Byungcheol; Lee, Byeongno; Oh, Kyungmin; Kim, Jaehyun; Cho, Seungryong
2018-03-01
Dual-energy X-ray inspection systems are widely used today for it provides X-ray attenuation contrast of the imaged object and also its material information. Material decomposition capability allows a higher detection sensitivity of potential targets including purposely loaded impurities in agricultural product inspections and threats in security scans for example. Dual-energy X-ray transmission data can be transformed into two basis material thickness data, and its transformation accuracy heavily relies on a calibration of material decomposition process. The calibration process in general can be laborious and time consuming. Moreover, a conventional calibration method is often challenged by the nonuniform spectral characteristics of the X-ray beam in the entire field-of-view (FOV). In this work, we developed an efficient material decomposition calibration process for a linear accelerator (LINAC) based high-energy X-ray cargo inspection system. We also proposed a multi-spot calibration method to improve the decomposition performance throughout the entire FOV. Experimental validation of the proposed method has been demonstrated by use of a cargo inspection system that supports 6 MV and 9 MV dual-energy imaging.
Watt, Stuart; Jiao, Wei; Brown, Andrew M K; Petrocelli, Teresa; Tran, Ben; Zhang, Tong; McPherson, John D; Kamel-Reid, Suzanne; Bedard, Philippe L; Onetto, Nicole; Hudson, Thomas J; Dancey, Janet; Siu, Lillian L; Stein, Lincoln; Ferretti, Vincent
2013-09-01
Using sequencing information to guide clinical decision-making requires coordination of a diverse set of people and activities. In clinical genomics, the process typically includes sample acquisition, template preparation, genome data generation, analysis to identify and confirm variant alleles, interpretation of clinical significance, and reporting to clinicians. We describe a software application developed within a clinical genomics study, to support this entire process. The software application tracks patients, samples, genomic results, decisions and reports across the cohort, monitors progress and sends reminders, and works alongside an electronic data capture system for the trial's clinical and genomic data. It incorporates systems to read, store, analyze and consolidate sequencing results from multiple technologies, and provides a curated knowledge base of tumor mutation frequency (from the COSMIC database) annotated with clinical significance and drug sensitivity to generate reports for clinicians. By supporting the entire process, the application provides deep support for clinical decision making, enabling the generation of relevant guidance in reports for verification by an expert panel prior to forwarding to the treating physician. Copyright © 2013 Elsevier Inc. All rights reserved.
Bayridge Secondary School: A Case Study of the Planning and Implementation of Educational Change.
ERIC Educational Resources Information Center
Eastabrook, Glen; And Others
This is an account of the planning and implementation processes of a new secondary school (Bayridge Secondary School), located in a suburban area of a medium-sized city in Ontario, Canada. This report traces the planning and development of the school's goals, which included involvement of the entire school community, from 1970 through 1974. The…
Aerospace Environmental Technology Conference
NASA Technical Reports Server (NTRS)
Whitaker, A. F. (Editor)
1995-01-01
The mandated elimination of CFC's, Halons, TCA, and other ozone depleting chemicals and specific hazardous materials has required changes and new developments in aerospace materials and processes. The aerospace industry has been involved for several years in providing product substitutions, redesigning entire production processes, and developing new materials that minimize or eliminate damage to the environment. These activities emphasize replacement cleaning solvents and their application verifications, compliant coatings including corrosion protection systems, and removal techniques, chemical propulsion effects on the environment, and the initiation of modifications to relevant processing and manufacturing specifications and standards. The Executive Summary of this Conference is published as NASA CP-3297.
Process engineering concerns in the lunar environment
NASA Technical Reports Server (NTRS)
Sullivan, T. A.
1990-01-01
The paper discusses the constraints on a production process imposed by the lunar or Martian environment on the space transportation system. A proposed chemical route to produce oxygen from iron oxide bearing minerals (including ilmenite) is presented in three different configurations which vary in complexity. A design for thermal energy storage is presented that could both provide power during the lunar night and act as a blast protection barrier for the outpost. A process to release carbon from the lunar regolith as methane is proposed, capitalizing on the greater abundance and favorable physical properties of methane relative to hydrogen to benefit the entire system.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Silver, E G
This document is a review journal that covers significant developments in the field of nuclear safety. Its scope includes the analysis and control of hazards associated with nuclear energy, operations involving fissionable materials, and the products of nuclear fission and their effects on the environment. Primary emphasis is on safety in reactor design, construction, and operation; however, the safety aspects of the entire fuel cycle, including fuel fabrication, spent-fuel processing, nuclear waste disposal, handling of radioisotopes, and environmental effects of these operations, are also treated.
NASA Astrophysics Data System (ADS)
Aminatun, Putri, N. S. Efinda; Indriani, Arista; Himawati, Umi; Hikmawati, Dyah; Suhariningsih
2014-09-01
Cobalt-based alloys are widely used as total hip and knee replacements because of their excellent properties, such as corrosion resistance, fatigue strength and biocompatibility. In this work, cobalt alloys with variation of Cr (28.5; 30; 31.5; 33, and 34.5% wt) have been synthesized by smelting method began with the process of compaction, followed by smelting process using Tri Arc Melting Furnace at 200A. Continued by homogenization process at recrystallization temperature (1250° C) for 3 hours to allow the atoms diffuses and transform into γ phase. The next process is rolling process which is accompanied by heating at 1200° C for ± 15 minutes and followed by quenching. This process is repeated until the obtained thickness of ± 1 mm. The evaluated material properties included microstructure, surface morphology, and hardness value. It was shown that microstructure of cobalt alloys with variation of Cr is dominant by γ phase, thus making the entire cobalt alloys have high hardness. It was also shown from the surface morphology of entire cobalt alloys sample indicated the whole process of synthesis that had good solubility were at flat surface area. Hardness value test showed all of cobalt alloys sample had high hardness, just variation of 33% Cr be in the range of ASTMF75, it were 345,24 VHN which is potential to be applied as an implant prosthesis.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Aminatun,, E-mail: ami-sofijan@yahoo.co.id; Putri, N.S Efinda, E-mail: ami-sofijan@yahoo.co.id; Indriani, Arista, E-mail: ami-sofijan@yahoo.co.id
Cobalt-based alloys are widely used as total hip and knee replacements because of their excellent properties, such as corrosion resistance, fatigue strength and biocompatibility. In this work, cobalt alloys with variation of Cr (28.5; 30; 31.5; 33, and 34.5% wt) have been synthesized by smelting method began with the process of compaction, followed by smelting process using Tri Arc Melting Furnace at 200A. Continued by homogenization process at recrystallization temperature (1250° C) for 3 hours to allow the atoms diffuses and transform into γ phase. The next process is rolling process which is accompanied by heating at 1200° C formore » ± 15 minutes and followed by quenching. This process is repeated until the obtained thickness of ± 1 mm. The evaluated material properties included microstructure, surface morphology, and hardness value. It was shown that microstructure of cobalt alloys with variation of Cr is dominant by γ phase, thus making the entire cobalt alloys have high hardness. It was also shown from the surface morphology of entire cobalt alloys sample indicated the whole process of synthesis that had good solubility were at flat surface area. Hardness value test showed all of cobalt alloys sample had high hardness, just variation of 33% Cr be in the range of ASTMF75, it were 345,24 VHN which is potential to be applied as an implant prosthesis.« less
The intricate mechanisms of neurodegeneration in prion diseases
Soto, Claudio; Satani, Nikunj
2010-01-01
Prion diseases are a group of infectious neurodegenerative diseases with an entirely novel mechanism of transmission, involving a protein-only infectious agent that propagates the disease by transmitting protein conformational changes. The disease results from extensive and progressive brain degeneration. The molecular mechanisms involved in neurodegeneration are not entirely known but involve multiple processes operating simultaneously and synergistically in the brain, including spongiform degeneration, synaptic alterations, brain inflammation, neuronal death and the accumulation of protein aggregates. Here, we review the pathways implicated in prion-induced brain damage and put the pieces together into a possible model of neurodegeneration in prion disorders. A more comprehensive understanding of the molecular basis of brain degeneration is essential to develop a much needed therapy for these devastating diseases. PMID:20889378
Space processing applications rocket project. SPAR 8
NASA Technical Reports Server (NTRS)
Chassay, R. P. (Editor)
1984-01-01
The Space Processing Applications Rocket Project (SPAR) VIII Final Report contains the engineering report prepared at the Marshall Space Flight Center (MSFC) as well as the three reports from the principal investigators. These reports also describe pertinent portions of ground-based research leading to the ultimate selection of the flight sample composition, including design, fabrication, and testing, all of which are expected to contribute immeasurably to an improved comprehension of materials processing in space. This technical memorandum is directed entirely to the payload manifest flown in the eighth of a series of SPAR flights conducted at the White Sands Missile Range (WSMR) and includes the experiments entitled Glass Formation Experiment SPAR 74-42/1R, Glass Fining Experiment in Low-Gravity SPAR 77-13/1, and Dynamics of Liquid Bubbles SPAR Experiment 77-18/2.
Design and performance study of an orthopaedic surgery robotized module for automatic bone drilling.
Boiadjiev, George; Kastelov, Rumen; Boiadjiev, Tony; Kotev, Vladimir; Delchev, Kamen; Zagurski, Kazimir; Vitkov, Vladimir
2013-12-01
Many orthopaedic operations involve drilling and tapping before the insertion of screws into a bone. This drilling is usually performed manually, thus introducing many problems. These include attaining a specific drilling accuracy, preventing blood vessels from breaking, and minimizing drill oscillations that would widen the hole. Bone overheating is the most important problem. To avoid such problems and reduce the subjective factor, automated drilling is recommended. Because numerous parameters influence the drilling process, this study examined some experimental methods. These concerned the experimental identification of technical drilling parameters, including the bone resistance force and temperature in the drilling process. During the drilling process, the following parameters were monitored: time, linear velocity, angular velocity, resistance force, penetration depth, and temperature. Specific drilling effects were revealed during the experiments. The accuracy was improved at the starting point of the drilling, and the error for the entire process was less than 0.2 mm. The temperature deviations were kept within tolerable limits. The results of various experiments with different drilling velocities, drill bit diameters, and penetration depths are presented in tables, as well as the curves of the resistance force and temperature with respect to time. Real-time digital indications of the progress of the drilling process are shown. Automatic bone drilling could entirely solve the problems that usually arise during manual drilling. An experimental setup was designed to identify bone drilling parameters such as the resistance force arising from variable bone density, appropriate mechanical drilling torque, linear speed of the drill, and electromechanical characteristics of the motors, drives, and corresponding controllers. Automatic drilling guarantees greater safety for the patient. Moreover, the robot presented is user-friendly because it is simple to set robot tasks, and process data are collected in real time. Copyright © 2013 John Wiley & Sons, Ltd.
Reference Architecture Model Enabling Standards Interoperability.
Blobel, Bernd
2017-01-01
Advanced health and social services paradigms are supported by a comprehensive set of domains managed by different scientific disciplines. Interoperability has to evolve beyond information and communication technology (ICT) concerns, including the real world business domains and their processes, but also the individual context of all actors involved. So, the system must properly reflect the environment in front and around the computer as essential and even defining part of the health system. This paper introduces an ICT-independent system-theoretical, ontology-driven reference architecture model allowing the representation and harmonization of all domains involved including the transformation into an appropriate ICT design and implementation. The entire process is completely formalized and can therefore be fully automated.
Website Redesign: A Case Study.
Wu, Jin; Brown, Janis F
2016-01-01
A library website redesign is a complicated and at times arduous task, requiring many different steps including determining user needs, analyzing past user behavior, examining other websites, defining design preferences, testing, marketing, and launching the site. Many different types of expertise are required over the entire process. Lessons learned from the Norris Medical Library's experience with the redesign effort may be useful to others undertaking a similar project.
NASA Astrophysics Data System (ADS)
Wright, Ashley J.; Walker, Jeffrey P.; Pauwels, Valentijn R. N.
2017-08-01
Floods are devastating natural hazards. To provide accurate, precise, and timely flood forecasts, there is a need to understand the uncertainties associated within an entire rainfall time series, even when rainfall was not observed. The estimation of an entire rainfall time series and model parameter distributions from streamflow observations in complex dynamic catchments adds skill to current areal rainfall estimation methods, allows for the uncertainty of entire rainfall input time series to be considered when estimating model parameters, and provides the ability to improve rainfall estimates from poorly gauged catchments. Current methods to estimate entire rainfall time series from streamflow records are unable to adequately invert complex nonlinear hydrologic systems. This study aims to explore the use of wavelets in the estimation of rainfall time series from streamflow records. Using the Discrete Wavelet Transform (DWT) to reduce rainfall dimensionality for the catchment of Warwick, Queensland, Australia, it is shown that model parameter distributions and an entire rainfall time series can be estimated. Including rainfall in the estimation process improves streamflow simulations by a factor of up to 1.78. This is achieved while estimating an entire rainfall time series, inclusive of days when none was observed. It is shown that the choice of wavelet can have a considerable impact on the robustness of the inversion. Combining the use of a likelihood function that considers rainfall and streamflow errors with the use of the DWT as a model data reduction technique allows the joint inference of hydrologic model parameters along with rainfall.
NASA Technical Reports Server (NTRS)
Tilmes, Curt A.; Fleig, Albert J.
2008-01-01
NASA's traditional science data processing systems have focused on specific missions, and providing data access, processing and services to the funded science teams of those specific missions. Recently NASA has been modifying this stance, changing the focus from Missions to Measurements. Where a specific Mission has a discrete beginning and end, the Measurement considers long term data continuity across multiple missions. Total Column Ozone, a critical measurement of atmospheric composition, has been monitored for'decades on a series of Total Ozone Mapping Spectrometer (TOMS) instruments. Some important European missions also monitor ozone, including the Global Ozone Monitoring Experiment (GOME) and SCIAMACHY. With the U.S.IEuropean cooperative launch of the Dutch Ozone Monitoring Instrument (OMI) on NASA Aura satellite, and the GOME-2 instrumental on MetOp, the ozone monitoring record has been further extended. In conjunction with the U.S. Department of Defense (DoD) and the National Oceanic and Atmospheric Administration (NOAA), NASA is now preparing to evaluate data and algorithms for the next generation Ozone Mapping and Profiler Suite (OMPS) which will launch on the National Polar-orbiting Operational Environmental Satellite System (NPOESS) Preparatory Project (NPP) in 2010. NASA is constructing the Science Data Segment (SDS) which is comprised of several elements to evaluate the various NPP data products and algorithms. The NPP SDS Ozone Product Evaluation and Test Element (PEATE) will build on the heritage of the TOMS and OM1 mission based processing systems. The overall measurement based system that will encompass these efforts is the Atmospheric Composition Processing System (ACPS). We have extended the system to include access to publically available data sets from other instruments where feasible, including non-NASA missions as appropriate. The heritage system was largely monolithic providing a very controlled processing flow from data.ingest of satellite data to the ultimate archive of specific operational data products. The ACPS allows more open access with standard protocols including HTTP, SOAPIXML, RSS and various REST incarnations. External entities can be granted access to various modules within the system, including an extended data archive, metadata searching, production planning and processing. Data access is provided with very fine grained access control. It is possible to easily designate certain datasets as being available to the public, or restricted to groups of researchers, or limited strictly to the originator. This can be used, for example, to release one's best validated data to the public, but restrict the "new version" of data processed with a new, unproven algorithm until it is ready. Similarly, the system can provide access to algorithms, both as modifiable source code (where possible) and fully integrated executable Algorithm Plugin Packages (APPs). This enables researchers to download publically released versions of the processing algorithms and easily reproduce the processing remotely, while interacting with the ACPS. The algorithms can be modified allowing better experimentation and rapid improvement. The modified algorithms can be easily integrated back into the production system for large scale bulk processing to evaluate improvements. The system includes complete provenance tracking of algorithms, data and the entire processing environment. The origin of any data or algorithms is recorded and the entire history of the processing chains are stored such that a researcher can understand the entire data flow. Provenance is captured in a form suitable for the system to guarantee scientific reproducability of any data product it distributes even in cases where the physical data products themselves have been deleted due to space constraints. We are currently working on Semantic Web ontologies for representing the various provenance information. A new web site focusing on consolidating informaon about the measurement, processing system, and data access has been established to encourage interaction with the overall scientific community. We will describe the system, its data processing capabilities, and the methods the community can use to interact with the standard interfaces of the system.
Reengineering outcomes management: an integrated approach to managing data, systems, and processes.
Neuman, K; Malloch, K; Ruetten, V
1999-01-01
The integration of outcomes management into organizational reengineering projects is often overlooked or marginalized in proportion to the entire project. Incorporation of an integrated outcomes management program strengthens the overall quality of reengineering projects and enhances their sustainability. This article presents a case study in which data, systems, and processes were reengineered to form an effective Outcomes Management program as a component of the organization's overall project. The authors describe eight steps to develop and monitor an integrated outcomes management program. An example of an integrated report format is included.
Development and implementation of an interdisciplinary plan of care.
Lewis, Cynthia; Hoffmann, Mary Lou; Gard, Angela; Coons, Jacqueline; Bichinich, Pat; Euclid, Jeff
2005-01-01
In January 2002 Aurora Health Care Metro Region chartered an interdisciplinary team to develop a process and structure for patient-centered interdisciplinary care planning. This unique endeavor created a process that includes the patient, family, and all clinical disciplines involved in planning and providing care to patients from system point of entry throughout the entire acute care episode. The interdisciplinary plan of care (IPOC) demonstrates the integration of prioritized problems, outcomes, and measurement toward goal attainment. This article focuses on the journey of this team to the successful implementation of an IPOC.
Aerospace Environmental Technology Conference: Exectutive summary
NASA Technical Reports Server (NTRS)
Whitaker, A. F. (Editor)
1995-01-01
The mandated elimination of CFC's, Halons, TCA, and other ozone depleting chemicals and specific hazardous materials has required changes and new developments in aerospace materials and processes. The aerospace industry has been involved for several years in providing product substitutions, redesigning entire production processes, and developing new materials that minimize or eliminate damage to the environment. These activities emphasize replacement cleaning solvents and their application verifications, compliant coatings including corrosion protection systems, and removal techniques, chemical propulsion effects on the environment, and the initiation of modifications to relevant processing and manufacturing specifications and standards. The papers from this conference are being published in a separate volume as NASA CP-3298.
NASA Technical Reports Server (NTRS)
Sewell, James S.; Bozada, Christopher A.
1994-01-01
Advanced radar and communication systems rely heavily on state-of-the-art microelectronics. Systems such as the phased-array radar require many transmit/receive (T/R) modules which are made up of many millimeter wave - microwave integrated circuits (MMIC's). The heart of a MMIC chip is the Gallium Arsenide (GaAs) field-effect transistor (FET). The transistor gate length is the critical feature that determines the operating frequency of the radar system. A smaller gate length will typically result in a higher frequency. In order to make a phased array radar system economically feasible, manufacturers must be capable of producing very large quantities of small-gate-length MMIC chips at a relatively low cost per chip. This requires the processing of a large number of wafers with a large number of chips per wafer, minimum processing time, and a very high chip yield. One of the bottlenecks in the fabrication of MIMIC chips is the transistor gate definition. The definition of sub-half-micron gates for GaAs-based field-effect transistors is generally performed by direct-write electron beam lithography (EBL). Because of the throughput limitations of EBL, the gate-layer fabrication is conventionally divided into two lithographic processes where EBL is used to generate the gate fingers and optical lithography is used to generate the large-area gate pads and interconnects. As a result, two complete sequences of resist application, exposure, development, metallization and lift-off are required for the entire gate structure. We have baselined a hybrid process, referred to as EBOL (electron beam/optical lithography), in which a single application of a multi-level resist is used for both exposures. The entire gate structure, (gate fingers, interconnects and pads), is then formed with a single metallization and lift-off process. The EBOL process thus retains the advantages of the high-resolution E-beam lithography and the high throughput of optical lithography while essentially eliminating an entire lithography/metallization/lift-off process sequence. This technique has been proven to be reliable for both trapezoidal and mushroom gates and has been successfully applied to metal-semiconductor and high-electron-mobility field-effect transistor (MESFET and HEMT) wafers containing devices with gate lengths down to 0.10 micron and 75 x 75 micron gate pads. The yields and throughput of these wafers have been very high with no loss in device performance. We will discuss the entire EBOL process technology including the multilayer resist structure, exposure conditions, process sensitivities, metal edge definition, device results, comparison to the standard gate-layer process, and its suitability for manufacturing.
NASA Astrophysics Data System (ADS)
Sewell, James S.; Bozada, Christopher A.
1994-02-01
Advanced radar and communication systems rely heavily on state-of-the-art microelectronics. Systems such as the phased-array radar require many transmit/receive (T/R) modules which are made up of many millimeter wave - microwave integrated circuits (MMIC's). The heart of a MMIC chip is the Gallium Arsenide (GaAs) field-effect transistor (FET). The transistor gate length is the critical feature that determines the operating frequency of the radar system. A smaller gate length will typically result in a higher frequency. In order to make a phased array radar system economically feasible, manufacturers must be capable of producing very large quantities of small-gate-length MMIC chips at a relatively low cost per chip. This requires the processing of a large number of wafers with a large number of chips per wafer, minimum processing time, and a very high chip yield. One of the bottlenecks in the fabrication of MIMIC chips is the transistor gate definition. The definition of sub-half-micron gates for GaAs-based field-effect transistors is generally performed by direct-write electron beam lithography (EBL). Because of the throughput limitations of EBL, the gate-layer fabrication is conventionally divided into two lithographic processes where EBL is used to generate the gate fingers and optical lithography is used to generate the large-area gate pads and interconnects. As a result, two complete sequences of resist application, exposure, development, metallization and lift-off are required for the entire gate structure. We have baselined a hybrid process, referred to as EBOL (electron beam/optical lithography), in which a single application of a multi-level resist is used for both exposures. The entire gate structure, (gate fingers, interconnects and pads), is then formed with a single metallization and lift-off process. The EBOL process thus retains the advantages of the high-resolution E-beam lithography and the high throughput of optical lithography while essentially eliminating an entire lithography/metallization/lift-off process sequence. This technique has been proven to be reliable for both trapezoidal and mushroom gates and has been successfully applied to metal-semiconductor and high-electron-mobility field-effect transistor (MESFET and HEMT) wafers containing devices with gate lengths down to 0.10 micron and 75 x 75 micron gate pads. The yields and throughput of these wafers have been very high with no loss in device performance. We will discuss the entire EBOL process technology including the multilayer resist structure, exposure conditions, process sensitivities, metal edge definition, device results, comparison to the standard gate-layer process, and its suitability for manufacturing.
Distraction and mind-wandering under load.
Forster, Sophie
2013-01-01
Attention research over the last several decades has provided rich insights into the determinants of distraction, including distractor characteristics, task features, and individual differences. Load Theory represented a particularly important breakthrough, highlighting the critical role of the level and nature of task-load in determining both the efficiency of distractor rejection and the stage of processing at which this occurs. However, until recently studies of distraction were restricted to those measuring rather specific forms of distraction by external stimuli which I argue that, although intended to be irrelevant, were in fact task-relevant. In daily life, attention may be distracted by a wide range of stimuli, which may often be entirely unrelated to any task being performed, and may include not only external stimuli but also internally generated stimuli such as task-unrelated thoughts. This review outlines recent research examining these more general, entirely task-irrelevant, forms of distraction within the framework of Load Theory. I discuss the relation between different forms of distraction, and the universality of load effects across different distractor types and individuals.
[Improving blood safety: errors management in transfusion medicine].
Bujandrić, Nevenka; Grujić, Jasmina; Krga-Milanović, Mirjana
2014-01-01
The concept of blood safety includes the entire transfusion chain starting with the collection of blood from the blood donor, and ending with blood transfusion to the patient. The concept involves quality management system as the systematic monitoring of adverse reactions and incidents regarding the blood donor or patient. Monitoring of near-miss errors show the critical points in the working process and increase transfusion safety. The aim of the study was to present the analysis results of adverse and unexpected events in transfusion practice with a potential risk to the health of blood donors and patients. One-year retrospective study was based on the collection, analysis and interpretation of written reports on medical errors in the Blood Transfusion Institute of Vojvodina. Errors were distributed according to the type, frequency and part of the working process where they occurred. Possible causes and corrective actions were described for each error. The study showed that there were not errors with potential health consequences for the blood donor/patient. Errors with potentially damaging consequences for patients were detected throughout the entire transfusion chain. Most of the errors were identified in the preanalytical phase. The human factor was responsible for the largest number of errors. Error reporting system has an important role in the error management and the reduction of transfusion-related risk of adverse events and incidents. The ongoing analysis reveals the strengths and weaknesses of the entire process and indicates the necessary changes. Errors in transfusion medicine can be avoided in a large percentage and prevention is cost-effective, systematic and applicable.
Gao, Yi; Wei, Jiankai; Yuan, Jianbo; Zhang, Xiaojun; Li, Fuhua; Xiang, Jianhai
2017-04-24
Exoskeleton construction is an important issue in shrimp. To better understand the molecular mechanism of exoskeleton formation, development and reconstruction, the transcriptome of the entire developmental process in Litopenaeus vannamei, including nine early developmental stages and eight adult-moulting stages, was sequenced and analysed using Illumina RNA-seq technology. A total of 117,539 unigenes were obtained, with 41.2% unigenes predicting the full-length coding sequence. Gene Ontology, Clusters of Orthologous Group (COG), the Kyoto Encyclopedia of Genes and Genomes (KEGG) analysis and functional annotation of all unigenes gave a better understanding of the exoskeleton developmental process in L. vannamei. As a result, more than six hundred unigenes related to exoskeleton development were identified both in the early developmental stages and adult-moulting. A cascade of sequential expression events of exoskeleton-related genes were summarized, including exoskeleton formation, regulation, synthesis, degradation, mineral absorption/reabsorption, calcification and hardening. This new insight on major transcriptional events provide a deep understanding for exoskeleton formation and reconstruction in L. vannamei. In conclusion, this is the first study that characterized the integrated transcriptomic profiles cover the entire exoskeleton development from zygote to adult-moulting in a crustacean, and these findings will serve as significant references for exoskeleton developmental biology and aquaculture research.
The Need for V&V in Reuse-Based Software Engineering
NASA Technical Reports Server (NTRS)
Addy, Edward A.
1997-01-01
V&V is currently performed during application development for many systems, especially safety-critical and mission-critical systems. The V&V process is intended to discover errors, especially errors related to entire' domain or product line rather than a critical processing, as early as possible during the development process. The system application provides the context under which the software artifacts are validated. engineering. This paper describes a framework that extends V&V from an individual application system to a product line of systems that are developed within an architecture-based software engineering environment. This framework includes the activities of traditional application-level V&V, and extends these activities into the transition between domain engineering and application engineering. The framework includes descriptions of the types of activities to be performed during each of the life-cycle phases, and provides motivation for activities.
Platform for Post-Processing Waveform-Based NDE
NASA Technical Reports Server (NTRS)
Roth, Don J.
2010-01-01
Signal- and image-processing methods are commonly needed to extract information from the waves, improve resolution of, and highlight defects in an image. Since some similarity exists for all waveform-based nondestructive evaluation (NDE) methods, it would seem that a common software platform containing multiple signal- and image-processing techniques to process the waveforms and images makes sense where multiple techniques, scientists, engineers, and organizations are involved. NDE Wave & Image Processor Version 2.0 software provides a single, integrated signal- and image-processing and analysis environment for total NDE data processing and analysis. It brings some of the most useful algorithms developed for NDE over the past 20 years into a commercial-grade product. The software can import signal/spectroscopic data, image data, and image series data. This software offers the user hundreds of basic and advanced signal- and image-processing capabilities including esoteric 1D and 2D wavelet-based de-noising, de-trending, and filtering. Batch processing is included for signal- and image-processing capability so that an optimized sequence of processing operations can be applied to entire folders of signals, spectra, and images. Additionally, an extensive interactive model-based curve-fitting facility has been included to allow fitting of spectroscopy data such as from Raman spectroscopy. An extensive joint-time frequency module is included for analysis of non-stationary or transient data such as that from acoustic emission, vibration, or earthquake data.
Monte Carlo simulation of efficient data acquisition for an entire-body PET scanner
NASA Astrophysics Data System (ADS)
Isnaini, Ismet; Obi, Takashi; Yoshida, Eiji; Yamaya, Taiga
2014-07-01
Conventional PET scanners can image the whole body using many bed positions. On the other hand, an entire-body PET scanner with an extended axial FOV, which can trace whole-body uptake images at the same time and improve sensitivity dynamically, has been desired. The entire-body PET scanner would have to process a large amount of data effectively. As a result, the entire-body PET scanner has high dead time at a multiplex detector grouping process. Also, the entire-body PET scanner has many oblique line-of-responses. In this work, we study an efficient data acquisition for the entire-body PET scanner using the Monte Carlo simulation. The simulated entire-body PET scanner based on depth-of-interaction detectors has a 2016-mm axial field-of-view (FOV) and an 80-cm ring diameter. Since the entire-body PET scanner has higher single data loss than a conventional PET scanner at grouping circuits, the NECR of the entire-body PET scanner decreases. But, single data loss is mitigated by separating the axially arranged detector into multiple parts. Our choice of 3 groups of axially-arranged detectors has shown to increase the peak NECR by 41%. An appropriate choice of maximum ring difference (MRD) will also maintain the same high performance of sensitivity and high peak NECR while at the same time reduces the data size. The extremely-oblique line of response for large axial FOV does not contribute much to the performance of the scanner. The total sensitivity with full MRD increased only 15% than that with about half MRD. The peak NECR was saturated at about half MRD. The entire-body PET scanner promises to provide a large axial FOV and to have sufficient performance values without using the full data.
Crane 55 at Drydock No. 2. View includes entire bone. ...
Crane 55 at Drydock No. 2. View includes entire bone. Building 43 is in background - Puget Sound Naval Shipyard, Portal Gantry Crane No. 55, Central Industrial Area, Farragut Avenue, Bremerton, Kitsap County, WA
DOE Office of Scientific and Technical Information (OSTI.GOV)
Patel, Amit, E-mail: amitrp@iitrpr.ac.in; Faculty of Technology and Engineering, The Maharaja Sayajirao University of Baroda, Vadodara 390001, Gujarat; Sarkar, Prabir
The environmental impact assessment of a process over its entire operational lifespan is an important issue. Estimation of life cycle emission helps in predicting the contribution of a given process to abate (or to pollute) the environmental emission scenario. Considering diminishing and time-dependent effect of emission, assessment of the overall effect of emissions is very complex. The paper presents a generalized methodology for arriving at a single emission discounting number for a process option, using the concept of time value of carbon emission flow. This number incorporates the effect of the emission resulting from the process over the entire operationalmore » lifespan. The advantage of this method is its quantitative aspect as well as its flexible nature. It can be applied to any process. The method is demonstrated with the help of an Intermediate Pyrolysis process when used to generate off-grid electricity and opting biochar route for disposing straw residue. The scenarios of very high net emission to very high net carbon sequestration is generated using process by careful selection of process parameters for different scenarios. For these different scenarios, the process discounting rate was determined and its outcome is discussed. The paper also proposes a process specific eco-label that mentions the discounting rates. - Highlight: • Methodology to obtain emission discounting rate for a process is proposed. • The method includes all components of life cycle emission converts into a time dependent discounting number. • A case study of Intermediate Pyrolysis is used to obtain such number for a range of processes. • The method is useful to determine if the effect from the operation of a process will lead to a net absorption of emission or net accumulation of emission in the environment.« less
A Theory for the Function of the Spermaceti Organ of the Sperm Whale (Physeter Catodon L.)
NASA Technical Reports Server (NTRS)
Norris, K. S.; Harvey, G. W.
1972-01-01
The function of the spermaceti organ of the sperm whale is studied using a model of its acoustic system. Suggested functions of the system include: (1) action as an acoustic resonating and sound focussing chamber to form and process burst-pulsed clicks; (2) use of nasal passages in forehead for repeated recycling of air for phonation during dives and to provide mirrors for sound reflection and signal processing; and (3) use of the entire system to allow sound signal production especially useful for long range echolocofion in the deep sea.
Applying simulation to optimize plastic molded optical parts
NASA Astrophysics Data System (ADS)
Jaworski, Matthew; Bakharev, Alexander; Costa, Franco; Friedl, Chris
2012-10-01
Optical injection molded parts are used in many different industries including electronics, consumer, medical and automotive due to their cost and performance advantages compared to alternative materials such as glass. The injection molding process, however, induces elastic (residual stress) and viscoelastic (flow orientation stress) deformation into the molded article which alters the material's refractive index to be anisotropic in different directions. Being able to predict and correct optical performance issues associated with birefringence early in the design phase is a huge competitive advantage. This paper reviews how to apply simulation analysis of the entire molding process to optimize manufacturability and part performance.
Performing Verification and Validation in Reuse-Based Software Engineering
NASA Technical Reports Server (NTRS)
Addy, Edward A.
1999-01-01
The implementation of reuse-based software engineering not only introduces new activities to the software development process, such as domain analysis and domain modeling, it also impacts other aspects of software engineering. Other areas of software engineering that are affected include Configuration Management, Testing, Quality Control, and Verification and Validation (V&V). Activities in each of these areas must be adapted to address the entire domain or product line rather than a specific application system. This paper discusses changes and enhancements to the V&V process, in order to adapt V&V to reuse-based software engineering.
NASA Technical Reports Server (NTRS)
Orr, James K.
2010-01-01
This presentation has shown the accomplishments of the PASS project over three decades and highlighted the lessons learned. Over the entire time, our goal has been to continuously improve our process, implement automation for both quality and increased productivity, and identify and remove all defects due to prior execution of a flawed process in addition to improving our processes following identification of significant process escapes. Morale and workforce instability have been issues, most significantly during 1993 to 1998 (period of consolidation in aerospace industry). The PASS project has also consulted with others, including the Software Engineering Institute, so as to be an early evaluator, adopter, and adapter of state-of-the-art software engineering innovations.
Synthetic Genome Recoding: New genetic codes for new features
Kuo, James; Stirling, Finn; Lau, Yu Heng; Shulgina, Yekaterina; Way, Jeffrey C.; Silver, Pamela A.
2018-01-01
Full genome recoding, or rewriting codon meaning, through chemical synthesis of entire bacterial chromosomes has become feasible in the past several years. Recoding an organism can impart new properties including non-natural amino acid incorporation, virus resistance, and biocontainment. The estimated cost of construction that includes DNA synthesis, assembly by recombination, and troubleshooting, is now comparable to costs of early stage development of drugs or other high-tech products. Here we discuss several recently published assembly methods and provide some thoughts on the future, including how synthetic efforts might benefit from analysis of natural recoding processes and organisms that use alternative genetic codes. PMID:28983660
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bradley, E.C.; Killough, S.M.; Rowe, J.C.
The purpose of the Smart Crane Ammunition Transfer System (SCATS) project is to demonstrate robotic/telerobotic controls technology for a mobile articulated crane for missile/ munitions handling, delivery, and reload. Missile resupply and reload have been manually intensive operations up to this time. Currently, reload missiles are delivered by truck to the site of the launcher. A crew of four to five personnel reloads the missiles from the truck to the launcher using a hydraulic-powered crane. The missiles are handled carefully for the safety of the missiles and personnel. Numerous steps are required in the reload process and the entire reloadmore » operation can take over 1 h for some missile systems. Recent U.S. Army directives require the entire operation to be accomplished in a fraction of that time. Current requirements for the development of SCATS are being based primarily on reloading Patriot missiles. The planned development approach will integrate robotic control and sensor technology with a commercially available hydraulic articulated crane. SCATS is being developed with commercially available hardware as much as possible. Development plans include adding a 3-D.F. end effector with a grapple to the articulating crane; closed-loop position control for the crane and end effector; digital microprocessor control of crane functions; simplified operator interface; and operating modes which include rectilinear movement, obstacle avoidance, and partial automated operation. The planned development will include progressive technology demonstrations. Ultimate plans are for this technology to be transferred and utilized in the military fielding process.« less
Business Models for Cost Sharing & Capability Sustainment
2012-08-18
digital technology into existing mechanical products and their supporting processes can only work correctly if the firm carrying it out changes its entire...average 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering and maintaining the data needed...Capability Sustainment 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT NUMBER 5e. TASK NUMBER 5f. WORK
Why Things Are So Bad for the Computer-Naive User
1975-03-01
knowledge of human communication that we need. Many of the things that people do in communication, including the entire list indicated above, are not...making direct use of computers feasible and comfortable for broad classes of people, research in modeling human communication processes deserves a far...higher national priority than it currently h<is. There are a few active research projects that are building the right kinds of rodels of human
Alon, Sigal
2015-07-01
This study demonstrates the analytical leverage gained from considering the entire college pipeline-including the application, admission and graduation stages-in examining the economic position of various groups upon labor market entry. The findings, based on data from three elite universities in Israel, reveal that the process that shapes economic inequality between different ethnic and immigrant groups is not necessarily cumulative. Field of study stratification does not expand systematically from stage to stage and the position of groups on the field of study hierarchy at each stage is not entirely explained by academic preparation. Differential selection and attrition processes, as well as ambition and aspirations, also shape the position of ethnic groups in the earnings hierarchy and generate a non-cumulative pattern. These findings suggest that a cross-sectional assessment of field of study inequality at the graduation stage can generate misleading conclusions about group-based economic inequality among workers with a bachelor's degree. Copyright © 2015 Elsevier Inc. All rights reserved.
48 CFR 852.236-83 - Payments under fixed-price construction contracts (including NAS).
Code of Federal Regulations, 2011 CFR
2011-10-01
... (CPM) network. (4) The CPM network shall include a separate cost loaded activity for adjusting and... (cold, constant temperature) 5 Entire air-conditioning system (Specified under 600 Sections) 5 Entire... equipment as are approved by the resident engineer for storage will be included. (3) Such materials and/or...
2014-05-01
There are several types of planning processes and plans, including strategic, operational, tactical, and contingency. For this document, operational planning includes tactical planning. This chapter examines the strategic planning process and includes an introduction into disaster response plans. "A strategic plan is an outline of steps designed with the goals of the entire organisation as a whole in mind, rather than with the goals of specific divisions or departments". Strategic planning includes all measures taken to provide a broad picture of what must be achieved and in which order, including how to organise a system capable of achieving the overall goals. Strategic planning often is done pre-event, based on previous experience and expertise. The strategic planning for disasters converts needs into a strategic plan of action. Strategic plans detail the goals that must be achieved. The process of converting needs into plans has been deconstructed into its components and includes consideration of: (1) disaster response plans; (2) interventions underway or planned; (3) available resources; (4) current status vs. pre-event status; (5) history and experience of the planners; and (6) access to the affected population. These factors are tempered by the local: (a) geography; (b) climate; (c) culture; (d) safety; and (e) practicality. The planning process consumes resources (costs). All plans must be adapted to the actual conditions--things never happen exactly as planned.
An assessment of space shuttle flight software development processes
NASA Technical Reports Server (NTRS)
1993-01-01
In early 1991, the National Aeronautics and Space Administration's (NASA's) Office of Space Flight commissioned the Aeronautics and Space Engineering Board (ASEB) of the National Research Council (NRC) to investigate the adequacy of the current process by which NASA develops and verifies changes and updates to the Space Shuttle flight software. The Committee for Review of Oversight Mechanisms for Space Shuttle Flight Software Processes was convened in Jan. 1992 to accomplish the following tasks: (1) review the entire flight software development process from the initial requirements definition phase to final implementation, including object code build and final machine loading; (2) review and critique NASA's independent verification and validation process and mechanisms, including NASA's established software development and testing standards; (3) determine the acceptability and adequacy of the complete flight software development process, including the embedded validation and verification processes through comparison with (1) generally accepted industry practices, and (2) generally accepted Department of Defense and/or other government practices (comparing NASA's program with organizations and projects having similar volumes of software development, software maturity, complexity, criticality, lines of code, and national standards); (4) consider whether independent verification and validation should continue. An overview of the study, independent verification and validation of critical software, and the Space Shuttle flight software development process are addressed. Findings and recommendations are presented.
Ponce, David A.
1997-01-01
Gravity data for the entire state of Nevada and adjacent parts of California, Utah, and Arizona are available on this CD-ROM. About 80,000 gravity stations were compiled primarily from the National Geophysical Data Center and the U.S. Geological Survey. Gravity data was reduced to the Geodetic Reference System of 1967 and adjusted to the Gravity Standardization Net 1971 gravity datum. Data were processed to complete Bouguer and isostatic gravity anomalies by applying standard gravity corrections including terrain and isostatic corrections. Selected principal fact references and a list of sources for data from the National Geophysical Data Center are included.
NASA Astrophysics Data System (ADS)
McCook, L. J.; Almany, G. R.; Berumen, M. L.; Day, J. C.; Green, A. L.; Jones, G. P.; Leis, J. M.; Planes, S.; Russ, G. R.; Sale, P. F.; Thorrold, S. R.
2009-06-01
The global decline in coral reefs demands urgent management strategies to protect resilience. Protecting ecological connectivity, within and among reefs, and between reefs and other ecosystems is critical to resilience. However, connectivity science is not yet able to clearly identify the specific measures for effective protection of connectivity. This article aims to provide a set of principles or practical guidelines that can be applied currently to protect connectivity. These ‘rules of thumb’ are based on current knowledge and expert opinion, and on the philosophy that, given the urgency, it is better to act with incomplete knowledge than to wait for detailed understanding that may come too late. The principles, many of which are not unique to connectivity, include: (1) allow margins of error in extent and nature of protection, as insurance against unforeseen or incompletely understood threats or critical processes; (2) spread risks among areas; (3) aim for networks of protected areas which are: (a) comprehensive and spread—protect all biotypes, habitats and processes, etc., to capture as many possible connections, known and unknown; (b) adequate—maximise extent of protection for each habitat type, and for the entire region; (c) representative—maximise likelihood of protecting the full range of processes and spatial requirements; (d) replicated—multiple examples of biotypes or processes enhances risk spreading; (4) protect entire biological units where possible (e.g. whole reefs), including buffers around core areas. Otherwise, choose bigger rather than smaller areas; (5) provide for connectivity at a wide range of dispersal distances (within and between patches), emphasising distances <20-30 km; and (6) use a portfolio of approaches, including but not limited to MPAs. Three case studies illustrating the application of these principles to coral reef management in the Bohol Sea (Philippines), the Great Barrier Reef (Australia) and Kimbe Bay (Papua New Guinea) are described.
Cima, Robert R; Brown, Michael J; Hebl, James R; Moore, Robin; Rogers, James C; Kollengode, Anantha; Amstutz, Gwendolyn J; Weisbrod, Cheryl A; Narr, Bradly J; Deschamps, Claude
2011-07-01
Operating rooms (ORs) are resource-intense and costly hospital units. Maximizing OR efficiency is essential to maintaining an economically viable institution. OR efficiency projects often focus on a limited number of ORs or cases. Efforts across an entire OR suite have not been reported. Lean and Six Sigma methodologies were developed in the manufacturing industry to increase efficiency by eliminating non-value-added steps. We applied Lean and Six Sigma methodologies across an entire surgical suite to improve efficiency. A multidisciplinary surgical process improvement team constructed a value stream map of the entire surgical process from the decision for surgery to discharge. Each process step was analyzed in 3 domains, ie, personnel, information processed, and time. Multidisciplinary teams addressed 5 work streams to increase value at each step: minimizing volume variation; streamlining the preoperative process; reducing nonoperative time; eliminating redundant information; and promoting employee engagement. Process improvements were implemented sequentially in surgical specialties. Key performance metrics were collected before and after implementation. Across 3 surgical specialties, process redesign resulted in substantial improvements in on-time starts and reduction in number of cases past 5 pm. Substantial gains were achieved in nonoperative time, staff overtime, and ORs saved. These changes resulted in substantial increases in margin/OR/day. Use of Lean and Six Sigma methodologies increased OR efficiency and financial performance across an entire operating suite. Process mapping, leadership support, staff engagement, and sharing performance metrics are keys to enhancing OR efficiency. The performance gains were substantial, sustainable, positive financially, and transferrable to other specialties. Copyright © 2011 American College of Surgeons. Published by Elsevier Inc. All rights reserved.
The Neurolab mission and biomedical engineering: a partnership for the future.
Liskowsky, D R; Frey, M A; Sulzman, F M; White, R J; Likowsky, D R
1996-01-01
Over the last five years, with the advent of flights of U.S. Shuttle/Spacelab missions dedicated entirely to life sciences research, the opportunities for conducting serious studies that use a fully outfitted space laboratory to better understand basic biological processes have increased. The last of this series of Shuttle/Spacelab missions, currently scheduled for 1998, is dedicated entirely to neuroscience and behavioral research. The mission, named Neurolab, includes a broad range of experiments that build on previous research efforts, as well as studies related to less mature areas of space neuroscience. The Neurolab mission provides the global scientific community with the opportunity to use the space environment for investigations that exploit microgravity to increase our understanding of basic processes in neuroscience. The results from this premier mission should lead to a significant advancement in the field as a whole and to the opening of new lines of investigation for future research. Experiments under development for this mission will utilize human subjects as well as a variety of other species. The capacity to carry out detailed experiments on both human and animal subjects in space allows a diverse complement of studies that investigate functional changes and their underlying molecular, cellular, and physiological mechanisms. In order to conduct these experiments, a wide array of biomedical instrumentation will be used, including some instruments and devices being developed especially for the mission.
Exploring the Solid Rocket Boosters and Properties of Matter
NASA Technical Reports Server (NTRS)
Moffett, Amy
2007-01-01
I worked for the United Space Alliance, LLC (USA) with the Solid Rocket Booster (SRB) Materials and Process engineers (M&P). I was assigned a project in which I needed to research and collect chemical and physical properties information, material safety data sheets (MSDS), and other product information from the vendor's websites and existing "inhouse" files for a select group of materials used in building and refurbishing the SRBs. This information was then compiled in a report that summarized the information collected. My work site was at the Kennedy Space Center (KSC). This allowed for many opportunities to visit and tour sites operated by NASA, by USA, and by the Air Force. This included the vehicle assembly building (VAB), orbital processing facilities (OPF), the crawler with the mobile launch pad (MLP), and the SRB assembly and refurbishment facility (ARF), to name a few. In addition, the launch, of STS- 117 took place within the first week of employment allowing a day by day following of that mission including post flight operations for the SRBs. Two Delta II rockets were also launched during these 7 weeks. The sights were incredible and the operations witnessed were amazing. I learned so many things I never knew about the entire program and the shuttle itself. The entire experience, especially my work with the SRB materials, inspired my plan for implementation into the classroom.
The Neurolab mission and biomedical engineering: a partnership for the future
NASA Technical Reports Server (NTRS)
Liskowsky, D. R.; Frey, M. A.; Sulzman, F. M.; White, R. J.; Likowsky, D. R.
1996-01-01
Over the last five years, with the advent of flights of U.S. Shuttle/Spacelab missions dedicated entirely to life sciences research, the opportunities for conducting serious studies that use a fully outfitted space laboratory to better understand basic biological processes have increased. The last of this series of Shuttle/Spacelab missions, currently scheduled for 1998, is dedicated entirely to neuroscience and behavioral research. The mission, named Neurolab, includes a broad range of experiments that build on previous research efforts, as well as studies related to less mature areas of space neuroscience. The Neurolab mission provides the global scientific community with the opportunity to use the space environment for investigations that exploit microgravity to increase our understanding of basic processes in neuroscience. The results from this premier mission should lead to a significant advancement in the field as a whole and to the opening of new lines of investigation for future research. Experiments under development for this mission will utilize human subjects as well as a variety of other species. The capacity to carry out detailed experiments on both human and animal subjects in space allows a diverse complement of studies that investigate functional changes and their underlying molecular, cellular, and physiological mechanisms. In order to conduct these experiments, a wide array of biomedical instrumentation will be used, including some instruments and devices being developed especially for the mission.
Distraction and Mind-Wandering Under Load
Forster, Sophie
2013-01-01
Attention research over the last several decades has provided rich insights into the determinants of distraction, including distractor characteristics, task features, and individual differences. Load Theory represented a particularly important breakthrough, highlighting the critical role of the level and nature of task-load in determining both the efficiency of distractor rejection and the stage of processing at which this occurs. However, until recently studies of distraction were restricted to those measuring rather specific forms of distraction by external stimuli which I argue that, although intended to be irrelevant, were in fact task-relevant. In daily life, attention may be distracted by a wide range of stimuli, which may often be entirely unrelated to any task being performed, and may include not only external stimuli but also internally generated stimuli such as task-unrelated thoughts. This review outlines recent research examining these more general, entirely task-irrelevant, forms of distraction within the framework of Load Theory. I discuss the relation between different forms of distraction, and the universality of load effects across different distractor types and individuals. PMID:23734138
One-Dimensional Signal Extraction Of Paper-Written ECG Image And Its Archiving
NASA Astrophysics Data System (ADS)
Zhang, Zhi-ni; Zhang, Hong; Zhuang, Tian-ge
1987-10-01
A method for converting paper-written electrocardiograms to one dimensional (1-D) signals for archival storage on floppy disk is presented here. Appropriate image processing techniques were employed to remove the back-ground noise inherent to ECG recorder charts and to reconstruct the ECG waveform. The entire process consists of (1) digitization of paper-written ECGs with an image processing system via a TV camera; (2) image preprocessing, including histogram filtering and binary image generation; (3) ECG feature extraction and ECG wave tracing, and (4) transmission of the processed ECG data to IBM-PC compatible floppy disks for storage and retrieval. The algorithms employed here may also be used in the recognition of paper-written EEG or EMG and may be useful in robotic vision.
Converting customer expectations into achievable results.
Landis, G A
1999-11-01
It is not enough in today's environment to just meet customers' expectations--we must exceed them. Therefore, one must learn what constitutes expectations. These needs have expanded during the past few years from just manufacturing the product and looking at the outcome from a provincial standpoint. Now we must understand and satisfy the entire supply chain. To manage this process and satisfy the customer, the process now involves the supplier, the manufacturer, and the entire distribution system.
NASA Technical Reports Server (NTRS)
Crisp, David; Komar, George (Technical Monitor)
2001-01-01
Advancement of our predictive capabilities will require new scientific knowledge, improvement of our modeling capabilities, and new observation strategies to generate the complex data sets needed by coupled modeling networks. New observation strategies must support remote sensing from a variety of vantage points and will include "sensorwebs" of small satellites in low Earth orbit, large aperture sensors in Geostationary orbits, and sentinel satellites at L1 and L2 to provide day/night views of the entire globe. Onboard data processing and high speed computing and communications will enable near real-time tailoring and delivery of information products (i.e., predictions) directly to users.
Scandurra, Isabella; Hägglund, Maria
2009-01-01
Introduction Integrated care involves different professionals, belonging to different care provider organizations and requires immediate and ubiquitous access to patient-oriented information, supporting an integrated view on the care process [1]. Purpose To present a method for development of usable and work process-oriented information and communication technology (ICT) systems for integrated care. Theory and method Based on Human-computer Interaction Science and in particular Participatory Design [2], we present a new collaborative design method in the context of health information systems (HIS) development [3]. This method implies a thorough analysis of the entire interdisciplinary cooperative work and a transformation of the results into technical specifications, via user validated scenarios, prototypes and use cases, ultimately leading to the development of appropriate ICT for the variety of occurring work situations for different user groups, or professions, in integrated care. Results and conclusions Application of the method in homecare of the elderly resulted in an HIS that was well adapted to the intended user groups. Conducted in multi-disciplinary seminars, the method captured and validated user needs and system requirements for different professionals, work situations, and environments not only for current work; it also aimed to improve collaboration in future (ICT supported) work processes. A holistic view of the entire care process was obtained and supported through different views of the HIS for different user groups, resulting in improved work in the entire care process as well as for each collaborating profession [4].
Viewing The Entire Sun With STEREO And SDO
NASA Astrophysics Data System (ADS)
Thompson, William T.; Gurman, J. B.; Kucera, T. A.; Howard, R. A.; Vourlidas, A.; Wuelser, J.; Pesnell, D.
2011-05-01
On 6 February 2011, the two Solar Terrestrial Relations Observatory (STEREO) spacecraft were at 180 degrees separation. This allowed the first-ever simultaneous view of the entire Sun. Combining the STEREO data with corresponding images from the Solar Dynamics Observatory (SDO) allows this full-Sun view to continue for the next eight years. We show how the data from the three viewpoints are combined into a single heliographic map. Processing of the STEREO beacon telemetry allows these full-Sun views to be created in near-real-time, allowing tracking of solar activity even on the far side of the Sun. This is a valuable space-weather tool, not only for anticipating activity before it rotates onto the Earth-view, but also for deep space missions in other parts of the solar system. Scientific use of the data includes the ability to continuously track the entire lifecycle of active regions, filaments, coronal holes, and other solar features. There is also a significant public outreach component to this activity. The STEREO Science Center produces products from the three viewpoints used in iPhone/iPad and Android applications, as well as time sequences for spherical projection systems used in museums, such as Science-on-a-Sphere and Magic Planet.
Method for sequentially processing a multi-level interconnect circuit in a vacuum chamber
NASA Technical Reports Server (NTRS)
Routh, D. E.; Sharma, G. C. (Inventor)
1982-01-01
The processing of wafer devices to form multilevel interconnects for microelectronic circuits is described. The method is directed to performing the sequential steps of etching the via, removing the photo resist pattern, back sputtering the entire wafer surface and depositing the next layer of interconnect material under common vacuum conditions without exposure to atmospheric conditions. Apparatus for performing the method includes a vacuum system having a vacuum chamber in which wafers are processed on rotating turntables. The vacuum chamber is provided with an RF sputtering system and a DC magnetron sputtering system. A gas inlet is provided in the chamber for the introduction of various gases to the vacuum chamber and the creation of various gas plasma during the sputtering steps.
An investigation into creative design methodologies for textiles and fashion
NASA Astrophysics Data System (ADS)
Gault, Alison
2017-10-01
Understanding market intelligence, trends, influences and personal approaches are essential tools for design students to develop their ideas in textiles and fashion. Identifying different personal approaches including, visual, process-led or concept by employing creative methodologies are key to developing a brief. A series of ideas or themes start to emerge and through the design process serve to underpin and inform an entire collection. These investigations ensure that the design collections are able to produce a diverse range of outcomes. Following key structures and coherent stages in the design process creates authentic collections in textiles and fashion. A range of undergraduate students presented their design portfolios (180) and the methodologies employed were mapped against success at module level, industry response and graduate employment.
Digital Signal Processing and Generation for a DC Current Transformer for Particle Accelerators
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zorzetti, Silvia
2013-01-01
The thesis topic, digital signal processing and generation for a DC current transformer, focuses on the most fundamental beam diagnostics in the field of particle accelerators, the measurement of the beam intensity, or beam current. The technology of a DC current transformer (DCCT) is well known, and used in many areas, including particle accelerator beam instrumentation, as non-invasive (shunt-free) method to monitor the DC current in a conducting wire, or in our case, the current of charged particles travelling inside an evacuated metal pipe. So far, custom and commercial DCCTs are entirely based on analog technologies and signal processing, whichmore » makes them inflexible, sensitive to component aging, and difficult to maintain and calibrate.« less
Broadband omnidirectional antireflection coating based on subwavelength surface Mie resonators
Spinelli, P.; Verschuuren, M.A.; Polman, A.
2012-01-01
Reflection is a natural phenomenon that occurs when light passes the interface between materials with different refractive index. In many applications, such as solar cells or photodetectors, reflection is an unwanted loss process. Many ways to reduce reflection from a substrate have been investigated so far, including dielectric interference coatings, surface texturing, adiabatic index matching and scattering from plasmonic nanoparticles. Here we present an entirely new concept that suppresses the reflection of light from a silicon surface over a broad spectral range. A two-dimensional periodic array of subwavelength silicon nanocylinders designed to possess strongly substrate-coupled Mie resonances yields almost zero total reflectance over the entire spectral range from the ultraviolet to the near-infrared. This new antireflection concept relies on the strong forward scattering that occurs when a scattering structure is placed in close proximity to a high-index substrate with a high optical density of states. PMID:22353722
48 CFR 852.236-83 - Payments under fixed-price construction contracts (including NAS).
Code of Federal Regulations, 2010 CFR
2010-10-01
... (cold, constant temperature) 5 Entire air-conditioning system (Specified under 600 Sections) 5 Entire... 48 Federal Acquisition Regulations System 5 2010-10-01 2010-10-01 false Payments under fixed-price construction contracts (including NAS). 852.236-83 Section 852.236-83 Federal Acquisition Regulations System...
38 CFR 16.107 - IRB membership.
Code of Federal Regulations, 2010 CFR
2010-07-01
... diversity of the members, including consideration of race, gender, and cultural backgrounds and sensitivity... entirely of women, including the institution's consideration of qualified persons of both sexes, so long as no selection is made to the IRB on the basis of gender. No IRB may consist entirely of members of one...
45 CFR 690.107 - IRB membership.
Code of Federal Regulations, 2011 CFR
2011-10-01
... diversity of the members, including consideration of race, gender, and cultural backgrounds and sensitivity... entirely of women, including the institution's consideration of qualified persons of both sexes, so long as no selection is made to the IRB on the basis of gender. No IRB may consist entirely of members of one...
45 CFR 690.107 - IRB membership.
Code of Federal Regulations, 2010 CFR
2010-10-01
... diversity of the members, including consideration of race, gender, and cultural backgrounds and sensitivity... entirely of women, including the institution's consideration of qualified persons of both sexes, so long as no selection is made to the IRB on the basis of gender. No IRB may consist entirely of members of one...
38 CFR 16.107 - IRB membership.
Code of Federal Regulations, 2011 CFR
2011-07-01
... diversity of the members, including consideration of race, gender, and cultural backgrounds and sensitivity... entirely of women, including the institution's consideration of qualified persons of both sexes, so long as no selection is made to the IRB on the basis of gender. No IRB may consist entirely of members of one...
Entire Photodamaged Chloroplasts Are Transported to the Central Vacuole by Autophagy[OPEN
2017-01-01
Turnover of dysfunctional organelles is vital to maintain homeostasis in eukaryotic cells. As photosynthetic organelles, plant chloroplasts can suffer sunlight-induced damage. However, the process for turnover of entire damaged chloroplasts remains unclear. Here, we demonstrate that autophagy is responsible for the elimination of sunlight-damaged, collapsed chloroplasts in Arabidopsis thaliana. We found that vacuolar transport of entire chloroplasts, termed chlorophagy, was induced by UV-B damage to the chloroplast apparatus. This transport did not occur in autophagy-defective atg mutants, which exhibited UV-B-sensitive phenotypes and accumulated collapsed chloroplasts. Use of a fluorescent protein marker of the autophagosomal membrane allowed us to image autophagosome-mediated transport of entire chloroplasts to the central vacuole. In contrast to sugar starvation, which preferentially induced distinct type of chloroplast-targeted autophagy that transports a part of stroma via the Rubisco-containing body (RCB) pathway, photooxidative damage induced chlorophagy without prior activation of RCB production. We further showed that chlorophagy is induced by chloroplast damage caused by either artificial visible light or natural sunlight. Thus, this report establishes that an autophagic process eliminates entire chloroplasts in response to light-induced damage. PMID:28123106
1987-03-01
environment . Actions within the process loop is initiated by a perceived divergence from a desired state and the sensed environmental state. Definitions of the...initiated environmental effect. An action by our own forces as well as by the enemy forces can create an alteration to the overall environment . The DESIRED...Additionally, the model would accommodate the entire C2 system, including physical entities, structure, and its environment . The objective was to
Sodano, M J
1991-01-01
The author describes an innovative "work unit compensation" system that acts as an adjunct to existing personnel payment structures. The process, developed as a win-win alternative for both employees and their institution, includes a reward system for the entire department and insures a team atmosphere. The Community Medical Center in Toms River, New Jersey developed the plan which sets the four basic goals: to be fair, economical, lasting and transferable (FELT). The plan has proven to be a useful tool in retention and recruitment of qualified personnel.
Nuclear Safety. Technical progress journal, April--June 1996: Volume 37, No. 2
DOE Office of Scientific and Technical Information (OSTI.GOV)
Muhlheim, M D
1996-01-01
This journal covers significant issues in the field of nuclear safety. Its primary scope is safety in the design, construction, operation, and decommissioning of nuclear power reactors worldwide and the research and analysis activities that promote this goal, but it also encompasses the safety aspects of the entire nuclear fuel cycle, including fuel fabrication, spent-fuel processing and handling, nuclear waste disposal, the handling of fissionable materials and radioisotopes, and the environmental effects of all these activities.
Nuclear Safety. Technical progress journal, January--March 1994: Volume 35, No. 1
DOE Office of Scientific and Technical Information (OSTI.GOV)
Silver, E G
1994-01-01
This is a journal that covers significant issues in the field of nuclear safety. Its primary scope is safety in the design, construction, operation, and decommissioning of nuclear power reactors worldwide and the research and analysis activities that promote this goal, but it also encompasses the safety aspects of the entire nuclear fuel cycle, including fuel fabrication, spent-fuel processing and handling, and nuclear waste disposal, the handling of fissionable materials and radioisotopes, and the environmental effects of all these activities.
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
2011-02-25
There are many voices calling for a future of abundant clean energy. The choices are difficult and the challenges daunting. How will we get there? The National Renewable Energy Laboratory integrates the entire spectrum of innovation including fundamental science, market relevant research, systems integration, testing and validation, commercialization and deployment. The innovation process at NREL is interdependent and iterative. Many scientific breakthroughs begin in our own laboratories, but new ideas and technologies come to NREL at any point along the innovation spectrum to be validated and refined for commercial use.
A CAD approach to magnetic bearing design
NASA Technical Reports Server (NTRS)
Jeyaseelan, M.; Anand, D. K.; Kirk, J. A.
1988-01-01
A design methodology has been developed at the Magnetic Bearing Research Laboratory for designing magnetic bearings using a CAD approach. This is used in the algorithm of an interactive design software package. The package is a design tool developed to enable the designer to simulate the entire process of design and analysis of the system. Its capabilities include interactive input/modification of geometry, finding any possible saturation at critical sections of the system, and the design and analysis of a control system that stabilizes and maintains magnetic suspension.
DOE Office of Scientific and Technical Information (OSTI.GOV)
VOLTTRON is an agent execution platform providing services to its agents that allow them to easily communicate with physical devices and other resources. VOLTTRON delivers an innovative distributed control and sensing software platform that supports modern control strategies, including agent-based and transaction-based controls. It enables mobile and stationary software agents to perform information gathering, processing, and control actions. VOLTTRON can independently manage a wide range of applications, such as HVAC systems, electric vehicles, distributed energy or entire building loads, leading to improved operational efficiency.
Command History. 1967. Volume 1. Sanitized
1967-01-01
the Politburo of the Lao Dong Party. This top political control center included Le Duan, General Vo Nguyen Glap, Truong Chinh , Le Duc Tho, and until...the support area Naval Support Activity (NSA) Da Nang’s vital intra- coastal trans -shipment route from the deep water port at Da Nang northward to...under the energetic guidance of MG Nguyen Duc Thang, reviewed PAGE 6 OF 1340 PAGES I% the entire process of RD and set to the task of revitalizing the
None
2018-05-11
There are many voices calling for a future of abundant clean energy. The choices are difficult and the challenges daunting. How will we get there? The National Renewable Energy Laboratory integrates the entire spectrum of innovation including fundamental science, market relevant research, systems integration, testing and validation, commercialization and deployment. The innovation process at NREL is interdependent and iterative. Many scientific breakthroughs begin in our own laboratories, but new ideas and technologies come to NREL at any point along the innovation spectrum to be validated and refined for commercial use.
Mapping permafrost change hot-spots with Landsat time-series
NASA Astrophysics Data System (ADS)
Grosse, G.; Nitze, I.
2016-12-01
Recent and projected future climate warming strongly affects permafrost stability over large parts of the terrestrial Arctic with local, regional and global scale consequences. The monitoring and quantification of permafrost and associated land surface changes in these areas is crucial for the analysis of hydrological and biogeochemical cycles as well as vegetation and ecosystem dynamics. However, detailed knowledge of the spatial distribution and the temporal dynamics of these processes is scarce and likely key locations of permafrost landscape dynamics may remain unnoticed. As part of the ERC funded PETA-CARB and ESA GlobPermafrost projects, we developed an automated processing chain based on data from the entire Landsat archive (excluding MSS) for the detection of permafrost change related processes and hotspots. The automated method enables us to analyze thousands of Landsat scenes, which allows for a multi-scaled spatio-temporal analysis at 30 meter spatial resolution. All necessary processing steps are carried out automatically with minimal user interaction, including data extraction, masking, reprojection, subsetting, data stacking, and calculation of multi-spectral indices. These indices, e.g. Landsat Tasseled Cap and NDVI among others, are used as proxies for land surface conditions, such as vegetation status, moisture or albedo. Finally, a robust trend analysis is applied to each multi-spectral index and each pixel over the entire observation period of up to 30 years from 1985 to 2015, depending on data availability. Large transects of around 2 million km² across different permafrost types in Siberia and North America have been processed. Permafrost related or influencing landscape dynamics were detected within the trend analysis, including thermokarst lake dynamics, fires, thaw slumps, and coastal dynamics. The produced datasets will be distributed to the community as part of the ERC PETA-CARB and ESA GlobPermafrost projects. Users are encouraged to provide feedback and ground truth data for a continuous improvement of our methodology and datasets, which will lead to a better understanding of the spatial and temporal distribution of changes within the vulnerable permafrost zone.
NASA Astrophysics Data System (ADS)
Wilson, Katherine E.; Henke, E.-F. Markus; Slipher, Geoffrey A.; Anderson, Iain A.
2017-04-01
Electromechanically coupled dielectric elastomer actuators (DEAs) and dielectric elastomer switches (DESs) may form digital logic circuitry made entirely of soft and flexible materials. The expansion in planar area of a DEA exerts force across a DES, which is a soft electrode with strain-dependent resistivity. When compressed, the DES drops steeply in resistance and changes state from non-conducting to conducting. Logic operators may be achieved with different arrangements of interacting DE actuators and switches. We demonstrate combinatorial logic elements, including the fundamental Boolean logic gates, as well as sequential logic elements, including latches and flip-flops. With both data storage and signal processing abilities, the necessary calculating components of a soft computer are available. A noteworthy advantage of a soft computer with mechanosensitive DESs is the potential for responding to environmental strains while locally processing information and generating a reaction, like a muscle reflex.
7 CFR 51.3416 - Classification of defects.
Code of Federal Regulations, 2010 CFR
2010-01-01
... Maximum allowed for U.S. No. 2 processing Occurring outside of or not entirely confined to the vascular ring Internal Black Spot, Internal Discoloration, Vascular Browning, Fusarium Wilt, Net Necrosis, Other Necrosis, Stem End Browning 5% waste 10% waste. Occurring entirely within the vascular ring Hollow Heart or...
Theoretical study of optical pump process in solid gain medium based on four-energy-level model
NASA Astrophysics Data System (ADS)
Ma, Yongjun; Fan, Zhongwei; Zhang, Bin; Yu, Jin; Zhang, Hongbo
2018-04-01
A semiclassical algorithm is explored to a four-energy level model, aiming to find out the factors that affect the dynamics behavior during the pump process. The impacts of pump intensity Ω p , non-radiative transition rate γ 43 and decay rate of electric dipole δ 14 are discussed in detail. The calculation results show that large γ 43, small δ 14, and strong pumping Ω p are beneficial to the establishing of population inversion. Under strong pumping conditions, the entire pump process can be divided into four different phases, tentatively named far-from-equilibrium process, Rabi oscillation process, quasi dynamic equilibrium process and ‘equilibrium’ process. The Rabi oscillation can slow the pumping process and cause some instability. Moreover, the duration of the entire process is negatively related to Ω p and γ 43 whereas positively related to δ 14.
Digital Light Processing update: status and future applications
NASA Astrophysics Data System (ADS)
Hornbeck, Larry J.
1999-05-01
Digital Light Processing (DLP) projection displays based on the Digital Micromirror Device (DMD) were introduced to the market in 1996. Less than 3 years later, DLP-based projectors are found in such diverse applications as mobile, conference room, video wall, home theater, and large-venue. They provide high-quality, seamless, all-digital images that have exceptional stability as well as freedom from both flicker and image lag. Marked improvements have been made in the image quality of DLP-based projection display, including brightness, resolution, contrast ratio, and border image. DLP-based mobile projectors that weighted about 27 pounds in 1996 now weight only about 7 pounds. This weight reduction has been responsible for the definition of an entirely new projector class, the ultraportable. New applications are being developed for this important new projection display technology; these include digital photofinishing for high process speed minilab and maxilab applications and DLP Cinema for the digital delivery of films to audiences around the world. This paper describes the status of DLP-based projection display technology, including its manufacturing, performance improvements, and new applications, with emphasis on DLP Cinema.
Probabilistic TSUnami Hazard MAPS for the NEAM Region: The TSUMAPS-NEAM Project
NASA Astrophysics Data System (ADS)
Basili, Roberto; Babeyko, Andrey Y.; Hoechner, Andreas; Baptista, Maria Ana; Ben Abdallah, Samir; Canals, Miquel; El Mouraouah, Azelarab; Bonnevie Harbitz, Carl; Ibenbrahim, Aomar; Lastras, Galderic; Lorito, Stefano; Løvholt, Finn; Matias, Luis Manuel; Omira, Rachid; Papadopoulos, Gerassimos A.; Pekcan, Onur; Nmiri, Abdelwaheb; Selva, Jacopo; Yalciner, Ahmet C.; Thio, Hong K.
2017-04-01
As global awareness of tsunami hazard and risk grows, the North-East Atlantic, the Mediterranean, and connected Seas (NEAM) region still lacks a thorough probabilistic tsunami hazard assessment. The TSUMAPS-NEAM project aims to fill this gap in the NEAM region by 1) producing the first region-wide long-term homogenous Probabilistic Tsunami Hazard Assessment (PTHA) from earthquake sources, and by 2) triggering a common tsunami risk management strategy. The specific objectives of the project are tackled by the following four consecutive actions: 1) Conduct a state-of-the-art, standardized, and updatable PTHA with full uncertainty treatment; 2) Review the entire process with international experts; 3) Produce the PTHA database, with documentation of the entire hazard assessment process; and 4) Publicize the results through an awareness raising and education phase, and a capacity building phase. This presentation will illustrate the project layout, summarize its current status of advancement including the firs preliminary release of the assessment, and outline its connections with similar initiatives in the international context. The TSUMAPS-NEAM Project (http://www.tsumaps-neam.eu/) is co-financed by the European Union Civil Protection Mechanism, Agreement Number: ECHO/SUB/2015/718568/PREV26.
NASA Astrophysics Data System (ADS)
Zhu, Ming; Liu, Tingting; Wang, Shu; Zhang, Kesheng
2017-08-01
Existing two-frequency reconstructive methods can only capture primary (single) molecular relaxation processes in excitable gases. In this paper, we present a reconstructive method based on the novel decomposition of frequency-dependent acoustic relaxation spectra to capture the entire molecular multimode relaxation process. This decomposition of acoustic relaxation spectra is developed from the frequency-dependent effective specific heat, indicating that a multi-relaxation process is the sum of the interior single-relaxation processes. Based on this decomposition, we can reconstruct the entire multi-relaxation process by capturing the relaxation times and relaxation strengths of N interior single-relaxation processes, using the measurements of acoustic absorption and sound speed at 2N frequencies. Experimental data for the gas mixtures CO2-N2 and CO2-O2 validate our decomposition and reconstruction approach.
Development of hydrogen peroxide technique for bioburden reduction
NASA Astrophysics Data System (ADS)
Rohatgi, N.; Schwartz, L.; Stabekis, P.; Barengoltz, J.
In order to meet the National Aeronautics and Space Administration (NASA) Planetary Protection microbial reduction requirements for Mars in-situ life detection and sample return missions, entire planetary spacecraft (including planetary entry probes and planetary landing capsules) may have to be exposed to a qualified sterilization process. Presently, dry heat is the only NASA approved sterilization technique available for spacecraft application. However, with the increasing use of various man-made materials, highly sophisticated electronic circuit boards, and sensors in a modern spacecraft, compatibility issues may render this process unacceptable to design engineers and thus impractical to achieve terminal sterilization of the entire spacecraft. An alternative vapor phase hydrogen peroxide sterilization process, which is currently used in various industries, has been selected for further development. Strategic Technology Enterprises, Incorporated (STE), a subsidiary of STERIS Corporation, under a contract from the Jet Propulsion Laboratory (JPL) is developing systems and methodologies to decontaminate spacecraft using vaporized hydrogen peroxide (VHP) technology. The VHP technology provides an effective, rapid and low temperature means for inactivation of spores, mycobacteria, fungi, viruses and other microorganisms. The VHP application is a dry process affording excellent material compatibility with many of the components found in spacecraft such as polymers, paints and electronic systems. Furthermore, the VHP process has innocuous residuals as it decomposes to water vapor and oxygen. This paper will discuss the approach that is being used to develop this technique and will present lethality data that have been collected to establish deep vacuum VHP sterilization cycles. In addition, the application of this technique to meet planetary protection requirements will be addressed.
NASA Astrophysics Data System (ADS)
Gigan, Olivier; Chen, Hua; Robert, Olivier; Renard, Stephane; Marty, Frederic
2002-11-01
This paper is dedicated to the fabrication and technological aspect of a silicon microresonator sensor. The entire project includes the fabrication processes, the system modelling/simulation, and the electronic interface. The mechanical model of such resonator is presented including description of frequency stability and Hysterises behaviour of the electrostatically driven resonator. Numeric model and FEM simulations are used to simulate the system dynamic behaviour. The complete fabrication process is based on standard microelectronics technology with specific MEMS technological steps. The key steps are described: micromachining on SOI by Deep Reactive Ion Etching (DRIE), specific release processes to prevent sticking (resist and HF-vapour release process) and collective vacuum encapsulation by Silicon Direct Bonding (SDB). The complete process has been validated and prototypes have been fabricated. The ASIC was designed to interface the sensor and to control the vibration amplitude. This electronic was simulated and designed to work up to 200°C and implemented in a standard 0.6μ CMOS technology. Characterizations of sensor prototypes are done both mechanically and electrostatically. These measurements showed good agreements with theory and FEM simulations.
Oxygen production processes on the Moon: An overview
NASA Technical Reports Server (NTRS)
Taylor, Lawrence A.; Carrier, W. David, III
1991-01-01
The production of oxygen on the Moon utilizing indigenous material is paramount to a successful lunar colonization. Several processes were put forth to accomplish this. The lunar liquid oxygen (LLOX) generation schemes which have received the most study to date are those involving: (1) the reduction of ilmenite (FeTiO3) by H2, C, CO, CH4, CO-Cl2 plasma; (2) magma electrolysis, both unadulterated and fluoride-fluxed, and (3) several others, including carbo-chlorination, HF acid leaching, fluorine extraction, magma oxidation, and vapor pyrolysis. The H2 reduction of ilmenite and magma electrolysis processes have received the most study to date. At this stage of development, they both appear feasible schemes with various pros and cons. However, all processes should be addressed at least at the onset of the considerations. It is ultimatley the energy requirements of the entire process, including the acquisition of feedstock, which will determine the mode of oxygen productions. There is an obvious need for considerably more experimentation and study. Some of these requisite studies are in progress, and several of the most studied and feasible processes for winning oxygen from lunar materials are reviewed.
New controls spark boiler efficiency
DOE Office of Scientific and Technical Information (OSTI.GOV)
Engels, T.
1993-09-01
Monsanto's NutraSweet plant in University Park, IL, produces aspartame, the patented NutraSweet artificial sweetener product. Until recently, boiler control was managed by a '60s-era Fireye jackshaft system in which air and natural gas were mechanically linked with an offset to compensate for oxygen trim. The interlocking devices on the Fireye system were becoming obsolete, and the boiler needed a new front end retrofitted for low emissions. In order to improve boiler control efficiency, we decided to modernize and automate the entire boiler control system. We replaced the original jackshaft system, and installed a Gordon-Piet burner system, including gas valves, airmore » dampers, blowers, and burner. The upgrade challenges included developing a control strategy and selecting and implementing a process control system. Since our plant has standardized on the PROVOX process management information system from Fisher Controls (now Fisher-Rosemount Systems) to support most of our process, it was a natural and logical choice for boiler controls as well. 2 figs.« less
Mission Life Thermal Analysis and Environment Correlation for the Lunar Reconnaissance Orbiter
NASA Technical Reports Server (NTRS)
Garrison, Matthew B.; Peabody, Hume
2012-01-01
Standard thermal analysis practices include stacking worst-case conditions including environmental heat loads, thermo-optical properties and orbital beta angles. This results in the design being driven by a few bounding thermal cases, although those cases may only represent a very small portion of the actual mission life. The NASA Goddard Space Flight Center Thermal Branch developed a procedure to predict the flight temperatures over the entire mission life, assuming a known beta angle progression, variation in the thermal environment, and a degradation rate in the coatings. This was applied to the Global Precipitation Measurement core spacecraft. In order to assess the validity of this process, this work applies the similar process to the Lunar Reconnaissance Orbiter. A flight-correlated thermal model was exercised to give predictions of the thermal performance over the mission life. These results were then compared against flight data from the first two years of the spacecraft s use. This is used to validate the process and to suggest possible improvements for future analyses.
Analysis of design characteristics of a V-type support using an advanced engineering environment
NASA Astrophysics Data System (ADS)
Gwiazda, A.; Banaś, W.; Sękala, A.; Cwikla, G.; Topolska, S.; Foit, K.; Monica, Z.
2017-08-01
Modern mining support, for the entire period of their use, is the important part of the mining complex, which includes all the devices in the excavation during his normal use. Therefore, during the design of the support, it is an important task to choose the shape and to select the dimensions of a support as well as its strength characteristics. According to the rules, the design process of a support must take into account, inter alia, the type and the dimensions of the expected means of transport, the number and size of pipelines, and the type of additional equipment used excavation area. The support design must ensure the functionality of the excavation process and job security, while maintaining the economic viability of the entire project. Among others it should ensure the selection of a support for specific natural conditions. It is also important to take into consideration the economic characteristics of the project. The article presents an algorithm of integrative approach and its formalized description in the form of integration the areas of different construction characteristics optimization of a V-type mining support. The paper includes the example of its application for developing the construction of this support. In the paper is also described the results of the characteristics analysis and changings that were introduced afterwards. The support models are prepared in the computer environment of the CAD class (Siemens NX PLM). Also the analyses were conducted in this design, graphical environment.
Software Development Technologies for Reactive, Real-Time, and Hybrid Systems
NASA Technical Reports Server (NTRS)
Manna, Zohar
1996-01-01
The research is directed towards the design and implementation of a comprehensive deductive environment for the development of high-assurance systems, especially reactive (concurrent, real-time, and hybrid) systems. Reactive systems maintain an ongoing interaction with their environment, and are among the most difficult to design and verify. The project aims to provide engineers with a wide variety of tools within a single, general, formal framework in which the tools will be most effective. The entire development process is considered, including the construction, transformation, validation, verification, debugging, and maintenance of computer systems. The goal is to automate the process as much as possible and reduce the errors that pervade hardware and software development.
[Biotechnology in perspective].
Brand, A
1990-06-15
Biotechnology is a collective term for a large number of manipulations of biological material. Fields of importance in stock-keeping include: (1) manipulation of reproductive processes; (2) genetic manipulation of macro-(farm) animals and micro-organisms and (3) manipulation of metabolism. Fitting in biotechnological findings in breeding-stock farming has repercussions in several fields such as the relationship between producers and the ancillary and processing industries, service industries, consumers and society as a whole. The use of biotechnical findings will also require further automation and adaptation of farm management. Biotechnology opens up a new area and new prospects for farm animal husbandry. These can only be regarded as positive when they take a permanent development of the entire section into account.
MBE growth of VCSELs for high volume applications
NASA Astrophysics Data System (ADS)
Jäger, Roland; Riedl, Michael C.
2011-05-01
Mass market applications like laser computer mouse or optical data transmission based on vertical-cavity surface-emitting laser (VCSEL) chips need a high over all yield including epitaxy, processing, dicing, mounting and testing. One yield limitation for VCSEL structures is the emission wavelength variation of the substrate surface area leading to the fraction on laser chips which are below or above the specification limits. For most 850 nm VCSEL products a resonator wavelength variation of ±2 nm is common. This represents an average resonator thickness variation of much less than 1% which is quite challenging to be fulfilled on the entire processed wafer surface area. A high over all yield is demonstrated on MBE grown VCSEL structures.
Temperature distribution of thick thermoset composites
NASA Astrophysics Data System (ADS)
Guo, Zhan-Sheng; Du, Shanyi; Zhang, Boming
2004-05-01
The development of temperature distribution of thick polymeric matrix laminates during an autoclave vacuum bag process was measured and compared with numerically calculated results. The finite element formulation of the transient heat transfer problem was carried out for polymeric matrix composite materials from the heat transfer differential equations including internal heat generation produced by exothermic chemical reactions. Software based on the general finite element software package was developed for numerical simulation of the entire composite process. From the experimental and numerical results, it was found that the measured temperature profiles were in good agreement with the numerical ones, and conventional cure cycles recommended by prepreg manufacturers for thin laminates should be modified to prevent temperature overshoot.
The electrical properties of zero-gravity processed immiscibles
NASA Technical Reports Server (NTRS)
Lacy, L. L.; Otto, G. H.
1974-01-01
When dispersed or mixed immiscibles are solidified on earth, a large amount of separation of the constituents takes place due to differences in densities. However, when the immiscibles are dispersed and solidified in zero-gravity, density separation does not occur, and unique composite solids can be formed with many new and promising electrical properties. By measuring the electrical resistivity and superconducting critical temperature, Tc, of zero-g processed Ga-Bi samples, it has been found that the electrical properties of such materials are entirely different from the basic constituents and the ground control samples. Our results indicate that space processed immiscible materials may form an entirely new class of electronic materials.
Access NASA Satellite Global Precipitation Data Visualization on YouTube
NASA Technical Reports Server (NTRS)
Liu, Z.; Su, J.; Acker, J.; Huffman, G.; Vollmer, B.; Wei, J.; Meyer, D.
2017-01-01
Since the satellite era began, NASA has collected a large volume of Earth science observations for research and applications around the world. The collected and archived satellite data at 12 NASA data centers can also be used for STEM education and activities such as disaster events, climate change, etc. However, accessing satellite data can be a daunting task for non-professional users such as teachers and students because of unfamiliarity of terminology, disciplines, data formats, data structures, computing resources, processing software, programming languages, etc. Over the years, many efforts including tools, training classes, and tutorials have been developed to improve satellite data access for users, but barriers still exist for non-professionals. In this presentation, we will present our latest activity that uses a very popular online video sharing Web site, YouTube (https://www.youtube.com/), for accessing visualizations of our global precipitation datasets at the NASA Goddard Earth Sciences (GES) Data and Information Services Center (DISC). With YouTube, users can access and visualize a large volume of satellite data without the necessity to learn new software or download data. The dataset in this activity is a one-month animation for the GPM (Global Precipitation Measurement) Integrated Multi-satellite Retrievals for GPM (IMERG). IMERG provides precipitation on a near-global (60 deg. N-S) coverage at half-hourly time interval, providing more details on precipitation processes and development compared to the 3-hourly TRMM (Tropical Rainfall Measuring Mission) Multisatellite Precipitation Analysis (TMPA, 3B42) product. When the retro-processing of IMERG during the TRMM era is finished in 2018, the entire video will contain more than 330,000 files and will last 3.6 hours. Future plans include development of flyover videos for orbital data for an entire satellite mission or project. All videos, including the one-month animation, will be uploaded and available at the GES DISC site on YouTube (https://www.youtube.com/user/NASAGESDISC).
Space Processing Applications Rocket (SPAR) project: SPAR 10
NASA Technical Reports Server (NTRS)
Poorman, R. (Compiler)
1986-01-01
The Space Processing Applications Rocket Project (SPAR) X Final Report contains the compilation of the post-flight reports from each of the Principal Investigators (PIs) on the four selected science payloads, in addition to the engineering report as documented by the Marshall Space Flight Center (MSFC). This combined effort also describes pertinent portions of ground-based research leading to the ultimate selection of the flight sample composition, including design, fabrication and testing, all of which are expected to contribute to an improved comprehension of materials processing in space. The SPAR project was coordinated and managed by MSFC as part of the Microgravity Science and Applications (MSA) program of the Office of Space Science and Applications (OSSA) of NASA Headquarters. This technical memorandum is directed entirely to the payload manifest flown in the tenth of a series of SPAR flights conducted at the White Sands Missile Range (WSMR) and includes the experiments entitled, Containerless Processing Technology, SPAR Experiment 76-20/3; Directional Solidification of Magnetic Composites, SPAR Experiment 76-22/3; Comparative Alloy Solidification, SPAR Experiment 76-36/3; and Foam Copper, SPAR Experiment 77-9/1R.
Xu, Lifeng; Henke, Michael; Zhu, Jun; Kurth, Winfried; Buck-Sorlin, Gerhard
2011-04-01
Although quantitative trait loci (QTL) analysis of yield-related traits for rice has developed rapidly, crop models using genotype information have been proposed only relatively recently. As a first step towards a generic genotype-phenotype model, we present here a three-dimensional functional-structural plant model (FSPM) of rice, in which some model parameters are controlled by functions describing the effect of main-effect and epistatic QTLs. The model simulates the growth and development of rice based on selected ecophysiological processes, such as photosynthesis (source process) and organ formation, growth and extension (sink processes). It was devised using GroIMP, an interactive modelling platform based on the Relational Growth Grammar formalism (RGG). RGG rules describe the course of organ initiation and extension resulting in final morphology. The link between the phenotype (as represented by the simulated rice plant) and the QTL genotype was implemented via a data interface between the rice FSPM and the QTLNetwork software, which computes predictions of QTLs from map data and measured trait data. Using plant height and grain yield, it is shown how QTL information for a given trait can be used in an FSPM, computing and visualizing the phenotypes of different lines of a mapping population. Furthermore, we demonstrate how modification of a particular trait feeds back on the entire plant phenotype via the physiological processes considered. We linked a rice FSPM to a quantitative genetic model, thereby employing QTL information to refine model parameters and visualizing the dynamics of development of the entire phenotype as a result of ecophysiological processes, including the trait(s) for which genetic information is available. Possibilities for further extension of the model, for example for the purposes of ideotype breeding, are discussed.
Xu, Lifeng; Henke, Michael; Zhu, Jun; Kurth, Winfried; Buck-Sorlin, Gerhard
2011-01-01
Background and Aims Although quantitative trait loci (QTL) analysis of yield-related traits for rice has developed rapidly, crop models using genotype information have been proposed only relatively recently. As a first step towards a generic genotype–phenotype model, we present here a three-dimensional functional–structural plant model (FSPM) of rice, in which some model parameters are controlled by functions describing the effect of main-effect and epistatic QTLs. Methods The model simulates the growth and development of rice based on selected ecophysiological processes, such as photosynthesis (source process) and organ formation, growth and extension (sink processes). It was devised using GroIMP, an interactive modelling platform based on the Relational Growth Grammar formalism (RGG). RGG rules describe the course of organ initiation and extension resulting in final morphology. The link between the phenotype (as represented by the simulated rice plant) and the QTL genotype was implemented via a data interface between the rice FSPM and the QTLNetwork software, which computes predictions of QTLs from map data and measured trait data. Key Results Using plant height and grain yield, it is shown how QTL information for a given trait can be used in an FSPM, computing and visualizing the phenotypes of different lines of a mapping population. Furthermore, we demonstrate how modification of a particular trait feeds back on the entire plant phenotype via the physiological processes considered. Conclusions We linked a rice FSPM to a quantitative genetic model, thereby employing QTL information to refine model parameters and visualizing the dynamics of development of the entire phenotype as a result of ecophysiological processes, including the trait(s) for which genetic information is available. Possibilities for further extension of the model, for example for the purposes of ideotype breeding, are discussed. PMID:21247905
Processing Satellite Images on Tertiary Storage: A Study of the Impact of Tile Size on Performance
NASA Technical Reports Server (NTRS)
Yu, JieBing; DeWitt, David J.
1996-01-01
Before raw data from a satellite can be used by an Earth scientist, it must first undergo a number of processing steps including basic processing, cleansing, and geo-registration. Processing actually expands the volume of data collected by a factor of 2 or 3 and the original data is never deleted. Thus processing and storage requirements can exceed 2 terrabytes/day. Once processed data is ready for analysis, a series of algorithms (typically developed by the Earth scientists) is applied to a large number of images in a data set. The focus of this paper is how best to handle such images stored on tape using the following assumptions: (1) all images of interest to a scientist are stored on a single tape, (2) images are accessed and processed in the order that they are stored on tape, and (3) the analysis requires access to only a portion of each image and not the entire image.
SkyMapper Filter Set: Design and Fabrication of Large-Scale Optical Filters
NASA Astrophysics Data System (ADS)
Bessell, Michael; Bloxham, Gabe; Schmidt, Brian; Keller, Stefan; Tisserand, Patrick; Francis, Paul
2011-07-01
The SkyMapper Southern Sky Survey will be conducted from Siding Spring Observatory with u, v, g, r, i, and z filters that comprise glued glass combination filters with dimensions of 309 × 309 × 15 mm. In this article we discuss the rationale for our bandpasses and physical characteristics of the filter set. The u, v, g, and z filters are entirely glass filters, which provide highly uniform bandpasses across the complete filter aperture. The i filter uses glass with a short-wave pass coating, and the r filter is a complete dielectric filter. We describe the process by which the filters were constructed, including the processes used to obtain uniform dielectric coatings and optimized narrowband antireflection coatings, as well as the technique of gluing the large glass pieces together after coating using UV transparent epoxy cement. The measured passbands, including extinction and CCD QE, are presented.
St. Fergus terminal gets turboexpanders for critical service
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lillard, J.K.; Nicol, G.
1994-09-05
To expand the St. Fergus gas-reception terminal for the Scottish Area Gas Evacuation (SAGE) system, Mobil North Sea Ltd. is adding a second separation train and two treatment trains. To meet pipeline-gas specifications over a wide range of low rates and feed-gas compositions, single-stage turboexpander chilling was selected over Joule-Thomson valve expansion. Four turboexpanders (two per process train) will operate in parallel to achieve the required performance over the entire flow range of 90--575 MMscfd per process train. Unusual operating conditions for the turboexpanders include dense-phase inlet gas, expansion near the cricondenbar, and high equilibrium liquid content at the exhaustmore » (up to 50 wt %). The two turboexpanders in each train share common suction and discharge facilities as do their associated brake compressor. Details of the more than 400 million pounds Sterling Phase B discussed here include commissioning, start-up, and operation.« less
Localized saddle-point search and application to temperature-accelerated dynamics
NASA Astrophysics Data System (ADS)
Shim, Yunsic; Callahan, Nathan B.; Amar, Jacques G.
2013-03-01
We present a method for speeding up temperature-accelerated dynamics (TAD) simulations by carrying out a localized saddle-point (LSAD) search. In this method, instead of using the entire system to determine the energy barriers of activated processes, the calculation is localized by only including a small chunk of atoms around the atoms directly involved in the transition. Using this method, we have obtained N-independent scaling for the computational cost of the saddle-point search as a function of system size N. The error arising from localization is analyzed using a variety of model systems, including a variety of activated processes on Ag(100) and Cu(100) surfaces, as well as multiatom moves in Cu radiation damage and metal heteroepitaxial growth. Our results show significantly improved performance of TAD with the LSAD method, for the case of Ag/Ag(100) annealing and Cu/Cu(100) growth, while maintaining a negligibly small error in energy barriers.
Design of structure and simulation of the three-zone gasifier of dense layer of the inverted process
NASA Astrophysics Data System (ADS)
Zagrutdinov, R. Sh; Negutorov, V. N.; Maliykhin, D. G.; Nikishanin, M. S.; Senachin, P. K.
2017-11-01
Experts of LLC “New Energy Technologies” have developed gasifiers designs, with the implementation of the three-zone gasification method, which satisfy the following conditions: 1) the generated gas must be free from tar, soot and hydrocarbons, with a given ratio of CO/H2; 2) to use as the fuel source a wide range of low-grade low-value solid fuels, including biomass and various kinds of carbonaceous wastes; 3) have high reliability in operation, do not require qualified operating personnel, be relatively inexpensive to produce and use steam-air blowing instead of expensive steam-oxygen one; 4) the line of standard sizes should be sufficiently wide (with a single unit capacity of fuel from 1 to 50-70 MW). Two models of gas generators of the inverted gasification process with three combustion zones operating under pressure have been adopted for design: 1) gas generator with a remote combustion chamber type GOP-VKS (two-block version) and 2) a gas generator with a common combustion chamber of the GOP-OK type (single-block version), which is an almost ideal model for increasing the unit capacity. There have been worked out various schemes for the preparation of briquettes from practically the entire spectrum of low-grade fuel: high-ash and high-moisture coals, peat and biomass, including all types of waste - solid household waste, crop, livestock, poultry, etc. In the gas generators there are gasified the cylindrical briquettes with a diameter of 20-25 mm and a length of 25-35 mm. There have been developed a mathematical model and computer code for numerical simulation of synthesis gas generation processes in a gasifier of a dense layer of inverted process during a steam-air blast, including: continuity equations for the 8 gas phase components and for the solid phase; the equation of the heat balance for the entire heterogeneous system; the Darcy law equation (for porous media); equation of state for 8 components of the gas phase; equations for the rates of 3 gas-phase and 4 heterogeneous reactions; macro kinetics law of coke combustion; other equations and boundary conditions.
Expectation, information processing, and subjective duration.
Simchy-Gross, Rhimmon; Margulis, Elizabeth Hellmuth
2018-01-01
In research on psychological time, it is important to examine the subjective duration of entire stimulus sequences, such as those produced by music (Teki, Frontiers in Neuroscience, 10, 2016). Yet research on the temporal oddball illusion (according to which oddball stimuli seem longer than standard stimuli of the same duration) has examined only the subjective duration of single events contained within sequences, not the subjective duration of sequences themselves. Does the finding that oddballs seem longer than standards translate to entire sequences, such that entire sequences that contain oddballs seem longer than those that do not? Is this potential translation influenced by the mode of information processing-whether people are engaged in direct or indirect temporal processing? Two experiments aimed to answer both questions using different manipulations of information processing. In both experiments, musical sequences either did or did not contain oddballs (auditory sliding tones). To manipulate information processing, we varied the task (Experiment 1), the sequence event structure (Experiments 1 and 2), and the sequence familiarity (Experiment 2) independently within subjects. Overall, in both experiments, the sequences that contained oddballs seemed shorter than those that did not when people were engaged in direct temporal processing, but longer when people were engaged in indirect temporal processing. These findings support the dual-process contingency model of time estimation (Zakay, Attention, Perception & Psychophysics, 54, 656-664, 1993). Theoretical implications for attention-based and memory-based models of time estimation, the pacemaker accumulator and coding efficiency hypotheses of time perception, and dynamic attending theory are discussed.
Vasan, S N Swetadri; Ionita, Ciprian N; Titus, A H; Cartwright, A N; Bednarek, D R; Rudin, S
2012-02-23
We present the image processing upgrades implemented on a Graphics Processing Unit (GPU) in the Control, Acquisition, Processing, and Image Display System (CAPIDS) for the custom Micro-Angiographic Fluoroscope (MAF) detector. Most of the image processing currently implemented in the CAPIDS system is pixel independent; that is, the operation on each pixel is the same and the operation on one does not depend upon the result from the operation on the other, allowing the entire image to be processed in parallel. GPU hardware was developed for this kind of massive parallel processing implementation. Thus for an algorithm which has a high amount of parallelism, a GPU implementation is much faster than a CPU implementation. The image processing algorithm upgrades implemented on the CAPIDS system include flat field correction, temporal filtering, image subtraction, roadmap mask generation and display window and leveling. A comparison between the previous and the upgraded version of CAPIDS has been presented, to demonstrate how the improvement is achieved. By performing the image processing on a GPU, significant improvements (with respect to timing or frame rate) have been achieved, including stable operation of the system at 30 fps during a fluoroscopy run, a DSA run, a roadmap procedure and automatic image windowing and leveling during each frame.
Benefits to blood banks of a sales and operations planning process.
Keal, Donald A; Hebert, Phil
2010-12-01
A formal sales and operations planning (S&OP) process is a decision making and communication process that balances supply and demand while integrating all business operational components with customer-focused business plans that links high level strategic plans to day-to-day operations. Furthermore, S&OP can assist in managing change across the organization as it provides the opportunity to be proactive in the face of problems and opportunities while establishing a plan for everyone to follow. Some of the key outcomes from a robust S&OP process in blood banking would include: higher customer satisfaction (donors and health care providers), balanced inventory across product lines and customers, more stable production rates and higher productivity, more cooperation across the entire operation, and timely updates to the business plan resulting in better forecasting and fewer surprises that negatively impact the bottom line. © 2010 American Association of Blood Banks.
Laboratory-scale integrated ARP filter test
DOE Office of Scientific and Technical Information (OSTI.GOV)
Poirier, M.; Burket, P.
2016-03-01
The Savannah River Site (SRS) is currently treating radioactive liquid waste with the Actinide Removal Process (ARP) and the Modular Caustic Side Solvent Extraction Unit (MCU). Recently, the low filter flux through the ARP of approximately 5 gallons per minute has limited the rate at which radioactive liquid waste can be treated. Salt Batch 6 had a lower processing rate and required frequent filter cleaning. There is a desire to understand the causes of the low filter flux and to increase ARP/MCU throughput. This task attempted to simulate the entire ARP process, including multiple batches (5), washing, chemical cleaning, andmore » blending the feed with heels and recycle streams. The objective of the tests was to determine whether one of these processes is causing excessive fouling of the crossflow or secondary filter. The authors conducted the tests with feed solutions containing 6.6 M sodium Salt Batch 6 simulant supernate with no MST.« less
Transmission ultrasonography. [time delay spectrometry for soft tissue transmission imaging
NASA Technical Reports Server (NTRS)
Heyser, R. C.; Le Croissette, D. H.
1973-01-01
Review of the results of the application of an advanced signal-processing technique, called time delay spectrometry, in obtaining soft tissue transmission images by transmission ultrasonography, both in vivo and in vitro. The presented results include amplitude ultrasound pictures and phase ultrasound pictures obtained by this technique. While amplitude ultrasonographs of tissue are closely analogous to X-ray pictures in that differential absorption is imaged, phase ultrasonographs represent an entirely new source of information based on differential time of propagation. Thus, a new source of information is made available for detailed analysis.
NASA Technical Reports Server (NTRS)
Holanda, Raymond; Kim, Walter S.; Pencil, Eric; Groth, Mary; Danzey, Gerald A.
1990-01-01
Parallel gap resistance welding was used to attach lead wires to sputtered thin film sensors. Ranges of optimum welding parameters to produce an acceptable weld were determined. The thin film sensors were Pt13Rh/Pt thermocouples; they were mounted on substrates of MCrAlY-coated superalloys, aluminum oxide, silicon carbide and silicon nitride. The entire sensor system is designed to be used on aircraft engine parts. These sensor systems, including the thin-film-to-lead-wire connectors, were tested to 1000 C.
Operating a petabyte class archive at ESO
NASA Astrophysics Data System (ADS)
Suchar, Dieter; Lockhart, John S.; Burrows, Andrew
2008-07-01
The challenges of setting up and operating a Petabyte Class Archive will be described in terms of computer systems within a complex Data Centre environment. The computer systems, including the ESO Primary and Secondary Archive and the associated computational environments such as relational databases will be explained. This encompasses the entire system project cycle, including the technical specifications, procurement process, equipment installation and all further operational phases. The ESO Data Centre construction and the complexity of managing the environment will be presented. Many factors had to be considered during the construction phase, such as power consumption, targeted cooling and the accumulated load on the building structure to enable the smooth running of a Petabyte class Archive.
GR@PPA 2.8: Initial-state jet matching for weak-boson production processes at hadron collisions
NASA Astrophysics Data System (ADS)
Odaka, Shigeru; Kurihara, Yoshimasa
2012-04-01
The initial-state jet matching method introduced in our previous studies has been applied to the event generation of single W and Z production processes and diboson (WW, WZ and ZZ) production processes at hadron collisions in the framework of the GR@PPA event generator. The generated events reproduce the transverse momentum spectra of weak bosons continuously in the entire kinematical region. The matrix elements (ME) for hard interactions are still at the tree level. As in previous versions, the decays of weak bosons are included in the matrix elements. Therefore, spin correlations and phase-space effects in the decay of weak bosons are exact at the tree level. The program package includes custom-made parton shower programs as well as ME-based hard interaction generators in order to achieve self-consistent jet matching. The generated events can be passed to general-purpose event generators to make the simulation proceed down to the hadron level. Catalogue identifier: ADRH_v3_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADRH_v3_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 112 146 No. of bytes in distributed program, including test data, etc.: 596 667 Distribution format: tar.gz Programming language: Fortran; with some included libraries coded in C and C++ Computer: All Operating system: Any UNIX-like system RAM: 1.6 Mega bytes at minimum Classification: 11.2 Catalogue identifier of previous version: ADRH_v2_0 Journal reference of previous version: Comput. Phys. Comm. 175 (2006) 665 External routines: Bash and Perl for the setup, and CERNLIB, ROOT, LHAPDF, PYTHIA according to the user's choice. Does the new version supersede the previous version?: No, this version supports only a part of the processes included in the previous versions. Nature of problem: We need to combine those processes including 0 jet and 1 jet in the matrix elements using an appropriate matching method, in order to simulate weak-boson production processes in the entire kinematical region. Solution method: The leading logarithmic components to be included in parton distribution functions and parton showers are subtracted from 1-jet matrix elements. Custom-made parton shower programs are provided to ensure satisfactory performance of the matching method. Reasons for new version: An initial-state jet matching method has been implemented. Summary of revisions: Weak-boson production processes associated with 0 jet and 1 jet can be consistently merged using the matching method. Restrictions: The built-in parton showers are not compatible with the PYTHIA new PS and the HERWIG PS. Unusual features: A large number of particles may be produced by the parton showers and passed to general-purpose event generators. Running time: About 10 min for initialization plus 25 s for every 1k-event generation for W production in the LHC condition, on a 3.0-GHz Intel Xeon processor with the default setting.
Choy, G.L.; Boatwright, J.
2007-01-01
The rupture process of the Mw 9.1 Sumatra-Andaman earthquake lasted for approximately 500 sec, nearly twice as long as the teleseismic time windows between the P and PP arrival times generally used to compute radiated energy. In order to measure the P waves radiated by the entire earthquake, we analyze records that extend from the P-wave to the S-wave arrival times from stations at distances ?? >60??. These 8- to 10-min windows contain the PP, PPP, and ScP arrivals, along with other multiply reflected phases. To gauge the effect of including these additional phases, we form the spectral ratio of the source spectrum estimated from extended windows (between TP and TS) to the source spectrum estimated from normal windows (between TP and TPP). The extended windows are analyzed as though they contained only the P-pP-sP wave group. We analyze four smaller earthquakes that occurred in the vicinity of the Mw 9.1 mainshock, with similar depths and focal mechanisms. These smaller events range in magnitude from an Mw 6.0 aftershock of 9 January 2005 to the Mw 8.6 Nias earthquake that occurred to the south of the Sumatra-Andaman earthquake on 28 March 2005. We average the spectral ratios for these four events to obtain a frequency-dependent operator for the extended windows. We then correct the source spectrum estimated from the extended records of the 26 December 2004 mainshock to obtain a complete or corrected source spectrum for the entire rupture process (???600 sec) of the great Sumatra-Andaman earthquake. Our estimate of the total seismic energy radiated by this earthquake is 1.4 ?? 1017 J. When we compare the corrected source spectrum for the entire earthquake to the source spectrum from the first ???250 sec of the rupture process (obtained from normal teleseismic windows), we find that the mainshock radiated much more seismic energy in the first half of the rupture process than in the second half, especially over the period range from 3 sec to 40 sec.
The Defense Life Cycle Management System as a Working Model for Academic Application
ERIC Educational Resources Information Center
Burian, Philip E.; Keffel, Leslie M.; Maffei, Francis R., III
2011-01-01
Performing the review and assessment of masters' level degree programs can be an overwhelming and challenging endeavor. Getting organized and mapping out the entire review and assessment process can be extremely helpful and more importantly provide a path for successfully accomplishing the review and assessment of the entire program. This paper…
Some Memories Are Odder than Others: Judgments of Episodic Oddity Violate Known Decision Rules
ERIC Educational Resources Information Center
O'Connor, Akira R.; Guhl, Emily N.; Cox, Justin C.; Dobbins, Ian G.
2011-01-01
Current decision models of recognition memory are based almost entirely on one paradigm, single item old/new judgments accompanied by confidence ratings. This task results in receiver operating characteristics (ROCs) that are well fit by both signal-detection and dual-process models. Here we examine an entirely new recognition task, the judgment…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lim, Yong Chae; Sanderson, Samuel; Mahoney, Murray
Friction stir welding (FSW) has recently attracted attention as an alternative construction process for gas/oil transportation applications due to advantages compared to fusion welding techniques. A significant advantage is the ability of FSW to weld the entire or nearly the entire wall thickness in a single pass, while fusion welding requires multiple passes. However, when FSW is applied to a pipe or tube geometry, an internal back support anvil is required to resist the plunging forces exerted during FSW. Unfortunately, it may not be convenient or economical to use internal backing support due to limited access for some applications. Tomore » overcome this issue, ExxonMobil recently developed a new concept, combining root arc welding and FSW. That is, a root arc weld is made prior to FSW that supports the normal loads associated with FSW. In the present work, mechanical properties of a FSW + root arc welded pipe steel are reported including microstructure and microhardness.« less
GPU Optimizations for a Production Molecular Docking Code*
Landaverde, Raphael; Herbordt, Martin C.
2015-01-01
Modeling molecular docking is critical to both understanding life processes and designing new drugs. In previous work we created the first published GPU-accelerated docking code (PIPER) which achieved a roughly 5× speed-up over a contemporaneous 4 core CPU. Advances in GPU architecture and in the CPU code, however, have since reduced this relalative performance by a factor of 10. In this paper we describe the upgrade of GPU PIPER. This required an entire rewrite, including algorithm changes and moving most remaining non-accelerated CPU code onto the GPU. The result is a 7× improvement in GPU performance and a 3.3× speedup over the CPU-only code. We find that this difference in time is almost entirely due to the difference in run times of the 3D FFT library functions on CPU (MKL) and GPU (cuFFT), respectively. The GPU code has been integrated into the ClusPro docking server which has over 4000 active users. PMID:26594667
GPU Optimizations for a Production Molecular Docking Code.
Landaverde, Raphael; Herbordt, Martin C
2014-09-01
Modeling molecular docking is critical to both understanding life processes and designing new drugs. In previous work we created the first published GPU-accelerated docking code (PIPER) which achieved a roughly 5× speed-up over a contemporaneous 4 core CPU. Advances in GPU architecture and in the CPU code, however, have since reduced this relalative performance by a factor of 10. In this paper we describe the upgrade of GPU PIPER. This required an entire rewrite, including algorithm changes and moving most remaining non-accelerated CPU code onto the GPU. The result is a 7× improvement in GPU performance and a 3.3× speedup over the CPU-only code. We find that this difference in time is almost entirely due to the difference in run times of the 3D FFT library functions on CPU (MKL) and GPU (cuFFT), respectively. The GPU code has been integrated into the ClusPro docking server which has over 4000 active users.
Atomization and vaporization characteristics of airblast fuel injection inside a venturi tube
NASA Technical Reports Server (NTRS)
Sun, H.; Chue, T.-H.; Lai, M.-C.; Tacina, R. R.
1993-01-01
This paper describes the experimental and numerical characterization of the capillary fuel injection, atomization, dispersion, and vaporization of liquid fuel in a coflowing air stream inside a single venturi tube. The experimental techniques used are all laser-based. Phase Doppler analyzer was used to characterize the atomization and vaporization process. Planar laser-induced fluorescence visualizations give good qualitative picture of the fuel droplet and vapor distribution. Limited quantitative capabilities of the technique are also demonstrated. A modified version of the KIVA-II was used to simulate the entire spray process, including breakup and vaporization. The advantage of venturi nozzle is demonstrated in terms of better atomization, more uniform F/A distribution, and less pressure drop. Multidimensional spray calculations can be used as a design tool only if care is taken for the proper breakup model, and wall impingement process.
Larval Transport on the Atlantic Continental Shelf of North America: a Review
NASA Astrophysics Data System (ADS)
Epifanio, C. E.; Garvine, R. W.
2001-01-01
This review considers transport of larval fish and crustaceans on the continental shelf. Previous reviews have contained only limited treatments of the physical processes involved. The present paper provides a physical background that is considerably more comprehensive. It includes a discussion of three principal forcing agents: (1) wind stress; (2) tides propagating from the deep ocean; and (3) differences in density associated with the buoyant outflow of estuaries, surface heat flux, or the interaction of coastal and oceanic water masses at the seaward margin of the shelf. The authors discuss the effects of these forcing agents on transport of larvae in the Middle Atlantic and South Atlantic Bights along the east coast of North America. The discussion concentrates on three species (blue crab, menhaden, bluefish) that have been the subject of a very recent multi-disciplinary study. Taken as a whole, the reproductive activities of these three species span the entire year and utilize the entire shelf, from the most seaward margin to the estuarine nursery. The blue crab is representative of species affected by physical processes occurring during summer and early autumn on the inner and mid-shelf. Menhaden are impacted by processes occurring in winter on the outer and mid-shelf. Bluefish are influenced primarily by processes occurring during early spring at the outer shelf margin near the western boundary current. The authors conclude that alongshore wind stress and density differences, i.e. buoyancy-driven flow, are the primary agents of larval transport in the region. Circulation associated with the western boundary current is only important at the shelf margin and tidally driven processes are generally inconsequential.
NASA Astrophysics Data System (ADS)
Sobolev, Stephan; Muldashev, Iskander
2016-04-01
The key achievement of the geodynamic modelling community greatly contributed by the work of Evgenii Burov and his students is application of "realistic" mineral-physics based non-linear rheological models to simulate deformation processes in crust and mantle. Subduction being a type example of such process is an essentially multi-scale phenomenon with the time-scales spanning from geological to earthquake scale with the seismic cycle in-between. In this study we test the possibility to simulate the entire subduction process from rupture (1 min) to geological time (Mln yr) with the single cross-scale thermomechanical model that employs elasticity, mineral-physics constrained non-linear transient viscous rheology and rate-and-state friction plasticity. First we generate a thermo-mechanical model of subduction zone at geological time-scale including a narrow subduction channel with "wet-quartz" visco-elasto-plastic rheology and low static friction. We next introduce in the same model classic rate-and state friction law in subduction channel, leading to stick-slip instability. This model generates spontaneous earthquake sequence. In order to follow in details deformation process during the entire seismic cycle and multiple seismic cycles we use adaptive time-step algorithm changing step from 40 sec during the earthquake to minute-5 year during postseismic and interseismic processes. We observe many interesting deformation patterns and demonstrate that contrary to the conventional ideas, this model predicts that postseismic deformation is controlled by visco-elastic relaxation in the mantle wedge already since hour to day after the great (M>9) earthquakes. We demonstrate that our results are consistent with the postseismic surface displacement after the Great Tohoku Earthquake for the day-to-4year time range.
Sar, Taner; Seker, Gamze; Erman, Ayse Gokce; Stark, Benjamin C.; Yesilcimen Akbas, Meltem
2017-01-01
ABSTRACT This study describes an efficient and reusable process for ethanol production from medium containing whey powder, using alginate immobilized ethanologenic E. coli strains either expressing (TS3) or not expressing (FBR5) Vitreoscilla hemoglobin. Reuseabilities of the FBR5 and TS3 strains were investigated regarding their ethanol production capacities over the course of 15 successive 96-h batch fermentations. The ethanol production was fairly stable over the entire duration of the experiment, with strain TS3 maintaining a substantial advantage over strain FBR5. Storage of both strains in 2 different solutions for up to 60 d resulted in only a modest loss of ethanol production, with strain TS3 consistently outperforming strain FBR5 by a substantial amount. Strains stored for 15 or 30 d maintained their abilities to produce ethanol without dimunition over the course of 8 successive batch fermentations; again strain TS3 maintained a substantial advantage over strain FBR5 throughout the entire experiment. Thus, immobilization is a useful strategy to maintain the advantage in ethanol productivity afforded by expression of Vitreoscilla hemoglobin over long periods of time and large numbers of repeated batch fermentations, including, as in this case, using media with food processing wastes as the carbon source. PMID:28394725
[Emissions from dairy industry and the influence of herd management].
Dämmgen, Ulrich; Brade, Wilfried; Haenel, Hans-Dieter; Rösemann, Claus; Dämmgen, Jürgen; Meyer, Ulrich
2017-08-11
The purpose of this paper is to identify specific emission-reduction opportunities in dairy herds arising from aspects of useful herd management with the potential to reduce emissions, which are within the scope of veterinary activities. In future, it might be one of a veterinarian's advisory capacities to deal with the aspect of climate and environmental protection in animal husbandry. The models involved are similar to those of the national agricultural emission inventory. They allow quantifying the impacts of improved animal health, extended productive lifespan and grazing of an entire dairy herd (cows, calves, heifers and bulls) on emissions from the herd itself, in addition to those originating from the entire production chain, including provision of primary energy, water, feed production and processing. Ammonia emissions are the main focus. The reductions achieved here are not huge, though noticeable. They do not create extra costs. As can be shown, improved animal health and welfare are also environmentally beneficial. The reduction of greenhouse gas and air pollutant (eutrophying and acidifying gases and particles) emissions is an acknowledged political goal. If Germany wants to achieve the emission ceilings it has agreed to, agriculture will have to contribute. Planning will have to precede action if agriculture is itself to keep control of the processes.
The Effect of Radiation on Selected Photographic Film
NASA Technical Reports Server (NTRS)
Slater, Richard; Kinard, John; Firsov, Ivan
2000-01-01
We conducted this film test to evaluate several manufacturers' photographic films for their ability to acquire imagery on the International Space Station. We selected 25 motion picture, photographic slide, and negative films from three different film manufacturers. We based this selection on the fact that their films ranked highest in other similar film tests, and on their general acceptance by the international community. This test differed from previous tests because the entire evaluation process leading up to the final selection was based on information derived after the original flight film was scanned to a digital file. Previously conducted tests were evaluated entirely based on 8 x 10s that were produced from the film either directly or through the internegative process. This new evaluation procedure provided accurate quantitative data on granularity and contrast from the digital data. This test did not try to define which film was best visually. This is too often based on personal preference. However, the test results did group the films by good, marginal, and unacceptable. We developed, and included in this report, a template containing quantitative, graphical, and visual information for each film. These templates should be sufficient for comparing the different films tested and subsequently selecting a film or films to be used for experiments and general documentation on the International Space Station.
Sar, Taner; Seker, Gamze; Erman, Ayse Gokce; Stark, Benjamin C; Yesilcimen Akbas, Meltem
2017-09-03
This study describes an efficient and reusable process for ethanol production from medium containing whey powder, using alginate immobilized ethanologenic E. coli strains either expressing (TS3) or not expressing (FBR5) Vitreoscilla hemoglobin. Reuseabilities of the FBR5 and TS3 strains were investigated regarding their ethanol production capacities over the course of 15 successive 96-h batch fermentations. The ethanol production was fairly stable over the entire duration of the experiment, with strain TS3 maintaining a substantial advantage over strain FBR5. Storage of both strains in 2 different solutions for up to 60 d resulted in only a modest loss of ethanol production, with strain TS3 consistently outperforming strain FBR5 by a substantial amount. Strains stored for 15 or 30 d maintained their abilities to produce ethanol without dimunition over the course of 8 successive batch fermentations; again strain TS3 maintained a substantial advantage over strain FBR5 throughout the entire experiment. Thus, immobilization is a useful strategy to maintain the advantage in ethanol productivity afforded by expression of Vitreoscilla hemoglobin over long periods of time and large numbers of repeated batch fermentations, including, as in this case, using media with food processing wastes as the carbon source.
The Danish test battery for auditory processing disorder evaluated with patient and control data.
Raben Pedersen, Ellen
2018-06-10
This study evaluates the Danish test battery for auditory processing disorder (APD). The battery consists of four behavioural tests, two speech and two non-speech stimuli tests. The evaluation includes determination of: (1) new cut-off values (pass-fail criteria), (2) the sensitivity and the specificity of the entire test battery and (3) the failure rate of different test combinations. For each test in the battery, cut-off values were determined using the weighted Youden index. Applying the newly derived cut-off values, the distribution of failing specific test combinations was determined. A group of 112 children diagnosed with APD (57 boys, 55 girls, aged 6-16 years) and a control group containing 158 children without auditory problems (75 boys, 83 girls, aged 6-16 years). Cut-off values for different weights of the sensitivity and the specificity have been determined. Using the criterion that at least two tests have to be failed for APD to be suspected, the sensitivity and the specificity of the entire test battery were 95.3% and 91.6%, respectively. Some test combinations were found to have higher failure rates than others. Due to the high sensitivity and specificity the test battery has good predictive value in APD assessment.
Optimal control solutions to sodic soil reclamation
NASA Astrophysics Data System (ADS)
Mau, Yair; Porporato, Amilcare
2016-05-01
We study the reclamation process of a sodic soil by irrigation with water amended with calcium cations. In order to explore the entire range of time-dependent strategies, this task is framed as an optimal control problem, where the amendment rate is the control and the total rehabilitation time is the quantity to be minimized. We use a minimalist model of vertically averaged soil salinity and sodicity, in which the main feedback controlling the dynamics is the nonlinear coupling of soil water and exchange complex, given by the Gapon equation. We show that the optimal solution is a bang-bang control strategy, where the amendment rate is discontinuously switched along the process from a maximum value to zero. The solution enables a reduction in remediation time of about 50%, compared with the continuous use of good-quality irrigation water. Because of its general structure, the bang-bang solution is also shown to work for the reclamation of other soil conditions, such as saline-sodic soils. The novelty in our modeling approach is the capability of searching the entire "strategy space" for optimal time-dependent protocols. The optimal solutions found for the minimalist model can be then fine-tuned by experiments and numerical simulations, applicable to realistic conditions that include spatial variability and heterogeneities.
The Painting-Sponging Analogy for Chemical Equilibrium
NASA Astrophysics Data System (ADS)
Gamitz, Adoni
1997-05-01
An analogy for chemical equilibrium is presented, in which high school or younger students can follow the advance towards equilibrium and its final dynamic nature. The relative opposition between forward and backward processes in a real chemical reaction is exemplified by the distance of a road line that is painted by one person and erased by another, both with different skills and working speeds. The graphical results of the progress of the line distance is entirely similar to the increasing of products concentration in a chemical reaction starting from the reactants. In the analogy, the final equilibrium position is independent of the starting point, as well as in a real chemical process. A simple basic program is included for interactive learning purposes.
Pascual-Leone, A; Yeryomenko, N; Sawashima, T; Warwar, S
2017-05-04
Pascual-Leone and Greenberg's sequential model of emotional processing has been used to explore process in over 24 studies. This line of research shows emotional processing in good psychotherapy often follows a sequential order, supporting a saw-toothed pattern of change within individual sessions (progressing "2-steps-forward, 1-step-back"). However, one cannot assume that local in-session patterns are scalable across an entire course of therapy. Thus, the primary objective of this exploratory study was to consider how the sequential patterns identified by Pascual-Leone, may apply across entire courses of treatment. Intensive emotion coding in two separate single-case designs were submitted for quantitative analyses of longitudinal patterns. Comprehensive coding in these cases involved recording observations for every emotional event in an entire course of treatment (using the Classification of Affective-Meaning States), which were then treated as a 9-point ordinal scale. Applying multilevel modeling to each of the two cases showed significant patterns of change over a large number of sessions, and those patterns were either nested at the within-session level or observed at the broader session-by-session level of change. Examining successful treatment cases showed several theoretically coherent kinds of temporal patterns, although not always in the same case. Clinical or methodological significance of this article: This is the first paper to demonstrate systematic temporal patterns of emotion over the course of an entire treatment. (1) The study offers a proof of concept that longitudinal patterns in the micro-processes of emotion can be objectively derived and quantified. (2) It also shows that patterns in emotion may be identified on the within-session level, as well as the session-by-session level of analysis. (3) Finally, observed processes over time support the ordered pattern of emotional states hypothesized in Pascual-Leone and Greenberg's ( 2007 ) model of emotional processing.
Expanded Processing Techniques for EMI Systems
2012-07-01
possible to perform better target detection using physics-based algorithms and the entire data set, rather than simulating a simpler data set and mapping...possible to perform better target detection using physics-based algorithms and the entire data set, rather than simulating a simpler data set and...54! Figure 4.25: Plots of simulated MetalMapper data for two oblate spheroidal targets
An Optimization of Manufacturing Systems using a Feedback Control Scheduling Model
NASA Astrophysics Data System (ADS)
Ikome, John M.; Kanakana, Grace M.
2018-03-01
In complex production system that involves multiple process, unplanned disruption often turn to make the entire production system vulnerable to a number of problems which leads to customer’s dissatisfaction. However, this problem has been an ongoing problem that requires a research and methods to streamline the entire process or develop a model that will address it, in contrast to this, we have developed a feedback scheduling model that can minimize some of this problem and after a number of experiment, it shows that some of this problems can be eliminated if the correct remedial actions are implemented on time.
SEPARATION OF INORGANIC SALTS FROM ORGANIC SOLUTIONS
Katzin, L.I.; Sullivan, J.C.
1958-06-24
A process is described for recovering the nitrates of uranium and plutonium from solution in oxygen-containing organic solvents such as ketones or ethers. The solution of such salts dissolved in an oxygen-containing organic compound is contacted with an ion exchange resin whereby sorption of the entire salt on the resin takes place and then the salt-depleted liquid and the resin are separated from each other. The reaction seems to be based on an anion formation of the entire salt by complexing with the anion of the resin. Strong base or quaternary ammonium type resins can be used successfully in this process.
Garbers, Samantha; Flandrick, Kathleen; Bermudez, Dayana; Meserve, Allison; Chiasson, Mary Ann
2014-11-01
Interventions to reduce unintended pregnancy through improved contraceptive use are a public health priority. A comprehensive process evaluation of a contraceptive assessment module intervention with demonstrated efficacy was undertaken. The 12-month process evaluation goal was to describe the extent to which the intervention was implemented as intended over time, and to identify programmatic adjustments to improve implementation fidelity. Quantitative and qualitative methods included staff surveys, electronic health record data, usage monitoring, and observations. Fidelity of implementation was low overall (<10% of eligible patients completed the entire module [dose received]). Although a midcourse correction making the module available in clinical areas led to increased dose delivered (23% vs. 30%, chi-square test p = .006), dose received did not increase significantly after this adjustment. Contextual factors including competing organizational and staff priorities and staff buy-in limited the level of implementation and precluded adoption of some strategies such as adjusting patient flow. Using a process evaluation framework enabled the research team to identify and address complexities inherent in effectiveness studies and facilitated the alignment of program and context. © 2014 Society for Public Health Education.
GREENSCOPE Technical User’s Guide
GREENSCOPE’s methodology has been developed and its software tool designed such that it can be applied to an entire process, to a piece of equipment or process unit, or at the investigatory bench scale.
Toward a virtual platform for materials processing
NASA Astrophysics Data System (ADS)
Schmitz, G. J.; Prahl, U.
2009-05-01
Any production is based on materials eventually becoming components of a final product. Material properties being determined by the microstructure of the material thus are of utmost importance both for productivity and reliability of processing during production and for application and reliability of the product components. A sound prediction of materials properties therefore is highly important. Such a prediction requires tracking of microstructure and properties evolution along the entire component life cycle starting from a homogeneous, isotropic and stress-free melt and eventually ending in failure under operational load. This article will outline ongoing activities at the RWTH Aachen University aiming at establishing a virtual platform for materials processing comprising a virtual, integrative numerical description of processes and of the microstructure evolution along the entire production chain and even extending further toward microstructure and properties evolution under operational conditions.
Modelling Coastal Cliff Recession Based on the GIM-DDD Method
NASA Astrophysics Data System (ADS)
Gong, Bin; Wang, Shanyong; Sloan, Scott William; Sheng, Daichao; Tang, Chun'an
2018-04-01
The unpredictable and instantaneous collapse behaviour of coastal rocky cliffs may cause damage that extends significantly beyond the area of failure. Gravitational movements that occur during coastal cliff recession involve two major stages: the small deformation stage and the large displacement stage. In this paper, a method of simulating the entire progressive failure process of coastal rocky cliffs is developed based on the gravity increase method (GIM), the rock failure process analysis method and the discontinuous deformation analysis method, and it is referred to as the GIM-DDD method. The small deformation stage, which includes crack initiation, propagation and coalescence processes, and the large displacement stage, which includes block translation and rotation processes during the rocky cliff collapse, are modelled using the GIM-DDD method. In addition, acoustic emissions, stress field variations, crack propagation and failure mode characteristics are further analysed to provide insights that can be used to predict, prevent and minimize potential economic losses and casualties. The calculation and analytical results are consistent with previous studies, which indicate that the developed method provides an effective and reliable approach for performing rocky cliff stability evaluations and coastal cliff recession analyses and has considerable potential for improving the safety and protection of seaside cliff areas.
Mistakes in a stat laboratory: types and frequency.
Plebani, M; Carraro, P
1997-08-01
Application of Total Quality Management concepts to laboratory testing requires that the total process, including preanalytical and postanalytical phases, be managed so as to reduce or, ideally, eliminate all defects within the process itself. Indeed a "mistake" can be defined as any defect during the entire testing process, from ordering tests to reporting results. We evaluated the frequency and types of mistakes found in the "stat" section of the Department of Laboratory Medicine of the University-Hospital of Padova by monitoring four different departments (internal medicine, nephrology, surgery, and intensive care unit) for 3 months. Among a total of 40490 analyses, we identified 189 laboratory mistakes, a relative frequency of 0.47%. The distribution of mistakes was: preanalytical 68.2%, analytical 13.3%, and postanalytical 18.5%. Most of the laboratory mistakes (74%) did not affect patients' outcome. However, in 37 patients (19%), laboratory mistakes were associated with further inappropriate investigations, thus resulting in an unjustifiable increase in costs. Moreover, in 12 patients (6.4%) laboratory mistakes were associated with inappropriate care or inappropriate modification of therapy. The promotion of quality control and continuous improvement of the total testing process, including pre- and postanalytical phases, seems to be a prerequisite for an effective laboratory service.
Three-dimensional architecture for solid state radiation detectors
Parker, S.
1999-03-30
A radiation-damage resistant radiation detector is formed on a substrate formed of a material doped with a first conductivity type dopant. The detector includes at least one first electrode formed of first conductivity type dopant, and at least one second electrode that is spaced-apart from the first electrode and formed of a second conductivity type dopant. Each first and second electrode penetrates into the substrate from a substrate surface, and one or more electrodes may penetrate entirely through the substrate, that is traversing from one surface to the other surface. Particulate and/or electromagnetic radiation penetrating at least a surface of the substrate releases electrons and holes in substrate regions. Because the electrodes may be formed entirely through the substrate thickness, the released charges will be a relatively small distance from at least a portion of such an electrode, e.g., a distance less than the substrate thickness. The electrons and/or holes traverse the small distance and are collected by said electrodes, thus promoting rapid detection of the radiation. By providing one or more electrodes with a dopant profile radially graded in a direction parallel to a substrate surface, an electric field results that promotes rapid collection of released electrons and said holes. Monolithic combinations of such detectors may be fabricated including CMOS electronics to process radiation signals. 45 figs.
Three-dimensional architecture for solid state radiation detectors
Parker, Sherwood
1999-01-01
A radiation-damage resistant radiation detector is formed on a substrate formed of a material doped with a first conductivity type dopant. The detector includes at least one first electrode formed of first conductivity type dopant, and at least one second electrode that is spaced-apart from the first electrode and formed of a second conductivity type dopant. Each first and second electrode penetrates into the substrate from a substrate surface, and one or more electrodes may penetrate entirely through the substrate, that is traversing from one surface to the other surface. Particulate and/or electromagnetic radiation penetrating at least a surface of the substrate releases electrons and holes in substrate regions. Because the electrodes may be formed entirely through the substrate thickness, the released charges will be a relatively small distance from at least a portion of such an electrode, e.g., a distance less than the substrate thickness. The electrons and/or holes traverse the small distance and are collected by said electrodes, thus promoting rapid detection of the radiation. By providing one or more electrodes with a dopant profile radially graded in a direction parallel to a substrate surface, an electric field results that promotes rapid collection of released electrons and said holes. Monolithic combinations of such detectors may be fabricated including CMOS electronics to process radiation signals.
Enhanced intelligence through optimized TCPED concepts for airborne ISR
NASA Astrophysics Data System (ADS)
Spitzer, M.; Kappes, E.; Böker, D.
2012-06-01
Current multinational operations show an increased demand for high quality actionable intelligence for different operational levels and users. In order to achieve sufficient availability, quality and reliability of information, various ISR assets are orchestrated within operational theatres. Especially airborne Intelligence, Surveillance and Reconnaissance (ISR) assets provide - due to their endurance, non-intrusiveness, robustness, wide spectrum of sensors and flexibility to mission changes - significant intelligence coverage of areas of interest. An efficient and balanced utilization of airborne ISR assets calls for advanced concepts for the entire ISR process framework including the Tasking, Collection, Processing, Exploitation and Dissemination (TCPED). Beyond this, the employment of current visualization concepts, shared information bases and information customer profiles, as well as an adequate combination of ISR sensors with different information age and dynamic (online) retasking process elements provides the optimization of interlinked TCPED processes towards higher process robustness, shorter process duration, more flexibility between ISR missions and, finally, adequate "entry points" for information requirements by operational users and commands. In addition, relevant Trade-offs of distributed and dynamic TCPED processes are examined and future trends are depicted.
UnPAKing the class differences among p21-activated kinases.
Eswaran, Jeyanthy; Soundararajan, Meera; Kumar, Rakesh; Knapp, Stefan
2008-08-01
The p21-activated kinases (PAKs) are signal transducers, central to many vital cellular processes, including cell morphology, motility, survival, gene transcription and hormone signalling. The mammalian PAK family contains six serine/threonine kinases divided into two subgroups, group I (PAK 1-3) and group II (PAK4-6), based on their domain architecture and regulation. PAKs functioning as dynamic signalling nodes present themselves as attractive therapeutic targets in tumours, neurological diseases and infection. The recent findings across all PAKs, including newly reported structures, shed light on the cellular functions of PAKs, highlighting molecular mechanisms of activation, catalysis and substrate specificity. We believe that a comprehensive understanding of the entire PAK family is essential for developing strategies towards PAK-targeted therapeutics.
Future Aeronautical Communication Infrastructure Technology Investigation
NASA Technical Reports Server (NTRS)
Gilbert, Tricia; Jin, Jenny; Bergerm Jason; Henriksen, Steven
2008-01-01
This National Aeronautics and Space Administration (NASA) Contractor Report summarizes and documents the work performed to investigate technologies that could support long-term aeronautical mobile communications operating concepts for air traffic management (ATM) in the timeframe of 2020 and beyond, and includes the associated findings and recommendations made by ITT Corporation and NASA Glenn Research Center to the Federal Aviation Administration (FAA). The work was completed as the final phase of a multiyear NASA contract in support of the Future Communication Study (FCS), a cooperative research and development program of the United States FAA, NASA, and EUROCONTROL. This final report focuses on an assessment of final five candidate technologies, and also provides an overview of the entire technology assessment process, including final recommendations.
Current strategies with 1-stage prosthetic breast reconstruction
2015-01-01
Background 1-stage prosthetic breast reconstruction is gaining traction as a preferred method of breast reconstruction in select patients who undergo mastectomy for cancer or prevention. Methods Critical elements to the procedure including patient selection, technique, surgical judgment, and postoperative care were reviewed. Results Outcomes series reveal that in properly selected patients, direct-to-implant (DTI) reconstruction has similar low rates of complications and high rates of patient satisfaction compared to traditional 2-stage reconstruction. Conclusions 1-stage prosthetic breast reconstruction may be the procedure of choice in select patients undergoing mastectomy. Advantages include the potential for the entire reconstructive process to be complete in one surgery, the quick return to normal activities, and lack of donor site morbidity. PMID:26005643
Valuing the Accreditation Process
ERIC Educational Resources Information Center
Bahr, Maria
2018-01-01
The value of the National Association for Developmental Education (NADE) accreditation process is far-reaching. Not only do students and programs benefit from the process, but also the entire institution. Through data collection of student performance, analysis, and resulting action plans, faculty and administrators can work cohesively towards…
Mars Pathfinder and Mars Global Surveyor Outreach Compilation
NASA Astrophysics Data System (ADS)
1999-09-01
This videotape is a compilation of the best NASA JPL (Jet Propulsion Laboratory) videos of the Mars Pathfinder and Mars Global Surveyor missions. The mission is described using animation and narration as well as some actual footage of the entire sequence of mission events. Included within these animations are the spacecraft orbit insertion; descent to the Mars surface; deployment of the airbags and instruments; and exploration by Sojourner, the Mars rover. JPL activities at spacecraft control during significant mission events are also included at the end. The spacecraft cameras pan the surrounding Mars terrain and film Sojourner traversing the surface and inspecting rocks. A single, brief, processed image of the Cydonia region (Mars face) at an oblique angle from the Mars Global Surveyor is presented. A description of the Mars Pathfinder mission, instruments, landing and deployment process, Mars approach, spacecraft orbit insertion, rover operation are all described using computer animation. Actual color footage of Sojourner as well as a 360 deg pan of the Mars terrain surrounding the spacecraft is provided. Lower quality black and white photography depicting Sojourner traversing the Mars surface and inspecting Martian rocks also is included.
Mars Pathfinder and Mars Global Surveyor Outreach Compilation
NASA Technical Reports Server (NTRS)
1999-01-01
This videotape is a compilation of the best NASA JPL (Jet Propulsion Laboratory) videos of the Mars Pathfinder and Mars Global Surveyor missions. The mission is described using animation and narration as well as some actual footage of the entire sequence of mission events. Included within these animations are the spacecraft orbit insertion; descent to the Mars surface; deployment of the airbags and instruments; and exploration by Sojourner, the Mars rover. JPL activities at spacecraft control during significant mission events are also included at the end. The spacecraft cameras pan the surrounding Mars terrain and film Sojourner traversing the surface and inspecting rocks. A single, brief, processed image of the Cydonia region (Mars face) at an oblique angle from the Mars Global Surveyor is presented. A description of the Mars Pathfinder mission, instruments, landing and deployment process, Mars approach, spacecraft orbit insertion, rover operation are all described using computer animation. Actual color footage of Sojourner as well as a 360 deg pan of the Mars terrain surrounding the spacecraft is provided. Lower quality black and white photography depicting Sojourner traversing the Mars surface and inspecting Martian rocks also is included.
Artifacts in magnetic spirals retrieved by transport of intensity equation (TIE)
NASA Astrophysics Data System (ADS)
Cui, J.; Yao, Y.; Shen, X.; Wang, Y. G.; Yu, R. C.
2018-05-01
The artifacts in the magnetic structures reconstructed from Lorentz transmission electron microscopy (LTEM) images with TIE method have been analyzed in detail. The processing for the simulated images of Bloch and Neel spirals indicated that the improper parameters in TIE may overestimate the high frequency information and induce some false features in the retrieved images. The specimen tilting will further complicate the analysis of the images because the LTEM image contrast is not the result of the magnetization distribution within the specimen but the integral projection pattern of the magnetic induction filling the entire space including the specimen.
Exact solution of some linear matrix equations using algebraic methods
NASA Technical Reports Server (NTRS)
Djaferis, T. E.; Mitter, S. K.
1979-01-01
Algebraic methods are used to construct the exact solution P of the linear matrix equation PA + BP = - C, where A, B, and C are matrices with real entries. The emphasis of this equation is on the use of finite algebraic procedures which are easily implemented on a digital computer and which lead to an explicit solution to the problem. The paper is divided into six sections which include the proof of the basic lemma, the Liapunov equation, and the computer implementation for the rational, integer and modular algorithms. Two numerical examples are given and the entire calculation process is depicted.
NASA Technical Reports Server (NTRS)
Gonzalez, Guillermo A.; Lucy, Melvin H.; Massie, Jeffrey J.
2013-01-01
The NASA Langley Research Center, Engineering Directorate, Electronic System Branch, is responsible for providing pyrotechnic support capabilities to Langley Research Center unmanned flight and ground test projects. These capabilities include device selection, procurement, testing, problem solving, firing system design, fabrication and testing; ground support equipment design, fabrication and testing; checkout procedures and procedure?s training to pyro technicians. This technical memorandum will serve as a guideline for the design, fabrication and testing of electropyrotechnic firing systems. The guidelines will discuss the entire process beginning with requirements definition and ending with development and execution.
On the Violence of High Explosive Reactions
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tarver, C M; Chidester, S K
High explosive reactions can be caused by three general energy deposition processes: impact ignition by frictional and/or shear heating; bulk thermal heating; and shock compression. The violence of the subsequent reaction varies from benign slow combustion to catastrophic detonation of the entire charge. The degree of violence depends on many variables, including the rate of energy delivery, the physical and chemical properties of the explosive, and the strength of the confinement surrounding the explosive charge. The current state of experimental and computer modeling research on the violence of impact, thermal, and shock-induced reactions is reviewed.
Improved ADM1 model for anaerobic digestion process considering physico-chemical reactions.
Zhang, Yang; Piccard, Sarah; Zhou, Wen
2015-11-01
The "Anaerobic Digestion Model No. 1" (ADM1) was modified in the study by improving the bio-chemical framework and integrating a more detailed physico-chemical framework. Inorganic carbon and nitrogen balance terms were introduced to resolve the discrepancies in the original bio-chemical framework between the carbon and nitrogen contents in the degraders and substrates. More inorganic components and solids precipitation processes were included in the physico-chemical framework of ADM1. The modified ADM1 was validated with the experimental data and used to investigate the effects of calcium ions, magnesium ions, inorganic phosphorus and inorganic nitrogen on anaerobic digestion in batch reactor. It was found that the entire anaerobic digestion process might exist an optimal initial concentration of inorganic nitrogen for methane gas production in the presence of calcium ions, magnesium ions and inorganic phosphorus. Copyright © 2015 Elsevier Ltd. All rights reserved.
Medicare+Choice: what lies ahead?
Layne, R Jeffrey
2002-03-01
Health plans have continued to exit the Medicare+Choice program in recent years, despite efforts of Congress and the Centers for Medicare and Medicaid Services (CMS) to reform the program. Congress and CMS therefore stand poised to make additional, substantial reforms to the program. CMS has proposed to consolidate its oversight of the program, extend the due date for Medicare+Choice plans to file their adjusted community rate proposals, revise risk-adjustment processes, streamline the marketing review process, enhance quality-improvement requirements, institute results based performance assessment audits, coordinate policy changes to coincide with contracting cycles, expand its fall advertising campaign for the program, provide better employer-based Medicare options for beneficiaries, and take steps to minimize beneficiary costs. Congressional leaders have proposed various legislative remedies to improve the program, including creation of an entirely new pricing structure for the program based on a competitive bidding process.
AVE-SESAME program for the REEDA System
NASA Technical Reports Server (NTRS)
Hickey, J. S.
1981-01-01
The REEDA system software was modified and improved to process the AVE-SESAME severe storm data. A random access file system for the AVE storm data was designed, tested, and implemented. The AVE/SESAME software was modified to incorporate the random access file input and to interface with new graphics hardware/software now available on the REEDA system. Software was developed to graphically display the AVE/SESAME data in the convention normally used by severe storm researchers. Software was converted to AVE/SESAME software systems and interfaced with existing graphics hardware/software available on the REEDA System. Software documentation was provided for existing AVE/SESAME programs underlining functional flow charts and interacting questions. All AVE/SESAME data sets in random access format was processed to allow developed software to access the entire AVE/SESAME data base. The existing software was modified to allow for processing of different AVE/SESAME data set types including satellite surface and radar data.
NASA Astrophysics Data System (ADS)
Inisheva, L. I.; Szajdak, L.; Sergeeva, M. A.
2016-04-01
The biological activity in oligotrophic peatlands at the margins of the Vasyugan Mire has been studied. It is shown found that differently directed biochemical processes manifest themselves in the entire peat profile down to the underlying mineral substrate. Their activity is highly variable. It is argued that the notion about active and inert layers in peat soils is only applicable for the description of their water regime. The degree of the biochemical activity is specified by the physical soil properties. As a result of the biochemical processes, a micromosaic aerobic-anaerobic medium is developed under the surface waterlogged layer of peat deposits. This layer contains the gas phase, including oxygen. It is concluded that the organic and mineral parts of peat bogs represent a single functional system of a genetic peat profile with a clear record of the history of its development.
Kaur, Gagan Deep
2017-05-01
The design process in Kashmiri carpet weaving is distributed over a number of actors and artifacts and is mediated by a weaving notation called talim. The script encodes entire design in practice-specific symbols. This encoded script is decoded and interpreted via design-specific conventions by weavers to weave the design embedded in it. The cognitive properties of this notational system are described in the paper employing cognitive dimensions (CDs) framework of Green (People and computers, Cambridge University Press, Cambridge, 1989) and Blackwell et al. (Cognitive technology: instruments of mind-CT 2001, LNAI 2117, Springer, Berlin, 2001). After introduction to the practice, the design process is described in 'The design process' section which includes coding and decoding of talim. In 'Cognitive dimensions of talim' section, after briefly discussing CDs framework, the specific cognitive dimensions possessed by talim are described in detail.
Long term trending of engineering data for the Hubble Space Telescope
NASA Technical Reports Server (NTRS)
Cox, Ross M.
1993-01-01
A major goal in spacecraft engineering analysis is the detection of component failures before the fact. Trending is the process of monitoring subsystem states to discern unusual behaviors. This involves reducing vast amounts of data about a component or subsystem into a form that helps humans discern underlying patterns and correlations. A long term trending system has been developed for the Hubble Space Telescope. Besides processing the data for 988 distinct telemetry measurements each day, it produces plots of 477 important parameters for the entire 24 hours. Daily updates to the trend files also produce 339 thirty day trend plots each month. The total system combines command procedures to control the execution of the C-based data processing program, user-written FORTRAN routines, and commercial off-the-shelf plotting software. This paper includes a discussion the performance of the trending system and of its limitations.
User's manual SIG: a general-purpose signal processing program
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lager, D.; Azevedo, S.
1983-10-25
SIG is a general-purpose signal processing, analysis, and display program. Its main purpose is to perform manipulations on time- and frequency-domain signals. However, it has been designed to ultimately accommodate other representations for data such as multiplexed signals and complex matrices. Many of the basic operations one would perform on digitized data are contained in the core SIG package. Out of these core commands, more powerful signal processing algorithms may be built. Many different operations on time- and frequency-domain signals can be performed by SIG. They include operations on the samples of a signal, such as adding a scalar tomore » each sample, operations on the entire signal such as digital filtering, and operations on two or more signals such as adding two signals. Signals may be simulated, such as a pulse train or a random waveform. Graphics operations display signals and spectra.« less
Ioannidis, Vassilios; van Nimwegen, Erik; Stockinger, Heinz
2016-01-01
ISMARA ( ismara.unibas.ch) automatically infers the key regulators and regulatory interactions from high-throughput gene expression or chromatin state data. However, given the large sizes of current next generation sequencing (NGS) datasets, data uploading times are a major bottleneck. Additionally, for proprietary data, users may be uncomfortable with uploading entire raw datasets to an external server. Both these problems could be alleviated by providing a means by which users could pre-process their raw data locally, transferring only a small summary file to the ISMARA server. We developed a stand-alone client application that pre-processes large input files (RNA-seq or ChIP-seq data) on the user's computer for performing ISMARA analysis in a completely automated manner, including uploading of small processed summary files to the ISMARA server. This reduces file sizes by up to a factor of 1000, and upload times from many hours to mere seconds. The client application is available from ismara.unibas.ch/ISMARA/client. PMID:28232860
Mid-Atlantic Microtidal Barrier Coast Classification.
1983-05-01
subregions A through F. APPENDIX A. BIGDAT data file for the 800 sample sites along the coast, and strike-parallel plots of this data. i C 4 FLIST OF FIGURES...from this data set as follows: 1) BIGDAT - the entire coast at 1-km intervals, including areas peripheral to inlets and capes (n - 800); 2) INLETR2 - the...in Table 5. The Entire Coast at 1-km Intervals ( BIGDAT and fINLETRZ Correlation analysis of the 15 variables for the entire coast at 1-km intervals
Bringing Standardized Processes in Atom-Probe Tomography: I Establishing Standardized Terminology
DOE Office of Scientific and Technical Information (OSTI.GOV)
Anderson, Ian M; Danoix, F; Forbes, Richard
2011-01-01
Defining standardized methods requires careful consideration of the entire field and its applications. The International Field Emission Society (IFES) has elected a Standards Committee, whose task is to determine the needed steps to establish atom-probe tomography as an accepted metrology technique. Specific tasks include developing protocols or standards for: terminology and nomenclature; metrology and instrumentation, including specifications for reference materials; test methodologies; modeling and simulations; and science-based health, safety, and environmental practices. The Committee is currently working on defining terminology related to atom-probe tomography with the goal to include terms into a document published by the International Organization for Standardsmore » (ISO). A lot of terms also used in other disciplines have already been defined) and will be discussed for adoption in the context of atom-probe tomography.« less
Fonseca, Luciana Mara Monti; Dias, Danielle Monteiro Vilela; Góes, Fernanda Dos Santos Nogueira; Seixas, Carlos Alberto; Scochi, Carmen Gracinda Silvan; Martins, José Carlos Amado; Rodrigues, Manuel Alves
2014-09-01
The present study aimed to describe the development process of a serious game that enables users to evaluate the respiratory process in a preterm infant based on an emotional design model. The e-Baby serious game was built to feature the simulated environment of an incubator, in which the user performs a clinical evaluation of the respiratory process in a virtual preterm infant. The user learns about the preterm baby's history, chooses the tools for the clinical evaluation, evaluates the baby, and determines whether his/her evaluation is appropriate. The e-Baby game presents phases that contain respiratory process impairments of higher or lower complexity in the virtual preterm baby. Included links give the user the option of recording the entire evaluation procedure and sharing his/her performance on a social network. e-Baby integrates a Clinical Evaluation of the Preterm Baby course in the Moodle virtual environment. This game, which evaluates the respiratory process in preterm infants, could support a more flexible, attractive, and interactive teaching and learning process that includes simulations with features very similar to neonatal unit realities, thus allowing more appropriate training for clinical oxygenation evaluations in at-risk preterm infants. e-Baby allows advanced user-technology-educational interactions because it requires active participation in the process and is emotionally integrated.
NASA Astrophysics Data System (ADS)
Dombeck, J. P.; Cattell, C. A.; Prasad, N.; Sakher, A.; Hanson, E.; McFadden, J. P.; Strangeway, R. J.
2016-12-01
Field-aligned currents (FACs) provide a fundamental driver and means of Magnetosphere-Ionosphere (M-I) coupling. These currents need to be supported by local physics along the entire field line generally with quasi-static potential structures, but also supporting the time-evolution of the structures and currents, producing Alfvén waves and Alfvénic electron acceleration. In regions of upward current, precipitating auroral electrons are accelerated earthward. These processes can result in ion outflow, changes in ionospheric conductivity, and affect the particle distributions on the field line, affecting the M-I coupling processes supporting the individual FACs and potentially the entire FAC system. The FAST mission was well suited to study both the FACs and the electron auroral acceleration processes. We present the results of the comparisons between meso- and small-scale FACs determined from FAST using the method of Peria, et al., 2000, and our FAST auroral acceleration mechanism study when such identification is possible for the entire ˜13 year FAST mission. We also present the latest results of the electron energy (and number) flux ionospheric input based on acceleration mechanism (and FAC characteristics) from our FAST auroral acceleration mechanism study.
Code of Federal Regulations, 2010 CFR
2010-10-01
... preparation of the product from its raw state through each step in the entire process; or observe conditions... under the regulations in this part which has been preserved by any recognized commercial process..., or by fermentation. Quality. “Quality” means the inherent properties of any processed product which...
Integrated testing and verification system for research flight software design document
NASA Technical Reports Server (NTRS)
Taylor, R. N.; Merilatt, R. L.; Osterweil, L. J.
1979-01-01
The NASA Langley Research Center is developing the MUST (Multipurpose User-oriented Software Technology) program to cut the cost of producing research flight software through a system of software support tools. The HAL/S language is the primary subject of the design. Boeing Computer Services Company (BCS) has designed an integrated verification and testing capability as part of MUST. Documentation, verification and test options are provided with special attention on real time, multiprocessing issues. The needs of the entire software production cycle have been considered, with effective management and reduced lifecycle costs as foremost goals. Capabilities have been included in the design for static detection of data flow anomalies involving communicating concurrent processes. Some types of ill formed process synchronization and deadlock also are detected statically.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stein, Joshua; Burnham, Laurie; Jones, Christian Birk
The U.S. DOE Regional Test Center for Solar Technologies program was established to validate photovoltaic (PV) technologies installed in a range of different climates. The program is funded by the Energy Department's SunShot Initiative. The initiative seeks to make solar energy cost competitive with other forms of electricity by the end of the decade. Sandia National Laboratory currently manages four different sites across the country. The National Renewable Energy Laboratory manages a fifth site in Colorado. The entire PV portfolio currently includes 20 industry partners and almost 500 kW of installed systems. The program follows a defined process that outlinesmore » tasks, milestones, agreements, and deliverables. The process is broken out into four main parts: 1) planning and design, 2) installation, 3) operations, and 4) decommissioning. This operations manual defines the various elements of each part.« less
Computation of Unsteady Flow in Flame Trench For Prediction of Ignition Overpressure Waves
NASA Technical Reports Server (NTRS)
Kwak, Dochan; Kris, Cetin
2010-01-01
Computational processes/issues for supporting mission tasks are discussed using an example from launch environment simulation. Entire CFD process has been discussed using an existing code; STS-124 conditions were revisited to support wall repair effort for STS-125 flight; when water bags were not included, computed results indicate that IOP waves with the peak values have been reflected from SRB s own exhaust hole; ARES-1X simulations show that there is a shock wave going through the unused exhaust hole, however, it plays a secondary role; all three ARES-1X cases and STS-1 simulations showed very similar IOP magnitudes and patters on the vehicle; with the addition of water bags and water injection, it will further diminish the IOP effects.
NASA Technical Reports Server (NTRS)
Belon, A. E. (Principal Investigator); Miller, J. M.
1973-01-01
The author has identified the following significant results. The objective of this project is to provide a focus for the entire University of Alaska ERTS-1 effort (12 projects covering 10 disciplines and involving 8 research institutes and science departments). Activities have been concentrated on the implementation of the project's three primary functions: (1) coordination and management of the U of A ERTS-1 program, including management of the flow of data and data products; (2) acquisition, installation, test, operation, and maintanence of centralized facilities for processing ERTS-1, aircraft, and ground truth data; and (3) development of photographic and digital techniques for processing and interpreting ERTS-1 and aircraft data. With minor exceptions these three functions are now well-established and working smoothly.
A high performance, ad-hoc, fuzzy query processing system for relational databases
NASA Technical Reports Server (NTRS)
Mansfield, William H., Jr.; Fleischman, Robert M.
1992-01-01
Database queries involving imprecise or fuzzy predicates are currently an evolving area of academic and industrial research. Such queries place severe stress on the indexing and I/O subsystems of conventional database environments since they involve the search of large numbers of records. The Datacycle architecture and research prototype is a database environment that uses filtering technology to perform an efficient, exhaustive search of an entire database. It has recently been modified to include fuzzy predicates in its query processing. The approach obviates the need for complex index structures, provides unlimited query throughput, permits the use of ad-hoc fuzzy membership functions, and provides a deterministic response time largely independent of query complexity and load. This paper describes the Datacycle prototype implementation of fuzzy queries and some recent performance results.
Wojas-Pelc, Anna; Sułowicz, Joanna; Nastałek, Magdalena
2008-01-01
Aging refers to the hole human body including the skin, but here it is usually better seen by milieu, repeatedly burdens life quality. There are many theories explaining the process of human aging, but its reasons, irrespectively of their criteria, are numerous and affect one another. Skin aging just like the entire body depends on the influence of genetics, environmental and hormonal factors. Ultraviolet radiation and tobacco smoking have confirmed influence on skin aging. The role of hormonal disorders, particularly estrogens are also underlined. Mechanisms of skin aging induced by UV radiation, tobacco smoke and estrogens are similar and included unfavourable effects of oxidative stress (free radicals) and also disturbances of the TGF beta pathway. Data of many clinical studies proved that avoiding sun and smoking, nucleic acids diet, antioxidant supplementation, everyday use of UV filter, moisturizers, topical use of antioxidants, retinoid derivatives and flavonoids have proved protective the influence to multidirectional process of skin aging.
Pragmatics as Metacognitive Control
Kissine, Mikhail
2016-01-01
The term “pragmatics” is often used to refer without distinction, on one hand, to the contextual selection of interpretation norms and, on the other hand, to the context-sensitive processes guided by these norms. Pragmatics in the first acception depends on language-independent contextual factors that can, but need not, involve Theory of Mind; in the second acception, pragmatics is a language-specific metacognitive process, which may unfold at an unconscious level without involving any mental state (meta-)representation. Distinguishing between these two kinds of ways context drives the interpretation of communicative stimuli helps dissolve the dispute between proponents of an entirely Gricean pragmatics and those who claim that some pragmatic processes do not depend on mind-reading capacities. According to the model defended in this paper, the typology of pragmatic processes is not entirely determined by a hierarchy of meanings, but by contextually set norms of interpretation. PMID:26834671
Pragmatics as Metacognitive Control.
Kissine, Mikhail
2015-01-01
The term "pragmatics" is often used to refer without distinction, on one hand, to the contextual selection of interpretation norms and, on the other hand, to the context-sensitive processes guided by these norms. Pragmatics in the first acception depends on language-independent contextual factors that can, but need not, involve Theory of Mind; in the second acception, pragmatics is a language-specific metacognitive process, which may unfold at an unconscious level without involving any mental state (meta-)representation. Distinguishing between these two kinds of ways context drives the interpretation of communicative stimuli helps dissolve the dispute between proponents of an entirely Gricean pragmatics and those who claim that some pragmatic processes do not depend on mind-reading capacities. According to the model defended in this paper, the typology of pragmatic processes is not entirely determined by a hierarchy of meanings, but by contextually set norms of interpretation.
Water displacement mercury pump
Nielsen, Marshall G.
1985-01-01
A water displacement mercury pump has a fluid inlet conduit and diffuser, a valve, a pressure cannister, and a fluid outlet conduit. The valve has a valve head which seats in an opening in the cannister. The entire assembly is readily insertable into a process vessel which produces mercury as a product. As the mercury settles, it flows into the opening in the cannister displacing lighter material. When the valve is in a closed position, the pressure cannister is sealed except for the fluid inlet conduit and the fluid outlet conduit. Introduction of a lighter fluid into the cannister will act to displace a heavier fluid from the cannister via the fluid outlet conduit. The entire pump assembly penetrates only a top wall of the process vessel, and not the sides or the bottom wall of the process vessel. This insures a leak-proof environment and is especially suitable for processing of hazardous materials.
Water displacement mercury pump
Nielsen, M.G.
1984-04-20
A water displacement mercury pump has a fluid inlet conduit and diffuser, a valve, a pressure cannister, and a fluid outlet conduit. The valve has a valve head which seats in an opening in the cannister. The entire assembly is readily insertable into a process vessel which produces mercury as a product. As the mercury settles, it flows into the opening in the cannister displacing lighter material. When the valve is in a closed position, the pressure cannister is sealed except for the fluid inlet conduit and the fluid outlet conduit. Introduction of a lighter fluid into the cannister will act to displace a heavier fluid from the cannister via the fluid outlet conduit. The entire pump assembly penetrates only a top wall of the process vessel, and not the sides or the bottom wall of the process vessel. This insures a leak-proof environment and is especially suitable for processing of hazardous materials.
Rigor + Results = Impact: Measuring Impact with Integrity (Invited)
NASA Astrophysics Data System (ADS)
Davis, H. B.; Scalice, D.
2013-12-01
Are you struggling to measure and explain the impact of your EPO efforts? The NASA Astrobiology Institute (NAI) is using an evaluation process to determine the impact of its 15 EPO projects with over 200 activities. What is the current impact? How can it be improved in the future? We have developed a process that preserves autonomy at the project implementation level while still painting a picture of the entire portfolio. The impact evaluation process looks at an education/public outreach activity through its entire project cycle. Working with an external evaluator, education leads: 1) rate the quality/health of an activity in each stage of its cycle, and 2) determine the impact based on the results of the evaluation and the rigor of the methods used. The process has created a way to systematically codify a project's health and its impact, while offering support for improving both impact and how it is measured.
Signal Processing, Analysis, & Display
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lager, Darrell; Azevado, Stephen
1986-06-01
SIG is a general-purpose signal processing, analysis, and display program. Its main purpose is to perform manipulations on time- and frequency-domain signals. However, it has been designed to ultimately accommodate other representations for data such as multiplexed signals and complex matrices. Two user interfaces are provided in SIG - a menu mode for the unfamiliar user and a command mode for more experienced users. In both modes errors are detected as early as possible and are indicated by friendly, meaningful messages. An on-line HELP package is also included. A variety of operations can be performed on time- and frequency-domain signalsmore » including operations on the samples of a signal, operations on the entire signal, and operations on two or more signals. Signal processing operations that can be performed are digital filtering (median, Bessel, Butterworth, and Chebychev), ensemble average, resample, auto and cross spectral density, transfer function and impulse response, trend removal, convolution, Fourier transform and inverse window functions (Hamming, Kaiser-Bessel), simulation (ramp, sine, pulsetrain, random), and read/write signals. User definable signal processing algorithms are also featured. SIG has many options including multiple commands per line, command files with arguments,commenting lines, defining commands, and automatic execution for each item in a repeat sequence. Graphical operations on signals and spectra include: x-y plots of time signals; real, imaginary, magnitude, and phase plots of spectra; scaling of spectra for continuous or discrete domain; cursor zoom; families of curves; and multiple viewports.« less
SIG. Signal Processing, Analysis, & Display
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hernandez, J.; Lager, D.; Azevedo, S.
1992-01-22
SIG is a general-purpose signal processing, analysis, and display program. Its main purpose is to perform manipulations on time- and frequency-domain signals. However, it has been designed to ultimately accommodate other representations for data such as multiplexed signals and complex matrices. Two user interfaces are provided in SIG - a menu mode for the unfamiliar user and a command mode for more experienced users. In both modes errors are detected as early as possible and are indicated by friendly, meaningful messages. An on-line HELP package is also included. A variety of operations can be performed on time- and frequency-domain signalsmore » including operations on the samples of a signal, operations on the entire signal, and operations on two or more signals. Signal processing operations that can be performed are digital filtering (median, Bessel, Butterworth, and Chebychev), ensemble average, resample, auto and cross spectral density, transfer function and impulse response, trend removal, convolution, Fourier transform and inverse window functions (Hamming, Kaiser-Bessel), simulation (ramp, sine, pulsetrain, random), and read/write signals. User definable signal processing algorithms are also featured. SIG has many options including multiple commands per line, command files with arguments,commenting lines, defining commands, and automatic execution for each item in a repeat sequence. Graphical operations on signals and spectra include: x-y plots of time signals; real, imaginary, magnitude, and phase plots of spectra; scaling of spectra for continuous or discrete domain; cursor zoom; families of curves; and multiple viewports.« less
40 CFR Appendix Viii to Part 86 - Aging Bench Equipment and Procedures
Code of Federal Regulations, 2010 CFR
2010-07-01
.... Exhaust System Installation a. The entire catalyst(s)-plus-oxygen-sensor(s) system, together with all... catalysts, the entire catalyst system including all catalysts, all oxygen sensors and the associated exhaust... first catalyst at its longitudinal axis). Alternatively, the feed gas temperature just before the...
40 CFR Appendix Viii to Part 86 - Aging Bench Equipment and Procedures
Code of Federal Regulations, 2011 CFR
2011-07-01
.... Exhaust System Installation a. The entire catalyst(s)-plus-oxygen-sensor(s) system, together with all... catalysts, the entire catalyst system including all catalysts, all oxygen sensors and the associated exhaust... first catalyst at its longitudinal axis). Alternatively, the feed gas temperature just before the...
Genomic imprinting in Drosophila has properties of both mammalian and insect imprinting.
Anaka, Matthew; Lynn, Audra; McGinn, Patrick; Lloyd, Vett K
2009-02-01
Genomic imprinting is a process that marks DNA, causing a change in gene or chromosome behavior, depending on the sex of the transmitting parent. In mammals, most examples of genomic imprinting affect the transcription of individual or small clusters of genes whereas in insects, genomic imprinting tends to silence entire chromosomes. This has been interpreted as evidence of independent evolutionary origins for imprinting. To investigate how these types of imprinting are related, we performed a phenotypic, molecular, and cytological analysis of an imprinted chromosome in Drosophila melanogaster. Analysis of this chromosome reveals that the imprint results in transcriptional silencing. Yet, the domain of transcriptional silencing is very large, extending at least 1.2 Mb and encompassing over 100 genes, and is associated with decreased somatic polytenization of the entire chromosome. We propose that repression of somatic replication in polytenized cells, as a secondary response to the imprint, acts to extend the size of the imprinted domain to an entire chromosome. Thus, imprinting in Drosophila has properties of both typical mammalian and insect imprinting which suggests that genomic imprinting in Drosophila and mammals is not fundamentally different; imprinting is manifest as transcriptional silencing of a few genes or silencing of an entire chromosome depending on secondary processes such as differences in gene density and polytenization.
Howard, Barbara J; Sturner, Raymond
2017-12-01
To describe benefits and problems with screening and addressing developmental and behavioral problems in primary care and using an online clinical process support system as a solution. Screening has been found to have various implementation barriers including time costs, accuracy, workflow and knowledge of tools. In addition, training of clinicians in dealing with identified issues is lacking. Patients disclose more to and prefer computerized screening. An online clinical process support system (CHADIS) shows promise in addressing these issues. Use of a comprehensive panel of online pre-visit screens; linked decision support to provide moment-of-care training; and post-visit activities and resources for patient-specific education, monitoring and care coordination is an efficient way to make the entire process of screening and follow up care feasible in primary care. CHADIS fulfills these requirements and provides Maintenance of Certification credit to physicians as well as added income for screening efforts.
The search for a topographic signature of life.
Dietrich, William E; Perron, J Taylor
2006-01-26
Landscapes are shaped by the uplift, deformation and breakdown of bedrock and the erosion, transport and deposition of sediment. Life is important in all of these processes. Over short timescales, the impact of life is quite apparent: rock weathering, soil formation and erosion, slope stability and river dynamics are directly influenced by biotic processes that mediate chemical reactions, dilate soil, disrupt the ground surface and add strength with a weave of roots. Over geologic time, biotic effects are less obvious but equally important: biota affect climate, and climatic conditions dictate the mechanisms and rates of erosion that control topographic evolution. Apart from the obvious influence of humans, does the resulting landscape bear an unmistakable stamp of life? The influence of life on topography is a topic that has remained largely unexplored. Erosion laws that explicitly include biotic effects are needed to explore how intrinsically small-scale biotic processes can influence the form of entire landscapes, and to determine whether these processes create a distinctive topography.
Resource allocation processes at multilateral organizations working in global health
Chi, Y-Ling; Bump, Jesse B
2018-01-01
Abstract International institutions provide well over US$10 billion in development assistance for health (DAH) annually and between 1990 and 2014, DAH disbursements totaled $458 billion but how do they decide who gets what, and for what purpose? In this article, we explore how allocation decisions were made by the nine convening agencies of the Equitable Access Initiative. We provide clear, plain language descriptions of the complete process from resource mobilization to allocation for the nine multilateral agencies with prominent agendas in global health. Then, through a comparative analysis we illuminate the choices and strategies employed in the nine international institutions. We find that resource allocation in all reviewed institutions follow a similar pattern, which we categorized in a framework of five steps: strategy definition, resource mobilization, eligibility of countries, support type and funds allocation. All the reviewed institutions generate resource allocation decisions through well-structured and fairly complex processes. Variations in those processes seem to reflect differences in institutional principles and goals. However, these processes have serious shortcomings. Technical problems include inadequate flexibility to account for or meet country needs. Although aid effectiveness and value for money are commonly referenced, we find that neither performance nor impact is a major criterion for allocating resources. We found very little formal consideration of the incentives generated by allocation choices. Political issues include non-transparent influence on allocation processes by donors and bureaucrats, and the common practice of earmarking funds to bypass the normal allocation process entirely. Ethical deficiencies include low accountability and transparency at international institutions, and limited participation by affected citizens or their representatives. We find that recipient countries have low influence on allocation processes themselves, although within these processes they have some influence in relatively narrow areas. PMID:29415239
Bone as a source of organism vitality and regeneration.
Mackiewicz, Zygmunt; Niklińska, Wiesława Ewa; Kowalewska, Jolanta; Chyczewski, Lech
2011-01-01
The most important features that determine the vital role of bone include: a) a continuous supply of calcium, which is indispensible for every cell of the entire organism at all times, and b) the delivery of circulating blood cells and some adult stem cells to keep the body vigorous, ready for self-reparation, and continuously rebuilding throughout life. These functions of bones are no less important than protecting the body cavities, serving as mechanical levers connected to the muscles, and determining the shape and dimensions of the entire organism. The aim of this review was to address some basic cellular and molecular knowledge to better understand the complex interactions of bone structural components. The apprehension of osteoblast differentiation and its local regulation has substantially increased in recent years. It has been suggested that osteocytes, cells within the bone matrix, act as regulatory mechanosensors. Therefore immobility as well as limited activity has a dramatic effect on bone structure and influences a broad spectrum of bone physiology-related functions as well as the functions of many other organs. Lifelong bone rebuilding is modulated through several pathways, including the Wnt pathway that regulates bone formation and resorption. In the adult skeleton, bone is continuously renewed in response to a variety of stimuli, such as the specific process of remodeling dependent on RANK/ /RANKL/OPG interactions. Better understanding of bone biology provides opportunities for the development of more effective prevention and treatment modalities for a variety of bone diseases, including new approaches to adult stem cell-based therapies.
Intrinsically organized network for word processing during the resting state.
Zhao, Jizheng; Liu, Jiangang; Li, Jun; Liang, Jimin; Feng, Lu; Ai, Lin; Lee, Kang; Tian, Jie
2011-01-03
Neural mechanisms underlying word processing have been extensively studied. It has been revealed that when individuals are engaged in active word processing, a complex network of cortical regions is activated. However, it is entirely unknown whether the word-processing regions are intrinsically organized without any explicit processing tasks during the resting state. The present study investigated the intrinsic functional connectivity between word-processing regions during the resting state with the use of fMRI methodology. The low-frequency fluctuations were observed between the left middle fusiform gyrus and a number of cortical regions. They included the left angular gyrus, left supramarginal gyrus, bilateral pars opercularis, and left pars triangularis of the inferior frontal gyrus, which have been implicated in phonological and semantic processing. Additionally, the activations were also observed in the bilateral superior parietal lobule and dorsal lateral prefrontal cortex, which have been suggested to provide top-down monitoring on the visual-spatial processing of words. The findings of our study indicate an intrinsically organized network during the resting state that likely prepares the visual system to anticipate the highly probable word input for ready and effective processing. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.
The Careful Puppet Master: Reducing risk and fortifying acceptance testing with Jenkins CI
NASA Astrophysics Data System (ADS)
Smith, Jason A.; Richman, Gabriel; DeStefano, John; Pryor, James; Rao, Tejas; Strecker-Kellogg, William; Wong, Tony
2015-12-01
Centralized configuration management, including the use of automation tools such as Puppet, can greatly increase provisioning speed and efficiency when configuring new systems or making changes to existing systems, reduce duplication of work, and improve automated processes. However, centralized management also brings with it a level of inherent risk: a single change in just one file can quickly be pushed out to thousands of computers and, if that change is not properly and thoroughly tested and contains an error, could result in catastrophic damage to many services, potentially bringing an entire computer facility offline. Change management procedures can—and should—be formalized in order to prevent such accidents. However, like the configuration management process itself, if such procedures are not automated, they can be difficult to enforce strictly. Therefore, to reduce the risk of merging potentially harmful changes into our production Puppet environment, we have created an automated testing system, which includes the Jenkins CI tool, to manage our Puppet testing process. This system includes the proposed changes and runs Puppet on a pool of dozens of RedHat Enterprise Virtualization (RHEV) virtual machines (VMs) that replicate most of our important production services for the purpose of testing. This paper describes our automated test system and how it hooks into our production approval process for automatic acceptance testing. All pending changes that have been pushed to production must pass this validation process before they can be approved and merged into production.
Vertical electrostatic actuator with extended digital range via tailored topology
NASA Astrophysics Data System (ADS)
Zhang, Yanhang; Dunn, Martin L.
2002-07-01
We describe the design, fabrication, and testing of an electrostatic vertical actuator that exhibits a range of motion that covers the entire initial gap between the actuator and substrate and provides controllable digital output motion. This is obtained by spatially tailoring the electrode arrangement and the stiffness characteristics of the microstructure to control the voltage-deflection characteristics. The concept is based on the electrostatic pull down of bimaterial beams, via a series of electrodes attached to the beams by flexures with tailored stiffness characteristics. The range of travel of the actuator is defined by the post-release deformed shape of the bilayer beams, and can be controlled by a post-release heat-treat process combined with a tailored actuator topology (material distribution and geometry, including spatial geometrical patterning of the individual layers of the bilayer beams). Not only does this allow an increase in the range of travel to cover the entire initial gap, but it also permits digital control of the tip of the actuator which can be designed to yield linear displacement - pull in step characteristics. We fabricated these actuators using the MUMPs surface micromachining process, and packaged them in-house. We measured, using an interferometric microscope, full field deformed shapes of the actuator at each pull in step. The measurements compare well with companion simulation results, both qualitatively and quantitatively.
Whole-system carbon balance for a regional temperate forest in Northern Wisconsin, USA
NASA Astrophysics Data System (ADS)
Peckham, S. D.; Gower, S. T.
2010-12-01
The whole-system (biological + industrial) carbon (C) balance was estimated for the Chequamegon-Nicolet National Forest (CNNF), a temperate forest covering 600,000 ha in Northern Wisconsin, USA. The biological system was modeled using a spatially-explicit version of the ecosystem process model Biome-BGC. The industrial system was modeled using life cycle inventory (LCI) models for wood and paper products. Biome-BGC was used to estimate net primary production, net ecosystem production (NEP), and timber harvest (H) over the entire CNNF. The industrial carbon budget (Ci) was estimated by applying LCI models of CO2 emissions resulting from timber harvest and production of specific wood and paper products in the CNNF region. In 2009, simulated NEP of the CNNF averaged 3.0 tC/ha and H averaged 0.1 tC/ha. Despite model uncertainty, the CNNF region is likely a carbon sink (NEP - Ci > 0), even when CO2 emissions from timber harvest and production of wood and paper products are included in the calculation of the entire forest system C budget.
Design Evolution and Performance Characterization of the GTX Air-Breathing Launch Vehicle Inlet
NASA Technical Reports Server (NTRS)
DeBonis, J. R.; Steffen, C. J., Jr.; Rice, T.; Trefny, C. J.
2002-01-01
The design and analysis of a second version of the inlet for the GTX rocket-based combine-cycle launch vehicle is discussed. The previous design did not achieve its predicted performance levels due to excessive turning of low-momentum comer flows and local over-contraction due to asymmetric end-walls. This design attempts to remove these problems by reducing the spike half-angle to 10- from 12-degrees and by implementing true plane of symmetry end-walls. Axisymmetric Reynolds-Averaged Navier-Stokes simulations using both perfect gas and real gas, finite rate chemistry, assumptions were performed to aid in the design process and to create a comprehensive database of inlet performance. The inlet design, which operates over the entire air-breathing Mach number range from 0 to 12, and the performance database are presented. The performance database, for use in cycle analysis, includes predictions of mass capture, pressure recovery, throat Mach number, drag force, and heat load, for the entire Mach range. Results of the computations are compared with experimental data to validate the performance database.
Kuok, Sin-Chi; Yuen, Ka-Veng
2013-01-01
The goal of this study is to investigate the structural performance of reinforced concrete building under the influence of severe typhoon. For this purpose, full-scale monitoring of a 22-story reinforced concrete building was conducted during the entire passage process of a severe typhoon "Vicente." Vicente was the eighth tropical storm developed in the Western North Pacific Ocean and the South China Sea in 2012. Moreover, it was the strongest and most devastating typhoon that struck Macao since 1999. The overall duration of the typhoon affected period that lasted more than 70 hours and the typhoon eye region covered Macao for around one hour. The wind and structural response measurements were acquired throughout the entire typhoon affected period. The wind characteristics were analyzed using the measured wind data including the wind speed and wind direction time histories. Besides, the structural response measurements of the monitored building were utilized for modal identification using the Bayesian spectral density approach. Detailed analysis of the field data and the typhoon generated effects on the structural performance are discussed.
Dorn, Stan; Shang, Baoping
2012-02-01
Fewer than one-third of eligible Medicare beneficiaries enroll in Medicare savings programs, which pay premiums and, in some cases, eliminate out-of-pocket cost sharing for poor and near-poor enrollees. Many beneficiaries don't participate in savings programs because they must complete a cumbersome application process, including a burdensome asset test. We demonstrate that a streamlined alternative to the asset test-allowing seniors to qualify for Medicare savings programs by providing evidence of limited assets or showing a lack of investment income-would permit 78 percent of currently eligible seniors to bypass the asset test entirely. This simplified approach would increase the number of beneficiaries who qualify for Medicare savings programs from the current 3.6 million seniors to 4.6 million. Such an alternative would keep benefits targeted to people with low assets, eliminate costly administrative expenses and obstacles to enrollment associated with the asset test, and avoid the much larger influx of seniors that would occur if the asset test were eliminated entirely.
The dishonest dean's letter: an analysis of 532 dean's letters from 99 U.S. medical schools.
Edmond, M; Roberson, M; Hasan, N
1999-09-01
To quantify the censure of potentially negative information in dean's letters. Concordance between 532 dean's letters and the corresponding transcripts was determined for six variables (failing grade in a preclinical course, marginal preclinical course grade, failing grade for a clinical rotation, marginal clinical rotation grade, leave of absence, and requirement to repeat an entire year of medical school). The evaluated variables were not found in the dean's letters 27% to 50% of the time that they were present on the transcripts. In three of nine instances (33%), a failing grade in a clinical rotation was not included. Four students had been required to repeat an entire year, but this was noted in only two cases. In toto, 35 of 104 (34%) of the variables identified on the transcripts were not reported. In addition, deans were significantly less likely to report a student's USMLE 1 score if the score was at or below the 20th percentile (p = .03). Some deans suppress negative information in their letters and potentially obfuscate the residency selection process.
Is the perception of 3D shape from shading based on assumed reflectance and illumination?
Todd, James T; Egan, Eric J L; Phillips, Flip
2014-01-01
The research described in the present article was designed to compare three types of image shading: one generated with a Lambertian BRDF and homogeneous illumination such that image intensity was determined entirely by local surface orientation irrespective of position; one that was textured with a linear intensity gradient, such that image intensity was determined entirely by local surface position irrespective of orientation; and another that was generated with a Lambertian BRDF and inhomogeneous illumination such that image intensity was influenced by both position and orientation. A gauge figure adjustment task was used to measure observers' perceptions of local surface orientation on the depicted surfaces, and the probe points included 60 pairs of regions that both had the same orientation. The results show clearly that observers' perceptions of these three types of stimuli were remarkably similar, and that probe regions with similar apparent orientations could have large differences in image intensity. This latter finding is incompatible with any process for computing shape from shading that assumes any plausible reflectance function combined with any possible homogeneous illumination.
Is the perception of 3D shape from shading based on assumed reflectance and illumination?
Todd, James T.; Egan, Eric J. L.; Phillips, Flip
2014-01-01
The research described in the present article was designed to compare three types of image shading: one generated with a Lambertian BRDF and homogeneous illumination such that image intensity was determined entirely by local surface orientation irrespective of position; one that was textured with a linear intensity gradient, such that image intensity was determined entirely by local surface position irrespective of orientation; and another that was generated with a Lambertian BRDF and inhomogeneous illumination such that image intensity was influenced by both position and orientation. A gauge figure adjustment task was used to measure observers' perceptions of local surface orientation on the depicted surfaces, and the probe points included 60 pairs of regions that both had the same orientation. The results show clearly that observers' perceptions of these three types of stimuli were remarkably similar, and that probe regions with similar apparent orientations could have large differences in image intensity. This latter finding is incompatible with any process for computing shape from shading that assumes any plausible reflectance function combined with any possible homogeneous illumination. PMID:26034561
NASA Technical Reports Server (NTRS)
Carnahan, Richard S., Jr.; Corey, Stephen M.; Snow, John B.
1989-01-01
Applications of rapid prototyping and Artificial Intelligence techniques to problems associated with Space Station-era information management systems are described. In particular, the work is centered on issues related to: (1) intelligent man-machine interfaces applied to scientific data user support, and (2) the requirement that intelligent information management systems (IIMS) be able to efficiently process metadata updates concerning types of data handled. The advanced IIMS represents functional capabilities driven almost entirely by the needs of potential users. Space Station-era scientific data projected to be generated is likely to be significantly greater than data currently processed and analyzed. Information about scientific data must be presented clearly, concisely, and with support features to allow users at all levels of expertise efficient and cost-effective data access. Additionally, mechanisms for allowing more efficient IIMS metadata update processes must be addressed. The work reported covers the following IIMS design aspects: IIMS data and metadata modeling, including the automatic updating of IIMS-contained metadata, IIMS user-system interface considerations, including significant problems associated with remote access, user profiles, and on-line tutorial capabilities, and development of an IIMS query and browse facility, including the capability to deal with spatial information. A working prototype has been developed and is being enhanced.
Certification of vapor phase hydrogen peroxide sterilization process for spacecraft application
NASA Technical Reports Server (NTRS)
Rohatgi, N.; Schubert, W.; Koukol, R.; Foster, T. L.; Stabekis, P. D.
2002-01-01
This paper describes the selection process and research activities JPL is planning to conduct for certification of hydrogen peroxide as a NASA approved technique for sterilization of various spacecraft parts/components and entire modern spacecraft.
Variable dynamic testbed vehicle : safety plan
DOT National Transportation Integrated Search
1997-02-01
This safety document covers the entire safety process from inception to delivery of the Variable Dynamic Testbed Vehicle. In addition to addressing the process of safety on the vehicle , it should provide a basis on which to build future safety proce...
NASA Astrophysics Data System (ADS)
Richardson, M.; Kumar, P.
2016-12-01
The critical zone (CZ) includes the biophysical processes occurring from the top of the vegetation canopy to the weathering zone below the groundwater table. CZ services provide a measure for the goods and benefits derived from CZ processes. In intensively managed landscapes (IML), the provisioning, supporting, and regulating services are altered through anthropogenic energy inputs to derive more productivity, as agricultural products, from these landscapes than would be possible under natural conditions. However, the energy or cost equivalents of alterations to CZ functions within landscape profiles are unknown. The valuation of CZ services in energy or monetary terms provides a more concrete tool for characterizing seemingly abstract environmental damages from agricultural production systems. A multi-layer canopy-root-soil model is combined with nutrient and water flux models to simulate the movement of nutrients throughout the soil system. This data enables the measurement of agricultural anthropogenic impacts to the CZ's nutrient cycling supporting services and atmospheric stabilizing regulating services defined by the flux of carbon and nutrients. Such measurements include soil carbon storage, soil carbon respiration, nitrate leaching, and nitrous oxide flux into the atmosphere. Additionally, the socioeconomic values of corn feed and ethanol define the primary productivity supporting services of each crop use.In the debate between feed production and corn-based ethanol production, measured nutrient CZ services can cost up to four times more than traditionally estimated CO2 equivalences for the entire bioenergy production system. Energy efficiency in addition to environmental impacts demonstrate how the inclusion of CZ services is necessary in accounting for the entire life cycle of agricultural production systems. These results conclude that feed production systems are more energy efficient and less environmentally costly than corn-based ethanol systems.
Brusletto, Birgit; Torp, Steffen; Ihlebæk, Camilla Martha; Vinje, Hege Forbech
2018-06-01
We investigated persons who survived cancer (PSC) and their experiences in returning to sustainable work. Videotaped, qualitative, in-depth interviews with previous cancer patients were analyzed directly using "Interpretative Phenomenological Analysis" (IPA). Four men and four women aged 42-59 years participated. Mean time since last treatment was nine years. All participants had worked for more than 3 years when interviewed. An advisory team of seven members with diverse cancer experiences contributed as co-researchers. The entire trajectory from cancer diagnosis until achievement of sustainable work was analog to a journey, and a process model comprising five phases was developed, including personal situations, treatments, and work issues. The theme "return-to-work" (RTW) turned out to be difficult to separate from the entire journey that started at the time of diagnosis. PSCs were mainly concerned about fighting for life in phases 1 and 2. In phase 3 and 4, some participants had to adjust and make changes at work more than once over a period of 1-10 years before reaching sustainable work in phase 5. Overall, the ability to adapt to new circumstances, take advantage of emerging opportunities, and finding meaningful occupational activities were crucial. Our process model may be useful as a tool when discussing the future working life of PSCs. Every individual's journey towards sustainable work was unique, and contained distinct and long-lasting efforts and difficulties. The first attempt to RTW after cancer may not be persistent. Copyright © 2018 Elsevier Ltd. All rights reserved.
Processes of contaminant accumulation in an Arctic beluga whale population
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hickie, B.E.; Muir, D.; Kingsley, M.
1995-12-31
As long-lived top predators in marine food chains, marine mammals accumulate high levels of persistent organic contaminants. While arctic marine mammal contaminant concentrations are lower than those from temperate regions, levels are sufficiently high to be a health concern to people who rely on marine mammals as food. Monitoring programs developed to address this problem and to define spatial and temporal trends often are difficult to interpret since tissue contaminant concentrations vary with species, age, sex, reproductive effort, and condition (ie blubber thickness). It can be difficult to relate contaminant concentrations in other environmental compartments to those in marine mammalsmore » since their residues reflect exposure over their entire life, often 20 to 30 years. Contaminant accumulation models for marine mammals enable us to better understand the importance of, and interaction between, factors affecting contaminant accumulation, and can provide a dynamic framework for interpreting contaminant monitoring data. The authors developed two models for the beluga whale (Delphinapterus leucas): one provides a detailed view of processes at the individual level, the other examines population-based processes. The models quantify uptake, release and disposition of organic contaminants over their entire lifespan by incorporating all aspects of life-history. These models are used together to examine impact of a variety of factors on patterns and variability of PCBs found in the West Greenland beluga population (sample size: 696, 729). Factors examined include: energetics, growth, birth rate, lactation, contaminant assimilation and clearance rates, and dietary contaminant concentrations. Results are discussed in relation to the use of marine mammals for monitoring contaminant trends.« less
PACS archive upgrade and data migration: clinical experiences
NASA Astrophysics Data System (ADS)
Liu, Brent J.; Documet, Luis; Sarti, Dennis A.; Huang, H. K.; Donnelly, John
2002-05-01
Saint John's Health Center PACS data volumes have increased dramatically since the hospital became filmless in April of 1999. This is due in part of continuous image accumulation, and the integration of a new multi-slice detector CT scanner into PACS. The original PACS archive would not be able to handle the distribution and archiving load and capacity in the near future. Furthermore, there is no secondary copy backup of all the archived PACS image data for disaster recovery purposes. The purpose of this paper is to present a clinical and technical process template to upgrade and expand the PACS archive, migrate existing PACs image data to the new archive, and provide a back-up and disaster recovery function not currently available. Discussion of the technical and clinical pitfalls and challenges involved in this process will be presented as well. The server hardware configuration was upgraded and a secondary backup implemented for disaster recovery. The upgrade includes new software versions, database reconfiguration, and installation of a new tape jukebox to replace the current MOD jukebox. Upon completion, all PACS image data from the original MOD jukebox was migrated to the new tape jukebox and verified. The migration was performed during clinical operation continuously in the background. Once the data migration was completed the MOD jukebox was removed. All newly acquired PACS exams are now archived to the new tape jukebox. All PACs image data residing on the original MOD jukebox have been successfully migrated into the new archive. In addition, a secondary backup of all PACS image data has been implemented for disaster recovery and has been verified using disaster scenario testing. No PACS image data was lost during the entire process and there was very little clinical impact during the entire upgrade and data migration. Some of the pitfalls and challenges during this upgrade process included hardware reconfiguration for the original archive server, clinical downtime involved with the upgrade, and data migration planning to minimize impact on clinical workflow. The impact was minimized with a downtime contingency plan.
NASA Technical Reports Server (NTRS)
Gauthier, M. K.; Miller, E. L.; Shumka, A.
1980-01-01
Laser-Scanning System pinpoints imperfections in solar cells. Entire solar panels containing large numbers of cells can be scanned. Although technique is similar to use of scanning electron microscope (SEM) to locate microscopic imperfections, it differs in that large areas may be examined, including entire solar panels, and it is not necessary to remove cover glass or encapsulants.
78 FR 23837 - Cranes and Derricks in Construction: Underground Construction and Demolition
Federal Register 2010, 2011, 2012, 2013, 2014
2013-04-23
... report here the results for the entire heavy-and-civil engineering sector and the entire site... in this final rule because employers included in the heavy-and-civil engineering sector, or the site... final rule. This final rule affects two construction sectors: NAICS 237990 (Other Heavy and Civil...
Expanding the chemical information science gateway.
Bajorath, Jürgen
2017-01-01
Broadly defined, chemical information science (CIS) covers chemical structure and data analysis including biological activity data as well as processing, organization, and retrieval of any form of chemical information. The CIS Gateway (CISG) of F1000Research was created to communicate research involving the entire spectrum of chemical information, including chem(o)informatics. CISG provides a forum for high-quality publications and a meaningful alternative to conventional journals. This gateway is supported by leading experts in the field recognizing the need for open science and a flexible publication platform enabling off-the-beaten path contributions. This editorial aims to further rationalize the scope of CISG, position it within its scientific environment, and open it up to a wider audience. Chemical information science is an interdisciplinary field with high potential to interface with experimental work.
Expanding the chemical information science gateway
Bajorath, Jürgen
2017-01-01
Broadly defined, chemical information science (CIS) covers chemical structure and data analysis including biological activity data as well as processing, organization, and retrieval of any form of chemical information. The CIS Gateway (CISG) of F1000Research was created to communicate research involving the entire spectrum of chemical information, including chem(o)informatics. CISG provides a forum for high-quality publications and a meaningful alternative to conventional journals. This gateway is supported by leading experts in the field recognizing the need for open science and a flexible publication platform enabling off-the-beaten path contributions. This editorial aims to further rationalize the scope of CISG, position it within its scientific environment, and open it up to a wider audience. Chemical information science is an interdisciplinary field with high potential to interface with experimental work. PMID:29043072
Weck, Florian; Grikscheit, Florian; Höfling, Volkmar; Stangier, Ulrich
2014-07-01
The evaluation of treatment integrity (therapist adherence and competence) is a necessary condition to ensure the internal and external validity of psychotherapy research. However, the evaluation process is associated with high costs, because therapy sessions must be rated by experienced clinicians. It is debatable whether rating session segments is an adequate alternative to rating entire sessions. Four judges evaluated treatment integrity (i.e., therapist adherence and competence) in 84 randomly selected videotapes of cognitive-behavioral therapy for major depressive disorder, social anxiety disorder, and hypochondriasis (from three different treatment outcome studies). In each case, two judges provided ratings based on entire therapy sessions and two on session segments only (i.e., the middle third of the entire sessions). Interrater reliability of adherence and competence evaluations proved satisfactory for ratings based on segments and the level of reliability did not differ from ratings based on entire sessions. Ratings of treatment integrity that were based on entire sessions and session segments were strongly correlated (r=.62 for adherence and r=.73 for competence). The relationship between treatment integrity and outcome was comparable for ratings based on session segments and those based on entire sessions. However, significant relationships between therapist competence and therapy outcome were only found in the treatment of social anxiety disorder. Ratings based on segments proved to be adequate for the evaluation of treatment integrity. The findings demonstrate that session segments are an adequate and cost-effective alternative to entire sessions for the evaluation of therapist adherence and competence. Copyright © 2014. Published by Elsevier Ltd.
Levenson, Steven A; Desai, Abhilash K
2017-04-01
Despite much attention including national initiatives, concerns remain about the approaches to managing behavior symptoms and psychiatric conditions across all settings, including in long-term care settings such as nursing homes and assisted living facilities. One key reason why problems persist is because most efforts to "reform" and "correct" the situation have failed to explore or address root causes and instead have promoted inadequate piecemeal "solutions." Further improvement requires jumping off the bandwagon and rethinking the entire issue, including recognizing and applying key concepts of clinical reasoning and the care delivery process to every situation. The huge negative impact of cognitive biases and rote approaches on related clinical problem solving and decision making and patient outcomes also must be addressed. Copyright © 2017 AMDA – The Society for Post-Acute and Long-Term Care Medicine. Published by Elsevier Inc. All rights reserved.
Technical Writing: Process and Product. Third Edition.
ERIC Educational Resources Information Center
Gerson, Sharon J.; Gerson, Steven M.
This book guides students through the entire writing process--prewriting, writing, and rewriting--developing an easy-to-use, step-by-step technique for writing the types of documents they will encounter on the job. It engages students in the writing process and encourages hands-on application as well as discussions about ethics, audience…
A PROCESS FOR DEVELOPING AND EVALUATING INDICIES OF FISH ASSEMBLAGE INTEGRITY
We describe a general process for developing an index of fish assemblage integrity, using the Willamette Valley of Oregon, U.S.A., as an example. Such an index is useful for assessing the effects of humans on entire fish assemblages, and the general process can be applied to any ...
Theory-Driven Process Evaluation of a Complementary Feeding Trial in Four Countries
ERIC Educational Resources Information Center
Newman, Jamie E.; Garces, Ana; Mazariegos, Manolo; Hambidge, K. Michael; Manasyan, Albert; Tshefu, Antoinette; Lokangaka, Adrien; Sami, Neelofar; Carlo, Waldemar A.; Bose, Carl L.; Pasha, Omrana; Goco, Norman; Chomba, Elwyn; Goldenberg, Robert L.; Wright, Linda L.; Koso-Thomas, Marion; Krebs, Nancy F.
2014-01-01
We conducted a theory-driven process evaluation of a cluster randomized controlled trial comparing two types of complementary feeding (meat versus fortified cereal) on infant growth in Guatemala, Pakistan, Zambia and the Democratic Republic of Congo. We examined process evaluation indicators for the entire study cohort (N = 1236) using chi-square…
NEAT1 Scaffolds RNA Binding Proteins and the Microprocessor to Globally Enhance Pri-miRNA Processing
Jiang, Li; Shao, Changwei; Wu, Qi-Jia; Chen, Geng; Zhou, Jie; Yang, Bo; Li, Hairi; Gou, Lan-Tao; Zhang, Yi; Wang, Yangming; Yeo, Gene W.; Zhou, Yu; Fu, Xiang-Dong
2018-01-01
Summary MicroRNA biogenesis is known to be modulated by a variety of RNA binding proteins (RBPs), but in most cases, individual RBPs appear to influence the processing of a small subset of target miRNAs. We herein report that the RNA binding NONO/PSF heterodimer binds a large number of expressed pri-miRNAs in HeLa cells to globally enhance pri-miRNA processing by the Drosha/DGCR8 Microprocessor. Because NONO/PSF are key components of paraspeckles organized by the lncRNA NEAT1, we further demonstrate that NEAT1 also has a profound effect on global pri-miRNA processing. Mechanistic dissection reveals that NEAT1 broadly interacts with NONO/PSF as well as many other RBPs, and that multiple RNA segments in NEAT1, including a “pseudo pri-miRNA” near its 3′ end, help attract the Microprocessor. These findings suggest a bird nest model for a large non-coding RNA to orchestrate efficient processing of almost an entire class of small non-coding RNAs in the nucleus. PMID:28846091
Jiang, Li; Shao, Changwei; Wu, Qi-Jia; Chen, Geng; Zhou, Jie; Yang, Bo; Li, Hairi; Gou, Lan-Tao; Zhang, Yi; Wang, Yangming; Yeo, Gene W; Zhou, Yu; Fu, Xiang-Dong
2017-10-01
MicroRNA (miRNA) biogenesis is known to be modulated by a variety of RNA-binding proteins (RBPs), but in most cases, individual RBPs appear to influence the processing of a small subset of target miRNAs. Here, we report that the RNA-binding NONO-PSF heterodimer binds a large number of expressed pri-miRNAs in HeLa cells to globally enhance pri-miRNA processing by the Drosha-DGCR8 Microprocessor. NONO and PSF are key components of paraspeckles organized by the long noncoding RNA (lncRNA) NEAT1. We further demonstrate that NEAT1 also has a profound effect on global pri-miRNA processing. Mechanistic dissection reveals that NEAT1 broadly interacts with the NONO-PSF heterodimer as well as many other RBPs and that multiple RNA segments in NEAT1, including a 'pseudo pri-miRNA' near its 3' end, help attract the Microprocessor. These findings suggest a 'bird nest' model in which an lncRNA orchestrates efficient processing of potentially an entire class of small noncoding RNAs in the nucleus.
Cognitive Design for Learning: Cognition and Emotion in the Design Process
ERIC Educational Resources Information Center
Hasebrook, Joachim
2016-01-01
We are so used to accept new technologies being the driver of change and innovation in human computer interfaces (HCI). In our research we focus on the development of innovations as a design process--or design, for short. We also refer to the entire process of creating innovations and putting them to use as "cognitive processes"--or…
NASA Astrophysics Data System (ADS)
Ohnaka, M.
2004-12-01
For the past four decades, great progress has been made in understanding earthquake source processes. In particular, recent progress in the field of the physics of earthquakes has contributed substantially to unraveling the earthquake generation process in quantitative terms. Yet, a fundamental problem remains unresolved in this field. The constitutive law that governs the behavior of earthquake ruptures is the basis of earthquake physics, and the governing law plays a fundamental role in accounting for the entire process of an earthquake rupture, from its nucleation to the dynamic propagation to its arrest, quantitatively in a unified and consistent manner. Therefore, without establishing the rational constitutive law, the physics of earthquakes cannot be a quantitative science in a true sense, and hence it is urgent to establish the rational constitutive law. However, it has been controversial over the past two decades, and it is still controversial, what the constitutive law for earthquake ruptures ought to be, and how it should be formulated. To resolve the controversy is a necessary step towards a more complete, unified theory of earthquake physics, and now the time is ripe to do so. Because of its fundamental importance, we have to discuss thoroughly and rigorously what the constitutive law ought to be from the standpoint of the physics of rock friction and fracture on the basis of solid evidence. There are prerequisites for the constitutive formulation. The brittle, seismogenic layer and individual faults therein are characterized by inhomogeneity, and fault inhomogeneity has profound implications for earthquake ruptures. In addition, rupture phenomena including earthquakes are inherently scale dependent; indeed, some of the physical quantities inherent in rupture exhibit scale dependence. To treat scale-dependent physical quantities inherent in the rupture over a broad scale range quantitatively in a unified and consistent manner, it is critical to formulate the governing law properly so as to incorporate the scaling property. Thus, the properties of fault inhomogeneity and physical scaling are indispensable prerequisites to be incorporated into the constitutive formulation. Thorough discussion in this context necessarily leads to the consistent conclusion that the constitutive law must be formulated in such a manner that the shear traction is a primary function of the slip displacement, with the secondary effect of slip rate or stationary contact time. This constitutive formulation makes it possible to account for the entire process of an earthquake rupture over a broad scale range quantitatively in a unified and consistent manner.
General intelligence predicts memory change across sleep.
Fenn, Kimberly M; Hambrick, David Z
2015-06-01
Psychometric intelligence (g) is often conceptualized as the capability for online information processing but it is also possible that intelligence may be related to offline processing of information. Here, we investigated the relationship between psychometric g and sleep-dependent memory consolidation. Participants studied paired-associates and were tested after a 12-hour retention interval that consisted entirely of wake or included a regular sleep phase. We calculated the number of word-pairs that were gained and lost across the retention interval. In a separate session, participants completed a battery of cognitive ability tests to assess g. In the wake group, g was not correlated with either memory gain or memory loss. In the sleep group, we found that g correlated positively with memory gain and negatively with memory loss. Participants with a higher level of general intelligence showed more memory gain and less memory loss across sleep. Importantly, the correlation between g and memory loss was significantly stronger in the sleep condition than in the wake condition, suggesting that the relationship between g and memory loss across time is specific to time intervals that include sleep. The present research suggests that g not only reflects the capability for online cognitive processing, but also reflects capability for offline processes that operate during sleep.
Kenngott, H G; Wagner, M; Preukschas, A A; Müller-Stich, B P
2016-12-01
Modern operating room (OR) suites are mostly digitally connected but until now the primary focus was on the presentation, transfer and distribution of images. Device information and processes within the operating theaters are barely considered. Cognitive assistance systems have triggered a fundamental rethinking in the automotive industry as well as in logistics. In principle, tasks in the OR, some of which are highly repetitive, also have great potential to be supported by automated cognitive assistance via a self-thinking system. This includes the coordination of the entire workflow in the perioperative process in both the operating theater and the whole hospital. With corresponding data from hospital information systems, medical devices and appropriate models of the surgical process, intelligent systems could optimize the workflow in the operating theater in the near future and support the surgeon. Preliminary results on the use of device information and automatically controlled OR suites are already available. Such systems include, for example the guidance of laparoscopic camera systems. Nevertheless, cognitive assistance systems that make use of knowledge about patients, processes and other pieces of information to improve surgical treatment are not yet available in the clinical routine but are urgently needed in order to automatically assist the surgeon in situation-related activities and thus substantially improve patient care.
Kassiopeia: a modern, extensible C++ particle tracking package
DOE Office of Scientific and Technical Information (OSTI.GOV)
Furse, Daniel; Groh, Stefan; Trost, Nikolaus
The Kassiopeia particle tracking framework is an object-oriented software package using modern C++ techniques, written originally to meet the needs of the KATRIN collaboration. Kassiopeia features a new algorithmic paradigm for particle tracking simulations which targets experiments containing complex geometries and electromagnetic fields, with high priority put on calculation efficiency, customizability, extensibility, and ease-of-use for novice programmers. To solve Kassiopeia's target physics problem the software is capable of simulating particle trajectories governed by arbitrarily complex differential equations of motion, continuous physics processes that may in part be modeled as terms perturbing that equation of motion, stochastic processes that occur inmore » flight such as bulk scattering and decay, and stochastic surface processes occurring at interfaces, including transmission and reflection effects. This entire set of computations takes place against the backdrop of a rich geometry package which serves a variety of roles, including initialization of electromagnetic field simulations and the support of state-dependent algorithm-swapping and behavioral changes as a particle's state evolves. Thanks to the very general approach taken by Kassiopeia it can be used by other experiments facing similar challenges when calculating particle trajectories in electromagnetic fields. It is publicly available at https://github.com/KATRIN-Experiment/Kassiopeia.« less
Regenerative medicine in kidney disease: where we stand and where to go.
Borges, Fernanda T; Schor, Nestor
2017-07-22
The kidney is a complex organ with more than 20 types of specialized cells that play an important role in maintaining the body's homeostasis. The epithelial tubular cell is formed during embryonic development and has little proliferative capacity under physiological conditions, but after acute injury the kidney does have regenerative capacity. However, after repetitive or severe lesions, it may undergo a maladaptation process that predisposes it to chronic kidney injury. Regenerative medicine includes various repair and regeneration techniques, and these have gained increasing attention in the scientific literature. In the future, not only will these techniques contribute to the repair and regeneration of the human kidney, but probably also to the construction of an entire organ. New mechanisms studied for kidney regeneration and repair include circulating stem cells as mesenchymal stromal/stem cells and their paracrine mechanisms of action; renal progenitor stem cells; the leading role of tubular epithelial cells in the tubular repair process; the study of zebrafish larvae to understand the process of nephron development, kidney scaffold and its repopulation; and, finally, the development of organoids. This review elucidates where we are in terms of current scientific knowledge regarding these mechanisms and the promises of future scientific perspectives.
Mukherji, Sutapa
2018-03-01
In this paper, we study a one-dimensional totally asymmetric simple exclusion process with position-dependent hopping rates. Under open boundary conditions, this system exhibits boundary-induced phase transitions in the steady state. Similarly to totally asymmetric simple exclusion processes with uniform hopping, the phase diagram consists of low-density, high-density, and maximal-current phases. In various phases, the shape of the average particle density profile across the lattice including its boundary-layer parts changes significantly. Using the tools of boundary-layer analysis, we obtain explicit solutions for the density profile in different phases. A detailed analysis of these solutions under different boundary conditions helps us obtain the equations for various phase boundaries. Next, we show how the shape of the entire density profile including the location of the boundary layers can be predicted from the fixed points of the differential equation describing the boundary layers. We discuss this in detail through several examples of density profiles in various phases. The maximal-current phase appears to be an especially interesting phase where the boundary layer flows to a bifurcation point on the fixed-point diagram.
Kassiopeia: a modern, extensible C++ particle tracking package
Furse, Daniel; Groh, Stefan; Trost, Nikolaus; ...
2017-05-16
The Kassiopeia particle tracking framework is an object-oriented software package using modern C++ techniques, written originally to meet the needs of the KATRIN collaboration. Kassiopeia features a new algorithmic paradigm for particle tracking simulations which targets experiments containing complex geometries and electromagnetic fields, with high priority put on calculation efficiency, customizability, extensibility, and ease-of-use for novice programmers. To solve Kassiopeia's target physics problem the software is capable of simulating particle trajectories governed by arbitrarily complex differential equations of motion, continuous physics processes that may in part be modeled as terms perturbing that equation of motion, stochastic processes that occur inmore » flight such as bulk scattering and decay, and stochastic surface processes occurring at interfaces, including transmission and reflection effects. This entire set of computations takes place against the backdrop of a rich geometry package which serves a variety of roles, including initialization of electromagnetic field simulations and the support of state-dependent algorithm-swapping and behavioral changes as a particle's state evolves. Thanks to the very general approach taken by Kassiopeia it can be used by other experiments facing similar challenges when calculating particle trajectories in electromagnetic fields. It is publicly available at https://github.com/KATRIN-Experiment/Kassiopeia.« less
Kassiopeia: a modern, extensible C++ particle tracking package
NASA Astrophysics Data System (ADS)
Furse, Daniel; Groh, Stefan; Trost, Nikolaus; Babutzka, Martin; Barrett, John P.; Behrens, Jan; Buzinsky, Nicholas; Corona, Thomas; Enomoto, Sanshiro; Erhard, Moritz; Formaggio, Joseph A.; Glück, Ferenc; Harms, Fabian; Heizmann, Florian; Hilk, Daniel; Käfer, Wolfgang; Kleesiek, Marco; Leiber, Benjamin; Mertens, Susanne; Oblath, Noah S.; Renschler, Pascal; Schwarz, Johannes; Slocum, Penny L.; Wandkowsky, Nancy; Wierman, Kevin; Zacher, Michael
2017-05-01
The Kassiopeia particle tracking framework is an object-oriented software package using modern C++ techniques, written originally to meet the needs of the KATRIN collaboration. Kassiopeia features a new algorithmic paradigm for particle tracking simulations which targets experiments containing complex geometries and electromagnetic fields, with high priority put on calculation efficiency, customizability, extensibility, and ease-of-use for novice programmers. To solve Kassiopeia's target physics problem the software is capable of simulating particle trajectories governed by arbitrarily complex differential equations of motion, continuous physics processes that may in part be modeled as terms perturbing that equation of motion, stochastic processes that occur in flight such as bulk scattering and decay, and stochastic surface processes occurring at interfaces, including transmission and reflection effects. This entire set of computations takes place against the backdrop of a rich geometry package which serves a variety of roles, including initialization of electromagnetic field simulations and the support of state-dependent algorithm-swapping and behavioral changes as a particle’s state evolves. Thanks to the very general approach taken by Kassiopeia it can be used by other experiments facing similar challenges when calculating particle trajectories in electromagnetic fields. It is publicly available at https://github.com/KATRIN-Experiment/Kassiopeia.
NASA Astrophysics Data System (ADS)
Mukherji, Sutapa
2018-03-01
In this paper, we study a one-dimensional totally asymmetric simple exclusion process with position-dependent hopping rates. Under open boundary conditions, this system exhibits boundary-induced phase transitions in the steady state. Similarly to totally asymmetric simple exclusion processes with uniform hopping, the phase diagram consists of low-density, high-density, and maximal-current phases. In various phases, the shape of the average particle density profile across the lattice including its boundary-layer parts changes significantly. Using the tools of boundary-layer analysis, we obtain explicit solutions for the density profile in different phases. A detailed analysis of these solutions under different boundary conditions helps us obtain the equations for various phase boundaries. Next, we show how the shape of the entire density profile including the location of the boundary layers can be predicted from the fixed points of the differential equation describing the boundary layers. We discuss this in detail through several examples of density profiles in various phases. The maximal-current phase appears to be an especially interesting phase where the boundary layer flows to a bifurcation point on the fixed-point diagram.
An efficient framework for Java data processing systems in HPC environments
NASA Astrophysics Data System (ADS)
Fries, Aidan; Castañeda, Javier; Isasi, Yago; Taboada, Guillermo L.; Portell de Mora, Jordi; Sirvent, Raül
2011-11-01
Java is a commonly used programming language, although its use in High Performance Computing (HPC) remains relatively low. One of the reasons is a lack of libraries offering specific HPC functions to Java applications. In this paper we present a Java-based framework, called DpcbTools, designed to provide a set of functions that fill this gap. It includes a set of efficient data communication functions based on message-passing, thus providing, when a low latency network such as Myrinet is available, higher throughputs and lower latencies than standard solutions used by Java. DpcbTools also includes routines for the launching, monitoring and management of Java applications on several computing nodes by making use of JMX to communicate with remote Java VMs. The Gaia Data Processing and Analysis Consortium (DPAC) is a real case where scientific data from the ESA Gaia astrometric satellite will be entirely processed using Java. In this paper we describe the main elements of DPAC and its usage of the DpcbTools framework. We also assess the usefulness and performance of DpcbTools through its performance evaluation and the analysis of its impact on some DPAC systems deployed in the MareNostrum supercomputer (Barcelona Supercomputing Center).
Digital Twin concept for smart injection molding
NASA Astrophysics Data System (ADS)
Liau, Y.; Lee, H.; Ryu, K.
2018-03-01
Injection molding industry has evolved over decades and became the most common method to manufacture plastic parts. Monitoring and improvement in the injection molding industry are usually performed separately in each stage, i.e. mold design, mold making and injection molding process. However, in order to make a breakthrough and survive in the industrial revolution, all the stages in injection molding need to be linked and communicated with each other. Any changes in one stage will cause a certain effect in other stage because there is a correlation between each other. Hence, the simulation should not only based on the input of historical data, but it also needs to include the current condition of equipment and prediction of future events in other stages to make the responsive decision. This can be achieved by implementing the concept of Digital Twin that models the entire process as a virtual model and enables bidirectional control with the physical process. This paper presented types of data and technology required to build the Digital Twin for the injection molding industry. The concept includes Digital Twin of each stage and integration of these Digital Twin model as a thoroughgoing model of the injection molding industry.
Functional mapping of the primate auditory system.
Poremba, Amy; Saunders, Richard C; Crane, Alison M; Cook, Michelle; Sokoloff, Louis; Mishkin, Mortimer
2003-01-24
Cerebral auditory areas were delineated in the awake, passively listening, rhesus monkey by comparing the rates of glucose utilization in an intact hemisphere and in an acoustically isolated contralateral hemisphere of the same animal. The auditory system defined in this way occupied large portions of cerebral tissue, an extent probably second only to that of the visual system. Cortically, the activated areas included the entire superior temporal gyrus and large portions of the parietal, prefrontal, and limbic lobes. Several auditory areas overlapped with previously identified visual areas, suggesting that the auditory system, like the visual system, contains separate pathways for processing stimulus quality, location, and motion.
Oxygen ion-beam microlithography
Tsuo, Y.S.
1991-08-20
A method of providing and developing a resist on a substrate for constructing integrated circuit (IC) chips includes the following steps: of depositing a thin film of amorphous silicon or hydrogenated amorphous silicon on the substrate and exposing portions of the amorphous silicon to low-energy oxygen ion beams to oxidize the amorphous silicon at those selected portions. The nonoxidized portions are then removed by etching with RF-excited hydrogen plasma. Components of the IC chip can then be constructed through the removed portions of the resist. The entire process can be performed in an in-line vacuum production system having several vacuum chambers. Nitrogen or carbon ion beams can also be used. 5 figures.
BOREAS AFM-04 Twin Otter Aircraft Flux Data
NASA Technical Reports Server (NTRS)
MacPherson, J. Ian; Hall, Forrest G. (Editor); Knapp, David E. (Editor); Desjardins, Raymond L.; Smith, David E. (Technical Monitor)
2000-01-01
The BOREAS AFM-5 team collected and processed data from the numerous radiosonde flights during the project. The goals of the AFM-05 team were to provide large-scale definition of the atmosphere by supplementing the existing AES aerological network, both temporally and spatially. This data set includes basic upper-air parameters collected from the network of upper-air stations during the 1993, 1994, and 1996 field campaigns over the entire study region. The data are contained in tabular ASCII files. The data files are available on a CD-ROM (see document number 20010000884) or from the Oak Ridge National Laboratory (ORNL) Distributed Active Archive Center (DAAC).
NASA Astrophysics Data System (ADS)
Frahm, K. M.; Shepelyansky, D. L.
2012-10-01
We construct the Google matrix of the entire Twitter network, dated by July 2009, and analyze its spectrum and eigenstate properties including the PageRank and CheiRank vectors and 2DRanking of all nodes. Our studies show much stronger inter-connectivity between top PageRank nodes for the Twitter network compared to the networks of Wikipedia and British Universities studied previously. Our analysis allows to locate the top Twitter users which control the information flow on the network. We argue that this small fraction of the whole number of users, which can be viewed as the social network elite, plays the dominant role in the process of opinion formation on the network.
Pathogenesis of Taenia solium taeniasis and cysticercosis.
Gonzales, I; Rivera, J T; Garcia, H H
2016-03-01
Taenia solium infections (taeniasis/cysticercosis) are a major scourge to most developing countries. Neurocysticercosis, the infection of the human nervous system by the cystic larvae of this parasite, has a protean array of clinical manifestations varying from entirely asymptomatic infections to aggressive, lethal courses. The diversity of clinical manifestations reflects a series of contributing factors which include the number, size and location of the invading parasites, and particularly the inflammatory response of the host. This manuscript reviews the different presentations of T. solium infections in the human host with a focus on the mechanisms or processes responsible for their clinical expression. © 2016 John Wiley & Sons Ltd.
Improving the Endoscopic Detection Rate in Patients with Early Gastric Cancer
2015-01-01
Endoscopists should ideally possess both sufficient knowledge of the endoscopic gastrointestinal disease findings and an appropriate attitude. Before performing endoscopy, the endoscopist must identify several risk factors of gastric cancer, including the patient's age, comorbidities, and drug history, a family history of gastric cancer, previous endoscopic findings of atrophic gastritis or intestinal metaplasia, and a history of previous endoscopic treatments. During endoscopic examination, the macroscopic appearance is very important for the diagnosis of early gastric cancer; therefore, the endoscopist should have a consistent and organized endoscope processing technique and the ability to comprehensively investigate the entire stomach, even blind spots. PMID:26240801
Oxygen ion-beam microlithography
Tsuo, Y. Simon
1991-01-01
A method of providing and developing a resist on a substrate for constructing integrated circuit (IC) chips includes the following steps: of depositing a thin film of amorphous silicon or hydrogenated amorphous silicon on the substrate and exposing portions of the amorphous silicon to low-energy oxygen ion beams to oxidize the amorphous silicon at those selected portions. The nonoxidized portions are then removed by etching with RF-excited hydrogen plasma. Components of the IC chip can then be constructed through the removed portions of the resist. The entire process can be performed in an in-line vacuum production system having several vacuum chambers. Nitrogen or carbon ion beams can also be used.
Space Flight Operations Center local area network
NASA Technical Reports Server (NTRS)
Goodman, Ross V.
1988-01-01
The existing Mission Control and Computer Center at JPL will be replaced by the Space Flight Operations Center (SFOC). One part of the SFOC is the LAN-based distribution system. The purpose of the LAN is to distribute the processed data among the various elements of the SFOC. The SFOC LAN will provide a robust subsystem that will support the Magellan launch configuration and future project adaptation. Its capabilities include (1) a proven cable medium as the backbone for the entire network; (2) hardware components that are reliable, varied, and follow OSI standards; (3) accurate and detailed documentation for fault isolation and future expansion; and (4) proven monitoring and maintenance tools.
NASA Technical Reports Server (NTRS)
Archer, G. T.
1974-01-01
The model presents a systems analysis of a human circulatory regulation based almost entirely on experimental data and cumulative present knowledge of the many facets of the circulatory system. The model itself consists of eighteen different major systems that enter into circulatory control. These systems are grouped into sixteen distinct subprograms that are melded together to form the total model. The model develops circulatory and fluid regulation in a simultaneous manner. Thus, the effects of hormonal and autonomic control, electrolyte regulation, and excretory dynamics are all important and are all included in the model.
Model selection using cosmic chronometers with Gaussian Processes
NASA Astrophysics Data System (ADS)
Melia, Fulvio; Yennapureddy, Manoj K.
2018-02-01
The use of Gaussian Processes with a measurement of the cosmic expansion rate based solely on the observation of cosmic chronometers provides a completely cosmology-independent reconstruction of the Hubble constant H(z) suitable for testing different models. The corresponding dispersion σH is smaller than ~ 9% over the entire redshift range (lesssim zlesssim 20) of the observations, rivaling many kinds of cosmological measurements available today. We use the reconstructed H(z) function to test six different cosmologies, and show that it favours the Rh=ct universe, which has only one free parameter (i.e., H0) over other models, including Planck ΛCDM . The parameters of the standard model may be re-optimized to improve the fits to the reconstructed H(z) function, but the results have smaller p-values than one finds with Rh=ct.
Damage Instability and Transition From Quasi-Static to Dynamic Fracture
NASA Technical Reports Server (NTRS)
Davila, Carlos G.
2015-01-01
In a typical mechanical test, the loading phase is intended to be a quasi-static process, while the failure and collapse is usually a dynamic event. The structural strength and modes of damage can seldom be predicted without accounting for these two aspects of the response. For a proper prediction, it is therefore essential to use tools and methodologies that are capable of addressing both aspects of responses. In some cases, implicit quasi-static models have been shown to be able to predict the entire response of a structure, including the unstable path that leads to fracture. However, is it acceptable to ignore the effect of inertial forces in the formation of damage? In this presentation we examine aspects of the damage processes that must be simulated for an accurate prediction of structural strength and modes of failure.
Interstellar Travel and Galactic Colonization: Insights from Percolation Theory and the Yule Process
NASA Astrophysics Data System (ADS)
Lingam, Manasvi
2016-06-01
In this paper, percolation theory is employed to place tentative bounds on the probability p of interstellar travel and the emergence of a civilization (or panspermia) that colonizes the entire Galaxy. The ensuing ramifications with regard to the Fermi paradox are also explored. In particular, it is suggested that the correlation function of inhabited exoplanets can be used to observationally constrain p in the near future. It is shown, by using a mathematical evolution model known as the Yule process, that the probability distribution for civilizations with a given number of colonized worlds is likely to exhibit a power-law tail. Some of the dynamical aspects of this issue, including the question of timescales and generalizing percolation theory, were also studied. The limitations of these models, and other avenues for future inquiry, are also outlined.
Strickland, Michelle; Tudorica, Victor; Řezáč, Milan; Thomas, Neil R; Goodacre, Sara L
2018-06-01
Spiders produce multiple silks with different physical properties that allow them to occupy a diverse range of ecological niches, including the underwater environment. Despite this functional diversity, past molecular analyses show a high degree of amino acid sequence similarity between C-terminal regions of silk genes that appear to be independent of the physical properties of the resulting silks; instead, this domain is crucial to the formation of silk fibers. Here, we present an analysis of the C-terminal domain of all known types of spider silk and include silk sequences from the spider Argyroneta aquatica, which spins the majority of its silk underwater. Our work indicates that spiders have retained a highly conserved mechanism of silk assembly, despite the extraordinary diversification of species, silk types and applications of silk over 350 million years. Sequence analysis of the silk C-terminal domain across the entire gene family shows the conservation of two uncommon amino acids that are implicated in the formation of a salt bridge, a functional bond essential to protein assembly. This conservation extends to the novel sequences isolated from A. aquatica. This finding is relevant to research regarding the artificial synthesis of spider silk, suggesting that synthesis of all silk types will be possible using a single process.
Ben-David, Jonathan; Chipman, Ariel D
2010-10-01
The early embryo of the milkweed bug, Oncopeltus fasciatus, appears as a single cell layer - the embryonic blastoderm - covering the entire egg. It is at this blastoderm stage that morphological domains are first determined, long before the appearance of overt segmentation. Central to the process of patterning the blastoderm into distinct domains are a group of transcription factors known as gap genes. In Drosophila melanogaster these genes form a network of interactions, and maintain sharp expression boundaries through strong mutual repression. Their restricted expression domains define specific areas along the entire body. We have studied the expression domains of the four trunk gap gene homologues in O. fasciatus and have determined their interactions through dsRNA gene knockdown experiments, followed by expression analyses. While the blastoderm in O. fasciatus includes only the first six segments of the embryo, the expression domains of the gap genes within these segments are broadly similar to those in Drosophila where the blastoderm includes all 15 segments. However, the interactions between the gap genes are surprisingly different from those in Drosophila, and mutual repression between the genes seems to play a much less significant role. This suggests that the well-studied interaction pattern in Drosophila is evolutionarily derived, and has evolved from a less strongly interacting network. Copyright © 2010 Elsevier Inc. All rights reserved.
STAKEHOLDER INVOLVEMENT IN THE HEALTH TECHNOLOGY ASSESSMENT PROCESS IN LATIN AMERICA.
Pichon-Riviere, Andres; Soto, Natalie; Augustovski, Federico; Sampietro-Colom, Laura
2018-06-11
Latin American countries are taking important steps to expand and strengthen universal health coverage, and health technology assessment (HTA) has an increasingly prominent role in this process. Participation of all relevant stakeholders has become a priority in this effort. Key issues in this area were discussed during the 2017 Latin American Health Technology Assessment International (HTAi) Policy Forum. The Forum included forty-one participants from Latin American HTA agencies; public, social security, and private insurance sectors; and the pharmaceutical and medical device industry. A background paper and presentations by invited experts and Forum members supported discussions. This study presents a summary of these discussions. Stakeholder involvement in HTA remains inconsistently implemented in the region and few countries have established formal processes. Participants agreed that stakeholder involvement is key to improve the HTA process, but the form and timing of such improvements must be adapted to local contexts. The legitimization of both HTA and decision-making processes was identified as one of the main reasons to promote stakeholder involvement; but to be successful, the entire system of assessment and decision making must be properly staffed and organized, and certain basic conditions must be met, including transparency in the HTA process and a clear link between HTA and decision making. Participants suggested a need for establishing clear rules of participation in HTA that would protect HTA producers and decision makers from potentially distorting external influences. Such rules and mechanisms could help foster trust and credibility among stakeholders, supporting actual involvement in HTA processes.
Foerster, Rebecca M.; Poth, Christian H.; Behler, Christian; Botsch, Mario; Schneider, Werner X.
2016-01-01
Neuropsychological assessment of human visual processing capabilities strongly depends on visual testing conditions including room lighting, stimuli, and viewing-distance. This limits standardization, threatens reliability, and prevents the assessment of core visual functions such as visual processing speed. Increasingly available virtual reality devices allow to address these problems. One such device is the portable, light-weight, and easy-to-use Oculus Rift. It is head-mounted and covers the entire visual field, thereby shielding and standardizing the visual stimulation. A fundamental prerequisite to use Oculus Rift for neuropsychological assessment is sufficient test-retest reliability. Here, we compare the test-retest reliabilities of Bundesen’s visual processing components (visual processing speed, threshold of conscious perception, capacity of visual working memory) as measured with Oculus Rift and a standard CRT computer screen. Our results show that Oculus Rift allows to measure the processing components as reliably as the standard CRT. This means that Oculus Rift is applicable for standardized and reliable assessment and diagnosis of elementary cognitive functions in laboratory and clinical settings. Oculus Rift thus provides the opportunity to compare visual processing components between individuals and institutions and to establish statistical norm distributions. PMID:27869220
Influence of process parameters on the effectiveness of photooxidative treatment of pharmaceuticals.
Markic, Marinko; Cvetnic, Matija; Ukic, Sime; Kusic, Hrvoje; Bolanca, Tomislav; Bozic, Ana Loncaric
2018-03-21
In this study, UV-C/H 2 O 2 and UV-C/[Formula: see text] processes as photooxidative Advanced oxidation processes were applied for the treatment of seven pharmaceuticals, either already included in the Directive 2013/39/EU "watch list" (17α- ethynylestradiol, 17β-estradiol) or with potential to be added in the near future due to environmental properties and increasing consumption (azithromycin, carbamazepine, dexamethasone, erythromycin and oxytetracycline). The influence of process parameters (pH, oxidant concentration and type) on the pharmaceuticals degradation was studied through employed response surface modelling approach. It was established that degradation obeys first-order kinetic regime regardless structural differences and over entire range of studied process parameters. The results revealed that the effectiveness of UV-C/H 2 O 2 process is highly dependent on both initial pH and oxidant concentration. It was found that UV-C/[Formula: see text] process, exhibiting several times faster degradation of studied pharmaceuticals, is less sensitive to pH changes providing practical benefit to its utilization. The influence of water matrix on degradation kinetics of studied pharmaceuticals was studied through natural organic matter effects on single component and mixture systems.
IEC 61511 and the capital project process--a protective management system approach.
Summers, Angela E
2006-03-17
This year, the process industry has reached an important milestone in process safety-the acceptance of an internationally recognized standard for safety instrumented systems (SIS). This standard, IEC 61511, documents good engineering practice for the assessment, design, operation, maintenance, and management of SISs. The foundation of the standard is established by several requirements in Part 1, Clauses 5-7, which cover the development of a management system aimed at ensuring that functional safety is achieved. The management system includes a quality assurance process for the entire SIS lifecycle, requiring the development of procedures, identification of resources and acquisition of tools. For maximum benefit, the deliverables and quality control checks required by the standard should be integrated into the capital project process, addressing safety, environmental, plant productivity, and asset protection. Industry has become inundated with a multitude of programs focusing on safety, quality, and cost performance. This paper introduces a protective management system, which builds upon the work process identified in IEC 61511. Typical capital project phases are integrated with the management system to yield one comprehensive program to efficiently manage process risk. Finally, the paper highlights areas where internal practices or guidelines should be developed to improve program performance and cost effectiveness.
Foerster, Rebecca M; Poth, Christian H; Behler, Christian; Botsch, Mario; Schneider, Werner X
2016-11-21
Neuropsychological assessment of human visual processing capabilities strongly depends on visual testing conditions including room lighting, stimuli, and viewing-distance. This limits standardization, threatens reliability, and prevents the assessment of core visual functions such as visual processing speed. Increasingly available virtual reality devices allow to address these problems. One such device is the portable, light-weight, and easy-to-use Oculus Rift. It is head-mounted and covers the entire visual field, thereby shielding and standardizing the visual stimulation. A fundamental prerequisite to use Oculus Rift for neuropsychological assessment is sufficient test-retest reliability. Here, we compare the test-retest reliabilities of Bundesen's visual processing components (visual processing speed, threshold of conscious perception, capacity of visual working memory) as measured with Oculus Rift and a standard CRT computer screen. Our results show that Oculus Rift allows to measure the processing components as reliably as the standard CRT. This means that Oculus Rift is applicable for standardized and reliable assessment and diagnosis of elementary cognitive functions in laboratory and clinical settings. Oculus Rift thus provides the opportunity to compare visual processing components between individuals and institutions and to establish statistical norm distributions.
Conjoint Analysis of the Surface and Atmospheric Water Balances of the Andes-Amazon System
NASA Astrophysics Data System (ADS)
Builes-Jaramillo, Alejandro; Poveda, Germán
2017-04-01
Acknowledging the interrelation between the two branches of the hydrological cycle, we perform a comprehensive analysis of the long-term mean surface and atmospheric water balances in the Amazon-Andes River basins system. We estimate the closure of the water budgets based on the long-term approximation of the water balance equations, and estimate the imbalance between both atmospheric and surface budgets. The analysis was performed with observational and reanalysis datasets for the entire basin, for several sub-catchments inside the entire Amazon River basin and for two physical and geographical distinctive subsystems of the basin, namely upper Andean the low-lying Amazon River basin. Our results evidence that for the entire Amazon River basin the surface water balance can be considered to be in balance (P = 2225 mm.yr-1, ET= 1062 mm.yr-1, R= 965 mm.yr-1), whereas for the separated subsystems it not so clear, showing high discrepancies between observations and reanalysis datasets. In turn, the atmospheric budget does not close regardless of datasets or geographical disaggregation. Our results indicate that the amount of imbalance of the atmospheric branch of the water balance depends on the evaporation data source used. The imbalance calculated as I=(C/R)-1, where C is net moisture convergence (C= -∇Q where ∇Q is the net vertically integrated moisture divergence) and R the runoff,represents the difference between the two branches of the hydrological cycle. For the entire Amazon River basin we found a consistent negative imbalance driven by higher values of runoff, and when calculated for monthly time scales the imbalance is characterized by a high dependence on the Amazon dry season. The separated analysis performed to the Andes and Low-lying Amazonia subsystems unveils two shortcomings of the available data, namely a poor quality of the representation of surface processes in the reanalysis models (including precipitation and evapotranspiration), and the limitations that high altitudes and scarcity of information induce in capturing the dynamics of hydrological processes over the Andean region. Our results confirm the paramount importance of a joint analysis between the atmospheric and surface water budgets at the river basin level, in order to achieve a complete understanding of the hydrologic dynamics.
Understanding the ignition mechanism of high-pressure spray flames
Dahms, Rainer N.; Paczko, Günter A.; Skeen, Scott A.; ...
2016-10-25
A conceptual model for turbulent ignition in high-pressure spray flames is presented. The model is motivated by first-principles simulations and optical diagnostics applied to the Sandia n-dodecane experiment. The Lagrangian flamelet equations are combined with full LLNL kinetics (2755 species; 11,173 reactions) to resolve all time and length scales and chemical pathways of the ignition process at engine-relevant pressures and turbulence intensities unattainable using classic DNS. The first-principles value of the flamelet equations is established by a novel chemical explosive mode-diffusion time scale analysis of the fully-coupled chemical and turbulent time scales. Contrary to conventional wisdom, this analysis reveals thatmore » the high Damköhler number limit, a key requirement for the validity of the flamelet derivation from the reactive Navier–Stokes equations, applies during the entire ignition process. Corroborating Rayleigh-scattering and formaldehyde PLIF with simultaneous schlieren imaging of mixing and combustion are presented. Our combined analysis establishes a characteristic temporal evolution of the ignition process. First, a localized first-stage ignition event consistently occurs in highest temperature mixture regions. This initiates, owed to the intense scalar dissipation, a turbulent cool flame wave propagating from this ignition spot through the entire flow field. This wave significantly decreases the ignition delay of lower temperature mixture regions in comparison to their homogeneous reference. This explains the experimentally observed formaldehyde formation across the entire spray head prior to high-temperature ignition which consistently occurs first in a broad range of rich mixture regions. There, the combination of first-stage ignition delay, shortened by the cool flame wave, and the subsequent delay until second-stage ignition becomes minimal. A turbulent flame subsequently propagates rapidly through the entire mixture over time scales consistent with experimental observations. As a result, we demonstrate that the neglect of turbulence-chemistry-interactions fundamentally fails to capture the key features of this ignition process.« less
Annotating images by mining image search results.
Wang, Xin-Jing; Zhang, Lei; Li, Xirong; Ma, Wei-Ying
2008-11-01
Although it has been studied for years by the computer vision and machine learning communities, image annotation is still far from practical. In this paper, we propose a novel attempt at model-free image annotation, which is a data-driven approach that annotates images by mining their search results. Some 2.4 million images with their surrounding text are collected from a few photo forums to support this approach. The entire process is formulated in a divide-and-conquer framework where a query keyword is provided along with the uncaptioned image to improve both the effectiveness and efficiency. This is helpful when the collected data set is not dense everywhere. In this sense, our approach contains three steps: 1) the search process to discover visually and semantically similar search results, 2) the mining process to identify salient terms from textual descriptions of the search results, and 3) the annotation rejection process to filter out noisy terms yielded by Step 2. To ensure real-time annotation, two key techniques are leveraged-one is to map the high-dimensional image visual features into hash codes, the other is to implement it as a distributed system, of which the search and mining processes are provided as Web services. As a typical result, the entire process finishes in less than 1 second. Since no training data set is required, our approach enables annotating with unlimited vocabulary and is highly scalable and robust to outliers. Experimental results on both real Web images and a benchmark image data set show the effectiveness and efficiency of the proposed algorithm. It is also worth noting that, although the entire approach is illustrated within the divide-and conquer framework, a query keyword is not crucial to our current implementation. We provide experimental results to prove this.
Kegeles, Susan M; Rebchook, Gregory; Tebbetts, Scott; Arnold, Emily
2015-04-17
Since the scale-up of HIV/AIDS prevention evidence-based interventions (EBIs) has not been simple, it is important to examine processes that occur in the translation of the EBIs into practice that affect successful implementation. The goal of this paper is to examine facilitators and barriers to effective implementation that arose among 72 community-based organizations as they moved into practice a multilevel HIV prevention intervention EBI, the Mpowerment Project, for young gay and bisexual men. CBOs that were implementing the Mpowerment Project participated in this study and were assessed at baseline, and 6-months, 1 year, and 2 years post-baseline. Semi-structured telephone interviews were conducted separately with individuals at each CBO. Study data came from 647 semi-structured interviews and extensive notes and commentaries from technical assistance providers. Framework Analysis guided the analytic process. Barriers and facilitators to implementation was the overarching thematic framework used across all the cases in our analysis. Thirteen themes emerged regarding factors that influence the successful implementation of the MP. These were organized into three overarching themes: HIV Prevention System Factors, Community Factors, and Intervention Factors. The entire HIV Prevention System, including coordinators, supervisors, executive directors, funders, and national HIV prevention policies, all influenced implementation success. Other Prevention System Factors that affected the effective translation of the EBI into practice include Knowledge About Intervention, Belief in the Efficacy of the Intervention, Desire to Change Existing Prevention Approach, Planning for Intervention Before Implementation, Accountability, Appropriateness of Individuals for Coordinator Positions, Evaluation of Intervention, and Organizational Stability. Community Factors included Geography and Sociopolitical Climate. Intervention Factors included Intervention Characteristics and Adaptation Issues. The entire ecological system in which an EBI occurs affects implementation. It is imperative to focus capacity-building efforts on getting individuals at different levels of the HIV Prevention System into alignment regarding understanding and believing in the program's goals and methods. For a Prevention Support System to be maximally useful, it must address facilitators or barriers to implementation, address the right people, and use modalities to convey information that are acceptable for users of the system.
Kim, Kkotbong; Yang, Jinhyang
2017-06-01
After being diagnosed with breast cancer, women must make a number of decisions about their treatment and management. When the decision-making process among breast cancer patients is ineffective, it results in harm to their health. Little is known about the decision-making process of breast cancer patients during the entire course of treatment and management. We investigated women with breast cancer to explore the decision-making processes related to treatment and management. Eleven women participated, all of whom were receiving treatment or management in Korea. The average participant age was 43.5years. For data collection and analysis, a grounded theory methodology was used. Through constant comparative analyses, a core category emerged that we referred to as "finding the right individualized healthcare trajectory." The decision-making process occurred in four phases: turmoil, exploration, balance, and control. The turmoil phase included weighing the credibility of information and lowering the anxiety level. The exploration phase included assessing the expertise/promptness of medical treatment and evaluating the effectiveness of follow-up management. The balance phase included performing analyses from multiple angles and rediscovering value as a human being. The control phase included constructing an individualized management system and following prescribed and other management options. It is important to provide patients with accurate information related to the treatment and management of breast cancer so that they can make effective decisions. Healthcare providers should engage with patients on issues related to their disease, understand the burden placed on patients because of issues related to their sex, and ensure that the patient has a sufficient support system. The results of this study can be used to develop phase-specific, patient-centered, and tailored interventions for breast cancer patients. Copyright © 2017 Elsevier Inc. All rights reserved.
The Million-Body Problem: Particle Simulations in Astrophysics
Rasio, Fred
2018-05-21
Computer simulations using particles play a key role in astrophysics. They are widely used to study problems across the entire range of astrophysical scales, from the dynamics of stars, gaseous nebulae, and galaxies, to the formation of the largest-scale structures in the universe. The 'particles' can be anything from elementary particles to macroscopic fluid elements, entire stars, or even entire galaxies. Using particle simulations as a common thread, this talk will present an overview of computational astrophysics research currently done in our theory group at Northwestern. Topics will include stellar collisions and the gravothermal catastrophe in dense star clusters.
LANDSAT D data transmission and dissemination study
NASA Technical Reports Server (NTRS)
1976-01-01
An assessment of the quantity of data processed by the system is discussed investigating the various methods for transmission within the system. Various methods of data storage are considered. It is concluded that the entire processing system should be located in White Sands, New Mexico.
Mathematical model of whole-process calculation for bottom-blowing copper smelting
NASA Astrophysics Data System (ADS)
Li, Ming-zhou; Zhou, Jie-min; Tong, Chang-ren; Zhang, Wen-hai; Li, He-song
2017-11-01
The distribution law of materials in smelting products is key to cost accounting and contaminant control. Regardless, the distribution law is difficult to determine quickly and accurately by mere sampling and analysis. Mathematical models for material and heat balance in bottom-blowing smelting, converting, anode furnace refining, and electrolytic refining were established based on the principles of material (element) conservation, energy conservation, and control index constraint in copper bottom-blowing smelting. Simulation of the entire process of bottom-blowing copper smelting was established using a self-developed MetCal software platform. A whole-process simulation for an enterprise in China was then conducted. Results indicated that the quantity and composition information of unknown materials, as well as heat balance information, can be quickly calculated using the model. Comparison of production data revealed that the model can basically reflect the distribution law of the materials in bottom-blowing copper smelting. This finding provides theoretical guidance for mastering the performance of the entire process.
Invaginating Structures in Mammalian Synapses
Petralia, Ronald S.; Wang, Ya-Xian; Mattson, Mark P.; Yao, Pamela J.
2018-01-01
Invaginating structures at chemical synapses in the mammalian nervous system exist in presynaptic axon terminals, postsynaptic spines or dendrites, and glial processes. These invaginating structures can be divided into three categories. The first category includes slender protrusions invaginating into axonal terminals, postsynaptic spines, or glial processes. Best known examples of this category are spinules extending from postsynaptic spines into presynaptic terminals in forebrain synapses. Another example of this category are protrusions from inhibitory presynaptic terminals invaginating into postsynaptic neuronal somas. Regardless of the direction and location, the invaginating structures of the first category do not have synaptic active zones within the invagination. The second category includes postsynaptic spines invaginating into presynaptic terminals, whereas the third category includes presynaptic terminals invaginating into postsynaptic spines or dendrites. Unlike the first category, the second and third categories have active zones within the invagination. An example of the second category are mossy terminal synapses of the hippocampal CA3 region, in which enlarged spine-like structures invaginate partly or entirely into mossy terminals. An example of the third category is the neuromuscular junction (NMJ) where substantial invaginations of the presynaptic terminals invaginate into the muscle fibers. In the retina, rod and cone synapses have invaginating processes from horizontal and bipolar cells. Because horizontal cells act both as post and presynaptic structures, their invaginating processes represent both the second and third category. These invaginating structures likely play broad yet specialized roles in modulating neuronal cell signaling. PMID:29674962
SIG. Signal Processing, Analysis, & Display
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hernandez, J.; Lager, D.; Azevedo, S.
1992-01-22
SIG is a general-purpose signal processing, analysis, and display program. Its main purpose is to perform manipulations on time and frequency-domain signals. However, it has been designed to ultimately accommodate other representations for data such as multiplexed signals and complex matrices. Two user interfaces are provided in SIG; a menu mode for the unfamiliar user and a command mode for more experienced users. In both modes errors are detected as early as possible and are indicated by friendly, meaningful messages. An on-line HELP package is also included. A variety of operations can be performed on time and frequency-domain signals includingmore » operations on the samples of a signal, operations on the entire signal, and operations on two or more signals. Signal processing operations that can be performed are digital filtering (median, Bessel, Butterworth, and Chebychev), ensemble average, resample, auto and cross spectral density, transfer function and impulse response, trend removal, convolution, Fourier transform and inverse window functions (Hamming, Kaiser-Bessel), simulation (ramp, sine, pulsetrain, random), and read/write signals. User definable signal processing algorithms are also featured. SIG has many options including multiple commands per line, command files with arguments, commenting lines, defining commands, and automatic execution for each item in a `repeat` sequence. Graphical operations on signals and spectra include: x-y plots of time signals; real, imaginary, magnitude, and phase plots of spectra; scaling of spectra for continuous or discrete domain; cursor zoom; families of curves; and multiple viewports.« less
SIG. Signal Processing, Analysis, & Display
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hernandez, J.; Lager, D.; Azevedo, S.
1992-01-22
SIG is a general-purpose signal processing, analysis, and display program. Its main purpose is to perform manipulations on time-and frequency-domain signals. However, it has been designed to ultimately accommodate other representations for data such as multiplexed signals and complex matrices. Two user interfaces are provided in SIG - a menu mode for the unfamiliar user and a command mode for more experienced users. In both modes errors are detected as early as possible and are indicated by friendly, meaningful messages. An on-line HELP package is also included. A variety of operations can be performed on time and frequency-domain signals includingmore » operations on the samples of a signal, operations on the entire signal, and operations on two or more signals. Signal processing operations that can be performed are digital filtering (median, Bessel, Butterworth, and Chebychev), ensemble average, resample, auto and cross spectral density, transfer function and impulse response, trend removal, convolution, Fourier transform and inverse window functions (Hamming, Kaiser-Bessel), simulation (ramp, sine, pulsetrain, random), and read/write signals. User definable signal processing algorithms are also featured. SIG has many options including multiple commands per line, command files with arguments, commenting lines, defining commands, and automatic execution for each item in a repeat sequence. Graphical operations on signals and spectra include: x-y plots of time signals; real, imaginary, magnitude, and phase plots of spectra; scaling of spectra for continuous or discrete domain; cursor zoom; families of curves; and multiple viewports.« less
Got (the Right) Milk? How a Blended Quality Improvement Approach Catalyzed Change.
Luton, Alexandra; Bondurant, Patricia G; Campbell, Amy; Conkin, Claudia; Hernandez, Jae; Hurst, Nancy
2015-10-01
The expression, storage, preparation, fortification, and feeding of breast milk are common ongoing activities in many neonatal intensive care units (NICUs) today. Errors in breast milk administration are a serious issue that should be prevented to preserve the health and well-being of NICU babies and their families. This paper describes how a program to improve processes surrounding infant feeding was developed, implemented, and evaluated. The project team used a blended quality improvement approach that included the Model for Improvement, Lean and Six Sigma methodologies, and principles of High Reliability Organizations to identify and drive short-term, medium-term, and long-term improvement strategies. Through its blended quality improvement approach, the team strengthened the entire dispensation system for both human milk and formula and outlined a clear vision and plan for further improvements as well. The NICU reduced feeding errors by 83%. Be systematic in the quality improvement approach, and apply proven methods to improving processes surrounding infant feeding. Involve expert project managers with nonclinical perspective to guide work in a systematic way and provide unbiased feedback. Create multidisciplinary, cross-departmental teams that include a vast array of stakeholders in NICU feeding processes to ensure comprehensive examination of current state, identification of potential risks, and "outside the box" potential solutions. As in the realm of pharmacy, the processes involved in preparing feedings for critically ill infants should be carried out via predictable, reliable means including robust automated verification that integrates seamlessly into existing processes. The use of systems employed in pharmacy for medication preparation should be considered in the human milk and formula preparation setting.
The Drug Reimbursement Decision-Making System in Iran.
Ansaripour, Amir; Uyl-de Groot, Carin A; Steenhoek, Adri; Redekop, William K
2014-05-01
Previous studies of health policies in Iran have not focused exclusively on the drug reimbursement process. The aim of this study was to describe the entire drug reimbursement process and the stakeholders, and discuss issues faced by policymakers. Review of documents describing the administrative rules and directives of stakeholders, supplemented by published statistics and interviews with experts and policymakers. Iran has a systematic process for the assessment, appraisal, and judgment of drug reimbursements. The two most important organizations in this process are the Food and Drug Organization, which considers clinical effectiveness, safety, and economic issues, and the Supreme Council of Health Insurance, which considers various criteria, including budget impact and cost-effectiveness. Ultimately, the Iranian Cabinet approves a drug and recommends its use to all health insurance organizations. Reimbursed drugs account for about 53.5% of all available drugs and 77.3% of drug expenditures. Despite its strengths, the system faces various issues, including conflicting stakeholder aims, lengthy decision-making duration, limited access to decision-making details, and rigidity in the assessment process. The Iranian drug reimbursement system uses decision-making criteria and a structured approach similar to those in other countries. Important shortcomings in the system include out-of-pocket contributions due to lengthy decision making, lack of transparency, and conflicting interests among stakeholders. Iranian policymakers should consider a number of ways to remedy these problems, such as case studies of individual drugs and closer examination of experiences in other countries. Copyright © 2014 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.
Cell-accurate optical mapping across the entire developing heart.
Weber, Michael; Scherf, Nico; Meyer, Alexander M; Panáková, Daniela; Kohl, Peter; Huisken, Jan
2017-12-29
Organogenesis depends on orchestrated interactions between individual cells and morphogenetically relevant cues at the tissue level. This is true for the heart, whose function critically relies on well-ordered communication between neighboring cells, which is established and fine-tuned during embryonic development. For an integrated understanding of the development of structure and function, we need to move from isolated snap-shot observations of either microscopic or macroscopic parameters to simultaneous and, ideally continuous, cell-to-organ scale imaging. We introduce cell-accurate three-dimensional Ca 2+ -mapping of all cells in the entire electro-mechanically uncoupled heart during the looping stage of live embryonic zebrafish, using high-speed light sheet microscopy and tailored image processing and analysis. We show how myocardial region-specific heterogeneity in cell function emerges during early development and how structural patterning goes hand-in-hand with functional maturation of the entire heart. Our method opens the way to systematic, scale-bridging, in vivo studies of vertebrate organogenesis by cell-accurate structure-function mapping across entire organs.
Cell-accurate optical mapping across the entire developing heart
Meyer, Alexander M; Panáková, Daniela; Kohl, Peter
2017-01-01
Organogenesis depends on orchestrated interactions between individual cells and morphogenetically relevant cues at the tissue level. This is true for the heart, whose function critically relies on well-ordered communication between neighboring cells, which is established and fine-tuned during embryonic development. For an integrated understanding of the development of structure and function, we need to move from isolated snap-shot observations of either microscopic or macroscopic parameters to simultaneous and, ideally continuous, cell-to-organ scale imaging. We introduce cell-accurate three-dimensional Ca2+-mapping of all cells in the entire electro-mechanically uncoupled heart during the looping stage of live embryonic zebrafish, using high-speed light sheet microscopy and tailored image processing and analysis. We show how myocardial region-specific heterogeneity in cell function emerges during early development and how structural patterning goes hand-in-hand with functional maturation of the entire heart. Our method opens the way to systematic, scale-bridging, in vivo studies of vertebrate organogenesis by cell-accurate structure-function mapping across entire organs. PMID:29286002
NASA Technical Reports Server (NTRS)
Vu, Duc; Sandor, Michael; Agarwal, Shri
2005-01-01
CSAM Metrology Software Tool (CMeST) is a computer program for analysis of false-color CSAM images of plastic-encapsulated microcircuits. (CSAM signifies C-mode scanning acoustic microscopy.) The colors in the images indicate areas of delamination within the plastic packages. Heretofore, the images have been interpreted by human examiners. Hence, interpretations have not been entirely consistent and objective. CMeST processes the color information in image-data files to detect areas of delamination without incurring inconsistencies of subjective judgement. CMeST can be used to create a database of baseline images of packages acquired at given times for comparison with images of the same packages acquired at later times. Any area within an image can be selected for analysis, which can include examination of different delamination types by location. CMeST can also be used to perform statistical analyses of image data. Results of analyses are available in a spreadsheet format for further processing. The results can be exported to any data-base-processing software.
Creating the future: IAIMS planning premises at the University of Washington.
Fuller, S S
1992-01-01
In September 1990, the University of Washington (UW) received a Phase I IAIMS Planning Grant from the National Library of Medicine and embarked upon a planning process involving the entire health sciences center. As a result of our relatively late entry into IAIMS planning, we have been able to learn from the experiences of other health sciences centers and to leverage our existing institutional efforts. Consequently, our progress has been rapid, and in a little over a year, we drafted a long-range plan and embarked on several related research and development projects. The hallmarks of our planning process include careful study of both the UW institutional environment and the experiences of other IAIMS institutions throughout the United States; broad, interdisciplinary participation of faculty, librarians, and administrators; an intensive educational process; a focus on people rather than technology; and, above all, leveraging of existing institutional and research projects that support our vision for the future. PMID:1326372
Comment on "falsification of the Atmospheric CO2 Greenhouse Effects Within the Frame of Physics"
NASA Astrophysics Data System (ADS)
Halpern, Joshua B.; Colose, Christopher M.; Ho-Stuart, Chris; Shore, Joel D.; Smith, Arthur P.; Zimmermann, Jörg
In this journal, Gerhard Gerlich and Ralf D. Tscheuschner claim to have falsified the existence of an atmospheric greenhouse effect.1 Here, we show that their methods, logic, and conclusions are in error. Their most significant errors include trying to apply the Clausius statement of the Second Law of Thermodynamics to only one side of a heat transfer process rather than the entire process, and systematically ignoring most non-radiative heat flows applicable to the Earth's surface and atmosphere. They claim that radiative heat transfer from a colder atmosphere to a warmer surface is forbidden, ignoring the larger transfer in the other direction which makes the complete process allowed. Further, by ignoring heat capacity and non-radiative heat flows, they claim that radiative balance requires that the surface cool by 100 K or more at night, an obvious absurdity induced by an unphysical assumption. This comment concentrates on these two major points, while also taking note of some of Gerlich and Tscheuschner's other errors and misunderstandings.
Plan for Quality to Improve Patient Safety at the Point of Care
Ehrmeyer, Sharon S.
2011-01-01
The U.S. Institute of Medicine (IOM) much publicized report in “To Err is Human” (2000, National Academy Press) stated that as many as 98 000 hospitalized patients in the U.S. die each year due to preventable medical errors. This revelation about medical error and patient safety focused the public and the medical community's attention on errors in healthcare delivery including laboratory and point-of-care-testing (POCT). Errors introduced anywhere in the POCT process clearly can impact quality and place patient's safety at risk. While POCT performed by or near the patient reduces the potential of some errors, the process presents many challenges to quality with its multiple tests sites, test menus, testing devices and non-laboratory analysts, who often have little understanding of quality testing. Incoherent or no regulations and the rapid availability of test results for immediate clinical intervention can further amplify errors. System planning and management of the entire POCT process are essential to reduce errors and improve quality and patient safety. PMID:21808107
Input-output characterization of an ultrasonic testing system by digital signal analysis
NASA Technical Reports Server (NTRS)
Karaguelle, H.; Lee, S. S.; Williams, J., Jr.
1984-01-01
The input/output characteristics of an ultrasonic testing system used for stress wave factor measurements were studied. The fundamentals of digital signal processing are summarized. The inputs and outputs are digitized and processed in a microcomputer using digital signal processing techniques. The entire ultrasonic test system, including transducers and all electronic components, is modeled as a discrete-time linear shift-invariant system. Then the impulse response and frequency response of the continuous time ultrasonic test system are estimated by interpolating the defining points in the unit sample response and frequency response of the discrete time system. It is found that the ultrasonic test system behaves as a linear phase bandpass filter. Good results were obtained for rectangular pulse inputs of various amplitudes and durations and for tone burst inputs whose center frequencies are within the passband of the test system and for single cycle inputs of various amplitudes. The input/output limits on the linearity of the system are determined.
Implementation of the Pan-STARRS Image Processing Pipeline
NASA Astrophysics Data System (ADS)
Fang, Julia; Aspin, C.
2007-12-01
Pan-STARRS, or Panoramic Survey Telescope and Rapid Response System, is a wide-field imaging facility that combines small mirrors with gigapixel cameras. It surveys the entire available sky several times a month, which ultimately requires large amounts of data to be processed and stored right away. Accordingly, the Image Processing Pipeline--the IPP--is a collection of software tools that is responsible for the primary image analysis for Pan-STARRS. It includes data registration, basic image analysis such as obtaining master images and detrending the exposures, mosaic calibration when applicable, and lastly, image sum and difference. In this paper I present my work of the installation of IPP 2.1 and 2.2 on a Linux machine, running the Simtest, which is simulated data to test your installation, and finally applying the IPP to two different sets of UH 2.2m Tek data. This work was conducted by a Research Experience for Undergraduates (REU) position at the University of Hawaii's Institute for Astronomy and funded by the NSF.
Terahertz spectroscopic investigation of gallic acid and its monohydrate
NASA Astrophysics Data System (ADS)
Zhang, Bo; Li, Shaoping; Wang, Chenyang; Zou, Tao; Pan, Tingting; Zhang, Jianbing; Xu, Zhou; Ren, Guanhua; Zhao, Hongwei
2018-02-01
The low-frequency spectra of gallic acid (GA) and its monohydrate were investigated by terahertz time-domain spectroscopy (THz-TDS) in the range of 0.5 to 4.5 THz. The dehydration process of GA monohydrate was monitored on-line. The kinetic mechanism of the dehydration process was analyzed depending on the THz spectral change at different temperatures. The results indicate that the diffusion of water molecule dominates the speed of the entire dehydration process. Solid-state density functional theory (DFT) calculations of the vibrational modes of both GA and its monohydrate were performed based on their crystalline structures for better interpreting the experimental THz spectra. The results demonstrate that the characterized features of GA mainly originate from the collective vibrations of molecules. And the interactions between GA and water molecules are responsible for THz fingerprint of GA monohydrate. Multi-techniques including differential scanning calorimetry and thermogravimetry (DSC-TG) and powder X-ray diffraction (PXRD) were also carried out to further investigate GA and its monohydrate.
3. Elevation view of entire midsection using ultrawide angle lens. ...
3. Elevation view of entire midsection using ultrawide angle lens. Note opened south doors and closed north doors. The following photo WA-203-C-4 is similar except the camera position was moved right to include the slope of the south end. - Puget Sound Naval Shipyard, Munitions Storage Bunker, Naval Ammunitions Depot, South of Campbell Trail, Bremerton, Kitsap County, WA
Monitored Natural Attenuation (MNA) is unique among remedial technologies in relying entirely on natural processes to achieve site-specific objectives. Site characterization is essential to provide site-specific data and interpretations for the decision-making process (i.e., to ...
Why Can't a Computer Be More Like a Brain?
ERIC Educational Resources Information Center
Lerner, Eric J.
1984-01-01
Engineers seeking to develop intelligent computers have looked to studies of the human brain in hope of imitating its processes. A theory (known as cooperative action) that the brain processes information with electromagnetic waves may inspire engineers to develop entirely new types of computers. (JN)
The evolution and future of minimalism in neurological surgery.
Liu, Charles Y; Wang, Michael Y; Apuzzo, Michael L J
2004-11-01
The evolution of the field of neurological surgery has been marked by a progressive minimalism. This has been evident in the development of an entire arsenal of modern neurosurgical enterprises, including microneurosurgery, neuroendoscopy, stereotactic neurosurgery, endovascular techniques, radiosurgical systems, intraoperative and navigational devices, and in the last decade, cellular and molecular adjuvants. In addition to reviewing the major developments and paradigm shifts in the cyclic reinvention of the field as it currently stands, this paper attempts to identify forces and developments that are likely to fuel the irresistible escalation of minimalism into the future. These forces include discoveries in computational science, imaging, molecular science, biomedical engineering, and information processing as they relate to the theme of minimalism. These areas are explained in the light of future possibilities offered by the emerging field of nanotechnology with molecular engineering.
NASA Astrophysics Data System (ADS)
Mainzer, A.; Bauer, J.; Grav, T.; Masiero, J.; Cutri, R. M.; Dailey, J.; Eisenhardt, P.; McMillan, R. S.; Wright, E.; Walker, R.; Jedicke, R.; Spahr, T.; Tholen, D.; Alles, R.; Beck, R.; Brandenburg, H.; Conrow, T.; Evans, T.; Fowler, J.; Jarrett, T.; Marsh, K.; Masci, F.; McCallon, H.; Wheelock, S.; Wittman, M.; Wyatt, P.; DeBaun, E.; Elliott, G.; Elsbury, D.; Gautier, T., IV; Gomillion, S.; Leisawitz, D.; Maleszewski, C.; Micheli, M.; Wilkins, A.
2011-04-01
The Wide-field Infrared Survey Explorer (WISE) has surveyed the entire sky at four infrared wavelengths with greatly improved sensitivity and spatial resolution compared to its predecessors, the Infrared Astronomical Satellite and the Cosmic Background Explorer. NASA's Planetary Science Division has funded an enhancement to the WISE data processing system called "NEOWISE" that allows detection and archiving of moving objects found in the WISE data. NEOWISE has mined the WISE images for a wide array of small bodies in our solar system, including near-Earth objects (NEOs), Main Belt asteroids, comets, Trojans, and Centaurs. By the end of survey operations in 2011 February, NEOWISE identified over 157,000 asteroids, including more than 500 NEOs and ~120 comets. The NEOWISE data set will enable a panoply of new scientific investigations.
NASA Technical Reports Server (NTRS)
1981-01-01
In the development of the business system for the SRB automated production control system, special attention had to be paid to the unique environment posed by the space shuttle. The issues posed by this environment, and the means by which they were addressed, are reviewed. The change in management philosphy which will be required as NASA switches from one-of-a-kind launches to multiple launches is discussed. The implications of the assembly process on the business system are described. These issues include multiple missions, multiple locations and facilities, maintenance and refurbishment, multiple sources, and multiple contractors. The implications of these aspects on the automated production control system are reviewed including an assessment of the six major subsystems, as well as four other subsystem. Some general system requirements which flow through the entire business system are described.
Remotely Monitored Sealing Array Software
DOE Office of Scientific and Technical Information (OSTI.GOV)
2012-09-12
The Remotely Monitored Sealing Array (RMSA) utilizes the Secure Sensor Platform (SSP) framework to establish the fundamental operating capabilities for communication, security, power management, and cryptography. In addition to the SSP framework the RMSA software has unique capabilities to support monitoring a fiber optic seal. Fiber monitoring includes open and closed as well as parametric monitoring to detect tampering attacks. The fiber monitoring techniques, using the SSP power management processes, allow the seals to last for years while maintaining the security requirements of the monitoring application. The seal is enclosed in a tamper resistant housing with software to support activemore » tamper monitoring. New features include LED notification of fiber closure, the ability to retrieve the entire fiber optic history via translator command, separate memory storage for fiber optic events, and a more robust method for tracking and resending failed messages.« less
NASA Astrophysics Data System (ADS)
Polewski, Przemyslaw; Yao, Wei; Heurich, Marco; Krzystek, Peter; Stilla, Uwe
2018-06-01
In this study, we present a method for improving the quality of automatic single fallen tree stem segmentation in ALS data by applying a specialized constrained conditional random field (CRF). The entire processing pipeline is composed of two steps. First, short stem segments of equal length are detected and a subset of them is selected for further processing, while in the second step the chosen segments are merged to form entire trees. The first step is accomplished using the specialized CRF defined on the space of segment labelings, capable of finding segment candidates which are easier to merge subsequently. To achieve this, the CRF considers not only the features of every candidate individually, but incorporates pairwise spatial interactions between adjacent segments into the model. In particular, pairwise interactions include a collinearity/angular deviation probability which is learned from training data as well as the ratio of spatial overlap, whereas unary potentials encode a learned probabilistic model of the laser point distribution around each segment. Each of these components enters the CRF energy with its own balance factor. To process previously unseen data, we first calculate the subset of segments for merging on a grid of balance factors by minimizing the CRF energy. Then, we perform the merging and rank the balance configurations according to the quality of their resulting merged trees, obtained from a learned tree appearance model. The final result is derived from the top-ranked configuration. We tested our approach on 5 plots from the Bavarian Forest National Park using reference data acquired in a field inventory. Compared to our previous segment selection method without pairwise interactions, an increase in detection correctness and completeness of up to 7 and 9 percentage points, respectively, was observed.
Seidelmann, Katrin; Melzer, Björn; Speck, Thomas
2012-11-01
Monkey's comb (Amphilophium crucigerum) is a widely spread neotropical leaf climber that develops attachment pads for anchorage. A single complex leaf of the species comprises a basal pair of foliate, assimilating leaflets and apical, attaching leaflet tendrils. This study aims to analyze these leaves and their ontogenetic development for a better understanding of the attachment process, the form-structure-function relationships involved, and the overall maturation of the leaves. Thorough morphometrical, morphological, and anatomical analyses incorporated high-resolution microscopy, various staining techniques, SEM, and photographic recordings over the entire ontogenetic course of leaf development. The foliate, assimilating leaflets and the anchorage of the more apical leaflet tendrils acted independently of each other. Attachment was achieved by coiling of the leaflet tendrils and/or development of attachment pads at the tendril apices that grow opportunistically into gaps and fissures of the substrate. In contact zones with the substrate, the cells of the pads differentiate into a vessel element-like tissue. During the entire attachment process of the plant, no glue was excreted. The complex leaves of monkey's comb are highly differentiated organs with specialized leaf parts whose functions-photosynthesis or attachment-work independently of each other. The function of attachment includes coiling and maturation process of the leaflet tendrils and the formation of attachment pads, resulting in a biomechanically sound and persistent anchorage of the plant without the need of glue excretion. This kind of glue-less attachment is not only of interest in the framework of analyzing the functional variety of attachment structures evolved in climbing plants, but also for the development of innovative biomimetic attachment structures for manifold technical applications.
NASA Astrophysics Data System (ADS)
Zhang, Zhuomin; Zhan, Yisen; Huang, Yichun; Li, Gongke
2017-08-01
In this work, a portable large-volume constant-concentration (LVCC) sampling technique coupling with surface-enhanced Raman spectroscopy (SERS) was developed for the rapid on-site gas analysis based on suitable derivatization methods. LVCC sampling technique mainly consisted of a specially designed sampling cell including the rigid sample container and flexible sampling bag, and an absorption-derivatization module with a portable pump and a gas flowmeter. LVCC sampling technique allowed large, alterable and well-controlled sampling volume, which kept the concentration of gas target in headspace phase constant during the entire sampling process and made the sampling result more representative. Moreover, absorption and derivatization of gas target during LVCC sampling process were efficiently merged in one step using bromine-thiourea and OPA-NH4+ strategy for ethylene and SO2 respectively, which made LVCC sampling technique conveniently adapted to consequent SERS analysis. Finally, a new LVCC sampling-SERS method was developed and successfully applied for rapid analysis of trace ethylene and SO2 from fruits. It was satisfied that trace ethylene and SO2 from real fruit samples could be actually and accurately quantified by this method. The minor concentration fluctuations of ethylene and SO2 during the entire LVCC sampling process were proved to be < 4.3% and 2.1% respectively. Good recoveries for ethylene and sulfur dioxide from fruit samples were achieved in range of 95.0-101% and 97.0-104% respectively. It is expected that portable LVCC sampling technique would pave the way for rapid on-site analysis of accurate concentrations of trace gas targets from real samples by SERS.
Zhang, Zhuomin; Zhan, Yisen; Huang, Yichun; Li, Gongke
2017-08-05
In this work, a portable large-volume constant-concentration (LVCC) sampling technique coupling with surface-enhanced Raman spectroscopy (SERS) was developed for the rapid on-site gas analysis based on suitable derivatization methods. LVCC sampling technique mainly consisted of a specially designed sampling cell including the rigid sample container and flexible sampling bag, and an absorption-derivatization module with a portable pump and a gas flowmeter. LVCC sampling technique allowed large, alterable and well-controlled sampling volume, which kept the concentration of gas target in headspace phase constant during the entire sampling process and made the sampling result more representative. Moreover, absorption and derivatization of gas target during LVCC sampling process were efficiently merged in one step using bromine-thiourea and OPA-NH 4 + strategy for ethylene and SO 2 respectively, which made LVCC sampling technique conveniently adapted to consequent SERS analysis. Finally, a new LVCC sampling-SERS method was developed and successfully applied for rapid analysis of trace ethylene and SO 2 from fruits. It was satisfied that trace ethylene and SO 2 from real fruit samples could be actually and accurately quantified by this method. The minor concentration fluctuations of ethylene and SO 2 during the entire LVCC sampling process were proved to be <4.3% and 2.1% respectively. Good recoveries for ethylene and sulfur dioxide from fruit samples were achieved in range of 95.0-101% and 97.0-104% respectively. It is expected that portable LVCC sampling technique would pave the way for rapid on-site analysis of accurate concentrations of trace gas targets from real samples by SERS. Copyright © 2017 Elsevier B.V. All rights reserved.
Intragenome Diversity of Gene Families Encoding Toxin-like Proteins in Venomous Animals.
Rodríguez de la Vega, Ricardo C; Giraud, Tatiana
2016-11-01
The evolution of venoms is the story of how toxins arise and of the processes that generate and maintain their diversity. For animal venoms these processes include recruitment for expression in the venom gland, neofunctionalization, paralogous expansions, and functional divergence. The systematic study of these processes requires the reliable identification of the venom components involved in antagonistic interactions. High-throughput sequencing has the potential of uncovering the entire set of toxins in a given organism, yet the existence of non-venom toxin paralogs and the misleading effects of partial census of the molecular diversity of toxins make necessary to collect complementary evidence to distinguish true toxins from their non-venom paralogs. Here, we analyzed the whole genomes of two scorpions, one spider and one snake, aiming at the identification of the full repertoires of genes encoding toxin-like proteins. We classified the entire set of protein-coding genes into paralogous groups and monotypic genes, identified genes encoding toxin-like proteins based on known toxin families, and quantified their expression in both venom-glands and pooled tissues. Our results confirm that genes encoding toxin-like proteins are part of multigene families, and that these families arise by recruitment events from non-toxin genes followed by limited expansions of the toxin-like protein coding genes. We also show that failing to account for sequence similarity with non-toxin proteins has a considerable misleading effect that can be greatly reduced by comparative transcriptomics. Our study overall contributes to the understanding of the evolutionary dynamics of proteins involved in antagonistic interactions. © The Author 2016. Published by Oxford University Press on behalf of the Society for Integrative and Comparative Biology. All rights reserved. For permissions please email: journals.permissions@oup.com.
Powering the planet: Chemical challenges in solar energy utilization
Lewis, Nathan S.; Nocera, Daniel G.
2006-01-01
Global energy consumption is projected to increase, even in the face of substantial declines in energy intensity, at least 2-fold by midcentury relative to the present because of population and economic growth. This demand could be met, in principle, from fossil energy resources, particularly coal. However, the cumulative nature of CO2 emissions in the atmosphere demands that holding atmospheric CO2 levels to even twice their preanthropogenic values by midcentury will require invention, development, and deployment of schemes for carbon-neutral energy production on a scale commensurate with, or larger than, the entire present-day energy supply from all sources combined. Among renewable energy resources, solar energy is by far the largest exploitable resource, providing more energy in 1 hour to the earth than all of the energy consumed by humans in an entire year. In view of the intermittency of insolation, if solar energy is to be a major primary energy source, it must be stored and dispatched on demand to the end user. An especially attractive approach is to store solar-converted energy in the form of chemical bonds, i.e., in a photosynthetic process at a year-round average efficiency significantly higher than current plants or algae, to reduce land-area requirements. Scientific challenges involved with this process include schemes to capture and convert solar energy and then store the energy in the form of chemical bonds, producing oxygen from water and a reduced fuel such as hydrogen, methane, methanol, or other hydrocarbon species. PMID:17043226
Powering the planet: chemical challenges in solar energy utilization.
Lewis, Nathan S; Nocera, Daniel G
2006-10-24
Global energy consumption is projected to increase, even in the face of substantial declines in energy intensity, at least 2-fold by midcentury relative to the present because of population and economic growth. This demand could be met, in principle, from fossil energy resources, particularly coal. However, the cumulative nature of CO(2) emissions in the atmosphere demands that holding atmospheric CO(2) levels to even twice their preanthropogenic values by midcentury will require invention, development, and deployment of schemes for carbon-neutral energy production on a scale commensurate with, or larger than, the entire present-day energy supply from all sources combined. Among renewable energy resources, solar energy is by far the largest exploitable resource, providing more energy in 1 hour to the earth than all of the energy consumed by humans in an entire year. In view of the intermittency of insolation, if solar energy is to be a major primary energy source, it must be stored and dispatched on demand to the end user. An especially attractive approach is to store solar-converted energy in the form of chemical bonds, i.e., in a photosynthetic process at a year-round average efficiency significantly higher than current plants or algae, to reduce land-area requirements. Scientific challenges involved with this process include schemes to capture and convert solar energy and then store the energy in the form of chemical bonds, producing oxygen from water and a reduced fuel such as hydrogen, methane, methanol, or other hydrocarbon species.
Moore, Amy Lawson; Miller, Terissa M
2018-01-01
The purpose of the current study is to evaluate the validity and reliability of the revised Gibson Test of Cognitive Skills, a computer-based battery of tests measuring short-term memory, long-term memory, processing speed, logic and reasoning, visual processing, as well as auditory processing and word attack skills. This study included 2,737 participants aged 5-85 years. A series of studies was conducted to examine the validity and reliability using the test performance of the entire norming group and several subgroups. The evaluation of the technical properties of the test battery included content validation by subject matter experts, item analysis and coefficient alpha, test-retest reliability, split-half reliability, and analysis of concurrent validity with the Woodcock Johnson III Tests of Cognitive Abilities and Tests of Achievement. Results indicated strong sources of evidence of validity and reliability for the test, including internal consistency reliability coefficients ranging from 0.87 to 0.98, test-retest reliability coefficients ranging from 0.69 to 0.91, split-half reliability coefficients ranging from 0.87 to 0.91, and concurrent validity coefficients ranging from 0.53 to 0.93. The Gibson Test of Cognitive Skills-2 is a reliable and valid tool for assessing cognition in the general population across the lifespan.
Park, Jong Hyuk; Nagpal, Prashant; McPeak, Kevin M; Lindquist, Nathan C; Oh, Sang-Hyun; Norris, David J
2013-10-09
The template-stripping method can yield smooth patterned films without surface contamination. However, the process is typically limited to coinage metals such as silver and gold because other materials cannot be readily stripped from silicon templates due to strong adhesion. Herein, we report a more general template-stripping method that is applicable to a larger variety of materials, including refractory metals, semiconductors, and oxides. To address the adhesion issue, we introduce a thin gold layer between the template and the deposited materials. After peeling off the combined film from the template, the gold layer can be selectively removed via wet etching to reveal a smooth patterned structure of the desired material. Further, we demonstrate template-stripped multilayer structures that have potential applications for photovoltaics and solar absorbers. An entire patterned device, which can include a transparent conductor, semiconductor absorber, and back contact, can be fabricated. Since our approach can also produce many copies of the patterned structure with high fidelity by reusing the template, a low-cost and high-throughput process in micro- and nanofabrication is provided that is useful for electronics, plasmonics, and nanophotonics.
Space Shuttle propulsion parameter estimation using optimal estimation techniques, volume 1
NASA Technical Reports Server (NTRS)
1983-01-01
The mathematical developments and their computer program implementation for the Space Shuttle propulsion parameter estimation project are summarized. The estimation approach chosen is the extended Kalman filtering with a modified Bryson-Frazier smoother. Its use here is motivated by the objective of obtaining better estimates than those available from filtering and to eliminate the lag associated with filtering. The estimation technique uses as the dynamical process the six degree equations-of-motion resulting in twelve state vector elements. In addition to these are mass and solid propellant burn depth as the ""system'' state elements. The ""parameter'' state elements can include aerodynamic coefficient, inertia, center-of-gravity, atmospheric wind, etc. deviations from referenced values. Propulsion parameter state elements have been included not as options just discussed but as the main parameter states to be estimated. The mathematical developments were completed for all these parameters. Since the systems dynamics and measurement processes are non-linear functions of the states, the mathematical developments are taken up almost entirely by the linearization of these equations as required by the estimation algorithms.
Applications of remote-sensing data in Alaska
NASA Technical Reports Server (NTRS)
Miller, J. M. (Principal Investigator)
1977-01-01
Public and private agencies were introduced to the use of remotely sensed data obtained by both satellite and aircraft, and benefitted from facilities for data processing enhancement and interpretation as well as from the institute's data library. Cooperative ventures involving the performance of operational activities included assistance to the Bureau of Land Management in the suppression of wildfires; the selection of sites for power line right-of-way; the mapping of leads in sea ice; determination of portions of public lands to be allocated for small scale farming; the identification of areas for large scale farming of barley; the observation of coastal processes and sediment transport near Prudhoe Bay; the establishment of a colar infrared file of the entire state; and photomapping for geological surveys. Monitoring of the outer continental shelf environment and reindeer herds was also conducted. Institutional constraints to full utilization of satellite remote sensing in the state are explored and plans for future activites include the generation of awareness by government agencies, the training of state personnel, and improving coordination and communication with users.
Understanding and reduction of defects on finished EUV masks
NASA Astrophysics Data System (ADS)
Liang, Ted; Sanchez, Peter; Zhang, Guojing; Shu, Emily; Nagpal, Rajesh; Stivers, Alan
2005-05-01
To reduce the risk of EUV lithography adaptation for the 32nm technology node in 2009, Intel has operated a EUV mask Pilot Line since early 2004. The Pilot Line integrates all the necessary process modules including common tool sets shared with current photomask production as well as EUV specific tools. This integrated endeavor ensures a comprehensive understanding of any issues, and development of solutions for the eventual fabrication of defect-free EUV masks. Two enabling modules for "defect-free" masks are pattern inspection and repair, which have been integrated into the Pilot Line. This is the first time we are able to look at real defects originated from multilayer blanks and patterning process on finished masks over entire mask area. In this paper, we describe our efforts in the qualification of DUV pattern inspection and electron beam mask repair tools for Pilot Line operation, including inspection tool sensitivity, defect classification and characterization, and defect repair. We will discuss the origins of each of the five classes of defects as seen by DUV pattern inspection tool on finished masks, and present solutions of eliminating and mitigating them.
Distinct brain networks for adaptive and stable task control in humans
Dosenbach, Nico U. F.; Fair, Damien A.; Miezin, Francis M.; Cohen, Alexander L.; Wenger, Kristin K.; Dosenbach, Ronny A. T.; Fox, Michael D.; Snyder, Abraham Z.; Vincent, Justin L.; Raichle, Marcus E.; Schlaggar, Bradley L.; Petersen, Steven E.
2007-01-01
Control regions in the brain are thought to provide signals that configure the brain's moment-to-moment information processing. Previously, we identified regions that carried signals related to task-control initiation, maintenance, and adjustment. Here we characterize the interactions of these regions by applying graph theory to resting state functional connectivity MRI data. In contrast to previous, more unitary models of control, this approach suggests the presence of two distinct task-control networks. A frontoparietal network included the dorsolateral prefrontal cortex and intraparietal sulcus. This network emphasized start-cue and error-related activity and may initiate and adapt control on a trial-by-trial basis. The second network included dorsal anterior cingulate/medial superior frontal cortex, anterior insula/frontal operculum, and anterior prefrontal cortex. Among other signals, these regions showed activity sustained across the entire task epoch, suggesting that this network may control goal-directed behavior through the stable maintenance of task sets. These two independent networks appear to operate on different time scales and affect downstream processing via dissociable mechanisms. PMID:17576922
Shraiki, Mario; Arba-Mosquera, Samuel
2011-06-01
To evaluate ablation algorithms and temperature changes in laser refractive surgery. The model (virtual laser system [VLS]) simulates different physical effects of an entire surgical process, simulating the shot-by-shot ablation process based on a modeled beam profile. The model is comprehensive and directly considers applied correction; corneal geometry, including astigmatism; laser beam characteristics; and ablative spot properties. Pulse lists collected from actual treatments were used to simulate the temperature increase during the ablation process. Ablation efficiency reduction in the periphery resulted in a lower peripheral temperature increase. Steep corneas had lesser temperature increases than flat ones. The maximum rise in temperature depends on the spatial density of the ablation pulses. For the same number of ablative pulses, myopic corrections showed the highest temperature increase, followed by myopic astigmatism, mixed astigmatism, phototherapeutic keratectomy (PTK), hyperopic astigmatism, and hyperopic treatments. The proposed model can be used, at relatively low cost, for calibration, verification, and validation of the laser systems used for ablation processes and would directly improve the quality of the results.
Title TBA: Revising the Abstract Submission Process.
Tibon, Roni; Open Science Committee, Cbu; Henson, Richard
2018-04-01
Academic conferences are among the most prolific scientific activities, yet the current abstract submission and review process has serious limitations. We propose a revised process that would address these limitations, achieve some of the aims of Open Science, and stimulate discussion throughout the entire lifecycle of the scientific work. Copyright © 2018 The Authors. Published by Elsevier Ltd.. All rights reserved.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-09-13
... Vehicle Access Element of the CDCA Plan for the WEMO area; and (2) Alternative processes for designating.... Identification of the process and decision criteria that should be used to designate routes in the sub-regional... analysis, and guide the entire process from plan decision-making to route designation review in order to...
Information Transparency in Education: Three Sides of a Two-Sided Process
ERIC Educational Resources Information Center
Mertsalova, T. A.
2015-01-01
Information transparency is the result of informational globalization and the avalanche of information and communication technologies: thus, these processes are natural for the whole modern society. Statistics show that during the past several years the transparency situation not just in education but in the entire society has expanded…
Fetal programming and environmental exposures: Implications for prenatal care and preterm birth
Fetal programming is an enormously complex process that relies on numerous environmental inputs from uterine tissue, the placenta, the maternal blood supply, and other sources. Recent evidence has made clear that the process is not based entirely on genetics, but rather on a deli...
Paths for Future Population Aging.
ERIC Educational Resources Information Center
Grigsby, Jill S.
Population aging refers to an entire age structure becoming older. The age structure of a population is the result of three basic processes: fertility, mortality, and migration. Age structures reflect both past effects and current patterns of these processes. At the town, city, or regional level, migration becomes an important factor in raising…
Code of Federal Regulations, 2013 CFR
2013-01-01
... of air from a retort before the start of process timing. (x) Water activity. The ratio of the water vapor pressure of the product to the vapor pressure of pure water at the same temperature. ... throughout the entire thermal process. (d) Canned product. A poultry food product with a water activity above...
Code of Federal Regulations, 2012 CFR
2012-01-01
... of air from a retort before the start of process timing. (x) Water activity. The ratio of the water vapor pressure of the product to the vapor pressure of pure water at the same temperature. ... throughout the entire thermal process. (d) Canned product. A poultry food product with a water activity above...
Code of Federal Regulations, 2014 CFR
2014-01-01
... of air from a retort before the start of process timing. (x) Water activity. The ratio of the water vapor pressure of the product to the vapor pressure of pure water at the same temperature. ... throughout the entire thermal process. (d) Canned product. A poultry food product with a water activity above...
Code of Federal Regulations, 2010 CFR
2010-01-01
... of air from a retort before the start of process timing. (x) Water activity. The ratio of the water vapor pressure of the product to the vapor pressure of pure water at the same temperature. ... throughout the entire thermal process. (d) Canned product. A poultry food product with a water activity above...
Code of Federal Regulations, 2011 CFR
2011-01-01
... of air from a retort before the start of process timing. (x) Water activity. The ratio of the water vapor pressure of the product to the vapor pressure of pure water at the same temperature. ... throughout the entire thermal process. (d) Canned product. A poultry food product with a water activity above...
Students Matter: Quality Measurements in Online Courses
ERIC Educational Resources Information Center
Unal, Zafer; Unal, Aslihan
2016-01-01
Quality Matters (QM) is a peer review process designed to certify the quality of online courses and online components. It has generated widespread interest and received national recognition for its peer-based approach to quality assurance and continuous improvement in online education. While the entire QM online course design process is…
The Core Flight System (cFS) Community: Providing Low Cost Solutions for Small Spacecraft
NASA Technical Reports Server (NTRS)
McComas, David; Wilmot, Jonathan; Cudmore, Alan
2016-01-01
In February 2015 the NASA Goddard Space Flight Center (GSFC) completed the open source release of the entire Core Flight Software (cFS) suite. After the open source release a multi-NASA center Configuration Control Board (CCB) was established that has managed multiple cFS product releases. The cFS was developed and is being maintained in compliance with the NASA Class B software development process requirements and the open source release includes all Class B artifacts. The cFS is currently running on three operational science spacecraft and is being used on multiple spacecraft and instrument development efforts. While the cFS itself is a viable flight software (FSW) solution, we have discovered that the cFS community is a continuous source of innovation and growth that provides products and tools that serve the entire FSW lifecycle and future mission needs. This paper summarizes the current state of the cFS community, the key FSW technologies being pursued, the development/verification tools and opportunities for the small satellite community to become engaged. The cFS is a proven high quality and cost-effective solution for small satellites with constrained budgets.
[Fatigue damage analysis of porcelain in all-ceramic crowns].
Liu, Yi-hong; Feng, Hai-lan; Liu, Guang-hua; Shen, Zhi-jian
2010-02-18
To investigate the fatigue damage mechanism of porcelain, and its relation with the microscopic defects in clinically failed all-ceramic crowns. Collecting the bilayered all-ceramic crowns failed in vivo. The fractured surfaces and occlusial surfaces of failed crowns were examined by an optical microscope followed by detailed fractography investigations using a field emission scanning electron microscope. When chemical impurities were of concern, energy-dispersive X-ray spectroscopy analysis was performed to examine chemical composition. A standard practice for fractography failure analysis of advanced ceramics is applied to disclose the fracture mode, and damage character. Three types of fracture features are defined as breakdown of the entire crown, and porcelain chipping-off/delamination. Alumina crowns were usually characterized by breakdown of the entire crown, while zirconia crowns by porcelain chipping-off and delamination. The fatigue damage of porcelain was classified into surface wear, cone crack, and porcelain delamination. The observed microscopic defects in this study included air bubbles and impurity particles. The multi-point occlusial contacts were recommended in all-ceramic restorations clinically. The thickness of porcelain is important for the anti-fatigue ability of porcelain. Cautions have to be taken to avoid contaminations during the veneering processes.
Design of a control system for the LECR3
NASA Astrophysics Data System (ADS)
Zhou, Wen-Xiong; Wang, Yan-Yu; Zhou, De-Tai; Lin, Fu-Yuan; Luo, Jin-Fu; Yu, Yan-Juan; Feng, Yu-Cheng; Lu, Wang
2013-11-01
The Lanzhou Electron Cyclotron Resonance Ion Source No. 3 (LECR3) plays an important role in supplying many kinds of ion beams to the Heavy Ion Research Facility in Lanzhou (HIRFL). In this paper, we provide a detailed description of a new remote control system for the LECR3 that we designed and implemented. This system uses typical distribution control for both the LECR3 and the newly-built Lanzhou All Permanent Magnet ECR Ion Source No. 1 (LAPECR1). The entire project, including the construction of hardware and the software, was completed in September 2012. The hardware consists of an industry computer (IPC), an intranet composed of a switch, and various controllers with Ethernet access functions. The software is written in C++ and is used to control all of the respective equipment through the intranet to ensure that the useful information is stored in a database for later analysis. The entire system can efficiently acquire the necessary data from the respective equipment at a speed of 3 times per second, after which the data is stored in the database. The system can also complete the interlock protection and alarm process in one second.
A comprehensive three-dimensional cortical map of vowel space.
Scharinger, Mathias; Idsardi, William J; Poe, Samantha
2011-12-01
Mammalian cortex is known to contain various kinds of spatial encoding schemes for sensory information including retinotopic, somatosensory, and tonotopic maps. Tonotopic maps are especially interesting for human speech sound processing because they encode linguistically salient acoustic properties. In this study, we mapped the entire vowel space of a language (Turkish) onto cortical locations by using the magnetic N1 (M100), an auditory-evoked component that peaks approximately 100 msec after auditory stimulus onset. We found that dipole locations could be structured into two distinct maps, one for vowels produced with the tongue positioned toward the front of the mouth (front vowels) and one for vowels produced in the back of the mouth (back vowels). Furthermore, we found spatial gradients in lateral-medial, anterior-posterior, and inferior-superior dimensions that encoded the phonetic, categorical distinctions between all the vowels of Turkish. Statistical model comparisons of the dipole locations suggest that the spatial encoding scheme is not entirely based on acoustic bottom-up information but crucially involves featural-phonetic top-down modulation. Thus, multiple areas of excitation along the unidimensional basilar membrane are mapped into higher dimensional representations in auditory cortex.
Chung-Davidson, Yu-Wen; Davidson, Peter J.; Scott, Anne M.; Walaszczyk, Erin J.; Brant, Cory O.; Buchinger, Tyler; Johnson, Nicholas S.; Li, Weiming
2014-01-01
Biliary atresia is a rare disease of infancy, with an estimated 1 in 15,000 frequency in the southeast United States, but more common in East Asian countries, with a reported frequency of 1 in 5,000 in Taiwan. Although much is known about the management of biliary atresia, its pathogenesis is still elusive. The sea lamprey (Petromyzon marinus) provides a unique opportunity to examine the mechanism and progression of biliary degeneration. Sea lamprey develop through three distinct life stages: larval, parasitic, and adult. During the transition from larvae to parasitic juvenile, sea lamprey undergo metamorphosis with dramatic reorganization and remodeling in external morphology and internal organs. In the liver, the entire biliary system is lost, including the gall bladder and the biliary tree. A newly-developed method called “CLARITY” was modified to clarify the entire liver and the junction with the intestine in metamorphic sea lamprey. The process of biliary degeneration was visualized and discerned during sea lamprey metamorphosis by using laser scanning confocal microscopy. This method provides a powerful tool to study biliary atresia in a unique animal model.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nimbalkar, Sachin U.; Guo, Wei; Wenning, Thomas J.
Smart manufacturing and advanced data analytics can help the manufacturing sector unlock energy efficiency from the equipment level to the entire manufacturing facility and the whole supply chain. These technologies can make manufacturing industries more competitive, with intelligent communication systems, real-time energy savings, and increased energy productivity. Smart manufacturing can give all employees in an organization the actionable information they need, when they need it, so that each person can contribute to the optimal operation of the corporation through informed, data-driven decision making. This paper examines smart technologies and data analytics approaches for improving energy efficiency and reducing energy costsmore » in process-supporting energy systems. It dives into energy-saving improvement opportunities through smart manufacturing technologies and sophisticated data collection and analysis. The energy systems covered in this paper include those with motors and drives, fans, pumps, air compressors, steam, and process heating.« less
Switching dynamics of TaOx-based threshold switching devices
NASA Astrophysics Data System (ADS)
Goodwill, Jonathan M.; Gala, Darshil K.; Bain, James A.; Skowronski, Marek
2018-03-01
Bi-stable volatile switching devices are being used as access devices in solid-state memory arrays and as the active part of compact oscillators. Such structures exhibit two stable states of resistance and switch between them at a critical value of voltage or current. A typical resistance transient under a constant amplitude voltage pulse starts with a slow decrease followed by a rapid drop and leveling off at a low steady state value. This behavior prompted the interpretation of initial delay and fast transition as due to two different processes. Here, we show that the entire transient including incubation time, transition time, and the final resistance values in TaOx-based switching can be explained by one process, namely, Joule heating with the rapid transition due to the thermal runaway. The time, which is required for the device in the conducting state to relax back to the stable high resistance one, is also consistent with the proposed mechanism.
Reading Aloud: Discrete Stage(s) Redux
Robidoux, Serje; Besner, Derek
2017-01-01
Interactive activation accounts of processing have had a broad and deep influence on cognitive psychology, particularly so in the context of computational accounts of reading aloud at the single word level. Here we address the issue of whether such a framework can simulate the joint effects of stimulus quality and word frequency (which have been shown to produce both additive and interactive effects depending on the context). We extend previous work on this question by considering an alternative implementation of a stimulus quality manipulation, and the role of interactive activation. Simulations with a version of the Dual Route Cascaded model (a model with interactive activation dynamics along the lexical route) demonstrate that the model is unable to simulate the entire pattern seen in human performance. We discuss how a hybrid interactive activation model that includes some context dependent staged processing could accommodate these data. PMID:28289395
Poissonian steady states: from stationary densities to stationary intensities.
Eliazar, Iddo
2012-10-01
Markov dynamics are the most elemental and omnipresent form of stochastic dynamics in the sciences, with applications ranging from physics to chemistry, from biology to evolution, and from economics to finance. Markov dynamics can be either stationary or nonstationary. Stationary Markov dynamics represent statistical steady states and are quantified by stationary densities. In this paper, we generalize the notion of steady state to the case of general Markov dynamics. Considering an ensemble of independent motions governed by common Markov dynamics, we establish that the entire ensemble attains Poissonian steady states which are quantified by stationary Poissonian intensities and which hold valid also in the case of nonstationary Markov dynamics. The methodology is applied to a host of Markov dynamics, including Brownian motion, birth-death processes, random walks, geometric random walks, renewal processes, growth-collapse dynamics, decay-surge dynamics, Ito diffusions, and Langevin dynamics.
Poissonian steady states: From stationary densities to stationary intensities
NASA Astrophysics Data System (ADS)
Eliazar, Iddo
2012-10-01
Markov dynamics are the most elemental and omnipresent form of stochastic dynamics in the sciences, with applications ranging from physics to chemistry, from biology to evolution, and from economics to finance. Markov dynamics can be either stationary or nonstationary. Stationary Markov dynamics represent statistical steady states and are quantified by stationary densities. In this paper, we generalize the notion of steady state to the case of general Markov dynamics. Considering an ensemble of independent motions governed by common Markov dynamics, we establish that the entire ensemble attains Poissonian steady states which are quantified by stationary Poissonian intensities and which hold valid also in the case of nonstationary Markov dynamics. The methodology is applied to a host of Markov dynamics, including Brownian motion, birth-death processes, random walks, geometric random walks, renewal processes, growth-collapse dynamics, decay-surge dynamics, Ito diffusions, and Langevin dynamics.
Incorporating learning goals about modeling into an upper-division physics laboratory experiment
NASA Astrophysics Data System (ADS)
Zwickl, Benjamin M.; Finkelstein, Noah; Lewandowski, H. J.
2014-09-01
Implementing a laboratory activity involves a complex interplay among learning goals, available resources, feedback about the existing course, best practices for teaching, and an overall philosophy about teaching labs. Building on our previous work, which described a process of transforming an entire lab course, we now turn our attention to how an individual lab activity on the polarization of light was redesigned to include a renewed emphasis on one broad learning goal: modeling. By using this common optics lab as a concrete case study of a broadly applicable approach, we highlight many aspects of the activity development and show how modeling is used to integrate sophisticated conceptual and quantitative reasoning into the experimental process through the various aspects of modeling: constructing models, making predictions, interpreting data, comparing measurements with predictions, and refining models. One significant outcome is a natural way to integrate an analysis and discussion of systematic error into a lab activity.
Lingam, Manasvi
2016-06-01
In this paper, percolation theory is employed to place tentative bounds on the probability p of interstellar travel and the emergence of a civilization (or panspermia) that colonizes the entire Galaxy. The ensuing ramifications with regard to the Fermi paradox are also explored. In particular, it is suggested that the correlation function of inhabited exoplanets can be used to observationally constrain p in the near future. It is shown, by using a mathematical evolution model known as the Yule process, that the probability distribution for civilizations with a given number of colonized worlds is likely to exhibit a power-law tail. Some of the dynamical aspects of this issue, including the question of timescales and generalizing percolation theory, were also studied. The limitations of these models, and other avenues for future inquiry, are also outlined. Complex life-Extraterrestrial life-Panspermia-Life detection-SETI. Astrobiology 16, 418-426.
NASA Technical Reports Server (NTRS)
Magness, E. R. (Principal Investigator)
1980-01-01
The success of the Transition Year procedure to separate and label barley and the other small grains was assessed. It was decided that developers of the procedure would carry out the exercise in order to prevent compounding procedural problems with implementation problems. The evaluation proceeded by labeling the sping small grains first. The accuracy of this labeling was, on the average, somewhat better than that in the Transition Year operations. Other departures from the original procedure included a regionalization of the labeling process, the use of trend analysis, and the removal of time constraints from the actual processing. Segment selection, ground truth derivation, and data available for each segment in the analysis are discussed. Labeling accuracy is examined for North Dakota, South Dakota, Minnesota, and Montana as well as for the entire four-state area. Errors are characterized.
Cyclic electron flow is redox-controlled but independent of state transition.
Takahashi, Hiroko; Clowez, Sophie; Wollman, Francis-André; Vallon, Olivier; Rappaport, Fabrice
2013-01-01
Photosynthesis is the biological process that feeds the biosphere with reduced carbon. The assimilation of CO2 requires the fine tuning of two co-existing functional modes: linear electron flow, which provides NADPH and ATP, and cyclic electron flow, which only sustains ATP synthesis. Although the importance of this fine tuning is appreciated, its mechanism remains equivocal. Here we show that cyclic electron flow as well as formation of supercomplexes, thought to contribute to the enhancement of cyclic electron flow, are promoted in reducing conditions with no correlation with the reorganization of the thylakoid membranes associated with the migration of antenna proteins towards Photosystems I or II, a process known as state transition. We show that cyclic electron flow is tuned by the redox power and this provides a mechanistic model applying to the entire green lineage including the vast majority of the cases in which state transition only involves a moderate fraction of the antenna.
Energetic particle influences in Earth's atmosphere
NASA Astrophysics Data System (ADS)
Aplin, Karen; Harrison, R. Giles; Nicoll, Keri; Rycroft, Michael; Briggs, Aaron
2016-04-01
Energetic particles from outer space, known as galactic cosmic rays, constantly ionise the entire atmosphere. During strong solar storms, solar energetic particles can also reach the troposphere and enhance ionisation. Atmospheric ionisation generates cluster ions. These facilitate current flow in the global electric circuit, which arises from charge separation in thunderstorms driven by meteorological processes. Energetic particles, whether solar or galactic in origin, may influence the troposphere and stratosphere through a range of different mechanisms, each probably contributing a small amount. Some of the suggested processes potentially acting over a wide spatial area in the troposphere include enhanced scavenging of charged aerosol particles, modification of droplet or droplet-droplet behavior by charging, and the direct absorption of infra-red radiation by the bending and stretching of hydrogen bonds inside atmospheric cluster-ions. As well as reviewing the proposed mechanisms by which energetic particles modulate atmospheric properties, we will also discuss new instrumentation for measurement of energetic particles in the atmosphere.
Open Science CBS Neuroimaging Repository: Sharing ultra-high-field MR images of the brain.
Tardif, Christine Lucas; Schäfer, Andreas; Trampel, Robert; Villringer, Arno; Turner, Robert; Bazin, Pierre-Louis
2016-01-01
Magnetic resonance imaging at ultra high field opens the door to quantitative brain imaging at sub-millimeter isotropic resolutions. However, novel image processing tools to analyze these new rich datasets are lacking. In this article, we introduce the Open Science CBS Neuroimaging Repository: a unique repository of high-resolution and quantitative images acquired at 7 T. The motivation for this project is to increase interest for high-resolution and quantitative imaging and stimulate the development of image processing tools developed specifically for high-field data. Our growing repository currently includes datasets from MP2RAGE and multi-echo FLASH sequences from 28 and 20 healthy subjects respectively. These datasets represent the current state-of-the-art in in-vivo relaxometry at 7 T, and are now fully available to the entire neuroimaging community. Copyright © 2015 Elsevier Inc. All rights reserved.
Navigating Institutions and Institutional Leadership to Address Sexual Violence
ERIC Educational Resources Information Center
Sisneros, Kathy; Rivera, Monica
2018-01-01
Using an institutional example, this chapter offers strategies to effectively navigate institutional culture, processes, and structures to engage the entire campus community in addressing sexual violence.
ERIC Educational Resources Information Center
Journal of Chemical Education, 2000
2000-01-01
This activity takes students through the process of fermentation. Requires an entire month for the full reaction to take place. The reaction, catalyzed by bacterial enzymes, produces lactic acid from glucose. (SAH)
Kutzin, Joseph; Ibraimova, Ainura; Jakab, Melitta; O'Dougherty, Sheila
2009-07-01
Options for health financing reform are often portrayed as a choice between general taxation (known as the Beveridge model) and social health insurance (known as the Bismarck model). Ten years of health financing reform in Kyrgyzstan, since the introduction of its compulsory health insurance fund in 1997, provide an excellent example of why it is wrong to reduce health financing policy to a choice between the Beveridge and Bismarck models. Rather than fragment the system according to the insurance status of the population, as many other low- and middle-income countries have done, the Kyrgyz reforms were guided by the objective of having a single system for the entire population. Key features include the role and gradual development of the compulsory health insurance fund as the single purchaser of health-care services for the entire population using output-based payment methods, the complete restructuring of pooling arrangements from the former decentralized budgetary structure to a single national pool, and the establishment of an explicit benefit package. Central to the process was the transformation of the role of general budget revenues - the main source of public funding for health - from directly subsidizing the supply of services to subsidizing the purchase of services on behalf of the entire population by redirecting them into the health insurance fund. Through their approach to health financing policy, and pooling in particular, the Kyrgyz health reformers demonstrated that different sources of funds can be used in an explicitly complementary manner to enable the creation of a unified, universal system.
Development of High Throughput Process for Constructing 454 Titanium and Illumina Libraries
DOE Office of Scientific and Technical Information (OSTI.GOV)
Deshpande, Shweta; Hack, Christopher; Tang, Eric
2010-05-28
We have developed two processes with the Biomek FX robot to construct 454 titanium and Illumina libraries in order to meet the increasing library demands. All modifications in the library construction steps were made to enable the adaptation of the entire processes to work with the 96-well plate format. The key modifications include the shearing of DNA with Covaris E210 and the enzymatic reaction cleaning and fragment size selection with SPRI beads and magnetic plate holders. The construction of 96 Titanium libraries takes about 8 hours from sheared DNA to ssDNA recovery. The processing of 96 Illumina libraries takes lessmore » time than that of the Titanium library process. Although both processes still require manual transfer of plates from robot to other work stations such as thermocyclers, these robotic processes represent about 12- to 24-folds increase of library capacity comparing to the manual processes. To enable the sequencing of many libraries in parallel, we have also developed sets of molecular barcodes for both library types. The requirements for the 454 library barcodes include 10 bases, 40-60percent GC, no consecutive same base, and no less than 3 bases difference between barcodes. We have used 96 of the resulted 270 barcodes to construct libraries and pool to test the ability of accurately assigning reads to the right samples. When allowing 1 base error occurred in the 10 base barcodes, we could assign 99.6percent of the total reads and 100percent of them were uniquely assigned. As for the Illumina barcodes, the requirements include 4 bases, balanced GC, and at least 2 bases difference between barcodes. We have begun to assess the ability to assign reads after pooling different number of libraries. We will discuss the progress and the challenges of these scale-up processes.« less
Resource allocation processes at multilateral organizations working in global health.
Chi, Y-Ling; Bump, Jesse B
2018-02-01
International institutions provide well over US$10 billion in development assistance for health (DAH) annually and between 1990 and 2014, DAH disbursements totaled $458 billion but how do they decide who gets what, and for what purpose? In this article, we explore how allocation decisions were made by the nine convening agencies of the Equitable Access Initiative. We provide clear, plain language descriptions of the complete process from resource mobilization to allocation for the nine multilateral agencies with prominent agendas in global health. Then, through a comparative analysis we illuminate the choices and strategies employed in the nine international institutions. We find that resource allocation in all reviewed institutions follow a similar pattern, which we categorized in a framework of five steps: strategy definition, resource mobilization, eligibility of countries, support type and funds allocation. All the reviewed institutions generate resource allocation decisions through well-structured and fairly complex processes. Variations in those processes seem to reflect differences in institutional principles and goals. However, these processes have serious shortcomings. Technical problems include inadequate flexibility to account for or meet country needs. Although aid effectiveness and value for money are commonly referenced, we find that neither performance nor impact is a major criterion for allocating resources. We found very little formal consideration of the incentives generated by allocation choices. Political issues include non-transparent influence on allocation processes by donors and bureaucrats, and the common practice of earmarking funds to bypass the normal allocation process entirely. Ethical deficiencies include low accountability and transparency at international institutions, and limited participation by affected citizens or their representatives. We find that recipient countries have low influence on allocation processes themselves, although within these processes they have some influence in relatively narrow areas. © The Author(s) 2018. Published by Oxford University Press in association with The London School of Hygiene and Tropical Medicine.
Algorithmic commonalities in the parallel environment
NASA Technical Reports Server (NTRS)
Mcanulty, Michael A.; Wainer, Michael S.
1987-01-01
The ultimate aim of this project was to analyze procedures from substantially different application areas to discover what is either common or peculiar in the process of conversion to the Massively Parallel Processor (MPP). Three areas were identified: molecular dynamic simulation, production systems (rule systems), and various graphics and vision algorithms. To date, only selected graphics procedures have been investigated. They are the most readily available, and produce the most visible results. These include simple polygon patch rendering, raycasting against a constructive solid geometric model, and stochastic or fractal based textured surface algorithms. Only the simplest of conversion strategies, mapping a major loop to the array, has been investigated so far. It is not entirely satisfactory.
3D ultrasound-based patient positioning for radiotherapy
NASA Astrophysics Data System (ADS)
Wang, Michael H.; Rohling, Robert N.; Archip, Neculai; Clark, Brenda G.
2006-03-01
A new 3D ultrasound-based patient positioning system for target localisation during radiotherapy is described. Our system incorporates the use of tracked 3D ultrasound scans of the target anatomy acquired using a dedicated 3D ultrasound probe during both the simulation and treatment sessions, fully automatic 3D ultrasound-toultrasound registration, and OPTOTRAK IRLEDs for registering simulation CT to ultrasound data. The accuracy of the entire radiotherapy treatment process resulting from the use of our system, from simulation to the delivery of radiation, has been validated on a phantom. The overall positioning error is less than 5mm, which includes errors from estimation of the irradiated region location in the phantom.
Web-Based Collaborative Publications System: R&Tserve
NASA Technical Reports Server (NTRS)
Abrams, Steve
1997-01-01
R&Tserve is a publications system based on 'commercial, off-the-shelf' (COTS) software that provides a persistent, collaborative workspace for authors and editors to support the entire publication development process from initial submission, through iterative editing in a hierarchical approval structure, and on to 'publication' on the WWW. It requires no specific knowledge of the WWW (beyond basic use) or HyperText Markup Language (HTML). Graphics and URLs are automatically supported. The system includes a transaction archive, a comments utility, help functionality, automated graphics conversion, automated table generation, and an email-based notification system. It may be configured and administered via the WWW and can support publications ranging from single page documents to multiple-volume 'tomes'.
2014-01-01
In fabrication of nano- and quantum devices, it is sometimes critical to position individual dopants at certain sites precisely to obtain the specific or enhanced functionalities. With first-principles simulations, we propose a method for substitutional doping of individual atom at a certain position on a stepped metal surface by single-atom manipulation. A selected atom at the step of Al (111) surface could be extracted vertically with an Al trimer-apex tip, and then the dopant atom will be positioned to this site. The details of the entire process including potential energy curves are given, which suggests the reliability of the proposed single-atom doping method. PMID:24899871
Chen, Chang; Zhang, Jinhu; Dong, Guofeng; Shao, Hezhu; Ning, Bo-Yuan; Zhao, Li; Ning, Xi-Jing; Zhuang, Jun
2014-01-01
In fabrication of nano- and quantum devices, it is sometimes critical to position individual dopants at certain sites precisely to obtain the specific or enhanced functionalities. With first-principles simulations, we propose a method for substitutional doping of individual atom at a certain position on a stepped metal surface by single-atom manipulation. A selected atom at the step of Al (111) surface could be extracted vertically with an Al trimer-apex tip, and then the dopant atom will be positioned to this site. The details of the entire process including potential energy curves are given, which suggests the reliability of the proposed single-atom doping method.
Mathematical Modeling Of Life-Support Systems
NASA Technical Reports Server (NTRS)
Seshan, Panchalam K.; Ganapathi, Balasubramanian; Jan, Darrell L.; Ferrall, Joseph F.; Rohatgi, Naresh K.
1994-01-01
Generic hierarchical model of life-support system developed to facilitate comparisons of options in design of system. Model represents combinations of interdependent subsystems supporting microbes, plants, fish, and land animals (including humans). Generic model enables rapid configuration of variety of specific life support component models for tradeoff studies culminating in single system design. Enables rapid evaluation of effects of substituting alternate technologies and even entire groups of technologies and subsystems. Used to synthesize and analyze life-support systems ranging from relatively simple, nonregenerative units like aquariums to complex closed-loop systems aboard submarines or spacecraft. Model, called Generic Modular Flow Schematic (GMFS), coded in such chemical-process-simulation languages as Aspen Plus and expressed as three-dimensional spreadsheet.
A parallel finite-difference method for computational aerodynamics
NASA Technical Reports Server (NTRS)
Swisshelm, Julie M.
1989-01-01
A finite-difference scheme for solving complex three-dimensional aerodynamic flow on parallel-processing supercomputers is presented. The method consists of a basic flow solver with multigrid convergence acceleration, embedded grid refinements, and a zonal equation scheme. Multitasking and vectorization have been incorporated into the algorithm. Results obtained include multiprocessed flow simulations from the Cray X-MP and Cray-2. Speedups as high as 3.3 for the two-dimensional case and 3.5 for segments of the three-dimensional case have been achieved on the Cray-2. The entire solver attained a factor of 2.7 improvement over its unitasked version on the Cray-2. The performance of the parallel algorithm on each machine is analyzed.
Specifying and protecting germ cell fate
Strome, Susan; Updike, Dustin
2015-01-01
Germ cells are the special cells in the body that undergo meiosis to generate gametes and subsequently entire new organisms after fertilization, a process that continues generation after generation. Recent studies have expanded our understanding of the factors and mechanisms that specify germ cell fate, including the partitioning of maternally supplied ‘germ plasm’, inheritance of epigenetic memory and expression of transcription factors crucial for primordial germ cell (PGC) development. Even after PGCs are specified, germline fate is labile and thus requires protective mechanisms, such as global transcriptional repression, chromatin state alteration and translation of only germline-appropriate transcripts. Findings from diverse species continue to provide insights into the shared and divergent needs of these special reproductive cells. PMID:26122616
A 3D Model to Compute Lightning and HIRF Coupling Effects on Avionic Equipment of an Aircraft
NASA Astrophysics Data System (ADS)
Perrin, E.; Tristant, F.; Guiffaut, C.; Terrade, F.; Reineix, A.
2012-05-01
This paper describes the 3D FDTD model of an aircraft developed to compute the lightning and HIRF (High Intentity Radiated Fields) coupling effects on avionic equipment and all the wire harness associated. This virtual prototype aims at assisting the aircraft manufacturer during the lightning and HIRF certification processes. The model presented here permits to cover a frequency range from lightning spectrum to the low frequency HIRF domain, i.e. 0 to 100 MHz. Moreover, the entire aircraft, including the frame, the skin, the wire harness and the equipment are taken into account in only one model. Results obtained are compared to measurements on a real aircraft.
Biospecimen Core Resource - TCGA
The Cancer Genome Atlas (TCGA) Biospecimen Core Resource centralized laboratory reviews and processes blood and tissue samples and their associated data using optimized standard operating procedures for the entire TCGA Research Network.
Coupling of snow and permafrost processes using the Basic Modeling Interface (BMI)
NASA Astrophysics Data System (ADS)
Wang, K.; Overeem, I.; Jafarov, E. E.; Piper, M.; Stewart, S.; Clow, G. D.; Schaefer, K. M.
2017-12-01
We developed a permafrost modeling tool based by implementing the Kudryavtsev empirical permafrost active layer depth model (the so-called "Ku" component). The model is specifically set up to have a basic model interface (BMI), which enhances the potential coupling to other earth surface processes model components. This model is accessible through the Web Modeling Tool in Community Surface Dynamics Modeling System (CSDMS). The Kudryavtsev model has been applied for entire Alaska to model permafrost distribution at high spatial resolution and model predictions have been verified by Circumpolar Active Layer Monitoring (CALM) in-situ observations. The Ku component uses monthly meteorological forcing, including air temperature, snow depth, and snow density, and predicts active layer thickness (ALT) and temperature on the top of permafrost (TTOP), which are important factors in snow-hydrological processes. BMI provides an easy approach to couple the models with each other. Here, we provide a case of coupling the Ku component to snow process components, including the Snow-Degree-Day (SDD) method and Snow-Energy-Balance (SEB) method, which are existing components in the hydrological model TOPOFLOW. The work flow is (1) get variables from meteorology component, set the values to snow process component, and advance the snow process component, (2) get variables from meteorology and snow component, provide these to the Ku component and advance, (3) get variables from snow process component, set the values to meteorology component, and advance the meteorology component. The next phase is to couple the permafrost component with fully BMI-compliant TOPOFLOW hydrological model, which could provide a useful tool to investigate the permafrost hydrological effect.
3D-printing porosity: A new approach to creating elevated porosity materials and structures.
Jakus, A E; Geisendorfer, N R; Lewis, P L; Shah, R N
2018-05-01
We introduce a new process that enables the ability to 3D-print high porosity materials and structures by combining the newly introduced 3D-Painting process with traditional salt-leaching. The synthesis and resulting properties of three 3D-printable inks comprised of varying volume ratios (25:75, 50:50, 70:30) of CuSO 4 salt and polylactide-co-glycolide (PLGA), as well as their as-printed and salt-leached counterparts, are discussed. The resulting materials are comprised entirely of PLGA (F-PLGA), but exhibit porosities proportional to the original CuSO 4 content. The three distinct F-PLGA materials exhibit average porosities of 66.6-94.4%, elastic moduli of 112.6-2.7 MPa, and absorbency of 195.7-742.2%. Studies with adult human mesenchymal stem cells (hMSCs) demonstrated that elevated porosity substantially promotes cell adhesion, viability, and proliferation. F-PLGA can also act as carriers for weak, naturally or synthetically-derived hydrogels. Finally, we show that this process can be extended to other materials including graphene, metals, and ceramics. Porosity plays an essential role in the performance and function of biomaterials, tissue engineering, and clinical medicine. For the same material chemistry, the level of porosity can dictate if it is cell, tissue, or organ friendly; with low porosity materials being far less favorable than high porosity materials. Despite its importance, it has been difficult to create three-dimensionally printed structures that are comprised of materials that have extremely high levels of internal porosity yet are surgically friendly (able to handle and utilize during surgical operations). In this work, we extend a new materials-centric approach to 3D-printing, 3D-Painting, to 3D-printing structures made almost entirely out of water-soluble salt. The structures are then washed in a specific way that not only extracts the salt but causes the structures to increase in size. With the salt removed, the resulting medical polymer structures are almost entirely porous and contain very little solid material, but the maintain their 3D-printed form and are highly compatible with adult human stem cells, are mechanically robust enough to use in surgical manipulations, and can be filled with and act as carriers for biologically active liquids and gels. We can also extend this process to three-dimensionally printing other porous materials, such as graphene, metals, and even ceramics. Copyright © 2018 Acta Materialia Inc. Published by Elsevier Ltd. All rights reserved.
Practical robotic self-awareness and self-knowledge
NASA Astrophysics Data System (ADS)
Gage, Douglas W.
2011-05-01
The functional software components of an autonomous robotic system express behavior via commands to its actuators, based on processed inputs from its sensors; we propose an additional set of "cognitive" capabilities for robotic systems of all types, based on the comprehensive logging of all available data, including sensor inputs, behavioral states, and outputs sent to actuators. A robot should maintain a "sense" of its own (piecewise) continuous existence through time and space; it should in some sense "get a life," providing a level of self-awareness and self-knowledge. Self-awareness includes the ability to survive and work through unexpected power glitches while executing a task or mission. Selfknowledge includes an extensive world model including a model of self and the purpose context in which it is operating (deontics). Our system must support proactive self-test, monitoring, and calibration, and maintain a "personal" health/repair history, supporting system test and evaluation by continuously measuring performance throughout the entire product lifecycle. It will include episodic memory, and a system "lifelog," and will also participate in multiple modes of Human Robotic interaction (HRI).
Access NASA Satellite Global Precipitation Data Visualization on YouTube
NASA Astrophysics Data System (ADS)
Liu, Z.; Su, J.; Acker, J. G.; Huffman, G. J.; Vollmer, B.; Wei, J.; Meyer, D. J.
2017-12-01
Since the satellite era began, NASA has collected a large volume of Earth science observations for research and applications around the world. Satellite data at 12 NASA data centers can also be used for STEM activities such as disaster events, climate change, etc. However, accessing satellite data can be a daunting task for non-professional users such as teachers and students because of unfamiliarity of terminology, disciplines, data formats, data structures, computing resources, processing software, programing languages, etc. Over the years, many efforts have been developed to improve satellite data access, but barriers still exist for non-professionals. In this presentation, we will present our latest activity that uses the popular online video sharing web site, YouTube, to access visualization of global precipitation datasets at the NASA Goddard Earth Sciences (GES) Data and Information Services Center (DISC). With YouTube, users can access and visualize a large volume of satellite data without necessity to learn new software or download data. The dataset in this activity is the 3-hourly TRMM (Tropical Rainfall Measuring Mission) Multi-satellite Precipitation Analysis (TMPA). The video consists of over 50,000 data files collected since 1998 onwards, covering a zone between 50°N-S. The YouTube video will last 36 minutes for the entire dataset record (over 19 years). Since the time stamp is on each frame of the video, users can begin at any time by dragging the time progress bar. This precipitation animation will allow viewing precipitation events and processes (e.g., hurricanes, fronts, atmospheric rivers, etc.) on a global scale. The next plan is to develop a similar animation for the GPM (Global Precipitation Measurement) Integrated Multi-satellitE Retrievals for GPM (IMERG). The IMERG provides precipitation on a near-global (60°N-S) coverage at half-hourly time interval, showing more details on precipitation processes and development, compared to the 3-hourly TMPA product. The entire video will contain more than 330,000 files and will last 3.6 hours. Future plans include development of fly-over videos for orbital data for an entire satellite mission or project. All videos will be uploaded and available at the GES DISC site on YouTube (https://www.youtube.com/user/NASAGESDISC).
PROBA-V, the small saellite for global vegetation monitoring
NASA Astrophysics Data System (ADS)
Deronde, Bart; Benhadj, Iskander; Clarijs, Dennis; Dierckx, Wouter; Dries, Jan; Sterckx, Sindy; van Roey, Tom; Wolters, erwin
2015-04-01
PROBA-V, the small satellite for global vegetation monitoring Bart Deronde, Iskander Benhadj, Dennis Clarijs, Wouter Dierckx, Jan Dries, Sindy Sterck, Tom Van Roey, Erwin Wolters (VITO NV) Exactly one year ago, in December 2013, VITO (Flemish Institute for Technological Research) started up the real time operations of PROBA-V. This miniaturised ESA (European Space Agency) satellite was launched by ESA's Vega rocket from Kourou, French-Guyana on May 7th, 2013. After six months of commissioning the mission was taken into operations. Since mid-December 2013 PROBA-V products are processed on an operational basis and distributed to a worldwide user community. PROVA-V is tasked with a full-scale mission: to map land cover and vegetation growth across the entire planet every two days. It is flying a lighter but fully functional redesign of the 'VEGETATION' imaging instruments previously flown on France's full-sized SPOT-4 and SPOT-5 satellites, which have been observing Earth since 1998. PROBA-V, entirely built by a Belgian consortium, continues this valuable and uninterrupted time series with daily products at 300 m and 1 km resolution. Even 100 m products will become available early 2015, delivering a global coverage every 5 days. The blue, red, near-infrared and mid-infrared wavebands allow PROBA-V to distinguish between different types of land cover/use and plant species, including crops. Vital uses of these data include day-by-day tracking of vegetation development, alerting authorities to crop failures, monitoring inland water resources and tracing the steady spread of deserts and deforestation. As such the data is also highly valuable to study climate change and the global carbon cycle. In this presentation we will discuss the in-flight results, one year after launch, from the User Segment (i.e. the processing facility) point of view. The focus will be on geometric and radiometric accuracy and stability. Furthermore, we will elaborate on the lessons learnt from the operational day-to-day activities. Data acquisition, input data quality, instrument programming, image processing and data distribution are some of the topics that will be highlighted. Finally, the synergy with other European missions like the Copernicus Sentinel 3 satellite will be handled.
Urgesi, Cosimo; Bricolo, Emanuela; Aglioti, Salvatore M
2005-08-01
Cerebral dominance and hemispheric metacontrol were investigated by testing the ability of healthy participants to match chimeric, entire, or half faces presented tachistoscopically. The two hemi-faces compounding chimeric or entire stimuli were presented simultaneously or asynchronously at different exposure times. Participants did not consciously detect chimeric faces for simultaneous presentations lasting up to 40 ms. Interestingly, a 20 ms separation between each half-chimera was sufficient to induce detection of conflicts at a conscious level. Although the presence of chimeric faces was not consciously perceived, performance on chimeric faces was poorer than on entire- and half-faces stimuli, thus indicating an implicit processing of perceptual conflicts. Moreover, the precedence of hemispheric stimulation over-ruled the right hemisphere dominance for face processing, insofar as the hemisphere stimulated last appeared to influence the response. This dynamic reversal of cerebral dominance, however, was not caused by a shift in hemispheric specialization, since the level of performance always reflected the right hemisphere specialization for face recognition. Thus, the dissociation between hemispheric dominance and specialization found in the present study hints at the existence of hemispheric metacontrol in healthy individuals.
NASA Astrophysics Data System (ADS)
Hu, Jiafei; Pan, Mengchun; Xin, Jianguang; Chen, Dixiang
2008-12-01
The magnetostrictive transducer is the most important part of the optic-fiber magnetic field sensor, and the optic-fiber/giant magnetostrictive(GMS) film coupled structure is a novel coupling form of the magnetostrictive transducer. Always we analyze the coupled structure based on the entire coupled structure being sputtered GMS material without tail-fibers. In practical application, the coupled structure has tail-fibers without films at two ends. When the entire coupled structure is immersed in the detected magnetic field, the detected magnetic field causes the GMS film strain then causing optic-fiber strain. This strain transmission process is different from it in the coupled structure entirely with GMS films without tail-fibers. The strain transmission relationship can be calculated theoretically in the coupled structure without tail-fibers, but it's complicated to theoretically calculate the strain transmission relationship in the coupled structure with tail-fibers. After large numbers of calculations and analyses by ANSYS software, we figure out some relationships of the two strain transmission processes in the respective structures and the stress distribution in the tail-fibers. These results are helpful to the practical application of the optic-fiber/ GMS film coupled structure.
Augmenting SCA project management and automation framework
NASA Astrophysics Data System (ADS)
Iyapparaja, M.; Sharma, Bhanupriya
2017-11-01
In our daily life we need to keep the records of the things in order to manage it in more efficient and proper way. Our Company manufactures semiconductor chips and sale it to the buyer. Sometimes it manufactures the entire product and sometimes partially and sometimes it sales the intermediary product obtained during manufacturing, so for the better management of the entire process there is a need to keep the track record of all the entity involved in it. Materials and Methods: Therefore to overcome with the problem the need raised to develop the framework for the maintenance of the project and for the automation testing. Project management framework provides an architecture which supports in managing the project by marinating the records of entire requirements, the test cases that were created for testing each unit of the software, defect raised from the past years. So through this the quality of the project can be maintained. Results: Automation framework provides the architecture which supports the development and implementation of the automation test script for the software testing process. Conclusion: For implementing project management framework the product of HP that is Application Lifecycle management is used which provides central repository to maintain the project.
Atomically Traceable Nanostructure Fabrication
Ballard, Josh B.; Dick, Don D.; McDonnell, Stephen J.; Bischof, Maia; Fu, Joseph; Owen, James H. G.; Owen, William R.; Alexander, Justin D.; Jaeger, David L.; Namboodiri, Pradeep; Fuchs, Ehud; Chabal, Yves J.; Wallace, Robert M.; Reidy, Richard; Silver, Richard M.; Randall, John N.; Von Ehr, James
2015-01-01
Reducing the scale of etched nanostructures below the 10 nm range eventually will require an atomic scale understanding of the entire fabrication process being used in order to maintain exquisite control over both feature size and feature density. Here, we demonstrate a method for tracking atomically resolved and controlled structures from initial template definition through final nanostructure metrology, opening up a pathway for top-down atomic control over nanofabrication. Hydrogen depassivation lithography is the first step of the nanoscale fabrication process followed by selective atomic layer deposition of up to 2.8 nm of titania to make a nanoscale etch mask. Contrast with the background is shown, indicating different mechanisms for growth on the desired patterns and on the H passivated background. The patterns are then transferred into the bulk using reactive ion etching to form 20 nm tall nanostructures with linewidths down to ~6 nm. To illustrate the limitations of this process, arrays of holes and lines are fabricated. The various nanofabrication process steps are performed at disparate locations, so process integration is discussed. Related issues are discussed including using fiducial marks for finding nanostructures on a macroscopic sample and protecting the chemically reactive patterned Si(100)-H surface against degradation due to atmospheric exposure. PMID:26274555
Atomically Traceable Nanostructure Fabrication.
Ballard, Josh B; Dick, Don D; McDonnell, Stephen J; Bischof, Maia; Fu, Joseph; Owen, James H G; Owen, William R; Alexander, Justin D; Jaeger, David L; Namboodiri, Pradeep; Fuchs, Ehud; Chabal, Yves J; Wallace, Robert M; Reidy, Richard; Silver, Richard M; Randall, John N; Von Ehr, James
2015-07-17
Reducing the scale of etched nanostructures below the 10 nm range eventually will require an atomic scale understanding of the entire fabrication process being used in order to maintain exquisite control over both feature size and feature density. Here, we demonstrate a method for tracking atomically resolved and controlled structures from initial template definition through final nanostructure metrology, opening up a pathway for top-down atomic control over nanofabrication. Hydrogen depassivation lithography is the first step of the nanoscale fabrication process followed by selective atomic layer deposition of up to 2.8 nm of titania to make a nanoscale etch mask. Contrast with the background is shown, indicating different mechanisms for growth on the desired patterns and on the H passivated background. The patterns are then transferred into the bulk using reactive ion etching to form 20 nm tall nanostructures with linewidths down to ~6 nm. To illustrate the limitations of this process, arrays of holes and lines are fabricated. The various nanofabrication process steps are performed at disparate locations, so process integration is discussed. Related issues are discussed including using fiducial marks for finding nanostructures on a macroscopic sample and protecting the chemically reactive patterned Si(100)-H surface against degradation due to atmospheric exposure.
NASA Astrophysics Data System (ADS)
Wade, Mark T.; Shainline, Jeffrey M.; Orcutt, Jason S.; Ram, Rajeev J.; Stojanovic, Vladimir; Popovic, Milos A.
2014-03-01
We present the spoked-ring microcavity, a nanophotonic building block enabling energy-efficient, active photonics in unmodified, advanced CMOS microelectronics processes. The cavity is realized in the IBM 45nm SOI CMOS process - the same process used to make many commercially available microprocessors including the IBM Power7 and Sony Playstation 3 processors. In advanced SOI CMOS processes, no partial etch steps and no vertical junctions are available, which limits the types of optical cavities that can be used for active nanophotonics. To enable efficient active devices with no process modifications, we designed a novel spoked-ring microcavity which is fully compatible with the constraints of the process. As a modulator, the device leverages the sub-100nm lithography resolution of the process to create radially extending p-n junctions, providing high optical fill factor depletion-mode modulation and thereby eliminating the need for a vertical junction. The device is made entirely in the transistor active layer, low-loss crystalline silicon, which eliminates the need for a partial etch commonly used to create ridge cavities. In this work, we present the full optical and electrical design of the cavity including rigorous mode solver and FDTD simulations to design the Qlimiting electrical contacts and the coupling/excitation. We address the layout of active photonics within the mask set of a standard advanced CMOS process and show that high-performance photonic devices can be seamlessly monolithically integrated alongside electronics on the same chip. The present designs enable monolithically integrated optoelectronic transceivers on a single advanced CMOS chip, without requiring any process changes, enabling the penetration of photonics into the microprocessor.
Array automated assembly task low cost silicon solar array project, phase 2
NASA Technical Reports Server (NTRS)
Olson, C.
1980-01-01
Analyses of solar cell and module process steps for throughput rate, cost effectiveness, and reproductibility are reported. In addition to the concentration on cell and module processing sequences, an investigation was made into the capability of using microwave energy in the diffusion, sintering, and thick film firing steps of cell processing. Although the entire process sequence was integrated, the steps are treated individually with test and experimental data, conclusions, and recommendations.
Validation Test Report for the Automated Optical Processing System (AOPS) Version 4.8
2013-06-28
be familiar with UNIX; BASH shell programming; and remote sensing, particularly regarding computer processing of satellite data. The system memory ...and storage requirements are difficult to gauge. The amount of memory needed is dependent upon the amount and type of satellite data you wish to...process; the larger the area, the larger the memory requirement. For example, the entire Atlantic Ocean will require more processing power than the
Zivojnovic, Marija; Delbos, Frédéric; Girelli Zubani, Giulia; Julé, Amélie; Alcais, Alexandre; Weill, Jean-Claude; Reynaud, Claude-Agnès; Storck, Sébastien
2014-06-01
A/T mutations at immunoglobulin loci are introduced by DNA polymerase η (Polη) during an Msh2/6-dependent repair process which results in A's being mutated 2-fold more often than T's. This patch synthesis is initiated by a DNA incision event whose origin is still obscure. We report here the analysis of A/T oligonucleotide mutation substrates inserted at the heavy chain locus, including or not including internal C's or G's. Surprisingly, the template composed of only A's and T's was highly mutated over its entire 90-bp length, with a 2-fold decrease in mutation from the 5' to the 3' end and a constant A/T ratio of 4. These results imply that Polη synthesis was initiated from a break in the 5'-flanking region of the substrate and proceeded over its entire length. The A/T bias was strikingly altered in an Ung(-/-) background, which provides the first experimental evidence supporting a concerted action of Ung and Msh2/6 pathways to generate mutations at A/T bases. New analysis of Pms2(-/-) animals provided a complementary picture, revealing an A/T mutation ratio of 4. We therefore propose that Ung and Pms2 may exert a mutual backup function for the DNA incision that promotes synthesis by Polη, each with a distinct strand bias. Copyright © 2014, American Society for Microbiology. All Rights Reserved.
Zivojnovic, Marija; Delbos, Frédéric; Girelli Zubani, Giulia; Julé, Amélie; Alcais, Alexandre; Storck, Sébastien
2014-01-01
A/T mutations at immunoglobulin loci are introduced by DNA polymerase η (Polη) during an Msh2/6-dependent repair process which results in A's being mutated 2-fold more often than T's. This patch synthesis is initiated by a DNA incision event whose origin is still obscure. We report here the analysis of A/T oligonucleotide mutation substrates inserted at the heavy chain locus, including or not including internal C's or G's. Surprisingly, the template composed of only A's and T's was highly mutated over its entire 90-bp length, with a 2-fold decrease in mutation from the 5′ to the 3′ end and a constant A/T ratio of 4. These results imply that Polη synthesis was initiated from a break in the 5′-flanking region of the substrate and proceeded over its entire length. The A/T bias was strikingly altered in an Ung−/− background, which provides the first experimental evidence supporting a concerted action of Ung and Msh2/6 pathways to generate mutations at A/T bases. New analysis of Pms2−/− animals provided a complementary picture, revealing an A/T mutation ratio of 4. We therefore propose that Ung and Pms2 may exert a mutual backup function for the DNA incision that promotes synthesis by Polη, each with a distinct strand bias. PMID:24710273
Integrated Knowledge Translation: illustrated with outcome research in mental health.
Preyde, Michele; Carter, Jeff; Penney, Randy; Lazure, Kelly; Vanderkooy, John; Chevalier, Pat
2015-01-01
Through this article the authors present a case summary of the early phases of research conducted with an Integrated Knowledge Translation (iKT) approach utilizing four factors: research question, research approach, feasibility, and outcome. iKT refers to an approach for conducting research in which community partners, referred to as knowledge users, are engaged in the entire research process. In this collaborative approach, knowledge users and researchers jointly devise the entire research agenda beginning with the development of the research question(s), determination of a feasible research design and feasible methods, interpretation of the results, dissemination of the findings, and the translation of knowledge into practice or policy decisions. Engaging clinical or community partners in the research enterprise can enhance the utility of the research results and facilitate its uptake. This collaboration can be a complex arrangement and flexibility may be required to accommodate the various configurations that the collaboration can take. For example, the research question can be jointly determined and refined; however, one person must take the responsibility for orchestrating the project, including preparing the proposal and application to the Research Ethics Board. This collaborative effort also requires the simultaneous navigation of barriers and facilitators to the research enterprise. Navigating these elements becomes part of the conduct of research with the potential for rewarding results, including an enriched work experience for clinical partners and investigators. One practice implication is that iKT may be considered of great utility to service providers due to its field friendly nature.
Image processing system performance prediction and product quality evaluation
NASA Technical Reports Server (NTRS)
Stein, E. K.; Hammill, H. B. (Principal Investigator)
1976-01-01
The author has identified the following significant results. A new technique for image processing system performance prediction and product quality evaluation was developed. It was entirely objective, quantitative, and general, and should prove useful in system design and quality control. The technique and its application to determination of quality control procedures for the Earth Resources Technology Satellite NASA Data Processing Facility are described.
Material quality development during the automated tow placement process
NASA Astrophysics Data System (ADS)
Tierney, John Joseph
Automated tow placement (ATP) of thermoplastic composites builds on the existing industrial base for equipment, robotics and kinematic placement of material with the aim of further cost reduction by eliminating the autoclave entirely. During ATP processing, thermoplastic composite tows are deposited on a preconsolidated substrate at rates ranging from 10--100mm/s and consolidated using the localized application of heat and pressure by a tow placement head mounted on a robot. The process is highly non-isothermal subjecting the material to multiple heating and cooling rates approaching 1000°C/sec. The requirement for the ATP process is to achieve the same quality in seconds (low void content, full translation of mechanical properties and degree of bonding and minimal warpage) as the autoclave process achieves in hours. The scientific challenge was to first understand and then model the relationships between processing, material response, microstructure and quality. The important phenomena affecting quality investigated in this study include a steady state heat transfer simulation, consolidation and deconsolidation (void dynamics), intimate contact and polymer interdiffusion (degree of bonding/mechanical properties) and residual stress and warpage (crystallization and viscoelastic response). A fundamental understanding of the role of materials related to these mechanisms and their relationship to final quality is developed and applied towards a method of process control and optimization.
NASA Technical Reports Server (NTRS)
Ganguly, Sangram; Kalia, Subodh; Li, Shuang; Michaelis, Andrew; Nemani, Ramakrishna R.; Saatchi, Sassan A
2017-01-01
Uncertainties in input land cover estimates contribute to a significant bias in modeled above ground biomass (AGB) and carbon estimates from satellite-derived data. The resolution of most currently used passive remote sensing products is not sufficient to capture tree canopy cover of less than ca. 10-20 percent, limiting their utility to estimate canopy cover and AGB for trees outside of forest land. In our study, we created a first of its kind Continental United States (CONUS) tree cover map at a spatial resolution of 1-m for the 2010-2012 epoch using the USDA NAIP imagery to address the present uncertainties in AGB estimates. The process involves different tasks including data acquisition ingestion to pre-processing and running a state-of-art encoder-decoder based deep convolutional neural network (CNN) algorithm for automatically generating a tree non-tree map for almost a quarter million scenes. The entire processing chain including generation of the largest open source existing aerial satellite image training database was performed at the NEX supercomputing and storage facility. We believe the resulting forest cover product will substantially contribute to filling the gaps in ongoing carbon and ecological monitoring research and help quantifying the errors and uncertainties in derived products.
Method of manufacturing iron aluminide by thermomechanical processing of elemental powders
Deevi, Seetharama C.; Lilly, Jr., A. Clifton; Sikka, Vinod K.; Hajaligol, Mohammed R.
2000-01-01
A powder metallurgical process of preparing iron aluminide useful as electrical resistance heating elements having improved room temperature ductility, electrical resistivity, cyclic fatigue resistance, high temperature oxidation resistance, low and high temperature strength, and/or resistance to high temperature sagging. The iron aluminide has an entirely ferritic microstructure which is free of austenite and can include, in weight %, 20 to 32% Al, and optional additions such as .ltoreq.1% Cr, .gtoreq.05% Zr or ZrO.sub.2 stringers extending perpendicular to an exposed surface of the heating element, .ltoreq.2% Ti, .ltoreq.2% Mo, .ltoreq.1% Zr, .ltoreq.1% C, .ltoreq.0.1% B, .ltoreq.30% oxide dispersoid and/or electrically insulating or electrically conductive covalent ceramic particles, .ltoreq.1 % rare earth metal, .ltoreq.1% oxygen, and/or .ltoreq.3% Cu. The process includes forming a mixture of aluminum powder and iron powder, shaping the mixture into an article such as by cold rolling the mixture into a sheet, and sintering the article at a temperature sufficient to react the iron and aluminum powders and form iron aluminide. The sintering can be followed by hot or cold rolling to reduce porosity created during the sintering step and optional annealing steps in a vacuum or inert atmosphere.
The science and practice of river restoration
NASA Astrophysics Data System (ADS)
Wohl, Ellen; Lane, Stuart N.; Wilcox, Andrew C.
2015-08-01
River restoration is one of the most prominent areas of applied water-resources science. From an initial focus on enhancing fish habitat or river appearance, primarily through structural modification of channel form, restoration has expanded to incorporate a wide variety of management activities designed to enhance river process and form. Restoration is conducted on headwater streams, large lowland rivers, and entire river networks in urban, agricultural, and less intensively human-altered environments. We critically examine how contemporary practitioners approach river restoration and challenges for implementing restoration, which include clearly identified objectives, holistic understanding of rivers as ecosystems, and the role of restoration as a social process. We also examine challenges for scientific understanding in river restoration. These include: how physical complexity supports biogeochemical function, stream metabolism, and stream ecosystem productivity; characterizing response curves of different river components; understanding sediment dynamics; and increasing appreciation of the importance of incorporating climate change considerations and resiliency into restoration planning. Finally, we examine changes in river restoration within the past decade, such as increasing use of stream mitigation banking; development of new tools and technologies; different types of process-based restoration; growing recognition of the importance of biological-physical feedbacks in rivers; increasing expectations of water quality improvements from restoration; and more effective communication between practitioners and river scientists.
Hilliard, Mark; Alley, William R; McManus, Ciara A; Yu, Ying Qing; Hallinan, Sinead; Gebler, John; Rudd, Pauline M
Glycosylation is an important attribute of biopharmaceutical products to monitor from development through production. However, glycosylation analysis has traditionally been a time-consuming process with long sample preparation protocols and manual interpretation of the data. To address the challenges associated with glycan analysis, we developed a streamlined analytical solution that covers the entire process from sample preparation to data analysis. In this communication, we describe the complete analytical solution that begins with a simplified and fast N-linked glycan sample preparation protocol that can be completed in less than 1 hr. The sample preparation includes labelling with RapiFluor-MS tag to improve both fluorescence (FLR) and mass spectral (MS) sensitivities. Following HILIC-UPLC/FLR/MS analyses, the data are processed and a library search based on glucose units has been included to expedite the task of structural assignment. We then applied this total analytical solution to characterize the glycosylation of the NIST Reference Material mAb 8761. For this glycoprotein, we confidently identified 35 N-linked glycans and all three major classes, high mannose, complex, and hybrid, were present. The majority of the glycans were neutral and fucosylated; glycans featuring N-glycolylneuraminic acid and those with two galactoses connected via an α1,3-linkage were also identified.
NASA Astrophysics Data System (ADS)
Ganguly, S.; Kalia, S.; Li, S.; Michaelis, A.; Nemani, R. R.; Saatchi, S.
2017-12-01
Uncertainties in input land cover estimates contribute to a significant bias in modeled above gound biomass (AGB) and carbon estimates from satellite-derived data. The resolution of most currently used passive remote sensing products is not sufficient to capture tree canopy cover of less than ca. 10-20 percent, limiting their utility to estimate canopy cover and AGB for trees outside of forest land. In our study, we created a first of its kind Continental United States (CONUS) tree cover map at a spatial resolution of 1-m for the 2010-2012 epoch using the USDA NAIP imagery to address the present uncertainties in AGB estimates. The process involves different tasks including data acquisition/ingestion to pre-processing and running a state-of-art encoder-decoder based deep convolutional neural network (CNN) algorithm for automatically generating a tree/non-tree map for almost a quarter million scenes. The entire processing chain including generation of the largest open source existing aerial/satellite image training database was performed at the NEX supercomputing and storage facility. We believe the resulting forest cover product will substantially contribute to filling the gaps in ongoing carbon and ecological monitoring research and help quantifying the errors and uncertainties in derived products.
Basic study of entire whole-body PET scanners based on the OpenPET geometry
NASA Astrophysics Data System (ADS)
Yoshida, Eiji; Yamaya, Taiga; Nishikido, Fumihiko; Inadama, Naoko; Murayama, Hideo
2010-09-01
A conventional PET scanner has a 15-25 cm axial field-of-view (FOV) and images a whole body using about six bed positions. An OpenPET geometry can extend the axial FOV with a limited number of detectors. The entire whole-body PET scanner must be able to process a large amount of data effectively. In this work, we study feasibility of the fully 3D entire whole-body PET scanner using the GATE simulation. The OpenPET has 12 block detector rings with the ring diameter of 840 mm and each block detector ring consists of 48 depth-of-interaction (DOI) detectors. The OpenPET has the axial length of 895.95 mm with five parts of 58.95 mm open gaps. The OpenPET has higher single data loss than a conventional PET scanner at grouping circuits. NECR of the OpenPET decreases by single data loss. But single data loss is mitigated by separating the axially arranged detector into two parts. Also, multiple coincidences are found to be important for the entire whole-body PET scanner. The entire whole-body PET scanner with the OpenPET geometry promises to provide a large axial FOV with the open space and to have sufficient performance values. But single data loss at the grouping circuits and multiple coincidences are limited to the peak noise equivalent count rate (NECR) for the entire whole-body PET scanner.
Chen, P P; Tsui, N Tk; Fung, A Sw; Chiu, A Hf; Wong, W Cw; Leong, H T; Lee, P Sf; Lau, J Yw
2017-08-01
The implementation of a new clinical service is associated with anxiety and challenges that may prevent smooth and safe execution of the service. Unexpected issues may not be apparent until the actual clinical service commences. We present a novel approach to test the new clinical setting before actual implementation of our endovascular aortic repair service. In-situ simulation at the new clinical location would enable identification of potential process and system issues prior to implementation of the service. After preliminary planning, a simulation test utilising a case scenario with actual simulation of the entire care process was carried out to identify any logistic, equipment, settings or clinical workflow issues, and to trial a contingency plan for a surgical complication. All patient care including anaesthetic, surgical, and nursing procedures and processes were simulated and tested. Overall, 17 vital process and system issues were identified during the simulation as potential clinical concerns. They included difficult patient positioning, draping pattern, unsatisfactory equipment setup, inadequate critical surgical instruments, blood products logistics, and inadequate nursing support during crisis. In-situ simulation provides an innovative method to identify critical deficiencies and unexpected issues before implementation of a new clinical service. Life-threatening and serious practical issues can be identified and corrected before formal service commences. This article describes our experience with the use of simulation in pre-implementation testing of a clinical process or service. We found the method useful and would recommend it to others.
NASA Technical Reports Server (NTRS)
Dundas, T. R.
1981-01-01
The development and capabilities of the Montana geodata system are discussed. The system is entirely dependent on the state's central data processing facility which serves all agencies and is therefore restricted to batch mode processing. The computer graphics equipment is briefly described along with its application to state lands and township mapping and the production of water quality interval maps.
Facing Global Challenges: A European University Perspective. Policy Perspectives
ERIC Educational Resources Information Center
Swail, Watson Scott
2014-01-01
This EPI Policy Perspectives covers a presentation given at the European University Association Annual Convention (March 20, 2009, in Prague, Czech Republic) that addresses the Bologna process in the European Union. The process raised many questions regarding the role of the university, and the entire tertiary/postsecondary system of education.…
Code of Federal Regulations, 2010 CFR
2010-10-01
... shall be prepared for each process and welding position to be employed in the fabrication. (1) Girth...) processes. Classes I, I-L, and II-L piping are required to have the inside of the pipe machined for good fit...). (1) Where seal welding of threaded joints is performed, threads shall be entirely covered by the seal...
Code of Federal Regulations, 2013 CFR
2013-10-01
... shall be prepared for each process and welding position to be employed in the fabrication. (1) Girth...) processes. Classes I, I-L, and II-L piping are required to have the inside of the pipe machined for good fit...). (1) Where seal welding of threaded joints is performed, threads shall be entirely covered by the seal...
Code of Federal Regulations, 2014 CFR
2014-10-01
... shall be prepared for each process and welding position to be employed in the fabrication. (1) Girth...) processes. Classes I, I-L, and II-L piping are required to have the inside of the pipe machined for good fit...). (1) Where seal welding of threaded joints is performed, threads shall be entirely covered by the seal...
Code of Federal Regulations, 2012 CFR
2012-10-01
... shall be prepared for each process and welding position to be employed in the fabrication. (1) Girth...) processes. Classes I, I-L, and II-L piping are required to have the inside of the pipe machined for good fit...). (1) Where seal welding of threaded joints is performed, threads shall be entirely covered by the seal...
Code of Federal Regulations, 2011 CFR
2011-10-01
... shall be prepared for each process and welding position to be employed in the fabrication. (1) Girth...) processes. Classes I, I-L, and II-L piping are required to have the inside of the pipe machined for good fit...). (1) Where seal welding of threaded joints is performed, threads shall be entirely covered by the seal...
Watershed rehabilitation: a process view
Robert R. Ziemer
1981-01-01
Abstract - The most effective control of erosion, in both physical and economic terms, is through prevention because once natural erosion is accelerated, corrective action is not only expensive but seldom entirely successful. To control erosion it is important to understand the forces that cause material to move or resist movement. Once the forces and processes of...
Share Your Voice: Online Community Building during Reaffirmation of Accreditation
ERIC Educational Resources Information Center
Kruse, Brenda; Bonura, Kimberlee Bethany; James, Suzanne G.; Potler, Shelley
2013-01-01
Walden University recently underwent a successful reaffirmation of accreditation process with The Higher Learning Commission of the North Central Association of Colleges and Schools. As part of the 3-year process, a committee, named the Education and Communication working group, was formed to inform and engage with the entire Walden community. The…
ERIC Educational Resources Information Center
Lin, Shih-Yin; Singh, Chandralekha
2015-01-01
It is well known that introductory physics students often have alternative conceptions that are inconsistent with established physical principles and concepts. Invoking alternative conceptions in the quantitative problem-solving process can derail the entire process. In order to help students solve quantitative problems involving strong…
ERIC Educational Resources Information Center
Zentel, Peter; Bett, Katja; Meister, Dorothee M.; Rinn, Ulrike; Wedekind, Joachim
2004-01-01
In this article, we describe the current situation of virtual universities in Germany and pursue the question of whether innovation processes taking place throughout the entire higher education landscape. Our study shows that the integration of ICT [information and communication technologies] not only changes the medial characteristics of the…
Duplicated genes evolve independently in allopolyploid cotton.
Richard C. Cronn; Randall L. Small; Jonathan F. Wendel
1999-01-01
Of the many processes that generate gene duplications, polyploidy is unique in that entire genomes are duplicated. This process has been important in the evolution of many eukaryotic groups, and it occurs with high frequency in plants. Recent evidence suggests that polyploidization may be accompanied by rapid genomic changes, but the evolutionary fate of discrete loci...
40 CFR 63.2840 - What emission requirements must I meet?
Code of Federal Regulations, 2014 CFR
2014-07-01
... entire calendar month in which the source operated under an initial startup period subject to § 63.2850(c... operating months, as determined in § 63.2853. Oilseed = Tons of each oilseed type “i” processed during the... Loss Factors for Determining Allowable HAP Loss Type of oilseed process A source that... Oilseed...
40 CFR 63.2840 - What emission requirements must I meet?
Code of Federal Regulations, 2013 CFR
2013-07-01
... entire calendar month in which the source operated under an initial startup period subject to § 63.2850(c... operating months, as determined in § 63.2853. Oilseed = Tons of each oilseed type “i” processed during the... Loss Factors for Determining Allowable HAP Loss Type of oilseed process A source that... Oilseed...
40 CFR 63.2840 - What emission requirements must I meet?
Code of Federal Regulations, 2012 CFR
2012-07-01
... entire calendar month in which the source operated under an initial startup period subject to § 63.2850(c... operating months, as determined in § 63.2853. Oilseed = Tons of each oilseed type “i” processed during the... Loss Factors for Determining Allowable HAP Loss Type of oilseed process A source that... Oilseed...
Cooperative Education Is a Superior Strategy for Using Basic Learning Processes.
ERIC Educational Resources Information Center
Reed, V. Gerald
Cooperative education is a learning strategy that fits very well with basic laws of learning. In fact, several basic important learning processes are far better adapted to the cooperative education strategy than to methods that lean entirely on classroom instruction. For instance, cooperative education affords more opportunities for reinforcement,…
An Interactive Virtual Tour of a Milk Powder Plant
ERIC Educational Resources Information Center
Herritsch, Alfred; Rahim, Elin Abdul; Fee, Conan J.; Morison, Ken R.; Gostomski, Peter A.
2013-01-01
Immersive learning applications in chemical and process engineering are creating the opportunity to bring entire process plants to the student. While meant to complement field trips, in some cases, this is the only opportunity for students to engage with certain industrial sites due to site regulations (health and safety, hygiene, intellectual…
Distributed Group Design Process: Lessons Learned.
ERIC Educational Resources Information Center
Eseryel, Deniz; Ganesan, Radha
A typical Web-based training development team consists of a project manager, an instructional designer, a subject-matter expert, a graphic artist, and a Web programmer. The typical scenario involves team members working together in the same setting during the entire design and development process. What happens when the team is distributed, that is…
Management of major system programs and projects. Handbook
NASA Technical Reports Server (NTRS)
1993-01-01
This Handbook establishes the detailed policies and processes for implementing NMI 7120.4, 'Management of Major System Programs and Projects'. It constitutes a comprehensive source of the specific policies and processes governing management of major development programs/projects and is intended as a resource to the entire program/project management (PPM) community.
NASA Astrophysics Data System (ADS)
Zhu, Ming; Liu, Tingting; Zhang, Xiangqun; Li, Caiyun
2018-01-01
Recently, a decomposition method of acoustic relaxation absorption spectra was used to capture the entire molecular multimode relaxation process of gas. In this method, the acoustic attenuation and phase velocity were measured jointly based on the relaxation absorption spectra. However, fast and accurate measurements of the acoustic attenuation remain challenging. In this paper, we present a method of capturing the molecular relaxation process by only measuring acoustic velocity, without the necessity of obtaining acoustic absorption. The method is based on the fact that the frequency-dependent velocity dispersion of a multi-relaxation process in a gas is the serial connection of the dispersions of interior single-relaxation processes. Thus, one can capture the relaxation times and relaxation strengths of N decomposed single-relaxation dispersions to reconstruct the entire multi-relaxation dispersion using the measurements of acoustic velocity at 2N + 1 frequencies. The reconstructed dispersion spectra are in good agreement with experimental data for various gases and mixtures. The simulations also demonstrate the robustness of our reconstructive method.
Non-crosslinked, amorphous, block copolymer electrolyte for batteries
Mayes, Anne M.; Ceder, Gerbrand; Chiang, Yet-Ming; Sadoway, Donald R.; Aydinol, Mehmet K.; Soo, Philip P.; Jang, Young-Il; Huang, Biying
2006-04-11
Solid battery components are provided. A block copolymeric electrolyte is non-crosslinked and non-glassy through the entire range of typical battery service temperatures, that is, through the entire range of at least from about 0.degree. C. to about 70.degree. C. The chains of which the copolymer is made each include at least one ionically-conductive block and at least one second block immiscible with the ionically-conductive block. The chains form an amorphous association and are arranged in an ordered nanostructure including a continuous matrix of amorphous ionically-conductive domains and amorphous second domains that are immiscible with the ionically-conductive domains. A compound is provided that has a formula of Li.sub.xM.sub.yN.sub.zO.sub.2. M and N are each metal atoms or a main group elements, and x, y and z are each numbers from about 0 to about 1. y and z are chosen such that a formal charge on the M.sub.yN.sub.z portion of the compound is (4-x). In certain embodiments, these compounds are used in the cathodes of rechargeable batteries. The present invention also includes methods of predicting the potential utility of metal dichalgogenide compounds for use in lithium intercalation compounds. It also provides methods for processing lithium intercalation oxides with the structure and compositional homogeneity necessary to realize the increased formation energies of said compounds. An article is made of a dimensionally-stable, interpenetrating microstructure of a first phase including a first component and a second phase, immiscible with the first phase, including a second component. The first and second phases define interphase boundaries between them, and at least one particle is positioned between a first phase and a second phase at an interphase boundary. When the first and second phases are electronically-conductive and ionically-conductive polymers, respectively, and the particles are ion host particles, the arrangement is an electrode of a battery.
Code of Federal Regulations, 2012 CFR
2012-07-01
... entire vapor processing system except the exhaust port(s) or stack(s). Flare means a thermal oxidation...(ee). Thermal oxidation system means a combustion device used to mix and ignite fuel, air pollutants...
Code of Federal Regulations, 2014 CFR
2014-07-01
... entire vapor processing system except the exhaust port(s) or stack(s). Flare means a thermal oxidation...(ee). Thermal oxidation system means a combustion device used to mix and ignite fuel, air pollutants...
Code of Federal Regulations, 2013 CFR
2013-07-01
... entire vapor processing system except the exhaust port(s) or stack(s). Flare means a thermal oxidation...(ee). Thermal oxidation system means a combustion device used to mix and ignite fuel, air pollutants...
Code of Federal Regulations, 2011 CFR
2011-07-01
... entire vapor processing system except the exhaust port(s) or stack(s). Flare means a thermal oxidation...(ee). Thermal oxidation system means a combustion device used to mix and ignite fuel, air pollutants...
Prescribing medical cannabis in Canada: Are we being too cautious?
Lake, Stephanie; Kerr, Thomas; Montaner, Julio
2015-04-30
There has been much recent discussion and debate surrounding cannabis in Canada, including the prescribing of medical cannabis for therapeutic purposes. Certain commentators - including the Canadian Medical Association (CMA) - have denounced the prescribing of cannabis for medical purposes due to a perceived lack of evidence related to the drug's efficacy, harms, and mechanism of action. In this commentary, we present arguments in favour of prescribing medical cannabis in Canada. We believe the anti-cannabis position taken by CMA and other commentators is not entirely evidence-based. Using the example of neuropathic pain, we present and summarize the clinical evidence surrounding smoked or vapourized cannabis, including recent evidence pertaining to the effectiveness of cannabis in comparison to existing standard pharmacotherapies for neuropathy. Further, we outline how the concerns expressed regarding cannabis' mechanism of action are inconsistent with current decision-making processes related to the prescribing of many common pharmaceuticals. Finally, we discuss potential secondary public health benefits of prescribing cannabis for pain-related disorders in Canada and North America.
Paths of Adoption: Routes to Continuous Process Improvement
2014-07-01
To obtain a process the team will use: The Team Lead has worked on teams with good processes and wants their new team to start out on the right foot ...eventually going to have to eat the entire process-improvement elephant . So, how do you get the “Never- Adopters” to undertake the effort? The key is to...Air Warfare Center Abstract. This paper covers the different types of teams the authors have en- countered as NAVAIR Internal Process Coaches and how
21 CFR 56.107 - IRB membership.
Code of Federal Regulations, 2014 CFR
2014-04-01
... of its members, and the diversity of the members, including consideration of race, gender, cultural... gender. No IRB may consist entirely of members of one profession. (c) Each IRB shall include at least one...
NASA Technical Reports Server (NTRS)
Hooker, Stanford B.; McClain, Charles R.; Mannino, Antonio
2007-01-01
The primary objective of this planning document is to establish a long-term capability and validating oceanic biogeochemical satellite data. It is a pragmatic solution to a practical problem based primarily o the lessons learned from prior satellite missions. All of the plan's elements are seen to be interdependent, so a horizontal organizational scheme is anticipated wherein the overall leadership comes from the NASA Ocean Biology and Biogeochemistry (OBB) Program Manager and the entire enterprise is split into two components of equal sature: calibration and validation plus satellite data processing. The detailed elements of the activity are based on the basic tasks of the two main components plus the current objectives of the Carbon Cycle and Ecosystems Roadmap. The former is distinguished by an internal core set of responsibilities and the latter is facilitated through an external connecting-core ring of competed or contracted activities. The core elements for the calibration and validation component include a) publish protocols and performance metrics; b) verify uncertainty budgets; c) manage the development and evaluation of instrumentation; and d) coordinate international partnerships. The core elements for the satellite data processing component are e) process and reprocess multisensor data; f) acquire, distribute, and archive data products; and g) implement new data products. Both components have shared responsibilities for initializing and temporally monitoring satellite calibration. Connecting-core elements include (but are not restricted to) atmospheric correction and characterization, standards and traceability, instrument and analysis round robins, field campaigns and vicarious calibration sites, in situ database, bio-optical algorithm (and product) validation, satellite characterization and vicarious calibration, and image processing software. The plan also includes an accountability process, creating a Calibration and Validation Team (to help manage the activity), and a discussion of issues associated with the plan's scientific focus.
NASA Astrophysics Data System (ADS)
Landry, Blake J.; Hancock, Matthew J.; Mei, Chiang C.; García, Marcelo H.
2012-09-01
The ability to determine wave heights and phases along a spatial domain is vital to understanding a wide range of littoral processes. The software tool presented here employs established Stokes wave theory and sampling methods to calculate parameters for the incident and reflected components of a field of weakly nonlinear waves, monochromatic at first order in wave slope and propagating in one horizontal dimension. The software calculates wave parameters over an entire wave tank and accounts for reflection, weak nonlinearity, and a free second harmonic. Currently, no publicly available program has such functionality. The included MATLAB®-based open source code has also been compiled for Windows®, Mac® and Linux® operating systems. An additional companion program, VirtualWave, is included to generate virtual wave fields for WaveAR. Together, the programs serve as ideal analysis and teaching tools for laboratory water wave systems.
Setup, Validation and Quality Control of a Centralized WGS laboratory - Lessons Learned.
Arnold, Cath; Edwards, Kirstin; Desai, Meeta; Platt, Steve; Green, Jonathan; Conway, David
2018-04-25
Routine use of Whole Genome analysis for infectious diseases can be used to enlighten various scenarios pertaining to Public Health, including identification of microbial pathogens; relating individual cases to an outbreak of infectious disease; establishing an association between an outbreak of food poisoning and a specific food vehicle; inferring drug susceptibility; source tracing of contaminants and study of variations in the genome affect pathogenicity/virulence. We describe the setup, validation and ongoing verification of a centralised WGS laboratory to carry out the sequencing for these public health functions for the National Infection Services, Public Health England in the UK. The performance characteristics and Quality Control metrics measured during validation and verification of the entire end to end process (accuracy, precision, reproducibility and repeatability) are described and include information regarding the automated pass and release of data to service users without intervention. © Crown copyright 2018.
Persily, Gail L.; Butter, Karen A.
2010-01-01
The University of California, San Francisco, is an academic health sciences campus that is part of a state public university system. Space is very limited at this urban campus, and the library building's 90,000 square feet represent extremely valuable real estate. A planning process spanning several years initially proposed creating new teaching space utilizing 10,000 square feet of the library. A collaborative campus-wide planning process eventually resulted in the design of a new teaching and learning center that integrates clinical skills, simulation, and technology-enhanced education facilties on one entire floor of the building (21,000 square feet). The planning process resulted in a project that serves the entire campus and strengthens the library's role in the education mission. The full impact of the project is yet unknown as construction is not complete. PMID:20098654
A new look at low-energy nuclear reaction research.
Krivit, Steven B; Marwan, Jan
2009-10-01
This paper presents a new look at low-energy nuclear reaction research, a field that has developed from one of the most controversial subjects in science, "cold fusion." Early in the history of this controversy, beginning in 1989, a strong polarity existed; many scientists fiercely defended the claim of new physical effects as well as a new process in which like-charged atomic nuclei overcome the Coulomb barrier at normal temperatures and pressures. Many other scientists considered the entire collection of physical observations-along with the hypothesis of a "cold fusion"--entirely a mistake. Twenty years later, some people who had dismissed the field in its entirety are considering the validity of at least some of the reported experimental phenomena. As well, some researchers in the field are wondering whether the underlying phenomena may be not a fusion process but a neutron capture/absorption process. In 2002, a related tabletop form of thermonuclear fusion was discovered in the field of acoustic inertial confinement fusion. We briefly review some of this work, as well.
de Araújo Padilha, Carlos Eduardo; Fortunato Dantas, Paulo Victor; de Sousa, Francisco Canindé; de Santana Souza, Domingos Fabiano; de Oliveira, Jackson Araújo; de Macedo, Gorete Ribeiro; Dos Santos, Everaldo Silvino
2016-12-15
In this study, a general rate model was applied to the entire process of expanded bed adsorption chromatography (EBAC) for the chitosanases purification protocol from unclarified fermentation broth produced by Paenibacillus ehimensis using the anionic adsorbent Streamline ® DEAE. For the experiments performed using the expanded bed, a homemade column (2.6cm×30.0cm) was specially designed. The proposed model predicted the entire EBA process adequately, giving R 2 values higher than 0.85 and χ 2 as low as 0.351 for the elution step. Using the validated model, a 3 3 factorial design was used to investigate other non-tested conditions as input. It was observed that the superficial velocity during loading and washing steps, as well as the settled bed height, has a strong positive effect on the F objective function used to evaluate the production of the purified chitosanases. Copyright © 2016 Elsevier B.V. All rights reserved.
Scandurra, I; Hägglund, M; Koch, S
2008-08-01
This paper presents a new multi-disciplinary method for user needs analysis and requirements specification in the context of health information systems based on established theories from the fields of participatory design and computer supported cooperative work (CSCW). Whereas conventional methods imply a separate, sequential needs analysis for each profession, the "multi-disciplinary thematic seminar" (MdTS) method uses a collaborative design process. Application of the method in elderly homecare resulted in prototypes that were well adapted to the intended user groups. Vital information in the points of intersection between different care professions was elicited and a holistic view of the entire care process was obtained. Health informatics-usability specialists and clinical domain experts are necessary to apply the method. Although user needs acquisition can be time-consuming, MdTS was perceived to efficiently identify in-context user needs, and transformed these directly into requirements specifications. Consequently the method was perceived to expedite the entire ICT implementation process.
Bang, Seungmin; Park, Jeong Youp; Jeong, Seok; Kim, Young Ho; Shim, Han Bo; Kim, Tae Song; Lee, Don Haeng; Song, Si Young
2009-02-01
We developed a capsule endoscope (CE), "MiRo," with the novel transmission technology of electric-field propagation. The technology uses the human body as a conductive medium for data transmission. Specifications of the prototype include the ability to receive real-time images; size, 10.8 x 24 mm; weight, 3.3 g; field of view, 150 degrees; resolution of power, 320 x 320 pixels; and transmittal speed, 2 frames per second. To evaluate the clinical safety and diagnostic feasibility of the prototype MiRo, we conducted a multicenter clinical trial. All volunteers underwent baseline examinations, including EGD and electrocardiography for the screening of GI obstructive and cardiovascular diseases, before the trial. In the first 10 cases, 24-hour Holter monitoring was also performed. To evaluate the diagnostic feasibility, transmission rate of the captured images, inspection rate of the entire small bowel, and quality of transmitted images (graded as outstanding, excellent, good/average, below average, and poor) were analyzed. Of the 49 healthy volunteers, 45 were included in the trial, and 4 were excluded because of baseline abnormalities. No adverse effects were noted. All CEs were expelled within 2 days, and the entire small bowel could be explored in all cases. The transmission rates of the captured image in the stomach, small bowel, and colon were 99.5%, 99.6%, and 97.2%, respectively. The mean total duration of image transmission was 9 hours, 51 minutes, and the mean transit time of the entire small bowel was 4 hours, 33 minutes. Image quality was graded as good or better in 41 cases (91.1%). Details of the villi and vascular structures of the entire small bowel were clearly visualized in 31 cases (68.9%). MiRo is safe and effective for exploring the entire small bowel, with good image quality and real-time feasibility. This novel transmission technology may have applications beyond the field of capsule endoscopy.
The central nervous system is composed of the brain and spinal cord. Your brain and spinal cord serve as the main "processing center" for your entire nervous system. They control all the workings of your body.
NASA Astrophysics Data System (ADS)
Bevilacqua, Andrea; Neri, Augusto; Bisson, Marina; Esposti Ongaro, Tomaso; Flandoli, Franco; Isaia, Roberto; Rosi, Mauro; Vitale, Stefano
2017-09-01
This study presents a new method for producing long-term hazard maps for pyroclastic density currents (PDC) originating at Campi Flegrei caldera. Such method is based on a doubly stochastic approach and is able to combine the uncertainty assessments on the spatial location of the volcanic vent, the size of the flow and the expected time of such an event. The results are obtained by using a Monte Carlo approach and adopting a simplified invasion model based on the box model integral approximation. Temporal assessments are modelled through a Cox-type process including self-excitement effects, based on the eruptive record of the last 15 kyr. Mean and percentile maps of PDC invasion probability are produced, exploring their sensitivity to some sources of uncertainty and to the effects of the dependence between PDC scales and the caldera sector where they originated. Conditional maps representative of PDC originating inside limited zones of the caldera, or of PDC with a limited range of scales are also produced. Finally, the effect of assuming different time windows for the hazard estimates is explored, also including the potential occurrence of a sequence of multiple events. Assuming that the last eruption of Monte Nuovo (A.D. 1538) marked the beginning of a new epoch of activity similar to the previous ones, results of the statistical analysis indicate a mean probability of PDC invasion above 5% in the next 50 years on almost the entire caldera (with a probability peak of 25% in the central part of the caldera). In contrast, probability values reduce by a factor of about 3 if the entire eruptive record is considered over the last 15 kyr, i.e. including both eruptive epochs and quiescent periods.
Ojeda-May, Pedro; Nam, Kwangho
2017-08-08
The strategy and implementation of scalable and efficient semiempirical (SE) QM/MM methods in CHARMM are described. The serial version of the code was first profiled to identify routines that required parallelization. Afterward, the code was parallelized and accelerated with three approaches. The first approach was the parallelization of the entire QM/MM routines, including the Fock matrix diagonalization routines, using the CHARMM message passage interface (MPI) machinery. In the second approach, two different self-consistent field (SCF) energy convergence accelerators were implemented using density and Fock matrices as targets for their extrapolations in the SCF procedure. In the third approach, the entire QM/MM and MM energy routines were accelerated by implementing the hybrid MPI/open multiprocessing (OpenMP) model in which both the task- and loop-level parallelization strategies were adopted to balance loads between different OpenMP threads. The present implementation was tested on two solvated enzyme systems (including <100 QM atoms) and an S N 2 symmetric reaction in water. The MPI version exceeded existing SE QM methods in CHARMM, which include the SCC-DFTB and SQUANTUM methods, by at least 4-fold. The use of SCF convergence accelerators further accelerated the code by ∼12-35% depending on the size of the QM region and the number of CPU cores used. Although the MPI version displayed good scalability, the performance was diminished for large numbers of MPI processes due to the overhead associated with MPI communications between nodes. This issue was partially overcome by the hybrid MPI/OpenMP approach which displayed a better scalability for a larger number of CPU cores (up to 64 CPUs in the tested systems).
BLENDING ANALYSIS FOR RADIOACTIVE SALT WASTE PROCESSING FACILITY
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lee, S.
2012-05-10
Savannah River National Laboratory (SRNL) evaluated methods to mix and blend the contents of the blend tanks to ensure the contents are properly blended before they are transferred from the blend tank such as Tank 21 and Tank 24 to the Salt Waste Processing Facility (SWPF) feed tank. The tank contents consist of three forms: dissolved salt solution, other waste salt solutions, and sludge containing settled solids. This paper focuses on developing the computational model and estimating the operation time of submersible slurry pump when the tank contents are adequately blended prior to their transfer to the SWPF facility. Amore » three-dimensional computational fluid dynamics approach was taken by using the full scale configuration of SRS Type-IV tank, Tank 21H. Major solid obstructions such as the tank wall boundary, the transfer pump column, and three slurry pump housings including one active and two inactive pumps were included in the mixing performance model. Basic flow pattern results predicted by the computational model were benchmarked against the SRNL test results and literature data. Tank 21 is a waste tank that is used to prepare batches of salt feed for SWPF. The salt feed must be a homogeneous solution satisfying the acceptance criterion of the solids entrainment during transfer operation. The work scope described here consists of two modeling areas. They are the steady state flow pattern calculations before the addition of acid solution for tank blending operation and the transient mixing analysis during miscible liquid blending operation. The transient blending calculations were performed by using the 95% homogeneity criterion for the entire liquid domain of the tank. The initial conditions for the entire modeling domain were based on the steady-state flow pattern results with zero second phase concentration. The performance model was also benchmarked against the SRNL test results and literature data.« less
NASA Astrophysics Data System (ADS)
Lau, Rita
2018-02-01
In this paper, we investigate the sensitivities of positron decays on a one-zone model of type-I X-ray bursts. Most existing studies have multiplied or divided entire beta decay rates (electron captures and beta decay rates) by 10. Instead of using the standard Fuller & Fowler (FFNU) rates, we used the most recently developed weak library rates [1], which include rates from Langanke et al.'s table (the LMP table) (2000) [2], Langanke et al.'s table (the LMSH table) (2003) [3], and Oda et al.'s table (1994) [4] (all shell model rates). We then compared these table rates with the old FFNU rates [5] to study differences within the final abundances. Both positron decays and electron capture rates were included in the tables. We also used pn-QRPA rates [6,7] to study the differences within the final abundances. Many of the positron rates from the nuclei's ground states and initial excited energy states along the rapid proton capture (rp) process have been measured in existing studies. However, because temperature affects the rates of excited states, these studies should have also acknowledged the half-lives of the nuclei's excited states. Thus, instead of multiplying or dividing entire rates by 10, we studied how the half-lives of sensitive nuclei in excited states affected the abundances by dividing the half-lives of the ground states by 10, which allowed us to set the half-lives of the excited states. Interestingly, we found that the peak of the final abundance shifted when we modified the rates from the excited states of the 105Sn positron decay rates. Furthermore, the abundance of 80Zr also changed due to usage of pn-QRPA rates instead of weak library rates (the shell model rates).
Statistical Methods for Identifying Sequence Motifs Affecting Point Mutations
Zhu, Yicheng; Neeman, Teresa; Yap, Von Bing; Huttley, Gavin A.
2017-01-01
Mutation processes differ between types of point mutation, genomic locations, cells, and biological species. For some point mutations, specific neighboring bases are known to be mechanistically influential. Beyond these cases, numerous questions remain unresolved, including: what are the sequence motifs that affect point mutations? How large are the motifs? Are they strand symmetric? And, do they vary between samples? We present new log-linear models that allow explicit examination of these questions, along with sequence logo style visualization to enable identifying specific motifs. We demonstrate the performance of these methods by analyzing mutation processes in human germline and malignant melanoma. We recapitulate the known CpG effect, and identify novel motifs, including a highly significant motif associated with A→G mutations. We show that major effects of neighbors on germline mutation lie within ±2 of the mutating base. Models are also presented for contrasting the entire mutation spectra (the distribution of the different point mutations). We show the spectra vary significantly between autosomes and X-chromosome, with a difference in T→C transition dominating. Analyses of malignant melanoma confirmed reported characteristic features of this cancer, including statistically significant strand asymmetry, and markedly different neighboring influences. The methods we present are made freely available as a Python library https://bitbucket.org/pycogent3/mutationmotif. PMID:27974498
Using Automation to Improve the Flight Software Testing Process
NASA Technical Reports Server (NTRS)
ODonnell, James R., Jr.; Andrews, Stephen F.; Morgenstern, Wendy M.; Bartholomew, Maureen O.; McComas, David C.; Bauer, Frank H. (Technical Monitor)
2001-01-01
One of the critical phases in the development of a spacecraft attitude control system (ACS) is the testing of its flight software. The testing (and test verification) of ACS flight software requires a mix of skills involving software, attitude control, data manipulation, and analysis. The process of analyzing and verifying flight software test results often creates a bottleneck which dictates the speed at which flight software verification can be conducted. In the development of the Microwave Anisotropy Probe (MAP) spacecraft ACS subsystem, an integrated design environment was used that included a MAP high fidelity (HiFi) simulation, a central database of spacecraft parameters, a script language for numeric and string processing, and plotting capability. In this integrated environment, it was possible to automate many of the steps involved in flight software testing, making the entire process more efficient and thorough than on previous missions. In this paper, we will compare the testing process used on MAP to that used on previous missions. The software tools that were developed to automate testing and test verification will be discussed, including the ability to import and process test data, synchronize test data and automatically generate HiFi script files used for test verification, and an automated capability for generating comparison plots. A summary of the perceived benefits of applying these test methods on MAP will be given. Finally, the paper will conclude with a discussion of re-use of the tools and techniques presented, and the ongoing effort to apply them to flight software testing of the Triana spacecraft ACS subsystem.
NASA Technical Reports Server (NTRS)
Szuszczewicz, E. P.; Bateman, T. T.
1996-01-01
We have conducted a laboratory investigation into the physics of plasma expansions and their associated energization processes. We studied single- and multi-ion plasma processes in self-expansions, and included light and heavy ions and heavy/light mixtures to encompass the phenomenological regimes of the solar and polar winds and the AMPTE and CRRES chemical release programs. The laboratory experiments provided spatially-distributed time-dependent measurements of total plasma density, temperature, and density fluctuation power spectra with the data confirming the long-theorized electron energization process in an expanding cloud - a result that was impossible to determine in spaceborne experiments (as e.g., in the CRRES program). These results provided the missing link in previous laboratory and spaceborne programs. confirming important elements in our understanding of such solar-terrestrial processes as manifested in expanding plasmas in the solar wind (e.g., CMES) and in ionospheric outflow in plasmaspheric fluctuate refilling after a storm. The energization signatures were seen in an entire series of runs that varied the ion species (Ar', Xe', Kr' and Ne'), and correlative studies included spectral analyses of electrostatic waves collocated with the energized electron distributions. In all cases wave energies were most intense during the times in which the suprathermal populations were present, with wave intensity increasing with the intensity of the suprathermal electron population. This is consistent with theoretical expectations wherein the energization process is directly attributable to wave particle interactions. No resonance conditions were observed, in an overall framework in which the general wave characteristics were broadband with power decreasing with increasing frequency.
Using Automation to Improve the Flight Software Testing Process
NASA Technical Reports Server (NTRS)
ODonnell, James R., Jr.; Morgenstern, Wendy M.; Bartholomew, Maureen O.
2001-01-01
One of the critical phases in the development of a spacecraft attitude control system (ACS) is the testing of its flight software. The testing (and test verification) of ACS flight software requires a mix of skills involving software, knowledge of attitude control, and attitude control hardware, data manipulation, and analysis. The process of analyzing and verifying flight software test results often creates a bottleneck which dictates the speed at which flight software verification can be conducted. In the development of the Microwave Anisotropy Probe (MAP) spacecraft ACS subsystem, an integrated design environment was used that included a MAP high fidelity (HiFi) simulation, a central database of spacecraft parameters, a script language for numeric and string processing, and plotting capability. In this integrated environment, it was possible to automate many of the steps involved in flight software testing, making the entire process more efficient and thorough than on previous missions. In this paper, we will compare the testing process used on MAP to that used on other missions. The software tools that were developed to automate testing and test verification will be discussed, including the ability to import and process test data, synchronize test data and automatically generate HiFi script files used for test verification, and an automated capability for generating comparison plots. A summary of the benefits of applying these test methods on MAP will be given. Finally, the paper will conclude with a discussion of re-use of the tools and techniques presented, and the ongoing effort to apply them to flight software testing of the Triana spacecraft ACS subsystem.
Using the nursing process to implement a Y2K computer application.
Hobbs, C F; Hardinge, T T
2000-01-01
Because of the coming year 2000, the need was assessed to upgrade the order entry system at many hospitals. At Somerset Medical Center, a training team divided the transition into phases and used a modified version of the nursing process to implement the new program. The entire process required fewer than 6 months and was relatively problem-free. This successful transition was aided by the nursing process, training team, and innovative educational techniques.
NASA Astrophysics Data System (ADS)
Escriva-Bou, A.; Lund, J. R.; Pulido-Velazquez, M.; Medellin-Azuara, J.
2015-12-01
Most individual processes relating water and energy interdependence have been assessed in many different ways over the last decade. It is time to step up and include the results of these studies in management by proportionating a tool for integrating these processes in decision-making to effectively understand the tradeoffs between water and energy from management options and scenarios. A simple but powerful decision support system (DSS) for water management is described that includes water-related energy use and GHG emissions not solely from the water operations, but also from final water end uses, including demands from cities, agriculture, environment and the energy sector. Because one of the main drivers of energy use and GHG emissions is water pumping from aquifers, the DSS combines a surface water management model with a simple groundwater model, accounting for their interrelationships. The model also explicitly includes economic data to optimize water use across sectors during shortages and calculate return flows from different uses. Capabilities of the DSS are demonstrated on a case study over California's intertied water system. Results show that urban end uses account for most GHG emissions of the entire water cycle, but large water conveyance produces significant peaks over the summer season. Also the development of more efficient water application on the agricultural sector has increased the total energy consumption and the net water use in the basins.
New principle of chemotherapy resistance
A laboratory study has revealed an entirely unexpected process for acquiring drug resistance that bypasses the need to re-establish DNA damage repair in breast cancers that have mutant BRCA1 or BRCA2 genes.
43 CFR 3203.10 - How are lands included in a competitive sale?
Code of Federal Regulations, 2011 CFR
2011-10-01
... within the legal subdivision, section, township, and range; (2) For unsurveyed lands, describe the lands..., include an entire section, township, and range. Do not divide protracted sections into aliquot parts; (4...
Azevedo, Stephen; Grangeat, Pierre; Rizo, Philippe
1995-01-01
Process and installation making it possible to reconstitute precise images of an area of interest (2) of an object (1) by reducing the errors produced by the contribution of the compliment of the object. A first series of measurements is carried out, where a conical beam (10) only takes in the area of interest of the object (2) and this is followed by a second series of measurements in which the beam takes in the entire object. A combination of the measurements of the two series is carried out in order to make them compatible and obtain a more accurate image of the area of interest (2).
Statistical Features of the 2010 Beni-Ilmane, Algeria, Aftershock Sequence
NASA Astrophysics Data System (ADS)
Hamdache, M.; Peláez, J. A.; Gospodinov, D.; Henares, J.
2018-03-01
The aftershock sequence of the 2010 Beni-Ilmane ( M W 5.5) earthquake is studied in depth to analyze the spatial and temporal variability of seismicity parameters of the relationships modeling the sequence. The b value of the frequency-magnitude distribution is examined rigorously. A threshold magnitude of completeness equal to 2.1, using the maximum curvature procedure or the changing point algorithm, and a b value equal to 0.96 ± 0.03 have been obtained for the entire sequence. Two clusters have been identified and characterized by their faulting type, exhibiting b values equal to 0.99 ± 0.05 and 1.04 ± 0.05. Additionally, the temporal decay of the aftershock sequence was examined using a stochastic point process. The analysis was done through the restricted epidemic-type aftershock sequence (RETAS) stochastic model, which allows the possibility to recognize the prevailing clustering pattern of the relaxation process in the examined area. The analysis selected the epidemic-type aftershock sequence (ETAS) model to offer the most appropriate description of the temporal distribution, which presumes that all events in the sequence can cause secondary aftershocks. Finally, the fractal dimensions are estimated using the integral correlation. The obtained D 2 values are 2.15 ± 0.01, 2.23 ± 0.01 and 2.17 ± 0.02 for the entire sequence, and for the first and second cluster, respectively. An analysis of the temporal evolution of the fractal dimensions D -2, D 0, D 2 and the spectral slope has been also performed to derive and characterize the different clusters included in the sequence.
Stagnation-point heat-transfer rate predictions at aeroassist flight conditions
NASA Technical Reports Server (NTRS)
Gupta, Roop N.; Jones, Jim J.; Rochelle, William C.
1992-01-01
The results are presented for the stagnation-point heat-transfer rates used in the design process of the Aeroassist Flight Experiment (AFE) vehicle over its entire aeropass trajectory. The prediction methods used in this investigation demonstrate the application of computational fluid dynamics (CFD) techniques to a wide range of flight conditions and their usefulness in a design process. The heating rates were computed by a viscous-shock-layer (VSL) code at the lower altitudes and by a Navier-Stokes (N-S) code for the higher altitude cases. For both methods, finite-rate chemically reacting gas was considered, and a temperature-dependent wall-catalysis model was used. The wall temperature for each case was assumed to be radiative equilibrium temperature, based on total heating. The radiative heating was estimated by using a correlation equation. Wall slip was included in the N-S calculation method, and this method implicitly accounts for shock slip. The N-S/VSL combination of projection methods was established by comparison with the published benchmark flow-field code LAURA results at lower altitudes, and the direct simulation Monte Carlo results at higher altitude cases. To obtain the design heating rate over the entire forward face of the vehicle, a boundary-layer method (BLIMP code) that employs reacting chemistry and surface catalysis was used. The ratio of the VSL or N-S method prediction to that obtained from the boundary-layer method code at the stagnation point is used to define an adjustment factor, which accounts for the errors involved in using the boundary-layer method.
Challenges in Soft Computing: Case Study with Louisville MSD CSO Modeling
NASA Astrophysics Data System (ADS)
Ormsbee, L.; Tufail, M.
2005-12-01
The principal constituents of soft computing include fuzzy logic, neural computing, evolutionary computation, machine learning, and probabilistic reasoning. There are numerous applications of these constituents (both individually and combination of two or more) in the area of water resources and environmental systems. These range from development of data driven models to optimal control strategies to assist in more informed and intelligent decision making process. Availability of data is critical to such applications and having scarce data may lead to models that do not represent the response function over the entire domain. At the same time, too much data has a tendency to lead to over-constraining of the problem. This paper will describe the application of a subset of these soft computing techniques (neural computing and genetic algorithms) to the Beargrass Creek watershed in Louisville, Kentucky. The application include development of inductive models as substitutes for more complex process-based models to predict water quality of key constituents (such as dissolved oxygen) and use them in an optimization framework for optimal load reductions. Such a process will facilitate the development of total maximum daily loads for the impaired water bodies in the watershed. Some of the challenges faced in this application include 1) uncertainty in data sets, 2) model application, and 3) development of cause-and-effect relationships between water quality constituents and watershed parameters through use of inductive models. The paper will discuss these challenges and how they affect the desired goals of the project.
Planetary Evolution, Habitability and Life
NASA Astrophysics Data System (ADS)
Tilman, Spohn; Breuer, Doris; de Vera, Jean-Pierre; Jaumann, Ralf; Kuehrt, Ekkehard; Möhlmann, Diedrich; Rauer, Heike; Richter, Lutz
A Helmholtz Alliance has been established to study the interactions between life and the evo-lution of planets. The approach goes beyond current studies in Earth-System Sciences by including the entire planet from the atmosphere to the deep interior, going beyond Earth to include other Earth-like planets such as Mars and Venus and satellites in the solar system where ecosystems may exist underneath thick ice shells,considering other solar systems. The approach includes studies of the importance of plate tectonics and other tectonic regimes such as single plate tectonics for the development and for sustaining life and asks the question: If life can adapt to a planet, can a planet adapt to life? Can life be seen as a geological process and if so, can life shape the conditions on a planet such that life can flourish? The vision goes beyond the solar system by including the challenges that life would face in other solar systems. The Alliance uses theoretical modelling of feedback cycles and coupled planetary atmosphere and interior processes. These models are based on the results of remote sensing of planetary surfaces and atmospheres, laboratory studies on (meteorite) samples from other planets and on studies of life under extreme conditions. The Alliance uses its unique capabilities in remote sensing and in-situ exploration to prepare for empirical studies of the parameters affecting habitability. The Alliance aims to establish a network infrastructure in Germany to enable the most ad-vanced research in planetary evolution studies by including life as a planetary process. Finding extraterrestrial life is a task of fundamental importance to mankind, and its fulfilment will be philosophically profound. Evaluating the interactions between planetary evolution and life will help to put the evolution of our home planet (even anthropogenic effects) into perspective.
QuakeSim: a Web Service Environment for Productive Investigations with Earth Surface Sensor Data
NASA Astrophysics Data System (ADS)
Parker, J. W.; Donnellan, A.; Granat, R. A.; Lyzenga, G. A.; Glasscoe, M. T.; McLeod, D.; Al-Ghanmi, R.; Pierce, M.; Fox, G.; Grant Ludwig, L.; Rundle, J. B.
2011-12-01
The QuakeSim science gateway environment includes a visually rich portal interface, web service access to data and data processing operations, and the QuakeTables ontology-based database of fault models and sensor data. The integrated tools and services are designed to assist investigators by covering the entire earthquake cycle of strain accumulation and release. The Web interface now includes Drupal-based access to diverse and changing content, with new ability to access data and data processing directly from the public page, as well as the traditional project management areas that require password access. The system is designed to make initial browsing of fault models and deformation data particularly engaging for new users. Popular data and data processing include GPS time series with data mining techniques to find anomalies in time and space, experimental forecasting methods based on catalogue seismicity, faulted deformation models (both half-space and finite element), and model-based inversion of sensor data. The fault models include the CGS and UCERF 2.0 faults of California and are easily augmented with self-consistent fault models from other regions. The QuakeTables deformation data include the comprehensive set of UAVSAR interferograms as well as a growing collection of satellite InSAR data.. Fault interaction simulations are also being incorporated in the web environment based on Virtual California. A sample usage scenario is presented which follows an investigation of UAVSAR data from viewing as an overlay in Google Maps, to selection of an area of interest via a polygon tool, to fast extraction of the relevant correlation and phase information from large data files, to a model inversion of fault slip followed by calculation and display of a synthetic model interferogram.
Manipulation Capabilities with Simple Hands
2010-01-01
allowing it to interpret online kinesthetic data, addressing two objectives: • Grasp classification: Distinguish between successful and unsuccessful...determining the grasp outcome before the grasping process is complete, by using the entire time series or kinesthetic signature of the grasping process. As...the grasp proceeds and additional kinesthetic data accumulates, the confidence also increases. In some cases Manipulation Capabilities with Simple Hands
Re-Engineering the Curriculum at a Rural Institution: Reflection on the Process of Development
ERIC Educational Resources Information Center
Naude, A.; Wium, A. M.; du Plessis, S.
2011-01-01
The Department of Speech-Language Pathology and Audiology at the University of Limpopo (Medunsa Campus) redesigned their curriculum at the beginning of 2010. The template that was developed shows the horizontal and vertical integration of outcomes. Although the outcomes of the entire process met the requirements of the Health Professions Council…
NASA Technical Reports Server (NTRS)
Steurer, Wolfgang
1992-01-01
This process employs a thermal plasma for the separation and production of oxygen and metals. It is a continuous process that requires no consumables and relies entirely on space resources. The almost complete absence of waste renders it relatively clean. It can be turned on or off without any undesirable side effects or residues. The prime disadvantage is its high power consumption.
ERIC Educational Resources Information Center
Xiaobing, Sun
2012-01-01
This paper starts out by describing the research and drafting processes of the "National Medium- and Long-Term Educational Reform and Development Guideline" (2010-20) (hereafter abbreviated as the "Guideline") and analyzes a series of core concepts that ran through the entire process of researching and drafting the…
ERIC Educational Resources Information Center
al Mahmud, Abdullah
2013-01-01
The gist of the entire constructivist learning theory is that learners are self-builders of their learning that occurs through a mental process in a social context or communication setting, and teachers as facilitators generate learning by creating the expected environment and/or utilizing the process. This article theoretically proves…
The Real-Time Processing of Sluiced Sentences
ERIC Educational Resources Information Center
Poirier, Josee; Wolfinger, Katie; Spellman, Lisa; Shapiro, Lewis P.
2010-01-01
Ellipsis refers to an element that is absent from the input but whose meaning can nonetheless be recovered from context. In this cross-modal priming study, we examined the online processing of Sluicing, an ellipsis whose antecedent is an entire clause: "The handyman threw a book to the programmer but I don't know which book" the handyman threw to…
The Equity Effects of Restraints on Taxing and Spending.
ERIC Educational Resources Information Center
Menchik, Mark David; Pascal, Anthony H.
The passage of California's Proposition 13 is the best known incident in the process that can be labeled "fiscal containment." This process, resulting from a shift in the mood and the demands of the entire nation's electorate, involves a moderation of rapid growth in government and means a less prominent role for government in the…
Sketching in Design Journals: An Analysis of Visual Representations in the Product Design Process
ERIC Educational Resources Information Center
Lau, Kimberly; Oehlberg, Lora; Agogino, Alice
2009-01-01
This paper explores the sketching behavior of designers and the role of sketching in the design process. Observations from a descriptive study of sketches provided in design journals, characterized by a protocol measuring sketching activities, are presented. A distinction is made between journals that are entirely tangible and those that contain…
Alternative Vocabularies in the Test Validity Literature
ERIC Educational Resources Information Center
Markus, Keith A.
2016-01-01
Justification of testing practice involves moving from one state of knowledge about the test to another. Theories of test validity can (a) focus on the beginning of the process, (b) focus on the end, or (c) encompass the entire process. Analyses of four case studies test and illustrate three claims: (a) restrictions on validity entail a supplement…
The microcomputer scientific software series 1: the numerical information manipulation system.
Harold M. Rauscher
1983-01-01
The Numerical Information Manipulation System extends the versatility provided by word processing systems for textual data manipulation to mathematical or statistical data in numeric matrix form. Numeric data, stored and processed in the matrix form, may be manipulated in a wide variety of ways. The system allows operations on single elements, entire rows, or columns...
Reforming the University Sector: Effects on Teaching Efficiency--Evidence from Italy
ERIC Educational Resources Information Center
Agasisti, Tommaso; Dal Bianco, Antonio
2009-01-01
In this article, we analyse the effects of teaching reforms in Italy. These were introduced in 1999, and changed the entire organization of university courses, where the Bachelor-Master (BA-MA) structure was adopted. The first step is to define the production process of higher education (HE). This process consists of several inputs (professors,…
Green Schools as High Performance Learning Facilities
ERIC Educational Resources Information Center
Gordon, Douglas E.
2010-01-01
In practice, a green school is the physical result of a consensus process of planning, design, and construction that takes into account a building's performance over its entire 50- to 60-year life cycle. The main focus of the process is to reinforce optimal learning, a goal very much in keeping with the parallel goals of resource efficiency and…
STUDIES ON A-AVITAMINOSIS IN CHICKENS
Seifried, Oskar
1930-01-01
1. The principal tissue changes in the respiratory tract of chickens caused by a vitamin A deficiency in the food are, first, an atrophy and degeneration of the lining mucous membrane epithelium as well as of the epithelium of the mucous membrane glands. This process is followed or accompanied by a replacement or substitution of the degenerating original epithelium of these parts by a squamous stratified keratinizing epithelium. This newly formed epithelium develops from the primitive columnar epithelium and divides and grows very rapidly. The process appears to be one of substitution rather than a metaplasia, and resembles the normal keratinization of the skin or even more closely the incomplete keratinization of the mucous membranes (e.g., the esophagus or certain parts of the tongue of chickens). In this connection findings have been described which not only afford an interesting insight into the complicated mechanism of keratinization, but also show probable relations between keratinization and the development of Guarnieri's inclusion bodies. Balloon and reticular degeneration of the upper layers of the new stratified epithelium has been frequently observed. All parts of the respiratory tract are about equally involved in the process; and the olfactory region as well, so that the sense of smell may be lost. The lesions, which first take place on the surface epithelium and then in the glands, show only minor differences. 2. The protective mechanism inherent in the mucous membranes of the entire respiratory tract is seriously damaged or even entirely destroyed by the degeneration of the ciliated cells at the surface and the lack of secretion with bactericidal. properties. Secondary infections are frequently found, and nasal discharge and various kinds of inflammatory processes are common, including purulent ones, especially in the upper respiratory tract, communicating sinuses, eyes and trachea. The development of the characteristic histological process is not dependent upon the presence of these infections, since it also takes place in the absence of infection. 3. The specific histological lesions make it possible to differentiate between A-avitaminosis and some infectious diseases of the respiratory tract. These studies we hope will serve as a basis for further investigations on the relationship between A-avitaminosis and infection in general. PMID:19869784
Hospital cost structure in the USA: what's behind the costs? A business case.
Chandra, Charu; Kumar, Sameer; Ghildayal, Neha S
2011-01-01
Hospital costs in the USA are a large part of the national GDP. Medical billing and supplies processes are significant and growing contributors to hospital operations costs in the USA. This article aims to identify cost drivers associated with these processes and to suggest improvements to reduce hospital costs. A Monte Carlo simulation model that uses @Risk software facilitates cost analysis and captures variability associated with the medical billing process (administrative) and medical supplies process (variable). The model produces estimated savings for implementing new processes. Significant waste exists across the entire medical supply process that needs to be eliminated. Annual savings, by implementing the improved process, have the potential to save several billion dollars annually in US hospitals. The other analysis in this study is related to hospital billing processes. Increased spending on hospital billing processes is not entirely due to hospital inefficiency. The study lacks concrete data for accurately measuring cost savings, but there is obviously room for improvement in the two US healthcare processes. This article only looks at two specific costs associated with medical supply and medical billing processes, respectively. This study facilitates awareness of escalating US hospital expenditures. Cost categories, namely, fixed, variable and administrative, are presented to identify the greatest areas for improvement. The study will be valuable to US Congress policy makers and US healthcare industry decision makers. Medical billing process, part of a hospital's administrative costs, and hospital supplies management processes are part of variable costs. These are the two major cost drivers of US hospitals' expenditures that were examined and analyzed.
Waldau, Susanne; Lindholm, Lars; Wiechel, Anna Helena
2010-08-01
In the Västerbotten County Council in Sweden a priority setting process was undertaken to reallocate existing resources for funding of new methods and activities. Resources were created by limiting low priority services. A procedure for priority setting was constructed and fully tested by engaging the entire organisation. The procedure included priority setting within and between departments and political decision making. Participants' views and experiences were collected as a basis for future improvement of the process. Results indicate that participants appreciated the overall approach and methodology and wished to engage in their improvement. Among the improvement proposals is prolongation of the process in order to improve the knowledge base quality. The procedure for identification of new items for funding also needs to be revised. The priority setting process was considered an overall success because it fulfilled its political goals. Factors considered crucial for success are a wish among managers for an economic strategy that addresses existing internal resource allocation; process management characterized by goal orientation and clear leadership; an elaborate communications strategy integrated early in the process and its management; political unity in support of the procedure, and a strong political commitment throughout the process. Generalizability has already been demonstrated by several health care organisations that performed processes founded on this working model. Copyright 2010 Elsevier Ireland Ltd. All rights reserved.
Staels, Eva; Van den Broeck, Wim
2017-05-01
Recently, a general implicit sequence learning deficit was proposed as an underlying cause of dyslexia. This new hypothesis was investigated in the present study by including a number of methodological improvements, for example, the inclusion of appropriate control conditions. The second goal of the study was to explore the role of attentional functioning in implicit and explicit learning tasks. In a 2 × 2 within-subjects design 4 tasks were administered in 30 dyslexic and 38 control children: an implicit and explicit serial reaction time (RT) task and an implicit and explicit contextual cueing task. Attentional functioning was also administered. The entire learning curves of all tasks were analyzed using latent growth curve modeling in order to compare performances between groups and to examine the role of attentional functioning on the learning curves. The amount of implicit learning was similar for both groups. However, the dyslexic group showed slower RTs throughout the entire task. This group difference reduced and became nonsignificant after controlling for attentional functioning. Both implicit learning tasks, but none of the explicit learning tasks, were significantly affected by attentional functioning. Dyslexic children do not suffer from a specific implicit sequence learning deficit. The slower RTs of the dyslexic children throughout the entire implicit sequence learning process are caused by their comorbid attention problems and overall slowness. A key finding of the present study is that, in contrast to what was assumed for a long time, implicit learning relies on attentional resources, perhaps even more than explicit learning does. (PsycINFO Database Record (c) 2017 APA, all rights reserved).
NASA Astrophysics Data System (ADS)
Lin, W.; Noormets, A.; domec, J.; King, J. S.; Sun, G.; McNulty, S.
2012-12-01
Wood stable isotope ratios (δ13C and δ18O) offer insight to water source and plant water use efficiency (WUE), which in turn provide a glimpse to potential plant responses to changing climate, particularly rainfall patterns. The synthetic pathways of cell wall deposition in wood rings differ in their discrimination ratios between the light and heavy isotopes, and α-cellulose is broadly seen as the best indicator of plant water status due to its local and temporal fixation and to its high abundance within the wood. To use the effects of recent severe droughts on the WUE of loblolly pine (Pinus taeda) throughout Southeastern USA as a harbinger of future changes, an effort has been undertaken to sample the entire range of the species and to sample the isotopic composition in a consistent manner. To be able to accommodate the large number of samples required by this analysis, we have developed a new high-throughput method for α-cellulose extraction, which is the rate-limiting step in such an endeavor. Although an entire family of methods has been developed and perform well, their throughput in a typical research lab setting is limited to 16-75 samples per week with intensive labor input. The resin exclusion step in conifersis is particularly time-consuming. We have combined the recent advances of α-cellulose extraction in plant ecology and wood science, including a high-throughput extraction device developed in the Potsdam Dendro Lab and a simple chemical-based resin exclusion method. By transferring the entire extraction process to a multiport-based system allows throughputs of up to several hundred samples in two weeks, while minimizing labor requirements to 2-3 days per batch of samples.
3D Airborne Electromagnetic Inversion: A case study from the Musgrave Region, South Australia
NASA Astrophysics Data System (ADS)
Cox, L. H.; Wilson, G. A.; Zhdanov, M. S.; Sunwall, D. A.
2012-12-01
Geophysicists know and accept that geology is inherently 3D, and is resultant from complex, overlapping processes related to genesis, metamorphism, deformation, alteration, weathering, and/or hydrogeology. Yet, the geophysics community has long relied on qualitative analysis, conductivity depth imaging (CDIs), 1D inversion, and/or plate modeling. There are many reasons for this deficiency, not the least of which has been the lack of capacity for historic 3D AEM inversion algorithms to invert entire surveys so as to practically affect exploration decisions. Our recent introduction of a moving sensitivity domain (footprint) methodology has been a paradigm shift in AEM interpretation. The basis of this method is that one needs only to calculate the responses and sensitivities for that part of the 3D earth model that is within the AEM system's sensitivity domain (footprint), and then superimpose all sensitivity domains into a single, sparse sensitivity matrix for the entire 3D earth model which is then updated in a regularized inversion scheme. This has made it practical to rigorously invert entire surveys with thousands of line kilometers of AEM data to mega-cell 3D models in hours using multi-processor workstations. Since 2010, over eighty individual projects have been completed for Aerodat, AEROTEM, DIGHEM, GEOTEM, HELITEM, HoisTEM, MEGATEM, RepTEM, RESOLVE, SkyTEM, SPECTREM, TEMPEST, and VTEM data from Australia, Brazil, Canada, Finland, Ghana, Peru, Tanzania, the US, and Zambia. Examples of 3D AEM inversion have been published for a variety of applications, including mineral exploration, oil sands exploration, salinity, permafrost, and bathymetry mapping. In this paper, we present a comparison of 3D inversions for SkyTEM, SPECTREM, TEMPET and VTEM data acquired over the same area in the Musgrave region of South Australia for exploration under cover.
Incorporation of Fiber Bragg Sensors for Shape Memory Polyurethanes Characterization.
Alberto, Nélia; Fonseca, Maria A; Neto, Victor; Nogueira, Rogério; Oliveira, Mónica; Moreira, Rui
2017-11-11
Shape memory polyurethanes (SMPUs) are thermally activated shape memory materials, which can be used as actuators or sensors in applications including aerospace, aeronautics, automobiles or the biomedical industry. The accurate characterization of the memory effect of these materials is therefore mandatory for the technology's success. The shape memory characterization is normally accomplished using mechanical testing coupled with a heat source, where a detailed knowledge of the heat cycle and its influence on the material properties is paramount but difficult to monitor. In this work, fiber Bragg grating (FBG) sensors were embedded into SMPU samples aiming to study and characterize its shape memory effect. The samples were obtained by injection molding, and the entire processing cycle was successfully monitored, providing a process global quality signature. Moreover, the integrity and functionality of the FBG sensors were maintained during and after the embedding process, demonstrating the feasibility of the technology chosen for the purpose envisaged. The results of the shape memory effect characterization demonstrate a good correlation between the reflected FBG peak with the temperature and induced strain, proving that this technology is suitable for this particular application.
The control of voluntary eye movements: new perspectives.
Krauzlis, Richard J
2005-04-01
Primates use two types of voluntary eye movements to track objects of interest: pursuit and saccades. Traditionally, these two eye movements have been viewed as distinct systems that are driven automatically by low-level visual inputs. However, two sets of findings argue for a new perspective on the control of voluntary eye movements. First, recent experiments have shown that pursuit and saccades are not controlled by entirely different neural pathways but are controlled by similar networks of cortical and subcortical regions and, in some cases, by the same neurons. Second, pursuit and saccades are not automatic responses to retinal inputs but are regulated by a process of target selection that involves a basic form of decision making. The selection process itself is guided by a variety of complex processes, including attention, perception, memory, and expectation. Together, these findings indicate that pursuit and saccades share a similar functional architecture. These points of similarity may hold the key for understanding how neural circuits negotiate the links between the many higher order functions that can influence behavior and the singular and coordinated motor actions that follow.
Development of an automated ammunition processing system for battlefield use
DOE Office of Scientific and Technical Information (OSTI.GOV)
Speaks, D.M.; Chesser, J.B.; Lloyd, P.D.
1995-03-01
The Future Armored Resupply Vehicle (FARV) will be the companion ammunition resupply vehicle to the Advanced Field Artillery System (AFAS). These systems are currently being investigated by the US Army for future acquisition. The FARV will sustain the AFAS with ammunition and fuel and will significantly increase capabilities over current resupply vehicles. Currently ammunition is transferred to field artillery almost entirely by hand. The level of automation to be included into the FARV is still under consideration. At the request of the US Army`s Project Manager, AFAS/FARV, Oak Ridge National Laboratory (ORNL) identified and evaluated various concepts for the automatedmore » upload, processing, storage, and delivery equipment for the FARV. ORNL, working with the sponsor, established basic requirements and assumptions for concept development and the methodology for concept selection. A preliminary concept has been selected, and the associated critical technologies have been identified. ORNL has provided technology demonstrations of many of these critical technologies. A technology demonstrator which incorporates all individual components into a total process demonstration is planned for late FY 1995.« less
Evaluating All-Metal Valves for Use in a Tritium Environment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Houk, L.; Payton, A.
In the tritium gas processing system, it is desired to minimize polymer components due to their degradation from tritium exposure (beta decay). One source of polymers in the tritium process is valve components. A vendor has been identified that manufactures a valve that is marketed as being made from all-metal construction. This manufacturer, Ham-Let Group, manufactures a diaphragm valve (3LE series) that claims to be made entirely of metal. SRNL procured twelve (12) Ham-Let diaphragm valves for characterization and evaluation. The characterization tests include identification of the maximum pressure of these valves by performing pressure and burst tests. Leak testsmore » were performed to ensure the valves do not exceed the acceptable leak rate for tritium service. These valves were then cycled in a nitrogen gas and/or vacuum environment to ensure they would be durable in a process environment. They were subsequently leak tested per ASTM protocol to ensure that the valves maintained their leak tight integrity. A detailed material analysis was also conducted to determine hydrogen and tritium compatibility.« less
Incorporation of Fiber Bragg Sensors for Shape Memory Polyurethanes Characterization
Nogueira, Rogério; Moreira, Rui
2017-01-01
Shape memory polyurethanes (SMPUs) are thermally activated shape memory materials, which can be used as actuators or sensors in applications including aerospace, aeronautics, automobiles or the biomedical industry. The accurate characterization of the memory effect of these materials is therefore mandatory for the technology’s success. The shape memory characterization is normally accomplished using mechanical testing coupled with a heat source, where a detailed knowledge of the heat cycle and its influence on the material properties is paramount but difficult to monitor. In this work, fiber Bragg grating (FBG) sensors were embedded into SMPU samples aiming to study and characterize its shape memory effect. The samples were obtained by injection molding, and the entire processing cycle was successfully monitored, providing a process global quality signature. Moreover, the integrity and functionality of the FBG sensors were maintained during and after the embedding process, demonstrating the feasibility of the technology chosen for the purpose envisaged. The results of the shape memory effect characterization demonstrate a good correlation between the reflected FBG peak with the temperature and induced strain, proving that this technology is suitable for this particular application. PMID:29137136
The effects of anandamide and oleamide on cognition depend on diurnal variations.
Rueda-Orozco, Pavel E; Montes-Rodriguez, Corinne J; Ruiz-Contreras, Alejandra E; Mendez-Diaz, Monica; Prospero-Garcia, Oscar
2017-10-01
Cannabinergic receptor 1 (CB1r) is highly expressed in almost the entire brain; hence, its activation affects diverse functions, including cognitive processes such as learning and memory. On the other hand, it has been demonstrated that CB1r expression fluctuates along the light-dark cycle. In this context, the objective of this work was to characterize the cannabinergic influence over cognitive processes and its relationship with the light-dark cycle. To this aim we studied the effects of two endogenous cannabinoids, anandamide (AEA) and oleamide (ODA), on the consolidation of memory and event-related potentials (ERPs) depending on the light-dark cycle. Our results indicate that AEA and ODA impair the consolidation of spatial and emotional memories and reduce the amplitude of several components of the ERP complex, depending on the phase of the light-dark cycle. This study further supports the notion that endocannabinoids participate in the regulation of cognitive processes with strong influence of environmental variables such as the light-dark cycle. Copyright © 2017 Elsevier B.V. All rights reserved.
Fiber-Optic Surface Temperature Sensor Based on Modal Interference.
Musin, Frédéric; Mégret, Patrice; Wuilpart, Marc
2016-07-28
Spatially-integrated surface temperature sensing is highly useful when it comes to controlling processes, detecting hazardous conditions or monitoring the health and safety of equipment and people. Fiber-optic sensing based on modal interference has shown great sensitivity to temperature variation, by means of cost-effective image-processing of few-mode interference patterns. New developments in the field of sensor configuration, as described in this paper, include an innovative cooling and heating phase discrimination functionality and more precise measurements, based entirely on the image processing of interference patterns. The proposed technique was applied to the measurement of the integrated surface temperature of a hollow cylinder and compared with a conventional measurement system, consisting of an infrared camera and precision temperature probe. As a result, the optical technique is in line with the reference system. Compared with conventional surface temperature probes, the optical technique has the following advantages: low heat capacity temperature measurement errors, easier spatial deployment, and replacement of multiple angle infrared camera shooting and the continuous monitoring of surfaces that are not visually accessible.
Organ transplantation and magical thinking.
Vamos, Marina
2010-10-01
Organ transplantation can provide important treatment benefits in a variety of situations. While a number of live donor procedures are now possible, procurement of organs from dead donors remains the mainstay of transplant programmes. However, cadaveric donation rates remain much lower than anticipated, and some patients who receive organs struggle to adapt to their new body. The reasons for this are not entirely explained by rational or logical means. This paper uses concepts drawn from magical thinking to try to explain some of the less apparent issues at play within the process of cadaveric organ transplantation, including both the donation and receiving of organs. Three themes are explored as potentially relevant: superstitions and rituals around death and the dead body, incorporation and the meanings attached to the transplanted organ, and survivor guilt. All three are shown to be relevant for some part of the transplantation process in at least a minority of cases. It is therefore suggested that focusing not only on the logical and scientific, but also on the ambiguous and magical may enhance the organ donation process and thus increase donation rates and the psychological adjustment of transplant recipients.
40 CFR 426.115 - Standards of performance for new sources.
Code of Federal Regulations, 2010 CFR
2010-07-01
... (the fluoride and lead limitations are applicable to the abrasive polishing and acid polishing waste water streams while the TSS, oil, and pH limitations are applicable to the entire process waste water...
Fink, Paul; Skeen, James
2007-01-01
The decision to outsource the manufacture of an entire device can be difficult, yet the advantages of this business strategy are huge. The important elements of the process are examined here so that companies can adopt this approach with confidence.
Extending the Marine Microcosm Laboratory
ERIC Educational Resources Information Center
Ryswyk, Hal Van; Hall, Eric W.; Petesch, Steven J.; Wiedeman, Alice E.
2007-01-01
The traditional range of marine microcosm laboratory experiments is presented as an ideal environment to teach the entire analysis process. The microcosm lab provides student-centered approach with opportunities for collaborative learning and to develop critical communication skills.
Applications of Geomatics in Surface Mining
NASA Astrophysics Data System (ADS)
Blachowski, Jan; Górniak-Zimroz, Justyna; Milczarek, Wojciech; Pactwa, Katarzyna
2017-12-01
In terms of method of extracting mineral from deposit, mining can be classified into: surface, underground, and borehole mining. Surface mining is a form of mining, in which the soil and the rock covering the mineral deposits are removed. Types of surface mining include mainly strip and open-cast methods, as well as quarrying. Tasks associated with surface mining of minerals include: resource estimation and deposit documentation, mine planning and deposit access, mine plant development, extraction of minerals from deposits, mineral and waste processing, reclamation and reclamation of former mining grounds. At each stage of mining, geodata describing changes occurring in space during the entire life cycle of surface mining project should be taken into consideration, i.e. collected, analysed, processed, examined, distributed. These data result from direct (e.g. geodetic) and indirect (i.e. remote or relative) measurements and observations including airborne and satellite methods, geotechnical, geological and hydrogeological data, and data from other types of sensors, e.g. located on mining equipment and infrastructure, mine plans and maps. Management of such vast sources and sets of geodata, as well as information resulting from processing, integrated analysis and examining such data can be facilitated with geomatic solutions. Geomatics is a discipline of gathering, processing, interpreting, storing and delivering spatially referenced information. Thus, geomatics integrates methods and technologies used for collecting, management, processing, visualizing and distributing spatial data. In other words, its meaning covers practically every method and tool from spatial data acquisition to distribution. In this work examples of application of geomatic solutions in surface mining on representative case studies in various stages of mine operation have been presented. These applications include: prospecting and documenting mineral deposits, assessment of land accessibility for a potential large-scale surface mining project, modelling mineral deposit (granite) management, concept of a system for management of conveyor belt network technical condition, project of a geoinformation system of former mining terrains and objects, and monitoring and control of impact of surface mining on mine surroundings with satellite radar interferometry.
The fabrication of a multi-spectral lens array and its application in assisting color blindness
NASA Astrophysics Data System (ADS)
Di, Si; Jin, Jian; Tang, Guanrong; Chen, Xianshuai; Du, Ruxu
2016-01-01
This article presents a compact multi-spectral lens array and describes its application in assisting color-blindness. The lens array consists of 9 microlens, and each microlens is coated with a different color filter. Thus, it can capture different light bands, including red, orange, yellow, green, cyan, blue, violet, near-infrared, and the entire visible band. First, the fabrication process is described in detail. Second, an imaging system is setup and a color blindness testing card is selected as the sample. By the system, the vision results of normal people and color blindness can be captured simultaneously. Based on the imaging results, it is possible to be used for helping color-blindness to recover normal vision.
Polar process and world climate /A brief overview/
NASA Technical Reports Server (NTRS)
Goody, R.
1980-01-01
A review is presented of events relating polar regions to the world climate, the mechanisms of sea ice and polar ice sheets, and of two theories of the Pleistocene Ice Ages. The sea ice which varies over time scales of one or two years and the polar ice sheets with time changes measured in tens or hundreds of thousands of years introduce two distinct time constants into global time changes; the yearly Arctic sea ice variations affect northern Europe and have some effect over the entire Northern Hemisphere; the ice-albedo coupling in the polar ice sheets is involved in major climatic events such as the Pleistocene ice ages. It is concluded that climate problems require a global approach including the atmosphere, the oceans, and the cryosphere.
Building bridges, breaking barriers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gonzalez, R.G.
1996-05-01
Communication is one of the most important factors in determining the success of any enterprise. It is certain to rise to the forefront for the international refining and petrochemical industries as companies seek continuous improvement in their operations. Most industry participants have recently undergone a wave of business process re-engineering, integrating new people with new practices in more areas than ever before. Companies have broken down the old functional organizations within the plant in favor of asset teams, placed renewed emphasis on optimizing profits across the entire downstream value chain and extended their global businesses to include joint venture partners.more » The goal is to empower people to make better decisions in a more timely manner to positively impact performance.« less
The economics of new drugs: can we afford to make progress in a common disease?
Hirsch, Bradford R; Schulman, Kevin A
2013-01-01
The concept of personalized medicine is beginning to come to fruition, but the cost of drug development is untenable today. To identify new initiatives that would support a more sustainable business model, the economics of drug development are analyzed, including the cost of drug development, cost of capital, target market size, returns to innovators at the product and firm levels, and, finally, product pricing. We argue that a quick fix is not available. Instead, a rethinking of the entire pharmaceutical development process is needed from the way that clinical trials are conducted, to the role of biomarkers in segmenting markets, to the use of grant support, and conditional approval to decrease the cost of capital. In aggregate, the opportunities abound.
Importance of databases of nucleic acids for bioinformatic analysis focused to genomics
NASA Astrophysics Data System (ADS)
Jimenez-Gutierrez, L. R.; Barrios-Hernández, C. J.; Pedraza-Ferreira, G. R.; Vera-Cala, L.; Martinez-Perez, F.
2016-08-01
Recently, bioinformatics has become a new field of science, indispensable in the analysis of millions of nucleic acids sequences, which are currently deposited in international databases (public or private); these databases contain information of genes, RNA, ORF, proteins, intergenic regions, including entire genomes from some species. The analysis of this information requires computer programs; which were renewed in the use of new mathematical methods, and the introduction of the use of artificial intelligence. In addition to the constant creation of supercomputing units trained to withstand the heavy workload of sequence analysis. However, it is still necessary the innovation on platforms that allow genomic analyses, faster and more effectively, with a technological understanding of all biological processes.
Getting the Bigger Picture With Digital Surveillance
NASA Technical Reports Server (NTRS)
2002-01-01
Through a Space Act Agreement, Diebold, Inc., acquired the exclusive rights to Glenn Research Center's patented video observation technology, originally designed to accelerate video image analysis for various ongoing and future space applications. Diebold implemented the technology into its AccuTrack digital, color video recorder, a state-of- the-art surveillance product that uses motion detection for around-the- clock monitoring. AccuTrack captures digitally signed images and transaction data in real-time. This process replaces the onerous tasks involved in operating a VCR-based surveillance system, and subsequently eliminates the need for central viewing and tape archiving locations altogether. AccuTrack can monitor an entire bank facility, including four automated teller machines, multiple teller lines, and new account areas, all from one central location.
Planning the data transition of a VLDB: a case study
NASA Astrophysics Data System (ADS)
Finken, Shirley J.
1997-02-01
This paper describes the technical and programmatic plans for moving and checking certain data from the IDentification Automated Services (IDAS) system to the new Interstate Identification Index/Federal Bureau of Investigation (III/FBI) Segment database--one of the three components of the Integrated Automated Fingerprint Identification System (IAFIS) being developed by the Federal Bureau of Investigation, Criminal Justice Information Services Division. Transitioning IDAS to III/FBI includes putting the data into an entirely new target database structure (i.e. from IBM VSAM files to ORACLE7 RDBMS tables). Only four IDAS files were transitioned (CCN, CCR, CCA, and CRS), but their total estimated size is at 500 Gb of data. Transitioning of this Very Large Database is planned as two processes.
BOREAS AFM-5 Level-2 Upper Air Network Standard Pressure Level Data
NASA Technical Reports Server (NTRS)
Barr, Alan; Hrynkiw, Charmaine; Hall, Forrest G. (Editor); Newcomer, Jeffrey A. (Editor); Smith, David E. (Technical Monitor)
2000-01-01
The BOREAS AFM-5 team collected and processed data from the numerous radiosonde flights during the project. The goals of the AFM-05 team were to provide large-scale definition of the atmosphere by supplementing the existing AES aerological network, both temporally and spatially. This data set includes basic upper-air parameters interpolated at 0.5 kiloPascal increments of atmospheric pressure from data collected from the network of upper-air stations during the 1993, 1994, and 1996 field campaigns over the entire study region. The data are contained in tabular ASCII files. The data files are available on a CD-ROM (see document number 20010000884) or from the Oak Ridge National Laboratory (ORNL) Distributed Active Archive Center (DAAC).
Integration of Metal Oxide Nanowires in Flexible Gas Sensing Devices
Comini, Elisabetta
2013-01-01
Metal oxide nanowires are very promising active materials for different applications, especially in the field of gas sensors. Advances in fabrication technologies now allow the preparation of nanowires on flexible substrates, expanding the potential market of the resulting sensors. The critical steps for the large-scale preparation of reliable sensing devices are the elimination of high temperatures processes and the stretchability of the entire final device, including the active material. Direct growth on flexible substrates and post-growth procedures have been successfully used for the preparation of gas sensors. The paper will summarize the procedures used for the preparation of flexible and wearable gas sensors prototypes with an overlook of the challenges and the future perspectives concerning this field. PMID:23955436