Customer-centered careflow modeling based on guidelines.
Huang, Biqing; Zhu, Peng; Wu, Cheng
2012-10-01
In contemporary society, customer-centered health care, which stresses customer participation and long-term tailored care, is inevitably becoming a trend. Compared with the hospital or physician-centered healthcare process, the customer-centered healthcare process requires more knowledge and modeling such a process is extremely complex. Thus, building a care process model for a special customer is cost prohibitive. In addition, during the execution of a care process model, the information system should have flexibility to modify the model so that it adapts to changes in the healthcare process. Therefore, supporting the process in a flexible, cost-effective way is a key challenge for information technology. To meet this challenge, first, we analyze various kinds of knowledge used in process modeling, illustrate their characteristics, and detail their roles and effects in careflow modeling. Secondly, we propose a methodology to manage a lifecycle of the healthcare process modeling, with which models could be built gradually with convenience and efficiency. In this lifecycle, different levels of process models are established based on the kinds of knowledge involved, and the diffusion strategy of these process models is designed. Thirdly, architecture and prototype of the system supporting the process modeling and its lifecycle are given. This careflow system also considers the compatibility of legacy systems and authority problems. Finally, an example is provided to demonstrate implementation of the careflow system.
Wang, Ximing; Liu, Brent J; Martinez, Clarisa; Zhang, Xuejun; Winstein, Carolee J
2015-01-01
Imaging based clinical trials can benefit from a solution to efficiently collect, analyze, and distribute multimedia data at various stages within the workflow. Currently, the data management needs of these trials are typically addressed with custom-built systems. However, software development of the custom- built systems for versatile workflows can be resource-consuming. To address these challenges, we present a system with a workflow engine for imaging based clinical trials. The system enables a project coordinator to build a data collection and management system specifically related to study protocol workflow without programming. Web Access to DICOM Objects (WADO) module with novel features is integrated to further facilitate imaging related study. The system was initially evaluated by an imaging based rehabilitation clinical trial. The evaluation shows that the cost of the development of system can be much reduced compared to the custom-built system. By providing a solution to customize a system and automate the workflow, the system will save on development time and reduce errors especially for imaging clinical trials. PMID:25870169
Software Acquisition Improvement in the Aeronautical Systems Center
2008-09-01
software fielded, a variety of different methods were suggested by the interviewees. These included blocks, suites and other tailored processes developed...12 Selection of Research Method ...DoD look to the commercial market to buy tools, methods , environments, and application software, instead of custom-built software (DSB: 1987). These
Code of Federal Regulations, 2010 CFR
2010-04-01
... 19 Customs Duties 2 2010-04-01 2010-04-01 false Procedure. 191.142 Section 191.142 Customs Duties U.S. CUSTOMS AND BORDER PROTECTION, DEPARTMENT OF HOMELAND SECURITY; DEPARTMENT OF THE TREASURY (CONTINUED) DRAWBACK Foreign-Built Jet Aircraft Engines Processed in the United States § 191.142 Procedure...
Code of Federal Regulations, 2011 CFR
2011-04-01
... 19 Customs Duties 2 2011-04-01 2011-04-01 false Procedure. 191.142 Section 191.142 Customs Duties U.S. CUSTOMS AND BORDER PROTECTION, DEPARTMENT OF HOMELAND SECURITY; DEPARTMENT OF THE TREASURY (CONTINUED) DRAWBACK Foreign-Built Jet Aircraft Engines Processed in the United States § 191.142 Procedure...
Petrakaki, Dimitra; Klecun, Ela
2015-01-01
This paper explores how national Electronic Patient Record (EPR) systems are customized in local settings and, in particular, how the context of their origin plays out with the context of their use. It shows how representations of healthcare organizations and of local clinical practice are built into EPR systems within a complex context whereby different stakeholder groups negotiate to produce an EPR package that aims to meet both local and generic needs. The paper draws from research into the implementation of the National Care Record Service, a part of the National Programme for Information Technology (NPfIT), in the English National Health Service (NHS). The paper makes two arguments. First, customization of national EPR is a distributed process that involves cycles of 'translation', which span across geographical, cultural and professional boundaries. Second, 'translation' is an inherently political process during which hybrid technology gets consolidated. The paper concludes, that hybrid technology opens up possibilities for standardization of healthcare. Copyright © 2014 Elsevier Ltd. All rights reserved.
Evaluation of a mass flow sensor at a gin
USDA-ARS?s Scientific Manuscript database
As part of a system to optimize the cotton ginning process, a custom-built mass flow sensor was evaluated at USDA-ARS Cotton Ginning Research Unit at Stoneville, Mississippi. The mass flow sensor was fabricated based on the principle of the sensor patented by Thomasson and Sui. The optical and ele...
19 CFR 191.144 - Refund of duties.
Code of Federal Regulations, 2010 CFR
2010-04-01
... 19 Customs Duties 2 2010-04-01 2010-04-01 false Refund of duties. 191.144 Section 191.144 Customs Duties U.S. CUSTOMS AND BORDER PROTECTION, DEPARTMENT OF HOMELAND SECURITY; DEPARTMENT OF THE TREASURY (CONTINUED) DRAWBACK Foreign-Built Jet Aircraft Engines Processed in the United States § 191.144 Refund of...
19 CFR 191.144 - Refund of duties.
Code of Federal Regulations, 2011 CFR
2011-04-01
... 19 Customs Duties 2 2011-04-01 2011-04-01 false Refund of duties. 191.144 Section 191.144 Customs Duties U.S. CUSTOMS AND BORDER PROTECTION, DEPARTMENT OF HOMELAND SECURITY; DEPARTMENT OF THE TREASURY (CONTINUED) DRAWBACK Foreign-Built Jet Aircraft Engines Processed in the United States § 191.144 Refund of...
19 CFR 191.143 - Drawback entry.
Code of Federal Regulations, 2010 CFR
2010-04-01
... (CONTINUED) DRAWBACK Foreign-Built Jet Aircraft Engines Processed in the United States § 191.143 Drawback entry. (a) Filing of entry. Drawback entries covering these foreign-built jet aircraft engines shall be filed on Customs Form 7551, modified to show that the entry covers jet aircraft engines processed under...
19 CFR 191.143 - Drawback entry.
Code of Federal Regulations, 2011 CFR
2011-04-01
... (CONTINUED) DRAWBACK Foreign-Built Jet Aircraft Engines Processed in the United States § 191.143 Drawback entry. (a) Filing of entry. Drawback entries covering these foreign-built jet aircraft engines shall be filed on Customs Form 7551, modified to show that the entry covers jet aircraft engines processed under...
Evaluation of an experimental mass-flow sensor of cotton-lint at the gin
USDA-ARS?s Scientific Manuscript database
As part of a system to optimize the cotton ginning process, a custom built mass-flow sensor was evaluated at USDA-ARS Cotton Ginning Research Unit at Stoneville, Mississippi. The mass-flow sensor was fabricated based on the principle of the senor patented by Thomasson and Sui (2004). The optical a...
Design and Demonstration of a Miniature Lidar System for Rover Applications
NASA Technical Reports Server (NTRS)
Robinson, Benjamin
2010-01-01
A basic small and portable lidar system for rover applications has been designed. It uses a 20 Hz Nd:YAG pulsed laser, a 4-inch diameter telescope receiver, a custom-built power distribution unit (PDU), and a custom-built 532 nm photomultiplier tube (PMT) to measure the lidar signal. The receiving optics have been designed, but not constructed yet. LabVIEW and MATLAB programs have also been written to control the system, acquire data, and analyze data. The proposed system design, along with some measurements, is described. Future work to be completed is also discussed.
Software Reliability Issues Concerning Large and Safety Critical Software Systems
NASA Technical Reports Server (NTRS)
Kamel, Khaled; Brown, Barbara
1996-01-01
This research was undertaken to provide NASA with a survey of state-of-the-art techniques using in industrial and academia to provide safe, reliable, and maintainable software to drive large systems. Such systems must match the complexity and strict safety requirements of NASA's shuttle system. In particular, the Launch Processing System (LPS) is being considered for replacement. The LPS is responsible for monitoring and commanding the shuttle during test, repair, and launch phases. NASA built this system in the 1970's using mostly hardware techniques to provide for increased reliability, but it did so often using custom-built equipment, which has not been able to keep up with current technologies. This report surveys the major techniques used in industry and academia to ensure reliability in large and critical computer systems.
A Simple Rate Law Experiment Using a Custom-Built Isothermal Heat Conduction Calorimeter
ERIC Educational Resources Information Center
Wadso, Lars; Li, Xi.
2008-01-01
Most processes (whether physical, chemical, or biological) produce or consume heat: measuring thermal power (the heat production rate) is therefore a typical method of studying processes. Here we describe the design of a simple isothermal heat conduction calorimeter built for use in teaching; we also provide an example of its use in simultaneously…
NASA Technical Reports Server (NTRS)
Nappier, Jennifer M.; Tokars, Roger P.; Wroblewski, Adam C.
2016-01-01
The Integrated Radio and Optical Communications (iROC) project at the National Aeronautics and Space Administrations (NASA) Glenn Research Center is investigating the feasibility of a hybrid radio frequency (RF) and optical communication system for future deep space missions. As a part of this investigation, a test bed for a radio frequency (RF) and optical software defined radio (SDR) has been built. Receivers and modems for the NASA deep space optical waveform are not commercially available so a custom ground optical receiver system has been built. This paper documents the ground optical receiver, which is used in order to test the RF and optical SDR in a free space optical communications link.
NASA Technical Reports Server (NTRS)
Nappier, Jennifer M.; Tokars, Roger P.; Wroblewski, Adam C.
2016-01-01
The Integrated Radio and Optical Communications (iROC) project at the National Aeronautics and Space Administration's (NASA) Glenn Research Center is investigating the feasibility of a hybrid radio frequency (RF) and optical communication system for future deep space missions. As a part of this investigation, a test bed for a radio frequency (RF) and optical software defined radio (SDR) has been built. Receivers and modems for the NASA deep space optical waveform are not commercially available so a custom ground optical receiver system has been built. This paper documents the ground optical receiver, which is used in order to test the RF and optical SDR in a free space optical communications link.
NASA Astrophysics Data System (ADS)
Dachyar, M.; Christy, E.
2014-04-01
To maintain position as a major milk producer, the Indonesian milk industry should do some business development with the purpose of increasing customer service level. One strategy is to create on time release conditions for finished goods which will be distributed to customers and distributors. To achieve this condition, management information systems of finished goods on time release needs to be improved. The focus of this research is to conduct business process improvement using Business Process Reengineering (BPR). The deliverable key of this study is a comprehensive business strategy which is the solution of the root problems. To achieve the goal, evaluation, reengineering, and improvement of the ERP system are conducted. To visualize the predicted implementation, a simulation model is built by Oracle BPM. The output of this simulation showed that the proposed solution could effectively reduce the process lead time and increase the number of quality releases.
NASA Technical Reports Server (NTRS)
1992-01-01
Silver ionization water purification technology was originally developed for Apollo spacecraft. It was later used to cleanse swimming pools and has now been applied to industrial cooling towers and process coolers. Sensible Technologies, Inc. has added two other technologies to the system, which occupies only six square feet. It is manufactured in three capacities, and larger models are custom built on request. The system eliminates scale, corrosion, algae, bacteria and debris, and because of the NASA technology, viruses and waterborne bacteria are also destroyed. Applications include a General Motors cooling tower, amusement parks, ice manufacture and a closed-loop process cooling system.
NASA Technology Transfer System
NASA Technical Reports Server (NTRS)
Tran, Peter B.; Okimura, Takeshi
2017-01-01
NTTS is the IT infrastructure for the Agency's Technology Transfer (T2) program containing 60,000+ technology portfolio supporting all ten NASA field centers and HQ. It is the enterprise IT system for facilitating the Agency's technology transfer process, which includes reporting of new technologies (e.g., technology invention disclosures NF1679), protecting intellectual properties (e.g., patents), and commercializing technologies through various technology licenses, software releases, spinoffs, and success stories using custom built workflow, reporting, data consolidation, integration, and search engines.
Key ingredients needed when building large data processing systems for scientists
NASA Technical Reports Server (NTRS)
Miller, K. C.
2002-01-01
Why is building a large science software system so painful? Weren't teams of software engineers supposed to make life easier for scientists? Does it sometimes feel as if it would be easier to write the million lines of code in Fortran 77 yourself? The cause of this dissatisfaction is that many of the needs of the science customer remain hidden in discussions with software engineers until after a system has already been built. In fact, many of the hidden needs of the science customer conflict with stated needs and are therefore very difficult to meet unless they are addressed from the outset in a system's architectural requirements. What's missing is the consideration of a small set of key software properties in initial agreements about the requirements, the design and the cost of the system.
A Multi-Component Automated Laser-Origami System for Cyber-Manufacturing
NASA Astrophysics Data System (ADS)
Ko, Woo-Hyun; Srinivasa, Arun; Kumar, P. R.
2017-12-01
Cyber-manufacturing systems can be enhanced by an integrated network architecture that is easily configurable, reliable, and scalable. We consider a cyber-physical system for use in an origami-type laser-based custom manufacturing machine employing folding and cutting of sheet material to manufacture 3D objects. We have developed such a system for use in a laser-based autonomous custom manufacturing machine equipped with real-time sensing and control. The basic elements in the architecture are built around the laser processing machine. They include a sensing system to estimate the state of the workpiece, a control system determining control inputs for a laser system based on the estimated data and user’s job requests, a robotic arm manipulating the workpiece in the work space, and middleware, named Etherware, supporting the communication among the systems. We demonstrate automated 3D laser cutting and bending to fabricate a 3D product as an experimental result.
Automated Reporting of DXA Studies Using a Custom-Built Computer Program.
England, Joseph R; Colletti, Patrick M
2018-06-01
Dual-energy x-ray absorptiometry (DXA) scans are a critical population health tool and relatively simple to interpret but can be time consuming to report, often requiring manual transfer of bone mineral density and associated statistics into commercially available dictation systems. We describe here a custom-built computer program for automated reporting of DXA scans using Pydicom, an open-source package built in the Python computer language, and regular expressions to mine DICOM tags for patient information and bone mineral density statistics. This program, easy to emulate by any novice computer programmer, has doubled our efficiency at reporting DXA scans and has eliminated dictation errors.
PScan 1.0: flexible software framework for polygon based multiphoton microscopy
NASA Astrophysics Data System (ADS)
Li, Yongxiao; Lee, Woei Ming
2016-12-01
Multiphoton laser scanning microscopes exhibit highly localized nonlinear optical excitation and are powerful instruments for in-vivo deep tissue imaging. Customized multiphoton microscopy has a significantly superior performance for in-vivo imaging because of precise control over the scanning and detection system. To date, there have been several flexible software platforms catered to custom built microscopy systems i.e. ScanImage, HelioScan, MicroManager, that perform at imaging speeds of 30-100fps. In this paper, we describe a flexible software framework for high speed imaging systems capable of operating from 5 fps to 1600 fps. The software is based on the MATLAB image processing toolbox. It has the capability to communicate directly with a high performing imaging card (Matrox Solios eA/XA), thus retaining high speed acquisition. The program is also designed to communicate with LabVIEW and Fiji for instrument control and image processing. Pscan 1.0 can handle high imaging rates and contains sufficient flexibility for users to adapt to their high speed imaging systems.
Development of a Medical Cyclotron Production Facility
NASA Astrophysics Data System (ADS)
Allen, Danny R.
2003-08-01
Development of a Cyclotron manufacturing facility begins with a business plan. Geographics, the size and activity of the medical community, the growth potential of the modality being served, and other business connections are all considered. This business used the customer base established by NuTech, Inc., an independent centralized nuclear pharmacy founded by Danny Allen. With two pharmacies in operation in Tyler and College Station and a customer base of 47 hospitals and clinics the existing delivery system and pharmacist staff is used for the cyclotron facility. We then added cyclotron products to contracts with these customers to guarantee a supply. We partnered with a company in the process of developing PET imaging centers. We then built an independent imaging center attached to the cyclotron facility to allow for the use of short-lived isotopes.
Support for User Interfaces for Distributed Systems
NASA Technical Reports Server (NTRS)
Eychaner, Glenn; Niessner, Albert
2005-01-01
An extensible Java(TradeMark) software framework supports the construction and operation of graphical user interfaces (GUIs) for distributed computing systems typified by ground control systems that send commands to, and receive telemetric data from, spacecraft. Heretofore, such GUIs have been custom built for each new system at considerable expense. In contrast, the present framework affords generic capabilities that can be shared by different distributed systems. Dynamic class loading, reflection, and other run-time capabilities of the Java language and JavaBeans component architecture enable the creation of a GUI for each new distributed computing system with a minimum of custom effort. By use of this framework, GUI components in control panels and menus can send commands to a particular distributed system with a minimum of system-specific code. The framework receives, decodes, processes, and displays telemetry data; custom telemetry data handling can be added for a particular system. The framework supports saving and later restoration of users configurations of control panels and telemetry displays with a minimum of effort in writing system-specific code. GUIs constructed within this framework can be deployed in any operating system with a Java run-time environment, without recompilation or code changes.
High-Resolution Uitra Low Power, Intergrated Aftershock and Microzonation System
NASA Astrophysics Data System (ADS)
Passmore, P.; Zimakov, L. G.
2012-12-01
Rapid Aftershock Mobilization plays an essential role in the understanding of both focal mechanism and rupture propagation caused by strong earthquakes. A quick assessment of the data provides a unique opportunity to study the dynamics of the entire earthquake process in-situ. Aftershock study also provides practical information for local authorities regarding the post earthquake activity, which is very important in order to conduct the necessary actions for public safety in the area affected by the strong earthquake. Refraction Technology, Inc. has developed a self-contained, fully integrated Aftershock System, model 160-03, providing the customer simple and quick deployment during aftershock emergency mobilization and microzonation studies. The 160-03 has no external cables or peripheral equipment for command/control and operation in the field. The 160-03 contains three major components integrated in one case: a) 24-bit resolution state-of-the art low power ADC with CPU and Lid interconnect boards; b) power source; and c) three component 2 Hz sensors (two horizontals and one vertical), and built-in ±4g accelerometer. Optionally, the 1 Hz sensors can be built-in the 160-03 system at the customer's request. The self-contained rechargeable battery pack provides power autonomy up to 7 days during data acquisition at 200 sps on continuous three weak motion and triggered three strong motion recording channels. For longer power autonomy, the 160-03 Aftershock System battery pack can be charged from an external source (solar power system). The data in the field is recorded to a built-in swappable USB flash drive. The 160-03 configuration is fixed based on a configuration file stored on the system, so no external command/control interface is required for parameter setup in the field. For visual control of the system performance in the field, the 160-03 has a built-in LED display which indicates the systems recording status as well as a hot swappable USB drive and battery status. The detailed specifications and performance are presented and discussed.;
Development of a Medical Cyclotron Production Facility
DOE Office of Scientific and Technical Information (OSTI.GOV)
Allen, Danny R.
Development of a Cyclotron manufacturing facility begins with a business plan. Geographics, the size and activity of the medical community, the growth potential of the modality being served, and other business connections are all considered. This business used the customer base established by NuTech, Inc., an independent centralized nuclear pharmacy founded by Danny Allen. With two pharmacies in operation in Tyler and College Station and a customer base of 47 hospitals and clinics the existing delivery system and pharmacist staff is used for the cyclotron facility. We then added cyclotron products to contracts with these customers to guarantee a supply.more » We partnered with a company in the process of developing PET imaging centers. We then built an independent imaging center attached to the cyclotron facility to allow for the use of short-lived isotopes.« less
Solar process water heat for the IRIS images custom color photo lab
NASA Technical Reports Server (NTRS)
1980-01-01
The solar facility located at a custom photo laboratory in Mill Valley, California is described. It was designed to provide 59 percent of the hot water requirements for developing photographic film and domestic hot water use. The design load is to provide 6 gallons of hot water per minute for 8 hours per working day at 100 F. It has 640 square feet of flat plate collectors and 360 gallons of hot water storage. The auxillary back up system is a conventional gas-fired water heater. Site and building description, subsystem description, as-built drawings, cost breakdown and analysis, performance analysis, lessons learned, and the operation and maintenance manual are presented.
Swank, Cynthia Karen
2003-10-01
Jefferson Pilot Financial, a life insurance and annuities firm, like many U.S. service companies at the end of the 1990s was looking for new ways to grow. Its top managers recognized that JPF needed to differentiate itself in the eyes of its customers, the independent life-insurance advisers who sell and service policies. To establish itself as these advisers' preferred partner, it set out to reduce the turnaround time on policy applications, simplify the submission process, and reduce errors. JPF's managers looked to the "lean production" practices that U.S. manufacturers adopted in response to competition from Japanese companies. Lean production is built around the concept of continuous-flow processing--a departure from traditional production systems, in which large batches are processed at each step. JPF appointed a "lean team" to reengineer its New Business unit's operations, beginning with the creation of a "model cell"--a fully functioning microcosm of JPF's entire process. This approach allowed managers to experiment and smooth out the kinks while working toward an optimal design. The team applied lean-manufacturing practices, including placing linked processes near one another, balancing employees' workloads, posting performance results, and measuring performance and productivity from the customer's perspective. Customer-focused metrics helped erode the employees' "My work is all that matters" mind-set. The results were so impressive that JPF is rolling out similar systems across many of its operations. To convince employees of the value of lean production, the lean team introduced a simulation in which teams compete to build the best paper airplane based on invented customer specifications. This game drives home lean production's basic principles, establishing a foundation for deep and far-reaching changes in the production system.
A multiarchitecture parallel-processing development environment
NASA Technical Reports Server (NTRS)
Townsend, Scott; Blech, Richard; Cole, Gary
1993-01-01
A description is given of the hardware and software of a multiprocessor test bed - the second generation Hypercluster system. The Hypercluster architecture consists of a standard hypercube distributed-memory topology, with multiprocessor shared-memory nodes. By using standard, off-the-shelf hardware, the system can be upgraded to use rapidly improving computer technology. The Hypercluster's multiarchitecture nature makes it suitable for researching parallel algorithms in computational field simulation applications (e.g., computational fluid dynamics). The dedicated test-bed environment of the Hypercluster and its custom-built software allows experiments with various parallel-processing concepts such as message passing algorithms, debugging tools, and computational 'steering'. Such research would be difficult, if not impossible, to achieve on shared, commercial systems.
Data acquisition and readout system for the LUX dark matter experiment
Akerib, D. S.; Bai, X.; Bedikian, S.; ...
2011-11-28
LUX is a two-phase (liquid/gas) xenon time projection chamber designed to detect nuclear recoils from interactions with dark matter particles. Signals from the LUX detector are processed by custom-built analog electronics which provide properly shaped signals for the trigger and data acquisition (DAQ) systems. The DAQ is comprised of commercial digitizers with firmware customized for the LUX experiment. Data acquisition systems in rare-event searches must accommodate high rate and large dynamic range during precision calibrations involving radioactive sources, while also delivering low threshold for maximum sensitivity. The LUX DAQ meets these challenges using real-time baseline sup- pression that allows formore » a maximum event acquisition rate in excess of 1.5 kHz with virtually no deadtime. This work describes the LUX DAQ and the novel acquisition techniques employed in the LUX experiment.« less
Integrating technology into complex intervention trial processes: a case study.
Drew, Cheney J G; Poile, Vincent; Trubey, Rob; Watson, Gareth; Kelson, Mark; Townson, Julia; Rosser, Anne; Hood, Kerenza; Quinn, Lori; Busse, Monica
2016-11-17
Trials of complex interventions are associated with high costs and burdens in terms of paperwork, management, data collection, validation, and intervention fidelity assessment occurring across multiple sites. Traditional data collection methods rely on paper-based forms, where processing can be time-consuming and error rates high. Electronic source data collection can potentially address many of these inefficiencies, but has not routinely been used in complex intervention trials. Here we present the use of an on-line system for managing all aspects of data handling and for the monitoring of trial processes in a multicentre trial of a complex intervention. We custom built a web-accessible software application for the delivery of ENGAGE-HD, a multicentre trial of a complex physical therapy intervention. The software incorporated functionality for participant randomisation, data collection and assessment of intervention fidelity. It was accessible to multiple users with differing levels of access depending on required usage or to maintain blinding. Each site was supplied with a 4G-enabled iPad for accessing the system. The impact of this system was quantified through review of data quality and collation of feedback from site coordinators and assessors through structured process interviews. The custom-built system was an efficient tool for collecting data and managing trial processes. Although the set-up time required was significant, using the system resulted in an overall data completion rate of 98.5% with a data query rate of 0.1%, the majority of which were resolved in under a week. Feedback from research staff indicated that the system was highly acceptable for use in a research environment. This was a reflection of the portability and accessibility of the system when using the iPad and its usefulness in aiding accurate data collection, intervention fidelity and general administration. A combination of commercially available hardware and a bespoke online database designed to support data collection, intervention fidelity and trial progress provides a viable option for streamlining trial processes in a multicentre complex intervention trial. There is scope to further extend the system to cater for larger trials and add further functionality such as automatic reporting facilities and participant management support. ISRCTN65378754 , registered on 13 March 2014.
Measurement of Clathrate Hydrate Thermodynamic Stability in the Presence of Ammonia
NASA Technical Reports Server (NTRS)
Dunham, Marc
2012-01-01
There is a lack of data available for the stability of clathrate hydrates in the presence of ammonia for low-to-moderate pressures in the 0-10 MPa range. Providing such data will allow for a better understanding of natural mass transfer processes on celestial bodies like Titan and Enceladus, on which destabilization of clathrates may be responsible for replenishment of gases in the atmosphere. The experimental process utilizes a custom-built gas handling system (GHS) and a cryogenic calorimeter to allow for the efficient testing of samples under varying pressures and gas species.
Gopakumar, Gopalakrishna Pillai; Swetha, Murali; Sai Siva, Gorthi; Sai Subrahmanyam, Gorthi R K
2018-03-01
The present paper introduces a focus stacking-based approach for automated quantitative detection of Plasmodium falciparum malaria from blood smear. For the detection, a custom designed convolutional neural network (CNN) operating on focus stack of images is used. The cell counting problem is addressed as the segmentation problem and we propose a 2-level segmentation strategy. Use of CNN operating on focus stack for the detection of malaria is first of its kind, and it not only improved the detection accuracy (both in terms of sensitivity [97.06%] and specificity [98.50%]) but also favored the processing on cell patches and avoided the need for hand-engineered features. The slide images are acquired with a custom-built portable slide scanner made from low-cost, off-the-shelf components and is suitable for point-of-care diagnostics. The proposed approach of employing sophisticated algorithmic processing together with inexpensive instrumentation can potentially benefit clinicians to enable malaria diagnosis. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Requirement Assurance: A Verification Process
NASA Technical Reports Server (NTRS)
Alexander, Michael G.
2011-01-01
Requirement Assurance is an act of requirement verification which assures the stakeholder or customer that a product requirement has produced its "as realized product" and has been verified with conclusive evidence. Product requirement verification answers the question, "did the product meet the stated specification, performance, or design documentation?". In order to ensure the system was built correctly, the practicing system engineer must verify each product requirement using verification methods of inspection, analysis, demonstration, or test. The products of these methods are the "verification artifacts" or "closure artifacts" which are the objective evidence needed to prove the product requirements meet the verification success criteria. Institutional direction is given to the System Engineer in NPR 7123.1A NASA Systems Engineering Processes and Requirements with regards to the requirement verification process. In response, the verification methodology offered in this report meets both the institutional process and requirement verification best practices.
Simscape Modeling Verification in the Simulink Development Environment
NASA Technical Reports Server (NTRS)
Volle, Christopher E. E.
2014-01-01
The purpose of the Simulation Product Group of the Control and Data Systems division of the NASA Engineering branch at Kennedy Space Center is to provide a realtime model and simulation of the Ground Subsystems participating in vehicle launching activities. The simulation software is part of the Spaceport Command and Control System (SCCS) and is designed to support integrated launch operation software verification, and console operator training. Using Mathworks Simulink tools, modeling engineers currently build models from the custom-built blocks to accurately represent ground hardware. This is time consuming and costly due to required rigorous testing and peer reviews to be conducted for each custom-built block. Using Mathworks Simscape tools, modeling time can be reduced since there would be no custom-code developed. After careful research, the group came to the conclusion it is feasible to use Simscape's blocks in MatLab's Simulink. My project this fall was to verify the accuracy of the Crew Access Arm model developed using Simscape tools running in the Simulink development environment.
Custom sample environments at the ALBA XPEEM.
Foerster, Michael; Prat, Jordi; Massana, Valenti; Gonzalez, Nahikari; Fontsere, Abel; Molas, Bernat; Matilla, Oscar; Pellegrin, Eric; Aballe, Lucia
2016-12-01
A variety of custom-built sample holders offer users a wide range of non-standard measurements at the ALBA synchrotron PhotoEmission Electron Microscope (PEEM) experimental station. Some of the salient features are: an ultrahigh vacuum (UHV) suitcase compatible with many offline deposition and characterization systems, built-in electromagnets for uni- or biaxial in-plane (IP) and out-of-plane (OOP) fields, as well as the combination of magnetic fields with electric fields or current injection. Electronics providing a synchronized sinusoidal signal for sample excitation enable time-resolved measurements at the 500MHz storage ring RF frequency. Copyright © 2016 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Buckman, S. M.; Ius, D.
1996-02-01
This paper reports on the development of a digital coincidence-counting system which comprises a custom-built data acquisition card and associated PC software. The system has been designed to digitise the pulse-trains from two radiation detectors at a rate of 20 MSamples/s with 12-bit resolution. Through hardware compression of the data, the system can continuously record both individual pulse-shapes and the time intervals between pulses. Software-based circuits are used to process the stored pulse trains. These circuits are constructed simply by linking together icons representing various components such as coincidence mixers, time delays, single-channel analysers, deadtimes and scalers. This system enables a pair of pulse trains to be processed repeatedly using any number of different methods. Some preliminary results are presented in order to demonstrate the versatility and efficiency of this new method.
Verification of Functional Fault Models and the Use of Resource Efficient Verification Tools
NASA Technical Reports Server (NTRS)
Bis, Rachael; Maul, William A.
2015-01-01
Functional fault models (FFMs) are a directed graph representation of the failure effect propagation paths within a system's physical architecture and are used to support development and real-time diagnostics of complex systems. Verification of these models is required to confirm that the FFMs are correctly built and accurately represent the underlying physical system. However, a manual, comprehensive verification process applied to the FFMs was found to be error prone due to the intensive and customized process necessary to verify each individual component model and to require a burdensome level of resources. To address this problem, automated verification tools have been developed and utilized to mitigate these key pitfalls. This paper discusses the verification of the FFMs and presents the tools that were developed to make the verification process more efficient and effective.
NASA Astrophysics Data System (ADS)
Foley, Andrew; Alam, Khan; Lin, Wenzhi; Wang, Kangkang; Chinchore, Abhijit; Corbett, Joseph; Savage, Alan; Chen, Tianjiao; Shi, Meng; Pak, Jeongihm; Smith, Arthur
2014-03-01
A custom low-temperature (4.2 K) scanning tunneling microscope system has been developed which is combined directly with a custom molecular beam epitaxy facility (and also including pulsed laser epitaxy) for the purpose of studying surface nanomagnetism of complex spintronic materials down to the atomic scale. For purposes of carrying out spin-polarized STM measurements, the microscope is built into a split-coil, 4.5 Tesla superconducting magnet system where the magnetic field can be applied normal to the sample surface; since, as a result, the microscope does not include eddy current damping, vibration isolation is achieved using a unique combination of two stages of pneumatic isolators along with an acoustical noise shield, in addition to the use of a highly stable as well as modular `Pan'-style STM design with a high Q factor. First 4.2 K results reveal, with clear atomic resolution, various reconstructions on wurtzite GaN c-plane surfaces grown by MBE, including the c(6x12) on N-polar GaN(0001). Details of the system design and functionality will be presented.
OpinionSeer: interactive visualization of hotel customer feedback.
Wu, Yingcai; Wei, Furu; Liu, Shixia; Au, Norman; Cui, Weiwei; Zhou, Hong; Qu, Huamin
2010-01-01
The rapid development of Web technology has resulted in an increasing number of hotel customers sharing their opinions on the hotel services. Effective visual analysis of online customer opinions is needed, as it has a significant impact on building a successful business. In this paper, we present OpinionSeer, an interactive visualization system that could visually analyze a large collection of online hotel customer reviews. The system is built on a new visualization-centric opinion mining technique that considers uncertainty for faithfully modeling and analyzing customer opinions. A new visual representation is developed to convey customer opinions by augmenting well-established scatterplots and radial visualization. To provide multiple-level exploration, we introduce subjective logic to handle and organize subjective opinions with degrees of uncertainty. Several case studies illustrate the effectiveness and usefulness of OpinionSeer on analyzing relationships among multiple data dimensions and comparing opinions of different groups. Aside from data on hotel customer feedback, OpinionSeer could also be applied to visually analyze customer opinions on other products or services.
NASA Astrophysics Data System (ADS)
Wang, Ximing; Martinez, Clarisa; Wang, Jing; Liu, Ye; Liu, Brent
2014-03-01
Clinical trials usually have a demand to collect, track and analyze multimedia data according to the workflow. Currently, the clinical trial data management requirements are normally addressed with custom-built systems. Challenges occur in the workflow design within different trials. The traditional pre-defined custom-built system is usually limited to a specific clinical trial and normally requires time-consuming and resource-intensive software development. To provide a solution, we present a user customizable imaging informatics-based intelligent workflow engine system for managing stroke rehabilitation clinical trials with intelligent workflow. The intelligent workflow engine provides flexibility in building and tailoring the workflow in various stages of clinical trials. By providing a solution to tailor and automate the workflow, the system will save time and reduce errors for clinical trials. Although our system is designed for clinical trials for rehabilitation, it may be extended to other imaging based clinical trials as well.
Custom-built tools for the study of deer antler biology.
Chu, Wenhui; Zhao, Haiping; Li, Junde; Li, Chunyi
2017-06-01
Deer antlers can be developed into multiple novel models to study growth and development of tissues for biomedical research. To facilitate this process, we have invented and further refined five custom-built tools through three decades of antler research. These are: 1. Pedicle growth detector to pinpoint the timing when pedicle growth is initiated, thus stimuli for pedicle and first antler formation can be investigated and identified. 2. Thin periosteum slice cutter to thinly slice (0.2mm or 0.7 mm thick) a whole piece of antlerogenic periosteum (AP) or pedicle periosteum (PP), which facilitates gene delivery into cells resident in these tissues, thus making transgenic antlers possible. 3. The porous periosteum multi-needle punch to effectively loosen the dense AP or PP tissue. This allows most cells of the periosteum to come into direct contact with treating solutions, thus making artificial manipulation of antler development possible. 4. The intra-dermal pocket maker to cut the thin dermal tissue (less than 2 mm in thickness) of a male deer calf horizontally into two layers to make an intra-dermal pocket. This allows loading of AP tissue intra-dermally to test the theory of "antler stem cell niche" in vivo . 5. The sterile periosteum sampling system to allow aseptic collection of the AP, PP or the antler growth centre tissue on farm, thus allowing antler generation, regeneration or rapid growth to be investigated in vitro . Overall, we believe the application of contemporary cellular and molecular biological techniques coupled with these custom-built tools would greatly promote the establishment of this unique and novel model for the benefits of biomedical research, and hence human health.
Bottom-feeding for blockbuster businesses.
Rosenblum, David; Tomlinson, Doug; Scott, Larry
2003-03-01
Marketing experts tell companies to analyze their customer portfolios and weed out buyer segments that don't generate attractive returns. Loyalty experts stress the need to aim retention programs at "good" customers--profitable ones- and encourage the "bad" ones to buy from competitors. And customer-relationship-management software provides ever more sophisticated ways to identify and eliminate poorly performing customers. On the surface, the movement to banish unprofitable customers seems reasonable. But writing off a customer relationship simply because it is currently unprofitable is at best rash and at worst counterproductive. Executives shouldn't be asking themselves, How can we shun unprofitable customers? They need to ask, How can we make money off the customers that everyone else is shunning? When you look at apparently unattractive segments through this lens, you often see opportunities to serve those segments in ways that fundamentally change customer economics. Consider Paychex, a payroll-processing company that built a nearly billion-dollar business by serving small companies. Established players had ignored these customers on the assumption that small companies couldn't afford the service. When founder Tom Golisano couldn't convince his bosses at Electronic Accounting Systems that they were missing a major opportunity, he started a company that now serves 390,000 U.S. customers, each employing around 14 people. In this article, the authors look closely at bottom-feeders--companies that assessed the needs of supposedly unattractive customers and redesigned their business models to turn a profit by fulfilling those needs. And they offer lessons other executives can use to do the same.
Conception of Self-Construction Production Scheduling System
NASA Astrophysics Data System (ADS)
Xue, Hai; Zhang, Xuerui; Shimizu, Yasuhiro; Fujimura, Shigeru
With the high speed innovation of information technology, many production scheduling systems have been developed. However, a lot of customization according to individual production environment is required, and then a large investment for development and maintenance is indispensable. Therefore now the direction to construct scheduling systems should be changed. The final objective of this research aims at developing a system which is built by it extracting the scheduling technique automatically through the daily production scheduling work, so that an investment will be reduced. This extraction mechanism should be applied for various production processes for the interoperability. Using the master information extracted by the system, production scheduling operators can be supported to accelerate the production scheduling work easily and accurately without any restriction of scheduling operations. By installing this extraction mechanism, it is easy to introduce scheduling system without a lot of expense for customization. In this paper, at first a model for expressing a scheduling problem is proposed. Then the guideline to extract the scheduling information and use the extracted information is shown and some applied functions are also proposed based on it.
Meta-expert system for cargo container screening
NASA Astrophysics Data System (ADS)
Alberts, David S.
1994-02-01
This paper reports upon improvements and extensions of rule-based expert systems and related technologies in the context of their application to the cargo container screening problem. These innovations have been incorporated into a system built for and deployed by U.S. Customs with funding provided by the DCI's Counter Narcotics Committee. Given the serious nature of the drug smuggling threat and the low probability of intercept, the ability to target the extremely limited inspectional resources available to U.S. Customs is a prerequisite for success in fighting the `Drug War.'
A Daytime Aspect Camera for Balloon Altitudes
NASA Technical Reports Server (NTRS)
Dietz, Kurt L.; Ramsey, Brian D.; Alexander, Cheryl D.; Apple, Jeff A.; Ghosh, Kajal K.; Swift, Wesley R.; Six, N. Frank (Technical Monitor)
2001-01-01
We have designed, built, and flight-tested a new star camera for daytime guiding of pointed balloon-borne experiments at altitudes around 40km. The camera and lens are commercially available, off-the-shelf components, but require a custom-built baffle to reduce stray light, especially near the sunlit limb of the balloon. This new camera, which operates in the 600-1000 nm region of the spectrum, successfully provided daytime aspect information of approximately 10 arcsecond resolution for two distinct star fields near the galactic plane. The detected scattered-light backgrounds show good agreement with the Air Force MODTRAN models, but the daytime stellar magnitude limit was lower than expected due to dispersion of red light by the lens. Replacing the commercial lens with a custom-built lens should allow the system to track stars in any arbitrary area of the sky during the daytime.
Daytime Aspect Camera for Balloon Altitudes
NASA Technical Reports Server (NTRS)
Dietz, Kurt L.; Ramsey, Brian D.; Alexander, Cheryl D.; Apple, Jeff A.; Ghosh, Kajal K.; Swift, Wesley R.
2002-01-01
We have designed, built, and flight-tested a new star camera for daytime guiding of pointed balloon-borne experiments at altitudes around 40 km. The camera and lens are commercially available, off-the-shelf components, but require a custom-built baffle to reduce stray light, especially near the sunlit limb of the balloon. This new camera, which operates in the 600- to 1000-nm region of the spectrum, successfully provides daytime aspect information of approx. 10 arcsec resolution for two distinct star fields near the galactic plane. The detected scattered-light backgrounds show good agreement with the Air Force MODTRAN models used to design the camera, but the daytime stellar magnitude limit was lower than expected due to longitudinal chromatic aberration in the lens. Replacing the commercial lens with a custom-built lens should allow the system to track stars in any arbitrary area of the sky during the daytime.
Aerial imaging with manned aircraft for precision agriculture
USDA-ARS?s Scientific Manuscript database
Over the last two decades, numerous commercial and custom-built airborne imaging systems have been developed and deployed for diverse remote sensing applications, including precision agriculture. More recently, unmanned aircraft systems (UAS) have emerged as a versatile and cost-effective platform f...
Dates fruits classification using SVM
NASA Astrophysics Data System (ADS)
Alzu'bi, Reem; Anushya, A.; Hamed, Ebtisam; Al Sha'ar, Eng. Abdelnour; Vincy, B. S. Angela
2018-04-01
In this paper, we used SVM in classifying various types of dates using their images. Dates have interesting different characteristics that can be valuable to distinguish and determine a particular date type. These characteristics include shape, texture, and color. A system that achieves 100% accuracy was built to classify the dates which can be eatable and cannot be eatable. The built system helps the food industry and customer in classifying dates depending on specific quality measures giving best performance with specific type of dates.
Design and Testing of a Transcutaneous RF Recharging System for a Fetal Micropacemaker.
Vest, Adriana N; Zhou, Li; Huang, Xuechen; Norekyan, Viktoria; Bar-Cohen, Yaniv; Chmait, Ramen H; Loeb, Gerald Eli
2017-04-01
We have developed a rechargeable fetal micropacemaker in order to treat severe fetal bradycardia with comorbid hydrops fetalis. The necessarily small form factor of the device, small patient population, and fetal anatomy put unique constraints on the design of the recharging system. To overcome these constraints, a custom high power field generator was built and the recharging process was controlled by utilizing pacing rate as a measure of battery state, a feature of the relaxation oscillator used to generate stimuli. The design and in vitro and in vivo verification of the recharging system is presented here, showing successful generation of recharging current in a fetal lamb model.
Design and Testing of a Transcutaneous RF Recharging System for a Fetal Micropacemaker
Vest, Adriana N.; Zhou, Li; Huang, Xuechen; Norekyan, Viktoria; Bar-Cohen, Yaniv; Chmait, Ramen H.; Loeb, Gerald Eli
2017-01-01
We have developed a rechargeable fetal micropacemaker in order to treat severe fetal bradycardia with comorbid hydrops fetalis. The necessarily small form factor of the device, small patient population, and fetal anatomy put unique constraints on the design of the recharging system. To overcome these constraints, a custom high power field generator was built and the recharging process was controlled by utilizing pacing rate as a measure of battery state, a feature of the relaxation oscillator used to generate stimuli. The design and in vitro and in vivo verification of the recharging system is presented here, showing successful generation of recharging current in a fetal lamb model. PMID:28212097
The instrument control software package for the Habitable-Zone Planet Finder spectrometer
NASA Astrophysics Data System (ADS)
Bender, Chad F.; Robertson, Paul; Stefansson, Gudmundur Kari; Monson, Andrew; Anderson, Tyler; Halverson, Samuel; Hearty, Frederick; Levi, Eric; Mahadevan, Suvrath; Nelson, Matthew; Ramsey, Larry; Roy, Arpita; Schwab, Christian; Shetrone, Matthew; Terrien, Ryan
2016-08-01
We describe the Instrument Control Software (ICS) package that we have built for The Habitable-Zone Planet Finder (HPF) spectrometer. The ICS controls and monitors instrument subsystems, facilitates communication with the Hobby-Eberly Telescope facility, and provides user interfaces for observers and telescope operators. The backend is built around the asynchronous network software stack provided by the Python Twisted engine, and is linked to a suite of custom hardware communication protocols. This backend is accessed through Python-based command-line and PyQt graphical frontends. In this paper we describe several of the customized subsystem communication protocols that provide access to and help maintain the hardware systems that comprise HPF, and show how asynchronous communication benefits the numerous hardware components. We also discuss our Detector Control Subsystem, built as a set of custom Python wrappers around a C-library that provides native Linux access to the SIDECAR ASIC and Hawaii-2RG detector system used by HPF. HPF will be one of the first astronomical instruments on sky to utilize this native Linux capability through the SIDECAR Acquisition Module (SAM) electronics. The ICS we have created is very flexible, and we are adapting it for NEID, NASA's Extreme Precision Doppler Spectrometer for the WIYN telescope; we will describe this adaptation, and describe the potential for use in other astronomical instruments.
Gopalakrishnan, V; Subramanian, V; Baskaran, R; Venkatraman, B
2015-07-01
Wireless based custom built aerosol sampling network is designed, developed, and implemented for environmental aerosol sampling. These aerosol sampling systems are used in field measurement campaign, in which sodium aerosol dispersion experiments have been conducted as a part of environmental impact studies related to sodium cooled fast reactor. The sampling network contains 40 aerosol sampling units and each contains custom built sampling head and the wireless control networking designed with Programmable System on Chip (PSoC™) and Xbee Pro RF modules. The base station control is designed using graphical programming language LabView. The sampling network is programmed to operate in a preset time and the running status of the samplers in the network is visualized from the base station. The system is developed in such a way that it can be used for any other environment sampling system deployed in wide area and uneven terrain where manual operation is difficult due to the requirement of simultaneous operation and status logging.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gopalakrishnan, V.; Subramanian, V.; Baskaran, R.
2015-07-15
Wireless based custom built aerosol sampling network is designed, developed, and implemented for environmental aerosol sampling. These aerosol sampling systems are used in field measurement campaign, in which sodium aerosol dispersion experiments have been conducted as a part of environmental impact studies related to sodium cooled fast reactor. The sampling network contains 40 aerosol sampling units and each contains custom built sampling head and the wireless control networking designed with Programmable System on Chip (PSoC™) and Xbee Pro RF modules. The base station control is designed using graphical programming language LabView. The sampling network is programmed to operate in amore » preset time and the running status of the samplers in the network is visualized from the base station. The system is developed in such a way that it can be used for any other environment sampling system deployed in wide area and uneven terrain where manual operation is difficult due to the requirement of simultaneous operation and status logging.« less
Diverse task scheduling for individualized requirements in cloud manufacturing
NASA Astrophysics Data System (ADS)
Zhou, Longfei; Zhang, Lin; Zhao, Chun; Laili, Yuanjun; Xu, Lida
2018-03-01
Cloud manufacturing (CMfg) has emerged as a new manufacturing paradigm that provides ubiquitous, on-demand manufacturing services to customers through network and CMfg platforms. In CMfg system, task scheduling as an important means of finding suitable services for specific manufacturing tasks plays a key role in enhancing the system performance. Customers' requirements in CMfg are highly individualized, which leads to diverse manufacturing tasks in terms of execution flows and users' preferences. We focus on diverse manufacturing tasks and aim to address their scheduling issue in CMfg. First of all, a mathematical model of task scheduling is built based on analysis of the scheduling process in CMfg. To solve this scheduling problem, we propose a scheduling method aiming for diverse tasks, which enables each service demander to obtain desired manufacturing services. The candidate service sets are generated according to subtask directed graphs. An improved genetic algorithm is applied to searching for optimal task scheduling solutions. The effectiveness of the scheduling method proposed is verified by a case study with individualized customers' requirements. The results indicate that the proposed task scheduling method is able to achieve better performance than some usual algorithms such as simulated annealing and pattern search.
An information based approach to improving overhead imagery collection
NASA Astrophysics Data System (ADS)
Sourwine, Matthew J.; Hintz, Kenneth J.
2011-06-01
Recent growth in commercial imaging satellite development has resulted in a complex and diverse set of systems. To simplify this environment for both customer and vendor, an information based sensor management model was built to integrate tasking and scheduling systems. By establishing a relationship between image quality and information, tasking by NIIRS can be utilized to measure the customer's required information content. Focused on a reduction in uncertainty about a target of interest, the sensor manager finds the best sensors to complete the task given the active suite of imaging sensors' functions. This is done through determination of which satellite will meet customer information and timeliness requirements with low likelihood of interference at the highest rate of return.
Innovating Big Data Computing Geoprocessing for Analysis of Engineered-Natural Systems
NASA Astrophysics Data System (ADS)
Rose, K.; Baker, V.; Bauer, J. R.; Vasylkivska, V.
2016-12-01
Big data computing and analytical techniques offer opportunities to improve predictions about subsurface systems while quantifying and characterizing associated uncertainties from these analyses. Spatial analysis, big data and otherwise, of subsurface natural and engineered systems are based on variable resolution, discontinuous, and often point-driven data to represent continuous phenomena. We will present examples from two spatio-temporal methods that have been adapted for use with big datasets and big data geo-processing capabilities. The first approach uses regional earthquake data to evaluate spatio-temporal trends associated with natural and induced seismicity. The second algorithm, the Variable Grid Method (VGM), is a flexible approach that presents spatial trends and patterns, such as those resulting from interpolation methods, while simultaneously visualizing and quantifying uncertainty in the underlying spatial datasets. In this presentation we will show how we are utilizing Hadoop to store and perform spatial analyses to efficiently consume and utilize large geospatial data in these custom analytical algorithms through the development of custom Spark and MapReduce applications that incorporate ESRI Hadoop libraries. The team will present custom `Big Data' geospatial applications that run on the Hadoop cluster and integrate with ESRI ArcMap with the team's probabilistic VGM approach. The VGM-Hadoop tool has been specially built as a multi-step MapReduce application running on the Hadoop cluster for the purpose of data reduction. This reduction is accomplished by generating multi-resolution, non-overlapping, attributed topology that is then further processed using ESRI's geostatistical analyst to convey a probabilistic model of a chosen study region. Finally, we will share our approach for implementation of data reduction and topology generation via custom multi-step Hadoop applications, performance benchmarking comparisons, and Hadoop-centric opportunities for greater parallelization of geospatial operations.
NASA Astrophysics Data System (ADS)
Tracey, Emily; Smith, Nichola; Lawrie, Ken
2017-04-01
The principles behind, and the methods of, digital data capture can be applied across many scientific, and other, disciplines, as can be demonstrated by the use of a custom modified version of the British Geological Survey's System for Integrated Geoscience Mapping, (BGS·SIGMA), for the capture of data for use in the conservation of Scottish built heritage. Historic Environment Scotland (HES), an executive agency of the Scottish Government charged with safeguarding the nation's historic environment, is directly responsible for 345 sites of national significance, most of which are built from stone. In common with many other heritage organisations, HES needs a system that can capture, store and present conservation, maintenance and condition indicator information for single or multiple historic sites; this system would then be used to better target and plan effective programmes of maintenance and repair. To meet this need, the British Geological Survey (BGS) has worked with HES to develop an integrated digital site assessment system that provides a refined survey process for stone-built (and other) historic sites. Based on BGS·SIGMA—an integrated workflow underpinned by a geo-spatial platform for data capture and interpretation—the new system is built on top of ESRI's ArcGIS software, and underpinned by a relational database. Users can, in the field or in the office, populate custom-built data entry forms to record maintenance issues and repair specifications for architectural elements ranging from individual blocks of stone to entire building elevations. Photographs, sketches, and digital documents can be linked to architectural elements to enhance the usability of the data. Predetermined data fields and supporting dictionaries constrain the input parameters, ensuring a high degree of standardisation in the datasets and, therefore, enabling highly consistent data extraction and querying. The GIS presentation of the data provides a powerful and versatile planning tool for scheduling works, specifying materials, identifying the skills needed for repairs, and allocating resources more effectively and efficiently. Physical alterations and changes in the overall condition of a single site, or a group of sites can be monitored accurately over time by repeating the original survey (e.g. every 5 years). Other datasets can be linked to the database and other geospatially referenced datasets can be superimposed in GIS, adding considerably to the scope and utility of the system. The system can be applied to any geospatially referenced object in a wide range of situations thus providing many potential applications in conservation, archaeology and other related fields.
NASA Astrophysics Data System (ADS)
Boyle, P.; Chen, D.; Christ, N.; Clark, M.; Cohen, S.; Cristian, C.; Dong, Z.; Gara, A.; Joo, B.; Jung, C.; Kim, C.; Levkova, L.; Liao, X.; Liu, G.; Li, S.; Lin, H.; Mawhinney, R.; Ohta, S.; Petrov, K.; Wettig, T.; Yamaguchi, A.
2005-03-01
The QCDOC project has developed a supercomputer optimised for the needs of Lattice QCD simulations. It provides a very competitive price to sustained performance ratio of around $1 USD per sustained Megaflop/s in combination with outstanding scalability. Thus very large systems delivering over 5 TFlop/s of performance on the evolution of a single lattice is possible. Large prototypes have been built and are functioning correctly. The software environment raises the state of the art in such custom supercomputers. It is based on a lean custom node operating system that eliminates many unnecessary overheads that plague other systems. Despite the custom nature, the operating system implements a standards compliant UNIX-like programming environment easing the porting of software from other systems. The SciDAC QMP interface adds internode communication in a fashion that provides a uniform cross-platform programming environment.
Value Creation Through Integrated Networks and Convergence
DOE Office of Scientific and Technical Information (OSTI.GOV)
De Martini, Paul; Taft, Jeffrey D.
2015-04-01
Customer adoption of distributed energy resources and public policies are driving changes in the uses of the distribution system. A system originally designed and built for one-way energy flows from central generating facilities to end-use customers is now experiencing injections of energy from customers anywhere on the grid and frequent reversals in the direction of energy flow. In response, regulators and utilities are re-thinking the design and operations of the grid to create more open and transactive electric networks. This evolution has the opportunity to unlock significant value for customers and utilities. Alternatively, failure to seize this potential may insteadmore » lead to an erosion of value if customers seek to defect and disconnect from the system. This paper will discuss how current grid modernization investments may be leveraged to create open networks that increase value through the interaction of intelligent devices on the grid and prosumerization of customers. Moreover, even greater value can be realized through the synergistic effects of convergence of multiple networks. This paper will highlight examples of the emerging nexus of non-electric networks with electricity.« less
Data Quality Monitoring System for New GEM Muon Detectors for the CMS Experiment Upgrade
NASA Astrophysics Data System (ADS)
King, Robert; CMS Muon Group Team
2017-01-01
The Gas Electron Multiplier (GEM) detectors are novel detectors designed to improve the muon trigger and tracking performance in CMS experiment for the high luminosity upgrade of the LHC. Partial installation of GEM detectors is planned during the 2016-2017 technical stop. Before the GEM system is installed underground, its data acquisition (DAQ) electronics must be thoroughly tested. The DAQ system includes several commercial and custom-built electronic boards running custom firmware. The front-end electronics are radiation-hard and communicate via optical fibers. The data quality monitoring (DQM) software framework has been designed to provide online verification of the integrity of the data produced by the detector electronics, and to promptly identify potential hardware or firmware malfunctions in the system. Local hits reconstruction and clustering algorithms allow quality control of the data produced by each GEM chamber. Once the new detectors are installed, the DQM will monitor the stability and performance of the system during normal data-taking operations. We discuss the design of the DQM system, the software being developed to read out and process the detector data, and the methods used to identify and report hardware and firmware malfunctions of the system.
Jones, J.W.; Desmond, G.B.; Henkle, C.; Glover, R.
2012-01-01
Accurate topographic data are critical to restoration science and planning for the Everglades region of South Florida, USA. They are needed to monitor and simulate water level, water depth and hydroperiod and are used in scientific research on hydrologic and biologic processes. Because large wetland environments and data acquisition challenge conventional ground-based and remotely sensed data collection methods, the United States Geological Survey (USGS) adapted a classical data collection instrument to global positioning system (GPS) and geographic information system (GIS) technologies. Data acquired with this instrument were processed using geostatistics to yield sub-water level elevation values with centimetre accuracy (??15 cm). The developed database framework, modelling philosophy and metadata protocol allow for continued, collaborative model revision and expansion, given additional elevation or other ancillary data. ?? 2012 Taylor & Francis.
Halim, Zahid; Abbas, Ghulam
2015-01-01
Sign language provides hearing and speech impaired individuals with an interface to communicate with other members of the society. Unfortunately, sign language is not understood by most of the common people. For this, a gadget based on image processing and pattern recognition can provide with a vital aid for detecting and translating sign language into a vocal language. This work presents a system for detecting and understanding the sign language gestures by a custom built software tool and later translating the gesture into a vocal language. For the purpose of recognizing a particular gesture, the system employs a Dynamic Time Warping (DTW) algorithm and an off-the-shelf software tool is employed for vocal language generation. Microsoft(®) Kinect is the primary tool used to capture video stream of a user. The proposed method is capable of successfully detecting gestures stored in the dictionary with an accuracy of 91%. The proposed system has the ability to define and add custom made gestures. Based on an experiment in which 10 individuals with impairments used the system to communicate with 5 people with no disability, 87% agreed that the system was useful.
The LUX experiment - trigger and data acquisition systems
NASA Astrophysics Data System (ADS)
Druszkiewicz, Eryk
2013-04-01
The Large Underground Xenon (LUX) detector is a two-phase xenon time projection chamber designed to detect interactions of dark matter particles with the xenon nuclei. Signals from the detector PMTs are processed by custom-built analog electronics which provide properly shaped signals for the trigger and data acquisition (DAQ) systems. During calibrations, both systems must be able to handle high rates and have large dynamic ranges; during dark matter searches, maximum sensitivity requires low thresholds. The trigger system uses eight-channel 64-MHz digitizers (DDC-8) connected to a Trigger Builder (TB). The FPGA cores on the digitizers perform real-time pulse identification (discriminating between S1 and S2-like signals) and event localization. The TB uses hit patterns, hit maps, and maximum response detection to make trigger decisions, which are reached within few microseconds after the occurrence of an event of interest. The DAQ system is comprised of commercial digitizers with customized firmware. Its real-time baseline suppression allows for a maximum event acquisition rate in excess of 1.5 kHz, which results in virtually no deadtime. The performance of the trigger and DAQ systems during the commissioning runs of LUX will be discussed.
Research on networked manufacturing system for reciprocating pump industry
NASA Astrophysics Data System (ADS)
Wu, Yangdong; Qi, Guoning; Xie, Qingsheng; Lu, Yujun
2005-12-01
Networked manufacturing is a trend of reciprocating pump industry. According to the enterprises' requirement, the architecture of networked manufacturing system for reciprocating pump industry was proposed, which composed of infrastructure layer, system management layer, application service layer and user layer. Its main functions included product data management, ASP service, business management, and customer relationship management, its physics framework was a multi-tier internet-based model; the concept of ASP service integration was put forward and its process model was also established. As a result, a networked manufacturing system aimed at the characteristics of reciprocating pump industry was built. By implementing this system, reciprocating pump industry can obtain a new way to fully utilize their own resources and enhance the capabilities to respond to the global market quickly.
Design of the first optical system for real-time tomographic holography (RTTH)
NASA Astrophysics Data System (ADS)
Galeotti, John M.; Siegel, Mel; Rallison, Richard D.; Stetten, George
2008-08-01
The design of the first Real-Time-Tomographic-Holography (RTTH) optical system for augmented-reality applications is presented. RTTH places a viewpoint-independent real-time (RT) virtual image (VI) of an object into its actual location, enabling natural hand-eye coordination to guide invasive procedures, without requiring tracking or a head-mounted device. The VI is viewed through a narrow-band Holographic Optical Element (HOE) with built-in power that generates the largest possible near-field, in-situ VI from a small display chip without noticeable parallax error or obscuring direct view of the physical world. Rigidly fixed upon a medical-ultrasound probe, RTTH could show the scan in its actual location inside the patient, because the VI would move with the probe. We designed the image source along with the system-optics, allowing us to ignore both planer geometric distortions and field curvature, respectively compensated by using RT pre-processing software and attaching a custom-surfaced fiber-optic-faceplate (FOFP) to our image source. Focus in our fast, non-axial system was achieved by placing correcting lenses near the FOFP and custom-optically-fabricating our volume-phase HOE using a recording beam that was specially shaped by extra lenses. By simultaneously simulating and optimizing the system's playback performance across variations in both the total playback and HOE-recording optical systems, we derived and built a design that projects a 104x112 mm planar VI 1 m from the HOE using a laser-illuminated 19x16 mm LCD+FOFP image-source. The VI appeared fixed in space and well focused. Viewpoint-induced location errors were <3 mm, and unexpected first-order astigmatism produced 3 cm (3% of 1 m) ambiguity in depth, typically unnoticed by human observers.
Renaissance architecture for Ground Data Systems
NASA Technical Reports Server (NTRS)
Perkins, Dorothy C.; Zeigenfuss, Lawrence B.
1994-01-01
The Mission Operations and Data Systems Directorate (MO&DSD) has embarked on a new approach for developing and operating Ground Data Systems (GDS) for flight mission support. This approach is driven by the goals of minimizing cost and maximizing customer satisfaction. Achievement of these goals is realized through the use of a standard set of capabilities which can be modified to meet specific user needs. This approach, which is called the Renaissance architecture, stresses the engineering of integrated systems, based upon workstation/local area network (LAN)/fileserver technology and reusable hardware and software components called 'building blocks.' These building blocks are integrated with mission specific capabilities to build the GDS for each individual mission. The building block approach is key to the reduction of development costs and schedules. Also, the Renaissance approach allows the integration of GDS functions that were previously provided via separate multi-mission facilities. With the Renaissance architecture, the GDS can be developed by the MO&DSD or all, or part, of the GDS can be operated by the user at their facility. Flexibility in operation configuration allows both selection of a cost-effective operations approach and the capability for customizing operations to user needs. Thus the focus of the MO&DSD is shifted from operating systems that we have built to building systems and, optionally, operations as separate services. Renaissance is actually a continuous process. Both the building blocks and the system architecture will evolve as user needs and technology change. Providing GDS on a per user basis enables this continuous refinement of the development process and product and allows the MO&DSD to remain a customer-focused organization. This paper will present the activities and results of the MO&DSD initial efforts toward the establishment of the Renaissance approach for the development of GDS, with a particular focus on both the technical and process implications posed by Renaissance to the MO&DSD.
The Gem Infrasound Logger and Custom-Built Instrumentation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Anderson, Jacob F.; Johnson, Jeffrey B.; Bowman, Daniel C.
Here, we designed, built, and recorded data with a custom infrasound logger (referred to as the Gem) that is inexpensive, portable, and easy to use. We also describe its design process, qualities, and applications in this article. Field instrumentation is a key element of geophysical data collection, and the quantity and quality of data that can be recorded is determined largely by the characteristics of the instruments used. Geophysicists tend to rely on commercially available instruments, which suffice for many important types of fieldwork. However, commercial instrumentation can fall short in certain roles, which motivates the development of custom sensorsmore » and data loggers. Particularly, we found existing data loggers to be expensive and inconvenient for infrasound campaigns, and developed the Gem infrasound logger in response. In this article, we discuss development of this infrasound logger and the various uses found for it, including projects on volcanoes, high-altitude balloons, and rivers. Further, we demonstrate that when needed, scientists can feasibly design and build their own specialized instruments, and that doing so can enable them to record more and better data at a lower cost.« less
The Gem Infrasound Logger and Custom-Built Instrumentation
Anderson, Jacob F.; Johnson, Jeffrey B.; Bowman, Daniel C.; ...
2017-11-22
Here, we designed, built, and recorded data with a custom infrasound logger (referred to as the Gem) that is inexpensive, portable, and easy to use. We also describe its design process, qualities, and applications in this article. Field instrumentation is a key element of geophysical data collection, and the quantity and quality of data that can be recorded is determined largely by the characteristics of the instruments used. Geophysicists tend to rely on commercially available instruments, which suffice for many important types of fieldwork. However, commercial instrumentation can fall short in certain roles, which motivates the development of custom sensorsmore » and data loggers. Particularly, we found existing data loggers to be expensive and inconvenient for infrasound campaigns, and developed the Gem infrasound logger in response. In this article, we discuss development of this infrasound logger and the various uses found for it, including projects on volcanoes, high-altitude balloons, and rivers. Further, we demonstrate that when needed, scientists can feasibly design and build their own specialized instruments, and that doing so can enable them to record more and better data at a lower cost.« less
Ion plating studies for high temperature applications
NASA Technical Reports Server (NTRS)
Davis, J. H.
1980-01-01
An experimental project was undertaken to ion plate, by electron beam evaporation, Al films onto 4340 steel substrates using (and at the time troubleshooting) the custom built V.T.A. 7375 electron beam ion plating system. A careful recent literature and commercial vendor survey indicates possible means of improving the trouble plagued V.T.A. system.
Nie, Min; Ren, Jie; Li, Zhengjun; Niu, Jinhai; Qiu, Yihong; Zhu, Yisheng; Tong, Shanbao
2009-01-01
Without visual information, the blind people live in various hardships with shopping, reading, finding objects and etc. Therefore, we developed a portable auditory guide system, called SoundView, for visually impaired people. This prototype system consists of a mini-CCD camera, a digital signal processing unit and an earphone, working with built-in customizable auditory coding algorithms. Employing environment understanding techniques, SoundView processes the images from a camera and detects objects tagged with barcodes. The recognized objects in the environment are then encoded into stereo speech signals for the blind though an earphone. The user would be able to recognize the type, motion state and location of the interested objects with the help of SoundView. Compared with other visual assistant techniques, SoundView is object-oriented and has the advantages of cheap cost, smaller size, light weight, low power consumption and easy customization.
Portable system for temperature monitoring in all phases of wine production.
Boquete, Luciano; Cambralla, Rafael; Rodríguez-Ascariz, J M; Miguel-Jiménez, J M; Cantos-Frontela, J J; Dongil, J
2010-07-01
This paper presents a low-cost and highly versatile temperature-monitoring system applicable to all phases of wine production, from grape cultivation through to delivery of bottled wine to the end customer. Monitoring is performed by a purpose-built electronic system comprising a digital memory that stores temperature data and a ZigBee communication system that transmits it to a Control Centre for processing and display. The system has been tested under laboratory conditions and in real-world operational applications. One of the system's advantages is that it can be applied to every phase of wine production. Moreover, with minimum modification, other variables of interest (pH, humidity, etc.) could also be monitored and the system could be applied to other similar sectors, such as olive-oil production. 2010 ISA. Published by Elsevier Ltd. All rights reserved.
SIRIUS - A new 6 MV accelerator system for IBA and AMS at ANSTO
NASA Astrophysics Data System (ADS)
Pastuovic, Zeljko; Button, David; Cohen, David; Fink, David; Garton, David; Hotchkis, Michael; Ionescu, Mihail; Long, Shane; Levchenko, Vladimir; Mann, Michael; Siegele, Rainer; Smith, Andrew; Wilcken, Klaus
2016-03-01
The Centre for Accelerator Science (CAS) facility at ANSTO has been expanded with a new 6 MV tandem accelerator system supplied by the National Electrostatic Corporation (NEC). The beamlines, end-stations and data acquisition software for the accelerator mass spectrometry (AMS) were custom built by NEC for rare isotope mass spectrometry, while the beamlines with end-stations for the ion beam analysis (IBA) are largely custom designed at ANSTO. An overview of the 6 MV system and its performance during testing and commissioning phase is given with emphasis on the IBA end-stations and their applications for materials modification and characterisation.
The Cold Dark Matter Search test stand warm electronics card
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hines, Bruce; /Colorado U., Denver; Hansen, Sten
A card which does the signal processing for four SQUID amplifiers and two charge sensitive channels is described. The card performs the same functions as is presently done with two custom 9U x 280mm Eurocard modules, a commercial multi-channel VME digitizer, a PCI to GPIB interface, a PCI to VME interface and a custom built linear power supply. By integrating these functions onto a single card and using the power over Ethernet standard, the infrastructure requirements for instrumenting a Cold Dark Matter Search (CDMS) detector test stand are significantly reduced.
ICE: A Scalable, Low-Cost FPGA-Based Telescope Signal Processing and Networking System
NASA Astrophysics Data System (ADS)
Bandura, K.; Bender, A. N.; Cliche, J. F.; de Haan, T.; Dobbs, M. A.; Gilbert, A. J.; Griffin, S.; Hsyu, G.; Ittah, D.; Parra, J. Mena; Montgomery, J.; Pinsonneault-Marotte, T.; Siegel, S.; Smecher, G.; Tang, Q. Y.; Vanderlinde, K.; Whitehorn, N.
2016-03-01
We present an overview of the ‘ICE’ hardware and software framework that implements large arrays of interconnected field-programmable gate array (FPGA)-based data acquisition, signal processing and networking nodes economically. The system was conceived for application to radio, millimeter and sub-millimeter telescope readout systems that have requirements beyond typical off-the-shelf processing systems, such as careful control of interference signals produced by the digital electronics, and clocking of all elements in the system from a single precise observatory-derived oscillator. A new generation of telescopes operating at these frequency bands and designed with a vastly increased emphasis on digital signal processing to support their detector multiplexing technology or high-bandwidth correlators — data rates exceeding a terabyte per second — are becoming common. The ICE system is built around a custom FPGA motherboard that makes use of an Xilinx Kintex-7 FPGA and ARM-based co-processor. The system is specialized for specific applications through software, firmware and custom mezzanine daughter boards that interface to the FPGA through the industry-standard FPGA mezzanine card (FMC) specifications. For high density applications, the motherboards are packaged in 16-slot crates with ICE backplanes that implement a low-cost passive full-mesh network between the motherboards in a crate, allow high bandwidth interconnection between crates and enable data offload to a computer cluster. A Python-based control software library automatically detects and operates the hardware in the array. Examples of specific telescope applications of the ICE framework are presented, namely the frequency-multiplexed bolometer readout systems used for the South Pole Telescope (SPT) and Simons Array and the digitizer, F-engine, and networking engine for the Canadian Hydrogen Intensity Mapping Experiment (CHIME) and Hydrogen Intensity and Real-time Analysis eXperiment (HIRAX) radio interferometers.
Customizing Laboratory Information Systems: Closing the Functionality Gap.
Gershkovich, Peter; Sinard, John H
2015-09-01
Highly customizable laboratory information systems help to address great variations in laboratory workflows, typical in Pathology. Often, however, built-in customization tools are not sufficient to add all of the desired functionality and improve systems interoperability. Emerging technologies and advances in medicine often create a void in functionality that we call a functionality gap. These gaps have distinct characteristics—a persuasive need to change the way a pathology group operates, the general availability of technology to address the missing functionality, the absence of this technology from your laboratory information system, and inability of built-in customization tools to address it. We emphasize the pervasive nature of these gaps, the role of pathology informatics in closing them, and suggest methods on how to achieve that. We found that a large number of the papers in the Journal of Pathology Informatics are concerned with these functionality gaps, and an even larger proportion of electronic posters and abstracts presented at the Pathology Informatics Summit conference each year deal directly with these unmet needs in pathology practice. A rapid, continuous, and sustainable approach to closing these gaps is critical for Pathology to provide the highest quality of care, adopt new technologies, and meet regulatory and financial challenges. The key element of successfully addressing functionality gaps is gap ownership—the ability to control the entire pathology information infrastructure with access to complementary systems and components. In addition, software developers with detailed domain expertise, equipped with right tools and methodology can effectively address these needs as they emerge.
ICE-Based Custom Full-Mesh Network for the CHIME High Bandwidth Radio Astronomy Correlator
NASA Astrophysics Data System (ADS)
Bandura, K.; Cliche, J. F.; Dobbs, M. A.; Gilbert, A. J.; Ittah, D.; Mena Parra, J.; Smecher, G.
2016-03-01
New generation radio interferometers encode signals from thousands of antenna feeds across large bandwidth. Channelizing and correlating this data requires networking capabilities that can handle unprecedented data rates with reasonable cost. The Canadian Hydrogen Intensity Mapping Experiment (CHIME) correlator processes 8-bits from N=2,048 digitizer inputs across 400MHz of bandwidth. Measured in N2× bandwidth, it is the largest radio correlator that is currently commissioning. Its digital back-end must exchange and reorganize the 6.6terabit/s produced by its 128 digitizing and channelizing nodes, and feed it to the 256 graphics processing unit (GPU) node spatial correlator in a way that each node obtains data from all digitizer inputs but across a small fraction of the bandwidth (i.e. ‘corner-turn’). In order to maximize performance and reliability of the corner-turn system while minimizing cost, a custom networking solution has been implemented. The system makes use of Field Programmable Gate Array (FPGA) transceivers to implement direct, passive copper, full-mesh, high speed serial connections between sixteen circuit boards in a crate, to exchange data between crates, and to offload the data to a cluster of 256 GPU nodes using standard 10Gbit/s Ethernet links. The GPU nodes complete the corner-turn by combining data from all crates and then computing visibilities. Eye diagrams and frame error counters confirm error-free operation of the corner-turn network in both the currently operating CHIME Pathfinder telescope (a prototype for the full CHIME telescope) and a representative fraction of the full CHIME hardware providing an end-to-end system validation. An analysis of an equivalent corner-turn system built with Ethernet switches instead of custom passive data links is provided.
User engineering: A new look at system engineering
NASA Technical Reports Server (NTRS)
Mclaughlin, Larry L.
1987-01-01
User Engineering is a new System Engineering perspective responsible for defining and maintaining the user view of the system. Its elements are a process to guide the project and customer, a multidisciplinary team including hard and soft sciences, rapid prototyping tools to build user interfaces quickly and modify them frequently at low cost, and a prototyping center for involving users and designers in an iterative way. The main consideration is reducing the risk that the end user will not or cannot effectively use the system. The process begins with user analysis to produce cognitive and work style models, and task analysis to produce user work functions and scenarios. These become major drivers of the human computer interface design which is presented and reviewed as an interactive prototype by users. Feedback is rapid and productive, and user effectiveness can be measured and observed before the system is built and fielded. Requirements are derived via the prototype and baselined early to serve as an input to the architecture and software design.
Framework for Development of Object-Oriented Software
NASA Technical Reports Server (NTRS)
Perez-Poveda, Gus; Ciavarella, Tony; Nieten, Dan
2004-01-01
The Real-Time Control (RTC) Application Framework is a high-level software framework written in C++ that supports the rapid design and implementation of object-oriented application programs. This framework provides built-in functionality that solves common software development problems within distributed client-server, multi-threaded, and embedded programming environments. When using the RTC Framework to develop software for a specific domain, designers and implementers can focus entirely on the details of the domain-specific software rather than on creating custom solutions, utilities, and frameworks for the complexities of the programming environment. The RTC Framework was originally developed as part of a Space Shuttle Launch Processing System (LPS) replacement project called Checkout and Launch Control System (CLCS). As a result of the framework s development, CLCS software development time was reduced by 66 percent. The framework is generic enough for developing applications outside of the launch-processing system domain. Other applicable high-level domains include command and control systems and simulation/ training systems.
Volcano Gas Measurements from UAS - Customization of Sensors and Platforms
NASA Astrophysics Data System (ADS)
Werner, C. A.; Dahlgren, R. P.; Kern, C.; Kelly, P. J.; Fladeland, M. M.; Norton, K.; Johnson, M. S.; Sutton, A. J.; Elias, T.
2015-12-01
Volcanic eruptions threaten not only the lives and property of local populations, but also aviation worldwide. Volcanic gas release is a key driving force in eruptive activity, and monitoring gas emissions is critical to assessing volcanic hazards, yet most volcanoes are not monitored for volcanic gas emission. Measuring volcanic gas emissions with manned aircraft has been standard practice for many years during eruptive crises, but such measurements are quite costly. As a result, measurements are typically only made every week or two at most during periods of unrest or eruption, whereas eruption dynamics change much more rapidly. Furthermore, very few measurements are made between eruptions to establish baseline emissions. Unmanned aerial system (UAS) measurements of volcanic plumes hold great promise for both improving temporal resolution of measurements during volcanic unrest, and for reducing the exposure of personnel to potentially hazardous conditions. Here we present the results of a new collaborative effort between the US Geological Survey and NASA Ames Research Center to develop a UAS specific for volcano gas monitoring using miniaturized gas sensing systems and a custom airframe. Two miniaturized sensing systems are being built and tested: a microDOAS system to quantify SO2 emission rates, and a miniature MultiGAS system for measuring in-situ concentrations of CO2, SO2, and H2S. The instruments are being built into pods that will be flown on a custom airframe built from surplus Raven RQ-11. The Raven is one of the smallest UAS (a SUAS), and has the potential to support global rapid response when eruptions occur because they require less crew for operations. A test mission is planned for fall 2015 or spring 2016 at the Crows Landing Airfield in central California. Future measurement locations might include Kilauea Volcano in Hawaii, or Pagan Volcano in the Marianas.
Positioning laboratory automation for today's dynamic climate
Vogt, D. G.
1994-01-01
Laboratory automation has existed and matured at Eli Lilly and Company for well over a decade. The author's section serves as a developer of laboratory automation systems for customers within Lilly and embodies ‘robotic friendly’ laboratories with highly technical and experienced personnel. With several systems showing signs of age, second generation ‘smart systems’ have been developed and delivered during the last three years. These systems were built with an ideology different from previous systems. Upon their delivery, the ‘smart systems’ met the customer's functional requirements but the overall acceptance of this ideology is still being debated due to the perception of failure. Much of this perception can be attributed to the delivery of a system heavily dependent on system maintenance, something totally unexpected by the customer. This paper discusses the ideology of a‘smart systems’ and the results following implementation. The events that led to the review and subsequent departure of the ‘smart systems’ ideology are also described. PMID:18924995
Elliott, Jonathan T; Dsouza, Alisha V; Marra, Kayla; Pogue, Brian W; Roberts, David W; Paulsen, Keith D
2016-09-01
Fluorescence guided surgery has the potential to positively impact surgical oncology; current operating microscopes and stand-alone imaging systems are too insensitive or too cumbersome to maximally take advantage of new tumor-specific agents developed through the microdose pathway. To this end, a custom-built illumination and imaging module enabling picomolar-sensitive near-infrared fluorescence imaging on a commercial operating microscope is described. The limits of detection and system specifications are characterized, and in vivo efficacy of the system in detecting ABY-029 is evaluated in a rat orthotopic glioma model following microdose injections, showing the suitability of the device for microdose phase 0 clinical trials.
Dsouza, Alisha V.; Marra, Kayla; Pogue, Brian W.; Roberts, David W.; Paulsen, Keith D.
2016-01-01
Fluorescence guided surgery has the potential to positively impact surgical oncology; current operating microscopes and stand-alone imaging systems are too insensitive or too cumbersome to maximally take advantage of new tumor-specific agents developed through the microdose pathway. To this end, a custom-built illumination and imaging module enabling picomolar-sensitive near-infrared fluorescence imaging on a commercial operating microscope is described. The limits of detection and system specifications are characterized, and in vivo efficacy of the system in detecting ABY-029 is evaluated in a rat orthotopic glioma model following microdose injections, showing the suitability of the device for microdose phase 0 clinical trials. PMID:27699098
Design and construction of a photobioreactor for hydrogen production, including status in the field.
Skjånes, Kari; Andersen, Uno; Heidorn, Thorsten; Borgvang, Stig A
Several species of microalgae and phototrophic bacteria are able to produce hydrogen under certain conditions. A range of different photobioreactor systems have been used by different research groups for lab-scale hydrogen production experiments, and some few attempts have been made to upscale the hydrogen production process. Even though a photobioreactor system for hydrogen production does require special construction properties (e.g., hydrogen tight, mixing by other means than bubbling with air), only very few attempts have been made to design photobioreactors specifically for the purpose of hydrogen production. We have constructed a flat panel photobioreactor system that can be used in two modes: either for the cultivation of phototrophic microorganisms (upright and bubbling) or for the production of hydrogen or other anaerobic products (mixing by "rocking motion"). Special emphasis has been taken to avoid any hydrogen leakages, both by means of constructional and material choices. The flat plate photobioreactor system is controlled by a custom-built control system that can log and control temperature, pH, and optical density and additionally log the amount of produced gas and dissolved oxygen concentration. This paper summarizes the status in the field of photobioreactors for hydrogen production and describes in detail the design and construction of a purpose-built flat panel photobioreactor system, optimized for hydrogen production in terms of structural functionality, durability, performance, and selection of materials. The motivations for the choices made during the design process and advantages/disadvantages of previous designs are discussed.
A new humane method of stunning broilers using low atmospheric pressure
USDA-ARS?s Scientific Manuscript database
This research project evaluated an alternative method of controlled atmosphere stunning of commercial broilers to induce anoxia utilizing a vacuum pump to reduce the oxygen tension, low atmospheric pressure stun (LAPS). A custom built 2 cage-module system (holding a total of 600 broilers each) with...
Vakili, Sharif; Pandit, Ravi; Singman, Eric L; Appelbaum, Jeffrey; Boland, Michael V
2015-10-29
Understanding how patients move through outpatient clinics is important for optimizing clinic processes. This study compares the costs, benefits, and challenges of two clinically important methods for measuring patient flow: (1) a commercial system using infrared (IR) technology that passively tracks patient movements and (2) a custom-built, low cost, networked radio frequency identification (RFID) system that requires active swiping by patients at proximity card readers. Readers for both the IR and RFID systems were installed in the General Eye Service of the Wilmer Eye Institute. Participants were given both IR and RFID tags to measure the time they spent in various clinic stations. Simultaneously, investigators recorded the times at which patients moved between rooms. These measurements were considered the standard against which the other methods were compared. One hundred twelve patients generated a total of 252 events over the course of 6 days. The proportion of events successfully recorded by the RFID system (83.7%) was significantly greater than that obtained with the IR system (75.4%, p < 0.001). The cause of the missing events using the IR method was found to be a signal interruption between the patient tags and the check-in desk receiver. Excluding those data, the IR system successfully recorded 94.4% of events (p = 0.002; OR = 3.83 compared to the RFID system). There was no statistical difference between the IR, RFID, and manual time measurements (p > 0.05 for all comparisons). Both RFID and IR methods are effective at providing patient flow information. The custom-made RFID system was as accurate as IR and was installed at about 10% the cost. Given its significantly lower costs, the RFID option may be an appealing option for smaller clinics with more limited budgets.
Final Technical Report: Intensive Quenching Technology for Heat Treating and Forging Industries
DOE Office of Scientific and Technical Information (OSTI.GOV)
Aronov, Michael A.
2005-12-21
Intensive quenching (IQ) process is an alternative way of hardening (quenching) steel parts through the use of highly agitated water and then still air. It was developed by IQ Technologies, Inc. (IQT) of Akron, Ohio. While conventional quenching is usually performed in environmentally unfriendly oil or water/polymer solutions, the IQ process uses highly agitated environmentally friendly water or low concentration water/mineral salt solutions. The IQ method is characterized by extremely high cooling rates of steel parts. In contrast to conventional quenching, where parts cool down to the quenchant temperature and usually have tensile or neutral residual surface stresses at themore » end of quenching. The IQ process is interrupted when the part core is still hot and when there are maximum compressive stresses deep into the parts, thereby providing hard, ductile, better wear resistant parts. The project goal was to advance the patented IQ process from feasibility to commercialization in the heat-treating and forging industries to reduce significantly energy consumption and environmental impact, to increase productivity and to enhance economic competitiveness of these industries as well as Steel, Metal Casting and Mining industries. To introduce successfully the IQ technology in the U.S. metal working industry, the project team has completed the following work over the course of this project: A total of 33 manufacturers of steel products provided steel parts for IQ trails. IQT conducted IQ demonstrations for 34 different steel parts. Our customers tested intensively quenched parts in actual field conditions to evaluate the product service life and performance improvement. The data obtained from the field showed the following: Service life (number of holes punched) of cold-work punches (provided by EHT customer and made of S5 shock-resisting steel) was improved by two to eight times. Aluminum extrusion dies provided by GAM and made of hot work H-13 steel outperformed the standard dies by at least 50%. Dies provided by an AST customer, made of plain carbon 1045 steel and used for pellet manufacturing outperformed the standard dies by more than 100%. Concrete crusher liner wear plates provided by an EHT customer and made of 1045 steel, had the same surface hardness as the plates made of more expensive, pre-hardened high alloy HARDOX-500 material supplied by a Swedish company and used currently by the EHT customer. The 1045 material intensively quenched wear plates are currently in the field. Concrete block molding machine wear plates provided by an IQT customer and made of 8620 steel were processed at the AST production IQ system using a 40% reduced carburization cycle. An effective case depth in the intensively quenched wear plates was the same as in the standard, oil quenched parts. Base keys provided by an EHT customer and made of 8620 steel were processed using a 40% reduced carburization cycle. The intensively quenched parts showed the same performance as standard parts. IQT introduced the IQ process in heat treat practices of three commercial heat-treating shops: Akron Steel Treating Co., Summit Heat Treating Co. and Euclid Heat Treating Co. CWRU conducted a material characterization study for a variety of steels to develop a database to support changing/modification of recognized standards for quenching steel parts. IQT conducted a series of IQ workshops, published seven technical papers and participated in ASM Heat Treating Society conference and exposition and in Furnace North America Show. IQT designed and built a fully automated new IQ system installed at the Center for Intensive Quenching. This system includes the following major components: a stand-alone 1,900-gallon IQ water system, a 24'' x 24'' atmosphere pit furnace, and an automated load transfer mechanism. IQT established a ''Center for Intensive Quenching'' at the AST facilities. The 4,000 square feet Center includes the following equipment: High-velocity single part quenching IQ unit developed and built previously under EMTEC CT-65 project. The unit is equipped with a neutral salt bath furnace and a high-temperature, electric-fired, atmosphere, box furnace. New 1,900 gallon IQ system with a 24'' x 24'' atmosphere pit furnace and a load transfer mechanism. Shaker hearth furnace equipped with an IQ water tank and with a chiller to maintain the required water temperature. Potential Savings for USA Heat Treating Industry IQ Process Benefit/Annual Benefit for USA Heat Treating Industry Full elimination or 30% reduction of the carburization cycle Savings of 1,800 billion Btu of energy Cost reduction by $600,000,000 Reduction of CO2 emissions by 148,000 ton Part weight reduction by 5% Savings in material cost of $70,000,000 Savings of 300 billion Btu of energy« less
Evaluation of Antimicrobial Stewardship-Related Alerts Using a Clinical Decision Support System.
Ghamrawi, Riane J; Kantorovich, Alexander; Bauer, Seth R; Pallotta, Andrea M; Sekeres, Jennifer K; Gordon, Steven M; Neuner, Elizabeth A
2017-11-01
Background: Information technology, including clinical decision support systems (CDSS), have an increasingly important and growing role in identifying opportunities for antimicrobial stewardship-related interventions. Objective: The aim of this study was to describe and compare types and outcomes of CDSS-built antimicrobial stewardship alerts. Methods: Fifteen alerts were evaluated in the initial antimicrobial stewardship program (ASP) review. Preimplementation, alerts were reviewed retrospectively. Postimplementation, alerts were reviewed in real-time. Data collection included total number of actionable alerts, recommendation acceptance rates, and time spent on each alert. Time to de-escalation to narrower spectrum agents was collected. Results: In total, 749 alerts were evaluated. Overall, 306 (41%) alerts were actionable (173 preimplementation, 133 postimplementation). Rates of actionable alerts were similar for custom-built and prebuilt alert types (39% [53 of 135] vs 41% [253 of 614], P = .68]. In the postimplementation group, an intervention was attempted in 97% of actionable alerts and 70% of interventions were accepted. The median time spent per alert was 7 minutes (interquartile range [IQR], 5-13 minutes; 15 [12-17] minutes for actionable alerts vs 6 [5-7] minutes for nonactionable alerts, P < .001). In cases where the antimicrobial was eventually de-escalated, the median time to de-escalation was 28.8 hours (95% confidence interval [CI], 10.0-69.1 hours) preimplementation vs 4.7 hours (95% CI, 2.4-22.1 hours) postimplementation, P < .001. Conclusions: CDSS have played an important role in ASPs to help identify opportunities to optimize antimicrobial use through prebuilt and custom-built alerts. As ASP roles continue to expand, focusing time on customizing institution specific alerts will be of vital importance to help redistribute time needed to manage other ASP tasks and opportunities.
Low SWaP multispectral sensors using dichroic filter arrays
NASA Astrophysics Data System (ADS)
Dougherty, John; Varghese, Ron
2015-06-01
The benefits of multispectral imaging are well established in a variety of applications including remote sensing, authentication, satellite and aerial surveillance, machine vision, biomedical, and other scientific and industrial uses. However, many of the potential solutions require more compact, robust, and cost-effective cameras to realize these benefits. The next generation of multispectral sensors and cameras needs to deliver improvements in size, weight, power, portability, and spectral band customization to support widespread deployment for a variety of purpose-built aerial, unmanned, and scientific applications. A novel implementation uses micro-patterning of dichroic filters1 into Bayer and custom mosaics, enabling true real-time multispectral imaging with simultaneous multi-band image acquisition. Consistent with color image processing, individual spectral channels are de-mosaiced with each channel providing an image of the field of view. This approach can be implemented across a variety of wavelength ranges and on a variety of detector types including linear, area, silicon, and InGaAs. This dichroic filter array approach can also reduce payloads and increase range for unmanned systems, with the capability to support both handheld and autonomous systems. Recent examples and results of 4 band RGB + NIR dichroic filter arrays in multispectral cameras are discussed. Benefits and tradeoffs of multispectral sensors using dichroic filter arrays are compared with alternative approaches - including their passivity, spectral range, customization options, and scalable production.
A network-based training environment: a medical image processing paradigm.
Costaridou, L; Panayiotakis, G; Sakellaropoulos, P; Cavouras, D; Dimopoulos, J
1998-01-01
The capability of interactive multimedia and Internet technologies is investigated with respect to the implementation of a distance learning environment. The system is built according to a client-server architecture, based on the Internet infrastructure, composed of server nodes conceptually modelled as WWW sites. Sites are implemented by customization of available components. The environment integrates network-delivered interactive multimedia courses, network-based tutoring, SIG support, information databases of professional interest, as well as course and tutoring management. This capability has been demonstrated by means of an implemented system, validated with digital image processing content, specifically image enhancement. Image enhancement methods are theoretically described and applied to mammograms. Emphasis is given to the interactive presentation of the effects of algorithm parameters on images. The system end-user access depends on available bandwidth, so high-speed access can be achieved via LAN or local ISDN connections. Network based training offers new means of improved access and sharing of learning resources and expertise, as promising supplements in training.
Li, Ya-Pin; Gao, Hong-Wei; Fan, Hao-Jun; Wei, Wei; Xu, Bo; Dong, Wen-Long; Li, Qing-Feng; Song, Wen-Jing; Hou, Shi-Ke
2017-12-01
The objective of this study was to build a database to collect infectious disease information at the scene of a disaster through the use of 128 epidemiological questionnaires and 47 types of options, with rapid acquisition of information regarding infectious disease and rapid questionnaire customization at the scene of disaster relief by use of a personal digital assistant (PDA). SQL Server 2005 (Microsoft Corp, Redmond, WA) was used to create the option database for the infectious disease investigation, to develop a client application for the PDA, and to deploy the application on the server side. The users accessed the server for data collection and questionnaire customization with the PDA. A database with a set of comprehensive options was created and an application system was developed for the Android operating system (Google Inc, Mountain View, CA). On this basis, an infectious disease information collection system was built for use at the scene of disaster relief. The creation of an infectious disease information collection system and rapid questionnaire customization through the use of a PDA was achieved. This system integrated computer technology and mobile communication technology to develop an infectious disease information collection system and to allow for rapid questionnaire customization at the scene of disaster relief. (Disaster Med Public Health Preparedness. 2017;11:668-673).
Software synthesis using generic architectures
NASA Technical Reports Server (NTRS)
Bhansali, Sanjay
1993-01-01
A framework for synthesizing software systems based on abstracting software system designs and the design process is described. The result of such an abstraction process is a generic architecture and the process knowledge for customizing the architecture. The customization process knowledge is used to assist a designer in customizing the architecture as opposed to completely automating the design of systems. Our approach using an implemented example of a generic tracking architecture which was customized in two different domains is illustrated. How the designs produced using KASE compare to the original designs of the two systems, and current work and plans for extending KASE to other application areas are described.
CRADA opportunities in pressurized combustion research
DOE Office of Scientific and Technical Information (OSTI.GOV)
Maloney, D J; Norton, T S; Casleton, K H
1995-06-01
The Morgantown Energy Technology Center recently began operation of a Low Emissions Combustor Test and Research (LECTR) Facility. This facility was built to support the development of Advanced Gas Turbine Systems (ATS) by providing test facilities and engineering support to METC customers through the ATS University-Industry Consortium and through CRADA participation with industrial partners.
NASA Technical Reports Server (NTRS)
Rincon, Rafael F.
2008-01-01
The reconfigurable L-Band radar is an ongoing development at NASA/GSFC that exploits the capability inherently in phased array radar systems with a state-of-the-art data acquisition and real-time processor in order to enable multi-mode measurement techniques in a single radar architecture. The development leverages on the L-Band Imaging Scatterometer, a radar system designed for the development and testing of new radar techniques; and the custom-built DBSAR processor, a highly reconfigurable, high speed data acquisition and processing system. The radar modes currently implemented include scatterometer, synthetic aperture radar, and altimetry; and plans to add new modes such as radiometry and bi-static GNSS signals are being formulated. This development is aimed at enhancing the radar remote sensing capabilities for airborne and spaceborne applications in support of Earth Science and planetary exploration This paper describes the design of the radar and processor systems, explains the operational modes, and discusses preliminary measurements and future plans.
NASA Astrophysics Data System (ADS)
Dayton, M.; Datte, P.; Carpenter, A.; Eckart, M.; Manuel, A.; Khater, H.; Hargrove, D.; Bell, P.
2017-08-01
The National Ignition Facility's (NIF) harsh radiation environment can cause electronics to malfunction during high-yield DT shots. Until now there has been little experience fielding electronic-based cameras in the target chamber under these conditions; hence, the performance of electronic components in NIF's radiation environment was unknown. It is possible to purchase radiation tolerant devices, however, they are usually qualified for radiation environments different to NIF, such as space flight or nuclear reactors. This paper presents the results from a series of online experiments that used two different prototype camera systems built from non-radiation hardened components and one commercially available camera that permanently failed at relatively low total integrated dose. The custom design built in Livermore endured a 5 × 1015 neutron shot without upset, while the other custom design upset at 2 × 1014 neutrons. These results agreed with offline testing done with a flash x-ray source and a 14 MeV neutron source, which suggested a methodology for developing and qualifying electronic systems for NIF. Further work will likely lead to the use of embedded electronic systems in the target chamber during high-yield shots.
A Study of Novice Systems Analysis Problem Solving Behaviors Using Protocol Analysis
1992-09-01
conducted. Each subject was given the same task to perform. The task involved a case study (Appendix B) of a utility company’s customer order processing system...behavior (Ramesh, 1989). The task was to design a customer order processing system that utilized a centralized telephone answering service center...of the utility company’s customer order processing system that was developed based on information obtained by a large systems consulting firm during
Precision engineering for future propulsion and power systems: a perspective from Rolls-Royce.
Beale, Sam
2012-08-28
Rolls-Royce today is an increasingly global business, supplying integrated power systems to a wide variety of customers for use on land, at sea and in the air. Its reputation for 'delivering excellence' to these customers has been built largely on its gas turbine technology portfolio, and this reputation relies on the quality of the company's expertise in design, manufacture and delivery of services. This paper sets out to examine a number of examples, such as the high-pressure turbine blade, of the company's reliance on precision design and manufacture, highlighting how this precision contributes to customer satisfaction with its products. A number of measures the company is taking to accelerate its competitiveness in precision manufacture are highlighted, not least its extensive relationships with the academic research base. The paper finishes by looking briefly at the demands of the company's potential future product portfolio.
An arch-shaped intraoral tongue drive system with built-in tongue-computer interfacing SoC.
Park, Hangue; Ghovanloo, Maysam
2014-11-14
We present a new arch-shaped intraoral Tongue Drive System (iTDS) designed to occupy the buccal shelf in the user's mouth. The new arch-shaped iTDS, which will be referred to as the iTDS-2, incorporates a system-on-a-chip (SoC) that amplifies and digitizes the raw magnetic sensor data and sends it wirelessly to an external TDS universal interface (TDS-UI) via an inductive coil or a planar inverted-F antenna. A built-in transmitter (Tx) employs a dual-band radio that operates at either 27 MHz or 432 MHz band, according to the wireless link quality. A built-in super-regenerative receiver (SR-Rx) monitors the wireless link quality and switches the band if the link quality is below a predetermined threshold. An accompanying ultra-low power FPGA generates data packets for the Tx and handles digital control functions. The custom-designed TDS-UI receives raw magnetic sensor data from the iTDS-2, recognizes the intended user commands by the sensor signal processing (SSP) algorithm running in a smartphone, and delivers the classified commands to the target devices, such as a personal computer or a powered wheelchair. We evaluated the iTDS-2 prototype using center-out and maze navigation tasks on two human subjects, which proved its functionality. The subjects' performance with the iTDS-2 was improved by 22% over its predecessor, reported in our earlier publication.
19 CFR 151.30 - Sugar closets.
Code of Federal Regulations, 2010 CFR
2010-04-01
... 19 Customs Duties 2 2010-04-01 2010-04-01 false Sugar closets. 151.30 Section 151.30 Customs... (CONTINUED) EXAMINATION, SAMPLING, AND TESTING OF MERCHANDISE Sugars, Sirups, and Molasses § 151.30 Sugar closets. Sugar closets for samples shall be substantially built and secured by locks furnished by Customs...
19 CFR 151.30 - Sugar closets.
Code of Federal Regulations, 2011 CFR
2011-04-01
... 19 Customs Duties 2 2011-04-01 2011-04-01 false Sugar closets. 151.30 Section 151.30 Customs... (CONTINUED) EXAMINATION, SAMPLING, AND TESTING OF MERCHANDISE Sugars, Sirups, and Molasses § 151.30 Sugar closets. Sugar closets for samples shall be substantially built and secured by locks furnished by Customs...
19 CFR 151.30 - Sugar closets.
Code of Federal Regulations, 2014 CFR
2014-04-01
... 19 Customs Duties 2 2014-04-01 2014-04-01 false Sugar closets. 151.30 Section 151.30 Customs... (CONTINUED) EXAMINATION, SAMPLING, AND TESTING OF MERCHANDISE Sugars, Sirups, and Molasses § 151.30 Sugar closets. Sugar closets for samples shall be substantially built and secured by locks furnished by Customs...
19 CFR 151.30 - Sugar closets.
Code of Federal Regulations, 2013 CFR
2013-04-01
... 19 Customs Duties 2 2013-04-01 2013-04-01 false Sugar closets. 151.30 Section 151.30 Customs... (CONTINUED) EXAMINATION, SAMPLING, AND TESTING OF MERCHANDISE Sugars, Sirups, and Molasses § 151.30 Sugar closets. Sugar closets for samples shall be substantially built and secured by locks furnished by Customs...
19 CFR 151.30 - Sugar closets.
Code of Federal Regulations, 2012 CFR
2012-04-01
... 19 Customs Duties 2 2012-04-01 2012-04-01 false Sugar closets. 151.30 Section 151.30 Customs... (CONTINUED) EXAMINATION, SAMPLING, AND TESTING OF MERCHANDISE Sugars, Sirups, and Molasses § 151.30 Sugar closets. Sugar closets for samples shall be substantially built and secured by locks furnished by Customs...
Ultra Small Aperture Terminal: System Design and Test Results
NASA Technical Reports Server (NTRS)
Sohn, Philip Y.; Reinhart, Richard C.
1996-01-01
The Ultra Small Aperture Terminal (USAT) has been developed to test and demonstrate remote and broadcast satcom applications via the Advanced Communications Technology Satellite (ACTS). The design of these ground stations emphasize small size, low power consumption, portable and rugged terminals. Each ground station includes several custom design parts such as 35 cm diameter antenna, 1/4 Watt transmitter with built-in upconverter, and 4.0 dB Noise Figure (NF) receiver with built-in downconverter. In addition, state-of-the-art commercial parts such as highly stable ovenized crystal oscillators and dielectric resonator oscillators are used in the ground station design. Presented in this paper are system level design description, performance, and sample applications.
The effect of requirements prioritization on avionics system conceptual design
NASA Astrophysics Data System (ADS)
Lorentz, John
This dissertation will provide a detailed approach and analysis of a new collaborative requirements prioritization methodology that has been used successfully on four Coast Guard avionics acquisition and development programs valued at $400M+. A statistical representation of participant study results will be discussed and analyzed in detail. Many technically compliant projects fail to deliver levels of performance and capability that the customer desires. Some of these systems completely meet "threshold" levels of performance; however, the distribution of resources in the process devoted to the development and management of the requirements does not always represent the voice of the customer. This is especially true for technically complex projects such as modern avionics systems. A simplified facilitated process for prioritization of system requirements will be described. The collaborative prioritization process, and resulting artifacts, aids the systems engineer during early conceptual design. All requirements are not the same in terms of customer priority. While there is a tendency to have many thresholds inside of a system design, there is usually a subset of requirements and system performance that is of the utmost importance to the design. These critical capabilities and critical levels of performance typically represent the reason the system is being built. The systems engineer needs processes to identify these critical capabilities, the associated desired levels of performance, and the risks associated with the specific requirements that define the critical capability. The facilitated prioritization exercise is designed to collaboratively draw out these critical capabilities and levels of performance so they can be emphasized in system design. Developing the purpose, scheduling and process for prioritization events are key elements of systems engineering and modern project management. The benefits of early collaborative prioritization flow throughout the project schedule, resulting in greater success during system deployment and operational testing. This dissertation will discuss the data and findings from participant studies, present a literature review of systems engineering and design processes, and test the hypothesis that the prioritization process had no effect on stakeholder sentiment related to the conceptual design. In addition, the "Requirements Rationalization" process will be discussed in detail. Avionics, like many other systems, has transitioned from a discrete electronics engineering, hard engineering discipline to incorporate software engineering as a core process of the technology development cycle. As with other software-based systems, avionics now has significant soft system attributes that must be considered in the design process. The boundless opportunities that exist in software design demand prioritization to focus effort onto the critical functions that the software must provide. This has been a well documented and understood phenomenon in the software development community for many years. This dissertation will attempt to link the effect of software integrated avionics to the benefits of prioritization of requirements in the problem space and demonstrate the sociological and technical benefits of early prioritization practices.
2013-05-23
This monograph borrows from multiple disciplines to argue for an organizational shift from process reengineering to system design to improve...government customer-service delivery. Specifically, the monograph proposes a transformation in claims processing within the Veterans Benefits Administration...required. The proposed system design is an attempt to place the disability claims process within a larger environment encompassing multiple dimensions of customers.
Realtime system for GLAS on WHT
NASA Astrophysics Data System (ADS)
Skvarč, Jure; Tulloch, Simon; Myers, Richard M.
2006-06-01
The new ground layer adaptive optics system (GLAS) on the William Herschel Telescope (WHT) on La Palma will be based on the existing natural guide star adaptive optics system called NAOMI. A part of the new developments is a new control system for the tip-tilt mirror. Instead of the existing system, built around a custom built multiprocessor computer made of C40 DSPs, this system uses an ordinary PC machine and a Linux operating system. It is equipped with a high sensitivity L3 CCD camera with effective readout noise of nearly zero. The software design for the tip-tilt system is being completely redeveloped, in order to make a use of object oriented design which should facilitate easier integration with the rest of the observing system at the WHT. The modular design of the system allows incorporation of different centroiding and loop control methods. To test the system off-sky, we have built a laboratory bench using an artificial light source and a tip-tilt mirror. We present results of tip-tilt correction quality using different centroiding algorithms and different control loop methods at different light levels. This system will serve as a testing ground for a transition to a completely PC-based real-time control system.
The design, status and performance of the ZEUS central tracking detector electronics
NASA Astrophysics Data System (ADS)
Cussans, D. G.; Fawcett, H. F.; Foster, B.; Gilmore, R. S.; Heath, G. P.; Llewellyn, T. J.; Malos, J.; Morgado, C. J. S.; Tapper, R. J.; Gingrich, D. M.; Harnew, N.; Hallam-Baker, P.; Nash, J.; Khatri, T.; Shield, P. D.; McArthur, I.; Topp-Jorgensen, S.; Wilson, F. F.; Allen, D.; Baird, S. A.; Carter, R.; Galagardera, S.; Gibson, M. D.; Hatley, R. S.; Jeffs, M.; Milborrow, R.; Morissey, M.; Quinton, S. P. H.; White, D. J.; Lane, J.; Nixon, G.; Postranecky, M.; Jamdagni, A. K.; Marcou, C.; Miller, D. B.; Toudup, L.
1992-05-01
The readout system developed for the ZEUS central trackign detector (CDT) is described. The CTD is required to provide an accurate measurement of the sagitta and energy loss of charged particles as well as provide fast trigger information. This must be carried out in the HERA environment in which beams cross every 96 ns. The first two aims are achieved by digitizing chamber pulses using a pipelined 104 MHz FADC system. The trigger uses a fast determination of the difference in the arrival times of a pulse at each end of the CTD. It processes this data and gives information to the ZEUS global first level trigger. The modules are housed in custom-built racks and crates and read out using a DAQ system based on Transputer readout controllers. These also monitor data quality and produce data for the ZEUS second level Trigger.
NASA Astrophysics Data System (ADS)
Huang, Jinxin; Yuan, Qun; Tankam, Patrice; Clarkson, Eric; Kupinski, Matthew; Hindman, Holly B.; Aquavella, James V.; Rolland, Jannick P.
2015-03-01
In biophotonics imaging, one important and quantitative task is layer-thickness estimation. In this study, we investigate the approach of combining optical coherence tomography and a maximum-likelihood (ML) estimator for layer thickness estimation in the context of tear film imaging. The motivation of this study is to extend our understanding of tear film dynamics, which is the prerequisite to advance the management of Dry Eye Disease, through the simultaneous estimation of the thickness of the tear film lipid and aqueous layers. The estimator takes into account the different statistical processes associated with the imaging chain. We theoretically investigated the impact of key system parameters, such as the axial point spread functions (PSF) and various sources of noise on measurement uncertainty. Simulations show that an OCT system with a 1 μm axial PSF (FWHM) allows unbiased estimates down to nanometers with nanometer precision. In implementation, we built a customized Fourier domain OCT system that operates in the 600 to 1000 nm spectral window and achieves 0.93 micron axial PSF in corneal epithelium. We then validated the theoretical framework with physical phantoms made of custom optical coatings, with layer thicknesses from tens of nanometers to microns. Results demonstrate unbiased nanometer-class thickness estimates in three different physical phantoms.
Integrated Component-based Data Acquisition Systems for Aerospace Test Facilities
NASA Technical Reports Server (NTRS)
Ross, Richard W.
2001-01-01
The Multi-Instrument Integrated Data Acquisition System (MIIDAS), developed by the NASA Langley Research Center, uses commercial off the shelf (COTS) products, integrated with custom software, to provide a broad range of capabilities at a low cost throughout the system s entire life cycle. MIIDAS combines data acquisition capabilities with online and post-test data reduction computations. COTS products lower purchase and maintenance costs by reducing the level of effort required to meet system requirements. Object-oriented methods are used to enhance modularity, encourage reusability, and to promote adaptability, reducing software development costs. Using only COTS products and custom software supported on multiple platforms reduces the cost of porting the system to other platforms. The post-test data reduction capabilities of MIIDAS have been installed at four aerospace testing facilities at NASA Langley Research Center. The systems installed at these facilities provide a common user interface, reducing the training time required for personnel that work across multiple facilities. The techniques employed by MIIDAS enable NASA to build a system with a lower initial purchase price and reduced sustaining maintenance costs. With MIIDAS, NASA has built a highly flexible next generation data acquisition and reduction system for aerospace test facilities that meets customer expectations.
Nelson, E C; Caldwell, C; Quinn, D; Rose, R
1991-03-01
Customer knowledge is an essential feature of hospitalwide quality improvement. All systems and processes have customers. The aim is to use customer knowledge and voice of the customer measurement to plan, design, improve, and monitor these systems and processes continuously. In this way, the hospital stands the best chance of meeting customers' needs and, hopefully, delivering services that are so outstanding that customers will be surprised and delighted. There are many methods, both soft and hard, that can be used to increase customer knowledge. One useful strategy is to use a family of quality measures that reflect the voice of the customer. These measures can generate practical and powerful customer knowledge information that is essential to performing strategic planning, deploying quality policy, designing new services, finding targets for improvements, and monitoring those continuous improvements based on customers' judgments.
V&V Plan for FPGA-based ESF-CCS Using System Engineering Approach.
NASA Astrophysics Data System (ADS)
Maerani, Restu; Mayaka, Joyce; El Akrat, Mohamed; Cheon, Jung Jae
2018-02-01
Instrumentation and Control (I&C) systems play an important role in maintaining the safety of Nuclear Power Plant (NPP) operation. However, most current I&C safety systems are based on Programmable Logic Controller (PLC) hardware, which is difficult to verify and validate, and is susceptible to software common cause failure. Therefore, a plan for the replacement of the PLC-based safety systems, such as the Engineered Safety Feature - Component Control System (ESF-CCS), with Field Programmable Gate Arrays (FPGA) is needed. By using a systems engineering approach, which ensures traceability in every phase of the life cycle, from system requirements, design implementation to verification and validation, the system development is guaranteed to be in line with the regulatory requirements. The Verification process will ensure that the customer and stakeholder’s needs are satisfied in a high quality, trustworthy, cost efficient and schedule compliant manner throughout a system’s entire life cycle. The benefit of the V&V plan is to ensure that the FPGA based ESF-CCS is correctly built, and to ensure that the measurement of performance indicators has positive feedback that “do we do the right thing” during the re-engineering process of the FPGA based ESF-CCS.
Fischmeister, Florian Ph.S.; Leodolter, Ulrich; Windischberger, Christian; Kasess, Christian H.; Schöpf, Veronika; Moser, Ewald; Bauer, Herbert
2010-01-01
Throughout recent years there has been an increasing interest in studying unconscious visual processes. Such conditions of unawareness are typically achieved by either a sufficient reduction of the stimulus presentation time or visual masking. However, there are growing concerns about the reliability of the presentation devices used. As all these devices show great variability in presentation parameters, the processing of visual stimuli becomes dependent on the display-device, e.g. minimal changes in the physical stimulus properties may have an enormous impact on stimulus processing by the sensory system and on the actual experience of the stimulus. Here we present a custom-built three-way LC-shutter-tachistoscope which allows experimental setups with both, precise and reliable stimulus delivery, and millisecond resolution. This tachistoscope consists of three LCD-projectors equipped with zoom lenses to enable stimulus presentation via a built-in mirror-system onto a back projection screen from an adjacent room. Two high-speed liquid crystal shutters are mounted serially in front of each projector to control the stimulus duration. To verify the intended properties empirically, different sequences of presentation times were performed while changes in optical power were measured using a photoreceiver. The obtained results demonstrate that interfering variabilities in stimulus parameters and stimulus rendering are markedly reduced. Together with the possibility to collect external signals and to send trigger-signals to other devices, this tachistoscope represents a highly flexible and easy to set up research tool not only for the study of unconscious processing in the brain but for vision research in general. PMID:20122963
Multi-spectral confocal microendoscope for in-vivo imaging
NASA Astrophysics Data System (ADS)
Rouse, Andrew Robert
The concept of in-vivo multi-spectral confocal microscopy is introduced. A slit-scanning multi-spectral confocal microendoscope (MCME) was built to demonstrate the technique. The MCME employs a flexible fiber-optic catheter coupled to a custom built slit-scan confocal microscope fitted with a custom built imaging spectrometer. The catheter consists of a fiber-optic imaging bundle linked to a miniature objective and focus assembly. The design and performance of the miniature objective and focus assembly are discussed. The 3mm diameter catheter may be used on its own or routed though the instrument channel of a commercial endoscope. The confocal nature of the system provides optical sectioning with 3mum lateral resolution and 30mum axial resolution. The prism based multi-spectral detection assembly is typically configured to collect 30 spectral samples over the visible chromatic range. The spectral sampling rate varies from 4nm/pixel at 490nm to 8nm/pixel at 660nm and the minimum resolvable wavelength difference varies from 7nm to 18nm over the same spectral range. Each of these characteristics are primarily dictated by the dispersive power of the prism. The MCME is designed to examine cellular structures during optical biopsy and to exploit the diagnostic information contained within the spectral domain. The primary applications for the system include diagnosis of disease in the gastro-intestinal tract and female reproductive system. Recent data from the grayscale imaging mode are presented. Preliminary multi-spectral results from phantoms, cell cultures, and excised human tissue are presented to demonstrate the potential of in-vivo multi-spectral imaging.
Progress on development of SPIDER diagnostics
NASA Astrophysics Data System (ADS)
Pasqualotto, R.; Agostini, M.; Barbisan, M.; Bernardi, M.; Brombin, M.; Cavazzana, R.; Croci, G.; Palma, M. Dalla; Delogu, R. S.; Gorini, G.; Lotto, L.; Muraro, A.; Peruzzo, S.; Pimazzoni, A.; Pomaro, N.; Rizzolo, A.; Serianni, G.; Spolaore, M.; Tardocchi, M.; Zaniol, B.; Zaupa, M.
2017-08-01
SPIDER experiment, the full size prototype of the beam source for the ITER heating neutral beam injector, has to demonstrate extraction and acceleration to 100 kV of a large negative ion hydrogen or deuterium beam with co-extracted electron fraction e-/D- <1 and beam uniformity within 10%, for up to one hour beam pulses. Main RF source plasma and beam parameters are measured with different complementary techniques to exploit the combination of their specific features. While SPIDER plant systems are being installed, the different diagnostic systems are in the procurement phase. Their final design is described here with a focus on some key solutions and most original and cost effective implementations. Thermocouples used to measure the power load distribution in the source and over the beam dump front surface will be efficiently fixed with proven technique and acquired through commercial and custom electronics. Spectroscopy needs to use well collimated lines of sight and will employ novel design spectrometers with higher efficiency and resolution and filtered detectors with custom built amplifiers. The electrostatic probes will be operated through electronics specifically developed to cope with the challenging environment of the RF source. The instrumented calorimeter STRIKE will use new CFC tiles, still under development. Two linear cameras, one built in house, have been tested as suitable for optical beam tomography. Some diagnostic components are off the shelf, others are custom developed: some of these are being prototyped or are under test before final production and installation, which will be completed before start of SPIDER operation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Aldrich, Robb; Butterfield, Karla
Kaplan Thompson Architects (KTA) has specialized in sustainable, energy-efficient buildings, and they have designed several custom, zero-energy homes in New England. These zero-energy projects have generally been high-end, custom homes with budgets that could accommodate advanced energy systems. In an attempt to make zero energy homes more affordable and accessible to a larger demographic, KTA explored modular construction as way to provide high-quality homes at lower costs. In the mid-2013, KTA formalized this concept when they launched BrightBuilt Home (BBH). The BBH mission is to offer a line of architect-designed, high-performance homes that are priced to offer substantial savings offmore » the lifetime cost of a typical home and can be delivered in less time. For the past two years, CARB has worked with BBH and Keiser Homes (the primary modular manufacturer for BBH) to discuss challenges related to wall systems, HVAC, and quality control. In Spring of 2014, CARB and BBH began looking in detail on a home to be built in Lincolnville, ME by Black Bros. Builders. This report details the solution package specified for this modular plan and the challenges that arose during the project.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kaplan Thompson Architects (KTA) has specialized in sustainable, energy-efficient buildings, and they have designed several custom, zero-energy homes in New England. These zero-energy projects have generally been high-end, custom homes with budgets that could accommodate advanced energy systems. In an attempt to make zero energy homes more affordable and accessible to a larger demographic, KTA explored modular construction as way to provide high-quality homes at lower costs. In mid-2013, KTA formalized this concept when they launched BrightBuilt Home (BBH). The BBH mission is to offer 'a line of architect-designed, high-performance homes that are priced to offer substantial savings off themore » lifetime cost of a typical home and can be delivered in less time.' For the past two years, CARB has worked with BBH and Keiser Homes (the primary modular manufacturer for BBH) to discuss challenges related to wall systems, HVAC, and quality control. In Spring of 2014, CARB and BBH began looking in detail on a home to be built in Lincolnville, Maine, by Black Bros. Builders. This report details the solution package specified for this modular plan and the challenges that arose during the project.« less
Process for Managing and Customizing HPC Operating Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brown, David ML
2014-04-02
A process for maintaining a custom HPC operating system was developed at the Environmental Molecular Sciences Laboratory (EMSL) over the past ten years. This process is generic and flexible to manage continuous change as well as keep systems updated while managing communication through well defined pieces of software.
Konrad, Peter E.; Neimat, Joseph S.; Yu, Hong; Kao, Chris C.; Remple, Michael S.; D'Haese, Pierre-François; Dawant, Benoit M.
2011-01-01
Background The microTargeting™ platform (MTP) stereotaxy system (FHC Inc., Bowdoin, Me., USA) was FDA approved in 2001 utilizing rapid-prototyping technology to create custom platforms for human stereotaxy procedures. It has also been called the STarFix (surgical targeting fixture) system since it is based on the concept of a patient- and procedure-specific surgical fixture. This is an alternative stereotactic method by which planned trajectories are incorporated into custom-built, miniature stereotactic platforms mounted onto bone fiducial markers. Our goal is to report the clinical experience with this system over a 6-year period. Methods We present the largest reported series of patients who underwent deep brain stimulation (DBS) implantations using customized rapidly prototyped stereotactic frames (MTP). Clinical experience and technical features for the use of this stereotactic system are described. Final lead location analysis using postoperative CT was performed to measure the clinical accuracy of the stereotactic system. Results Our series included 263 patients who underwent 284 DBS implantation surgeries at one institution over a 6-year period. The clinical targeting error without accounting for brain shift in this series was found to be 1.99 mm (SD 0.9). Operating room time was reduced through earlier incision time by 2 h per case. Conclusion Customized, miniature stereotactic frames, namely STarFix platforms, are an acceptable and efficient alternative method for DBS implantation. Its clinical accuracy and outcome are comparable to those associated with traditional stereotactic frame systems. PMID:21160241
NASA Astrophysics Data System (ADS)
Faizah, Arbiati; Syafei, Wahyul Amien; Isnanto, R. Rizal
2018-02-01
This research proposed a model combining an approach of Total Quality Management (TQM) and Fuzzy method of Service Quality (SERVQUAL) to asses service quality. TQM implementation was as quality management orienting on customer's satisfaction by involving all stakeholders. SERVQUAL model was used to measure quality service based on five dimensions such as tangible, reliability, responsiveness, assurance, and empathy. Fuzzy set theory was to accommodate subjectivity and ambiguity of quality assessment. Input data consisted of indicator data and quality assessment aspect. Input data was, then, processed to be service quality assessment questionnaires of Pesantren by using Fuzzy method to get service quality score. This process consisted of some steps as follows : inputting dimension and questionnaire data to data base system, filling questionnaire through system, then, system calculated fuzzification, defuzzification, gap of quality expected and received by service receivers, and calculating each dimension rating showing quality refinement priority. Rating of each quality dimension was, then, displayed at dashboard system to enable users to see information. From system having been built, it could be known that tangible dimension had the highest gap, -0.399, thus it needs to be prioritized and gets evaluation and refinement action soon.
Distributed data collection for a database of radiological image interpretations
NASA Astrophysics Data System (ADS)
Long, L. Rodney; Ostchega, Yechiam; Goh, Gin-Hua; Thoma, George R.
1997-01-01
The National Library of Medicine, in collaboration with the National Center for Health Statistics and the National Institute for Arthritis and Musculoskeletal and Skin Diseases, has built a system for collecting radiological interpretations for a large set of x-ray images acquired as part of the data gathered in the second National Health and Nutrition Examination Survey. This system is capable of delivering across the Internet 5- and 10-megabyte x-ray images to Sun workstations equipped with X Window based 2048 X 2560 image displays, for the purpose of having these images interpreted for the degree of presence of particular osteoarthritic conditions in the cervical and lumbar spines. The collected interpretations can then be stored in a database at the National Library of Medicine, under control of the Illustra DBMS. This system is a client/server database application which integrates (1) distributed server processing of client requests, (2) a customized image transmission method for faster Internet data delivery, (3) distributed client workstations with high resolution displays, image processing functions and an on-line digital atlas, and (4) relational database management of the collected data.
Investigating electron spin resonance spectroscopy of a spin-½ compound in a home-built spectrometer
NASA Astrophysics Data System (ADS)
Sarkar, Jit; Roy, Subhadip; Singh, Jitendra Kumar; Singh, Sourabh; Chakraborty, Tanmoy; Mitra, Chiranjib
2018-05-01
In this work we report electron spin resonance (ESR) measurements performed on NH4CuPO4.H2O, a Heisenberg spin ½ dimer compound. We carried out the experiments both at room temperature and at 78 K, which are well above the antiferromagnetic ordering temperature of the system where the paramagnetic spins have a dominant role in determining its magnetic behavior. We performed the measurements in a home built custom designed continuous wave electron spin resonance (CW-ESR) spectrometer. By analyzing the experimental data, we were able to quantify the Landé g-factor and the ESR line-width of the sample.
Creating value-focused healthcare delivery systems: Part three--Core competencies.
Beveridge, R N
1997-01-01
Value is created through the delivery of high-quality, cost--effective healthcare services. The ability to create value from the providers' perspective is facilitated through the development and implementation of essential, customer-focused core competencies. These core competencies include customer relationship management, payer/provider relationship management, disease management, outcomes management, financial/cost management, and information management. Customer relationship management is the foundation upon which all core competencies must be built. All of the core competencies must focus on the needs of the customers, both internal and external. Structuring all processes involved in the core competencies from the perspective of the customer will ensure that value is created throughout the system. Payer/provider relationship management will become a crucial pillar for healthcare providers in the future. As more vertical integration among providers occurs, the management of the relationships among providers and with payers will become more important. Many of the integration strategies being implemented across the country involve the integration of hospitals, physicians, and payers to form accountable health plans. The relationships must be organized to form "win/win" situations, where all parties are focused on a shared vision of creating value and none of the parties benefits at the expense of the others. Disease management in creating value requires that we begin examining the disease process along the entire continuum. Not only must providers be able to provide high-quality acute and chronic care, but they must also begin to focus more heavily on programs of prevention. Value is created throughout the system through reducing the prevalence and incidence of disease. Only through managing the full continuum of health will value be created throughout the healthcare delivery system. Outcomes management ensures that the outcomes are the highest quality at a cost-effective price. Outcomes must not only be compared to best practices, but to what is possible. Providers must constantly strive to enhance the quality of the services. Financial/cost management ensures that care is cost-effective and that a marginal profit is maintained to allow continued investment in new technology and continuing medical education to enhance the quality of care and lifestyles for all stakeholders. Information management is the binding element, or keystone, in providing value-focused care. Through the collection, storing, transfer, manipulation, sorting, and reporting of data, more effective decision-making can occur. Integrated MIS allows information to be generated about the cost-effectiveness of treatment regimens, employee productivity, physician cost-effectiveness, supply utilization, and clinical outcomes, as well as patient information to be readily available throughout the healthcare system. Having this information available will allow providers to become more cost-effective in the delivery of care, which results in perceived higher value for the services. Customers demand value. Value is created by meeting the needs and demands of the customers through the delivery of cost-effective, high-quality healthcare services that are easily accessible and meet with high patient satisfaction. Providers who can demonstrate their ability to provide the services in this manner will create a competitive advantage in the marketplace and will be perceived as the value provider of choice by loyal customers.
Klingvall Ek, Rebecca; Hong, Jaan; Thor, Andreas; Bäckström, Mikael; Rännar, Lars-Erik
This study aimed to evaluate how as-built electron beam melting (EBM) surface properties affect the onset of blood coagulation. The properties of EBM-manufactured implant surfaces for placement have, until now, remained largely unexplored in literature. Implants with conventional designs and custom-made implants have been manufactured using EBM technology and later placed into the human body. Many of the conventional implants used today, such as dental implants, display modified surfaces to optimize bone ingrowth, whereas custom-made implants, by and large, have machined surfaces. However, titanium in itself demonstrates good material properties for the purpose of bone ingrowth. Specimens manufactured using EBM were selected according to their surface roughness and process parameters. EBM-produced specimens, conventional machined titanium surfaces, as well as PVC surfaces for control were evaluated using the slide chamber model. A significant increase in activation was found, in all factors evaluated, between the machined samples and EBM-manufactured samples. The results show that EBM-manufactured implants with as-built surfaces augment the thrombogenic properties. EBM that uses Ti6Al4V powder appears to be a good manufacturing solution for load-bearing implants with bone anchorage. The as-built surfaces can be used "as is" for direct bone contact, although any surface treatment available for conventional implants can be performed on EBM-manufactured implants with a conventional design.
High-performance electronics for time-of-flight PET systems
NASA Astrophysics Data System (ADS)
Choong, W.-S.; Peng, Q.; Vu, C. Q.; Turko, B. T.; Moses, W. W.
2013-01-01
We have designed and built a high-performance readout electronics system for time-of-flight positron emission tomography (TOF PET) cameras. The electronics architecture is based on the electronics for a commercial whole-body PET camera (Siemens/CPS Cardinal electronics), modified to improve the timing performance. The fundamental contributions in the electronics that can limit the timing resolution include the constant fraction discriminator (CFD), which converts the analog electrical signal from the photo-detector to a digital signal whose leading edge is time-correlated with the input signal, and the time-to-digital converter (TDC), which provides a time stamp for the CFD output. Coincident events are identified by digitally comparing the values of the time stamps. In the Cardinal electronics, the front-end processing electronics are performed by an Analog subsection board, which has two application-specific integrated circuits (ASICs), each servicing a PET block detector module. The ASIC has a built-in CFD and TDC. We found that a significant degradation in the timing resolution comes from the ASIC's CFD and TDC. Therefore, we have designed and built an improved Analog subsection board that replaces the ASIC's CFD and TDC with a high-performance CFD (made with discrete components) and TDC (using the CERN high-performance TDC ASIC). The improved Analog subsection board is used in a custom single-ring LSO-based TOF PET camera. The electronics system achieves a timing resolution of 60 ps FWHM. Prototype TOF detector modules are read out with the electronics system and give coincidence timing resolutions of 259 ps FWHM and 156 ps FWHM for detector modules coupled to LSO and LaBr3 crystals respectively.
High-performance electronics for time-of-flight PET systems.
Choong, W-S; Peng, Q; Vu, C Q; Turko, B T; Moses, W W
2013-01-01
We have designed and built a high-performance readout electronics system for time-of-flight positron emission tomography (TOF PET) cameras. The electronics architecture is based on the electronics for a commercial whole-body PET camera (Siemens/CPS Cardinal electronics), modified to improve the timing performance. The fundamental contributions in the electronics that can limit the timing resolution include the constant fraction discriminator (CFD), which converts the analog electrical signal from the photo-detector to a digital signal whose leading edge is time-correlated with the input signal, and the time-to-digital converter (TDC), which provides a time stamp for the CFD output. Coincident events are identified by digitally comparing the values of the time stamps. In the Cardinal electronics, the front-end processing electronics are performed by an Analog subsection board, which has two application-specific integrated circuits (ASICs), each servicing a PET block detector module. The ASIC has a built-in CFD and TDC. We found that a significant degradation in the timing resolution comes from the ASIC's CFD and TDC. Therefore, we have designed and built an improved Analog subsection board that replaces the ASIC's CFD and TDC with a high-performance CFD (made with discrete components) and TDC (using the CERN high-performance TDC ASIC). The improved Analog subsection board is used in a custom single-ring LSO-based TOF PET camera. The electronics system achieves a timing resolution of 60 ps FWHM. Prototype TOF detector modules are read out with the electronics system and give coincidence timing resolutions of 259 ps FWHM and 156 ps FWHM for detector modules coupled to LSO and LaBr 3 crystals respectively.
de Castro, Alberto; Rosales, Patricia; Marcos, Susana
2007-03-01
To measure tilt and decentration of intraocular lenses (IOLs) with Scheimpflug and Purkinje imaging systems in physical model eyes with known amounts of tilt and decentration and patients. Instituto de Optica Daza de Valdés, Consejo Superior de Investigaciones Científicas, Madrid, Spain. Measurements of IOL tilt and decentration were obtained using a commercial Scheimpflug system (Pentacam, Oculus), custom algorithms, and a custom-built Purkinje imaging apparatus. Twenty-five Scheimpflug images of the anterior segment of the eye were obtained at different meridians. Custom algorithms were used to process the images (correction of geometrical distortion, edge detection, and curve fittings). Intraocular lens tilt and decentration were estimated by fitting sinusoidal functions to the projections of the pupillary axis and IOL axis in each image. The Purkinje imaging system captures pupil images showing reflections of light from the anterior corneal surface and anterior and posterior lens surfaces. Custom algorithms were used to detect the Purkinje image locations and estimate IOL tilt and decentration based on a linear system equation and computer eye models with individual biometry. Both methods were validated with a physical model eye in which IOL tilt and decentration can be set nominally. Twenty-one eyes of 12 patients with IOLs were measured with both systems. Measurements of the physical model eye showed an absolute discrepancy between nominal and measured values of 0.279 degree (Purkinje) and 0.243 degree (Scheimpflug) for tilt and 0.094 mm (Purkinje) and 0.228 mm (Scheimpflug) for decentration. In patients, the mean tilt was less than 2.6 degrees and the mean decentration less than 0.4 mm. Both techniques showed mirror symmetry between right eyes and left eyes for tilt around the vertical axis and for decentration in the horizontal axis. Both systems showed high reproducibility. Validation experiments on physical model eyes showed slightly higher accuracy with the Purkinje method than the Scheimpflug imaging method. Horizontal measurements of patients with both techniques were highly correlated. The IOLs tended to be tilted and decentered nasally in most patients.
WMS and WFS Standards Implementation of Weather Data
NASA Astrophysics Data System (ADS)
Armstrong, M.
2005-12-01
CustomWeather is private weather company that delivers global weather data products. CustomWeather has built a mapping platform according to OGC standards. Currently, both a Web Mapping Service (WMS) and Web Feature Service (WFS) are supported by CustomWeather. Supporting open geospatial standards has lead to number of positive changes internally to the processes of CustomWeather, along with those of the clients accessing the data. Quite a number of challenges surfaced during this process, particularly with respect to combining a wide variety of raw modeling and sensor data into a single delivery platform. Open standards have, however, made the delivery of very different data products rather seamless. The discussion will address the issues faced in building an OGC-based mapping platform along with the limitations encountered. While the availability of these data products through open standards is still very young, there have already been many adopters in the utility and navigation industries. The discussion will take a closer look at the different approach taken by these two industries as they utilize interoperability standards with existing data. Insight will be given in regards to applications already taking advantage of this new technology and how this is affecting decision-making processes. CustomWeather has observed considerable interest and potential benefit in this technology from developing countries. Weather data is a key element in disaster management. Interoperability is literally opening up a world of data and has the potential to quickly enable functionality that would otherwise take considerable time to implement. The discussion will briefly touch on our experience.
Calcium neuroimaging in behaving zebrafish larvae using a turn-key light field camera
NASA Astrophysics Data System (ADS)
Cruz Perez, Carlos; Lauri, Antonella; Symvoulidis, Panagiotis; Cappetta, Michele; Erdmann, Arne; Westmeyer, Gil Gregor
2015-09-01
Reconstructing a three-dimensional scene from multiple simultaneously acquired perspectives (the light field) is an elegant scanless imaging concept that can exceed the temporal resolution of currently available scanning-based imaging methods for capturing fast cellular processes. We tested the performance of commercially available light field cameras on a fluorescent microscopy setup for monitoring calcium activity in the brain of awake and behaving reporter zebrafish larvae. The plenoptic imaging system could volumetrically resolve diverse neuronal response profiles throughout the zebrafish brain upon stimulation with an aversive odorant. Behavioral responses of the reporter fish could be captured simultaneously together with depth-resolved neuronal activity. Overall, our assessment showed that with some optimizations for fluorescence microscopy applications, commercial light field cameras have the potential of becoming an attractive alternative to custom-built systems to accelerate molecular imaging research on cellular dynamics.
Calcium neuroimaging in behaving zebrafish larvae using a turn-key light field camera.
Perez, Carlos Cruz; Lauri, Antonella; Symvoulidis, Panagiotis; Cappetta, Michele; Erdmann, Arne; Westmeyer, Gil Gregor
2015-09-01
Reconstructing a three-dimensional scene from multiple simultaneously acquired perspectives (the light field) is an elegant scanless imaging concept that can exceed the temporal resolution of currently available scanning-based imaging methods for capturing fast cellular processes. We tested the performance of commercially available light field cameras on a fluorescent microscopy setup for monitoring calcium activity in the brain of awake and behaving reporter zebrafish larvae. The plenoptic imaging system could volumetrically resolve diverse neuronal response profiles throughout the zebrafish brain upon stimulation with an aversive odorant. Behavioral responses of the reporter fish could be captured simultaneously together with depth-resolved neuronal activity. Overall, our assessment showed that with some optimizations for fluorescence microscopy applications, commercial light field cameras have the potential of becoming an attractive alternative to custom-built systems to accelerate molecular imaging research on cellular dynamics.
NASA Astrophysics Data System (ADS)
Ong, Mingwei; Watanuki, Keiichi
Recently, as consumers gradually prefer buying products that reflect their own personality, there exist some consumers who wish to involve in the product design process. Parallel with the popularization of e-business, many manufacturers have utilized the Internet to promote their products, and some have even built websites that enable consumers to select their desirable product specifications. Nevertheless, this method has not been applied on complicated mechanical product due to the facts that complicated mechanical product has a large number of specifications that inter-relate among one another. In such a case, ordinary consumers who are lacking of design knowledge, are not capable of determining these specifications. In this paper, a prototype framework called Internet-based consumer-oriented product ordering system has been developed in which it enables ordinary consumers to have large freedom in determining complicated mechanical product specifications, and meanwhile ensures that the manufacturing of the determined product is feasible.
Liu, Jessica; Oakley, Clyde; Shandas, Robin
2009-01-01
The objective of this work is to construct capacitive micromachined ultrasouind transducers (cMUTs) using multi-user MEMS (MicroElectroMechanical Systems) process (MUMPs) and to analyze the capability of this process relative to the customized processes commonly in use. The MUMPs process has the advantages of low cost and accessibility to general users since it is not necessary to have access to customized fabrication capability such as wafer-bonding and sacrificial release processes. While other researchers have reported fabricating cMUTs using the MUMPs process none has reported the limitations in the process that arise due to the use of standard design rules that place limitations on the material thicknesses, gap thicknesses, and materials that may be used. In this paper we explain these limitations, and analyze the capabilities using 1D modeling, Finite Element Analysis, and experimental devices. We show that one of the limitations is that collapse voltage and center frequency can not be controlled independently. However, center frequencies up to 9 MHz can be achieved with collapse voltages of less than 200 volts making such devices suitable for medical and non-destructive evaluation imaging applications. Since the membrane and base electrodes are made of polysilicon, there is a larger series resistance than that resulting from processes that use metal electrodes. We show that the series resistance is not a significant problem. The conductive polysilicon can also destroy the cMUT if the top membrane is pulled in the bottom. As a solution we propose the application of an additional dielectric layer. Finally we demonstrate a device built with a novel beam construction that produces transmitted pressure pulse into air with 6% bandwidth and agrees reasonably well with the 1D model. We conclude that cMUTS made with MUMPS process have some limitations that are not present in customized processes. However these limitations may be overcome with the proper design considerations that we have presented putting a low cost, highly accessible means of making cMUT devices into the hands of academic and industrial researchers. PMID:19640557
Infrastructure for Big Data in the Intensive Care Unit.
Zelechower, Javier; Astudillo, José; Traversaro, Francisco; Redelico, Francisco; Luna, Daniel; Quiros, Fernan; San Roman, Eduardo; Risk, Marcelo
2017-01-01
The Big Data paradigm can be applied in intensive care unit, in order to improve the treatment of the patients, with the aim of customized decisions. This poster is about the infrastructure necessary to built a Big Data system for the ICU. Together with the infrastructure, the conformation of a multidisciplinary team is essential to develop Big Data to use in critical care medicine.
ISO 9000 and/or Systems Engineering Capability Maturity Model?
NASA Technical Reports Server (NTRS)
Gholston, Sampson E.
2002-01-01
For businesses and organizations to remain competitive today they must have processes and systems in place that will allow them to first identify customer needs and then develop products/processes that will meet or exceed the customers needs and expectations. Customer needs, once identified, are normally stated as requirements. Designers can then develop products/processes that will meet these requirements. Several functions, such as quality management and systems engineering management are used to assist product development teams in the development process. Both functions exist in all organizations and both have a similar objective, which is to ensure that developed processes will meet customer requirements. Are efforts in these organizations being duplicated? Are both functions needed by organizations? What are the similarities and differences between the functions listed above? ISO 9000 is an international standard of goods and services. It sets broad requirements for the assurance of quality and for management's involvement. It requires organizations to document the processes and to follow these documented processes. ISO 9000 gives customers assurance that the suppliers have control of the process for product development. Systems engineering can broadly be defined as a discipline that seeks to ensure that all requirements for a system are satisfied throughout the life of the system by preserving their interrelationship. The key activities of systems engineering include requirements analysis, functional analysis/allocation, design synthesis and verification, and system analysis and control. The systems engineering process, when followed properly, will lead to higher quality products, lower cost products, and shorter development cycles. The System Engineering Capability Maturity Model (SE-CMM) will allow companies to measure their system engineering capability and continuously improve those capabilities. ISO 9000 and SE-CMM seem to have a similar objective, which is to document the organization's processes and certify to potential customers the capability of a supplier to control the processes that determine the quality of the product or services being produced. The remaining sections of this report examine the differences and similarities between ISO 9000 and SE-CMM and make recommendations for implementation.
Mobile Timekeeping Application Built on Reverse-Engineered JPL Infrastructure
NASA Technical Reports Server (NTRS)
Witoff, Robert J.
2013-01-01
Every year, non-exempt employees cumulatively waste over one man-year tracking their time and using the timekeeping Web page to save those times. This app eliminates this waste. The innovation is a native iPhone app. Libraries were built around a reverse- engineered JPL API. It represents a punch-in/punch-out paradigm for timekeeping. It is accessible natively via iPhones, and features ease of access. Any non-exempt employee can natively punch in and out, as well as save and view their JPL timecard. This app is built on custom libraries created by reverse-engineering the standard timekeeping application. Communication is through custom libraries that re-route traffic through BrowserRAS (remote access service). This has value at any center where employees track their time.
USC orthogonal multiprocessor for image processing with neural networks
NASA Astrophysics Data System (ADS)
Hwang, Kai; Panda, Dhabaleswar K.; Haddadi, Navid
1990-07-01
This paper presents the architectural features and imaging applications of the Orthogonal MultiProcessor (OMP) system, which is under construction at the University of Southern California with research funding from NSF and assistance from several industrial partners. The prototype OMP is being built with 16 Intel i860 RISC microprocessors and 256 parallel memory modules using custom-designed spanning buses, which are 2-D interleaved and orthogonally accessed without conflicts. The 16-processor OMP prototype is targeted to achieve 430 MIPS and 600 Mflops, which have been verified by simulation experiments based on the design parameters used. The prototype OMP machine will be initially applied for image processing, computer vision, and neural network simulation applications. We summarize important vision and imaging algorithms that can be restructured with neural network models. These algorithms can efficiently run on the OMP hardware with linear speedup. The ultimate goal is to develop a high-performance Visual Computer (Viscom) for integrated low- and high-level image processing and vision tasks.
NASA Astrophysics Data System (ADS)
Calderisi, Marco; Ulrici, Alessandro; Pigani, Laura; Secchi, Alberto; Seeber, Renato
2012-09-01
The EU FP7 project CUSTOM (Drugs and Precursor Sensing by Complementing Low Cost Multiple Techniques) aims at developing a new sensing system for the detection of drug precursors in gaseous samples, which includes an External Cavity-Quantum Cascade Laser Photo-Acoustic Sensor (EC-QCLPAS) that is in the final step of realisation. Thus, a simulation based on FT-IR literature spectra has been accomplished, where the development of a proper strategy for the design of the composition of the environment, as much as possible realistic and representative of different scenarios, is of key importance. To this aim, an approach based on the combination of signal processing and experimental design techniques has been developed. The gaseous mixtures were built by adding the considered 4 drug precursor (target) species to the gases typically found in atmosphere, taking also into account possible interfering species. These last chemicals were selected considering custom environments (20 interfering chemical species), whose concentrations have been inferred from literature data. The spectra were first denoised by means of a Fast Wavelet Transform-based algorithm; then, a procedure based on a sigmoidal transfer function was developed to multiply the pure components spectra by the respective concentration values, in a way to correctly preserve background intensity and shape, and to operate only on the absorption bands. The noise structure of the EC-QCLPAS was studied using sample spectra measured with a prototype instrument, and added to the simulated mixtures. Finally a matrix containing 5000 simulated spectra of gaseous mixtures was built up.
Comparative study of age estimation using dentinal translucency by digital and conventional methods.
Bommannavar, Sushma; Kulkarni, Meena
2015-01-01
Estimating age using the dentition plays a significant role in identification of the individual in forensic cases. Teeth are one of the most durable and strongest structures in the human body. The morphology and arrangement of teeth vary from person-to-person and is unique to an individual as are the fingerprints. Therefore, the use of dentition is the method of choice in the identification of the unknown. Root dentin translucency is considered to be one of the best parameters for dental age estimation. Traditionally, root dentin translucency was measured using calipers. Recently, the use of custom built software programs have been proposed for the same. The present study describes a method to measure root dentin translucency on sectioned teeth using a custom built software program Adobe Photoshop 7.0 version (Adobe system Inc, Mountain View California). A total of 50 single rooted teeth were sectioned longitudinally to derive a 0.25 mm uniform thickness and the root dentin translucency was measured using digital and caliper methods and compared. The Gustafson's morphohistologic approach is used in this study. Correlation coefficients of translucency measurements to age were statistically significant for both the methods (P < 0.125) and linear regression equations derived from both methods revealed better ability of the digital method to assess age. The custom built software program used in the present study is commercially available and widely used image editing software. Furthermore, this method is easy to use and less time consuming. The measurements obtained using this method are more precise and thus help in more accurate age estimation. Considering these benefits, the present study recommends the use of digital method to assess translucency for age estimation.
Comparative study of age estimation using dentinal translucency by digital and conventional methods
Bommannavar, Sushma; Kulkarni, Meena
2015-01-01
Introduction: Estimating age using the dentition plays a significant role in identification of the individual in forensic cases. Teeth are one of the most durable and strongest structures in the human body. The morphology and arrangement of teeth vary from person-to-person and is unique to an individual as are the fingerprints. Therefore, the use of dentition is the method of choice in the identification of the unknown. Root dentin translucency is considered to be one of the best parameters for dental age estimation. Traditionally, root dentin translucency was measured using calipers. Recently, the use of custom built software programs have been proposed for the same. Objectives: The present study describes a method to measure root dentin translucency on sectioned teeth using a custom built software program Adobe Photoshop 7.0 version (Adobe system Inc, Mountain View California). Materials and Methods: A total of 50 single rooted teeth were sectioned longitudinally to derive a 0.25 mm uniform thickness and the root dentin translucency was measured using digital and caliper methods and compared. The Gustafson's morphohistologic approach is used in this study. Results: Correlation coefficients of translucency measurements to age were statistically significant for both the methods (P < 0.125) and linear regression equations derived from both methods revealed better ability of the digital method to assess age. Conclusion: The custom built software program used in the present study is commercially available and widely used image editing software. Furthermore, this method is easy to use and less time consuming. The measurements obtained using this method are more precise and thus help in more accurate age estimation. Considering these benefits, the present study recommends the use of digital method to assess translucency for age estimation. PMID:25709325
Manipulation and handling processes off-line programming and optimization with use of K-Roset
NASA Astrophysics Data System (ADS)
Gołda, G.; Kampa, A.
2017-08-01
Contemporary trends in development of efficient, flexible manufacturing systems require practical implementation of modern “Lean production” concepts for maximizing customer value through minimizing all wastes in manufacturing and logistics processes. Every FMS is built on the basis of automated and robotized production cells. Except flexible CNC machine tools and other equipments, the industrial robots are primary elements of the system. In the studies, authors look for wastes of time and cost in real tasks of robots, during manipulation processes. According to aspiration for optimization of handling and manipulation processes with use of the robots, the application of modern off-line programming methods and computer simulation, is the best solution and it is only way to minimize unnecessary movements and other instructions. The modelling process of robotized production cell and offline programming of Kawasaki robots in AS-Language will be described. The simulation of robotized workstation will be realized with use of virtual reality software K-Roset. Authors show the process of industrial robot’s programs improvement and optimization in terms of minimizing the number of useless manipulator movements and unnecessary instructions. This is realized in order to shorten the time of production cycles. This will also reduce costs of handling, manipulations and technological process.
Building automatic customer complaints filtering application based on Twitter in Bahasa Indonesia
NASA Astrophysics Data System (ADS)
Gunawan, D.; Siregar, R. P.; Rahmat, R. F.; Amalia, A.
2018-03-01
Twitter has become a media to provide communication between a company with its customers. The average number of Twitter active users monthly is 330 million. A lot of companies realize the potential of Twitter to establish good relationship with their customers. Therefore, they usually have one official Twitter account to act as customer care division. In Indonesia, one of the company that utilizes the potential of Twitter to reach their customers is PT Telkom. PT Telkom has an official customer service account (called @TelkomCare) to receive customers’ problem. However, because of this account is open for public, Twitter users might post all kind of messages (not only complaints) to Telkom Care account. This leads to a problem that the Telkom Care account contains not only the customer complaints but also compliment and ordinary message. Furthermore, the complaints should be distributed to relevant division such as “Indihome”, “Telkomsel”, “UseeTV”, and “Telepon” based on the content of the message. This research built the application that automatically filter twitter post messages into several pre-defined categories (based on existing divisions) using Naïve Bayes algorithm. This research is done by collecting Twitter message, data cleaning, data pre-processing, training and testing data, and evaluate the classification result. This research yields 97% accuracy to classify Twitter message into the categories mentioned earlier.
Flexible forecasts: a key to better customer service.
Neuhaus, C A
1997-05-01
Good customer service requires companies to keep their fingers on their customers' pulse and develop intelligent forecasts with their needs built in. As even the smallest factories today are placing at least some emphasis on lead time reductions to improve flexibility and the speed of response to customer requirements, the role of the forecast, now more than ever, is to provide at all times the best, most recent, and most accurate picture of what exactly will be required and when.
Zhou, Zhi; de Bedout, Juan Manuel; Kern, John Michael; Biyik, Emrah; Chandra, Ramu Sharat
2013-01-22
A system for optimizing customer utility usage in a utility network of customer sites, each having one or more utility devices, where customer site is communicated between each of the customer sites and an optimization server having software for optimizing customer utility usage over one or more networks, including private and public networks. A customer site model for each of the customer sites is generated based upon the customer site information, and the customer utility usage is optimized based upon the customer site information and the customer site model. The optimization server can be hosted by an external source or within the customer site. In addition, the optimization processing can be partitioned between the customer site and an external source.
Development of new data acquisition system for COMPASS experiment
NASA Astrophysics Data System (ADS)
Bodlak, M.; Frolov, V.; Jary, V.; Huber, S.; Konorov, I.; Levit, D.; Novy, J.; Salac, R.; Virius, M.
2016-04-01
This paper presents development and recent status of the new data acquisiton system of the COMPASS experiment at CERN with up to 50 kHz trigger rate and 36 kB average event size during 10 second period with beam followed by approximately 40 second period without beam. In the original DAQ, the event building is performed by software deployed on switched computer network, moreover the data readout is based on deprecated PCI technology; the new system replaces the event building network with a custom FPGA-based hardware. The custom cards are introduced and advantages of the FPGA technology for DAQ related tasks are discussed. In this paper, we focus on the software part that is mainly responsible for control and monitoring. The most of the system can run as slow control; only readout process has realtime requirements. The design of the software is built on state machines that are implemented using the Qt framework; communication between remote nodes that form the software architecture is based on the DIM library and IPBus technology. Furthermore, PHP and JS languages are used to maintain system configuration; the MySQL database was selected as storage for both configuration of the system and system messages. The system has been design with maximum throughput of 1500 MB/s and large buffering ability used to spread load on readout computers over longer period of time. Great emphasis is put on data latency, data consistency, and even timing checks which are done at each stage of event assembly. System collects results of these checks which together with special data format allows the software to localize origin of problems in data transmission process. A prototype version of the system has already been developed and tested the new system fulfills all given requirements. It is expected that the full-scale version of the system will be finalized in June 2014 and deployed on September provided that tests with cosmic run succeed.
NASA Astrophysics Data System (ADS)
Fitriana, Rina; Kurniawan, Wawan; Barlianto, Anung; Adriansyah Putra, Rizki
2016-02-01
AC is small and medium enterprises which is engaged in the field of crafts. This SME (Small Medium Enterprise) didn't have an integrated information system for managing sales. This research aims to design a marketing Information system online as applications that built as web base. The integrated system is made to manage sales and expand its market share. This study uses a structured analysis and design in its approach to build systems and also implemented a marketing framework of STP (Segmentation, Targeting, Positioning) and 4P (Price, Product, Place, Promotion) to obtain market analysis. The main market target customer craftsmen AC is women aged 13 years to 35 years. The products produced by AC are shoes, brooch, that are typical of the archipelago. The prices is range from Rp. 2000 until Rp. 400.000. Marketing information system online can be used as a sales transaction document, promoting the goods, and for customer booking products.
Analyzing Strategic Business Rules through Simulation Modeling
NASA Astrophysics Data System (ADS)
Orta, Elena; Ruiz, Mercedes; Toro, Miguel
Service Oriented Architecture (SOA) holds promise for business agility since it allows business process to change to meet new customer demands or market needs without causing a cascade effect of changes in the underlying IT systems. Business rules are the instrument chosen to help business and IT to collaborate. In this paper, we propose the utilization of simulation models to model and simulate strategic business rules that are then disaggregated at different levels of an SOA architecture. Our proposal is aimed to help find a good configuration for strategic business objectives and IT parameters. The paper includes a case study where a simulation model is built to help business decision-making in a context where finding a good configuration for different business parameters and performance is too complex to analyze by trial and error.
COMPASS: A general purpose computer aided scheduling tool
NASA Technical Reports Server (NTRS)
Mcmahon, Mary Beth; Fox, Barry; Culbert, Chris
1991-01-01
COMPASS is a generic scheduling system developed by McDonnell Douglas under the direction of the Software Technology Branch at JSC. COMPASS is intended to illustrate the latest advances in scheduling technology and provide a basis from which custom scheduling systems can be built. COMPASS was written in Ada to promote readability and to conform to potential NASA Space Station Freedom standards. COMPASS has some unique characteristics that distinguishes it from commercial products. These characteristics are discussed and used to illustrate some differences between scheduling tools.
COMPASS: An Ada based scheduler
NASA Technical Reports Server (NTRS)
Mcmahon, Mary Beth; Culbert, Chris
1992-01-01
COMPASS is a generic scheduling system developed by McDonnell Douglas and funded by the Software Technology Branch of NASA Johnson Space Center. The motivation behind COMPASS is to illustrate scheduling technology and provide a basis from which custom scheduling systems can be built. COMPASS was written in Ada to promote readability and to conform to DOD standards. COMPASS has some unique characteristics that distinguishes it from commercial products. This paper discusses these characteristics and uses them to illustrate some differences between scheduling tools.
DEAP-3600 Data Acquisition System
NASA Astrophysics Data System (ADS)
Lindner, Thomas
2015-12-01
DEAP-3600 is a dark matter experiment using liquid argon to detect Weakly Interacting Massive Particles (WIMPs). The DEAP-3600 Data Acquisition (DAQ) has been built using a combination of commercial and custom electronics, organized using the MIDAS framework. The DAQ system needs to suppress a high rate of background events from 39Ar beta decays. This suppression is implemented using a combination of online firmware and software-based event filtering. We will report on progress commissioning the DAQ system, as well as the development of the web-based user interface.
Life Cycle Analysis of Dedicated Nano-Launch Technologies
NASA Technical Reports Server (NTRS)
Zapata, Edgar; McCleskey, Carey (Editor); Martin, John; Lepsch, Roger; Ternani, Tosoc
2014-01-01
Recent technology advancements have enabled the development of small cheap satellites that can perform useful functions in the space environment. Currently, the only low cost option for getting these payloads into orbit is through ride share programs - small satellites awaiting the launch of a larger satellite, and then riding along on the same launcher. As a result, these small satellite customers await primary payload launches and a backlog exists. An alternative option would be dedicated nano-launch systems built and operated to provide more flexible launch services, higher availability, and affordable prices. The potential customer base that would drive requirements or support a business case includes commercial, academia, civil government and defense. Further, NASA technology investments could enable these alternative game changing options. With this context, in 2013 the Game Changing Development (GCD) program funded a NASA team to investigate the feasibility of dedicated nano-satellite launch systems with a recurring cost of less than $2 million per launch for a 5 kg payload to low Earth orbit. The team products would include potential concepts, technologies and factors for enabling the ambitious cost goal, exploring the nature of the goal itself, and informing the GCD program technology investment decision making process. This paper provides an overview of the life cycle analysis effort that was conducted in 2013 by an inter-center NASA team. This effort included the development of reference nano-launch system concepts, developing analysis processes and models, establishing a basis for cost estimates (development, manufacturing and launch) suitable to the scale of the systems, and especially, understanding the relationship of potential game changing technologies to life cycle costs, as well as other factors, such as flights per year.
Steele, James; Bruce-Low, Stewart; Smith, Dave; Jessop, David; Osborne, Neil
2016-03-01
Indirect measurement of disc hydration can be obtained through measures of spinal height using stadiometry. However, specialised stadiometers for this are often custom-built and expensive. Generic wall-mounted stadiometers alternatively are common in clinics and laboratories. This study examined the reliability of a custom set-up utilising a wall-mounted stadiometer for measurement of spinal height using custom built wall mounted postural rods. Twelve participants with non-specific chronic low back pain (CLBP; females n = 5, males n = 7) underwent measurement of spinal height on three separate consecutive days at the same time of day where 10 measurements were taken at 20 s intervals. Comparisons were made using repeated measures analysis of variance for 'trial' and 'gender'. There were no significant effects by trial or interaction effects of trial x gender. Intra-individual absolute standard error of measurement (SEM) was calculated for spinal height using the first of the 10 measures, the average of 10 measures, the total shrinkage, and the rate of shrinkage across the 10 measures examined as the slope of the curve when a linear regression was fitted. SEMs were 3.1 mm, 2.8 mm, 2.6 mm and 0.212, respectively. Absence of significant differences between trials and the reported SEMs suggests this custom set-up for measuring spinal height changes is suitable use as an outcome measure in either research or clinical practice in participants with CLBP. Copyright © 2015 Elsevier Ltd and The Ergonomics Society. All rights reserved.
Active vibration attenuating seat suspension for an armored helicopter crew seat
NASA Astrophysics Data System (ADS)
Sztein, Pablo Javier
An Active Vibration Attenuating Seat Suspension (AVASS) for an MH-60S helicopter crew seat is designed to protect the occupants from harmful whole-body vibration (WBV). Magnetorheological (MR) suspension units are designed, fabricated and installed in a helicopter crew seat. These MR isolators are built to work in series with existing Variable Load Energy Absorbers (VLEAs), have minimal increase in weight, and maintain crashworthiness for the seat system. Refinements are discussed, based on testing, to minimize friction observed in the system. These refinements include the addition of roller bearings to replace friction bearings in the existing seat. Additionally, semi-active control of the MR dampers is achieved using special purpose built custom electronics integrated into the seat system. Experimental testing shows that an MH-60S retrofitted with AVASS provides up to 70.65% more vibration attenuation than the existing seat configuration as well as up to 81.1% reduction in vibration from the floor.
Deployment of a Fast-GCMS System to Measure C2 to C5 Carbonyls, Methanol and Ethanol Aboard Aircraft
NASA Technical Reports Server (NTRS)
Apel, Eric C.
2004-01-01
Through funding of this proposal, a fast response gas chromatograph/mass spectrometer (FGCMS) instrument to measure less than or equal to C4 carbonyl compounds and methanol was developed for the NASA GTE TRACE-P (Global Tropospheric Experiment, Transport And Chemical Evolution Over The Pacific) mission. The system consists of four major components: sample inlet, preconcentration system, gas chromatograph (GC), and detector. The preconcentration system is a custom-built cryogen-conservative system. The GC is a compact, custom-built unit that can be temperature programmed and rapidly cooled. Detection is accomplished with an Agilent Technologies 5973 mass spectrometer. The FGCMS instrument provides positive identification because the compounds are chromatographically separated and mass selected. During TRACE-P, a sample was analyzed every 5 minutes. The FGCMS limit of detection was between 5 and 75 pptv, depending on the compound. The entire instrument package is contained in a standard NASA instrument rack (106 cm x 61 cm x 135 cm), consumes less than 1200 watts and is fully automated with LabViEW 6i. Methods were developed or producing highly accurate gas phase standards for the target compounds and for testing the system in the presence of potential interferents. This report presents data on these tests and on the general overall performance of the system in the laboratory and aboard the DC-8 aircraft during the mission. Vertical profiles for acetaldehyde, methanol, acetone, propanal, methyl ethyl ketone, and butanal from FGCMS data collected over the entire mission are also presented.
A system verification platform for high-density epiretinal prostheses.
Chen, Kuanfu; Lo, Yi-Kai; Yang, Zhi; Weiland, James D; Humayun, Mark S; Liu, Wentai
2013-06-01
Retinal prostheses have restored light perception to people worldwide who have poor or no vision as a consequence of retinal degeneration. To advance the quality of visual stimulation for retinal implant recipients, a higher number of stimulation channels is expected in the next generation retinal prostheses, which poses a great challenge to system design and verification. This paper presents a system verification platform dedicated to the development of retinal prostheses. The system includes primary processing, dual-band power and data telemetry, a high-density stimulator array, and two methods for output verification. End-to-end system validation and individual functional block characterization can be achieved with this platform through visual inspection and software analysis. Custom-built software running on the computers also provides a good way for testing new features before they are realized by the ICs. Real-time visual feedbacks through the video displays make it easy to monitor and debug the system. The characterization of the wireless telemetry and the demonstration of the visual display are reported in this paper using a 256-channel retinal prosthetic IC as an example.
Mahato, Niladri K; Montuelle, Stephane; Cotton, John; Williams, Susan; Thomas, James; Clark, Brian
2016-05-18
Single or biplanar video radiography and Roentgen stereophotogrammetry (RSA) techniques used for the assessment of in-vivo joint kinematics involves application of ionizing radiation, which is a limitation for clinical research involving human subjects. To overcome this limitation, our long-term goal is to develop a magnetic resonance imaging (MRI)-only, three dimensional (3-D) modeling technique that permits dynamic imaging of joint motion in humans. Here, we present our initial findings, as well as reliability data, for an MRI-only protocol and modeling technique. We developed a morphology-based motion-analysis technique that uses MRI of custom-built solid-body objects to animate and quantify experimental displacements between them. The technique involved four major steps. First, the imaging volume was calibrated using a custom-built grid. Second, 3-D models were segmented from axial scans of two custom-built solid-body cubes. Third, these cubes were positioned at pre-determined relative displacements (translation and rotation) in the magnetic resonance coil and scanned with a T1 and a fast contrast-enhanced pulse sequences. The digital imaging and communications in medicine (DICOM) images were then processed for animation. The fourth step involved importing these processed images into an animation software, where they were displayed as background scenes. In the same step, 3-D models of the cubes were imported into the animation software, where the user manipulated the models to match their outlines in the scene (rotoscoping) and registered the models into an anatomical joint system. Measurements of displacements obtained from two different rotoscoping sessions were tested for reliability using coefficient of variations (CV), intraclass correlation coefficients (ICC), Bland-Altman plots, and Limits of Agreement analyses. Between-session reliability was high for both the T1 and the contrast-enhanced sequences. Specifically, the average CVs for translation were 4.31 % and 5.26 % for the two pulse sequences, respectively, while the ICCs were 0.99 for both. For rotation measures, the CVs were 3.19 % and 2.44 % for the two pulse sequences with the ICCs being 0.98 and 0.97, respectively. A novel biplanar imaging approach also yielded high reliability with mean CVs of 2.66 % and 3.39 % for translation in the x- and z-planes, respectively, and ICCs of 0.97 in both planes. This work provides basic proof-of-concept for a reliable marker-less non-ionizing-radiation-based quasi-dynamic motion quantification technique that can potentially be developed into a tool for real-time joint kinematics analysis.
A multi-purpose electromagnetic actuator for magnetic resonance elastography.
Feng, Yuan; Zhu, Mo; Qiu, Suhao; Shen, Ping; Ma, Shengyuan; Zhao, Xuefeng; Hu, Chun-Hong; Guo, Liang
2018-04-19
An electromagnetic actuator was designed for magnetic resonance elastography (MRE). The actuator is unique in that it is simple, portable, and capable of brain, abdomen, and phantom imagings. A custom-built control unit was used for controlling the vibration frequency and synchronizing the trigger signals. An actuation unit was built and mounted on the specifically designed clamp and holders for different imaging applications. MRE experiments with respect to gel phantoms, brain, and liver showed that the actuator could produce stable and consistent mechanical waves. Estimated shear modulus using local frequency estimate method demonstrated that the measurement results were in line with that from MRE studies using different actuation systems. The relatively easy setup procedure and simple design indicated that the actuator system had the potential to be applied in many different clinical studies. Copyright © 2018 Elsevier Inc. All rights reserved.
Precision Cleaning - Path to Premier
NASA Technical Reports Server (NTRS)
Mackler, Scott E.
2008-01-01
ITT Space Systems Division s new Precision Cleaning facility provides critical cleaning and packaging of aerospace flight hardware and optical payloads to meet customer performance requirements. The Precision Cleaning Path to Premier Project was a 2007 capital project and is a key element in the approved Premier Resource Management - Integrated Supply Chain Footprint Optimization Project. Formerly precision cleaning was located offsite in a leased building. A new facility equipped with modern precision cleaning equipment including advanced process analytical technology and improved capabilities was designed and built after outsourcing solutions were investigated and found lacking in ability to meet quality specifications and schedule needs. SSD cleans parts that can range in size from a single threaded fastener all the way up to large composite structures. Materials that can be processed include optics, composites, metals and various high performance coatings. We are required to provide verification to our customers that we have met their particulate and molecular cleanliness requirements and we have that analytical capability in this new facility. The new facility footprint is approximately half the size of the former leased operation and provides double the amount of throughput. Process improvements and new cleaning equipment are projected to increase 1st pass yield from 78% to 98% avoiding $300K+/yr in rework costs. Cost avoidance of $350K/yr will result from elimination of rent, IT services, transportation, and decreased utility costs. Savings due to reduced staff expected to net $4-500K/yr.
The Integration of COTS/GOTS within NASA's HST Command and Control System
NASA Technical Reports Server (NTRS)
Pfarr, Thomas; Reis, James E.; Obenschain, Arthur F. (Technical Monitor)
2001-01-01
NASA's mission critical Hubble Space Telescope (HST) command and control system has been re-engineered with COTS/GOTS and minimal custom code. This paper focuses on the design of this new HST Control Center System (CCS) and the lessons learned throughout its development. CCS currently utilizes 31 COTS/GOTS products with an additional 12 million lines of custom glueware code; the new CCS exceeds the capabilities of the original system while significantly reducing the lines of custom code by more than 50%. The lifecycle of COTS/GOTS products will be examined including the pack-age selection process, evaluation process, and integration process. The advantages, disadvantages, issues, concerns, and lessons teamed for integrating COTS/GOTS into the NASA's mission critical HST CCS will be examined in detail. Command and control systems designed with traditional custom code development efforts will be compared with command and control systems designed with new development techniques relying heavily on COTS/COTS integration. This paper will reveal the many hidden costs of COTS/GOTS solutions when compared to traditional custom code development efforts; this paper will show the high cost of COTS/GOTS solutions including training expenses, consulting fees, and long-term maintenance expenses.
Building customer capital through knowledge management processes in the health care context.
Liu, Sandra S; Lin, Carol Yuh-Yun
2007-01-01
Customer capital is a value generated and an asset developed from customer relationships. Successfully managing these relationships is enhanced by knowledge management (KM) infrastructure that captures and transfers customer-related knowledge. The execution of such a system relies on the vision and determination of the top management team (TMT). The health care industry in today's knowledge economy encounters similar challenges of consumerism as its business sector. Developing customer capital is critical for hospitals to remain competitive in the market. This study aims to provide taxonomy for cultivating market-based organizational learning that leads to building of customer capital and attaining desirable financial performance in health care. With the advancement of technology, the KM system plays an important moderating role in the entire process. The customer capital issue has not been fully explored either in the business or the health care industry. The exploratory nature of such a pursuit calls for a qualitative approach. This study examines the proposed taxonomy with the case hospital. The lessons learned also are reflected with three US-based health networks. The TMT incorporated the knowledge process of conceptualization and transformation in their organizational mission. The market-oriented learning approach promoted by TMT helps with the accumulation and sharing of knowledge that prepares the hospital for the dynamics in the marketplace. Their key knowledge advancement relies on both the professional arena and the feedback of customers. The institutionalization of the KM system and organizational culture expands the hospital's customer capital. The implication is twofold: (1) the TMT is imperative for the success of building customer capital through KM process; and (2) the team effort should be enhanced with a learning culture and sharing spirit, in particular, active nurse participation in decision making and frontline staff's role in providing a delightfully surprising patient experience.
Quality Function Deployment for Large Systems
NASA Technical Reports Server (NTRS)
Dean, Edwin B.
1992-01-01
Quality Function Deployment (QFD) is typically applied to small subsystems. This paper describes efforts to extend QFD to large scale systems. It links QFD to the system engineering process, the concurrent engineering process, the robust design process, and the costing process. The effect is to generate a tightly linked project management process of high dimensionality which flushes out issues early to provide a high quality, low cost, and, hence, competitive product. A pre-QFD matrix linking customers to customer desires is described.
Emailing Drones: From Design to Test Range to ARS Offices and into the Field
NASA Astrophysics Data System (ADS)
Fuka, D. R.; Singer, S.; Rodriguez, R., III; Collick, A.; Cunningham, A.; Kleinman, P. J. A.; Manoukis, N. C.; Matthews, B.; Ralston, T.; Easton, Z. M.
2017-12-01
Unmanned aerial vehicles (UAVs or `drones') are one of the newest tools available for collecting geo- and biological-science data in the field, though today's commercial drones only come in a small range of options. While scientific research has benefitted from the enhanced topographic and surface characterization data that UAVs can provide through traditional image based remote sensing techniques, drones have significantly greater mission-specific potential than are currently utilized. The reasons for this under-utilization are twofold, 1) because with their broad capabilities comes the need to be careful in implementation, and as such, FAA and other regulatory agencies around the world have blanket regulations that can inhibit new designs from being implemented, and 2) current multi-mission-multi-payload commercial drones have to be over-designed to compensate for the fact that they are very difficult to stabilize for multiple payloads, leading to a much higher cost than necessary. For this project, we explore and demonstrate a workflow to optimize the design, testing, approval, and implementation of embarrassingly inexpensive mission specific drones, with two use cases. The first will follow the process from design (at VTech and UH Hilo) to field implementation (by USDA-ARS in PA and Extension in VA) of several custom water quality monitoring drones, printed on demand at ARS and Extension offices after testing at the Pan-Pacific UAS Test Range Complex (PPUTRC). This type of customized drone can allow for an increased understanding in the transition from non-point source to point source agri-chemical and pollutant transport in watershed systems. The second use case will follow the same process, resulting in customized drones with pest specific traps built into the design. This class of customized drone can facilitate IPM pest monitoring programs nationwide, decreasing the intensive and costly quarantine and population elimination measures that currently exist. This multi-institutional project works toward an optimized workflow where scientists can quickly 1) customize drones to meet specific purposes, 2) have them tested in FAA Test Ranges, and 3) get them certified and working in the field, while 4) cutting their cost to significantly less than what is currently available.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Costeux, Stephane; Bunker, Shanon
The objective of this project was to explore and potentially develop high performing insulation with increased R/inch and low impact on climate change that would help design highly insulating building envelope systems with more durable performance and lower overall system cost than envelopes with equivalent performance made with materials available today. The proposed technical approach relied on insulation foams with nanoscale pores (about 100 nm in size) in which heat transfer will be decreased. Through the development of new foaming methods, of new polymer formulations and new analytical techniques, and by advancing the understanding of how cells nucleate, expand andmore » stabilize at the nanoscale, Dow successfully invented and developed methods to produce foams with 100 nm cells and 80% porosity by batch foaming at the laboratory scale. Measurements of the gas conductivity on small nanofoam specimen confirmed quantitatively the benefit of nanoscale cells (Knudsen effect) to increase insulation value, which was the key technical hypotheses of the program. In order to bring this technology closer to a viable semi-continuous/continuous process, the project team modified an existing continuous extrusion foaming process as well as designed and built a custom system to produce 6" x 6" foam panels. Dow demonstrated for the first time that nanofoams can be produced in a both processes. However, due to technical delays, foam characteristics achieved so far fall short of the 100 nm target set for optimal insulation foams. In parallel with the technology development, effort was directed to the determination of most promising applications for nanocellular insulation foam. Voice of Customer (VOC) exercise confirmed that demand for high-R value product will rise due to building code increased requirements in the near future, but that acceptance for novel products by building industry may be slow. Partnerships with green builders, initial launches in smaller markets (e.g. EIFS), and efforts to drive cost down will help acceptance in residential and commercial retrofit and new construction.« less
Characterization of a phantom setup for breast conserving cancer surgery
NASA Astrophysics Data System (ADS)
Chadwell, Jacob T.; Conley, Rebekah H.; Collins, Jarrod A.; Meszoely, Ingrid M.; Miga, Michael I.
2016-03-01
The purpose of this work is to develop an anatomically and mechanically representative breast phantom for the validation of breast conserving surgical therapies, specifically, in this case, image guided surgeries. Using three patients scheduled for lumpectomy and four healthy volunteers in mock surgical presentations, the magnitude, direction, and location of breast deformations was analyzed. A phantom setup was then designed to approximate such deformations in a mock surgical environment. Specifically, commercially available and custom-built polyvinyl alcohol (PVA) phantoms were used to mimic breast tissue during surgery. A custom designed deformation apparatus was then created to reproduce deformations seen in typical clinical setups of the pre- and intra-operative breast geometry. Quantitative analysis of the human subjects yielded a positive correlation between breast volume and amount of breast deformation. Phantom results reflected similar behavior with the custom-built PVA phantom outperforming the commercial phantom.
ERIC Educational Resources Information Center
Burlington County Coll., Pemberton, NJ.
Prepared for use by staff in development workshops at Burlington County College (BCC), in New Jersey, this handbook offers college-wide guidelines for improving the quality of service provided to internal and external customers, and reviews key elements of BCC's Customer Service System (CSS), a computerized method of recording and following-up on…
NASA Astrophysics Data System (ADS)
Chen, Ruey-Shun; Tsai, Yung-Shun; Tu, Arthur
In this study we propose a manufacturing control framework based on radio-frequency identification (RFID) technology and a distributed information system to construct a mass-customization production process in a loosely coupled shop-floor control environment. On the basis of this framework, we developed RFID middleware and an integrated information system for tracking and controlling the manufacturing process flow. A bicycle manufacturer was used to demonstrate the prototype system. The findings of this study were that the proposed framework can improve the visibility and traceability of the manufacturing process as well as enhance process quality control and real-time production pedigree access. Using this framework, an enterprise can easily integrate an RFID-based system into its manufacturing environment to facilitate mass customization and a just-in-time production model.
Proposal of custom made wrist orthoses based on 3D modelling and 3D printing.
Abreu de Souza, Mauren; Schmitz, Cristiane; Marega Pinhel, Marcelo; Palma Setti, Joao A; Nohama, Percy
2017-07-01
Accessibility to three-dimensional (3D) technologies, such as 3D scanning systems and additive manufacturing (like 3D printers), allows a variety of 3D applications. For medical applications in particular, these modalities are gaining a lot of attention enabling several opportunities for healthcare applications. The literature brings several cases applying both technologies, but none of them focus on the spreading of how this technology could benefit the health segment. This paper proposes a new methodology, which employs both 3D modelling and 3D printing for building orthoses, which could better fit the demands of different patients. Additionally, there is an opportunity for sharing expertise, as it represents a trendy in terms of the maker-movement. Therefore, as a result of the proposed approach, we present a case study based on a volunteer who needs an immobilization orthosis, which was built for exemplification of the whole process. This proposal also employs freely available 3D models and software, having a strong social impact. As a result, it enables the implementation and effective usability for a variety of built to fit solutions, hitching useful and smarter technologies for the healthcare sector.
Advanced active quenching circuit for ultra-fast quantum cryptography.
Stipčević, Mario; Christensen, Bradley G; Kwiat, Paul G; Gauthier, Daniel J
2017-09-04
Commercial photon-counting modules based on actively quenched solid-state avalanche photodiode sensors are used in a wide variety of applications. Manufacturers characterize their detectors by specifying a small set of parameters, such as detection efficiency, dead time, dark counts rate, afterpulsing probability and single-photon arrival-time resolution (jitter). However, they usually do not specify the range of conditions over which these parameters are constant or present a sufficient description of the characterization process. In this work, we perform a few novel tests on two commercial detectors and identify an additional set of imperfections that must be specified to sufficiently characterize their behavior. These include rate-dependence of the dead time and jitter, detection delay shift, and "twilighting". We find that these additional non-ideal behaviors can lead to unexpected effects or strong deterioration of the performance of a system using these devices. We explain their origin by an in-depth analysis of the active quenching process. To mitigate the effects of these imperfections, a custom-built detection system is designed using a novel active quenching circuit. Its performance is compared against two commercial detectors in a fast quantum key distribution system with hyper-entangled photons and a random number generator.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-04-12
... generally favorable feedback concerning its proposed rule change, given the built-in customer protections in... State or Straddle State, the Exchange believes that providing market participants time to re-evaluate a... circumstances. Broker-dealers also should be mindful of their obligations to customers that may or may not be...
Teaching the Concept of Risk: Blended Learning Using a Custom-Built Prediction Market
ERIC Educational Resources Information Center
Garvey, John; Buckley, Patrick
2010-01-01
There has been much research into the role of technology in promoting student engagement and learning activity in third-level education. This article documents an innovative application of technology in a large, undergraduate business class in risk management. The students' learning outcomes are reinforced by activity in a custom-designed…
NASA Astrophysics Data System (ADS)
Yang, Byungkuen; Cho, Jee-Hyun; Song, Simon
2016-11-01
For the use of clinical purpose magnetic resonance velocimeter (MRV) is a versatile flow visualization technique in that it allows opaque flow, complex geometry, no use of tracer particles and facile fast non-invasive measurements of 3 dimensional and 3 component velocity vectors. However, the spatial resolution of a commercial MR machine is lower than optics-based techniques like PIV. On the other hand, the use of MRV for clinical purposes like cardiovascular flow visualization requires accurate measurements or estimations on wall shear stress (WSS) with a high spatial resolution. We developed a custom-built solenoid RF coil for phase-contrast (PC) MRV to improve its resolution. We compared signal-to-noise ratio, WSS estimations, partial volume effects near wall between the custom RF coil and a commercial coil. Also, a Hagen-Poiseuille flow was analyzed with the custom RF coil. This work was supported by the National Research Foundation of Korea (NRF) Grant funded by the Korea government (MSIP) (No. 2016R1A2B3009541).
Risk-based analysis and decision making in multi-disciplinary environments
NASA Technical Reports Server (NTRS)
Feather, Martin S.; Cornford, Steven L.; Moran, Kelly
2003-01-01
A risk-based decision-making process conceived of and developed at JPL and NASA, has been used to help plan and guide novel technology applications for use on spacecraft. These applications exemplify key challenges inherent in multi-disciplinary design of novel technologies deployed in mission-critical settings. 1) Cross-disciplinary concerns are numerous (e.g., spacecraft involve navigation, propulsion, telecommunications). These concems are cross-coupled and interact in multiple ways (e.g., electromagnetic interference, heat transfer). 2) Time and budget pressures constrain development, operational resources constrain the resulting system (e.g., mass, volume, power). 3) Spacecraft are critical systems that must operate correctly the first time in only partially understood environments, with no chance for repair. 4) Past experience provides only a partial guide: New mission concepts are enhanced and enabled by new technologies, for which past experience is lacking. The decision-making process rests on quantitative assessments of the relationships between three classes of information - objectives (the things the system is to accomplish and constraints on its operation and development), risks (whose occurrence detracts from objectives), and mitigations (options for reducing the likelihood and or severity of risks). The process successfully guides experts to pool their knowledge, using custom-built software to support information gathering and decision-making.
Application of the airborne ocean color imager for commercial fishing
NASA Technical Reports Server (NTRS)
Wrigley, Robert C.
1993-01-01
The objective of the investigation was to develop a commercial remote sensing system for providing near-real-time data (within one day) in support of commercial fishing operations. The Airborne Ocean Color Imager (AOCI) had been built for NASA by Daedalus Enterprises, Inc., but it needed certain improvements, data processing software, and a delivery system to make it into a commercial system for fisheries. Two products were developed to support this effort: the AOCI with its associated processing system and an information service for both commercial and recreational fisheries to be created by Spectro Scan, Inc. The investigation achieved all technical objectives: improving the AOCI, creating software for atmospheric correction and bio-optical output products, georeferencing the output products, and creating a delivery system to get those products into the hands of commercial and recreational fishermen in near-real-time. The first set of business objectives involved Daedalus Enterprises and also were achieved: they have an improved AOCI and new data processing software with a set of example data products for fisheries applications to show their customers. Daedalus' marketing activities showed the need for simplification of the product for fisheries, but they successfully marketed the current version to an Italian consortium. The second set of business objectives tasked Spectro Scan to provide an information service and they could not be achieved because Spectro Scan was unable to obtain necessary venture capital to start up operations.
Wei, L; Chen, H; Zhou, Y S; Sun, Y C; Pan, S X
2017-02-18
To compare the technician fabrication time and clinical working time of custom trays fabricated using two different methods, the three-dimensional printing custom trays and the conventional custom trays, and to prove the feasibility of the computer-aided design/computer-aided manufacturing (CAD/CAM) custom trays in clinical use from the perspective of clinical time cost. Twenty edentulous patients were recruited into this study, which was prospective, single blind, randomized self-control clinical trials. Two custom trays were fabricated for each participant. One of the custom trays was fabricated using functional suitable denture (FSD) system through CAD/CAM process, and the other was manually fabricated using conventional methods. Then the final impressions were taken using both the custom trays, followed by utilizing the final impression to fabricate complete dentures respectively. The technician production time of the custom trays and the clinical working time of taking the final impression was recorded. The average time spent on fabricating the three-dimensional printing custom trays using FSD system and fabricating the conventional custom trays manually were (28.6±2.9) min and (31.1±5.7) min, respectively. The average time spent on making the final impression with the three-dimensional printing custom trays using FSD system and the conventional custom trays fabricated manually were (23.4±11.5) min and (25.4±13.0) min, respectively. There was significant difference in the technician fabrication time and the clinical working time between the three-dimensional printing custom trays using FSD system and the conventional custom trays fabricated manually (P<0.05). The average time spent on fabricating three-dimensional printing custom trays using FSD system and making the final impression with the trays are less than those of the conventional custom trays fabricated manually, which reveals that the FSD three-dimensional printing custom trays is less time-consuming both in the clinical and laboratory process than the conventional custom trays. In addition, when we manufacture custom trays by three-dimensional printing method, there is no need to pour preliminary cast after taking the primary impression, therefore, it can save the impression material and model material. As to completing denture restoration, manufacturing custom trays using FSD system is worth being popularized.
Towards a Web-Enabled Geovisualization and Analytics Platform for the Energy and Water Nexus
NASA Astrophysics Data System (ADS)
Sanyal, J.; Chandola, V.; Sorokine, A.; Allen, M.; Berres, A.; Pang, H.; Karthik, R.; Nugent, P.; McManamay, R.; Stewart, R.; Bhaduri, B. L.
2017-12-01
Interactive data analytics are playing an increasingly vital role in the generation of new, critical insights regarding the complex dynamics of the energy/water nexus (EWN) and its interactions with climate variability and change. Integration of impacts, adaptation, and vulnerability (IAV) science with emerging, and increasingly critical, data science capabilities offers a promising potential to meet the needs of the EWN community. To enable the exploration of pertinent research questions, a web-based geospatial visualization platform is being built that integrates a data analysis toolbox with advanced data fusion and data visualization capabilities to create a knowledge discovery framework for the EWN. The system, when fully built out, will offer several geospatial visualization capabilities including statistical visual analytics, clustering, principal-component analysis, dynamic time warping, support uncertainty visualization and the exploration of data provenance, as well as support machine learning discoveries to render diverse types of geospatial data and facilitate interactive analysis. Key components in the system architecture includes NASA's WebWorldWind, the Globus toolkit, postgresql, as well as other custom built software modules.
Sloane, Elliot; Rosow, Eric; Adam, Joe; Shine, Dave
2005-01-01
The Clinical Engineering (a.k.a. Biomedical Engineering) Department has heretofore lagged in adoption of some of the leading-edge information system tools used in other industries. This present application is part of a DOD-funded SBIR grant to improve the overall management of medical technology, and describes the capabilities that Strategic Graphical Dashboards (SGDs) can afford. This SGD is built on top of an Oracle database, and uses custom-written graphic objects like gauges, fuel tanks, and Geographic Information System (GIS) maps to improve and accelerate decision making.
Endomicroscopy imaging of epithelial structures using tissue autofluorescence
NASA Astrophysics Data System (ADS)
Lin, Bevin; Urayama, Shiro; Saroufeem, Ramez M. G.; Matthews, Dennis L.; Demos, Stavros G.
2011-04-01
We explore autofluorescence endomicroscopy as a potential tool for real-time visualization of epithelial tissue microstructure and organization in a clinical setting. The design parameters are explored using two experimental systems--an Olympus Medical Systems Corp. stand-alone clinical prototype probe, and a custom built bench-top rigid fiber conduit prototype. Both systems entail ultraviolet excitation at 266 nm and/or 325 nm using compact laser sources. Preliminary results using ex vivo animal and human tissue specimens suggest that this technology can be translated toward in vivo application to address the need for real-time histology.
A new type of tri-axial accelerometers with high dynamic range MEMS for earthquake early warning
NASA Astrophysics Data System (ADS)
Peng, Chaoyong; Chen, Yang; Chen, Quansheng; Yang, Jiansi; Wang, Hongti; Zhu, Xiaoyi; Xu, Zhiqiang; Zheng, Yu
2017-03-01
Earthquake Early Warning System (EEWS) has shown its efficiency for earthquake damage mitigation. As the progress of low-cost Micro Electro Mechanical System (MEMS), many types of MEMS-based accelerometers have been developed and widely used in deploying large-scale, dense seismic networks for EEWS. However, the noise performance of these commercially available MEMS is still insufficient for weak seismic signals, leading to the large scatter of early-warning parameters estimation. In this study, we developed a new type of tri-axial accelerometer based on high dynamic range MEMS with low noise level using for EEWS. It is a MEMS-integrated data logger with built-in seismological processing. The device is built on a custom-tailored Linux 2.6.27 operating system and the method for automatic detecting seismic events is STA/LTA algorithms. When a seismic event is detected, peak ground parameters of all data components will be calculated at an interval of 1 s, and τc-Pd values will be evaluated using the initial 3 s of P wave. These values will then be organized as a trigger packet actively sent to the processing center for event combining detection. The output data of all three components are calibrated to sensitivity 500 counts/cm/s2. Several tests and a real field test deployment were performed to obtain the performances of this device. The results show that the dynamic range can reach 98 dB for the vertical component and 99 dB for the horizontal components, and majority of bias temperature coefficients are lower than 200 μg/°C. In addition, the results of event detection and real field deployment have shown its capabilities for EEWS and rapid intensity reporting.
Patterson, Olga V; Freiberg, Matthew S; Skanderson, Melissa; J Fodeh, Samah; Brandt, Cynthia A; DuVall, Scott L
2017-06-12
In order to investigate the mechanisms of cardiovascular disease in HIV infected and uninfected patients, an analysis of echocardiogram reports is required for a large longitudinal multi-center study. A natural language processing system using a dictionary lookup, rules, and patterns was developed to extract heart function measurements that are typically recorded in echocardiogram reports as measurement-value pairs. Curated semantic bootstrapping was used to create a custom dictionary that extends existing terminologies based on terms that actually appear in the medical record. A novel disambiguation method based on semantic constraints was created to identify and discard erroneous alternative definitions of the measurement terms. The system was built utilizing a scalable framework, making it available for processing large datasets. The system was developed for and validated on notes from three sources: general clinic notes, echocardiogram reports, and radiology reports. The system achieved F-scores of 0.872, 0.844, and 0.877 with precision of 0.936, 0.982, and 0.969 for each dataset respectively averaged across all extracted values. Left ventricular ejection fraction (LVEF) is the most frequently extracted measurement. The precision of extraction of the LVEF measure ranged from 0.968 to 1.0 across different document types. This system illustrates the feasibility and effectiveness of a large-scale information extraction on clinical data. New clinical questions can be addressed in the domain of heart failure using retrospective clinical data analysis because key heart function measurements can be successfully extracted using natural language processing.
NASA Astrophysics Data System (ADS)
Boadi, J.; Sangwal, V.; MacNeil, S.; Matcher, S. J.
2015-03-01
The prevailing hypothesis for the existence and healing of the avascular corneal epithelium is that this layer of cells is continually produced by stem cells in the limbus and transported onto the cornea to mature into corneal epithelium. Limbal Stem Cell Deficiency (LSCD), in which the stem cell population is depleted, can lead to blindness. LSCD can be caused by chemical and thermal burns to the eye. A popular treatment, especially in emerging economies such as India, is the transplantation of limbal stem cells onto damaged limbus with hope of repopulating the region. Hence regenerating the corneal epithelium. In order to gain insights into the success rates of this treatment, new imaging technologies are needed in order to track the transplanted cells. Optical Coherence Tomography (OCT) is well known for its high resolution in vivo images of the retina. A custom OCT system has been built to image the corneal surface, to investigate the fate of transplanted limbal stem cells. We evaluate two methods to label and track transplanted cells: melanin labelling and magneto-labelling. To evaluate melanin labelling, stem cells are loaded with melanin and then transplanted onto a rabbit cornea denuded of its epithelium. The melanin displays strongly enhanced backscatter relative to normal cells. To evaluate magneto-labelling the stem cells are loaded with magnetic nanoparticles (20-30nm in size) and then imaged with a custom-built, magneto-motive OCT system.
Semantic Enhancement for Enterprise Data Management
NASA Astrophysics Data System (ADS)
Ma, Li; Sun, Xingzhi; Cao, Feng; Wang, Chen; Wang, Xiaoyuan; Kanellos, Nick; Wolfson, Dan; Pan, Yue
Taking customer data as an example, the paper presents an approach to enhance the management of enterprise data by using Semantic Web technologies. Customer data is the most important kind of core business entity a company uses repeatedly across many business processes and systems, and customer data management (CDM) is becoming critical for enterprises because it keeps a single, complete and accurate record of customers across the enterprise. Existing CDM systems focus on integrating customer data from all customer-facing channels and front and back office systems through multiple interfaces, as well as publishing customer data to different applications. To make the effective use of the CDM system, this paper investigates semantic query and analysis over the integrated and centralized customer data, enabling automatic classification and relationship discovery. We have implemented these features over IBM Websphere Customer Center, and shown the prototype to our clients. We believe that our study and experiences are valuable for both Semantic Web community and data management community.
The design of a petabyte archive and distribution system for the NASA ECS project
NASA Technical Reports Server (NTRS)
Caulk, Parris M.
1994-01-01
The NASA EOS Data and Information System (EOSDIS) Core System (ECS) will contain one of the largest data management systems ever built - the ECS Science and Data Processing System (SDPS). SDPS is designed to support long term Global Change Research by acquiring, producing, and storing earth science data, and by providing efficient means for accessing and manipulating that data. The first two releases of SDPS, Release A and Release B, will be operational in 1997 and 1998, respectively. Release B will be deployed at eight Distributed Active Archiving Centers (DAAC's). Individual DAAC's will archive different collections of earth science data, and will vary in archive capacity. The storage and management of these data collections is the responsibility of the SDPS Data Server subsystem. It is anticipated that by the year 2001, the Data Server subsystem at the Goddard DAAC must support a near-line data storage capacity of one petabyte. The development of SDPS is a system integration effort in which COTS products will be used in favor of custom components in very possible way. Some software and hardware capabilities required to meet ECS data volume and storage management requirements beyond 1999 are not yet supported by available COTS products. The ECS project will not undertake major custom development efforts to provide these capabilities. Instead, SDPS and its Data Server subsystem are designed to support initial implementations with current products, and provide an evolutionary framework that facilitates the introduction of advanced COTS products as they become available. This paper provides a high-level description of the Data Server subsystem design from a COTS integration standpoint, and discussed some of the major issues driving the design. The paper focuses on features of the design that will make the system scalable and adaptable to changing technologies.
NASA Astrophysics Data System (ADS)
Heeager, Lise Tordrup; Tjørnehøj, Gitte
Quality assurance technology is a formal control mechanism aiming at increasing the quality of the product exchanged between vendors and customers. Studies of the adoption of this technology in the field of system development rarely focus on the role of the relationship between the customer and vendor in the process. We have studied how the process of adopting quality assurance technology by a small Danish IT vendor developing pharmacy software for a customer in the public sector was influenced by the relationship with the customer. The case study showed that the adoption process was shaped to a high degree by the relationship and vice versa. The prior high level of trust and mutual knowledge helped the parties negotiate mutually feasible solutions throughout the adoption process. We thus advise enhancing trust-building processes to strengthen the relationships and to balance formal control and social control to increase the likelihood of a successful outcome of the adoption of quality assurance technology in a customer-vendor relationship.
2008-12-01
forwarded to other users. D. DISCUSSION BOARD Discussion boards are commonly referred to as “ forums ” and are used for asynchronous communications.31...This technology allows for open ended communications in written format. A user can start a discussion by adding a posting to a community forum ...and deployed. Some of the built in features include, forums , discussion boards, custom lists, calendar, to-do lists, wiki technology, email
Color structured light imaging of skin
NASA Astrophysics Data System (ADS)
Yang, Bin; Lesicko, John; Moy, Austin; Reichenberg, Jason; Sacks, Michael; Tunnell, James W.
2016-05-01
We illustrate wide-field imaging of skin using a structured light (SL) approach that highlights the contrast from superficial tissue scattering. Setting the spatial frequency of the SL in a regime that limits the penetration depth effectively gates the image for photons that originate from the skin surface. Further, rendering the SL images in a color format provides an intuitive format for viewing skin pathologies. We demonstrate this approach in skin pathologies using a custom-built handheld SL imaging system.
Wiebrands, Michael; Malajczuk, Chris J; Woods, Andrew J; Rohl, Andrew L; Mancera, Ricardo L
2018-06-21
Molecular graphics systems are visualization tools which, upon integration into a 3D immersive environment, provide a unique virtual reality experience for research and teaching of biomolecular structure, function and interactions. We have developed a molecular structure and dynamics application, the Molecular Dynamics Visualization tool, that uses the Unity game engine combined with large scale, multi-user, stereoscopic visualization systems to deliver an immersive display experience, particularly with a large cylindrical projection display. The application is structured to separate the biomolecular modeling and visualization systems. The biomolecular model loading and analysis system was developed as a stand-alone C# library and provides the foundation for the custom visualization system built in Unity. All visual models displayed within the tool are generated using Unity-based procedural mesh building routines. A 3D user interface was built to allow seamless dynamic interaction with the model while being viewed in 3D space. Biomolecular structure analysis and display capabilities are exemplified with a range of complex systems involving cell membranes, protein folding and lipid droplets.
NASA Technical Reports Server (NTRS)
Georgieva, E. M.; Huang, W.; Heaps, W. S.
2012-01-01
A portable remote sensing system for precision column measurements of methane has been developed, built and tested at NASA GSFC. The sensor covers the spectral range from 1.636 micrometers to 1.646 micrometers, employs an air-gapped Fabry-Perot filter and a CCD camera and has a potential to operate from a variety of platforms. The detector is an XS-1.7-320 camera unit from Xenics Infrared solutions which combines an uncooled InGaAs detector array working up to 1.7 micrometers. Custom software was developed in addition to the graphical user basic interface X-Control provided by the company to help save and process the data. The technique and setup can be used to measure other trace gases in the atmosphere with minimal changes of the etalon and the prefilter. In this paper we describe the calibration of the system using several different approaches.
Social customer relationship management: taking advantage of Web 2.0 and Big Data technologies.
Orenga-Roglá, Sergio; Chalmeta, Ricardo
2016-01-01
The emergence of Web 2.0 and Big Data technologies has allowed a new customer relationship strategy based on interactivity and collaboration called Social Customer Relationship Management (Social CRM) to be created. This enhances customer engagement and satisfaction. The implementation of Social CRM is a complex task that involves different organisational, human and technological aspects. However, there is a lack of methodologies to assist companies in these processes. This paper shows a novel methodology that helps companies to implement Social CRM, taking into account different aspects such as social customer strategy, the Social CRM performance measurement system, the Social CRM business processes, or the Social CRM computer system. The methodology was applied to one company in order to validate and refine it.
Paek, Janghyun; Ahn, Hyo-Won; Jeong, Do-Min; Shim, Jeong-Seok; Kim, Seong-Hun; Chung, Kyu-Rhim
2015-03-25
This article presents the application of laser welding technique to fabricate an orthodontic mini-implant provisional restoration in missing area after limited orthodontic treatment. A 15-year-old boy case is presented. Two-piece orthodontic C-implant was placed after regaining space for missing right mandibular central incisor. Due to angular deviation of implant, customized abutment was required. Ready-made head part was milled and lingual part of customized abutment was made with non-precious metal. Two parts then were laser welded (Master 1000, Elettrolaser Italy, Verona, Italy) and indirect lab composite (3 M ESPE Sinfony, St. Paul, MN, USA) was built up. The patient had successful result, confirmed by clinical and radiographic examinations. Before the patient is ready to get a permanent restoration later on, this provisional restoration will be used. This case shows that a two-piece orthodontic C-implant system can be used to maintain small edentulous space after orthodontic treatment.
The Mobile Bark Blower: An Evaluation of Performance and Costs
Raymond L. Sarles; David M. Emanuel
1977-01-01
A custom-built bark blower truck (MOBLOW) developed in Oregon was tested for its effectiveness in applying bark mulches, sawdust, and shavings in the eastern United States. Tests determined the bark blower's performance and cost in mulching grass-legume seedings and shrub beds with 10 bark products or wood residues. Bark blower trucks built to MOBLOW...
An Array Library for Microsoft SQL Server with Astrophysical Applications
NASA Astrophysics Data System (ADS)
Dobos, L.; Szalay, A. S.; Blakeley, J.; Falck, B.; Budavári, T.; Csabai, I.
2012-09-01
Today's scientific simulations produce output on the 10-100 TB scale. This unprecedented amount of data requires data handling techniques that are beyond what is used for ordinary files. Relational database systems have been successfully used to store and process scientific data, but the new requirements constantly generate new challenges. Moving terabytes of data among servers on a timely basis is a tough problem, even with the newest high-throughput networks. Thus, moving the computations as close to the data as possible and minimizing the client-server overhead are absolutely necessary. At least data subsetting and preprocessing have to be done inside the server process. Out of the box commercial database systems perform very well in scientific applications from the prospective of data storage optimization, data retrieval, and memory management but lack basic functionality like handling scientific data structures or enabling advanced math inside the database server. The most important gap in Microsoft SQL Server is the lack of a native array data type. Fortunately, the technology exists to extend the database server with custom-written code that enables us to address these problems. We present the prototype of a custom-built extension to Microsoft SQL Server that adds array handling functionality to the database system. With our Array Library, fix-sized arrays of all basic numeric data types can be created and manipulated efficiently. Also, the library is designed to be able to be seamlessly integrated with the most common math libraries, such as BLAS, LAPACK, FFTW, etc. With the help of these libraries, complex operations, such as matrix inversions or Fourier transformations, can be done on-the-fly, from SQL code, inside the database server process. We are currently testing the prototype with two different scientific data sets: The Indra cosmological simulation will use it to store particle and density data from N-body simulations, and the Milky Way Laboratory project will use it to store galaxy simulation data.
Field emission study of carbon nanostructures
NASA Astrophysics Data System (ADS)
Zhao, Xin
Recently, carbon nanosheets (CNS), a novel nanostructure, were developed in our laboratory as a field emission source for high emission current. To characterize, understand and improve the field emission properties of CNS, a ultra-high vacuum surface analysis system was customized to conduct relevant experimental research in four distinct areas. The system includes Auger electron spectroscopy (AES), field emission energy spectroscopy (FEES), field emission I-V testing, and thermal desorption spectroscopy (TDS). Firstly, commercial Mo single tips were studied to calibrate the customized system. AES and FEES experiments indicate that a pyramidal nanotip of Ca and O elements formed on the Mo tip surface by field induced surface diffusion. Secondly, field emission I-V testing on CNS indicates that the field emission properties of pristine nanosheets are impacted by adsorbates. For instance, in pristine samples, field emission sources can be built up instantaneously and be characterized by prominent noise levels and significant current variations. However, when CNS are processed via conditioning (run at high current), their emission properties are greatly improved and stabilized. Furthermore, only H2 desorbed from the conditioned CNS, which indicates that only H adsorbates affect emission. Thirdly, the TDS study on nanosheets revealed that the predominant locations of H residing in CNS are sp2 hybridized C on surface and bulk. Fourthly, a fabricating process was developed to coat low work function ZrC on nanosheets for field emission enhancement. The carbide triple-peak in the AES spectra indicated that Zr carbide formed, but oxygen was not completely removed. The Zr(CxOy) coating was dispersed as nanobeads on the CNS surface. Although the work function was reduced, the coated CNS emission properties were not improved due to an increased beta factor. Further analysis suggest that for low emission current (<1 uA), the H adsorbates affect emission by altering the work function. In high emission current (>10 uA), thermal, ionic or electronic transition effects may occur, which differently affect the field emission process.
Computers for Manned Space Applications Base on Commercial Off-the-Shelf Components
NASA Astrophysics Data System (ADS)
Vogel, T.; Gronowski, M.
2009-05-01
Similar to the consumer markets there has been an ever increasing demand in processing power, signal processing capabilities and memory space also for computers used for science data processing in space. An important driver of this development have been the payload developers for the International Space Station, requesting high-speed data acquisition and fast control loops in increasingly complex systems. Current experiments now even perform video processing and compression with their payload controllers. Nowadays the requirements for a space qualified computer are often far beyond the capabilities of, for example, the classic SPARC architecture that is found in ERC32 or LEON CPUs. An increase in performance usually demands costly and power consuming application specific solutions. Continuous developments over the last few years have now led to an alternative approach that is based on complete electronics modules manufactured for commercial and industrial customers. Computer modules used in industrial environments with a high demand for reliability under harsh environmental conditions like chemical reactors, electrical power plants or on manufacturing lines are entered into a selection procedure. Promising candidates then undergo a detailed characterisation process developed by Astrium Space Transportation. After thorough analysis and some modifications, these modules can replace fully qualified custom built electronics in specific, although not safety critical applications in manned space. This paper focuses on the benefits of COTS1 based electronics modules and the necessary analyses and modifications for their utilisation in manned space applications on the ISS. Some considerations regarding overall systems architecture will also be included. Furthermore this paper will also pinpoint issues that render such modules unsuitable for specific tasks, and justify the reasons. Finally, the conclusion of this paper will advocate the implementation of COTS based electronics for a range of applications within specifically adapted systems. The findings in this paper are extrapolated from two reference computer systems, both having been launched in 2008. One of those was a LEON-2 based computer installed onboard the Columbus Orbital Facility while the other system consisted mainly of a commercial Power-PC module that was modified for a launch mounted on the ICC pallet in the Space Shuttle's cargo bay. Both systems are currently upgraded and extended for future applications.
Long-range strategy for remote sensing: an integrated supersystem
NASA Astrophysics Data System (ADS)
Glackin, David L.; Dodd, Joseph K.
1995-12-01
Present large space-based remote sensing systems, and those planned for the next two decades, remain dichotomous and custom-built. An integrated architecture might reduce total cost without limiting system performance. An example of such an architecture, developed at The Aerospace Corporation, explores the feasibility of reducing overall space systems costs by forming a 'super-system' which will provide environmental, earth resources and theater surveillance information to a variety of users. The concept involves integration of programs, sharing of common spacecraft bus designs and launch vehicles, use of modular components and subsystems, integration of command and control and data capture functions, and establishment of an integrated program office. Smart functional modules that are easily tested and replaced are used wherever possible in the space segment. Data is disseminated to systems such as NASA's EOSDIS, and data processing is performed at established centers of expertise. This concept is advanced for potential application as a follow-on to currently budgeted and planned space-based remote sensing systems. We hope that this work will serve to engender discussion that may be of assistance in leading to multinational remote sensing systems with greater cost effectiveness at no loss of utility to the end user.
Approaching Error-Free Customer Satisfaction through Process Change and Feedback Systems
ERIC Educational Resources Information Center
Berglund, Kristin M.; Ludwig, Timothy D.
2009-01-01
Employee-based errors result in quality defects that can often impact customer satisfaction. This study examined the effects of a process change and feedback system intervention on error rates of 3 teams of retail furniture distribution warehouse workers. Archival records of error codes were analyzed and aggregated as the measure of quality. The…
Training the Next Generation in Space Situational Awareness Research
NASA Astrophysics Data System (ADS)
Colpo, D.; Reddy, V.; Arora, S.; Tucker, S.; Jeffries, L.; May, D.; Bronson, R.; Hunten, E.
Traditional academic SSA research has relied on commercial off the shelf (COTS) systems for collecting metric and lightcurve data. COTS systems have several advantages over a custom built system including cost, easy integration, technical support and short deployment timescales. We at the University of Arizona took an alternative approach to develop a sensor system for space object characterization. Five engineering students designed and built two 0.6-meter F/4 electro-optical (EO) systems for collecting lightcurve and spectral data. All the design and fabrication work was carried out over the course of two semesters as part f their senior design project that is mandatory for the completion of their bachelors in engineering degree. The students designed over 200 individual parts using three-dimensional modeling software (SolidWorks), and conducted detailed optical design analysis using raytracing software (ZEMAX), with oversight and advice from faculty sponsor and Starizona, a local small business in Tucson. The components of the design were verified by test, analysis, inspection, or demonstration, per the process that the University of Arizona requires for each of its design projects. Methods to complete this project include mechanical FEA, optical testing methods (Foucault Knife Edge Test and Couder Mask Test), tests to verify the function of the thermometers, and a final pointing model test. A surprise outcome of our exercise is that the entire cost of the design and fabrication of these two EO systems was significantly lower than a COTS alternative. With careful planning and coordination we were also able to reduce to the deployment times to those for a commercial system. Our experience shows that development of hardware and software for SSA research could be accomplished in an academic environment that would enable the training of the next generation with active support from local small businesses.
A Federated Digital Identity Management Approach for Business Processes
NASA Astrophysics Data System (ADS)
Bertino, Elisa; Ferrini, Rodolfo; Musci, Andrea; Paci, Federica; Steuer, Kevin J.
Business processes have gained a lot of attention because of the pressing need for integrating existing resources and services to better fulfill customer needs. A key feature of business processes is that they are built from composable services, referred to as component services, that may belong to different domains. In such a context, flexible multi-domain identity management solutions are crucial for increased security and user-convenience. In particular, it is important that during the execution of a business process the component services be able to verify the identity of the client to check that it has the required permissions for accessing the services. To address the problem of multi-domain identity management, we propose a multi-factor identity attribute verification protocol for business processes that assures clients privacy and handles naming heterogeneity.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-06-13
... and Customs Enforcement, Customs and Border Protection--001 Alien File, Index, and National File... Services, Immigration and Customs Enforcement, and Customs and Border Protection--001 Alien File, Index... border protection processes. The Alien File (A-File), Index, and National File Tracking System of Records...
Investigation into some characteristics of the mass-customized production paradigm
NASA Astrophysics Data System (ADS)
Tapper, Jerome; Sundar, Pratap S.; Kamarthi, Sagar V.
2000-10-01
In recent times, while markets are reaching their saturation limits and customers are becoming more demanding, a paradigm shift has been taking place from mass production to mass- customized production (MCP). The concept of mass customization (MC) focuses on satisfying a customer's unique needs with the help of new technologies such as Internet, digital product realization, and re-configurable production facilities. In MC the needs of an individual customer are translated into design, accordingly produced, and delivered to the customer. In this research three hypothesis related to MCP are investigated by the data/information collected from ten companies, which are engaged in MCP. These three hypothesis are (1) mass-customized production systems can be classified into make-to-stock MCP, assemble-to-order MCP, make-to-order MCP, engineer-to-order MC, and develop-to-order MCP, (2) in mass-customized production systems the process of customization eliminates customer sacrifice, and (3) mass-customized production systems can deliver products at mass-production cost. The preliminary study indicates that while the first hypothesis is valid, MCP companies rarely fulfill what is stated in the other two hypotheses.
Customers for life: marketing oral health care to older adults.
Niessen, L C
1999-09-01
Respect for and awareness of the needs of older patients from dental office staff will help such patients feel welcome in a practice. Marketing to older patients is built upon this foundation. In addition, there are other strategies for internal and external marketing aimed at older people. This article addresses the concept of turning aging patients into "customers for life."
Modernization of software quality assurance
NASA Technical Reports Server (NTRS)
Bhaumik, Gokul
1988-01-01
The customers satisfaction depends not only on functional performance, it also depends on the quality characteristics of the software products. An examination of this quality aspect of software products will provide a clear, well defined framework for quality assurance functions, which improve the life-cycle activities of software development. Software developers must be aware of the following aspects which have been expressed by many quality experts: quality cannot be added on; the level of quality built into a program is a function of the quality attributes employed during the development process; and finally, quality must be managed. These concepts have guided our development of the following definition for a Software Quality Assurance function: Software Quality Assurance is a formal, planned approach of actions designed to evaluate the degree of an identifiable set of quality attributes present in all software systems and their products. This paper is an explanation of how this definition was developed and how it is used.
van der Linden, Helma; Talmon, Jan; Tange, Huibert; Grimson, Jane; Hasman, Arie
2005-03-01
The PropeR EHR system (PropeRWeb) is a multidisciplinary electronic health record (EHR) system for multidisciplinary use in extramural patient care for stroke patients. The system is built using existing open source components and is based on open standards. It is implemented as a web application using servlets and Java Server Pages (JSP's) with a CORBA connection to the database servers, which are based on the OMG HDTF specifications. PropeRWeb is a generic system which can be readily customized for use in a variety of clinical domains. The system proved to be stable and flexible, although some aspects (a.o. user friendliness) could be improved. These improvements are currently under development in a second version.
NASA Astrophysics Data System (ADS)
Hsu, Jen-Feng; Dhingra, Shonali; D'Urso, Brian
2017-01-01
Mirror galvanometer systems (galvos) are commonly employed in research and commercial applications in areas involving laser imaging, laser machining, laser-light shows, and others. Here, we present a robust, moderate-speed, and cost-efficient home-built galvo system. The mechanical part of this design consists of one mirror, which is tilted around two axes with multiple surface transducers. We demonstrate the ability of this galvo by scanning the mirror using a computer, via a custom driver circuit. The performance of the galvo, including scan range, noise, linearity, and scan speed, is characterized. As an application, we show that this galvo system can be used in a confocal scanning microscopy system.
Clean water billing monitoring system using flow liquid meter sensor and SMS gateway
NASA Astrophysics Data System (ADS)
Fahmi, F.; Hizriadi, A.; Khairani, F.; Andayani, U.; Siregar, B.
2018-03-01
Public clean water company (PDAM) as a public service is designed and organized to meet the needs of the community. Currently, the number of PDAM subscribers is very big and will continue to grow, but the service and facilities to customers are still done conventionally by visiting the customer’s home to record the last position of the meter. One of the problems of PDAM is the lack of disclosure of PDAM customers’ invoice because it is only done monthly. This, of course, makes PDAM customers difficult to remember the date of payment of water account. Therefore it is difficult to maintain the efficiency. The purpose of this research is to facilitate customers of PDAM water users to know the details of water usage and the time of payment of water bills easily. It also facilitates customers in knowing information related to the form of water discharge data used, payment rates, and time grace payments using SMS Gateway. In this study, Flow Liquid Meter Sensor was used for data retrieval of water flowing in the piping system. Sensors used to require the help of Hall Effect sensor that serves to measure the speed of water discharge and placed on the pipe that has the same diameter size with the sensor diameter. The sensor will take the data from the rate of water discharge it passes; this data is the number of turns of the mill on the sensor. The results of the tests show that the built system works well in helping customers know in detail the amount of water usage in a month and the bill to be paid
NASA Astrophysics Data System (ADS)
Jackson, Christopher Robert
"Lucky-region" fusion (LRF) is a synthetic imaging technique that has proven successful in enhancing the quality of images distorted by atmospheric turbulence. The LRF algorithm selects sharp regions of an image obtained from a series of short exposure frames, and fuses the sharp regions into a final, improved image. In previous research, the LRF algorithm had been implemented on a PC using the C programming language. However, the PC did not have sufficient sequential processing power to handle real-time extraction, processing and reduction required when the LRF algorithm was applied to real-time video from fast, high-resolution image sensors. This thesis describes two hardware implementations of the LRF algorithm to achieve real-time image processing. The first was created with a VIRTEX-7 field programmable gate array (FPGA). The other developed using the graphics processing unit (GPU) of a NVIDIA GeForce GTX 690 video card. The novelty in the FPGA approach is the creation of a "black box" LRF video processing system with a general camera link input, a user controller interface, and a camera link video output. We also describe a custom hardware simulation environment we have built to test the FPGA LRF implementation. The advantage of the GPU approach is significantly improved development time, integration of image stabilization into the system, and comparable atmospheric turbulence mitigation.
Wong, Kevin S K; Jian, Yifan; Cua, Michelle; Bonora, Stefano; Zawadzki, Robert J; Sarunic, Marinko V
2015-02-01
Wavefront sensorless adaptive optics optical coherence tomography (WSAO-OCT) is a novel imaging technique for in vivo high-resolution depth-resolved imaging that mitigates some of the challenges encountered with the use of sensor-based adaptive optics designs. This technique replaces the Hartmann Shack wavefront sensor used to measure aberrations with a depth-resolved image-driven optimization algorithm, with the metric based on the OCT volumes acquired in real-time. The custom-built ultrahigh-speed GPU processing platform and fast modal optimization algorithm presented in this paper was essential in enabling real-time, in vivo imaging of human retinas with wavefront sensorless AO correction. WSAO-OCT is especially advantageous for developing a clinical high-resolution retinal imaging system as it enables the use of a compact, low-cost and robust lens-based adaptive optics design. In this report, we describe our WSAO-OCT system for imaging the human photoreceptor mosaic in vivo. We validated our system performance by imaging the retina at several eccentricities, and demonstrated the improvement in photoreceptor visibility with WSAO compensation.
NASA Astrophysics Data System (ADS)
Flores, Federico; Rondanelli, Roberto; Abarca, Accel; Diaz, Marcos; Querel, Richard
2012-09-01
Our group has designed, sourced and constructed a radiosonde/ground-station pair using inexpensive opensource hardware. Based on the Arduino platform, the easy to build radiosonde allows the atmospheric science community to test and deploy instrumentation packages that can be fully customized to their individual sensing requirements. This sensing/transmitter package has been successfully deployed on a tethered-balloon, a weather balloon, a UAV airplane, and is currently being integrated into a UAV quadcopter and a student-built rocket. In this paper, the system, field measurements and potential applications will be described. As will the science drivers of having full control and open access to a measurement system in an age when commercial solutions have become popular but are restrictive in terms of proprietary sensor specifications, "black-box" calibration operations or data handling routines, etc. The ability to modify and experiment with both the hardware and software tools is an essential part of the scientific process. Without an understanding of the intrinsic biases or limitations in your instruments and system, it becomes difficult to improve them or advance the knowledge in any given field.
Integrating Cache Performance Modeling and Tuning Support in Parallelization Tools
NASA Technical Reports Server (NTRS)
Waheed, Abdul; Yan, Jerry; Saini, Subhash (Technical Monitor)
1998-01-01
With the resurgence of distributed shared memory (DSM) systems based on cache-coherent Non Uniform Memory Access (ccNUMA) architectures and increasing disparity between memory and processors speeds, data locality overheads are becoming the greatest bottlenecks in the way of realizing potential high performance of these systems. While parallelization tools and compilers facilitate the users in porting their sequential applications to a DSM system, a lot of time and effort is needed to tune the memory performance of these applications to achieve reasonable speedup. In this paper, we show that integrating cache performance modeling and tuning support within a parallelization environment can alleviate this problem. The Cache Performance Modeling and Prediction Tool (CPMP), employs trace-driven simulation techniques without the overhead of generating and managing detailed address traces. CPMP predicts the cache performance impact of source code level "what-if" modifications in a program to assist a user in the tuning process. CPMP is built on top of a customized version of the Computer Aided Parallelization Tools (CAPTools) environment. Finally, we demonstrate how CPMP can be applied to tune a real Computational Fluid Dynamics (CFD) application.
Zooglider - an Autonomous Vehicle for Optical and Acoustic Sensing of Marine Zooplankton
NASA Astrophysics Data System (ADS)
Ohman, M. D.; Davis, R. E.; Sherman, J. T.; Grindley, K.; Whitmore, B. M.
2016-02-01
We will present results from early sea trials of the Zooglider, an autonomous zooplankton glider designed and built by the Instrument Development Group at Scripps. The Zooglider is built upon a modified Spray glider and includes a low power camera with telecentric lens and a custom dual frequency sonar (200/1000 kHz). The imaging system quantifies zooplankton as they flow through a sampling tunnel within a well-defined sampling volume. The maximum operating depth is 500 m. Other sensors include a pumped CTD and Chl-a fluorometer. The Zooglider permits in situ measurements of mesozooplankton distributions and three dimensional orientation in relation to other biotic and physical properties of the ocean water column. Zooglider development is supported by the Gordon and Betty Moore Foundation.
An analytical approach to customer requirement information processing
NASA Astrophysics Data System (ADS)
Zhou, Zude; Xiao, Zheng; Liu, Quan; Ai, Qingsong
2013-11-01
'Customer requirements' (CRs) management is a key component of customer relationship management (CRM). By processing customer-focused information, CRs management plays an important role in enterprise systems (ESs). Although two main CRs analysis methods, quality function deployment (QFD) and Kano model, have been applied to many fields by many enterprises in the past several decades, the limitations such as complex processes and operations make them unsuitable for online businesses among small- and medium-sized enterprises (SMEs). Currently, most SMEs do not have the resources to implement QFD or Kano model. In this article, we propose a method named customer requirement information (CRI), which provides a simpler and easier way for SMEs to run CRs analysis. The proposed method analyses CRs from the perspective of information and applies mathematical methods to the analysis process. A detailed description of CRI's acquisition, classification and processing is provided.
An Indoor Positioning-Based Mobile Payment System Using Bluetooth Low Energy Technology.
Yohan, Alexander; Lo, Nai-Wei; Winata, Doni
2018-03-25
The development of information technology has paved the way for faster and more convenient payment process flows and new methodology for the design and implementation of next generation payment systems. The growth of smartphone usage nowadays has fostered a new and popular mobile payment environment. Most of the current generation smartphones support Bluetooth Low Energy (BLE) technology to communicate with nearby BLE-enabled devices. It is plausible to construct an Over-the-Air BLE-based mobile payment system as one of the payment methods for people living in modern societies. In this paper, a secure indoor positioning-based mobile payment authentication protocol with BLE technology and the corresponding mobile payment system design are proposed. The proposed protocol consists of three phases: initialization phase, session key construction phase, and authentication phase. When a customer moves toward the POS counter area, the proposed mobile payment system will automatically detect the position of the customer to confirm whether the customer is ready for the checkout process. Once the system has identified the customer is standing within the payment-enabled area, the payment system will invoke authentication process between POS and the customer's smartphone through BLE communication channel to generate a secure session key and establish an authenticated communication session to perform the payment transaction accordingly. A prototype is implemented to assess the performance of the proposed design for mobile payment system. In addition, security analysis is conducted to evaluate the security strength of the proposed protocol.
The role of complaint management in the service recovery process.
Bendall-Lyon, D; Powers, T L
2001-05-01
Patient satisfaction and retention can be influenced by the development of an effective service recovery program that can identify complaints and remedy failure points in the service system. Patient complaints provide organizations with an opportunity to resolve unsatisfactory situations and to track complaint data for quality improvement purposes. Service recovery is an important and effective customer retention tool. One way an organization can ensure repeat business is by developing a strong customer service program that includes service recovery as an essential component. The concept of service recovery involves the service provider taking responsive action to "recover" lost or dissatisfied customers and convert them into satisfied customers. Service recovery has proven to be cost-effective in other service industries. The complaint management process involves six steps that organizations can use to influence effective service recovery: (1) encourage complaints as a quality improvement tool; (2) establish a team of representatives to handle complaints; (3) resolve customer problems quickly and effectively; (4) develop a complaint database; (5) commit to identifying failure points in the service system; and (6) track trends and use information to improve service processes. Customer retention is enhanced when an organization can reclaim disgruntled patients through the development of effective service recovery programs. Health care organizations can become more customer oriented by taking advantage of the information provided by patient complaints, increasing patient satisfaction and retention in the process.
Chung, King
2004-01-01
This is the second part of a review on the challenges and recent developments in hearing aids. Feedback and the occlusion effect pose great challenges in hearing aid design and usage. Yet, conventional solutions to feedback and the occlusion effect often create a dilemma: the solution to one often leads to the other. This review discusses the advanced signal processing strategies to reduce feedback and some new approaches to reduce the occlusion effect. Specifically, the causes of three types of feedback (acoustic, mechanical, and electromagnetic) are discussed. The strategies currently used to reduce acoustic feedback (i.e., adaptive feedback reduction algorithms using adaptive gain reduction, notch filtering, and phase cancellation strategies) and the design of new receivers that are built to reduce mechanical and electromagnetic feedback are explained. In addition, various new strategies (i.e., redesigned sound delivery devices and receiver-in-the-ear-canal hearing aid configuration) to reduce the occlusion effect are reviewed. Many manufacturers have recently adopted laser shell-manufacturing technologies to overcome problems associated with manufacturing custom hearing aid shells. The mechanisms of selected laser sintering and stereo lithographic apparatus and the properties of custom shells produced by these two processes are reviewed. Further, various new developments in hearing aid transducers, telecoils, channel-free amplification, open-platform programming options, rechargeable hearing aids, ear-level frequency modulated (FM) receivers, wireless Bluetooth FM systems, and wireless programming options are briefly explained and discussed. Finally, the applications of advanced hearing aid technologies to enhance other devices such as cochlear implants, hearing protectors, and cellular phones are discussed. PMID:15735871
He, Zilong; Zhang, Huangkai; Gao, Shenghan; Lercher, Martin J; Chen, Wei-Hua; Hu, Songnian
2016-07-08
Evolview is an online visualization and management tool for customized and annotated phylogenetic trees. It allows users to visualize phylogenetic trees in various formats, customize the trees through built-in functions and user-supplied datasets and export the customization results to publication-ready figures. Its 'dataset system' contains not only the data to be visualized on the tree, but also 'modifiers' that control various aspects of the graphical annotation. Evolview is a single-page application (like Gmail); its carefully designed interface allows users to upload, visualize, manipulate and manage trees and datasets all in a single webpage. Developments since the last public release include a modern dataset editor with keyword highlighting functionality, seven newly added types of annotation datasets, collaboration support that allows users to share their trees and datasets and various improvements of the web interface and performance. In addition, we included eleven new 'Demo' trees to demonstrate the basic functionalities of Evolview, and five new 'Showcase' trees inspired by publications to showcase the power of Evolview in producing publication-ready figures. Evolview is freely available at: http://www.evolgenius.info/evolview/. © The Author(s) 2016. Published by Oxford University Press on behalf of Nucleic Acids Research.
ATM Coastal Topography-Alabama 2001
Nayegandhi, Amar; Yates, Xan; Brock, John C.; Sallenger, A.H.; Bonisteel, Jamie M.; Klipp, Emily S.; Wright, C. Wayne
2009-01-01
These remotely sensed, geographically referenced elevation measurements of Lidar-derived first surface (FS) topography were produced collaboratively by the U.S. Geological Survey (USGS), Florida Integrated Science Center (FISC), St. Petersburg, FL, and the National Aeronautics and Space Administration (NASA), Wallops Flight Facility, VA. This project provides highly detailed and accurate datasets of the Alabama coastline, acquired October 3-4, 2001. The datasets are made available for use as a management tool to research scientists and natural resource managers. An innovative scanning Lidar instrument originally developed by NASA, and known as the Airborne Topographic Mapper (ATM), was used during data acquisition. The ATM system is a scanning Lidar system that measures high-resolution topography of the land surface, and incorporates a green-wavelength laser operating at pulse rates of 2 to 10 kilohertz. Measurements from the laser ranging device are coupled with data acquired from inertial navigation system (INS) attitude sensors and differentially corrected global positioning system (GPS) receivers to measure topography of the surface at accuracies of +/-15 centimeters. The nominal ATM platform is a Twin Otter or P-3 Orion aircraft, but the instrument may be deployed on a range of light aircraft. Elevation measurements were collected over the survey area using the ATM system, and the resulting data were then processed using the Airborne Lidar Processing System (ALPS), a custom-built processing system developed in a NASA-USGS collaboration. ALPS supports the exploration and processing of Lidar data in an interactive or batch mode. Modules for pre-survey flight line definition, flight path plotting, Lidar raster and waveform investigation, and digital camera image playback have been developed. Processing algorithms have been developed to extract the range to the first and last significant return within each waveform. ALPS is routinely used to create maps that represent submerged or first surface topography.
ATM Coastal Topography-Florida 2001: Eastern Panhandle
Yates, Xan; Nayegandhi, Amar; Brock, John C.; Sallenger, A.H.; Bonisteel, Jamie M.; Klipp, Emily S.; Wright, C. Wayne
2009-01-01
These remotely sensed, geographically referenced elevation measurements of Lidar-derived first surface (FS) topography were produced collaboratively by the U.S. Geological Survey (USGS), Florida Integrated Science Center (FISC), St. Petersburg, FL, and the National Aeronautics and Space Administration (NASA), Wallops Flight Facility, VA. This project provides highly detailed and accurate datasets of the eastern Florida panhandle coastline, acquired October 2, 2001. The datasets are made available for use as a management tool to research scientists and natural resource managers. An innovative scanning Lidar instrument originally developed by NASA, and known as the Airborne Topographic Mapper (ATM), was used during data acquisition. The ATM system is a scanning Lidar system that measures high-resolution topography of the land surface and incorporates a green-wavelength laser operating at pulse rates of 2 to 10 kilohertz. Measurements from the laser-ranging device are coupled with data acquired from inertial navigation system (INS) attitude sensors and differentially corrected global positioning system (GPS) receivers to measure topography of the surface at accuracies of +/-15 centimeters. The nominal ATM platform is a Twin Otter or P-3 Orion aircraft, but the instrument may be deployed on a range of light aircraft. Elevation measurements were collected over the survey area using the ATM system, and the resulting data were then processed using the Airborne Lidar Processing System (ALPS), a custom-built processing system developed in a NASA-USGS collaboration. ALPS supports the exploration and processing of Lidar data in an interactive or batch mode. Modules for presurvey flight line definition, flight path plotting, Lidar raster and waveform investigation, and digital camera image playback have been developed. Processing algorithms have been developed to extract the range to the first and last significant return within each waveform. ALPS is routinely used to create maps that represent submerged or first surface topography.
Wireless Communications in Smart Grid
NASA Astrophysics Data System (ADS)
Bojkovic, Zoran; Bakmaz, Bojan
Communication networks play a crucial role in smart grid, as the intelligence of this complex system is built based on information exchange across the power grid. Wireless communications and networking are among the most economical ways to build the essential part of the scalable communication infrastructure for smart grid. In particular, wireless networks will be deployed widely in the smart grid for automatic meter reading, remote system and customer site monitoring, as well as equipment fault diagnosing. With an increasing interest from both the academic and industrial communities, this chapter systematically investigates recent advances in wireless communication technology for the smart grid.
Energy Storage Systems Are Coming: Are You Ready
DOE Office of Scientific and Technical Information (OSTI.GOV)
Conover, David R.
2015-12-05
Energy storage systems (batteries) are not a new concept, but the technology being developed and introduced today with an increasing emphasis on energy storage, is new. The increased focus on energy, environmental and economic issues in the built environment is spurring increased application of renewables as well as reduction in peak energy use - both of which create a need for energy storage. This article provides an overview of current and anticipated energy storage technology, focusing on ensuring the safe application and use of energy storage on both the grid and customer side of the utility meter.
Control of a Quadcopter Aerial Robot Using Optic Flow Sensing
NASA Astrophysics Data System (ADS)
Hurd, Michael Brandon
This thesis focuses on the motion control of a custom-built quadcopter aerial robot using optic flow sensing. Optic flow sensing is a vision-based approach that can provide a robot the ability to fly in global positioning system (GPS) denied environments, such as indoor environments. In this work, optic flow sensors are used to stabilize the motion of quadcopter robot, where an optic flow algorithm is applied to provide odometry measurements to the quadcopter's central processing unit to monitor the flight heading. The optic-flow sensor and algorithm are capable of gathering and processing the images at 250 frames/sec, and the sensor package weighs 2.5 g and has a footprint of 6 cm2 in area. The odometry value from the optic flow sensor is then used a feedback information in a simple proportional-integral-derivative (PID) controller on the quadcopter. Experimental results are presented to demonstrate the effectiveness of using optic flow for controlling the motion of the quadcopter aerial robot. The technique presented herein can be applied to different types of aerial robotic systems or unmanned aerial vehicles (UAVs), as well as unmanned ground vehicles (UGV).
The Integration of COTS/GOTS within NASA's HST Command and Control System
NASA Technical Reports Server (NTRS)
Pfarr, Thomas; Reis, James E.
2001-01-01
NASA's mission critical Hubble Space Telescope (HST) command and control system has been re-engineered with commercial-off-the-shelf (COTS/GOTS) and minimal custom code. This paper focuses on the design of this new HST Control Center System (CCS) and the lessons learned throughout its development. CCS currently utilizes more than 30 COTS/GOTS products with an additional 1/2 million lines of custom glueware code; the new CCS exceeds the capabilities of the original system while significantly reducing the lines of custom code by more than 50%. The lifecycle of COTS/GOTS products will be examined including the package selection process, evaluation process, and integration process. The advantages, disadvantages, issues, concerns, and lessons learned for integrating COTS/GOTS into the NASA's mission critical HST CCS will be examined in detail. This paper will reveal the many hidden costs of COTS/GOTS solutions when compared to traditional custom code development efforts; this paper will show the high cost of COTS/GOTS solutions including training expenses, consulting fees, and long-term maintenance expenses.
Innovative Product Design Based on Comprehensive Customer Requirements of Different Cognitive Levels
Zhao, Wu; Zheng, Yake; Wang, Rui; Wang, Chen
2014-01-01
To improve customer satisfaction in innovative product design, a topology structure of customer requirements is established and an innovative product approach is proposed. The topology structure provides designers with reasonable guidance to capture the customer requirements comprehensively. With the aid of analytic hierarchy process (AHP), the importance of the customer requirements is evaluated. Quality function deployment (QFD) is used to translate customer requirements into product and process design demands and pick out the technical requirements which need urgent improvement. In this way, the product is developed in a more targeted way to satisfy the customers. the theory of innovative problems solving (TRIZ) is used to help designers to produce innovative solutions. Finally, a case study of automobile steering system is used to illustrate the application of the proposed approach. PMID:25013862
Li, Xiaolong; Zhao, Wu; Zheng, Yake; Wang, Rui; Wang, Chen
2014-01-01
To improve customer satisfaction in innovative product design, a topology structure of customer requirements is established and an innovative product approach is proposed. The topology structure provides designers with reasonable guidance to capture the customer requirements comprehensively. With the aid of analytic hierarchy process (AHP), the importance of the customer requirements is evaluated. Quality function deployment (QFD) is used to translate customer requirements into product and process design demands and pick out the technical requirements which need urgent improvement. In this way, the product is developed in a more targeted way to satisfy the customers. the theory of innovative problems solving (TRIZ) is used to help designers to produce innovative solutions. Finally, a case study of automobile steering system is used to illustrate the application of the proposed approach.
Method for visualization and presentation of priceless old prints based on precise 3D scan
NASA Astrophysics Data System (ADS)
Bunsch, Eryk; Sitnik, Robert
2014-02-01
Graphic prints and manuscripts constitute main part of the cultural heritage objects created by the most of the known civilizations. Their presentation was always a problem due to their high sensitivity to light and changes of external conditions (temperature, humidity). Today it is possible to use an advanced digitalization techniques for documentation and visualization of mentioned objects. In the situation when presentation of the original heritage object is impossible, there is a need to develop a method allowing documentation and then presentation to the audience of all the aesthetical features of the object. During the course of the project scans of several pages of one of the most valuable books in collection of Museum of Warsaw Archdiocese were performed. The book known as "Great Dürer Trilogy" consists of three series of woodcuts by the Albrecht Dürer. The measurement system used consists of a custom designed, structured light-based, high-resolution measurement head with automated digitization system mounted on the industrial robot. This device was custom built to meet conservators' requirements, especially the lack of ultraviolet or infrared radiation emission in the direction of measured object. Documentation of one page from the book requires about 380 directional measurements which constitute about 3 billion sample points. The distance between the points in the cloud is 20 μm. Provided that the measurement with MSD (measurement sampling density) of 2500 points makes it possible to show to the publicity the spatial structure of this graphics print. An important aspect is the complexity of the software environment created for data processing, in which massive data sets can be automatically processed and visualized. Very important advantage of the software which is using directly clouds of points is the possibility to manipulate freely virtual light source.
DOE Office of Scientific and Technical Information (OSTI.GOV)
O'Leary, Conlan
Over the project, Sighten built a comprehensive software-as-a-service (Saas) platform to automate and streamline the residential solar financing workflow. Before the project period, significant time and money was spent by companies on front-end tools related to system design and proposal creation, but comparatively few resources were available to support the many back-end calculations and data management processes that underpin third party financing. Without a tool like Sighten, the solar financing processes involved passing information from the homeowner prospect into separate tools for system design, financing, and then later to reporting tools including Microsoft Excel, CRM software, in-house software, outside software,more » and offline, manual processes. Passing data between tools and attempting to connect disparate systems results in inefficiency and inaccuracy for the industry. Sighten was built to consolidate all financial and solar-related calculations in a single software platform. It significantly improves upon the accuracy of these calculations and exposes sophisticated new analysis tools resulting in a rigorous, efficient and cost-effective toolset for scaling residential solar. Widely deploying a platform like Sighten’s significantly and immediately impacts the residential solar space in several important ways: 1) standardizing and improving the quality of all quantitative calculations involved in the residential financing process, most notably project finance, system production and reporting calculations; 2) representing a true step change in terms of reporting and analysis capabilities by maintaining more accurate data and exposing sophisticated tools around simulation, tranching, and financial reporting, among others, to all stakeholders in the space; 3) allowing a broader group of developers/installers/finance companies to access the capital markets by providing an out-of-the-box toolset that handles the execution of running investor capital through a rooftop solar financing program. Standardizing and improving all calculations, improving data quality, and exposing new analysis tools previously unavailable affects investment in the residential space in several important ways: 1) lowering the cost of capital for existing capital providers by mitigating uncertainty and de-risking the solar asset class; 2) attracting new, lower cost investors to the solar asset class as reporting and data quality resemble standards of more mature asset classes; 3) increasing the prevalence of liquidity options for investors through back leverage, securitization, or secondary sale by providing the tools necessary for lenders, ratings agencies, etc. to properly understand a portfolio of residential solar assets. During the project period, Sighten successfully built and scaled a commercially ready tool for the residential solar market. The software solution built by Sighten has been deployed with key target customer segments identified in the award deliverables: solar installers, solar developers/channel managers, and solar financiers, including lenders. Each of these segments greatly benefits from the availability of the Sighten toolset.« less
Discovering system requirements
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bahill, A.T.; Bentz, B.; Dean, F.F.
1996-07-01
Cost and schedule overruns are often caused by poor requirements that are produced by people who do not understand the requirements process. This report provides a high-level overview of the system requirements process, explaining types, sources, and characteristics of good requirements. System requirements, however, are seldom stated by the customer. Therefore, this report shows ways to help you work with your customer to discover the system requirements. It also explains terminology commonly used in the requirements development field, such as verification, validation, technical performance measures, and the various design reviews.
NASA Astrophysics Data System (ADS)
Esch, T.; Asamer, H.; Boettcher, M.; Brito, F.; Hirner, A.; Marconcini, M.; Mathot, E.; Metz, A.; Permana, H.; Soukop, T.; Stanek, F.; Kuchar, S.; Zeidler, J.; Balhar, J.
2016-06-01
The Sentinel fleet will provide a so-far unique coverage with Earth observation data and therewith new opportunities for the implementation of methodologies to generate innovative geo-information products and services. It is here where the TEP Urban project is supposed to initiate a step change by providing an open and participatory platform based on modern ICT technologies and services that enables any interested user to easily exploit Earth observation data pools, in particular those of the Sentinel missions, and derive thematic information on the status and development of the built environment from these data. Key component of TEP Urban project is the implementation of a web-based platform employing distributed high-level computing infrastructures and providing key functionalities for i) high-performance access to satellite imagery and derived thematic data, ii) modular and generic state-of-the art pre-processing, analysis, and visualization techniques, iii) customized development and dissemination of algorithms, products and services, and iv) networking and communication. This contribution introduces the main facts about the TEP Urban project, including a description of the general objectives, the platform systems design and functionalities, and the preliminary portfolio products and services available at the TEP Urban platform.
Duke, Jon D; Morea, Justin; Mamlin, Burke; Martin, Douglas K; Simonaitis, Linas; Takesue, Blaine Y; Dixon, Brian E; Dexter, Paul R
2014-03-01
Regenstrief Institute developed one of the seminal computerized order entry systems, the Medical Gopher, for implementation at Wishard Hospital nearly three decades ago. Wishard Hospital and Regenstrief remain committed to homegrown software development, and over the past 4 years we have fully rebuilt Gopher with an emphasis on usability, safety, leveraging open source technologies, and the advancement of biomedical informatics research. Our objective in this paper is to summarize the functionality of this new system and highlight its novel features. Applying a user-centered design process, the new Gopher was built upon a rich-internet application framework using an agile development process. The system incorporates order entry, clinical documentation, result viewing, decision support, and clinical workflow. We have customized its use for the outpatient, inpatient, and emergency department settings. The new Gopher is now in use by over 1100 users a day, including an average of 433 physicians caring for over 3600 patients daily. The system includes a wizard-like clinical workflow, dynamic multimedia alerts, and a familiar 'e-commerce'-based interface for order entry. Clinical documentation is enhanced by real-time natural language processing and data review is supported by a rapid chart search feature. As one of the few remaining academically developed order entry systems, the Gopher has been designed both to improve patient care and to support next-generation informatics research. It has achieved rapid adoption within our health system and suggests continued viability for homegrown systems in settings of close collaboration between developers and providers. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Kassis, Timothy; Weiler, Michael J.; Dixon, J. Brandon
2012-03-01
All dietary lipids are transported to venous circulation through the lymphatic system, yet the underlying mechanisms that regulate this process remain unclear. Understanding how the lymphatics functionally respond to changes in lipid load is important in the diagnosis and treatment of lipid and lymphatic related diseases such as obesity, hypercholesterolemia, and lymphedema. Therefore, we sought to develop an in situ imaging system to quantify and correlate lymphatic function as it relates to lipid transport. A custom-built optical set-up provides us with the capability of dual-channel imaging of both high-speed bright-field video and fluorescence simultaneously. This is achieved by dividing the light path into two optical bands. Utilizing high-speed and back-illuminated CCD cameras and post-acquisition image processing algorithms, we have the potential quantify correlations between vessel contraction, lymph flow and lipid concentration of mesenteric lymphatic vessels in situ. Local flow velocity is measured through lymphocyte tracking, vessel contraction through measurements of the vessel walls and lipid uptake through fluorescence intensity tracking of a fluorescent long chain fatty acid analogue, Bodipy FL C16. This system will prove to be an invaluable tool for both scientists studying lymphatic function in health and disease, and those investigating strategies for targeting the lymphatic system with orally delivered drugs.
Kim, Jong Bae; Brienza, David M
2006-01-01
A Remote Accessibility Assessment System (RAAS) that uses three-dimensional (3-D) reconstruction technology is being developed; it enables clinicians to assess the wheelchair accessibility of users' built environments from a remote location. The RAAS uses commercial software to construct 3-D virtualized environments from photographs. We developed custom screening algorithms and instruments for analyzing accessibility. Characteristics of the camera and 3-D reconstruction software chosen for the system significantly affect its overall reliability. In this study, we performed an accuracy assessment to verify that commercial hardware and software can construct accurate 3-D models by analyzing the accuracy of dimensional measurements in a virtual environment and a comparison of dimensional measurements from 3-D models created with four cameras/settings. Based on these two analyses, we were able to specify a consumer-grade digital camera and PhotoModeler (EOS Systems, Inc, Vancouver, Canada) software for this system. Finally, we performed a feasibility analysis of the system in an actual environment to evaluate its ability to assess the accessibility of a wheelchair user's typical built environment. The field test resulted in an accurate accessibility assessment and thus validated our system.
Software handlers for process interfaces
NASA Technical Reports Server (NTRS)
Bercaw, R. W.
1976-01-01
The principles involved in the development of software handlers for custom interfacing problems are discussed. Handlers for the CAMAC standard are examined in detail. The types of transactions that must be supported have been established by standards groups, eliminating conflicting requirements arising out of different design philosophies and applications. Implementation of the standard handlers has been facilititated by standardization of hardware. The necessary local processors can be placed in the handler when it is written or at run time by means of input/output directives, or they can be built into a high-performance input/output processor. The full benefits of these process interfaces will only be realized when software requirements are incorporated uniformly into the hardware.
LDRD Final Report 15-ERD-037 Matthews
DOE Office of Scientific and Technical Information (OSTI.GOV)
Matthews, Manyalibo J.
2017-10-26
The physics and materials science involved in laser materials processing of metals was studied experimentally using custom-built test beds and in situ diagnostics. Special attention was given to laser-based powder bed fusion additive manufacturing processes, a technology critically important to the stockpile stewardship program in NNSA. New light has been shed on several phenomena such as laser-driven spatter, material displacement and morphology changes. The results presented here and in publications generated by this work have proven impactful and useful to both internal and external communities. New directions in additive manufacturing research at LLNL have been enabled, along with new scientificmore » capabilities that can serve future program needs.« less
Advantages of utilizing DMD based rapid manufacturing systems in mass customization applications
NASA Astrophysics Data System (ADS)
El-Siblani, A.
2010-02-01
The Use of DMD based Rapid Manufacturing Systems has proven to be very advantageous in the production of highly accurate plastic based components for use in mass customization market such as hearing aids, and dental markets. The voxelization process currently afforded with the DLP technology eliminates any layering effect associated with all existing additive Rapid Manufacturing technologies. The smooth accurate surfaces produced in an additive process utilizing DLP technology, through the voxelization approach, allow for the production of custom finished products. The implementation of DLP technology in rapid prototyping and rapid manufacturing systems allow for the usage of highly viscous photopolymer based liquid and paste composites for rapid manufacturing that could not be used in any other additive process prior to implementation of DLP technology in RP and RM systems. It also allowed for the greater throughput in production without sacrificing quality and accuracy.
Queueing system analysis of multi server model at XYZ insurance company in Tasikmalaya city
NASA Astrophysics Data System (ADS)
Muhajir, Ahmad; Binatari, Nikenasih
2017-08-01
Queueing theory or waiting line theory is a theory that deals with the queue process from the customer comes, queue to be served, served and left on service facilities. Queue occurs because of a mismatch between the numbers of customers that will be served with the available number of services, as an example at XYZ insurance company in Tasikmalaya. This research aims to determine the characteristics of the queue system which then to optimize the number of server in term of total cost. The result shows that the queue model can be represented by (M/M/4):(GD/∞/∞), where the arrivals are Poisson distributed while the service time is following exponential distribution. The probability of idle customer service is 2,39% of the working time, the average number of customer in the queue is 3 customers, the average number of customer in a system is 6 customers, the average time of a customer spent in the queue is 15,9979 minutes, the average time a customer spends in the system is 34,4141 minutes, and the average number of busy customer servicer is 3 server. The optimized number of customer service is 5 servers, and the operational cost has minimum cost at Rp 4.323.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-08-23
... Regarding Securities Delivered to or From Participant Accounts Through the Automated Customer Account... Corporation (``NSCC'') concerning Automated Customer Account Transfer Service (``ACATS'') transfers processed... (July 2, 2010). NSCC's ACATS system enables members to effect automated transfers of customer accounts...
Code of Federal Regulations, 2011 CFR
2011-04-01
... 19 Customs Duties 2 2011-04-01 2011-04-01 false Application. 143.2 Section 143.2 Customs Duties U.S. CUSTOMS AND BORDER PROTECTION, DEPARTMENT OF HOMELAND SECURITY; DEPARTMENT OF THE TREASURY... description of the computer hardware, communications and entry processing systems to be used and the estimated...
Code of Federal Regulations, 2014 CFR
2014-04-01
... 19 Customs Duties 2 2014-04-01 2014-04-01 false Application. 143.2 Section 143.2 Customs Duties U.S. CUSTOMS AND BORDER PROTECTION, DEPARTMENT OF HOMELAND SECURITY; DEPARTMENT OF THE TREASURY... description of the computer hardware, communications and entry processing systems to be used and the estimated...
Code of Federal Regulations, 2012 CFR
2012-04-01
... 19 Customs Duties 2 2012-04-01 2012-04-01 false Application. 143.2 Section 143.2 Customs Duties U.S. CUSTOMS AND BORDER PROTECTION, DEPARTMENT OF HOMELAND SECURITY; DEPARTMENT OF THE TREASURY... description of the computer hardware, communications and entry processing systems to be used and the estimated...
Code of Federal Regulations, 2013 CFR
2013-04-01
... 19 Customs Duties 2 2013-04-01 2013-04-01 false Application. 143.2 Section 143.2 Customs Duties U.S. CUSTOMS AND BORDER PROTECTION, DEPARTMENT OF HOMELAND SECURITY; DEPARTMENT OF THE TREASURY... description of the computer hardware, communications and entry processing systems to be used and the estimated...
Code of Federal Regulations, 2010 CFR
2010-04-01
... 19 Customs Duties 2 2010-04-01 2010-04-01 false Application. 143.2 Section 143.2 Customs Duties U.S. CUSTOMS AND BORDER PROTECTION, DEPARTMENT OF HOMELAND SECURITY; DEPARTMENT OF THE TREASURY... description of the computer hardware, communications and entry processing systems to be used and the estimated...
2003-09-01
A team of NASA researchers from Marshall Space Flight Center (MSFC) and Dryden Flight Research center have proven that beamed light can be used to power an aircraft, a first-in-the-world accomplishment to the best of their knowledge. Using an experimental custom built radio-controlled model aircraft, the team has demonstrated a system that beams enough light energy from the ground to power the propeller of an aircraft and sustain it in flight. Special photovoltaic arrays on the plane, similar to solar cells, receive the light energy and convert it to electric current to drive the propeller motor. In a series of indoor flights this week at MSFC, a lightweight custom built laser beam was aimed at the airplane `s solar panels. The laser tracks the plane, maintaining power on its cells until the end of the flight when the laser is turned off and the airplane glides to a landing. The laser source demonstration represents the capability to beam more power to a plane so that it can reach higher altitudes and have a greater flight range without having to carry fuel or batteries, enabling an indefinite flight time. The demonstration was a collaborative effort between the Dryden Center at Edward's, California, where the aircraft was designed and built, and MSFC, where integration and testing of the laser and photovoltaic cells was done. Laser power beaming is a promising technology for consideration in new aircraft design and operation, and supports NASA's goals in the development of revolutionary aerospace technologies. Photographed with their invention are (from left to right): David Bushman and Tony Frackowiak, both of Dryden; and MSFC's Robert Burdine.
Broadband Vibration Detection in Tissue Phantoms Using a Fiber Fabry-Perot Cavity.
Barnes, Jack; Li, Sijia; Goyal, Apoorv; Abolmaesumi, Purang; Mousavi, Parvin; Loock, Hans-Peter
2018-04-01
A fiber optic vibration sensor is developed and characterized with an ultrawide dynamic sensing range, from less than 1 Hz to clinical ultrasound frequencies near 6 MHz. The vibration sensor consists of a matched pair of fiber Bragg gratings coupled to a custom-built signal processing circuit. The wavelength of a laser diode is locked to one of the many cavity resonances using the Pound-Drever-Hall scheme. A calibrated piezoelectric vibration element was used to characterize the sensor's strain, temperature, and noise responses. To demonstrate its sensing capability, an ultrasound phantom with built-in low frequency vibration actuation was constructed. The fiber optic senor was shown to simultaneously capture the low frequency vibration and the clinical ultrasound transmission waveforms with nanostrain sensitivity. This miniaturized and sensitive vibration sensor can provide comprehensive information regarding strain response and the resultant ultrasound waveforms.
Solazyme Integrated Biorefinery (SzIBR): Diesel Fuels from Heterotrophic Algae
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brinkmann, David
2014-12-23
Under Department of Energy Award Number DE-EE0002877 (the “DOE Award”), Solazyme, Inc. (“Solazyme”) has built a demonstration scale “Solazyme Integrated Biorefinery (SzlBR).” The SzIBR was built to provide integrated scale-up of Solazyme’s novel heterotrophic algal oil biomanufacturing process, validate the projected commercial-scale economics of producing multiple algal oils, and to enable Solazyme to collect the data necessary to complete the design of its first commercial-scale facility. Solazyme’s technology enables it to convert a range of low-cost plant-based sugars into high-value oils. Solazyme’s renewable products replace or enhance oils derived from the world’s three existing sources—petroleum, plants, and animal fats. Solazymemore » tailors the composition of its oils to address specific customer requirements, offering superior performance characteristics and value. This report summarizes history and the results of the project.« less
5 CFR 850.101 - Purpose and scope.
Code of Federal Regulations, 2010 CFR
2010-01-01
... Employees' Retirement System (FERS) by using contemporary, automated business processes and supporting... employing more efficient and effective business systems to respond to increased customer demand for higher levels of customer service and online self-service tools. (b) The provisions of this part authorize...
Converting customer expectations into achievable results.
Landis, G A
1999-11-01
It is not enough in today's environment to just meet customers' expectations--we must exceed them. Therefore, one must learn what constitutes expectations. These needs have expanded during the past few years from just manufacturing the product and looking at the outcome from a provincial standpoint. Now we must understand and satisfy the entire supply chain. To manage this process and satisfy the customer, the process now involves the supplier, the manufacturer, and the entire distribution system.
Human factors involvement in bringing the power of AI to a heterogeneous user population
NASA Technical Reports Server (NTRS)
Czerwinski, Mary; Nguyen, Trung
1994-01-01
The Human Factors involvement in developing COMPAQ QuickSolve, an electronic problem-solving and information system for Compaq's line of networked printers, is described. Empowering customers with expert system technology so they could solve advanced networked printer problems on their own was a major goal in designing this system. This process would minimize customer down-time, reduce the number of phone calls to the Compaq Customer Support Center, improve customer satisfaction, and, most importantly, differentiate Compaq printers in the marketplace by providing the best, and most technologically advanced, customer support. This represents a re-engineering of Compaq's customer support strategy and implementation. In its first generation system, SMART, the objective was to provide expert knowledge to Compaq's help desk operation to more quickly and correctly answer customer questions and problems. QuickSolve is a second generation system in that customer support is put directly in the hands of the consumers. As a result, the design of QuickSolve presented a number of challenging issues. Because the produce would be used by a diverse and heterogeneous set of users, a significant amount of human factors research and analysis was required while designing and implementing the system. Research that shaped the organization and design of the expert system component as well.
Workflows for microarray data processing in the Kepler environment.
Stropp, Thomas; McPhillips, Timothy; Ludäscher, Bertram; Bieda, Mark
2012-05-17
Microarray data analysis has been the subject of extensive and ongoing pipeline development due to its complexity, the availability of several options at each analysis step, and the development of new analysis demands, including integration with new data sources. Bioinformatics pipelines are usually custom built for different applications, making them typically difficult to modify, extend and repurpose. Scientific workflow systems are intended to address these issues by providing general-purpose frameworks in which to develop and execute such pipelines. The Kepler workflow environment is a well-established system under continual development that is employed in several areas of scientific research. Kepler provides a flexible graphical interface, featuring clear display of parameter values, for design and modification of workflows. It has capabilities for developing novel computational components in the R, Python, and Java programming languages, all of which are widely used for bioinformatics algorithm development, along with capabilities for invoking external applications and using web services. We developed a series of fully functional bioinformatics pipelines addressing common tasks in microarray processing in the Kepler workflow environment. These pipelines consist of a set of tools for GFF file processing of NimbleGen chromatin immunoprecipitation on microarray (ChIP-chip) datasets and more comprehensive workflows for Affymetrix gene expression microarray bioinformatics and basic primer design for PCR experiments, which are often used to validate microarray results. Although functional in themselves, these workflows can be easily customized, extended, or repurposed to match the needs of specific projects and are designed to be a toolkit and starting point for specific applications. These workflows illustrate a workflow programming paradigm focusing on local resources (programs and data) and therefore are close to traditional shell scripting or R/BioConductor scripting approaches to pipeline design. Finally, we suggest that microarray data processing task workflows may provide a basis for future example-based comparison of different workflow systems. We provide a set of tools and complete workflows for microarray data analysis in the Kepler environment, which has the advantages of offering graphical, clear display of conceptual steps and parameters and the ability to easily integrate other resources such as remote data and web services.
Photon collider: a four-channel autoguider solution
NASA Astrophysics Data System (ADS)
Hygelund, John C.; Haynes, Rachel; Burleson, Ben; Fulton, Benjamin J.
2010-07-01
The "Photon Collider" uses a compact array of four off axis autoguider cameras positioned with independent filtering and focus. The photon collider is two way symmetric and robustly mounted with the off axis light crossing the science field which allows the compact single frame construction to have extremely small relative deflections between guide and science CCDs. The photon collider provides four independent guiding signals with a total of 15 square arc minutes of sky coverage. These signals allow for simultaneous altitude, azimuth, field rotation and focus guiding. Guide cameras read out without exposure overhead increasing the tracking cadence. The independent focus allows the photon collider to maintain in focus guide stars when the main science camera is taking defocused exposures as well as track for telescope focus changes. Independent filters allow auto guiding in the science camera wavelength bandpass. The four cameras are controlled with a custom web services interface from a single Linux based industrial PC, and the autoguider mechanism and telemetry is built around a uCLinux based Analog Devices BlackFin embedded microprocessor. Off axis light is corrected with a custom meniscus correcting lens. Guide CCDs are cooled with ethylene glycol with an advanced leak detection system. The photon collider was built for use on Las Cumbres Observatory's 2 meter Faulks telescopes and currently used to guide the alt-az mount.
Measurement framework for product service system performance of generator set distributors
NASA Astrophysics Data System (ADS)
Sofianti, Tanika D.
2017-11-01
Selling Generator Set (Genset) in B2B market, distributors assisted manufacturers to sell products. This is caused by the limited resources owned by the manufacturer for adding service elements. These service elements are needed to enhance the competitiveness of the generator sets. Some genset distributors often sell products together with supports to their customers. Industrial distributor develops services to meet the needs of the customer. Generator set distributors support machines and equipment produced by manufacturer. The services delivered by the distributors could enhance value obtained by the customers from the equipment. Services provided to customers in bidding process, ordering process of the equipment from the manufacturer, equipment delivery, installations, and the after sales stage. This paper promotes framework to measure Product Service System (PSS) of Generator Set distributors in delivering their products and services for the customers. The methodology of conducting this research is by adopting the perspective of the providers and customers and by taking into account the tangible and intangible products. This research leads to the idea of improvement of current Product Service System of a Genset distributor. This research needs further studies in more detailed measures and the implementation of measurement tools.
DOE Office of Scientific and Technical Information (OSTI.GOV)
none,
The first Challenge Home built in New England features cool-roof shingles, HERS 20–42, and walls densely packed with blown fiberglass. This house won a 2013 Housing Innovation Award in the custom builder category.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-08-23
... Enhance the Process for Transfers Through the Automated Customer Account Transfer Service August 16, 2010... Transfer Service (``ACATS'') system enables Members to effect automated transfers of customer accounts... transfer services and to effect customer account transfers within specified time frames. \\4\\ CNS is an...
Standard Systems Group (SSG) Technology Adoption Planning Workshop
2004-04-01
11 Figure 2: Map of SEI Technologies Against SSG (Cluster Focused on Customer Issues...them could be consolidated. The objectives were grouped into three categories ( customer focused, internal operations, and innovation & learning... customers ! • Streamlined organization with agile processes • Recognized expertise in exploring and exploiting leading IT technologies • Enterprise
Built-In Diagnostics (BID) Of Equipment/Systems
NASA Technical Reports Server (NTRS)
Granieri, Michael N.; Giordano, John P.; Nolan, Mary E.
1995-01-01
Diagnostician(TM)-on-Chip (DOC) technology identifies faults and commands systems reconfiguration. Smart microcontrollers operating in conjunction with other system-control circuits, command self-correcting system/equipment actions in real time. DOC microcontroller generates commands for associated built-in test equipment to stimulate unit of equipment diagnosed, collects and processes response data obtained by built-in test equipment, and performs diagnostic reasoning on response data, using diagnostic knowledge base derived from design data.
Universal Payload Information Management
NASA Technical Reports Server (NTRS)
Elmore, Ralph B.
2003-01-01
As the overall manager and integrator of International Space Station (ISS) science payloads, the Payload Operations Integration Center (POIC) at Marshall Space Flight Center has a critical need to provide an information management system for exchange and control of ISS payload files as well as to coordinate ISS payload related operational changes. The POIC's information management system has a fundamental requirement to provide secure operational access not only to users physically located at the POIC, but also to remote experimenters and International Partners physically located in different parts of the world. The Payload Information Management System (PIMS) is a ground-based electronic document configuration management and collaborative workflow system that was built to service the POIC's information management needs. This paper discusses the application components that comprise the PIMS system, the challenges that influenced its design and architecture, and the selected technologies it employs. This paper will also touch on the advantages of the architecture, details of the user interface, and lessons learned along the way to a successful deployment. With PIMS, a sophisticated software solution has been built that is not only universally accessible for POIC customer s information management needs, but also universally adaptable in implementation and application as a generalized information management system.
Low cost light-sheet microscopy for whole brain imaging
NASA Astrophysics Data System (ADS)
Kumar, Manish; Nasenbeny, Jordan; Kozorovitskiy, Yevgenia
2018-02-01
Light-sheet microscopy has evolved as an indispensable tool in imaging biological samples. It can image 3D samples at fast speed, with high-resolution optical sectioning, and with reduced photobleaching effects. These properties make light-sheet microscopy ideal for imaging fluorophores in a variety of biological samples and organisms, e.g. zebrafish, drosophila, cleared mouse brains, etc. While most commercial turnkey light-sheet systems are expensive, the existing lower cost implementations, e.g. OpenSPIM, are focused on achieving high-resolution imaging of small samples or organisms like zebrafish. In this work, we substantially reduce the cost of light-sheet microscope system while targeting to image much larger samples, i.e. cleared mouse brains, at single-cell resolution. The expensive components of a lightsheet system - excitation laser, water-immersion objectives, and translation stage - are replaced with an incoherent laser diode, dry objectives, and a custom-built Arduino-controlled translation stage. A low-cost CUBIC protocol is used to clear fixed mouse brain samples. The open-source platforms of μManager and Fiji support image acquisition, processing, and visualization. Our system can easily be extended to multi-color light-sheet microscopy.
Design of a spreader bar crane-mounted gamma-ray radiation detection system
NASA Astrophysics Data System (ADS)
Grypp, Matthew D.; Marianno, Craig M.; Poston, John W.; Hearn, Gentry C.
2014-04-01
Over 95% of imports entering the United States from outside North America arrive by sea at 329 ports of entry. These imports are packaged in more than 11 million cargo containers. Radiation portals monitors routinely scan cargo containers leaving port on specially-designed trucks. To accelerate the process, some commercial entities have placed detection systems on the spreader-bar cranes (SBCs) used to offload. Little is known about the radiation background profiles of systems operating on these cranes. To better understand the operational characteristics of these radiation detection systems; a research team from Texas A&M University (TAMU) mounted three thallium-doped sodium iodide [NaI(Tl)] detectors on an SBC at the Domestic Nuclear Detection Office's (DNDO) test track facility at the Port of Tacoma (PoT). These detectors were used to monitor background radiation levels and continuously recorded data during crane operations using a custom-built software package. Count rates and spectral data were recorded for various crane heights over both land and water. The results of this research created a background profile in which count rate was heavily dependent on position demonstrating how detector readings changed in the operational environment.
A new data acquisition system for the CMS Phase 1 pixel detector
NASA Astrophysics Data System (ADS)
Kornmayer, A.
2016-12-01
A new pixel detector will be installed in the CMS experiment during the extended technical stop of the LHC at the beginning of 2017. The new pixel detector, built from four layers in the barrel region and three layers on each end of the forward region, is equipped with upgraded front-end readout electronics, specifically designed to handle the high particle hit rates created in the LHC environment. The DAQ back-end was entirely redesigned to handle the increased number of readout channels, the higher data rates per channel and the new digital data format. Based entirely on the microTCA standard, new front-end controller (FEC) and front-end driver (FED) cards have been developed, prototyped and produced with custom optical link mezzanines mounted on the FC7 AMC and custom firmware. At the same time as the new detector is being assembled, the DAQ system is set up and its integration into the CMS central DAQ system tested by running the pilot blade detector already installed in CMS. This work describes the DAQ system, integration tests and gives an outline for the activities up to commissioning the final system at CMS in 2017.
NASA Astrophysics Data System (ADS)
Venugopal, Vivek; Park, Minho; Ashitate, Yoshitomo; Neacsu, Florin; Kettenring, Frank; Frangioni, John V.; Gangadharan, Sidhu P.; Gioux, Sylvain
2013-12-01
We report the design, characterization, and validation of an optimized simultaneous color and near-infrared (NIR) fluorescence rigid endoscopic imaging system for minimally invasive surgery. This system is optimized for illumination and collection of NIR wavelengths allowing the simultaneous acquisition of both color and NIR fluorescence at frame rates higher than 6.8 fps with high sensitivity. The system employs a custom 10-mm diameter rigid endoscope optimized for NIR transmission. A dual-channel light source compatible with the constraints of an endoscope was built and includes a plasma source for white light illumination and NIR laser diodes for fluorescence excitation. A prism-based 2-CCD camera was customized for simultaneous color and NIR detection with a highly efficient filtration scheme for fluorescence imaging of both 700- and 800-nm emission dyes. The performance characterization studies indicate that the endoscope can efficiently detect fluorescence signal from both indocyanine green and methylene blue in dimethyl sulfoxide at the concentrations of 100 to 185 nM depending on the background optical properties. Finally, we performed the validation of this imaging system in vivo during a minimally invasive procedure for thoracic sentinel lymph node mapping in a porcine model.
NASA Astrophysics Data System (ADS)
Chen, Yen-Luan; Chang, Chin-Chih; Sheu, Dwan-Fang
2016-04-01
This paper proposes the generalised random and age replacement policies for a multi-state system composed of multi-state elements. The degradation of the multi-state element is assumed to follow the non-homogeneous continuous time Markov process which is a continuous time and discrete state process. A recursive approach is presented to efficiently compute the time-dependent state probability distribution of the multi-state element. The state and performance distribution of the entire multi-state system is evaluated via the combination of the stochastic process and the Lz-transform method. The concept of customer-centred reliability measure is developed based on the system performance and the customer demand. We develop the random and age replacement policies for an aging multi-state system subject to imperfect maintenance in a failure (or unacceptable) state. For each policy, the optimum replacement schedule which minimises the mean cost rate is derived analytically and discussed numerically.
Full 3-D OCT-based pseudophakic custom computer eye model
Sun, M.; Pérez-Merino, P.; Martinez-Enriquez, E.; Velasco-Ocana, M.; Marcos, S.
2016-01-01
We compared measured wave aberrations in pseudophakic eyes implanted with aspheric intraocular lenses (IOLs) with simulated aberrations from numerical ray tracing on customized computer eye models, built using quantitative 3-D OCT-based patient-specific ocular geometry. Experimental and simulated aberrations show high correlation (R = 0.93; p<0.0001) and similarity (RMS for high order aberrations discrepancies within 23.58%). This study shows that full OCT-based pseudophakic custom computer eye models allow understanding the relative contribution of optical geometrical and surgically-related factors to image quality, and are an excellent tool for characterizing and improving cataract surgery. PMID:27231608
NASA Technical Reports Server (NTRS)
Campbell, R. H.; Essick, Ray B.; Johnston, Gary; Kenny, Kevin; Russo, Vince
1987-01-01
Project EOS is studying the problems of building adaptable real-time embedded operating systems for the scientific missions of NASA. Choices (A Class Hierarchical Open Interface for Custom Embedded Systems) is an operating system designed and built by Project EOS to address the following specific issues: the software architecture for adaptable embedded parallel operating systems, the achievement of high-performance and real-time operation, the simplification of interprocess communications, the isolation of operating system mechanisms from one another, and the separation of mechanisms from policy decisions. Choices is written in C++ and runs on a ten processor Encore Multimax. The system is intended for use in constructing specialized computer applications and research on advanced operating system features including fault tolerance and parallelism.
SkySat-1: very high-resolution imagery from a small satellite
NASA Astrophysics Data System (ADS)
Murthy, Kiran; Shearn, Michael; Smiley, Byron D.; Chau, Alexandra H.; Levine, Josh; Robinson, M. Dirk
2014-10-01
This paper presents details of the SkySat-1 mission, which is the first microsatellite-class commercial earth- observation system to generate sub-meter resolution panchromatic imagery, in addition to sub-meter resolution 4-band pan-sharpened imagery. SkySat-1 was built and launched for an order of magnitude lower cost than similarly performing missions. The low-cost design enables the deployment of a large imaging constellation that can provide imagery with both high temporal resolution and high spatial resolution. One key enabler of the SkySat-1 mission was simplifying the spacecraft design and instead relying on ground- based image processing to achieve high-performance at the system level. The imaging instrument consists of a custom-designed high-quality optical telescope and commercially-available high frame rate CMOS image sen- sors. While each individually captured raw image frame shows moderate quality, ground-based image processing algorithms improve the raw data by combining data from multiple frames to boost image signal-to-noise ratio (SNR) and decrease the ground sample distance (GSD) in a process Skybox calls "digital TDI". Careful qual-ity assessment and tuning of the spacecraft, payload, and algorithms was necessary to generate high-quality panchromatic, multispectral, and pan-sharpened imagery. Furthermore, the framing sensor configuration en- abled the first commercial High-Definition full-frame rate panchromatic video to be captured from space, with approximately 1 meter ground sample distance. Details of the SkySat-1 imaging instrument and ground-based image processing system are presented, as well as an overview of the work involved with calibrating and validating the system. Examples of raw and processed imagery are shown, and the raw imagery is compared to pre-launch simulated imagery used to tune the image processing algorithms.
Enhanced Product Generation at NASA Data Centers Through Grid Technology
NASA Technical Reports Server (NTRS)
Barkstrom, Bruce R.; Hinke, Thomas H.; Gavali, Shradha; Seufzer, William J.
2003-01-01
This paper describes how grid technology can support the ability of NASA data centers to provide customized data products. A combination of grid technology and commodity processors are proposed to provide the bandwidth necessary to perform customized processing of data, with customized data subsetting providing the initial example. This customized subsetting engine can be used to support a new type of subsetting, called phenomena-based subsetting, where data is subsetted based on its association with some phenomena, such as mesoscale convective systems or hurricanes. This concept is expanded to allow the phenomena to be detected in one type of data, with the subsetting requirements transmitted to the subsetting engine to subset a different type of data. The subsetting requirements are generated by a data mining system and transmitted to the subsetter in the form of an XML feature index that describes the spatial and temporal extent of the phenomena. For this work, a grid-based mining system called the Grid Miner is used to identify the phenomena and generate the feature index. This paper discusses the value of grid technology in facilitating the development of a high performance customized product processing and the coupling of a grid mining system to support phenomena-based subsetting.
Xu, Jing; Wong, Kevin; Jian, Yifan; Sarunic, Marinko V
2014-02-01
In this report, we describe a graphics processing unit (GPU)-accelerated processing platform for real-time acquisition and display of flow contrast images with Fourier domain optical coherence tomography (FDOCT) in mouse and human eyes in vivo. Motion contrast from blood flow is processed using the speckle variance OCT (svOCT) technique, which relies on the acquisition of multiple B-scan frames at the same location and tracking the change of the speckle pattern. Real-time mouse and human retinal imaging using two different custom-built OCT systems with processing and display performed on GPU are presented with an in-depth analysis of performance metrics. The display output included structural OCT data, en face projections of the intensity data, and the svOCT en face projections of retinal microvasculature; these results compare projections with and without speckle variance in the different retinal layers to reveal significant contrast improvements. As a demonstration, videos of real-time svOCT for in vivo human and mouse retinal imaging are included in our results. The capability of performing real-time svOCT imaging of the retinal vasculature may be a useful tool in a clinical environment for monitoring disease-related pathological changes in the microcirculation such as diabetic retinopathy.
Facilitating NASA Earth Science Data Processing Using Nebula Cloud Computing
NASA Technical Reports Server (NTRS)
Pham, Long; Chen, Aijun; Kempler, Steven; Lynnes, Christopher; Theobald, Michael; Asghar, Esfandiari; Campino, Jane; Vollmer, Bruce
2011-01-01
Cloud Computing has been implemented in several commercial arenas. The NASA Nebula Cloud Computing platform is an Infrastructure as a Service (IaaS) built in 2008 at NASA Ames Research Center and 2010 at GSFC. Nebula is an open source Cloud platform intended to: a) Make NASA realize significant cost savings through efficient resource utilization, reduced energy consumption, and reduced labor costs. b) Provide an easier way for NASA scientists and researchers to efficiently explore and share large and complex data sets. c) Allow customers to provision, manage, and decommission computing capabilities on an as-needed bases
DOE Office of Scientific and Technical Information (OSTI.GOV)
Weigand, Steven J.; Keane, Denis T.
The DuPont-Northwestern-Dow Collaborative Access Team (DND-CAT) built and currently manages sector 5 at the Advanced Photon Source (APS), Argonne National Laboratory. One of the principal techniques supported by DND-CAT is Small and Wide-Angle X-ray Scattering (SAXS/WAXS), with an emphasis on simultaneous data collection over a wide azimuthal and reciprocal space range using a custom SAXS/WAXS detector system. A new triple detector system is now in development, and we describe the key parameters and characteristics of the new instrument, which will be faster, more flexible, more robust, and will improve q-space resolution in a critical reciprocal space regime between the traditionalmore » WAXS and SAXS ranges.« less
Beck, Christoph; Garreau, Guillaume; Georgiou, Julius
2016-01-01
Sand-scorpions and many other arachnids perceive their environment by using their feet to sense ground waves. They are able to determine amplitudes the size of an atom and locate the acoustic stimuli with an accuracy of within 13° based on their neuronal anatomy. We present here a prototype sound source localization system, inspired from this impressive performance. The system presented utilizes custom-built hardware with eight MEMS microphones, one for each foot, to acquire the acoustic scene, and a spiking neural model to localize the sound source. The current implementation shows smaller localization error than those observed in nature.
Solar energy system performance evaluation: Seasonal report for Decade 80 House, Tucson, Arizona
NASA Technical Reports Server (NTRS)
1980-01-01
The operational and thermal performance of the Decade 80 solar energy system is described. The system was designed by Cooper Development Association, Inc. with space heating and space cooling to a one-story, single family residence located in Tucson, Arizona. The Decade 80 House was designed and built in the mid-70's to be a showplace/workshop for solar energy utilization. Superior construction techniques, the use of quality materials and a full time maintenance staff have served to make the entire system an outstanding example of the application of solar energy for residential purposes. The luxury of a full time, on-site maintenance person is perhaps the single most important aspect of this program. While most installations cannot support this level of maintenance, it was very useful in keeping all subsystems operating in top form and allowing for a full season data collection to be obtained. Several conclusions were drawn from the long term monitoring effort, among which are: (1) flat plate collectors will support cooling; (2) definite energy savings can be realized; and (3) more frequent periodic maintenance may be required on solar energy systems that are not custom built.
Black Box Thinking: Analysis of a Service Outsourcing Case in Insurance
ERIC Educational Resources Information Center
Witman, Paul D.; Njunge, Christopher
2016-01-01
Often, users of information systems (both automated and manual) must analyze those systems in a "black box" fashion, without being able to see the internals of how the system is supposed to work. In this case of business process outsourcing, an insurance industry customer encounters an ongoing stream of customer service issues, with both…
High-density fiber-optic DNA random microsphere array.
Ferguson, J A; Steemers, F J; Walt, D R
2000-11-15
A high-density fiber-optic DNA microarray sensor was developed to monitor multiple DNA sequences in parallel. Microarrays were prepared by randomly distributing DNA probe-functionalized 3.1-microm-diameter microspheres in an array of wells etched in a 500-microm-diameter optical imaging fiber. Registration of the microspheres was performed using an optical encoding scheme and a custom-built imaging system. Hybridization was visualized using fluorescent-labeled DNA targets with a detection limit of 10 fM. Hybridization times of seconds are required for nanomolar target concentrations, and analysis is performed in minutes.
NASA Astrophysics Data System (ADS)
Morgan, Christopher G.; Mitchell, A. C.; Murray, J. G.
1990-05-01
An imaging photon detector has been modified to incorporate fast timing electronics coupled to a custom built photon correlator interfaced to a RISC computer. Using excitation with intensity- muodulated light, fluorescence images can be readily obtained where contrast is determined by the decay time of emission, rather than by intensity. This technology is readily extended to multifrequency phase/demodulation fluorescence imaging or to differential polarised phase fluorometry. The potential use of the correlator for confocal imaging with a laser scanner is also briefly discussed.
NASA Astrophysics Data System (ADS)
Rimskog, Magnus; O'Loughlin, Brian J.
2007-02-01
Silex Microsystems handles a wide range of customized MEMS components. This speech will be describing Silex's MEMS foundry work model for providing customized solutions based on MEMS in a cost effective and well controlled manner. Factors for success are the capabilities to reformulate a customer product concept to manufacturing processes in the wafer fab, using standard process modules and production equipment. A well-controlled system increases the likelihood of a first batch success and enables fast ramp-up into volume production. The following success factors can be listed: strong enduring relationships with the customers; highly qualified well-experienced specialists working close with the customer; process solutions and building blocks ready to use out of a library; addressing manufacturing issues in the early design phase; in-house know how to meet demands for volume manufacturing; access to a wafer fab with high capacity, good organization, high availability of equipment, and short lead times; process development done in the manufacturing environment using production equipment for easy ramp-up to volume production. The article covers a method of working to address these factors: to have a long and enduring relationships with customers utilizing MEMS expertise and working close with customers, to translate their product ideas to MEMS components; to have stable process solutions for features such as Low ohmic vias, Spiked electrodes, Cantilevers, Silicon optical mirrors, Micro needles, etc, which can be used and modified for the customer needs; to use a structured development and design methodology in order to handle hundreds of process modules, and setting up standard run sheets. It is also very important to do real time process development in the manufacturing line. It minimizes the lead-time for the ramp-up of production; to have access to a state of the art Wafer Fab which is well organized, controlled and flexible, with high capacity and short lead-time for prototypes. It is crucial to have intimate control of processes, equipment, organization, production flow control and WIP. This has been addressed by using a fully computerized control and reporting system.
Customer Decision Making in Web Services with an Integrated P6 Model
NASA Astrophysics Data System (ADS)
Sun, Zhaohao; Sun, Junqing; Meredith, Grant
Customer decision making (CDM) is an indispensable factor for web services. This article examines CDM in web services with a novel P6 model, which consists of the 6 Ps: privacy, perception, propensity, preference, personalization and promised experience. This model integrates the existing 6 P elements of marketing mix as the system environment of CDM in web services. The new integrated P6 model deals with the inner world of the customer and incorporates what the customer think during the DM process. The proposed approach will facilitate the research and development of web services and decision support systems.
NASA Astrophysics Data System (ADS)
Xie, Dengling; Xie, Yanjun; Liu, Peng; Tong, Lieshu; Chu, Kaiqin; Smith, Zachary J.
2017-02-01
Current flow-based blood counting devices require expensive and centralized medical infrastructure and are not appropriate for field use. In this paper we report a method to count red blood cells, white blood cells as well as platelets through a low-cost and fully-automated blood counting system. The approach consists of using a compact, custom-built microscope with large field-of-view to record bright-field and fluorescence images of samples that are diluted with a single, stable reagent mixture and counted using automatic algorithms. Sample collection is performed manually using a spring loaded lancet, and volume-metering capillary tubes. The capillaries are then dropped into a tube of pre-measured reagents and gently shaken for 10-30 seconds. The sample is loaded into a measurement chamber and placed on a custom 3D printed platform. Sample translation and focusing is fully automated, and a user has only to press a button for the measurement and analysis to commence. Cost of the system is minimized through the use of custom-designed motorized components. We performed a series of comparative experiments by trained and untrained users on blood from adults and children. We compare the performance of our system, as operated by trained and untrained users, to the clinical gold standard using a Bland-Altman analysis, demonstrating good agreement of our system to the clinical standard. The system's low cost, complete automation, and good field performance indicate that it can be successfully translated for use in low-resource settings where central hematology laboratories are not accessible.
Customized fiber glass posts. Fatigue and fracture resistance.
Costa, Rogério Goulart; De Morais, Eduardo Christiano Caregnatto; Campos, Edson Alves; Michel, Milton Domingos; Gonzaga, Carla Castiglia; Correr, Gisele Maria
2012-02-01
To evaluate the root fracture strength of human single-rooted premolars restored with customized fiberglass post-core systems after fatigue simulation. 40 human premolars had their crowns cut and the root length was standardized to 13 mm. The teeth were endodontically treated and embedded in acrylic resin. The specimens were distributed into four groups (n=10) according to the restorative material used: prefabricated fiber post (PFP), PFP+accessory fiber posts (PFPa), PFP+unidirectional fiberglass (PFPf), and unidirectional fiberglass customized post (CP). All posts were luted using resin cement and the cores were built up with a resin composite. The samples were stored for 24 hours at 37 degrees C and 100% relative humidity and then submitted to mechanical cycling. The specimens were then compressive-loaded in a universal testing machine at a crosshead speed of 0.5 mm/minute until fracture. The failure patterns were analyzed and classified. Data was submitted to one-way ANOVA and Tukey's test (alpha = 0.05). The mean values of maximum load (N) were: PFP - 811.4 +/- 124.3; PFPa - 729.2 +/- 157.2; PFPf- 747.5 +/- 204.7; CP - 762.4 +/- 110. Statistical differences were not observed among the groups. All groups showed favorable restorable failures. Fiberglass customized post did not show improved fracture resistance or differences in failure patterns when compared to prefabricated glass fiber posts.
Combining virtual reality and multimedia techniques for effective maintenance training
NASA Astrophysics Data System (ADS)
McLin, David M.; Chung, James C.
1996-02-01
This paper describes a virtual reality (VR) system developed for use as part of an integrated, low-cost, stand-alone, multimedia trainer. The trainer is used to train National Guard personnel in maintenance and trouble-shooting tasks for the M1A1 Abrams tank, the M2A2 Bradley fighting vehicle and the TOW II missile system. The VR system features a modular, extensible, object-oriented design which consists of a training monitor component, a VR run time component, a model loader component, and a set of domain-specific object behaviors which mimic the behavior of objects encountered in the actual vehicles. The VR system is built from a combination of off-the-shelf commercial software and custom software developed at RTI.
Shared Autonomy Manipulation Data with a Seabotix vLBV300
Hollinger, Geoffrey; Lawrance, Nicholas
2017-06-19
This report outlines marine field demonstrations for manipulation tasks with a semi-Autonomous Underwater Vehicle (sAUV). The vehicle is built off a Seabotix vLBV300 platform with custom software interfacing it with the Robot Operating System (ROS). The vehicle utilizes an inertial navigation system available from Greensea Systems, Inc. based on a Gladiator Landmark 40 IMU coupled with a Teledyne Explorer Doppler Velocity Log to perform station keeping at a desired location and orientation. We performed two marine trials with the vehicle: a near-shore shared autonomy manipulation trial and an offshore attempted intervention trial. These demonstrations were designed to show the capabilities of our sAUV system for inspection and basic manipulation tasks in real marine environments.
Anaerobic Digestion and its Applications
Anaerobic digestion is a natural biological process. The initials "AD" may refer to the process of anaerobic digestion, or the built systems of anaerobic digesters. While there are many kinds of digesters, the biology is basically the same for all. Anaerobic digesters are built...
Customer-driven outcomes: a patient and family perspective.
Weston, Marla J; Weston, Richard R
2006-01-01
Experiencing the healthcare system during an acute surgical event highlighted factors that contributed to customer-driven outcomes. Communicating intentions of and rationale for interventions increased the patient and family's confidence, and engaged the whole mind-body connection into the healing process. Utilizing the family as a repository of patient information incorporated their perspective, knowledge, and wisdom into the delivery and evaluation of patient care. Lastly, fostering the relationship between the nurse and the patient and family strengthened the therapeutic process, thus providing a foundation for customizing care.
An automatic system to study sperm motility and energetics
Nascimento, Jaclyn M.; Chandsawangbhuwana, Charlie; Botvinick, Elliot L.; Berns, Michael W.
2012-01-01
An integrated robotic laser and microscope system has been developed to automatically analyze individual sperm motility and energetics. The custom-designed optical system directs near-infrared laser light into an inverted microscope to create a single-point 3-D gradient laser trap at the focal spot of the microscope objective. A two-level computer structure is described that quantifies the sperm motility (in terms of swimming speed and swimming force) and energetics (measuring mid-piece membrane potential) using real-time tracking (done by the upper-level system) and fluorescent ratio imaging (done by the lower-level system). The communication between these two systems is achieved by a gigabit network. The custom-built image processing algorithm identifies the sperm swimming trajectory in real-time using phase contrast images, and then subsequently traps the sperm by automatically moving the microscope stage to relocate the sperm to the laser trap focal plane. Once the sperm is stably trapped (determined by the algorithm), the algorithm can also gradually reduce the laser power by rotating the polarizer in the laser path to measure the trapping power at which the sperm is capable of escaping the trap. To monitor the membrane potential of the mitochondria located in a sperm’s mid-piece, the sperm is treated with a ratiometrically-encoded fluorescent probe. The proposed algorithm can relocate the sperm to the center of the ratio imaging camera and the average ratio value can be measured in real-time. The three parameters, sperm escape power, sperm swimming speed and ratio values of the mid-piece membrane potential of individual sperm can be compared with respect to time. This two-level automatic system to study individual sperm motility and energetics has not only increased experimental throughput by an order of magnitude but also has allowed us to monitor sperm energetics prior to and after exposure to the laser trap. This system should have application in both the human fertility clinic and in animal husbandry. PMID:18299996
An automatic system to study sperm motility and energetics.
Shi, Linda Z; Nascimento, Jaclyn M; Chandsawangbhuwana, Charlie; Botvinick, Elliot L; Berns, Michael W
2008-08-01
An integrated robotic laser and microscope system has been developed to automatically analyze individual sperm motility and energetics. The custom-designed optical system directs near-infrared laser light into an inverted microscope to create a single-point 3-D gradient laser trap at the focal spot of the microscope objective. A two-level computer structure is described that quantifies the sperm motility (in terms of swimming speed and swimming force) and energetics (measuring mid-piece membrane potential) using real-time tracking (done by the upper-level system) and fluorescent ratio imaging (done by the lower-level system). The communication between these two systems is achieved by a gigabit network. The custom-built image processing algorithm identifies the sperm swimming trajectory in real-time using phase contrast images, and then subsequently traps the sperm by automatically moving the microscope stage to relocate the sperm to the laser trap focal plane. Once the sperm is stably trapped (determined by the algorithm), the algorithm can also gradually reduce the laser power by rotating the polarizer in the laser path to measure the trapping power at which the sperm is capable of escaping the trap. To monitor the membrane potential of the mitochondria located in a sperm's mid-piece, the sperm is treated with a ratiometrically-encoded fluorescent probe. The proposed algorithm can relocate the sperm to the center of the ratio imaging camera and the average ratio value can be measured in real-time. The three parameters, sperm escape power, sperm swimming speed and ratio values of the mid-piece membrane potential of individual sperm can be compared with respect to time. This two-level automatic system to study individual sperm motility and energetics has not only increased experimental throughput by an order of magnitude but also has allowed us to monitor sperm energetics prior to and after exposure to the laser trap. This system should have application in both the human fertility clinic and in animal husbandry.
Real-time data acquisition and control system for the measurement of motor and neural data
Bryant, Christopher L.; Gandhi, Neeraj J.
2013-01-01
This paper outlines a powerful, yet flexible real-time data acquisition and control system for use in the triggering and measurement of both analog and digital events. Built using the LabVIEW development architecture (version 7.1) and freely available, this system provides precisely timed auditory and visual stimuli to a subject while recording analog data and timestamps of neural activity retrieved from a window discriminator. The system utilizes the most recent real-time (RT) technology in order to provide not only a guaranteed data acquisition rate of 1 kHz, but a much more difficult to achieve guaranteed system response time of 1 ms. The system interface is windows-based and easy to use, providing a host of configurable options for end-user customization. PMID:15698659
Multiphysical FE-analysis of a front-end bending phenomenon in a hot strip mill
NASA Astrophysics Data System (ADS)
Ilmola, Joonas; Seppälä, Oskari; Leinonen, Olli; Pohjonen, Aarne; Larkiola, Jari; Jokisaari, Juha; Putaansuu, Eero
2018-05-01
In hot steel rolling processes, a slab is generally rolled to a transfer bar in a roughing process and to a strip in a hot strip rolling process. Over several rolling passes the front-end may bend upward or downward due to asymmetrical rolling conditions causing entry problems in the next rolling pass. Many different factors may affect the front-end bending phenomenon and are very challenging to measure. Thus, a customized finite element model is designed and built to simulate the front-end bending phenomenon in a hot strip rolling process. To simulate the functioning of the hot strip mill precisely, automated controlling logic of the mill must be considered. In this paper we studied the effect of roll bite friction conditions and amount of reduction on the front-end bending phenomenon in a hot strip rolling process.
A Lightweight, High-performance I/O Management Package for Data-intensive Computing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jun Wang
2007-07-17
File storage systems are playing an increasingly important role in high-performance computing as the performance gap between CPU and disk increases. It could take a long time to develop an entire system from scratch. Solutions will have to be built as extensions to existing systems. If new portable, customized software components are plugged into these systems, better sustained high I/O performance and higher scalability will be achieved, and the development cycle of next-generation of parallel file systems will be shortened. The overall research objective of this ECPI development plan aims to develop a lightweight, customized, high-performance I/O management package namedmore » LightI/O to extend and leverage current parallel file systems used by DOE. During this period, We have developed a novel component in LightI/O and prototype them into PVFS2, and evaluate the resultant prototype—extended PVFS2 system on data-intensive applications. The preliminary results indicate the extended PVFS2 delivers better performance and reliability to users. A strong collaborative effort between the PI at the University of Nebraska Lincoln and the DOE collaborators—Drs Rob Ross and Rajeev Thakur at Argonne National Laboratory who are leading the PVFS2 group makes the project more promising.« less
Shedlock, James; Frisque, Michelle; Hunt, Steve; Walton, Linda; Handler, Jonathan; Gillam, Michael
2010-04-01
How can the user's access to health information, especially full-text articles, be improved? The solution is building and evaluating the Health SmartLibrary (HSL). The setting is the Galter Health Sciences Library, Feinberg School of Medicine, Northwestern University. The HSL was built on web-based personalization and customization tools: My E-Resources, Stay Current, Quick Search, and File Cabinet. Personalization and customization data were tracked to show user activity with these value-added, online services. Registration data indicated that users were receptive to personalized resource selection and that the automated application of specialty-based, personalized HSLs was more frequently adopted than manual customization by users. Those who did customize customized My E-Resources and Stay Current more often than Quick Search and File Cabinet. Most of those who customized did so only once. Users did not always take advantage of the services designed to aid their library research experiences. When personalization is available at registration, users readily accepted it. Customization tools were used less frequently; however, more research is needed to determine why this was the case.
Database Are Not Toasters: A Framework for Comparing Data Warehouse Appliances
NASA Astrophysics Data System (ADS)
Trajman, Omer; Crolotte, Alain; Steinhoff, David; Nambiar, Raghunath Othayoth; Poess, Meikel
The success of Business Intelligence (BI) applications depends on two factors, the ability to analyze data ever more quickly and the ability to handle ever increasing volumes of data. Data Warehouse (DW) and Data Mart (DM) installations that support BI applications have historically been built using traditional architectures either designed from the ground up or based on customized reference system designs. The advent of Data Warehouse Appliances (DA) brings packaged software and hardware solutions that address performance and scalability requirements for certain market segments. The differences between DAs and custom installations make direct comparisons between them impractical and suggest the need for a targeted DA benchmark. In this paper we review data warehouse appliances by surveying thirteen products offered today. We assess the common characteristics among them and propose a classification for DA offerings. We hope our results will help define a useful benchmark for DAs.
Applications for General Purpose Command Buffers: The Emergency Conjunction Avoidance Maneuver
Scheid, Robert J; England, Martin
2016-01-01
A case study is presented for the use of Relative Operation Sequence (ROS) command buffers to quickly execute a propulsive maneuver to avoid a collision with space debris. In this process, a ROS is custom-built with a burn time and magnitude, uplinked to the spacecraft, and executed in 15 percent of the time of the previous method. This new process provides three primary benefits. First, the planning cycle can be delayed until it is certain a burn must be performed, reducing team workload. Second, changes can be made to the burn parameters almost up to the point of execution while still allowing the normal uplink product review process, reducing the risk of leaving the operational orbit because of outdated burn parameters, and minimizing the chance of accidents from human error, such as missed commands, in a high-stress situation. Third, the science impacts can be customized and minimized around the burn, and in the event of an abort can be eliminated entirely in some circumstances. The result is a compact burn process that can be executed in as few as four hours and can be aborted seconds before execution. Operational, engineering, planning, and flight dynamics perspectives are presented, as well as a functional overview of the code and workflow required to implement the process. Future expansions and capabilities are also discussed.
WAN Optimization: A Business Process Reengineering and Knowledge Value Added Approach
2011-03-01
processing is not affected. Reliability The Customer or Order systems are unavailable. If either fails, order processing halts and alerts are...online immediately, and sends a fax to the customer who orders the container. The whole order processing process can be completed in one day. IT plays...Messages build up in the OrderQ until the email server restarts. Messages are then sent by the SendEmail component to remove the backlog. Order
Beholz, Sven; Konertz, Wolfgang
2006-01-01
The evaluation of customers' satisfaction is elementary for any quality management system. In our university cardiac surgery unit that has been certified according to DIN EN ISO 9001:2000 the influence of repeated evaluation of the referring physicians' satisfaction conducted in the course of three consecutive years on structures and processes in the scope of the quality management system was examined. Customers' satisfaction with the possibility of access to the department could be increased by targeted interventions. Further interventions in the field of documentation led to a measurable increase in satisfaction with postoperative communication. Repeated annual evaluation of the satisfaction of referring physicians has proved to be a valuable tool in the process of continuous quality improvement.
Structural Analysis Using Computer Based Methods
NASA Technical Reports Server (NTRS)
Dietz, Matthew R.
2013-01-01
The stiffness of a flex hose that will be used in the umbilical arms of the Space Launch Systems mobile launcher needed to be determined in order to properly qualify ground umbilical plate behavior during vehicle separation post T-0. This data is also necessary to properly size and design the motors used to retract the umbilical arms. Therefore an experiment was created to determine the stiffness of the hose. Before the test apparatus for the experiment could be built, the structure had to be analyzed to ensure it would not fail under given loading conditions. The design model was imported into the analysis software and optimized to decrease runtime while still providing accurate restlts and allow for seamless meshing. Areas exceeding the allowable stresses in the structure were located and modified before submitting the design for fabrication. In addition, a mock up of a deep space habitat and the support frame was designed and needed to be analyzed for structural integrity under different loading conditions. The load cases were provided by the customer and were applied to the structure after optimizing the geometry. Once again, weak points in the structure were located and recommended design changes were made to the customer and the process was repeated until the load conditions were met without exceeding the allowable stresses. After the stresses met the required factors of safety the designs were released for fabrication.
1994-12-01
Order Cycle ..... 20 2. Order Processing and the Information System .......... ................. 21 3. The Order Cycle at SCCB ...... ......... 21 v...order transmittal time, order processing time, order assembly time, stock availability, production time, and delivery time. CUSTOMER L7 2 I i u *r1...methods, inventory stocking policies, order processing procedures, transport modes, and scheduling methods [Ref. 15]. 20 2. Order Processing and the
McDonald, Sandra A; Ryan, Benjamin J; Brink, Amy; Holtschlag, Victoria L
2012-02-01
Informatics systems, particularly those that provide capabilities for data storage, specimen tracking, retrieval, and order fulfillment, are critical to the success of biorepositories and other laboratories engaged in translational medical research. A crucial item-one easily overlooked-is an efficient way to receive and process investigator-initiated requests. A successful electronic ordering system should allow request processing in a maximally efficient manner, while also allowing streamlined tracking and mining of request data such as turnaround times and numerical categorizations (user groups, funding sources, protocols, and so on). Ideally, an electronic ordering system also facilitates the initial contact between the laboratory and customers, while still allowing for downstream communications and other steps toward scientific partnerships. We describe here the recently established Web-based ordering system for the biorepository at Washington University Medical Center, along with its benefits for workflow, tracking, and customer service. Because of the system's numerous value-added impacts, we think our experience can serve as a good model for other customer-focused biorepositories, especially those currently using manual or non-Web-based request systems. Our lessons learned also apply to the informatics developers who serve such biobanks.
Ryan, Benjamin J.; Brink, Amy; Holtschlag, Victoria L.
2012-01-01
Informatics systems, particularly those that provide capabilities for data storage, specimen tracking, retrieval, and order fulfillment, are critical to the success of biorepositories and other laboratories engaged in translational medical research. A crucial item—one easily overlooked—is an efficient way to receive and process investigator-initiated requests. A successful electronic ordering system should allow request processing in a maximally efficient manner, while also allowing streamlined tracking and mining of request data such as turnaround times and numerical categorizations (user groups, funding sources, protocols, and so on). Ideally, an electronic ordering system also facilitates the initial contact between the laboratory and customers, while still allowing for downstream communications and other steps toward scientific partnerships. We describe here the recently established Web-based ordering system for the biorepository at Washington University Medical Center, along with its benefits for workflow, tracking, and customer service. Because of the system's numerous value-added impacts, we think our experience can serve as a good model for other customer-focused biorepositories, especially those currently using manual or non-Web–based request systems. Our lessons learned also apply to the informatics developers who serve such biobanks. PMID:23386921
Modeling to Improve the Risk Reduction Process for Command File Errors
NASA Technical Reports Server (NTRS)
Meshkat, Leila; Bryant, Larry; Waggoner, Bruce
2013-01-01
The Jet Propulsion Laboratory has learned that even innocuous errors in the spacecraft command process can have significantly detrimental effects on a space mission. Consequently, such Command File Errors (CFE), regardless of their effect on the spacecraft, are treated as significant events for which a root cause is identified and corrected. A CFE during space mission operations is often the symptom of imbalance or inadequacy within the system that encompasses the hardware and software used for command generation as well as the human experts and processes involved in this endeavor. As we move into an era of increased collaboration with other NASA centers and commercial partners, these systems become more and more complex. Consequently, the ability to thoroughly model and analyze CFEs formally in order to reduce the risk they pose is increasingly important. In this paper, we summarize the results of applying modeling techniques previously developed to the DAWN flight project. The original models were built with the input of subject matter experts from several flight projects. We have now customized these models to address specific questions for the DAWN flight project and formulating use cases to address their unique mission needs. The goal of this effort is to enhance the project's ability to meet commanding reliability requirements for operations and to assist them in managing their Command File Errors.
Recommendation Systems for Geoscience Data Portals Built by Analyzing Usage Patterns
NASA Astrophysics Data System (ADS)
Crosby, C.; Nandigam, V.; Baru, C.
2009-04-01
Since its launch five years ago, the National Science Foundation-funded GEON Project (www.geongrid.org) has been providing access to a variety of geoscience data sets such as geologic maps and other geographic information system (GIS)-oriented data, paleontologic databases, gravity and magnetics data and LiDAR topography via its online portal interface. In addition to data, the GEON Portal also provides web-based tools and other resources that enable users to process and interact with data. Examples of these tools include functions to dynamically map and integrate GIS data, compute synthetic seismograms, and to produce custom digital elevation models (DEMs) with user defined parameters such as resolution. The GEON portal built on the Gridsphere-portal framework allows us to capture user interaction with the system. In addition to the site access statistics captured by tools like Google Analystics which capture hits per unit time, search key words, operating systems, browsers, and referring sites, we also record additional statistics such as which data sets are being downloaded and in what formats, processing parameters, and navigation pathways through the portal. With over four years of data now available from the GEON Portal, this record of usage is a rich resource for exploring how earth scientists discover and utilize online data sets. Furthermore, we propose that this data could ultimately be harnessed to optimize the way users interact with the data portal, design intelligent processing and data management systems, and to make recommendations on algorithm settings and other available relevant data. The paradigm of integrating popular and commonly used patterns to make recommendations to a user is well established in the world of e-commerce where users receive suggestions on books, music and other products that they may find interesting based on their website browsing and purchasing history, as well as the patterns of fellow users who have made similar selections. However, this paradigm has not yet been explored for geoscience data portals. In this presentation we will present an initial analysis of user interaction and access statistics for the GEON OpenTopography LiDAR data distribution and processing system to illustrate what they reveal about user's spatial and temporal data access patterns, data processing parameter selections, and pathways through the data portal. We also demonstrate what these usage statistics can illustrate about aspects of the data sets that are of greatest interest. Finally, we explore how these usage statistics could be used to improve the user's experience in the data portal and to optimize how data access interfaces and tools are designed and implemented.
Affect Recognition through Facebook for Effective Group Profiling towards Personalized Instruction
ERIC Educational Resources Information Center
Troussas, Christos; Espinosa, Kurt Junshean; Virvou, Maria
2016-01-01
Social networks are progressively being considered as an intense thought for learning. Particularly in the research area of Intelligent Tutoring Systems, they can create intuitive, versatile and customized e-learning systems which can advance the learning process by revealing the capacities and shortcomings of every learner and by customizing the…
A configurable and low-power mixed signal SoC for portable ECG monitoring applications.
Kim, Hyejung; Kim, Sunyoung; Van Helleputte, Nick; Artes, Antonio; Konijnenburg, Mario; Huisken, Jos; Van Hoof, Chris; Yazicioglu, Refet Firat
2014-04-01
This paper describes a mixed-signal ECG System-on-Chip (SoC) that is capable of implementing configurable functionality with low-power consumption for portable ECG monitoring applications. A low-voltage and high performance analog front-end extracts 3-channel ECG signals and single channel electrode-tissue-impedance (ETI) measurement with high signal quality. This can be used to evaluate the quality of the ECG measurement and to filter motion artifacts. A custom digital signal processor consisting of 4-way SIMD processor provides the configurability and advanced functionality like motion artifact removal and R peak detection. A built-in 12-bit analog-to-digital converter (ADC) is capable of adaptive sampling achieving a compression ratio of up to 7, and loop buffer integration reduces the power consumption for on-chip memory access. The SoC is implemented in 0.18 μm CMOS process and consumes 32 μ W from a 1.2 V while heart beat detection application is running, and integrated in a wireless ECG monitoring system with Bluetooth protocol. Thanks to the ECG SoC, the overall system power consumption can be reduced significantly.
An Indoor Positioning-Based Mobile Payment System Using Bluetooth Low Energy Technology
Winata, Doni
2018-01-01
The development of information technology has paved the way for faster and more convenient payment process flows and new methodology for the design and implementation of next generation payment systems. The growth of smartphone usage nowadays has fostered a new and popular mobile payment environment. Most of the current generation smartphones support Bluetooth Low Energy (BLE) technology to communicate with nearby BLE-enabled devices. It is plausible to construct an Over-the-Air BLE-based mobile payment system as one of the payment methods for people living in modern societies. In this paper, a secure indoor positioning-based mobile payment authentication protocol with BLE technology and the corresponding mobile payment system design are proposed. The proposed protocol consists of three phases: initialization phase, session key construction phase, and authentication phase. When a customer moves toward the POS counter area, the proposed mobile payment system will automatically detect the position of the customer to confirm whether the customer is ready for the checkout process. Once the system has identified the customer is standing within the payment-enabled area, the payment system will invoke authentication process between POS and the customer’s smartphone through BLE communication channel to generate a secure session key and establish an authenticated communication session to perform the payment transaction accordingly. A prototype is implemented to assess the performance of the proposed design for mobile payment system. In addition, security analysis is conducted to evaluate the security strength of the proposed protocol. PMID:29587399
Using focus groups and social marketing to strengthen promotion of group prenatal care.
Vonderheid, Susan C; Carrie, S Klima; Norr, Kathleen F; Grady, Mary Alice; Westdahl, Claire M
2013-01-01
Centering Pregnancy, an innovative group model of prenatal care, shows promise to reduce persistent adverse maternal-infant outcomes and contain costs. Because this innovation requires systemwide change, clinics reported needing support enrolling women into groups and obtaining organizational buy-in. This study used the 3-step social marketing communication strategy to help clinic staff identify key customers and customer-specific barriers to adopting or supporting Centering Pregnancy. They developed targeted information to reduce barriers and built skills in communicating with different customers through role-playing. Findings provide practical information for others to use this communication strategy to improve implementation of Centering Pregnancy.
NASA Astrophysics Data System (ADS)
Eckhardt, Matt
2014-03-01
Tunneling spectroscopy is an important technique used to measure the superconducting energy gap, a feature that is at the heart of the nature of superconductivity in various materials. In this presentation, we report the progress and results in developing high-resolution tunneling spectroscopy experimental platforms in a helium three cryostat, a 3 Kelvin cryocooler and a helium dip-tester. The experimental team working in a liberal arts university is a multi-disciplinary group consisting of one physics major, chemisty majors and a biology major. Students including non-physics majors learned and implemented current-voltage measurement techniques, vacuum system engineering, built electronic boxes and amplifier circuits from scratch, built custom multi-conductor cables for thermometry and current-voltage measurements, and performed conductance measurements. We report preliminary results. Acknowledgments: We acknowledge support from National Science Foundation Grant # DMR-1206561.
A portfolio of products from the rapid terrain visualization interferometric SAR
NASA Astrophysics Data System (ADS)
Bickel, Douglas L.; Doerry, Armin W.
2007-04-01
The Rapid Terrain Visualization interferometric synthetic aperture radar was designed and built at Sandia National Laboratories as part of an Advanced Concept Technology Demonstration (ACTD) to "demonstrate the technologies and infrastructure to meet the Army requirement for rapid generation of digital topographic data to support emerging crisis or contingencies." This sensor was built by Sandia National Laboratories for the Joint Programs Sustainment and Development (JPSD) Project Office to provide highly accurate digital elevation models (DEMs) for military and civilian customers, both inside and outside of the United States. The sensor achieved better than HRTe Level IV position accuracy in near real-time. The system was flown on a deHavilland DHC-7 Army aircraft. This paper presents a collection of images and data products from the Rapid Terrain Visualization interferometric synthetic aperture radar. The imagery includes orthorectified images and DEMs from the RTV interferometric SAR radar.
When times get tough, what happens to TQM? Case study.
Niven, D
1993-01-01
When Mueller Chemical Company's biggest customer, Ameriton, demanded that MCC install a total quality management system five years ago, the effort seemed worth it. Morale improved dramatically at the German company, as did quality and productivity. But now, in this fictional case study, Ameriton has gone bankrupt. As a result, MCC has had to cut its work force, and senior managers are meeting to decide whether TQM should be part of the downsized MCC. Horst Koblitz, director of TQM, and Division Manager Eva Stichen both vote yes. Stichen's division, which never supplied Ameriton, has turned its process-control system into the company's best thanks to TQM. The division is more cost-efficient, product defects are nearly nonexistent, and its safety record is spotless. As Koblitz notes, Ameriton's failure is no reason to abandon all that MCC has built. Furthermore, shareholders and customers would think that MCC was panicking. MCC just needs to tailor its TQM program to a smaller organization. But CFO Georg Becker doesn't think MCC has the time or resources for fine-tuning. And as he sees it, that might be just as well. The distractions that came with TQM took MCC away from its goal of becoming the chemicals market leader in Europe. While the company organized teams, developed measurement systems, and filled out quality reports, its competitors took away much of the market share MCC was after. TQM was a good long-term approach, but it didn't come with a plan for MCC's current situation. And CEO and Chairman Dieter Mueller won't compromise; TQM must either stay or go.(ABSTRACT TRUNCATED AT 250 WORDS)
NASA Technical Reports Server (NTRS)
Russell, Yvonne; Falsetti, Christine M.
1991-01-01
Customer requirements are presented through three viewgraphs. One graph presents the range of services, which include requirements management, network engineering, operations, and applications support. Another viewgraph presents the project planning process. The third viewgraph presents the programs and/or projects actively supported including life sciences, earth science and applications, solar system exploration, shuttle flight engineering, microgravity science, space physics, and astrophysics.
Robotic Enrichment Processing of Roche 454 Titanium Emlusion PCR at the DOE Joint Genome Institute
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hamilton, Matthew; Wilson, Steven; Bauer, Diane
2010-05-28
Enrichment of emulsion PCR product is the most laborious and pipette-intensive step in the 454 Titanium process, posing the biggest obstacle for production-oriented scale up. The Joint Genome Institute has developed a pair of custom-made robots based on the Microlab Star liquid handling deck manufactured by Hamilton to mediate the complexity and ergonomic demands of the 454 enrichment process. The robot includes a custom built centrifuge, magnetic deck positions, as well as heating and cooling elements. At present processing eight emulsion cup samples in a single 2.5 hour run, these robots are capable of processing up to 24 emulsion cupmore » samples. Sample emulsions are broken using the standard 454 breaking process and transferred from a pair of 50ml conical tubes to a single 2ml tube and loaded on the robot. The robot performs the enrichment protocol and produces beads in 2ml tubes ready for counting. The robot follows the Roche 454 enrichment protocol with slight exceptions to the manner in which it resuspends beads via pipette mixing rather than vortexing and a set number of null bead removal washes. The robotic process is broken down in similar discrete steps: First Melt and Neutralization, Enrichment Primer Annealing, Enrichment Bead Incubation, Null Bead Removal, Second Melt and Neutralization and Sequencing Primer Annealing. Data indicating our improvements in enrichment efficiency and total number of bases per run will also be shown.« less
High Available COTS Based Computer for Space
NASA Astrophysics Data System (ADS)
Hartmann, J.; Magistrati, Giorgio
2015-09-01
The availability and reliability factors of a system are central requirements of a target application. From a simple fuel injection system used in cars up to a flight control system of an autonomous navigating spacecraft, each application defines its specific availability factor under the target application boundary conditions. Increasing quality requirements on data processing systems used in space flight applications calling for new architectures to fulfill the availability, reliability as well as the increase of the required data processing power. Contrary to the increased quality request simplification and use of COTS components to decrease costs while keeping the interface compatibility to currently used system standards are clear customer needs. Data processing system design is mostly dominated by strict fulfillment of the customer requirements and reuse of available computer systems were not always possible caused by obsolescence of EEE-Parts, insufficient IO capabilities or the fact that available data processing systems did not provide the required scalability and performance.
NASA Astrophysics Data System (ADS)
Niranjan, S. P.; Chandrasekaran, V. M.; Indhira, K.
2018-04-01
This paper examines bulk arrival and batch service queueing system with functioning server failure and multiple vacations. Customers are arriving into the system in bulk according to Poisson process with rate λ. Arriving customers are served in batches with minimum of ‘a’ and maximum of ‘b’ number of customers according to general bulk service rule. In the service completion epoch if the queue length is less than ‘a’ then the server leaves for vacation (secondary job) of random length. After a vacation completion, if the queue length is still less than ‘a’ then the server leaves for another vacation. The server keeps on going vacation until the queue length reaches the value ‘a’. The server is not stable at all the times. Sometimes it may fails during functioning of customers. Though the server fails service process will not be interrupted.It will be continued for the current batch of customers with lower service rate than the regular service rate. The server will be repaired after the service completion with lower service rate. The probability generating function of the queue size at an arbitrary time epoch will be obtained for the modelled queueing system by using supplementary variable technique. Moreover various performance characteristics will also be derived with suitable numerical illustrations.
Use of 3D Printing for Custom Wind Tunnel Fabrication
NASA Astrophysics Data System (ADS)
Gagorik, Paul; Bates, Zachary; Issakhanian, Emin
2016-11-01
Small-scale wind tunnels for the most part are fairly simple to produce with standard building equipment. However, the intricate bell housing and inlet shape of an Eiffel type wind tunnel, as well as the transition from diffuser to fan in a rectangular tunnel can present design and construction obstacles. With the help of 3D printing, these shapes can be custom designed in CAD models and printed in the lab at very low cost. The undergraduate team at Loyola Marymount University has built a custom benchtop tunnel for gas turbine film cooling experiments. 3D printing is combined with conventional construction methods to build the tunnel. 3D printing is also used to build the custom tunnel floor and interchangeable experimental pieces for various experimental shapes. This simple and low-cost tunnel is a custom solution for specific engineering experiments for gas turbine technology research.
[Business organization theory: its potential use in the organization of the operating room].
Bartz, H-J
2005-07-01
The paradigm of patient care in the German health system is changing. The introduction of German Diagnosis Related Groups (G-DRGs), a diagnosis-related coding system, has made process-oriented thinking increasingly important. The treatment process is viewed and managed as a whole from the admission to the discharge of the patient. The interfaces of departments and sectors are diminished. A main objective of these measures is to render patient care more cost efficient. Within the hospital, the operating room (OR) is the most expensive factor accounting for 25 - 50 % of the costs of a surgical patient and is also a bottleneck in the surgical patient care. Therefore, controlling of the perioperative treatment process is getting more and more important. Here, the business organisation theory can be a very useful tool. Especially the concepts of process organisation and process management can be applied to hospitals. Process-oriented thinking uncovers and solves typical organisational problems. Competences, responsibilities and tasks are reorganised by process orientation and the enterprise is gradually transformed to a process-oriented system. Process management includes objective-oriented controlling of the value chain of an enterprise with regard to quality, time, costs and customer satisfaction. The quality of the process is continuously improved using process-management techniques. The main advantage of process management is consistent customer orientation. Customer orientation means to be aware of the customer's needs at any time during the daily routine. The performance is therefore always directed towards current market requirements. This paper presents the basics of business organisation theory and to point out its potential use in the organisation of the OR.
Soysal, Ergin; Wang, Jingqi; Jiang, Min; Wu, Yonghui; Pakhomov, Serguei; Liu, Hongfang; Xu, Hua
2017-11-24
Existing general clinical natural language processing (NLP) systems such as MetaMap and Clinical Text Analysis and Knowledge Extraction System have been successfully applied to information extraction from clinical text. However, end users often have to customize existing systems for their individual tasks, which can require substantial NLP skills. Here we present CLAMP (Clinical Language Annotation, Modeling, and Processing), a newly developed clinical NLP toolkit that provides not only state-of-the-art NLP components, but also a user-friendly graphic user interface that can help users quickly build customized NLP pipelines for their individual applications. Our evaluation shows that the CLAMP default pipeline achieved good performance on named entity recognition and concept encoding. We also demonstrate the efficiency of the CLAMP graphic user interface in building customized, high-performance NLP pipelines with 2 use cases, extracting smoking status and lab test values. CLAMP is publicly available for research use, and we believe it is a unique asset for the clinical NLP community. © The Author 2017. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Data acquisition architecture and online processing system for the HAWC gamma-ray observatory
NASA Astrophysics Data System (ADS)
Abeysekara, A. U.; Alfaro, R.; Alvarez, C.; Álvarez, J. D.; Arceo, R.; Arteaga-Velázquez, J. C.; Ayala Solares, H. A.; Barber, A. S.; Baughman, B. M.; Bautista-Elivar, N.; Becerra Gonzalez, J.; Belmont-Moreno, E.; BenZvi, S. Y.; Berley, D.; Bonilla Rosales, M.; Braun, J.; Caballero-Lopez, R. A.; Caballero-Mora, K. S.; Carramiñana, A.; Castillo, M.; Cotti, U.; Cotzomi, J.; de la Fuente, E.; De León, C.; DeYoung, T.; Diaz-Cruz, J.; Diaz Hernandez, R.; Díaz-Vélez, J. C.; Dingus, B. L.; DuVernois, M. A.; Ellsworth, R. W.; Fiorino, D. W.; Fraija, N.; Galindo, A.; Garfias, F.; González, M. M.; Goodman, J. A.; Grabski, V.; Gussert, M.; Hampel-Arias, Z.; Harding, J. P.; Hui, C. M.; Hüntemeyer, P.; Imran, A.; Iriarte, A.; Karn, P.; Kieda, D.; Kunde, G. J.; Lara, A.; Lauer, R. J.; Lee, W. H.; Lennarz, D.; León Vargas, H.; Linares, E. C.; Linnemann, J. T.; Longo Proper, M.; Luna-García, R.; Malone, K.; Marinelli, A.; Marinelli, S. S.; Martinez, O.; Martínez-Castro, J.; Martínez-Huerta, H.; Matthews, J. A. J.; McEnery, J.; Mendoza Torres, E.; Miranda-Romagnoli, P.; Moreno, E.; Mostafá, M.; Nellen, L.; Newbold, M.; Noriega-Papaqui, R.; Oceguera-Becerra, T.; Patricelli, B.; Pelayo, R.; Pérez-Pérez, E. G.; Pretz, J.; Rivière, C.; Rosa-González, D.; Ruiz-Velasco, E.; Ryan, J.; Salazar, H.; Salesa Greus, F.; Sanchez, F. E.; Sandoval, A.; Schneider, M.; Silich, S.; Sinnis, G.; Smith, A. J.; Sparks Woodle, K.; Springer, R. W.; Taboada, I.; Toale, P. A.; Tollefson, K.; Torres, I.; Ukwatta, T. N.; Villaseñor, L.; Weisgarber, T.; Westerhoff, S.; Wisher, I. G.; Wood, J.; Yapici, T.; Yodh, G. B.; Younk, P. W.; Zaborov, D.; Zepeda, A.; Zhou, H.
2018-04-01
The High Altitude Water Cherenkov observatory (HAWC) is an air shower array devised for TeV gamma-ray astronomy. HAWC is located at an altitude of 4100 m a.s.l. in Sierra Negra, Mexico. HAWC consists of 300 Water Cherenkov Detectors, each instrumented with 4 photomultiplier tubes (PMTs). HAWC re-uses the Front-End Boards from the Milagro experiment to receive the PMT signals. These boards are used in combination with Time to Digital Converters (TDCs) to record the time and the amount of light in each PMT hit (light flash). A set of VME TDC modules (128 channels each) is operated in a continuous (dead time free) mode. The TDCs are read out via the VME bus by Single-Board Computers (SBCs), which in turn are connected to a gigabit Ethernet network. The complete system produces ≈500 MB/s of raw data. A high-throughput data processing system has been designed and built to enable real-time data analysis. The system relies on off-the-shelf hardware components, an open-source software technology for data transfers (ZeroMQ) and a custom software framework for data analysis (AERIE). Multiple trigger and reconstruction algorithms can be combined and run on blocks of data in a parallel fashion, producing a set of output data streams which can be analyzed in real time with minimal latency (<5 s). This paper provides an overview of the hardware set-up and an in-depth description of the software design, covering both the TDC data acquisition system and the real-time data processing system. The performance of these systems is also discussed.
P43-S Computational Biology Applications Suite for High-Performance Computing (BioHPC.net)
Pillardy, J.
2007-01-01
One of the challenges of high-performance computing (HPC) is user accessibility. At the Cornell University Computational Biology Service Unit, which is also a Microsoft HPC institute, we have developed a computational biology application suite that allows researchers from biological laboratories to submit their jobs to the parallel cluster through an easy-to-use Web interface. Through this system, we are providing users with popular bioinformatics tools including BLAST, HMMER, InterproScan, and MrBayes. The system is flexible and can be easily customized to include other software. It is also scalable; the installation on our servers currently processes approximately 8500 job submissions per year, many of them requiring massively parallel computations. It also has a built-in user management system, which can limit software and/or database access to specified users. TAIR, the major database of the plant model organism Arabidopsis, and SGN, the international tomato genome database, are both using our system for storage and data analysis. The system consists of a Web server running the interface (ASP.NET C#), Microsoft SQL server (ADO.NET), compute cluster running Microsoft Windows, ftp server, and file server. Users can interact with their jobs and data via a Web browser, ftp, or e-mail. The interface is accessible at http://cbsuapps.tc.cornell.edu/.
Large optics for the National Ignition Facility
DOE Office of Scientific and Technical Information (OSTI.GOV)
Baisden, P.
2015-01-12
The National Ignition Facility (NIF) laser with its 192 independent laser beams is not only the world’s largest laser, it is also the largest optical system ever built. With its 192 independent laser beams, the NIF requires a total of 7648 large-aperture (meter-sized) optics. One of the many challenges in designing and building NIF has been to carry out the research and development on optical materials, optics design, and optics manufacturing and metrology technologies needed to achieve NIF’s high output energies and precision beam quality. This paper describes the multiyear, multi-supplier, development effort that was undertaken to develop the advancedmore » optical materials, coatings, fabrication technologies, and associated process improvements necessary to manufacture the wide range of NIF optics. The optics include neodymium-doped phosphate glass laser amplifiers; fused silica lenses, windows, and phase plates; mirrors and polarizers with multi-layer, high-reflectivity dielectric coatings deposited on BK7 substrates; and potassium di-hydrogen phosphate crystal optics for fast optical switches, frequency conversion, and polarization rotation. Also included is a discussion of optical specifications and custom metrology and quality-assurance tools designed, built, and fielded at supplier sites to verify compliance with the stringent NIF specifications. In addition, a brief description of the ongoing program to improve the operational lifetime (i.e., damage resistance) of optics exposed to high fluence in the 351-nm (3ω) is provided.« less
Development of a c-scan photoacoutsic imaging probe for prostate cancer detection
NASA Astrophysics Data System (ADS)
Valluru, Keerthi S.; Chinni, Bhargava K.; Rao, Navalgund A.; Bhatt, Shweta; Dogra, Vikram S.
2011-03-01
Prostate cancer is the second leading cause of death in American men after lung cancer. The current screening procedures include Digital Rectal Exam (DRE) and Prostate Specific Antigen (PSA) test, along with Transrectal Ultrasound (TRUS). All suffer from low sensitivity and specificity in detecting prostate cancer in early stages. There is a desperate need for a new imaging modality. We are developing a prototype transrectal photoacoustic imaging probe to detect prostate malignancies in vivo that promises high sensitivity and specificity. To generate photoacoustic (PA) signals, the probe utilizes a high energy 1064 nm laser that delivers light pulses onto the prostate at 10Hz with 10ns duration through a fiber optic cable. The designed system will generate focused C-scan planar images using acoustic lens technology. A 5 MHz custom fabricated ultrasound sensor array located in the image plane acquires the focused PA signals, eliminating the need for any synthetic aperture focusing. The lens and sensor array design was optimized towards this objective. For fast acquisition times, a custom built 16 channel simultaneous backend electronics PCB has been developed. It consists of a low-noise variable gain amplifier and a 16 channel ADC. Due to the unavailability of 2d ultrasound arrays, in the current implementation several B-scan (depth-resolved) data is first acquired by scanning a 1d array, which is then processed to reconstruct either 3d volumetric images or several C-scan planar images. Experimental results on excised tissue using a in-vitro prototype of this technology are presented to demonstrate the system capability in terms of resolution and sensitivity.
Graphene integrated circuits: new prospects towards receiver realisation.
Saeed, Mohamed; Hamed, Ahmed; Wang, Zhenxing; Shaygan, Mehrdad; Neumaier, Daniel; Negra, Renato
2017-12-21
This work demonstrates a design approach which enables the fabrication of fully integrated radio frequency (RF) and millimetre-wave frequency direct-conversion graphene receivers by adapting the frontend architecture to exploit the state-of-the-art performance of the recently reported wafer-scale CVD metal-insulator-graphene (MIG) diodes. As a proof-of-concept, we built a fully integrated microwave receiver in the frequency range 2.1-2.7 GHz employing the strong nonlinearity and the high responsivity of MIG diodes to successfully receive and demodulate complex, digitally modulated communication signals at 2.45 GHz. In addition, the fabricated receiver uses zero-biased MIG diodes and consumes zero dc power. With the flexibility to be fabricated on different substrates, the prototype receiver frontend is fabricated on a low-cost, glass substrate utilising a custom-developed MMIC process backend which enables the high performance of passive components. The measured performance of the prototype makes it suitable for Internet-of-Things (IoT) and Radio Frequency Identification (RFID) systems for medical and communication applications.
Audio-based detection and evaluation of eating behavior using the smartwatch platform.
Kalantarian, Haik; Sarrafzadeh, Majid
2015-10-01
In recent years, smartwatches have emerged as a viable platform for a variety of medical and health-related applications. In addition to the benefits of a stable hardware platform, these devices have a significant advantage over other wrist-worn devices, in that user acceptance of watches is higher than other custom hardware solutions. In this paper, we describe signal-processing techniques for identification of chews and swallows using a smartwatch device׳s built-in microphone. Moreover, we conduct a survey to evaluate the potential of the smartwatch as a platform for monitoring nutrition. The focus of this paper is to analyze the overall applicability of a smartwatch-based system for food-intake monitoring. Evaluation results confirm the efficacy of our technique; classification was performed between apple and potato chip bites, water swallows, talking, and ambient noise, with an F-measure of 94.5% based on 250 collected samples. Copyright © 2015 Elsevier Ltd. All rights reserved.
Dynamic actuation of a novel laser-processed NiTi linear actuator
NASA Astrophysics Data System (ADS)
Pequegnat, A.; Daly, M.; Wang, J.; Zhou, Y.; Khan, M. I.
2012-09-01
A novel laser processing technique, capable of locally modifying the shape memory effect, was applied to enhance the functionality of a NiTi linear actuator. By altering local transformation temperatures, an additional memory was imparted into a monolithic NiTi wire to enable dynamic actuation via controlled resistive heating. Characterizations of the actuator load, displacement and cyclic properties were conducted using a custom-built spring-biased test set-up. Monotonic tensile testing was also implemented to characterize the deformation behaviour of the martensite phase. Observed differences in the deformation behaviour of laser-processed material were found to affect the magnitude of the active strain. Furthermore, residual strain during cyclic actuation testing was found to stabilize after 150 cycles while the recoverable strain remained constant. This laser-processed actuator will allow for the realization of new applications and improved control methods for shape memory alloys.
Upgrading Custom Simulink Library Components for Use in Newer Versions of Matlab
NASA Technical Reports Server (NTRS)
Stewart, Camiren L.
2014-01-01
The Spaceport Command and Control System (SCCS) at Kennedy Space Center (KSC) is a control system for monitoring and launching manned launch vehicles. Simulations of ground support equipment (GSE) and the launch vehicle systems are required throughout the life cycle of SCCS to test software, hardware, and procedures to train the launch team. The simulations of the GSE at the launch site in conjunction with off-line processing locations are developed using Simulink, a piece of Commercial Off-The-Shelf (COTS) software. The simulations that are built are then converted into code and ran in a simulation engine called Trick, a Government off-the-shelf (GOTS) piece of software developed by NASA. In the world of hardware and software, it is not uncommon to see the products that are utilized be upgraded and patched or eventually fade away into an obsolete status. In the case of SCCS simulation software, Matlab, a MathWorks product, has released a number of stable versions of Simulink since the deployment of the software on the Development Work Stations in the Linux environment (DWLs). The upgraded versions of Simulink has introduced a number of new tools and resources that, if utilized fully and correctly, will save time and resources during the overall development of the GSE simulation and its correlating documentation. Unfortunately, simply importing the already built simulations into the new Matlab environment will not suffice as it will produce results that may not be expected as they were in the version that is currently being utilized. Thus, an upgrade execution plan was developed and executed to fully upgrade the simulation environment to one of the latest versions of Matlab.
Process of Continual Improvement in a School of Nursing.
ERIC Educational Resources Information Center
Norman, Linda D.; Lutenbacher, Melanie
1996-01-01
Vanderbilt University School of Nursing used the Batalden model of systems improvement to change its program. The model analyzes services and products, customers, social community need, and customer knowledge to approach improvements in a systematic way. (JOW)
Pereira, Francilene Jane Rodrigues; dos Santos, Sérgio Ribeiro; da Silva, Cesar Cavalcanti
2011-01-01
This is a qualitative descriptive exploratory study, conducted in Higher Education Institutions (HEI) which offers Nursing course, in Joao Pessoa-PB. The study aimed to understand the concept of managers about the need for organizational changes to attend customers with special needs. Four managers participated in the study. A semi-structured interview with guiding questions was used to collect information and to interpret the data we used the method of discourse analysis based on Fiorin. It was noticed that the managers have a concern to meet the demands of inclusive policies, including the adequacy of physical spaces and the pedagogy adopted to meet the students' needs. However, some of them admitted to have little knowledge on how to deal with students with special needs and also mentioned that the institutions do not have an efficient and logistic work which can meet the current legislation of inclusion. We concluded that the process of structural and pedagogical changes is built in a slow and gradual way and it requires an involvement of qualified managers who are committed to execute the policies of inclusion of customers with special needs in a civil and legal way.
Configuration of electro-optic fire source detection system
NASA Astrophysics Data System (ADS)
Fabian, Ram Z.; Steiner, Zeev; Hofman, Nir
2007-04-01
The recent fighting activities in various parts of the world have highlighted the need for accurate fire source detection on one hand and fast "sensor to shooter cycle" capabilities on the other. Both needs can be met by the SPOTLITE system which dramatically enhances the capability to rapidly engage hostile fire source with a minimum of casualties to friendly force and to innocent bystanders. Modular system design enable to meet each customer specific requirements and enable excellent future growth and upgrade potential. The design and built of a fire source detection system is governed by sets of requirements issued by the operators. This can be translated into the following design criteria: I) Long range, fast and accurate fire source detection capability. II) Different threat detection and classification capability. III) Threat investigation capability. IV) Fire source data distribution capability (Location, direction, video image, voice). V) Men portability. ) In order to meet these design criteria, an optimized concept was presented and exercised for the SPOTLITE system. Three major modular components were defined: I) Electro Optical Unit -Including FLIR camera, CCD camera, Laser Range Finder and Marker II) Electronic Unit -including system computer and electronic. III) Controller Station Unit - Including the HMI of the system. This article discusses the system's components definition and optimization processes, and also show how SPOTLITE designers successfully managed to introduce excellent solutions for other system parameters.
Rise to the Challenge: A Business Guide to Creating a Workforce Investment System That Makes Sense.
ERIC Educational Resources Information Center
2000
This document explains how employers can participate in creating a new workforce investment system that is market driven, comprehensive, portable, accountable, customer focused, responsive, flexible, and customized. The guide details immediate and future steps employers can take at the state and local levels to influence the process of creating a…
Real time data acquisition of a countrywide commercial microwave link network
NASA Astrophysics Data System (ADS)
Chwala, Christian; Keis, Felix; Kunstmann, Harald
2015-04-01
Research in recent years has shown that data from commercial microwave link networks can provide very valuable precipitation information. Since these networks comprise the backbone of the cell phone network, they provide countrywide coverage. However acquiring the necessary data from the network operators is still difficult. Data is usually made available for researchers with a large time delay and often at irregular basis. This of course hinders the exploitation of commercial microwave link data in operational applications like QPE forecasts running at national meteorological services. To overcome this, we have developed a custom software in joint cooperation with our industry partner Ericsson. The software is installed on a dedicated server at Ericsson and is capable of acquiring data from the countrywide microwave link network in Germany. In its current first operational testing phase, data from several hundred microwave links in southern Germany is recorded. All data is instantaneously sent to our server where it is stored and organized in an emerging database. Time resolution for the Ericsson data is one minute. The custom acquisition software, however, is capable of processing higher sampling rates. Additionally we acquire and manage 1 Hz data from four microwave links operated by the skiing resort in Garmisch-Partenkirchen. We will present the concept of the data acquisition and show details of the custom-built software. Additionally we will showcase the accessibility and basic processing of real time microwave link data via our database web frontend.
Multidimensional custom-made non-linear microscope: from ex-vivo to in-vivo imaging
NASA Astrophysics Data System (ADS)
Cicchi, R.; Sacconi, L.; Jasaitis, A.; O'Connor, R. P.; Massi, D.; Sestini, S.; de Giorgi, V.; Lotti, T.; Pavone, F. S.
2008-09-01
We have built a custom-made multidimensional non-linear microscope equipped with a combination of several non-linear laser imaging techniques involving fluorescence lifetime, multispectral two-photon and second-harmonic generation imaging. The optical system was mounted on a vertical honeycomb breadboard in an upright configuration, using two galvo-mirrors relayed by two spherical mirrors as scanners. A double detection system working in non-descanning mode has allowed both photon counting and a proportional regime. This experimental setup offering high spatial (micrometric) and temporal (sub-nanosecond) resolution has been used to image both ex-vivo and in-vivo biological samples, including cells, tissues, and living animals. Multidimensional imaging was used to spectroscopically characterize human skin lesions, as malignant melanoma and naevi. Moreover, two-color detection of two photon excited fluorescence was applied to in-vivo imaging of living mice intact neocortex, as well as to induce neuronal microlesions by femtosecond laser burning. The presented applications demonstrate the capability of the instrument to be used in a wide range of biological and biomedical studies.
Co-existence of a few and sub micron inhomogeneities in Al-rich AlGaN/AlN quantum wells
DOE Office of Scientific and Technical Information (OSTI.GOV)
Iwata, Yoshiya; Oto, Takao; Banal, Ryan G.
2015-03-21
Inhomogeneity in Al-rich AlGaN/AlN quantum wells is directly observed using our custom-built confocal microscopy photoluminescence (μ-PL) apparatus with a reflective system. The μ-PL system can reach the AlN bandgap in the deep ultra-violet spectral range with a spatial resolution of 1.8 μm. In addition, cathodoluminescence (CL) measurements with a higher spatial resolution of about 100 nm are performed. A comparison of the μ-PL and CL measurements reveals that inhomogeneities, which have different spatial distributions of a few- and sub-micron scales that are superimposed, play key roles in determining the optical properties.
High rate tests of the photon detection system for the LHCb RICH Upgrade
NASA Astrophysics Data System (ADS)
Blago, M. P.; Keizer, F.
2017-12-01
The photon detection system for the LHCb RICH Upgrade consists of an array of multianode photomultiplier tubes (MaPMTs) read out by custom-built modular electronics. The behaviour of the whole chain was studied at CERN using a pulsed laser. Threshold scans were performed in order to study the MaPMT pulse-height spectra at high event rates and different photon intensities. The results show a reduction in photon detection efficiency at 900 V bias voltage, marked by a 20% decrease in the single-photon peak height, when increasing the event rate from 100 kHz to 20 MHz. This reduction was not observed at 1000 V bias voltage.
Mission Operations Planning and Scheduling System (MOPSS)
NASA Technical Reports Server (NTRS)
Wood, Terri; Hempel, Paul
2011-01-01
MOPSS is a generic framework that can be configured on the fly to support a wide range of planning and scheduling applications. It is currently used to support seven missions at Goddard Space Flight Center (GSFC) in roles that include science planning, mission planning, and real-time control. Prior to MOPSS, each spacecraft project built its own planning and scheduling capability to plan satellite activities and communications and to create the commands to be uplinked to the spacecraft. This approach required creating a data repository for storing planning and scheduling information, building user interfaces to display data, generating needed scheduling algorithms, and implementing customized external interfaces. Complex scheduling problems that involved reacting to multiple variable situations were analyzed manually. Operators then used the results to add commands to the schedule. Each architecture was unique to specific satellite requirements. MOPSS is an expert system that automates mission operations and frees the flight operations team to concentrate on critical activities. It is easily reconfigured by the flight operations team as the mission evolves. The heart of the system is a custom object-oriented data layer mapped onto an Oracle relational database. The combination of these two technologies allows a user or system engineer to capture any type of scheduling or planning data in the system's generic data storage via a GUI.
Look@NanoSIMS--a tool for the analysis of nanoSIMS data in environmental microbiology.
Polerecky, Lubos; Adam, Birgit; Milucka, Jana; Musat, Niculina; Vagner, Tomas; Kuypers, Marcel M M
2012-04-01
We describe an open-source freeware programme for high throughput analysis of nanoSIMS (nanometre-scale secondary ion mass spectrometry) data. The programme implements basic data processing and analytical functions, including display and drift-corrected accumulation of scanned planes, interactive and semi-automated definition of regions of interest (ROIs), and export of the ROIs' elemental and isotopic composition in graphical and text-based formats. Additionally, the programme offers new functions that were custom-designed to address the needs of environmental microbiologists. Specifically, it allows manual and automated classification of ROIs based on the information that is derived either from the nanoSIMS dataset itself (e.g. from labelling achieved by halogen in situ hybridization) or is provided externally (e.g. as a fluorescence in situ hybridization image). Moreover, by implementing post-processing routines coupled to built-in statistical tools, the programme allows rapid synthesis and comparative analysis of results from many different datasets. After validation of the programme, we illustrate how these new processing and analytical functions increase flexibility, efficiency and depth of the nanoSIMS data analysis. Through its custom-made and open-source design, the programme provides an efficient, reliable and easily expandable tool that can help a growing community of environmental microbiologists and researchers from other disciplines process and analyse their nanoSIMS data. © 2012 Society for Applied Microbiology and Blackwell Publishing Ltd.
NASA Technical Reports Server (NTRS)
Vargas-Aburto, Carlos; Aron, Paul R.; Liff, Dale R.
1990-01-01
The design, construction, and initial use of an ion microprobe to carry out secondary ion mass spectrometry (SIMS) of solid samples is reported. The system is composed of a differentially pumped custom-made UHV (Ultra High Vacuum) chamber, a quadrupole mass spectrometer and a telefocus A-DIDA ion gun with the capability of producing beams of Cesium, as well as inert and reactive gases. The computer control and acquisition of the data were designed and implemented using a personal computer with plug-in boards, and external circuitry built as required to suit the system needs. The software is being developed by using a FORTH-like language. Initial tests aimed at characterizing the system, as well as preliminary surface and depth-profiling studies are presently underway.
ATM Coastal Topography - Louisiana, 2001: UTM Zone 16 (Part 2 of 2)
Yates, Xan; Nayegandhi, Amar; Brock, John C.; Sallenger, Asbury H.; Klipp, Emily S.; Wright, C. Wayne
2009-01-01
These remotely sensed, geographically referenced elevation measurements of lidar-derived first-surface (FS) topography were produced collaboratively by the U.S. Geological Survey (USGS), Florida Integrated Science Center (FISC), St. Petersburg, FL, and the National Aeronautics and Space Administration (NASA), Wallops Flight Facility, VA. This project provides highly detailed and accurate datasets of a portion of the Louisiana coastline beach face within UTM Zone 16, from Grand Isle to the Chandeleur Islands, acquired September 7 and 9, 2001. The datasets are made available for use as a management tool to research scientists and natural-resource managers. An innovative scanning lidar instrument originally developed by NASA, and known as the Airborne Topographic Mapper (ATM), was used during data acquisition. The ATM system is a scanning lidar system that measures high-resolution topography of the land surface and incorporates a green-wavelength laser operating at pulse rates of 2 to 10 kilohertz. Measurements from the laser-ranging device are coupled with data acquired from inertial navigation system (INS) attitude sensors and differentially corrected global positioning system (GPS) receivers to measure topography of the surface at accuracies of +/-15 centimeters. The nominal ATM platform is a Twin Otter or P-3 Orion aircraft, but the instrument may be deployed on a range of light aircraft. Elevation measurements were collected over the survey area using the ATM system, and the resulting data were then processed using the Airborne Lidar Processing System (ALPS), a custom-built processing system developed in a NASA-USGS collaboration. ALPS supports the exploration and processing of lidar data in an interactive or batch mode. Modules for presurvey flight-line definition, flight-path plotting, lidar raster and waveform investigation, and digital camera image playback have been developed. Processing algorithms have been developed to extract the range to the first and last significant return within each waveform. ALPS is used routinely to create maps that represent submerged or first-surface topography.
ATM Coastal Topography-Louisiana, 2001: UTM Zone 15 (Part 1 of 2)
Yates, Xan; Nayegandhi, Amar; Brock, John C.; Sallenger, A.H.; Klipp, Emily S.; Wright, C. Wayne
2010-01-01
These remotely sensed, geographically referenced elevation measurements of lidar-derived first-surface (FS) topography were produced collaboratively by the U.S. Geological Survey (USGS), Florida Integrated Science Center (FISC), St. Petersburg, FL, and the National Aeronautics and Space Administration (NASA), Wallops Flight Facility, VA. This project provides highly detailed and accurate datasets of a portion of the Louisiana coastline beach face within UTM Zone 15, from Isles Dernieres to Grand Isle, acquired September 7 and 10, 2001. The datasets are made available for use as a management tool to research scientists and natural-resource managers. An innovative scanning lidar instrument originally developed by NASA, and known as the Airborne Topographic Mapper (ATM), was used during data acquisition. The ATM system is a scanning lidar system that measures high-resolution topography of the land surface and incorporates a green-wavelength laser operating at pulse rates of 2 to 10 kilohertz. Measurements from the laser-ranging device are coupled with data acquired from inertial navigation system (INS) attitude sensors and differentially corrected global positioning system (GPS) receivers to measure topography of the surface at accuracies of +/-15 centimeters. The nominal ATM platform is a Twin Otter or P-3 Orion aircraft, but the instrument may be deployed on a range of light aircraft. Elevation measurements were collected over the survey area using the ATM system, and the resulting data were then processed using the Airborne Lidar Processing System (ALPS), a custom-built processing system developed in a NASA-USGS collaboration. ALPS supports the exploration and processing of lidar data in an interactive or batch mode. Modules for presurvey flight-line definition, flight-path plotting, lidar raster and waveform investigation, and digital camera image playback have been developed. Processing algorithms have been developed to extract the range to the first and last significant return within each waveform. ALPS is used routinely to create maps that represent submerged or first-surface topography.
ATM Coastal Topography-Texas, 2001: UTM Zone 14
Klipp, Emily S.; Nayegandhi, Amar; Brock, John C.; Sallenger, A.H.; Bonisteel, Jamie M.; Yates, Xan; Wright, C. Wayne
2009-01-01
These remotely sensed, geographically referenced elevation measurements of lidar-derived first-surface (FS) topography were produced collaboratively by the U.S. Geological Survey (USGS), Florida Integrated Science Center (FISC), St. Petersburg, FL, and the National Aeronautics and Space Administration (NASA), Wallops Flight Facility, VA. This project provides highly detailed and accurate datasets of a portion of the Texas coastline within UTM zone 14, acquired October 12-13, 2001. The datasets are made available for use as a management tool to research scientists and natural-resource managers. An innovative scanning lidar instrument originally developed by NASA, and known as the Airborne Topographic Mapper (ATM), was used during data acquisition. The ATM system is a scanning lidar system that measures high-resolution topography of the land surface and incorporates a green-wavelength laser operating at pulse rates of 2 to 10 kilohertz. Measurements from the laser-ranging device are coupled with data acquired from inertial navigation system (INS) attitude sensors and differentially corrected global positioning system (GPS) receivers to measure topography of the surface at accuracies of +/-15 centimeters. The nominal ATM platform is a Twin Otter or P-3 Orion aircraft, but the instrument may be deployed on a range of light aircraft. Elevation measurements were collected over the survey area using the ATM system, and the resulting data were then processed using the Airborne Lidar Processing System (ALPS), a custom-built processing system developed in a NASA-USGS collaboration. ALPS supports the exploration and processing of lidar data in an interactive or batch mode. Modules for presurvey flight-line definition, flight-path plotting, lidar raster and waveform investigation, and digital camera image playback have been developed. Processing algorithms have been developed to extract the range to the first and last significant return within each waveform. ALPS is used routinely to create maps that represent submerged or first-surface topography.
ATM Coastal Topography-Texas, 2001: UTM Zone 15
Klipp, Emily S.; Nayegandhi, Amar; Brock, John C.; Sallenger, A.H.; Bonisteel, Jamie M.; Yates, Xan; Wright, C. Wayne
2009-01-01
These remotely sensed, geographically referenced elevation measurements of lidar-derived first-surface (FS) topography were produced collaboratively by the U.S. Geological Survey (USGS), Florida Integrated Science Center (FISC), St. Petersburg, FL, and the National Aeronautics and Space Administration (NASA), Wallops Flight Facility, VA. This project provides highly detailed and accurate datasets of a portion of the Texas coastline within UTM zone 15, from Matagorda Peninsula to Galveston Island, acquired October 12-13, 2001. The datasets are made available for use as a management tool to research scientists and natural-resource managers. An innovative scanning lidar instrument originally developed by NASA, and known as the Airborne Topographic Mapper (ATM), was used during data acquisition. The ATM system is a scanning lidar system that measures high-resolution topography of the land surface and incorporates a green-wavelength laser operating at pulse rates of 2 to 10 kilohertz. Measurements from the laser-ranging device are coupled with data acquired from inertial navigation system (INS) attitude sensors and differentially corrected global positioning system (GPS) receivers to measure topography of the surface at accuracies of +/-15 centimeters. The nominal ATM platform is a Twin Otter or P-3 Orion aircraft, but the instrument may be deployed on a range of light aircraft. Elevation measurements were collected over the survey area using the ATM system, and the resulting data were then processed using the Airborne Lidar Processing System (ALPS), a custom-built processing system developed in a NASA-USGS collaboration. ALPS supports the exploration and processing of lidar data in an interactive or batch mode. Modules for presurvey flight-line definition, flight-path plotting, lidar raster and waveform investigation, and digital camera image playback have been developed. Processing algorithms have been developed to extract the range to the first and last significant return within each waveform. ALPS is used routinely to create maps that represent submerged or first-surface topography.
ATM Coastal Topography-Florida 2001: Western Panhandle
Yates, Xan; Nayegandhi, Amar; Brock, John C.; Sallenger, A.H.; Bonisteel, Jamie M.; Klipp, Emily S.; Wright, C. Wayne
2009-01-01
These remotely sensed, geographically referenced elevation measurements of Lidar-derived first surface (FS) topography were produced collaboratively by the U.S. Geological Survey (USGS), Florida Integrated Science Center (FISC), St. Petersburg, FL, and the National Aeronautics and Space Administration (NASA), Wallops Flight Facility, VA. This project provides highly detailed and accurate datasets of the western Florida panhandle coastline, acquired October 2-4 and 7-10, 2001. The datasets are made available for use as a management tool to research scientists and natural resource managers. An innovative scanning Lidar instrument originally developed by NASA, and known as the Airborne Topographic Mapper (ATM), was used during data acquisition. The ATM system is a scanning Lidar system that measures high-resolution topography of the land surface and incorporates a green-wavelength laser operating at pulse rates of 2 to 10 kilohertz. Measurements from the laser-ranging device are coupled with data acquired from inertial navigation system (INS) attitude sensors and differentially corrected global positioning system (GPS) receivers to measure topography of the surface at accuracies of +/-15 centimeters. The nominal ATM platform is a Twin Otter or P-3 Orion aircraft, but the instrument may be deployed on a range of light aircraft. Elevation measurements were collected over the survey area using the ATM system, and the resulting data were then processed using the Airborne Lidar Processing System (ALPS), a custom-built processing system developed in a NASA-USGS collaboration. ALPS supports the exploration and processing of Lidar data in an interactive or batch mode. Modules for presurvey flight line definition, flight path plotting, Lidar raster and waveform investigation, and digital camera image playback have been developed. Processing algorithms have been developed to extract the range to the first and last significant return within each waveform. ALPS is routinely used to create maps that represent submerged or first surface topography.
ATM Coastal Topography-Mississippi, 2001
Nayegandhi, Amar; Yates, Xan; Brock, John C.; Sallenger, A.H.; Klipp, Emily S.; Wright, C. Wayne
2009-01-01
These remotely sensed, geographically referenced elevation measurements of lidar-derived first-surface (FS) topography were produced collaboratively by the U.S. Geological Survey (USGS), Florida Integrated Science Center (FISC), St. Petersburg, FL, and the National Aeronautics and Space Administration (NASA), Wallops Flight Facility, VA. This project provides highly detailed and accurate datasets of the Mississippi coastline, from Lakeshore to Petit Bois Island, acquired September 9-10, 2001. The datasets are made available for use as a management tool to research scientists and natural-resource managers. An innovative scanning lidar instrument originally developed by NASA, and known as the Airborne Topographic Mapper (ATM), was used during data acquisition. The ATM system is a scanning lidar system that measures high-resolution topography of the land surface and incorporates a green-wavelength laser operating at pulse rates of 2 to 10 kilohertz. Measurements from the laser-ranging device are coupled with data acquired from inertial navigation system (INS) attitude sensors and differentially corrected global positioning system (GPS) receivers to measure topography of the surface at accuracies of +/-15 centimeters. The nominal ATM platform is a Twin Otter or P-3 Orion aircraft, but the instrument may be deployed on a range of light aircraft. Elevation measurements were collected over the survey area using the ATM system, and the resulting data were then processed using the Airborne Lidar Processing System (ALPS), a custom-built processing system developed in a NASA-USGS collaboration. ALPS supports the exploration and processing of lidar data in an interactive or batch mode. Modules for presurvey flight-line definition, flight-path plotting, lidar raster and waveform investigation, and digital camera image playback have been developed. Processing algorithms have been developed to extract the range to the first and last significant return within each waveform. ALPS is used routinely to create maps that represent submerged or first-surface topography.
New mission requirements methodologies for services provided by the Office of Space Communications
NASA Technical Reports Server (NTRS)
Holmes, Dwight P.; Hall, J. R.; Macoughtry, William; Spearing, Robert
1993-01-01
The Office of Space Communications, NASA Headquarters, has recently revised its methodology for receiving, accepting and responding to customer requests for use of that office's tracking and communications capabilities. This revision is the result of a process which has become over-burdened by the size of the currently active and proposed missions set, requirements reviews that focus on single missions rather than on mission sets, and negotiations most often not completed early enough to effect needed additions to capacity or capability prior to launch. The requirements-coverage methodology described is more responsive to project/program needs and provides integrated input into the NASA budget process early enough to effect change, and describes the mechanisms and tools in place to insure a value-added process which will benefit both NASA and its customers. Key features of the requirements methodology include the establishment of a mechanism for early identification of and systems trades with new customers, and delegates the review and approval of requirements documents to NASA centers in lieu of Headquarters, thus empowering the system design teams to establish and negotiate the detailed requirements with the user. A Mission Requirements Request (MRR) is introduced to facilitate early customer interaction. The expected result is that the time to achieve an approved set of implementation requirements which meet the customer's needs can be greatly reduced. Finally, by increasing the discipline in requirements management, through the use of base lining procedures, a tighter coupling between customer requirements and the budget is provided. A twice-yearly projection of customer requirements accommodation, designated as the Capacity Projection Plan (CPP), provides customer feedback allowing the entire mission set to be serviced.
A Mechanism of Modeling and Verification for SaaS Customization Based on TLA
NASA Astrophysics Data System (ADS)
Luan, Shuai; Shi, Yuliang; Wang, Haiyang
With the gradually mature of SOA and the rapid development of Internet, SaaS has become a popular software service mode. The customized action of SaaS is usually subject to internal and external dependency relationships. This paper first introduces a method for modeling customization process based on Temporal Logic of Actions, and then proposes a verification algorithm to assure that each step in customization will not cause unpredictable influence on system and follow the related rules defined by SaaS provider.
The Impact of e-Customer Relationship Marketing in Hotel Industry
NASA Astrophysics Data System (ADS)
Samanta, Irene
The present research investigates the extent to which Greek hotels had developed the electronic customer relationship marketing (E-CRM). The study verifies the practices that frequently appear in relationship marketing process within online operations or whether their Internet presence mainly depends on the basic actions of "supplying information" and "reservations". Also, it investigates the effects of e-CRM system on customer loyalty and satisfaction as well as the impact of relationship marketing practices to customer retention and acquisition. They have understood the importance of using electronic channels instead of traditional ones to implement their marketing strategies. Thus, e-crm system has assisted hotel business to manage more effectively their reservations and serve their customers as fast and as effective as possible. They did not seem to apply many of the relationship marketing strategies to emphasize customer retention and continual satisfaction because of difficulties in staff training.
Service Quality Management Systems: An Annotated Bibliography
1992-05-01
customers, Fortune, 122, 38-48. Key words: Consumer preferences , customer expectations Abstract: Rice presents a profile of the 1990 U.S. consumers...business process, 16 competitive advantage, 6, 10 consumer, 5 consumer affairs department, 19 consumer preferences , 30 consumer research, 10,24
Cocaine self-administration in social dyads using custom-built operant conditioning chambers.
Lacy, Ryan T; Strickland, Justin C; Smith, Mark A
2014-10-30
Traditionally, the analysis of intravenous drug self-administration is limited to conditions in which subjects are tested in isolation. This limits the translational appeal of these studies because drug use in humans often occurs in the presence of others. We used custom-built operant conditioning chambers that allowed social dyads visual, olfactory, auditory, and limited tactile contact while concurrently self-administering cocaine. Male rats were trained to respond according to a fixed interval schedule of reinforcement (with a limited hold) in order to determine if patterns of cocaine (0.75mg/kg/infusion) self-administration became more similar over time in social pairs. Cocaine self-administration was tested across five days according to a 10-min fixed interval schedule (with a 5-min limited hold). Quarter-life values (time at which 25% of responses were emitted per interval) were analyzed using intraclass correlations. The total number of reinforcers obtained did not vary across the five days of testing; however, quarter-life values became progressively more similar between individuals within the social dyads. Standard operant conditioning chambers are unable to assess responding in multiple animals due to their small size, the need to prevent subjects from responding on the lever of their partner, and the need to prevent infusion lines from entangling. By using custom-built social operant conditioning chambers, we assessed the effects of social contact on cocaine self-administration. Social operant conditioning chambers can be used as a preclinical method to examine social influences on drug self-administration under conditions that approximate human substance use. Copyright © 2014 Elsevier B.V. All rights reserved.
Processing, Cataloguing and Distribution of Uas Images in Near Real Time
NASA Astrophysics Data System (ADS)
Runkel, I.
2013-08-01
Why are UAS such a hype? UAS make the data capture flexible, fast and easy. For many applications this is more important than a perfect photogrammetric aerial image block. To ensure, that the advantage of a fast data capturing will be valid up to the end of the processing chain, all intermediate steps like data processing and data dissemination to the customer need to be flexible and fast as well. GEOSYSTEMS has established the whole processing workflow as server/client solution. This is the focus of the presentation. Depending on the image acquisition system the image data can be down linked during the flight to the data processing computer or it is stored on a mobile device and hooked up to the data processing computer after the flight campaign. The image project manager reads the data from the device and georeferences the images according to the position data. The meta data is converted into an ISO conform format and subsequently all georeferenced images are catalogued in the raster data management System ERDAS APOLLO. APOLLO provides the data, respectively the images as an OGC-conform services to the customer. Within seconds the UAV-images are ready to use for GIS application, image processing or direct interpretation via web applications - where ever you want. The whole processing chain is built in a generic manner. It can be adapted to a magnitude of applications. The UAV imageries can be processed and catalogued as single ortho imges or as image mosaic. Furthermore, image data of various cameras can be fusioned. By using WPS (web processing services) image enhancement, image analysis workflows like change detection layers can be calculated and provided to the image analysts. The processing of the WPS runs direct on the raster data management server. The image analyst has no data and no software on his local computer. This workflow is proven to be fast, stable and accurate. It is designed to support time critical applications for security demands - the images can be checked and interpreted in near real-time. For sensible areas it gives you the possibility to inform remote decision makers or interpretation experts in order to provide them situations awareness, wherever they are. For monitoring and inspection tasks it speeds up the process of data capture and data interpretation. The fully automated workflow of data pre-processing, data georeferencing, data cataloguing and data dissemination in near real time was developed based on the Intergraph products ERDAS IMAGINE, ERDAS APOLLO and GEOSYSTEMS METAmorph!IT. It is offered as adaptable solution by GEOSYSTEMS GmbH.
Shedlock, James; Frisque, Michelle; Hunt, Steve; Walton, Linda; Handler, Jonathan; Gillam, Michael
2010-01-01
Question: How can the user's access to health information, especially full-text articles, be improved? The solution is building and evaluating the Health SmartLibrary (HSL). Setting: The setting is the Galter Health Sciences Library, Feinberg School of Medicine, Northwestern University. Method: The HSL was built on web-based personalization and customization tools: My E-Resources, Stay Current, Quick Search, and File Cabinet. Personalization and customization data were tracked to show user activity with these value-added, online services. Main Results: Registration data indicated that users were receptive to personalized resource selection and that the automated application of specialty-based, personalized HSLs was more frequently adopted than manual customization by users. Those who did customize customized My E-Resources and Stay Current more often than Quick Search and File Cabinet. Most of those who customized did so only once. Conclusion: Users did not always take advantage of the services designed to aid their library research experiences. When personalization is available at registration, users readily accepted it. Customization tools were used less frequently; however, more research is needed to determine why this was the case. PMID:20428276
Perspective: Rapid synthesis of complex oxides by combinatorial molecular beam epitaxy
A. T. Bollinger; Wu, J.; Bozovic, I.
2016-03-15
In this study, the molecular beam epitaxy(MBE) technique is well known for producing atomically smooth thin films as well as impeccable interfaces in multilayers of many different materials. In particular, molecular beam epitaxy is well suited to the growth of complex oxides, materials that hold promise for many applications. Rapid synthesis and high throughput characterization techniques are needed to tap into that potential most efficiently. We discuss our approach to doing that, leaving behind the traditional one-growth-one-compound scheme and instead implementing combinatorial oxide molecular beam epitaxy in a custom built system.
A discussion of refractive medical behavior from an experiential marketing viewpoint.
Ho, Yung-Ching; Li, Ye-Chuen; Su, Tzu-Hsin
2006-01-01
Since the launch of National Health Insurance System, the financial source of funding for hospital financing has been reduced. Meanwhile, more and more customers attach importance to the experience of the medical process. Our study adopts "strategic modules of experiential marketing" by Schmitt to be the theoretical basis and proceed with in-depth interviews to discuss the influence of "medical behavior" on customers' experiences. We interviewed 32 patients who had a refractive surgery experience. The results show there are 10 propositions, which could be developed from 5 experiential modules - SENSE, FEEL, THINK, ACT, and RELATE - of customers' medical experiences. This study clarifies the experiences of customers during the process of a refractive surgery experience in order to provide medical institutions with the direction of experiential marketing to consider how to use experiential providers to reinforce customers' experiences.
INFIBRA: machine vision inspection of acrylic fiber production
NASA Astrophysics Data System (ADS)
Davies, Roger; Correia, Bento A. B.; Contreiras, Jose; Carvalho, Fernando D.
1998-10-01
This paper describes the implementation of INFIBRA, a machine vision system for the inspection of acrylic fiber production lines. The system was developed by INETI under a contract from Fisipe, Fibras Sinteticas de Portugal, S.A. At Fisipe there are ten production lines in continuous operation, each approximately 40 m in length. A team of operators used to perform periodic manual visual inspection of each line in conditions of high ambient temperature and humidity. It is not surprising that failures in the manual inspection process occurred with some frequency, with consequences that ranged from reduced fiber quality to production stoppages. The INFIBRA system architecture is a specialization of a generic, modular machine vision architecture based on a network of Personal Computers (PCs), each equipped with a low cost frame grabber. Each production line has a dedicated PC that performs automatic inspection, using specially designed metrology algorithms, via four video cameras located at key positions on the line. The cameras are mounted inside custom-built, hermetically sealed water-cooled housings to protect them from the unfriendly environment. The ten PCs, one for each production line, communicate with a central PC via a standard Ethernet connection. The operator controls all aspects of the inspection process, from configuration through to handling alarms, via a simple graphical interface on the central PC. At any time the operator can also view on the central PC's screen the live image from any one of the 40 cameras employed by the system.
Manufacturing Bms/Iso System Review
NASA Technical Reports Server (NTRS)
Gomez, Yazmin
2004-01-01
The Quality Management System (QMS) is one that recognizes the need to continuously change and improve an organization s products and services as determined by system feedback, and corresponding management decisions. The purpose of a Quality Management System is to minimize quality variability of an organization's products and services. The optimal Quality Management System balances the need for an organization to maintain flexibility in the products and services it provides with the need for providing the appropriate level of discipline and control over the processes used to provide them. The goal of a Quality Management System is to ensure the quality of the products and services while consistently (through minimizing quality variability) meeting or exceeding customer expectations. The GRC Business Management System (BMS) is the foundation of the Center's ISO 9001:2000 registered quality system. ISO 9001 is a quality system model developed by the International Organization for Standardization. BMS supports and promote the Glenn Research Center Quality Policy and wants to ensure the customer satisfaction while also meeting quality standards. My assignment during this summer is to examine the manufacturing processes used to develop research hardware, which in most cases are one of a kind hardware, made with non conventional equipment and materials. During this process of observation I will make a determination, based on my observations of the hardware development processes the best way to meet customer requirements and at the same time achieve the GRC quality standards. The purpose of my task is to review the manufacturing processes identifying opportunities in which to optimize the efficiency of the processes and establish a plan for implementation and continuous improvement.
On decentralized design: Rationale, dynamics, and effects on decision-making
NASA Astrophysics Data System (ADS)
Chanron, Vincent
The focus of this dissertation is the design of complex systems, including engineering systems such as cars, airplanes, and satellites. Companies who design these systems are under constant pressure to design better products that meet customer expectations, and competition forces them to develop them faster. One of the responses of the industry to these conflicting challenges has been the decentralization of the design responsibilities. The current lack of understanding of the dynamics of decentralized design processes is the main motivation for this research, and places value on the descriptive base. It identifies the main reasons and the true benefits for companies to decentralize the design of their products. It also demonstrates the limitations of this approach by listing the relevant issues and problems created by the decentralization of decisions. Based on these observations, a game-theoretic approach to decentralized design is proposed to model the decisions made during the design process. The dynamics are modeled using mathematical formulations inspired from control theory. Building upon this formalism, the issue of convergence in decentralized design is analyzed: the equilibrium points of the design space are identified and convergent and divergent patterns are recognized. This rigorous investigation of the design process provides motivation and support for proposing new approaches to decentralized design problems. Two methods are developed, which aim at improving the design process in two ways: decreasing the product development time, and increasing the optimality of the final design. The frame of these methods are inspired by eigenstructure decomposition and set-based design, respectively. The value of the research detailed within this dissertation is in the proposed methods which are built upon the sound mathematical formalism developed. The contribution of this work is two fold: rigorous investigation of the design process, and practical support to decision-making in decentralized environments.
A Generic Ground Framework for Image Expertise Centres and Small-Sized Production Centres
NASA Astrophysics Data System (ADS)
Sellé, A.
2009-05-01
Initiated by the Pleiadas Earth Observation Program, the CNES (French Space Agency) has developed a generic collaborative framework for its image quality centre, highly customisable for any upcoming expertise centre. This collaborative framework has been design to be used by a group of experts or scientists that want to share data and processings and manage interfaces with external entities. Its flexible and scalable architecture complies with the core requirements: defining a user data model with no impact on the software (generic access data), integrating user processings with a GUI builder and built-in APIs, and offering a scalable architecture to fit any preformance requirement and accompany growing projects. The CNES jas given licensing grants for two software companies that will be able to redistribute this framework to any customer.
Practical LCA for short shelf life products
NASA Astrophysics Data System (ADS)
Laurin, Lise; Goedkoop, Mark; Norris, Greg
2005-11-01
Manufacturers in many of today's industries are faced with product shelf life counted in months. Traditionally, this has made it very difficult to make a life cycle assessment (LCA) of a product, since the product would be obsolete by the time the LCA was completed. A new concept in LCA that allows specialists in things other than LCA to rapidly create both a model and generate "what-if" scenarios will allow even manufacturers of short shelf life products take advantage of the benefits of LCA. These industry-specific "wizards" are built around a manufacturing process and can be rapidly updated or customized to a particular manufacturer or process type. Results can be used internally for decision-making and can also enable manufacturers submit information for environmentally preferable purchasing, eco-labels, etc.
WE-D-BRE-01: A Sr-90 Irradiation Device for the Study of Cutaneous Radiation Injury
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dorand, JE; Bourland, JD; Burnett, LR
2014-06-15
Purpose: To determine dosimetric character for a custom-built Sr-90 beta irradiator designed for the study of Cutaneous Radiation Injury (CRI) in a porcine animal model. In the event of a radiological accident or terrorist event, Sr-90, a fission by-product, will likely be produced. CRI is a main concern due to the low energy and superficial penetration in tissue of beta particles from Sr-90. Seven 100 mCi plaque Sr-90 radiation sources within a custom-built irradiation device create a 40 mm diameter region of radiation-induced skin injury as part of a larger project to study the efficacy of a topical keratin-based productmore » in CRI healing. Methods: A custom-built mobile irradiation device was designed and implemented for in vivo irradiations. Gafchromic™ EBT3 radiochromic film and a PTW Markus chamber type 23343 were utilized for dosimetric characterization of the beta fluence at the surface produced by this device. Films were used to assess 2-dimensional dose distribution and percent depth dose characteristics of the radiation field. Ion chamber measurements provided dose rate data within the field. Results: The radiation field produced by the irradiation device is homogeneous with high uniformity (∼5%) and symmetry (∼3%) with a steep dose fall-off with depth from the surface. Dose rates were determined to be 3.8 Gy/min and 3.3 Gy/min for film and ion chamber measurements, respectively. A dose rate of 3.4 Gy/min was used to calculate irradiation times for in vivo irradiations. Conclusion: The custom-built irradiation device enables the use of seven Sr-90 beta sources in an array to deliver a 40 mm diameter area of homogeneous skin dose with a dose rate that is useful for research purposes and clinically relevant for the induction of CRI. Doses of 36 and 42 Gy successfully produce Grade III CRI and are used in the study of the efficacy of KeraStat™. This project has been funded in whole or in part with Federal funds from the Biomedical Advanced Research and Development Authority, Office of the Assistant Secretary for Preparedness and Response, Office of the Secretary, Department of Health and Human Services, under Contract No. HHSO100201200007C.« less
Integrating Stakeholders and Users into the Geography Discipline's Research Process
Hermans, Caroline M.; Taketa, Richard
2006-01-01
Future research priorities of Geography emphasize the discipline's leadership role in the U.S. Geological Survey (USGS) in multidisciplinary and integrated research on human and environmental systems and how these systems are interrelated and respond to change Geography's research priorities also emphasize providing science that is usable to society and creating decision support products applicable to given customer problems. To achieve these goals, we must understand the relationship between our research and our customer, and how to integrate the customer into the research process. This report details the elements of the research process that help achieve the degree of stakeholder involvement necessary to ensure a successful end-product. It offers suggestions that can help researchers better understand stakeholders and customers and involve them in the research process more effectively, while preserving the integrity of the science. Its aim is to help researchers understand the problems and challenges faced by our customers and communicate the ways in which Geography can help address their problems. Adopting these guidelines can improve the efficiency of the research process and lead to higher quality output. We will be able to conduct better research because we will have an improved understanding of the research problem and the stakeholders involved. This report covers a broad range of topics, from identifying and communicating with stakeholders and users, to the use of language, to how to effectively present scientific information to the user. It does not offer a 'one size fits all' method. Instead, perhaps only specific sections are suitable for a given project and customers, depending on project scope and needs. This report is based on the objectives of Geography's strategic plan, U. S. Geological Survey's strategic plan, and Department of Interior's strategic plan. Section 2 of these guidelines describes the purpose of the research process in Geography and the need for better user involvement in the process. Section 3 explains how to conduct a stakeholder analysis. Section 4 explains how to conduct a user-needs assessment.
Operationalizing Space Weather Products - Process and Issues
NASA Astrophysics Data System (ADS)
Scro, K. D.; Quigley, S.
2006-12-01
Developing and transitioning operational products for any customer base is a complicated process. This is the case for operational space weather products and services for the USAF. This presentation will provide information on the current state of affairs regarding the process required to take an idea from the research field to the real-time application of 24-hour space weather operations support. General principles and specific issues are discussed and will include: customer requirements, organizations in-play, funding, product types, acquisition of engineering and validation data, security classification, version control, and various important changes that occur during the process. The author's viewpoint is as an individual developing space environmental system-impact products for the US Air Force: 1) as a member of its primary research organization (Air Force Research Laboratory), 2) working with its primary space environment technology transition organization (Technology Application Division of the Space and Missile Systems Center, SMC/WXT), and 3) delivering to the primary sponsor/customer of such system-impact products (Air Force Space Command). The experience and focus is obviously on specific military operationalization process and issues, but most of the paradigm may apply to other (commercial) enterprises as well.
2009-05-27
CAPE CANAVERAL, Fla. – An aerial view of the site in the Industrial Area of NASA's Kennedy Space Center in Florida where a solar power system will be built. The solar power systems are being constructed by NASA and Florida Power & Light Company as part of a public-private partnership that promotes a clean-energy future. A groundbreaking ceremony took place on May 27 at the Kennedy Space Center Visitor Complex. FPL, Florida's largest electric utility, will build and maintain two solar photovoltaic power generation systems at Kennedy. One will produce an estimated 10 megawatts of emissions-free power for FPL customers, which is enough energy to serve roughly 1,100 homes. The second, which will be built on the pictured location, is a one-megawatt solar power facility that will provide renewable energy directly to Kennedy. The FPL facilities at NASA will help provide Florida residents and America's space program with new sources of clean energy that will cut reliance on fossil fuels and improve the environment by reducing greenhouse gas emissions. The one megawatt facility also will help NASA meet its goal for use of power generated from renewable energy. Photo credit: NASA/Kim Shiflett
2009-05-27
CAPE CANAVERAL, Fla. – An aerial view of the site on S.R. 3 on NASA's Kennedy Space Center in Florida where a solar power system will be built. The solar power systems are being constructed by NASA and Florida Power & Light Company as part of a public-private partnership that promotes a clean-energy future. A groundbreaking ceremony took place on May 27 at the Kennedy Space Center Visitor Complex. FPL, Florida's largest electric utility, will build and maintain two solar photovoltaic power generation systems at Kennedy. One, which will be built on the pictured location, will produce an estimated 10 megawatts of emissions-free power for FPL customers, which is enough energy to serve roughly 1,100 homes. The second is a one-megawatt solar power facility that will provide renewable energy directly to Kennedy. The FPL facilities at NASA will help provide Florida residents and America's space program with new sources of clean energy that will cut reliance on fossil fuels and improve the environment by reducing greenhouse gas emissions. The one megawatt facility also will help NASA meet its goal for use of power generated from renewable energy. Photo credit: NASA/Kim Shiflett
NASA Technical Reports Server (NTRS)
Fishkind, Stanley; Harris, Richard N.; Pfeiffer, William A.
1996-01-01
The methodologies of the NASA requirements processing system, originally designed to enhance NASA's customer interface and response time, are reviewed. The response of NASA to the problems associated with the system is presented, and it is shown what was done to facilitate the process and to improve customer relations. The requirements generation system (RGS), a computer-supported client-server system, adopted by NASA is presented. The RGS system is configurable on a per-mission basis and can be structured to allow levels of requirements. The details provided concerning the RGS include the recommended configuration, information on becoming an RGS user and network connectivity worksheets for computer users.
EDUCATING MANAGERS ABOUT QUALITY THROUGH CUSTOMER-SUPPLIER UNDERSTANDING
The successful implementation of a Quality System depends largely on the commitment to Quality by managers and their participation in the quality management process. oday, an accepted definition of quality is largely based on the concept of customer and supplier partnerships in a...
A Time-Domain CMOS Oscillator-Based Thermostat with Digital Set-Point Programming
Chen, Chun-Chi; Lin, Shih-Hao
2013-01-01
This paper presents a time-domain CMOS oscillator-based thermostat with digital set-point programming [without a digital-to-analog converter (DAC) or external resistor] to achieve on-chip thermal management of modern VLSI systems. A time-domain delay-line-based thermostat with multiplexers (MUXs) was used to substantially reduce the power consumption and chip size, and can benefit from the performance enhancement due to the scaling down of fabrication processes. For further cost reduction and accuracy enhancement, this paper proposes a thermostat using two oscillators that are suitable for time-domain curvature compensation instead of longer linear delay lines. The final time comparison was achieved using a time comparator with a built-in custom hysteresis to generate the corresponding temperature alarm and control. The chip size of the circuit was reduced to 0.12 mm2 in a 0.35-μm TSMC CMOS process. The thermostat operates from 0 to 90 °C, and achieved a fine resolution better than 0.05 °C and an improved inaccuracy of ± 0.6 °C after two-point calibration for eight packaged chips. The power consumption was 30 μW at a sample rate of 10 samples/s. PMID:23385403
Application programs written by using customizing tools of a computer-aided design system
DOE Office of Scientific and Technical Information (OSTI.GOV)
Li, X.; Huang, R.; Juricic, D.
1995-12-31
Customizing tools of Computer-Aided Design Systems have been developed to such a degree as to become equivalent to powerful higher-level programming languages that are especially suitable for graphics applications. Two examples of application programs written by using AutoCAD`s customizing tools are given in some detail to illustrate their power. One tool uses AutoLISP list-processing language to develop an application program that produces four views of a given solid model. The other uses AutoCAD Developmental System, based on program modules written in C, to produce an application program that renders a freehand sketch from a given CAD drawing.
Design and implementation of fishery rescue data mart system
NASA Astrophysics Data System (ADS)
Pan, Jun; Huang, Haiguang; Liu, Yousong
A novel data mart based system for fishery rescue field was designed and implemented. The system runs ETL process to deal with original data from various databases and data warehouses, and then reorganized the data into the fishery rescue data mart. Next, online analytical processing (OLAP) are carried out and statistical reports are generated automatically. Particularly, quick configuration schemes are designed to configure query dimensions and OLAP data sets. The configuration file will be transformed into statistic interfaces automatically through a wizard-style process. The system provides various forms of reporting files, including crystal reports, flash graphical reports, and two-dimensional data grids. In addition, a wizard style interface was designed to guide users customizing inquiry processes, making it possible for nontechnical staffs to access customized reports. Characterized by quick configuration, safeness and flexibility, the system has been successfully applied in city fishery rescue department.
Research on an autonomous vision-guided helicopter
NASA Technical Reports Server (NTRS)
Amidi, Omead; Mesaki, Yuji; Kanade, Takeo
1994-01-01
Integration of computer vision with on-board sensors to autonomously fly helicopters was researched. The key components developed were custom designed vision processing hardware and an indoor testbed. The custom designed hardware provided flexible integration of on-board sensors with real-time image processing resulting in a significant improvement in vision-based state estimation. The indoor testbed provided convenient calibrated experimentation in constructing real autonomous systems.
Integrating Engineering Data Systems for NASA Spaceflight Projects
NASA Technical Reports Server (NTRS)
Carvalho, Robert E.; Tollinger, Irene; Bell, David G.; Berrios, Daniel C.
2012-01-01
NASA has a large range of custom-built and commercial data systems to support spaceflight programs. Some of the systems are re-used by many programs and projects over time. Management and systems engineering processes require integration of data across many of these systems, a difficult problem given the widely diverse nature of system interfaces and data models. This paper describes an ongoing project to use a central data model with a web services architecture to support the integration and access of linked data across engineering functions for multiple NASA programs. The work involves the implementation of a web service-based middleware system called Data Aggregator to bring together data from a variety of systems to support space exploration. Data Aggregator includes a central data model registry for storing and managing links between the data in disparate systems. Initially developed for NASA's Constellation Program needs, Data Aggregator is currently being repurposed to support the International Space Station Program and new NASA projects with processes that involve significant aggregating and linking of data. This change in user needs led to development of a more streamlined data model registry for Data Aggregator in order to simplify adding new project application data as well as standardization of the Data Aggregator query syntax to facilitate cross-application querying by client applications. This paper documents the approach from a set of stand-alone engineering systems from which data are manually retrieved and integrated, to a web of engineering data systems from which the latest data are automatically retrieved and more quickly and accurately integrated. This paper includes the lessons learned through these efforts, including the design and development of a service-oriented architecture and the evolution of the data model registry approaches as the effort continues to evolve and adapt to support multiple NASA programs and priorities.
Angiuoli, Samuel V; Matalka, Malcolm; Gussman, Aaron; Galens, Kevin; Vangala, Mahesh; Riley, David R; Arze, Cesar; White, James R; White, Owen; Fricke, W Florian
2011-08-30
Next-generation sequencing technologies have decentralized sequence acquisition, increasing the demand for new bioinformatics tools that are easy to use, portable across multiple platforms, and scalable for high-throughput applications. Cloud computing platforms provide on-demand access to computing infrastructure over the Internet and can be used in combination with custom built virtual machines to distribute pre-packaged with pre-configured software. We describe the Cloud Virtual Resource, CloVR, a new desktop application for push-button automated sequence analysis that can utilize cloud computing resources. CloVR is implemented as a single portable virtual machine (VM) that provides several automated analysis pipelines for microbial genomics, including 16S, whole genome and metagenome sequence analysis. The CloVR VM runs on a personal computer, utilizes local computer resources and requires minimal installation, addressing key challenges in deploying bioinformatics workflows. In addition CloVR supports use of remote cloud computing resources to improve performance for large-scale sequence processing. In a case study, we demonstrate the use of CloVR to automatically process next-generation sequencing data on multiple cloud computing platforms. The CloVR VM and associated architecture lowers the barrier of entry for utilizing complex analysis protocols on both local single- and multi-core computers and cloud systems for high throughput data processing.
The Gulf of Mexico Coastal Ocean Observing System: A Gulf Science Portal
NASA Astrophysics Data System (ADS)
Howard, M.; Gayanilo, F.; Kobara, S.; Jochens, A. E.
2013-12-01
The Gulf of Mexico Coastal Ocean Observing System's (GCOOS) regional science portal (gcoos.org) was designed to aggregate data and model output from distributed providers and to offer these, and derived products, through a single access point in standardized ways to a diverse set of users. The portal evolved under the NOAA-led U.S. Integrated Ocean Observing System (IOOS) program where automated largely-unattended machine-to-machine interoperability has always been a guiding tenet for system design. The web portal has a business unit where membership lists, new items, and reference materials are kept, a data portal where near real-time and historical data are held and served, and a products portal where data are fused into products tailored for specific or general stakeholder groups. The staff includes a system architect who built and maintains the data portal, a GIS expert who built and maintains the current product portal, the executive director who marshals resources to keep news items fresh and data manger who manages most of this. The business portal is built using WordPress which was selected because it appeared to be the easiest content management system for non-web programmers to add content to, maintain and enhance. The data portal is custom built and uses database, PHP, and web services based on Open Geospatial Consortium standards-based Sensor Observation Service (SOS) with Observations and Measurements (O&M) encodings. We employ a standards-based vocabulary, which we helped develop, which is registered at the Marine Metadata Interoperability Ontology Registry and Repository (http://mmisw.org). The registry is currently maintained by one of the authors. Products appearing in the products portal are primarily constructed using ESRI software by a Ph.D. level Geographer. Some products were built with other software, generally by graduate students over the years. We have been sensitive to the private sector when deciding which products to produce. While science users want numbers, users of all types mainly want maps. We have tried to develop flexible capabilities to present products for a variety of output devices, from desktop screens to the smart phones. Software maintenance is a continuing issue and new initiatives from NOAA add to the work load but improve the system. We will discuss how our data management system has evolved within the backdrop of rapidly changing technologies and diverse community requirements.
Caffrey, J A; Higley, K A; Farsoni, A T; Smith, S; Menn, S
2012-09-01
A custom radiation monitoring system was developed by Oregon State University at the request of the Woods Hole Oceanographic Institute to measure radioactive cesium contaminants in the ocean waters near Fukushima Dai-ichi Nuclear Power Plant. The system was to be used on board the R/V Ka'imikai-O-Kanaloa during a 15 d research cruise to provide real-time approximations of radionuclide concentration and alert researchers to the possible occurrence of highly elevated radionuclide concentrations. A NaI(Tl) scintillation detector was coupled to a custom-built compact digital spectroscopy system and suspended within a sealed tank of continuously flowing seawater. A series of counts were acquired within an energy region corresponding to the main photopeak of (137)Cs. The system was calibrated using known quantities of radioactive (134)Cs and (137)Cs in a ratio equating to that present at the reactors' ocean outlet. The response between net count rate and concentration of (137)Cs was then used to generate temporal and geographic plots of (137)Cs concentration throughout the research cruise in Japanese coastal waters. The concentration of (137)Cs was low but detectable, reaching a peak of 3.8 ± 0.2 Bq/L. Copyright © 2011 Elsevier Ltd. All rights reserved.
Custom electronic subsystems for the laboratory telerobotic manipulator
NASA Technical Reports Server (NTRS)
Glassell, R. L.; Butler, P. L.; Rowe, J. C.; Zimmermann, S. D.
1990-01-01
The National Aeronautics and Space Administration (NASA) Space Station Program presents new opportunities for the application of telerobotic and robotic systems. The Laboratory Telerobotic Manipulator (LTM) is a highly advanced 7 degrees-of-freedom (DOF) telerobotic/robotic manipulator. It was developed and built for the Automation Technology Branch at NASA's Langley Research Center (LaRC) for work in research and to demonstrate ground-based telerobotic manipulator system hardware and software systems for future NASA applications in the hazardous environment of space. The LTM manipulator uses an embedded wiring design with all electronics, motor power, and control and communication cables passing through the pitch-yaw differential joints. This design requires the number of cables passing through the pitch/yaw joint to be kept to a minimum. To eliminate the cables needed to carry each pitch-yaw joint's sensor data to the VME control computers, a custom-embedded electronics package for each manipulator joint was developed. The electronics package collects and sends the joint's sensor data to the VME control computers over a fiber optic cable. The electronics package consist of five individual subsystems: the VME Link Processor, the Joint Processor and the Joint Processor power supply in the joint module, the fiber optics communications system, and the electronics and motor power cabling.
NASA Astrophysics Data System (ADS)
Li, S. G.; Shi, L.
2014-10-01
The recommendation system for virtual items in massive multiplayer online role-playing games (MMORPGs) has aroused the interest of researchers. Of the many approaches to construct a recommender system, collaborative filtering (CF) has been the most successful one. However, the traditional CFs just lure customers into the purchasing action and overlook customers' satisfaction, moreover, these techniques always suffer from low accuracy under cold-start conditions. Therefore, a novel collaborative filtering (NCF) method is proposed to identify like-minded customers according to the preference similarity coefficient (PSC), which implies correlation between the similarity of customers' characteristics and the similarity of customers' satisfaction level for the product. Furthermore, the analytic hierarchy process (AHP) is used to determine the relative importance of each characteristic of the customer and the improved ant colony optimisation (IACO) is adopted to generate the expression of the PSC. The IACO creates solutions using the Markov random walk model, which can accelerate the convergence of algorithm and prevent prematurity. For a target customer whose neighbours can be found, the NCF can predict his satisfaction level towards the suggested products and recommend the acceptable ones. Under cold-start conditions, the NCF will generate the recommendation list by excluding items that other customers prefer.
Engineering performance metrics
NASA Astrophysics Data System (ADS)
Delozier, R.; Snyder, N.
1993-03-01
Implementation of a Total Quality Management (TQM) approach to engineering work required the development of a system of metrics which would serve as a meaningful management tool for evaluating effectiveness in accomplishing project objectives and in achieving improved customer satisfaction. A team effort was chartered with the goal of developing a system of engineering performance metrics which would measure customer satisfaction, quality, cost effectiveness, and timeliness. The approach to developing this system involved normal systems design phases including, conceptual design, detailed design, implementation, and integration. The lessons teamed from this effort will be explored in this paper. These lessons learned may provide a starting point for other large engineering organizations seeking to institute a performance measurement system accomplishing project objectives and in achieving improved customer satisfaction. To facilitate this effort, a team was chartered to assist in the development of the metrics system. This team, consisting of customers and Engineering staff members, was utilized to ensure that the needs and views of the customers were considered in the development of performance measurements. The development of a system of metrics is no different than the development of any type of system. It includes the steps of defining performance measurement requirements, measurement process conceptual design, performance measurement and reporting system detailed design, and system implementation and integration.
Fang, Jing-Jing; Liu, Jia-Kuang; Wu, Tzu-Chieh; Lee, Jing-Wei; Kuo, Tai-Hong
2013-05-01
Computer-aided design has gained increasing popularity in clinical practice, and the advent of rapid prototyping technology has further enhanced the quality and predictability of surgical outcomes. It provides target guides for complex bony reconstruction during surgery. Therefore, surgeons can efficiently and precisely target fracture restorations. Based on three-dimensional models generated from a computed tomographic scan, precise preoperative planning simulation on a computer is possible. Combining the interdisciplinary knowledge of surgeons and engineers, this study proposes a novel surgical guidance method that incorporates a built-in occlusal wafer that serves as the positioning reference.Two patients with complex facial deformity suffering from severe facial asymmetry problems were recruited. In vitro facial reconstruction was first rehearsed on physical models, where a customized surgical guide incorporating a built-in occlusal stent as the positioning reference was designed to implement the surgery plan. This study is intended to present the authors' preliminary experience in a complex facial reconstruction procedure. It suggests that in regions with less information, where intraoperative computed tomographic scans or navigation systems are not available, our approach could be an effective, expedient, straightforward aid to enhance surgical outcome in a complex facial repair.
2005 5th Annual CMMI Technology Conference and User Group. Volume 4: Thursday
2005-11-17
Identification and Involvement in the CMMI, Mr. James R. Armstrong , Systems and Software Consortium Ensuring the Right Process is Deployed Right...Customer-Driven Organization Chart Marketing Management: Analysis, Planning, Implementation and Control Philip Kotler © Prentice Hall Being Customer
Rethinking Human-Centered Computing: Finding the Customer and Negotiated Interactions at the Airport
NASA Technical Reports Server (NTRS)
Wales, Roxana; O'Neill, John; Mirmalek, Zara
2003-01-01
The breakdown in the air transportation system over the past several years raises an interesting question for researchers: How can we help improve the reliability of airline operations? In offering some answers to this question, we make a statement about Huuman-Centered Computing (HCC). First we offer the definition that HCC is a multi-disciplinary research and design methodology focused on supporting humans as they use technology by including cognitive and social systems, computational tools and the physical environment in the analysis of organizational systems. We suggest that a key element in understanding organizational systems is that there are external cognitive and social systems (customers) as well as internal cognitive and social systems (employees) and that they interact dynamically to impact the organization and its work. The design of human-centered intelligent systems must take this outside-inside dynamic into account. In the past, the design of intelligent systems has focused on supporting the work and improvisation requirements of employees but has often assumed that customer requirements are implicitly satisfied by employee requirements. Taking a customer-centric perspective provides a different lens for understanding this outside-inside dynamic, the work of the organization and the requirements of both customers and employees In this article we will: 1) Demonstrate how the use of ethnographic methods revealed the important outside-inside dynamic in an airline, specifically the consequential relationship between external customer requirements and perspectives and internal organizational processes and perspectives as they came together in a changing environment; 2) Describe how taking a customer centric perspective identifies places where the impact of the outside-inside dynamic is most critical and requires technology that can be adaptive; 3) Define and discuss the place of negotiated interactions in airline operations, identifying how these interactions between customers and airline employees provided new insights into design problems in the airline system; 4) Show how taking a customer-centric perspective influences the HCC design of an airline system and make recommendations for new architectures and intelligent devices that will enable airline systems to adapt flexibly to delay situations, supporting both customers and airline employees.
Simultaneous PET/MR imaging with a radio frequency-penetrable PET insert
Grant, Alexander M.; Lee, Brian J.; Chang, Chen-Ming; Levin, Craig S.
2017-01-01
Purpose A brain sized radio-frequency (RF)-penetrable PET insert has been designed for simultaneous operation with MRI systems. This system takes advantage of electro-optical coupling and battery power to electrically float the PET insert relative to the MRI ground, permitting RF signals to be transmitted through small gaps between the modules that form the PET ring. This design facilitates the use of the built-in body coil for RF transmission, and thus could be inserted into any existing MR site wishing to achieve simultaneous PET/MR imaging. The PET detectors employ non-magnetic silicon photomultipliers in conjunction with a compressed sensing signal multiplexing scheme, and optical fibers to transmit analog PET detector signals out of the MRI room for decoding, processing, and image reconstruction. Methods The PET insert was first constructed and tested in a laboratory benchtop setting, where tomographic images of a custom resolution phantom were successfully acquired. The PET insert was then placed within a 3T body MRI system, and tomographic resolution/contrast phantom images were acquired both with only the B0 field present, and under continuous pulsing from different MR imaging sequences. Results The resulting PET images have comparable contrast-to-noise ratios (CNR) under all MR pulsing conditions: the maximum percent CNR relative difference for each rod type among all four PET images acquired in the MRI system has a mean of 14.0±7.7%. MR images were successfully acquired through the RF-penetrable PET shielding using only the built-in MR body coil, suggesting that simultaneous imaging is possible without significant mutual interference. Conclusions These results show promise for this technology as an alternative to costly integrated PET/MR scanners; a PET insert that is compatible with any existing clinical MRI system could greatly increase the availability, accessibility, and dissemination of PET/MR. PMID:28102949
NASA Astrophysics Data System (ADS)
Kelb, Christian; Rother, Raimund; Schuler, Anne-Katrin; Hinkelmann, Moritz; Rahlves, Maik; Prucker, Oswald; Müller, Claas; Rühe, Jürgen; Reithmeier, Eduard; Roth, Bernhard
2016-03-01
We demonstrate the manufacturing of embedded multimode optical waveguides through linking of polymethylmethacrylate (PMMA) foils and cyclic olefin polymer (COP) filaments based on a lamination process. Since the two polymeric materials cannot be fused together through interdiffusion of polymer chains, we utilize a reactive lamination agent based on PMMA copolymers containing photoreactive 2-acryloyloxyanthraquinone units, which allows the creation of monolithic PMMA-COP substrates through C-H insertion reactions across the interface between the two materials. We elucidate the lamination process and evaluate the chemical link between filament and foils by carrying out extraction tests with a custom-built tensile testing machine. We also show attenuation measurements of the manufactured waveguides for different manufacturing parameters. The lamination process is in particular suited for large-scale and low-cost fabrication of board-level devices with optical waveguides or other micro-optical structures, e.g., optofluidic devices.
NASA Technical Reports Server (NTRS)
Laubenthal, N. A.; Bertsch, D.; Lal, N.; Etienne, A.; Mcdonald, L.; Mattox, J.; Sreekumar, P.; Nolan, P.; Fierro, J.
1992-01-01
The Energetic Gamma Ray Telescope Experiment (EGRET) on the Compton Gamma Ray Observatory has been in orbit for more than a year and is being used to map the full sky for gamma rays in a wide energy range from 30 to 20,000 MeV. Already these measurements have resulted in a wide range of exciting new information on quasars, pulsars, galactic sources, and diffuse gamma ray emission. The central part of the analysis is done with sky maps that typically cover an 80 x 80 degree section of the sky for an exposure time of several days. Specific software developed for this program generates the counts, exposure, and intensity maps. The analysis is done on a network of UNIX based workstations and takes full advantage of a custom-built user interface called X-dialog. The maps that are generated are stored in the FITS format for a collection of energies. These, along with similar diffuse emission background maps generated from a model calculation, serve as input to a maximum likelihood program that produces maps of likelihood with optional contours that are used to evaluate regions for sources. Likelihood also evaluates the background corrected intensity at each location for each energy interval from which spectra can be generated. Being in a standard FITS format permits all of the maps to be easily accessed by the full complement of tools available in several commercial astronomical analysis systems. In the EGRET case, IDL is used to produce graphics plots in two and three dimensions and to quickly implement any special evaluation that might be desired. Other custom-built software, such as the spectral and pulsar analyses, take advantage of the XView toolkit for display and Postscript output for the color hard copy. This poster paper outlines the data flow and provides examples of the user interfaces and output products. It stresses the advantages that are derived from the integration of the specific instrument-unique software and powerful commercial tools for graphics and statistical evaluation. This approach has several proven advantages including flexibility, a minimum of development effort, ease of use, and portability.
Tank Monitoring and Document control System (TMACS) As Built Software Design Document
DOE Office of Scientific and Technical Information (OSTI.GOV)
GLASSCOCK, J.A.
This document describes the software design for the Tank Monitor and Control System (TMACS). This document captures the existing as-built design of TMACS as of November 1999. It will be used as a reference document to the system maintainers who will be maintaining and modifying the TMACS functions as necessary. The heart of the TMACS system is the ''point-processing'' functionality where a sample value is received from the field sensors and the value is analyzed, logged, or alarmed as required. This Software Design Document focuses on the point-processing functions.
FPGA Based High Speed Data Acquisition System for Electrical Impedance Tomography
Khan, S; Borsic, A; Manwaring, Preston; Hartov, Alexander; Halter, Ryan
2014-01-01
Electrical Impedance Tomography (EIT) systems are used to image tissue bio-impedance. EIT provides a number of features making it attractive for use as a medical imaging device including the ability to image fast physiological processes (>60 Hz), to meet a range of clinical imaging needs through varying electrode geometries and configurations, to impart only non-ionizing radiation to a patient, and to map the significant electrical property contrasts present between numerous benign and pathological tissues. To leverage these potential advantages for medical imaging, we developed a modular 32 channel data acquisition (DAQ) system using National Instruments’ PXI chassis, along with FPGA, ADC, Signal Generator and Timing and Synchronization modules. To achieve high frame rates, signal demodulation and spectral characteristics of higher order harmonics were computed using dedicated FFT-hardware built into the FPGA module. By offloading the computing onto FPGA, we were able to achieve a reduction in throughput required between the FPGA and PC by a factor of 32:1. A custom designed analog front end (AFE) was used to interface electrodes with our system. Our system is wideband, and capable of acquiring data for input signal frequencies ranging from 100 Hz to 12 MHz. The modular design of both the hardware and software will allow this system to be flexibly configured for the particular clinical application. PMID:24729790
Real-Time Interactive Facilities Associated With A 3-D Medical Workstation
NASA Astrophysics Data System (ADS)
Goldwasser, S. M.; Reynolds, R. A.; Talton, D.; Walsh, E.
1986-06-01
Biomedical workstations of the future will incorporate three-dimensional interactive capabilities which provide real-time response to most common operator requests. Such systems will find application in many areas of medicine including clinical diagnosis, surgical and radiation therapy planning, biomedical research based on functional imaging, and medical education. This paper considers the requirements of these future systems in terms of image quality, performance, and the interactive environment, and examines the relationship of workstation capabilities to specific medical applications. We describe a prototype physician's workstation that we have designed and built to meet many of these requirements (using conventional graphics technology in conjunction with a custom real-time 3-D processor), and give an account of the remaining issues and challenges that future designers of such systems will have to address.
The Small Aircraft Transportation System Project: An Update
NASA Technical Reports Server (NTRS)
Kemmerly, Guy T.
2006-01-01
To all peoples in all parts of the world throughout history, the ability to move about easily is a fundamental element of freedom. The American people have charged NASA to increase their freedom and that of their children knowing that their quality of life will improve as our nation s transportation systems improve. In pursuit of this safe, reliable, and affordable personalized air transportation option, in 2000 NASA established the Small Aircraft Transportation System (SATS) Project. As the name suggests personalized air transportation would be built on smaller aircraft than those used by the airlines. Of course, smaller aircraft can operate from smaller airports and 96% of the American population is within thirty miles of a high-quality, underutilized community airport as are the vast majority of their customers, family members, and favorite vacation destinations.
Fuzzy Evaluating Customer Satisfaction of Jet Fuel Companies
NASA Astrophysics Data System (ADS)
Cheng, Haiying; Fang, Guoyi
Based on the market characters of jet fuel companies, the paper proposes an evaluation index system of jet fuel company customer satisfaction from five dimensions as time, business, security, fee and service. And a multi-level fuzzy evaluation model composing with the analytic hierarchy process approach and fuzzy evaluation approach is given. Finally a case of one jet fuel company customer satisfaction evaluation is studied and the evaluation results response the feelings of the jet fuel company customers, which shows the fuzzy evaluation model is effective and efficient.
EarthTutor: An Interactive Intelligent Tutoring System for Remote Sensing
NASA Astrophysics Data System (ADS)
Bell, A. M.; Parton, K.; Smith, E.
2005-12-01
Earth science classes in colleges and high schools use a variety of satellite image processing software to teach earth science and remote sensing principles. However, current tutorials for image processing software are often paper-based or lecture-based and do not take advantage of the full potential of the computer context to teach, immerse, and stimulate students. We present EarthTutor, an adaptive, interactive Intelligent Tutoring System (ITS) being built for NASA (National Aeronautics and Space Administration) that is integrated directly with an image processing application. The system aims to foster the use of satellite imagery in classrooms and encourage inquiry-based, hands-on earth science scientific study by providing students with an engaging imagery analysis learning environment. EarthTutor's software is available as a plug-in to ImageJ, a free image processing system developed by the NIH (National Institute of Health). Since it is written in Java, it can be run on almost any platform and also as an applet from the Web. Labs developed for EarthTutor combine lesson content (such as HTML web pages) with interactive activities and questions. In each lab the student learns to measure, calibrate, color, slice, plot and otherwise process and analyze earth science imagery. During the activities, EarthTutor monitors students closely as they work, which allows it to provide immediate feedback that is customized to a particular student's needs. As the student moves through the labs, EarthTutor assesses the student, and tailors the presentation of the content to a student's demonstrated skill level. EarthTutor's adaptive approach is based on emerging Artificial Intelligence (AI) research. Bayesian networks are employed to model a student's proficiency with different earth science and image processing concepts. Agent behaviors are used to track the student's progress through activities and provide guidance when a student encounters difficulty. Through individual feedback and adaptive instruction, EarthTutor aims to offer the benefits of a one-on-one human instructor in a cost-effective, easy-to-use application. We are currently working with remote sensing experts to develop EarthTutor labs for diverse earth science subjects such as global vegetation, stratospheric ozone, oceanography, polar sea ice and natural hazards. These labs will be packaged with the first public release of EarthTutor in December 2005. Custom labs can be designed with the EarthTutor authoring tool. The tool is basic enough to allow teachers to construct tutorials to fit their classroom's curriculum and locale, but also powerful enough to allow advanced users to create highly-interactive labs. Preliminary results from an ongoing pilot study demonstrate that the EarthTutor system is effective and enjoyable teaching tool, relative to traditional satellite imagery teaching methods.
Newton, Joshua D; Klein, Ruth; Bauman, Adrian; Newton, Fiona J; Mahal, Ajay; Gilbert, Kara; Piterman, Leon; Ewing, Michael T; Donovan, Robert J; Smith, Ben J
2015-04-18
Physical activity is associated with a host of health benefits, yet many individuals do not perform sufficient physical activity to realise these benefits. One approach to rectifying this situation is through modifying the built environment to make it more conducive to physical activity, such as by building walking tracks or recreational physical activity facilities. Often, however, modifications to the built environment are not connected to efforts aimed at encouraging their use. The purpose of the Monitoring and Observing the Value of Exercise (MOVE) study is to evaluate the effectiveness of two interventions designed to encourage the ongoing use of a new, multi-purpose, community-based physical activity facility. A two-year, randomised controlled trial with yearly survey points (baseline, 12 months follow-up, 24 months follow-up) will be conducted among 1,300 physically inactive adult participants aged 18-70 years. Participants will be randomly assigned to one of three groups: control, intervention 1 (attendance incentives), or intervention 2 (attendance incentives and tailored support following a model based on customer relationship management). Primary outcome measures will include facility usage, physical activity participation, mental and physical wellbeing, community connectedness, social capital, friendship, and social support. Secondary outcome measures will include stages of change for facility usage and social cognitive decision-making variables. This study will assess whether customer relationship management systems, a tool commonly used in commercial marketing settings, can encourage the ongoing use of a physical activity facility. Findings may also indicate the population segments among which the use of such systems are most effective, as well as their cost-effectiveness. Australian New Zealand Clinical Trials Registry: ACTRN12615000012572 (registered 9 January 2015).
Fleischman, A; Parvari, U; Oron, Y; Geyer, O
2012-06-01
Electroretinography (ERG) is widely used in clinical work and research to assess the retinal function. We evaluated an easy to build ERG setup adapted for small animals comprising two contact lens electrodes with a built-in light-emitting diode and a custom-made amplification system. The system's sensitivity was tested by monitoring ERG in albino rat eyes subjected to mild ischemia. Flash ERG was recorded by two contact lens electrodes positioned on the rat's corneas and used alternately as test or reference. The a- and b-wave amplitudes, a-wave latency, b-wave implicit time and oscillatory potentials (OPs) were analyzed. Ischemia was achieved by elevating the intraocular pressure in the eye's anterior chamber. ERG was recorded on post-ischemia (PI) days -1, 1, 3 and 7. Morphological changes were analyzed on hematoxylin/eosin stained 5 µm sections of control 7d PI retinas. In control eyes, ERG exhibited a pattern similar to a standard recording. Retinas subjected to mild ischemia preserved ordered layered morphology, exhibiting approximately 30% loss of ganglion cells and no changes in gross morphology. By day 3 PI, ischemia caused an increase in the a-wave amplitude (from 34.9 ± 2.7 to 45.4 ± 4.3 µV), a decrease in the b-wave amplitude (from 248 ± 13 to 162 ± 8 µV), an increase in a-wave latency (from 11.1 ± 0.3 to 17.3 ± 1.4 ms) and b-wave implicit time (from 81.0 ± 1.6 to 90.0 ± 2.5 ms), and attenuation of OPs. The described setup proved sensitive and reliable for evaluating subtle changes in the retinal function in small animals.
First year in operating a mechanical detrasher system at a sugarcane factory in Louisiana
USDA-ARS?s Scientific Manuscript database
Over the past 2 years, a new prototype mechanical detrasher system was built at a Louisiana sugarcane factory by American Biocarbon LLC. It was built to remove sugarcane trash (top stalks and leaves) before processing, and for the manufacture and sale of value-added-products from the removed trash,...
GLobal Integrated Design Environment
NASA Technical Reports Server (NTRS)
Kunkel, Matthew; McGuire, Melissa; Smith, David A.; Gefert, Leon P.
2011-01-01
The GLobal Integrated Design Environment (GLIDE) is a collaborative engineering application built to resolve the design session issues of real-time passing of data between multiple discipline experts in a collaborative environment. Utilizing Web protocols and multiple programming languages, GLIDE allows engineers to use the applications to which they are accustomed in this case, Excel to send and receive datasets via the Internet to a database-driven Web server. Traditionally, a collaborative design session consists of one or more engineers representing each discipline meeting together in a single location. The discipline leads exchange parameters and iterate through their respective processes to converge on an acceptable dataset. In cases in which the engineers are unable to meet, their parameters are passed via e-mail, telephone, facsimile, or even postal mail. The result of this slow process of data exchange would elongate a design session to weeks or even months. While the iterative process remains in place, software can now exchange parameters securely and efficiently, while at the same time allowing for much more information about a design session to be made available. GLIDE is written in a compilation of several programming languages, including REALbasic, PHP, and Microsoft Visual Basic. GLIDE client installers are available to download for both Microsoft Windows and Macintosh systems. The GLIDE client software is compatible with Microsoft Excel 2000 or later on Windows systems, and with Microsoft Excel X or later on Macintosh systems. GLIDE follows the Client-Server paradigm, transferring encrypted and compressed data via standard Web protocols. Currently, the engineers use Excel as a front end to the GLIDE Client, as many of their custom tools run in Excel.
Computer imaging and workflow systems in the business office.
Adams, W T; Veale, F H; Helmick, P M
1999-05-01
Computer imaging and workflow technology automates many business processes that currently are performed using paper processes. Documents are scanned into the imaging system and placed in electronic patient account folders. Authorized users throughout the organization, including preadmission, verification, admission, billing, cash posting, customer service, and financial counseling staff, have online access to the information they need when they need it. Such streamlining of business functions can increase collections and customer satisfaction while reducing labor, supply, and storage costs. Because the costs of a comprehensive computer imaging and workflow system can be considerable, healthcare organizations should consider implementing parts of such systems that can be cost-justified or include implementation as part of a larger strategic technology initiative.
NASA Astrophysics Data System (ADS)
Zhao, Huangxuan; Wang, Guangsong; Lin, Riqiang; Gong, Xiaojing; Song, Liang; Li, Tan; Wang, Wenjia; Zhang, Kunya; Qian, Xiuqing; Zhang, Haixia; Li, Lin; Liu, Zhicheng; Liu, Chengbo
2018-04-01
For the diagnosis and evaluation of ophthalmic diseases, imaging and quantitative characterization of vasculature in the iris are very important. The recently developed photoacoustic imaging, which is ultrasensitive in imaging endogenous hemoglobin molecules, provides a highly efficient label-free method for imaging blood vasculature in the iris. However, the development of advanced vascular quantification algorithms is still needed to enable accurate characterization of the underlying vasculature. We have developed a vascular information quantification algorithm by adopting a three-dimensional (3-D) Hessian matrix and applied for processing iris vasculature images obtained with a custom-built optical-resolution photoacoustic imaging system (OR-PAM). For the first time, we demonstrate in vivo 3-D vascular structures of a rat iris with a the label-free imaging method and also accurately extract quantitative vascular information, such as vessel diameter, vascular density, and vascular tortuosity. Our results indicate that the developed algorithm is capable of quantifying the vasculature in the 3-D photoacoustic images of the iris in-vivo, thus enhancing the diagnostic capability of the OR-PAM system for vascular-related ophthalmic diseases in vivo.
Serrano-Gotarredona, Rafael; Oster, Matthias; Lichtsteiner, Patrick; Linares-Barranco, Alejandro; Paz-Vicente, Rafael; Gomez-Rodriguez, Francisco; Camunas-Mesa, Luis; Berner, Raphael; Rivas-Perez, Manuel; Delbruck, Tobi; Liu, Shih-Chii; Douglas, Rodney; Hafliger, Philipp; Jimenez-Moreno, Gabriel; Civit Ballcels, Anton; Serrano-Gotarredona, Teresa; Acosta-Jimenez, Antonio J; Linares-Barranco, Bernabé
2009-09-01
This paper describes CAVIAR, a massively parallel hardware implementation of a spike-based sensing-processing-learning-actuating system inspired by the physiology of the nervous system. CAVIAR uses the asychronous address-event representation (AER) communication framework and was developed in the context of a European Union funded project. It has four custom mixed-signal AER chips, five custom digital AER interface components, 45k neurons (spiking cells), up to 5M synapses, performs 12G synaptic operations per second, and achieves millisecond object recognition and tracking latencies.
Valdez, Michelle M; Liwanag, Maureen; Mount, Charles; Rodriguez, Rechell; Avalos-Reyes, Elisea; Smith, Andrew; Collette, David; Starsiak, Michael; Green, Richard
2018-03-14
Inefficiencies in the command approval process for publications and/or presentations negatively impact DoD Graduate Medical Education (GME) residency programs' ability to meet ACGME scholarly activity requirements. A preliminary review of the authored works approval process at Naval Medical Center San Diego (NMCSD) disclosed significant inefficiency, variation in process, and a low level of customer satisfaction. In order to facilitate and encourage scholarly activity at NMCSD, and meet ACGME requirements, the Executive Steering Council (ESC) chartered an interprofessional team to lead a Lean Six Sigma (LSS) Rapid Improvement Event (RIE) project. Two major outcome metrics were identified: (1) the number of authored works submissions containing all required signatures and (2) customer satisfaction with the authored works process. Primary metric baseline data were gathered utilizing a Clinical Investigations database tracking publications and presentations. Secondary metric baseline data were collected via a customer satisfaction survey to GME faculty and residents. The project team analyzed pre-survey data and utilized LSS tools and methodology including a "gemba" (environment) walk, cause and effect diagram, critical to quality tree, voice of the customer, "muda" (waste) chart, and a pre- and post-event value stream map. The team selected an electronic submission system as the intervention most likely to positively impact the RIE project outcome measures. The number of authored works compliant with all required signatures improved from 52% to 100%. Customer satisfaction rated as "completely or mostly satisfied" improved from 24% to 97%. For both outcomes, signature compliance and customer satisfaction, statistical significance was achieved with a p < 0.0001. This RIE project utilized LSS methodology and tools to improve signature compliance and increase customer satisfaction with the authored works approval process, leading to 100% signature compliance, a comprehensive longitudinal repository of all authored work requests, and a 97% "completely or mostly satisfied" customer rating of the process.
Retinal angiography with real-time speckle variance optical coherence tomography.
Xu, Jing; Han, Sherry; Balaratnasingam, Chandrakumar; Mammo, Zaid; Wong, Kevin S K; Lee, Sieun; Cua, Michelle; Young, Mei; Kirker, Andrew; Albiani, David; Forooghian, Farzin; Mackenzie, Paul; Merkur, Andrew; Yu, Dao-Yi; Sarunic, Marinko V
2015-10-01
This report describes a novel, non-invasive and label-free optical imaging technique, speckle variance optical coherence tomography (svOCT), for visualising blood flow within human retinal capillary networks. This imaging system uses a custom-built swept source OCT system operating at a line rate of 100 kHz. Real-time processing and visualisation is implemented on a consumer grade graphics processing unit. To investigate the quality of microvascular detail acquired with this device we compared images of human capillary networks acquired with svOCT and fluorescein angiography. We found that the density of capillary microvasculature acquired with this svOCT device was visibly greater than fluorescein angiography. We also found that this svOCT device had the capacity to generate en face images of distinct capillary networks that are morphologically comparable with previously published histological studies. Finally, we found that this svOCT device has the ability to non-invasively illustrate the common manifestations of diabetic retinopathy and retinal vascular occlusion. The results of this study suggest that graphics processing unit accelerated svOCT has the potential to non-invasively provide useful quantitative information about human retinal capillary networks. Therefore svOCT may have clinical and research applications for the management of retinal microvascular diseases, which are a major cause of visual morbidity worldwide. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.
1994-09-01
IIssue Computers, information systems, and communication systems are being increasingly used in transportation, warehousing, order processing , materials...inventory levels, reduced order processing times, reduced order processing costs, and increased customer satisfaction. While purchasing and transportation...process, the speed in which crders are processed would increase significantly. Lowering the order processing time in turn lowers the lead time, which in
A knowledge based application of the extended aircraft interrogation and display system
NASA Technical Reports Server (NTRS)
Glover, Richard D.; Larson, Richard R.
1991-01-01
A family of multiple-processor ground support test equipment was used to test digital flight-control systems on high-performance research aircraft. A unit recently built for the F-18 high alpha research vehicle project is the latest model in a series called the extended aircraft interrogation and display system. The primary feature emphasized monitors the aircraft MIL-STD-1553B data buses and provides real-time engineering units displays of flight-control parameters. A customized software package was developed to provide real-time data interpretation based on rules embodied in a highly structured knowledge database. The configuration of this extended aircraft interrogation and display system is briefly described, and the evolution of the rule based package and its application to failure modes and effects testing on the F-18 high alpha research vehicle is discussed.
The Anticounter System of the PAMELA Space Experiment
NASA Astrophysics Data System (ADS)
Pearce, M.; Carlson, P.; Lund, J.; Lundquist, J.; Orsi, S.; Rydstroem
2003-07-01
The PAMELA space experiment [4] will be launched on-board of a polarorbiting Resurs DK1 satellite in 2004. The primary objective of PAMELA is to measure the flux of antiprotons (80 MeV 190 GeV) and positrons (50 MeV 270 GeV) in the cosmic radiation. PAMELA is built around a permanent magnet silicon spectrometer which is surrounded by an anticounter system. The anticounter system uses sheets of plastic scintillator to identify particles which do not pass cleanly through the acceptance of the spectrometer but still give rise to coincidental energy deposits in the time-of-flight / trigger scintillators positioned at the entrance and exit of the spectrometer. The construction of the anticounter system is described in detail along with the custom read-out, data acquisition and calibration electronics. Results from qualification studies are also discussed.
FOAM: the modular adaptive optics framework
NASA Astrophysics Data System (ADS)
van Werkhoven, T. I. M.; Homs, L.; Sliepen, G.; Rodenhuis, M.; Keller, C. U.
2012-07-01
Control software for adaptive optics systems is mostly custom built and very specific in nature. We have developed FOAM, a modular adaptive optics framework for controlling and simulating adaptive optics systems in various environments. Portability is provided both for different control hardware and adaptive optics setups. To achieve this, FOAM is written in C++ and runs on standard CPUs. Furthermore we use standard Unix libraries and compilation procedures and implemented a hardware abstraction layer in FOAM. We have successfully implemented FOAM on the adaptive optics system of ExPo - a high-contrast imaging polarimeter developed at our institute - in the lab and will test it on-sky late June 2012. We also plan to implement FOAM on adaptive optics systems for microscopy and solar adaptive optics. FOAM is available* under the GNU GPL license and is free to be used by anyone.
Measuring Effectiveness of TQM Training: An Indian Study.
ERIC Educational Resources Information Center
Palo, Sasmita; Padhi, Nayantara
2003-01-01
Responses from 372 employees of a steel manufacturer in India were analyzed to measure effectiveness of total quality management training. Training created awareness, built commitment to quality, facilitated teamwork, and enhanced professional standards. However, communication competencies and customer value training needed improvement. (Contains…
Ke, Bilian; Mao, Xinjie; Jiang, Hong; He, Jichang; Liu, Che; Li, Min; Yuan, Ying
2017-01-01
Purpose This study investigated the anterior ocular anatomic origin of high-order aberration (HOA) components using optical coherence tomography and a Shack-Hartmann wavefront sensor. Methods A customized system was built to simultaneously capture images of ocular wavefront aberrations and anterior ocular biometry. Relaxed, 2-diopter (D) and 4-D accommodative states were repeatedly measured in 30 young subjects. Custom software was used to correct optical distortions and measure biometric parameters from the images. Results The anterior ocular biometry changed during 2-D accommodation, in which central lens thickness, ciliary muscle thicknesses at 1 mm posterior to the scleral spur (CMT1), and the maximum value of ciliary muscle thickness increased significantly, whereas anterior chamber depth, CMT3, radius of anterior lens surface curvature (RAL), and radius of posterior lens surface curvature (RPL) decreased significantly. The changes in the anterior ocular parameters during 4-D accommodation were similar to those for the 2-D accommodation. \\begin{document}\
Knowledge-based reasoning in the Paladin tactical decision generation system
NASA Technical Reports Server (NTRS)
Chappell, Alan R.
1993-01-01
A real-time tactical decision generation system for air combat engagements, Paladin, has been developed. A pilot's job in air combat includes tasks that are largely symbolic. These symbolic tasks are generally performed through the application of experience and training (i.e. knowledge) gathered over years of flying a fighter aircraft. Two such tasks, situation assessment and throttle control, are identified and broken out in Paladin to be handled by specialized knowledge based systems. Knowledge pertaining to these tasks is encoded into rule-bases to provide the foundation for decisions. Paladin uses a custom built inference engine and a partitioned rule-base structure to give these symbolic results in real-time. This paper provides an overview of knowledge-based reasoning systems as a subset of rule-based systems. The knowledge used by Paladin in generating results as well as the system design for real-time execution is discussed.
Software Management for the NOνAExperiment
NASA Astrophysics Data System (ADS)
Davies, G. S.; Davies, J. P.; C Group; Rebel, B.; Sachdev, K.; Zirnstein, J.
2015-12-01
The NOvAsoftware (NOνASoft) is written in C++, and built on the Fermilab Computing Division's art framework that uses ROOT analysis software. NOνASoftmakes use of more than 50 external software packages, is developed by more than 50 developers and is used by more than 100 physicists from over 30 universities and laboratories in 3 continents. The software builds are handled by Fermilab's custom version of Software Release Tools (SRT), a UNIX based software management system for large, collaborative projects that is used by several experiments at Fermilab. The system provides software version control with SVN configured in a client-server mode and is based on the code originally developed by the BaBar collaboration. In this paper, we present efforts towards distributing the NOvA software via the CernVM File System distributed file system. We will also describe our recent work to use a CMake build system and Jenkins, the open source continuous integration system, for NOνASoft.
Wolfe, Benjamin; Rushmore, Richard J.; Valero-Cabré, Antoni
2010-01-01
While two-dimensional stimuli may be easily presented with any computer, an apparatus which allows a range of stimuli to be presented in three dimensions is not easily or cheaply available to researchers or clinicians. To fill this gap, we have developed the Realspace Testing System (RTS) which addresses the need for a flexible and multimodal stimulus presentation system capable of displaying stimuli in a three dimensional space with a high degree of temporal accuracy. The RTS is able to control twenty-six channels of visual or audio stimuli, to send trigger pulses during each trial to external devices, such as a Transcranial Magnetic Stimulator, and to record subject responses during the testing sessions. The RTS is flexible, portable and can be used in laboratory or clinical settings as required while being built at a low cost using off the shelf components. We have tested the RTS by performing an exploratory experiment on the role of right posterior parietal cortex in visuospatial processing in conjunction with online Transcranial Magnetic Stimulation (TMS) and verified that the system can accurately present stimuli as needed while triggering a TMS pulse during each trial at the required time. The RTS could be appealing and useful to a range of researchers or clinicians who may choose to use it much as we have designed it, or use it in its current state as a starting point to customize their stimulus control systems in real space. PMID:20079374
Dimensionless Analysis and Numerical Modeling of Rebalancing Phenomena During Levitation
NASA Astrophysics Data System (ADS)
Gao, Lei; Shi, Zhe; Li, Donghui; McLean, Alexander; Chattopadhyay, Kinnor
2016-06-01
Electromagnetic levitation (EML) has proved to be a powerful tool for research activities in areas pertaining to materials physics and engineering. The customized EML setups in various fields, ranging from solidification to nanomaterial manufacturing, require the designing of stable levitation systems. Since the elevated droplet is opaque, the most effective way to research on EML is mathematical modeling. In the present study, a 3D model was built to investigate the rebalancing phenomenon causing instabilities during droplet melting. A mathematical model modified based on Hooke's law (spring) was proposed to describe the levitation system. This was combined with dimensionless analysis to investigate the generation of levitation forces as it will significantly affect the behavior of the spring model.
Low-power, low-cost urinalysis system with integrated dipstick evaluation and microscopic analysis.
Smith, Gennifer T; Li, Linkai; Zhu, Yue; Bowden, Audrey K
2018-06-21
We introduce a coupled dipstick and microscopy device for analyzing urine samples. The device is capable of accurately assessing urine dipstick results while simultaneously imaging the microscopic contents within the sample. We introduce a long working distance, cellphone-based microscope in combination with an oblique illumination scheme to accurately visualize and quantify particles within the urine sample. To facilitate accurate quantification, we couple the imaging set-up with a power-free filtration system. The proposed device is reusable, low-cost, and requires very little power. We show that results obtained with the proposed device and custom-built app are consistent with those obtained with the standard clinical protocol, suggesting the potential clinical utility of the device.
Single-molecule fluorescence study of the inhibition of the oncogenic functionality of STAT3
NASA Astrophysics Data System (ADS)
Liu, Baoxu; Badali, Daniel; Fletcher, Steven; Avadisian, Miriam; Gunning, Patrick; Gradinaru, Claudiu
2009-06-01
Signal-Transducer-and-Activator-of-Transcription 3 (STAT3) protein plays an important role in the onset of cancers such as leukemia and lymphoma. In this study, we aim to test the effectiveness of a novel peptide drug designed to tether STAT3 to the phospholipid bilayer of the cell membrane and thus inhibit unwanted transcription. As a first step, STAT3 proteins were successfully labelled with tetramethylrhodamine (TMR), a fluorescent dye with suitable photostability for single molecule studies. The effectiveness of labelling was determined using fluorescence correlation spectroscopy in a custom built confocal microscope, from which diffusion times and hydrodynamic radii of individual proteins were determined. A newly developed fluorescein derivative label (F-NAc) has been designed to be incorporated into the structure of the peptide drug so that peptide-STAT3 interactions can be examined. This dye is spectrally characterized and is found to be well suited for its application to this project, as well as other single-molecule studies. The membrane localization via high-affinity cholesterol-bound small-molecule binding agents can be demonstrated by encapsulating TMR-labeled STAT3 and inhibitors within a vesicle model cell system. To this end, unilaminar lipid vesicles were examined for size and encapsulation ability. Preliminary results of the efficiency and stability of the STAT3 anchoring in lipid membranes obtained via quantitative confocal imaging and single-molecule spectroscopy using a custom-built multiparameter fluorescence microscope are reported here.
Durr, W
1998-01-01
Call centers are strategically and tactically important to many industries, including the healthcare industry. Call centers play a key role in acquiring and retaining customers. The ability to deliver high-quality and timely customer service without much expense is the basis for the proliferation and expansion of call centers. Call centers are unique blends of people and technology, where performance indicates combining appropriate technology tools with sound management practices built on key operational data. While the technology is fascinating, the people working in call centers and the skill of the management team ultimately make a difference to their companies.
Real-Time, Wide Area Dispatch of Mobil Tank Trucks
1987-01-01
human dispatchers it assists. Using CAD, Mobil has substantially re- duced costs and staff while improving customer service. I n the spring of 1985, a...process by establishing the Mobil order response center (MORC). To use MORC, the customer dials a toll-free number, available 24 hours a day, seven...MATS Figwe 3: Mobil light products order and dispatch information flow. Customers call an audio re- sponse computer system named MORC ( Mobil order
MetaNET--a web-accessible interactive platform for biological metabolic network analysis.
Narang, Pankaj; Khan, Shawez; Hemrom, Anmol Jaywant; Lynn, Andrew Michael
2014-01-01
Metabolic reactions have been extensively studied and compiled over the last century. These have provided a theoretical base to implement models, simulations of which are used to identify drug targets and optimize metabolic throughput at a systemic level. While tools for the perturbation of metabolic networks are available, their applications are limited and restricted as they require varied dependencies and often a commercial platform for full functionality. We have developed MetaNET, an open source user-friendly platform-independent and web-accessible resource consisting of several pre-defined workflows for metabolic network analysis. MetaNET is a web-accessible platform that incorporates a range of functions which can be combined to produce different simulations related to metabolic networks. These include (i) optimization of an objective function for wild type strain, gene/catalyst/reaction knock-out/knock-down analysis using flux balance analysis. (ii) flux variability analysis (iii) chemical species participation (iv) cycles and extreme paths identification and (v) choke point reaction analysis to facilitate identification of potential drug targets. The platform is built using custom scripts along with the open-source Galaxy workflow and Systems Biology Research Tool as components. Pre-defined workflows are available for common processes, and an exhaustive list of over 50 functions are provided for user defined workflows. MetaNET, available at http://metanet.osdd.net , provides a user-friendly rich interface allowing the analysis of genome-scale metabolic networks under various genetic and environmental conditions. The framework permits the storage of previous results, the ability to repeat analysis and share results with other users over the internet as well as run different tools simultaneously using pre-defined workflows, and user-created custom workflows.
Characteristics of a semi-custom library development system
NASA Technical Reports Server (NTRS)
Yancey, M.; Cannon, R.
1990-01-01
Standard cell and gate array macro libraries are in common use with workstation computer aided design (CAD) tools for application specific integrated circuit (ASIC) semi-custom application and have resulted in significant improvements in the overall design efficiencies as contrasted with custom design methodologies. Similar design methodology enhancements in providing for the efficient development of the library cells is an important factor in responding to the need for continuous technology improvement. The characteristics of a library development system that provides design flexibility and productivity enhancements for the library development engineer as he provides libraries in the state-of-the-art process technologies are presented. An overview of Gould's library development system ('Accolade') is also presented.
Payload/GSE/data system interface: Users guide for the VPF (Vertical Processing Facility)
NASA Technical Reports Server (NTRS)
1993-01-01
Payload/GSE/data system interface users guide for the Vertical Processing Facility is presented. The purpose of the document is three fold. First, the simulated Payload and Ground Support Equipment (GSE) Data System Interface, which is also known as the payload T-0 (T-Zero) System is described. This simulated system is located with the Cargo Integration Test Equipment (CITE) in the Vertical Processing Facility (VPF) that is located in the KSC Industrial Area. The actual Payload T-0 System consists of the Orbiter, Mobile Launch Platforms (MLPs), and Launch Complex (LC) 39A and B. This is referred to as the Pad Payload T-0 System (Refer to KSC-DL-116 for Pad Payload T-0 System description). Secondly, information is provided to the payload customer of differences between this simulated system and the actual system. Thirdly, a reference guide of the VPF Payload T-0 System for both KSC and payload customer personnel is provided.
NASA Technical Reports Server (NTRS)
Wassil-Grimm, Andrew D.
1997-01-01
More effective electronic communication processes are needed to transfer contractor and international partner data into NASA and prime contractor baseline database systems. It is estimated that the International Space Station Alpha (ISSA) parts database will contain up to one million parts each of which may require database capabilities for approximately one thousand bytes of data for each part. The resulting gigabyte database must provide easy access to users who will be preparing multiple analyses and reports in order to verify as-designed, as-built, launch, on-orbit, and return configurations for up to 45 missions associated with the construction of the ISSA. Additionally, Internet access to this data base is strongly indicated to allow multiple user access from clients located in many foreign countries. This summer's project involved familiarization and evaluation of the ISSA Electrical, Electronic, and Electromechanical (EEE) Parts data and the process of electronically managing these data. Particular attention was devoted to improving the interfaces among the many elements of the ISSA information system and its global customers and suppliers. Additionally, prototype queries were developed to facilitate the identification of data changes in the data base, verifications that the designs used only approved parts, and certifications that the flight hardware containing EEE parts was ready for flight. This project also resulted in specific recommendations to NASA for further development in the area of EEE parts database development and usage.
Study on tar generated from downdraft gasification of oil palm fronds.
Atnaw, Samson Mekbib; Kueh, Soo Chuan; Sulaiman, Shaharin Anwar
2014-01-01
One of the most challenging issues concerning the gasification of oil palm fronds (OPF) is the presence of tar and particulates formed during the process considering its high volatile matter content. In this study, a tar sampling train custom built based on standard tar sampling protocols was used to quantify the gravimetric concentration of tar (g/Nm3) in syngas produced from downdraft gasification of OPF. The amount of char, ash, and solid tar produced from the gasification process was measured in order to account for the mass and carbon conversion efficiency. Elemental analysis of the char and solid tar samples was done using ultimate analysis machine, while the relative concentration of the different compounds in the liquid tar was determined making use of a liquid gas chromatography (GC) unit. Average tar concentration of 4.928 g/Nm3 and 1.923 g/Nm3 was obtained for raw gas and cleaned gas samples, respectively. Tar concentration in the raw gas sample was found to be higher compared to results for other biomass materials, which could be attributed to the higher volatile matter percentage of OPF. Average cleaning efficiency of 61% which is comparable to that of sand bed filter and venturi scrubber cleaning systems reported in the literature was obtained for the cleaning system proposed in the current study.
Study on Tar Generated from Downdraft Gasification of Oil Palm Fronds
Atnaw, Samson Mekbib; Kueh, Soo Chuan; Sulaiman, Shaharin Anwar
2014-01-01
One of the most challenging issues concerning the gasification of oil palm fronds (OPF) is the presence of tar and particulates formed during the process considering its high volatile matter content. In this study, a tar sampling train custom built based on standard tar sampling protocols was used to quantify the gravimetric concentration of tar (g/Nm3) in syngas produced from downdraft gasification of OPF. The amount of char, ash, and solid tar produced from the gasification process was measured in order to account for the mass and carbon conversion efficiency. Elemental analysis of the char and solid tar samples was done using ultimate analysis machine, while the relative concentration of the different compounds in the liquid tar was determined making use of a liquid gas chromatography (GC) unit. Average tar concentration of 4.928 g/Nm3 and 1.923 g/Nm3 was obtained for raw gas and cleaned gas samples, respectively. Tar concentration in the raw gas sample was found to be higher compared to results for other biomass materials, which could be attributed to the higher volatile matter percentage of OPF. Average cleaning efficiency of 61% which is comparable to that of sand bed filter and venturi scrubber cleaning systems reported in the literature was obtained for the cleaning system proposed in the current study. PMID:24526899
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, Bingbing; Knopf, Daniel A.; China, Swarup
Heterogeneous ice nucleation is a physical chemistry process of critical relevance to a range of topics in the fundamental and the applied sciences and technologies. Heterogeneous ice nucleation remains insufficiently understood. This is in part due to the lack of experimental methods capable of in situ visualization of ice formation over nucleating substrates with microscopically characterized morphology and composition. We present development, validation and first applications of a novel electron microscopy platform allowing observation of individual ice nucleation events at temperature and relative humidity (RH) relevant for ice formation in a broad range of environmental and applied technology processes. Themore » approach utilizes a custom-built ice nucleation cell, interfaced with an Environmental Scanning Electron Microscope (IN-ESEM system). The IN-ESEM system allows dynamic observations of individual ice formation events over particles of atmospheric relevance and determination of the ice nucleation mechanisms. Additional IN-ESEM experiments allow examination of the location of ice formation on the surface of individual particles and micro-spectroscopy analysis of the ice nucleating particles (INPs). This includes elemental composition detected by the energy dispersed analysis of X-rays (EDX), speciation of the organic content in particles using scanning transmission X-ray microscopy with near edge X-ray absorption fine structure spectroscopy (STXM/NEXAFS), and Helium ion microscopy (HeIM). The capabilities of the IN-ESEM experimental platform are demonstrated first on laboratory standards and then by chemical imaging of INPs using a complex sample of ambient particles.« less
Requirements for Space Settlement Design
NASA Astrophysics Data System (ADS)
Gale, Anita E.; Edwards, Richard P.
2004-02-01
When large space settlements are finally built, inevitably the customers who pay for them will start the process by specifying requirements with a Request for Proposal (RFP). Although we are decades away from seeing the first of these documents, some of their contents can be anticipated now, and provide insight into the variety of elements that must be researched and developed before space settlements can happen. Space Settlement Design Competitions for High School students present design challenges in the form of RFPs, which predict basic requirements for space settlement attributes in the future, including structural features, infrastructure, living conveniences, computers, business areas, and safety. These requirements are generically summarized, and unique requirements are noted for specific space settlement locations and applications.
Management of Customer Service in Terms of Logistics Information Systems
NASA Astrophysics Data System (ADS)
Kampf, Rudolf; Ližbetinová, Lenka; Tišlerová, Kamila
2017-03-01
This paper is focused on perceiving the logistic services as the competition advantage in frame of the ecommerce. Customers consider their purchases in its complexity and all the logistic services should be designed to meet with customers' preferences as much as possible. Our aim was to identify and evaluate of customers perceiving in frame of sales proposals offered by e-shops. Collected data of research were processed with the usage of cluster analysis. The aim of this paper is to present the results and conclusions from this research with focus on the elements of logistics services within e-commerce. These outputs can be used for knowledge base of information systems through which enterprises evaluate their decisions and selection of variants. For the enterprise, it is important to appropriate decisions about resource allocation and design of the structure of logistics services were set based on real customer preferences.
A Decision Analysis Tool for Climate Impacts, Adaptations, and Vulnerabilities
DOE Office of Scientific and Technical Information (OSTI.GOV)
Omitaomu, Olufemi A; Parish, Esther S; Nugent, Philip J
Climate change related extreme events (such as flooding, storms, and drought) are already impacting millions of people globally at a cost of billions of dollars annually. Hence, there are urgent needs for urban areas to develop adaptation strategies that will alleviate the impacts of these extreme events. However, lack of appropriate decision support tools that match local applications is limiting local planning efforts. In this paper, we present a quantitative analysis and optimization system with customized decision support modules built on geographic information system (GIS) platform to bridge this gap. This platform is called Urban Climate Adaptation Tool (Urban-CAT). Formore » all Urban-CAT models, we divide a city into a grid with tens of thousands of cells; then compute a list of metrics for each cell from the GIS data. These metrics are used as independent variables to predict climate impacts, compute vulnerability score, and evaluate adaptation options. Overall, the Urban-CAT system has three layers: data layer (that contains spatial data, socio-economic and environmental data, and analytic data), middle layer (that handles data processing, model management, and GIS operation), and application layer (that provides climate impacts forecast, adaptation optimization, and site evaluation). The Urban-CAT platform can guide city and county governments in identifying and planning for effective climate change adaptation strategies.« less
A system for the rapid detection of bacterial contamination in cell-based therapeutica
NASA Astrophysics Data System (ADS)
Bolwien, Carsten; Erhardt, Christian; Sulz, Gerd; Thielecke, Hagen; Johann, Robert; Pudlas, Marieke; Mertsching, Heike; Koch, Steffen
2010-02-01
Monitoring the sterility of cell or tissue cultures is of major concern, particularly in the fields of regenerative medicine and tissue engineering when implanting cells into the human body. Our sterility-control system is based on a Raman micro-spectrometer and is able to perform fast sterility testing on microliters of liquid samples. In conventional sterility control, samples are incubated for weeks to proliferate the contaminants to concentrations above the detection limit of conventional analysis. By contrast, our system filters particles from the liquid sample. The filter chip fabricated in microsystem technology comprises a silicon nitride membrane with millions of sub-micrometer holes to retain particles of critical sizes and is embedded in a microfluidic cell specially suited for concomitant microscopic observation. After filtration, identification is carried out on the single particle level: image processing detects possible contaminants and prepares them for Raman spectroscopic analysis. A custom-built Raman-spectrometer-attachment coupled to the commercial microscope uses 532nm or 785nm Raman excitation and records spectra up to 3400cm-1. In the final step, the recorded spectrum of a single particle is compared to an extensive library of GMP-relevant organisms, and classification is carried out based on a support vector machine.
Windchill-201 - Custom Soft-Type Construction
NASA Technical Reports Server (NTRS)
Jones, Corey; LaPha, Steven
2013-01-01
This presentation will explain Windchill soft-types-what they are, how they work, and how to construct custom ones, configured specifically for your system. The process and particulars of creating and implementing a WTDocument soft-type will be discussed, and the interaction between soft-types and Windchill objects will be shown.
cPath: open source software for collecting, storing, and querying biological pathways.
Cerami, Ethan G; Bader, Gary D; Gross, Benjamin E; Sander, Chris
2006-11-13
Biological pathways, including metabolic pathways, protein interaction networks, signal transduction pathways, and gene regulatory networks, are currently represented in over 220 diverse databases. These data are crucial for the study of specific biological processes, including human diseases. Standard exchange formats for pathway information, such as BioPAX, CellML, SBML and PSI-MI, enable convenient collection of this data for biological research, but mechanisms for common storage and communication are required. We have developed cPath, an open source database and web application for collecting, storing, and querying biological pathway data. cPath makes it easy to aggregate custom pathway data sets available in standard exchange formats from multiple databases, present pathway data to biologists via a customizable web interface, and export pathway data via a web service to third-party software, such as Cytoscape, for visualization and analysis. cPath is software only, and does not include new pathway information. Key features include: a built-in identifier mapping service for linking identical interactors and linking to external resources; built-in support for PSI-MI and BioPAX standard pathway exchange formats; a web service interface for searching and retrieving pathway data sets; and thorough documentation. The cPath software is freely available under the LGPL open source license for academic and commercial use. cPath is a robust, scalable, modular, professional-grade software platform for collecting, storing, and querying biological pathways. It can serve as the core data handling component in information systems for pathway visualization, analysis and modeling.
Understanding customer experience.
Meyer, Christopher; Schwager, Andre
2007-02-01
Anyone who has signed up for cell phone service, attempted to claim a rebate, or navigated a call center has probably suffered from a company's apparent indifference to what should be its first concern: the customer experiences that culminate in either satisfaction or disappointment and defection. Customer experience is the subjective response customers have to direct or indirect contact with a company. It encompasses every aspect of an offering: customer care, advertising, packaging, features, ease of use, reliability. Customer experience is shaped by customers' expectations, which largely reflect previous experiences. Few CEOs would argue against the significance of customer experience or against measuring and analyzing it. But many don't appreciate how those activities differ from CRM or just how illuminating the data can be. For instance, the majority of the companies in a recent survey believed they have been providing "superior" experiences to customers, but most customers disagreed. The authors describe a customer experience management (CEM) process that involves three kinds of monitoring: past patterns (evaluating completed transactions), present patterns (tracking current relationships), and potential patterns (conducting inquiries in the hope of unveiling future opportunities). Data are collected at or about touch points through such methods as surveys, interviews, focus groups, and online forums. Companies need to involve every function in the effort, not just a single customer-facing group. The authors go on to illustrate how a cross-functional CEM system is created. With such a system, companies can discover which customers are prospects for growth and which require immediate intervention.
Geo-hazard harmonised data a driven process to environmental analysis system
NASA Astrophysics Data System (ADS)
Cipolloni, Carlo; Iadanza, Carla; Pantaloni, Marco; Trigila, Alessandro
2015-04-01
In the last decade an increase of damage caused by natural disasters has been recorded in Italy. To support environmental safety and human protection, by reducing vulnerability of exposed elements as well as improving the resilience of the involved communities, it need to give access to harmonized and customized data that is one of several steps towards delivering adequate support to risk assessment, reduction and management. In this contest has been developed SEIS and Copernicus-GEMES as infrastructure based on web services for environmental analysis, to integrates in its own system specifications and results from INSPIRE. The two landslide risk scenarios developed in different European projects driven the harmonization process of data that represents the basic element to have interoperable web services in environmental analysis system. From two different perspective we have built a common methodology to analyse dataset and transform them into INSPIRE compliant format following the Data Specification on Geology and on Natural Risk Zone given by INSPIRE. To ensure the maximum results and re-usability of data we have also applied to the landslide and geological datasets a wider Data model standard like GeoSciML, that represents the natural extension of INSPIRE data model to provide more information. The aim of this work is to present the first results of two projects concerning the data harmonisation process, where an important role is played by the semantic harmonisation using the ontology service and/or the hierarchy vocabularies available as Link Data or Link Open Data by means of URI directly in the data spatial services. It will be presented how the harmonised web services can provide an add value in a risk scenario analysis system, showing the first results of the landslide environmental analysis developed by the eENVplus and LIFE+IMAGINE projects.
Motion camera based on a custom vision sensor and an FPGA architecture
NASA Astrophysics Data System (ADS)
Arias-Estrada, Miguel
1998-09-01
A digital camera for custom focal plane arrays was developed. The camera allows the test and development of analog or mixed-mode arrays for focal plane processing. The camera is used with a custom sensor for motion detection to implement a motion computation system. The custom focal plane sensor detects moving edges at the pixel level using analog VLSI techniques. The sensor communicates motion events using the event-address protocol associated to a temporal reference. In a second stage, a coprocessing architecture based on a field programmable gate array (FPGA) computes the time-of-travel between adjacent pixels. The FPGA allows rapid prototyping and flexible architecture development. Furthermore, the FPGA interfaces the sensor to a compact PC computer which is used for high level control and data communication to the local network. The camera could be used in applications such as self-guided vehicles, mobile robotics and smart surveillance systems. The programmability of the FPGA allows the exploration of further signal processing like spatial edge detection or image segmentation tasks. The article details the motion algorithm, the sensor architecture, the use of the event- address protocol for velocity vector computation and the FPGA architecture used in the motion camera system.
Gamma ray imager on the DIII-D tokamak
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pace, D. C., E-mail: pacedc@fusion.gat.com; Taussig, D.; Eidietis, N. W.
2016-04-15
A gamma ray camera is built for the DIII-D tokamak [J. Luxon, Nucl. Fusion 42, 614 (2002)] that provides spatial localization and energy resolution of gamma flux by combining a lead pinhole camera with custom-built detectors and optimized viewing geometry. This diagnostic system is installed on the outer midplane of the tokamak such that its 123 collimated sightlines extend across the tokamak radius while also covering most of the vertical extent of the plasma volume. A set of 30 bismuth germanate detectors can be secured in any of the available sightlines, allowing for customizable coverage in experiments with runaway electronsmore » in the energy range of 1–60 MeV. Commissioning of the gamma ray imager includes the quantification of electromagnetic noise sources in the tokamak machine hall and a measurement of the energy spectrum of background gamma radiation. First measurements of gamma rays coming from the plasma provide a suitable testbed for implementing pulse height analysis that provides the energy of detected gamma photons.« less
Gamma ray imager on the DIII-D tokamak
Pace, D. C.; Cooper, C. M.; Taussig, D.; ...
2016-04-13
A gamma ray camera is built for the DIII-D tokamak [J. Luxon, Nucl. Fusion 42, 614 (2002)] that provides spatial localization and energy resolution of gamma flux by combining a lead pinhole camera with custom-built detectors and optimized viewing geometry. This diagnostic system is installed on the outer midplane of the tokamak such that its 123 collimated sightlines extend across the tokamak radius while also covering most of the vertical extent of the plasma volume. A set of 30 bismuth germanate detectors can be secured in any of the available sightlines, allowing for customizable coverage in experiments with runaway electronsmore » in the energy range of 1- 60 MeV. Commissioning of the gamma ray imager includes the quantification of electromagnetic noise sources in the tokamak machine hall and a measurement of the energy spectrum of background gamma radiation. In conclusion, first measurements of gamma rays coming from the plasma provide a suitable testbed for implementing pulse height analysis that provides the energy of detected gamma photons.« less
Remote Sensing of Air Pollution from Geo with GEMS and TEMPO
NASA Astrophysics Data System (ADS)
Lasnik, J.; Nicks, D. K., Jr.; Baker, B.; Canova, B.; Chance, K.; Liu, X.; Suleiman, R. M.; Pennington, W. F.; Flittner, D. E.; Al-Saadi, J. A.; Rosenbaum, D. M.
2017-12-01
The Geostationary Environmental Monitoring System (GEMS) and Tropospheric Emissions: Monitoring of Pollution (TEMPO) instruments will provide a new capability for the understanding of air quality and pollution. Ball Aerospace is the instrument developer. The GEMS and TEMPO instruments use well-proven remote sensing techniques and take advantage of a geostationary orbit to take hourly measurements of the same geographical area. The high spatial and temporal resolution of these instruments will allow for measurements of the complex diurnal cycle of pollution driven by the combination of photochemistry, chemical composition and the dynamic nature of the atmosphere. Status of the manufacturing, test and calibration efforts will be presented.The GEMS instrument is being built for the Korea Aerospace Research Institute and their customer the National Institute of Environmental Research (NIER). The TEMPO instrument is being built for NASA under the Earth Venture Instrument EVI Program. NASA Langley Research Center (LaRC) is the managing center and the Principle Investigator (PI) is Kelly Chance of the Smithsonian Astrophysical Observatory (SAO).
Flow-cytometric identification of vinegars using a multi-parameter analysis optical detection module
NASA Astrophysics Data System (ADS)
Verschooten, T.; Ottevaere, H.; Vervaeke, M.; Van Erps, J.; Callewaert, M.; De Malsche, W.; Thienpont, H.
2015-09-01
We show a proof-of-concept demonstration of a multi-parameter analysis low-cost optical detection system for the flowcytometric identification of vinegars. This multi-parameter analysis system can simultaneously measure laser induced fluorescence, absorption and scattering excited by two time-multiplexed lasers of different wavelengths. To our knowledge no other polymer optofluidic chip based system offers more simultaneous measurements. The design of the optofluidic channels is aimed at countering the effects that viscous fingering, air bubbles, and emulsion samples can have on the correct operation of such a detection system. Unpredictable variations in viscosity and refractive index of the channel content can be turned into a source of information. The sample is excited by two laser diodes that are driven by custom made low-cost laser drivers. The optofluidic chip is built to be robust and easy to handle and is reproducible using hot embossing. We show a custom optomechanical holder for the optofluidic chip that ensures correct alignment and automatic connection to the external fluidic system. We show an experiment in which 92 samples of vinegar are measured. We are able to identify 9 different kinds of vinegar with an accuracy of 94%. Thus we show an alternative approach to the classic optical spectroscopy solution at a lowered. Furthermore, we have shown the possibility of predicting the viscosity and turbidity of vinegars with a goodness-of-fit R2 over 0.947.
ERIC Educational Resources Information Center
Gibbs, Hope J.
2005-01-01
This article relates the experiences of Jeff Fischer, an instructor in the Computer Integrated Machining department at South Central College (SCC) in North Mankato, Minnesota. Facing dwindling student enrollment and possible departmental budget costs, Fischer was able to turn his passion for custom-built cycles and the intricate machining that…
DOE Office of Scientific and Technical Information (OSTI.GOV)
none,
This LEED Platinum home was built on the site of a 60-year-old bungalow that was demolished. It boasts views of Candlewood Lake, a great deal of daylight, and projected annual energy savings of almost $3,000. This home was awarded a 2013 Housing Innovation Award in the custom builder category.
Miro, Alice; Perrotta, Kim; Evans, Heather; Kishchuk, Natalie A; Gram, Claire; Stanwick, Richard S; Swinkels, Helena M
2014-08-06
The main objective of the Healthy Canada by Design CLASP Initiative in British Columbia (BC) was to develop, implement and evaluate a capacity-building project for health authorities. The desired outcomes of the project were as follows: 1) increased capacity of the participating health authorities to productively engage in land use and transportation planning processes; 2) new and sustained relationships or collaborations among the participating health authorities and among health authorities, local governments and other built environment stakeholders; and 3) indication of health authority influence and/or application of health evidence and tools in land use and transportation plans and policies. This project was designed to enhance the capacity of three regional health authorities, namely Fraser Health, Island Health and Vancouver Coastal Health, and their staff. These were considered the project's participants. The BC regions served by the three health authorities cover the urban, suburban and rural spectrum across relatively large and diverse geographic areas. The populations have broad ranges in socio-economic status, demographic profiles and cultural and political backgrounds. The Initiative provided the three health authorities with a consultant who had several years of experience working on land use and transportation planning. The consultant conducted situational assessments to understand the baseline knowledge and skill gaps, assets and objectives for built environment work for each of the participating health authorities. On the basis of this information, the consultant developed customized capacity-building work plans for each of the health authorities and assisted them with implementation. Capacity-building activities were as follows: researching health and built environment strategies, policies and evidence; transferring health evidence and promising policies and practices from other jurisdictions to local planning contexts; providing training and support with regard to health and the built environment to health authority staff; bringing together public health staff with local planners for networking; and participating in land use planning processes. The project helped to expand the capacity of participating health authorities to influence land use and transportation planning decisions by increasing the content and process expertise of public health staff. The project informed structural changes within health authorities, such as staffing reallocations to advance built environment work after the project. Health authorities also forged new relationships within and across sectors, which facilitated knowledge exchange and access of the public health sector to opportunities to influence built environment decisions. By the end of the project, there was emerging evidence of a health presence in land use policy documents. The project helped to prioritize, accelerate and formalize the participating health authorities' involvement in land use and transportation planning processes. In the long term, this is expected to lead to health policies and programs that consider the built environment, and to built environment policies and practices that integrate population health goals, thereby reducing the risk of chronic diseases.
High frame-rate MR-guided near-infrared tomography system to monitor breast hemodynamics
NASA Astrophysics Data System (ADS)
Li, Zhiqiu; Jiang, Shudong; Krishnaswamy, Venkataramanan; Davis, Scott C.; Srinivasan, Subhadra; Paulsen, Keith D.; Pogue, Brian W.
2011-02-01
A near-infrared (NIR) tomography system with spectral-encoded sources at two wavelength bands was built to quantify the temporal contrast at 20 Hz bandwidth, while imaging breast tissue. The NIR system was integrated with a magnetic resonance (MR) machine through a custom breast coil interface, and both NIR data and MR images were acquired simultaneously. MR images provided breast tissue structural information for NIR reconstruction. Acquisition of finger pulse oximeter (PO) plethysmogram was synchronized with the NIR system in the experiment to offer a frequency-locked reference. The recovered absorption coefficients of the breast at two wavelengths showed identical temporal frequency as the PO output, proving this multi-modality design can recover the small pulsatile variation of absorption property in breast tissue related to the heartbeat. And it also showed the system's ability on novel contrast imaging of fast flow signals in deep tissue.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zauls, A. Jason; Ashenafi, Michael S.; Onicescu, Georgiana
2011-11-15
Purpose: To report our dosimetric results using a novel push-button seed delivery system that constructs custom links of seeds intraoperatively. Methods and Materials: From 2005 to 2007, 43 patients underwent implantation using a gun applicator (GA), and from 2007 to 2008, 48 patientsunderwent implantation with a novel technique allowing creation of intraoperatively built custom links of seeds (IBCL). Specific endpoint analyses were prostate D90% (pD90%), rV100% > 1.3 cc, and overall time under anesthesia. Results: Final analyses included 91 patients, 43 GA and 48 IBCL. Absolute change in pD90% ({Delta}pD90%) between intraoperative and postoperative plans was evaluated. Using GA method,more » the {Delta}pD90% was -8.1Gy and -12.8Gy for I-125 and Pd-103 implants, respectively. Similarly, the IBCL technique resulted in a {Delta}pD90% of -8.7Gy and -9.8Gy for I-125 and Pd-103 implants, respectively. No statistically significant difference in {Delta}pD90% was found comparing methods. The GA method had two intraoperative and 10 postoperative rV100% >1.3 cc. For IBCL, five intraoperative and eight postoperative plans had rV100% >1.3 cc. For GA, the mean time under anesthesia was 75 min and 87 min for Pd-103 and I-125 implants, respectively. For IBCL, the mean time was 86 and 98 min for Pd-103 and I-125. There was a statistical difference between the methods when comparing mean time under anesthesia. Conclusions: Dosimetrically relevant endpoints were equivalent between the two methods. Currently, time under anesthesia is longer using the IBCL technique but has decreased over time. IBCL is a straightforward brachytherapy technique that can be implemented into clinical practice as an alternative to gun applicators.« less
A systems engineering approach to AIS accreditation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Harris, L.M.; Hunteman, W.J.
1994-04-01
The systems engineering model provides the vehicle for communication between the developer and the customer by presenting system facts and demonstrating the system in an organized form. The same model provides implementors with views of the system`s function and capability. The authors contend that the process of obtaining accreditation for a classified Automated Information System (AIS) adheres to the typical systems engineering model. The accreditation process is modeled as a ``roadmap`` with the customer represented by the Designed Accrediting Authority. The ``roadmap`` model reduces the amount of accreditation knowledge required of an AIS developer and maximizes the effectiveness of participationmore » in the accreditation process by making the understanding of accreditation a natural consequence of applying the model. This paper identifies ten ``destinations`` on the ``road`` to accreditation. The significance of each ``destination`` is explained, as are the potential consequences of its exclusion. The ``roadmap,`` which has been applied to a range of information systems throughout the DOE community, establishes a paradigm for the certification and accreditation of classified AISs.« less
Using Additive Manufacturing to Print a CubeSat Propulsion System
NASA Technical Reports Server (NTRS)
Marshall, William M.
2015-01-01
CubeSats are increasingly being utilized for missions traditionally ascribed to larger satellites CubeSat unit (1U) defined as 10 cm x 10 cm x 11 cm. Have been built up to 6U sizes. CubeSats are typically built up from commercially available off-the-shelf components, but have limited capabilities. By using additive manufacturing, mission specific capabilities (such as propulsion), can be built into a system. This effort is part of STMD Small Satellite program Printing the Complete CubeSat. Interest in propulsion concepts for CubeSats is rapidly gaining interest-Numerous concepts exist for CubeSat scale propulsion concepts. The focus of this effort is how to incorporate into structure using additive manufacturing. End-use of propulsion system dictates which type of system to develop-Pulse-mode RCS would require different system than a delta-V orbital maneuvering system. Team chose an RCS system based on available propulsion systems and feasibility of printing using a materials extrusion process. Initially investigated a cold-gas propulsion system for RCS applications-Materials extrusion process did not permit adequate sealing of part to make this a functional approach.
Investigation of transient dynamics of capillary assisted particle assembly yield
NASA Astrophysics Data System (ADS)
Virganavičius, D.; Juodėnas, M.; Tamulevičius, T.; Schift, H.; Tamulevičius, S.
2017-06-01
In this paper, the transient behavior of the particle assembly yield dynamics when switching from low yield to high yield deposition at different velocity and thermal regimes is investigated. Capillary force assisted particle assembly (CAPA) using colloidal suspension of green fluorescent 270 nm diameter polystyrene beads was performed on patterned poly (dimethyl siloxane) substrates using a custom-built deposition setup. Two types of patterns with different trapping site densities were used to assess CAPA process dynamics and the influence of pattern density and geometry on the deposition yield transitions. Closely packed 300 nm diameter circular pits ordered in hexagonal arrangement with 300 nm pitch, and 2 × 2 mm2 square pits with 2 μm spacing were used. 2-D regular structures of the deposited particles were investigated by means of optical fluorescence and scanning electron microscopy. The fluorescence micrographs were analyzed using a custom algorithm enabling to identify particles and calculate efficiency of the deposition performed at different regimes. Relationship between the spatial distribution of particles in transition zone and ambient conditions was evaluated and quantified by approximation of the yield profile with a logistic function.
Scotti, Dennis J; Harmon, Joel; Behson, Scott J
2009-01-01
This study assesses the importance of customer-contact intensity at the service encounter level as a determinant of service quality assessments. Using data from the U.S. Department of Veterans Affairs, it shows that performance-driven human resources practices play an important role as determinants of employee customer orientation and service capability in both high-contact (outpatient healthcare) and low-contact (benefits claim processing) human service contexts. However, there existed significant differences across service delivery settings in the salience of customer orientation and the congruence between employee and customer perceptions of service quality, depending on the intensity of customer contact. In both contexts, managerial attention to high-performance work systems and customer-orientation has the potential to favorably impact perceptions of service quality, amplify consumer satisfaction, and enhance operational efficiency.
NASA Astrophysics Data System (ADS)
Rose, K.; Rowan, C.; Rager, D.; Dehlin, M.; Baker, D. V.; McIntyre, D.
2015-12-01
Multi-organizational research teams working jointly on projects often encounter problems with discovery, access to relevant existing resources, and data sharing due to large file sizes, inappropriate file formats, or other inefficient options that make collaboration difficult. The Energy Data eXchange (EDX) from Department of Energy's (DOE) National Energy Technology Laboratory (NETL) is an evolving online research environment designed to overcome these challenges in support of DOE's fossil energy goals while offering improved access to data driven products of fossil energy R&D such as datasets, tools, and web applications. In 2011, development of NETL's Energy Data eXchange (EDX) was initiated and offers i) a means for better preserving of NETL's research and development products for future access and re-use, ii) efficient, discoverable access to authoritative, relevant, external resources, and iii) an improved approach and tools to support secure, private collaboration and coordination between multi-organizational teams to meet DOE mission and goals. EDX presently supports fossil energy and SubTER Crosscut research activities, with an ever-growing user base. EDX is built on a heavily customized instance of the open source platform, Comprehensive Knowledge Archive Network (CKAN). EDX connects users to externally relevant data and tools through connecting to external data repositories built on different platforms and other CKAN platforms (e.g. Data.gov). EDX does not download and repost data or tools that already have an online presence. This leads to redundancy and even error. If a relevant resource already has an online instance, is hosted by another online entity, EDX will point users to that external host either using web services, inventorying URLs and other methods. EDX offers users the ability to leverage private-secure capabilities custom built into the system. The team is presently working on version 3 of EDX which will incorporate big data analytical capabilities amongst other advanced features.
Intelligent diagnosis and prescription for a customized physical fitness and healthcare system.
Huang, Chung-Chi; Liu, Hsiao-Man; Huang, Chung-Lin
2015-01-01
With the advent of the era of global high-tech industry and commerce and its associated sedentary lifestyle, opportunities for physical activity are reduced. People's physical fitness and health is deteriorating. Therefore, it is necessary to develop a system that can enhance people's physical fitness and health. However, it is difficult for general physical fitness and healthcare systems to meet individualized needs. The main purpose of this research is to develop a method of intelligent diagnosis and prescription for a customized physical fitness and healthcare system. The proposed system records all processes of the physical fitness and healthcare system via a wireless sensor network and the results of the diagnosis and prescription will be generated by fuzzy logic inference. It will improve individualized physical fitness and healthcare. Finally, we demonstrate the advantages of intelligent diagnosis and prescription for a customized physical fitness and healthcare system.
Towards a Tropical Pacific Observing System for 2020 and Beyond.
NASA Astrophysics Data System (ADS)
Hill, K. L.; Kessler, W. S.; Smith, N.
2016-02-01
The international TPOS 2020 Project arose out of a review workshop in January 2014, following challenges sustaining TAO-TRITON array in 2012, with the aim of rethinking the tropical Pacific arrays in light of new scientific understanding and new ocean technology since its original design in the 1980s-90s. Observing and understanding ENSO remains a fundamental motivation, extending to biogeochemical phenomena, to processes on smaller scales that rectify into the low frequency, and, to the interaction of the coupled boundary layers of the upper ocean and lower atmosphere. Our primary customers remain the operational prediction centers and we will design an array to support research into physical processes, especially those not well represented in current-generation models. Current-generation forecast systems (data assimilation and the model physics) do not make effective-enough use of observations, thus the modeling centers are well-represented in the TPOS 2020 structure and our sampling is targeted to where the forecasts systems need guidance for improvement While we advocate evolution of the present arrays, the long climate records built up at mooring sites, repeated ship surveys, and island stations are fundamental to detecting and diagnosing both natural climate variability and detecting climate change signatures. Task teams have been established in specific topic areas. These will report in mid-2016, when a plan for the revised arrays will be presented to the agencies and governments, for completion of the evolution by 2020.This presentation will discuss the motivation, guiding principles, and potential changes of direction for the tropical Pacific observing system.
Life Cycle Analysis of Dedicated Nano-Launch Technologies
NASA Technical Reports Server (NTRS)
Zapata, Edgar; McCleskey, Carey; Martin, John; Lepsch, Roger; Hernani, Tosoc
2014-01-01
Recent technology advancements have enabled the development of small cheap satellites that can perform useful functions in the space environment. Currently, the only low cost option for getting these payloads into orbit is through ride share programs. As a result, these launch opportunities await primary payload launches and a backlog exists. An alternative option would be dedicated nano-launch systems built and operated to provide more flexible launch services, higher availability, and affordable prices. The potential customer base that would drive requirements or support a business case includes commercial, academia, civil government and defense. Further, NASA technology investments could enable these alternative game changing options.With this context, in 2013 the Game Changing Development (GCD) program funded a NASA team to investigate the feasibility of dedicated nano-satellite launch systems with a recurring cost of less than $2 million per launch for a 5 kg payload to low Earth orbit. The team products would include potential concepts, technologies and factors for enabling the ambitious cost goal, exploring the nature of the goal itself, and informing the GCD program technology investment decision making process. This paper provides an overview of the life cycle analysis effort that was conducted in 2013 by an inter-center NASA team. This effort included the development of reference nano-launch system concepts, developing analysis processes and models, establishing a basis for cost estimates (development, manufacturing and launch) suitable to the scale of the systems, and especially, understanding the relationship of potential game changing technologies to life cycle costs, as well as other factors, such as flights per year.
ERIC Educational Resources Information Center
Carlson, Scott
2008-01-01
This article features the Cliffs Cottage, a "showcase home" at Furman University which demonstrates the use of green technology in residential building and teaches about sustainability. Custom-built for the shelter-magazine dreams of "Southern Living," a sponsor of the home, the house seems better suited for a tony subdivision.…
Trust and Relationship Building in Electronic Commerce.
ERIC Educational Resources Information Center
Papadopoulou, Panagiota; Andreou, Andreas; Kanellis, Panagiotis; Martakos, Drakoulis
2001-01-01
Discussion of the need for trust in electronic commerce to build customer relationships focuses on a model drawn from established theoretical work on trust and relationship marketing that highlights differences between traditional and electronic commerce. Considers how trust can be built into virtual environments. (Contains 50 references.)…
Hourd, Paul; Medcalf, Nicholas; Segal, Joel; Williams, David J
2015-01-01
Computer-aided 3D printing approaches to the industrial production of customized 3D functional living constructs for restoration of tissue and organ function face significant regulatory challenges. Using the manufacture of a customized, 3D-bioprinted nasal implant as a well-informed but hypothetical exemplar, we examine how these products might be regulated. Existing EU and USA regulatory frameworks do not account for the differences between 3D printing and conventional manufacturing methods or the ability to create individual customized products using mechanized rather than craft approaches. Already subject to extensive regulatory control, issues related to control of the computer-aided design to manufacture process and the associated software system chain present additional scientific and regulatory challenges for manufacturers of these complex 3D-bioprinted advanced combination products.
Inhibition of Oncogenic functionality of STAT3 Protein by Membrane Anchoring
NASA Astrophysics Data System (ADS)
Liu, Baoxu; Fletcher, Steven; Gunning, Patrick; Gradinaru, Claudiu
2009-03-01
Signal Transducer and Activator of Transcription 3 (STAT3) protein plays an important role in oncogenic processes. A novel molecular therapeutic approach to inhibit the oncogenic functionality of STAT3 is to design a prenylated small peptide sequence which could sequester STAT3 to the plasma membrane. We have also developed a novel fluorescein derivative label (F-NAc), which is much more photostable compared to the popular fluorescein label FITC. Remarkably, the new dye shows fluorescent properties that are invariant over a wide pH range, which is advantageous for our application. We have shown that F-NAc is suitable for single-molecule measurements and its properties are not affected by ligation to biomolecules. The membrane localization via high-affinity prenylated small-molecule binding agents is studied by encapsulating FNAc-labeled STAT3 and inhibitors within a liposome model cell system. The dynamics of the interaction between the protein and the prenylated ligands is investigated at single molecule level. The efficiency and stability of the STAT3 anchoring in lipid membranes are addressed via quantitative confocal imaging and single-molecule spectroscopy using a custom-built multiparameter fluorescence microscope.
Stolyarov, Alexander M; Gumennik, Alexander; McDaniel, William; Shapira, Ofer; Schell, Brent; Sorin, Fabien; Kuriki, Ken; Benoit, Gilles; Rose, Aimee; Joannopoulos, John D; Fink, Yoel
2012-05-21
We demonstrate an in-fiber gas phase chemical detection architecture in which a chemiluminescent (CL) reaction is spatially and spectrally matched to the core modes of hollow photonic bandgap (PBG) fibers in order to enhance detection efficiency. A peroxide-sensitive CL material is annularly shaped and centered within the fiber's hollow core, thereby increasing the overlap between the emission intensity and the intensity distribution of the low-loss fiber modes. This configuration improves the sensitivity by 0.9 dB/cm compared to coating the material directly on the inner fiber surface, where coupling to both higher loss core modes and cladding modes is enhanced. By integrating the former configuration with a custom-built optofluidic system designed for concomitant controlled vapor delivery and emission measurement, we achieve a limit-of-detection of 100 parts per billion (ppb) for hydrogen peroxide vapor. The PBG fibers are produced by a new fabrication method whereby external gas pressure is used as a control knob to actively tune the transmission bandgaps through the entire visible range during the thermal drawing process.
Additive manufacturing of microfluidic glass chips
NASA Astrophysics Data System (ADS)
Kotz, F.; Helmer, D.; Rapp, B. E.
2018-02-01
Additive manufacturing has gained great interest in the microfluidic community due to the numerous channel designs which can be tested in the early phases of a lab-on-a-chip device development. High resolution additive manufacturing like microstereolithography is largely associated with polymers. Polymers are at a disadvantage compared to other materials due to their softness and low chemical resistance. Whenever high chemical and thermal resistance combined with high optical transparency is needed, glasses become the material of choice. However, glasses are difficult to structure at the microscale requiring hazardous chemicals for etching processes. In this work we present additive manufacturing and high resolution patterning of microfluidic chips in transparent fused silica glass using stereolithography and microlithography. We print an amorphous silica nanocomposite at room temperature using benchtop stereolithography printers and a custom built microlithography system based on a digital mirror device. Using microlithography we printed structures with tens of micron resolution. The printed part is then converted to a transparent fused silica glass using thermal debinding and sintering. Printing of a microfluidic chip can be done within 30 minutes. The heat treatment can be done within two days.
42 CFR 423.128 - Dissemination of Part D plan information.
Code of Federal Regulations, 2011 CFR
2011-10-01
... plan sponsor's toll free customer service line or by accessing the plan sponsor's internet Web site. (8... redetermination processes via an Internet Web site; and (iii) A system that transmits codes to network pharmacies...— (1) A toll-free customer call center that— (i) Is open during usual business hours. (ii) Provides...
Using data mining to build a customer-focused organization.
Okasha, A
1999-08-01
Data mining is a new buzz word in managed care. More than simply a method of unlocking a vault of useful information in MCO data banks and warehouses, the author believes that it can help steer an organization through the reengineering process, leading to the health system's transformation toward a customer-focused organization.
Muroff, Jordana; Amodeo, Maryann; Larson, Mary Jo; Carey, Margaret; Loftin, Ralph D
2011-01-01
This article describes a data management system (DMS) developed to support a large-scale randomized study of an innovative web-course that was designed to improve substance abuse counselors' knowledge and skills in applying a substance abuse treatment method (i.e., cognitive behavioral therapy; CBT). The randomized trial compared the performance of web-course-trained participants (intervention group) and printed-manual-trained participants (comparison group) to determine the effectiveness of the web-course in teaching CBT skills. A single DMS was needed to support all aspects of the study: web-course delivery and management, as well as randomized trial management. The authors briefly reviewed several other systems that were described as built either to handle randomized trials or to deliver and evaluate web-based training. However it was clear that these systems fell short of meeting our needs for simultaneous, coordinated management of the web-course and the randomized trial. New England Research Institute's (NERI) proprietary Advanced Data Entry and Protocol Tracking (ADEPT) system was coupled with the web-programmed course and customized for our purposes. This article highlights the requirements for a DMS that operates at the intersection of web-based course management systems and randomized clinical trial systems, and the extent to which the coupled, customized ADEPT satisfied those requirements. Recommendations are included for institutions and individuals considering conducting randomized trials and web-based training programs, and seeking a DMS that can meet similar requirements.
Technical Leadership Development Program - Year 2
2012-02-01
Why Projects Fail Pennotti Wed 2:45-3:00 Break Wed 3:00-4:30 Project: AR2D2: RFP Robinson UNCLASSIFIED Contract Number: H98230-08-D-0171 DO 02...Project: AR2D2 RFP 12. Customer Expectation-1: Lecture: Why Systems Fail 13. Customer Expectation-2: Case Study: Process Automation 14...01 February 2012 UNCLASSIFIED 65 Syllabus Segment 12: Why Systems Fail (Lecture) Time: 1.5 hours Responsible: Mike Pennotti Support
Engineering specification and system design for CAD/CAM of custom shoes: UMC project effort
NASA Technical Reports Server (NTRS)
Bao, Han P.
1991-01-01
The goal of this project is to supplement the footwear design system of North Carolina State University (NCSU) with a software module to design and manufacture a combination sole. The four areas of concentration were: customization of NASCAD (NASA Computer Aided Design) to the footwear project; use of CENCIT data; computer aided manufacturing activities; and beginning work for the bottom elements of shoes. The task of generating a software module for producing a sole was completed with a demonstrated product realization. The software written in C was delivered to NCSU for inclusion in their design system for custom footwear known as LASTMOD. The machining process of the shoe last was improved using a spiral tool path approach.
The General Mission Analysis Tool (GMAT): Current Features And Adding Custom Functionality
NASA Technical Reports Server (NTRS)
Conway, Darrel J.; Hughes, Steven P.
2010-01-01
The General Mission Analysis Tool (GMAT) is a software system for trajectory optimization, mission analysis, trajectory estimation, and prediction developed by NASA, the Air Force Research Lab, and private industry. GMAT's design and implementation are based on four basic principles: open source visibility for both the source code and design documentation; platform independence; modular design; and user extensibility. The system, released under the NASA Open Source Agreement, runs on Windows, Mac and Linux. User extensions, loaded at run time, have been built for optimization, trajectory visualization, force model extension, and estimation, by parties outside of GMAT's development group. The system has been used to optimize maneuvers for the Lunar Crater Observation and Sensing Satellite (LCROSS) and ARTEMIS missions and is being used for formation design and analysis for the Magnetospheric Multiscale Mission (MMS).
Automatic user customization for improving the performance of a self-paced brain interface system.
Fatourechi, Mehrdad; Bashashati, Ali; Birch, Gary E; Ward, Rabab K
2006-12-01
Customizing the parameter values of brain interface (BI) systems by a human expert has the advantage of being fast and computationally efficient. However, as the number of users and EEG channels grows, this process becomes increasingly time consuming and exhausting. Manual customization also introduces inaccuracies in the estimation of the parameter values. In this paper, the performance of a self-paced BI system whose design parameter values were automatically user customized using a genetic algorithm (GA) is studied. The GA automatically estimates the shapes of movement-related potentials (MRPs), whose features are then extracted to drive the BI. Offline analysis of the data of eight subjects revealed that automatic user customization improved the true positive (TP) rate of the system by an average of 6.68% over that whose customization was carried out by a human expert, i.e., by visually inspecting the MRP templates. On average, the best improvement in the TP rate (an average of 9.82%) was achieved for four individuals with spinal cord injury. In this case, the visual estimation of the parameter values of the MRP templates was very difficult because of the highly noisy nature of the EEG signals. For four able-bodied subjects, for which the MRP templates were less noisy, the automatic user customization led to an average improvement of 3.58% in the TP rate. The results also show that the inter-subject variability of the TP rate is also reduced compared to the case when user customization is carried out by a human expert. These findings provide some primary evidence that automatic user customization leads to beneficial results in the design of a self-paced BI for individuals with spinal cord injury.
Brand strengthening decision making delved from brand-contacts in health services organizations.
Takayanagi, Kazue; Hagihara, Yukiko
2007-01-01
Under the Japanese Government's strong enforcement of Japanese national medical cost reduction, only hospitals which emphasize patient values, and creation of brands according to them can survive. This study extracted patients' expectations as brand from Campbell's Brand-Contact lists. The authors also proposed to add Brand-strengthening strategies both for short-term strategies (large improvement is not required) and for long-term strategies (restructuring hardware and systems). This method would enable hospitals to collect customers' underlying expectations, and would create high-value brands. Trustful medical service would provide mutual and synergetic medical care effects. It is already considered out of date to conduct qualitative patient satisfaction interviews on current medical services to current customers. It is the only way to survive that hospitals themselves produce their original brands to increase patient loyalty and customer satisfaction. In the process, customer value should be reconsidered from both aspects of the quality of clinical care and of other medically related services. Then hospitals would be able to satisfy both customers' output and process expectations.
Embedded Web Technology: Applying World Wide Web Standards to Embedded Systems
NASA Technical Reports Server (NTRS)
Ponyik, Joseph G.; York, David W.
2002-01-01
Embedded Systems have traditionally been developed in a highly customized manner. The user interface hardware and software along with the interface to the embedded system are typically unique to the system for which they are built, resulting in extra cost to the system in terms of development time and maintenance effort. World Wide Web standards have been developed in the passed ten years with the goal of allowing servers and clients to intemperate seamlessly. The client and server systems can consist of differing hardware and software platforms but the World Wide Web standards allow them to interface without knowing about the details of system at the other end of the interface. Embedded Web Technology is the merging of Embedded Systems with the World Wide Web. Embedded Web Technology decreases the cost of developing and maintaining the user interface by allowing the user to interface to the embedded system through a web browser running on a standard personal computer. Embedded Web Technology can also be used to simplify an Embedded System's internal network.
Goals Analysis Procedure Guidelines for Applying the Goals Analysis Process
NASA Technical Reports Server (NTRS)
Motley, Albert E., III
2000-01-01
One of the key elements to successful project management is the establishment of the "right set of requirements", requirements that reflect the true customer needs and are consistent with the strategic goals and objectives of the participating organizations. A viable set of requirements implies that each individual requirement is a necessary element in satisfying the stated goals and that the entire set of requirements, taken as a whole, is sufficient to satisfy the stated goals. Unfortunately, it is the author's experience that during project formulation phases' many of the Systems Engineering customers do not conduct a rigorous analysis of the goals and objectives that drive the system requirements. As a result, the Systems Engineer is often provided with requirements that are vague, incomplete, and internally inconsistent. To complicate matters, most systems development methodologies assume that the customer provides unambiguous, comprehensive and concise requirements. This paper describes the specific steps of a Goals Analysis process applied by Systems Engineers at the NASA Langley Research Center during the formulation of requirements for research projects. The objective of Goals Analysis is to identify and explore all of the influencing factors that ultimately drive the system's requirements.
Integrating MRP (materiel requirements planning) into modern business.
Lunn, T
1994-05-01
Time is the commodity of the '90s. Therefore, we all must learn how to use our manufacturing systems to shorten lead time and increase customer satisfaction. The objective of this article is to discuss practical ways people integrate the techniques of materiel requirements planning (MRP) systems with just-in-time (JIT) execution systems to increase customer satisfaction. Included are examples of new ways people use MRP systems to exemplify the process of continuous improvement--multiple items on work orders, consolidated routings, flexing capacity, and other new developments. Ways that successful companies use MRP II for planning and JIT for execution are discussed. There are many examples of how to apply theory to real life situations and a discussion of techniques that work to keep companies in the mode of continuous improvement. Also included is a look at hands-on, practical methods people use to achieve lead time reduction and simplify bills of material. Total quality management concepts can be applied to the MRP process itself. This in turn helps people improve schedule adherence, which leads to customer satisfaction.
Preparing the Direct Broadcast Community for GOES-R
NASA Astrophysics Data System (ADS)
Dubey, K. F.; Baptiste, E.; Prasad, K.; Shin, H.
2012-12-01
The first satellite in the United States next generation weather satellite program, GOES-R, will be launched in 2015. SeaSpace Corporation is using our recent experience and lessons learned from bringing Suomi NPP-capable direct reception systems online, to similarly bring direct reception solutions to future GOES-R users. This includes earlier outreach to customers, due to the advance budgeting deadline for procurement in many agencies. With the cancellation of eGRB, all current GOES gvar customer will need a new direct readout system, with a new receiver, high powered processing subsystem, and a larger antenna in some locations. SeaSpace's preparations have also included communicating with program leaders in NOAA and NASA regarding direct readout specifications and the development of the borrowing process for the government-procured GRB emulator. At the request of NASA, SeaSpace has offered input towards the emulator check-out process, which is expected to begin in spring 2013. After the launch of Suomi NPP, SeaSpace found a need by non-traditional customers (such as customers with non-SeaSpace ground stations or those getting data via the NOAA archive), for a processing-only subsystem. In response to this need, SeaSpace developed such a solution for Suomi NPP users, and plans to do similar for GOES-R. This presentation will cover the steps that SeaSpace is undertaking to prepare the members of the direct reception community for reception and processing of GOES-R satellite data, and detail the solutions offered.
EvolView, an online tool for visualizing, annotating and managing phylogenetic trees.
Zhang, Huangkai; Gao, Shenghan; Lercher, Martin J; Hu, Songnian; Chen, Wei-Hua
2012-07-01
EvolView is a web application for visualizing, annotating and managing phylogenetic trees. First, EvolView is a phylogenetic tree viewer and customization tool; it visualizes trees in various formats, customizes them through built-in functions that can link information from external datasets, and exports the customized results to publication-ready figures. Second, EvolView is a tree and dataset management tool: users can easily organize related trees into distinct projects, add new datasets to trees and edit and manage existing trees and datasets. To make EvolView easy to use, it is equipped with an intuitive user interface. With a free account, users can save data and manipulations on the EvolView server. EvolView is freely available at: http://www.evolgenius.info/evolview.html.
EvolView, an online tool for visualizing, annotating and managing phylogenetic trees
Zhang, Huangkai; Gao, Shenghan; Lercher, Martin J.; Hu, Songnian; Chen, Wei-Hua
2012-01-01
EvolView is a web application for visualizing, annotating and managing phylogenetic trees. First, EvolView is a phylogenetic tree viewer and customization tool; it visualizes trees in various formats, customizes them through built-in functions that can link information from external datasets, and exports the customized results to publication-ready figures. Second, EvolView is a tree and dataset management tool: users can easily organize related trees into distinct projects, add new datasets to trees and edit and manage existing trees and datasets. To make EvolView easy to use, it is equipped with an intuitive user interface. With a free account, users can save data and manipulations on the EvolView server. EvolView is freely available at: http://www.evolgenius.info/evolview.html. PMID:22695796
Using Additive Manufacturing to Print a CubeSat Propulsion System
NASA Technical Reports Server (NTRS)
Marshall, William M.; Zemba, Michael; Shemelya, Corey; Wicker, Ryan; Espalin, David; MacDonald, Eric; Keif, Craig; Kwas, Andrew
2015-01-01
Small satellites, such as CubeSats, are increasingly being called upon to perform missions traditionally ascribed to larger satellite systems. However, the market of components and hardware for small satellites, particularly CubeSats, still falls short of providing the necessary capabilities required by ever increasing mission demands. One way to overcome this shortfall is to develop the ability to customize every build. By utilizing fabrication methods such as additive manufacturing, mission specific capabilities can be built into a system, or into the structure, that commercial off-the-shelf components may not be able to provide. A partnership between the University of Texas at El Paso, COSMIAC at the University of New Mexico, Northrop Grumman, and the NASA Glenn Research Center is looking into using additive manufacturing techniques to build a complete CubeSat, under the Small Spacecraft Technology Program. The W. M. Keck Center at the University of Texas at El Paso has previously demonstrated the ability to embed electronics and wires into the addtively manufactured structures. Using this technique, features such as antennas and propulsion systems can be included into the CubeSat structural body. Of interest to this paper, the team is investigating the ability to take a commercial micro pulsed plasma thruster and embed it into the printing process. Tests demonstrating the dielectric strength of the printed material and proof-of-concept demonstration of the printed thruster will be shown.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pharhizgar, K.D.; Lunce, S.E.
1994-12-31
Development of knowledge-based technological acquisition techniques and customers` information profiles are known as assimilative integrated discovery systems (AIDS) in modern organizations. These systems have access through processing to both deep and broad domains of information in modern societies. Through these systems organizations and individuals can predict future trend probabilities and events concerning their customers. AIDSs are new techniques which produce new information which informants can use without the help of the knowledge sources because of the existence of highly sophisticated computerized networks. This paper has analyzed the danger and side effects of misuse of information through the illegal, unethical andmore » immoral access to the data-base in an integrated and assimilative information system as described above. Cognivistic mapping, pragmatistic informational design gathering, and holistic classifiable and distributive techniques are potentially abusive systems whose outputs can be easily misused by businesses when researching the firm`s customers.« less
The As-Cu-Ni System: A Chemical Thermodynamic Model for Ancient Recycling
NASA Astrophysics Data System (ADS)
Sabatini, Benjamin J.
2015-12-01
This article is the first thermodynamically reasoned ancient metal system assessment intended for use by archaeologists and archaeometallurgists to aid in the interpretation of remelted/recycled copper alloys composed of arsenic and copper, and arsenic, copper, and nickel. These models are meant to fulfill two main purposes: first, to be applied toward the identification of progressive and regressive temporal changes in artifact chemistry that would have occurred due to recycling, and second, to provide thermodynamic insight into why such metal combinations existed in antiquity. Built on well-established thermodynamics, these models were created using a combination of custom-written software and published binary thermodynamic systems data adjusted to within the boundary conditions of 1200°C and 1 atm. Using these parameters, the behavior of each element and their likelihood of loss in the binaries As-Cu, As-Ni, Cu-Ni, and ternary As-Cu-Ni, systems, under assumed ancient furnace conditions, was determined.
Software for minimalistic data management in large camera trap studies
Krishnappa, Yathin S.; Turner, Wendy C.
2014-01-01
The use of camera traps is now widespread and their importance in wildlife studies well understood. Camera trap studies can produce millions of photographs and there is a need for software to help manage photographs efficiently. In this paper, we describe a software system that was built to successfully manage a large behavioral camera trap study that produced more than a million photographs. We describe the software architecture and the design decisions that shaped the evolution of the program over the study’s three year period. The software system has the ability to automatically extract metadata from images, and add customized metadata to the images in a standardized format. The software system can be installed as a standalone application on popular operating systems. It is minimalistic, scalable and extendable so that it can be used by small teams or individual researchers for a broad variety of camera trap studies. PMID:25110471
Recent trends in print portals and Web2Print applications
NASA Astrophysics Data System (ADS)
Tuijn, Chris
2009-01-01
For quite some time now, the printing business has been under heavy pressure because of overcapacity, dropping prices and the delocalization of the production to low income countries. To survive in this competitive world, printers have to invest in tools that, on one hand, reduce the production costs and, on the other hand, create additional value for their customers (print buyers). The creation of customer portals on top of prepress production systems allowing print buyers to upload their content, approve the uploaded pages based on soft proofs (rendered by the underlying production system) and further follow-up the generation of the printed material, has been illustrative in this respect. These developments resulted in both automation for the printer and added value for the print buyer. Many traditional customer portals assume that the printed products have been identified before they are presented to the print buyer in the portal environment. The products are, in this case, typically entered by the printing organization in a so-called MISi system after the official purchase order has been received from the print buyer. Afterwards, the MIS system then submits the product to the customer portal. Some portals, however, also support the initiation of printed products by the print buyer directly. This workflow creates additional flexibility but also makes things much more complex. We here have to distinguish between special products that are defined ad-hoc by the print buyer and standardized products that are typically selected out of catalogs. Special products are most of the time defined once and the level of detail required in terms of production parameters is quite high. Systems that support such products typically have a built-in estimation module, or, at least, a direct connection to an MIS system that calculates the prices and adds a specific mark-up to calculate a quote. Often, the markup is added by an account manager on a customer by customer basis; in this case, the ordering process is, of course, not fully automated. Standardized products, on the other hand, are easily identified and the cost charged to the print buyer can be retrieved from predefined price lists. Typically, higher volumes will result in more attractive prices. An additional advantage of this type of products is that they are often defined such that they can be produced in bulk using conventional printing techniques. If one wants to automate the ganging, a connection must be established between the on-line ordering and the production planning system. (For digital printing, there typically is no need to gang products since they can be produced more effectively separately.) Many of the on-line print solutions support additional features also available in general purpose e-commerce sites. We here think of the availability of virtual shopping baskets, the connectivity with payment gateways and the support of special facilities for interfacing with courier services (bar codes, connectivity to courier web sites for tracking shipments etc.). Supporting these features also assumes an intimate link with the print production system. Another development that goes beyond the on-line ordering of printed material and the submission of full pages and/or documents, is the interactive, on-line definition of the content itself. Typical applications in this respect are, e.g., the creation of business cards, leaflets, letter heads etc. On a more professional level, we also see that more and more publishing organizations start using on-line publishing platforms to organize their work. These professional platforms can also be connected directly to printing portals and thus enable extra automation. In this paper, we will discuss for each of the different applications presented above (traditional Print Portals, Web2Print applications and professional, on-line publishing platforms) how they interact with prepress and print production systems and how they contribute to the improvement of the overall operations of a printing organization.
26 CFR 1.263A-10 - Unit of property.
Code of Federal Regulations, 2010 CFR
2010-04-01
... property may be treated as not included in the accumulated production expenditures for the unit starting.... B, an individual, is in the trade or business of constructing custom-built houses for sale. B owns a... the accumulated production expenditures of the house unit starting with the first measurement period...
2006-06-01
series with the Philippines, Indonesia, Singapore, Malaysia , Brunei, and the United States. Another example of regional collaboration is the South East...computers to choose from producers such as Sony , Fujitsu, Compaq, Toshiba, Macintosh or a custom-built PC. The selection depends on factors such as
Bus transit operational efficiency resulting from passenger boardings at park-and-ride facilities.
DOT National Transportation Integrated Search
2016-08-01
In order to save time and money by not driving to an ultimate destination, some urban commuters drive themselves a few miles to specially designated parking lots built for transit customers and located where trains or buses stop. The focus of this pa...
Dependency Tree Annotation Software
2015-11-01
formats, and it provides numerous options for customizing how dependency trees are displayed. Built entirely in Java , it can run on a wide range of...tree can be saved as an image, .mxe (a mxGraph editing file), a .conll file, and several other file formats. DTE uses the open source Java version
You catch more flies with sugar...marketing RIM
DOE Office of Scientific and Technical Information (OSTI.GOV)
KEENEN,MARTHA JANE
There is a difference between marketing and selling. Marketing is finding out what the customer wants and/or needs and showing that customer how a product meets those needs. Modifying or repackaging the product may be required to make its utility clear to the customer. When it is, they'll buy because they, on their own, want it. Selling is pushing a product on the customer for reasons of profit, compliance, the way things have always been done here, or any others. When one markets, a relationship is built. This isn't about a one-time sale, it's about getting those records into safekeepingmore » and customers trusting us to give them back, retrieve them, the way that customer needs them, when and how that customer needs them. This is a trust building exercise that has long-term as well as short-term actions and reactions all aligned toward that interdependent relationship between customers and us, the recorded information managers. Marketing works better than selling because human beings don't like to be pushed...think door-to-door sales people and evaluate emotions. Are they positive? Go a step further. No one likes to be told to do what's good for you? Which brings us to the fundamental marketing, as opposed to sales, principle: What's In It For Me? Commonly called the WIIFM of Wiff-em principle in marketing and entrepreneurship texts and classes.« less
Methods and devices for determining quality of services of storage systems
Seelam, Seetharami R [Yorktown Heights, NY; Teller, Patricia J [Las Cruces, NM
2012-01-17
Methods and systems for allowing access to computer storage systems. Multiple requests from multiple applications can be received and processed efficiently to allow traffic from multiple customers to access the storage system concurrently.
Operating experience of the southwire high-temperature superconducting cable project
NASA Astrophysics Data System (ADS)
Hughey, R. L.; Lindsay, D.
2002-01-01
Southwire Company of Carrollton, Georgia in cooperation with Oak Ridge National Laboratory has designed, built, installed and is operating the world's first field installation of a High Temperature Superconducting (HTS) cable system. The cables supply power to three Southwire manufacturing facilities and part of the corporate headquarters building in Carrollton, GA. The system consists of three 30-m single phase cables rated at 12.4 kV, 1250 Amps, liquid nitrogen cooling system, and the computer-based control system. The cables are built using BSCCO-2223 powder-in-tube HTS tapes and a proprietary cryogenic dielectric material called Cryoflex™. The cables are fully shielded with a second layer of HTS tapes to eliminate any external magnetic fields. The Southwire HTS cables were first energized on january 6, 2000. Since that time they have logged over 8,500 hours of operation while supplying 100% of the required customer load. To date, the cables have worked without failure and operations are continuing. The cable design has passed requisite testing for this class of conventional cables including 10× over current to 12,500 Amps and BIL testing to 110 kV. Southwire has also successfully designed and tested a cable splice. System heat loads and AC Losses have been measured and compared to calculated values. On June 1, 2001 on-site monitoring was ceased and the system was changed to unattended operation to further prove the reliability of the HTS cable system. .
Reconfigurable HIL Testing of Earth Satellites
NASA Technical Reports Server (NTRS)
2008-01-01
In recent years, hardware-in-the-loop (HIL) testing has carved a strong niche in several industries, such as automotive, aerospace, telecomm, and consumer electronics. As desktop computers have realized gains in speed, memory size, and data storage capacity, hardware/software platforms have evolved into high performance, deterministic HIL platforms, capable of hosting the most demanding applications for testing components and subsystems. Using simulation software to emulate the digital and analog I/O signals of system components, engineers of all disciplines can now test new systems in realistic environments to evaluate their function and performance prior to field deployment. Within the Aerospace industry, space-borne satellite systems are arguably some of the most demanding in terms of their requirement for custom engineering and testing. Typically, spacecraft are built one or few at a time to fulfill a space science or defense mission. In contrast to other industries that can amortize the cost of HIL systems over thousands, even millions of units, spacecraft HIL systems have been built as one-of-a-kind solutions, expensive in terms of schedule, cost, and risk, to assure satellite and spacecraft systems reliability. The focus of this paper is to present a new approach to HIL testing for spacecraft systems that takes advantage of a highly flexible hardware/software architecture based on National Instruments PXI reconfigurable hardware and virtual instruments developed using LabVIEW. This new approach to HIL is based on a multistage/multimode spacecraft bus emulation development model called Reconfigurable Hardware In-the-Loop or RHIL.
2009-11-19
CAPE CANAVERAL, Fla. – NASA's first large-scale solar power generation facility is unveiled at NASA's Kennedy Space Center in Florida. Representatives from NASA, Florida Power & Light Company, or FPL, and SunPower Corporation formally commissioned the one-megawatt facility and announced plans to pursue a new research, development and demonstration project at Kennedy to advance America's use of renewable energy. The facility is the first element of a major renewable energy project currently under construction at Kennedy. The completed system features a fixed-tilt, ground-mounted solar power system designed and built by SunPower, along with SunPower solar panels. A 10-megawatt solar farm, which SunPower is building on nearby Kennedy property, will supply power to FPL's customers when it is completed in April 2010. Photo credit: NASA/Jim Grossmann
2009-11-19
CAPE CANAVERAL, Fla. – NASA's first large-scale solar power generation facility is ready for operation at NASA's Kennedy Space Center in Florida. Representatives from NASA, Florida Power & Light Company, or FPL, and SunPower Corporation formally commissioned the one-megawatt facility and announced plans to pursue a new research, development and demonstration project at Kennedy to advance America's use of renewable energy. The facility is the first element of a major renewable energy project currently under construction at Kennedy. The completed system features a fixed-tilt, ground-mounted solar power system designed and built by SunPower, along with SunPower solar panels. A 10-megawatt solar farm, which SunPower is building on nearby Kennedy property, will supply power to FPL's customers when it is completed in April 2010. Photo credit: NASA/Jim Grossmann
2009-11-19
CAPE CANAVERAL, Fla. – NASA's first large-scale solar power generation facility opens at NASA's Kennedy Space Center in Florida. Representatives from NASA, Florida Power & Light Company, or FPL, and SunPower Corporation formally commissioned the one-megawatt facility and announced plans to pursue a new research, development and demonstration project at Kennedy to advance America's use of renewable energy. The facility is the first element of a major renewable energy project currently under construction at Kennedy. The completed system features a fixed-tilt, ground-mounted solar power system designed and built by SunPower, along with SunPower solar panels. A 10-megawatt solar farm, which SunPower is building on nearby Kennedy property, will supply power to FPL's customers when it is completed in April 2010. Photo credit: NASA/Jim Grossmann
2011-01-01
Background Next-generation sequencing technologies have decentralized sequence acquisition, increasing the demand for new bioinformatics tools that are easy to use, portable across multiple platforms, and scalable for high-throughput applications. Cloud computing platforms provide on-demand access to computing infrastructure over the Internet and can be used in combination with custom built virtual machines to distribute pre-packaged with pre-configured software. Results We describe the Cloud Virtual Resource, CloVR, a new desktop application for push-button automated sequence analysis that can utilize cloud computing resources. CloVR is implemented as a single portable virtual machine (VM) that provides several automated analysis pipelines for microbial genomics, including 16S, whole genome and metagenome sequence analysis. The CloVR VM runs on a personal computer, utilizes local computer resources and requires minimal installation, addressing key challenges in deploying bioinformatics workflows. In addition CloVR supports use of remote cloud computing resources to improve performance for large-scale sequence processing. In a case study, we demonstrate the use of CloVR to automatically process next-generation sequencing data on multiple cloud computing platforms. Conclusion The CloVR VM and associated architecture lowers the barrier of entry for utilizing complex analysis protocols on both local single- and multi-core computers and cloud systems for high throughput data processing. PMID:21878105
Design of a superconducting volume coil for magnetic resonance microscopy of the mouse brain
NASA Astrophysics Data System (ADS)
Nouls, John C.; Izenson, Michael G.; Greeley, Harold P.; Johnson, G. Allan
2008-04-01
We present the design process of a superconducting volume coil for magnetic resonance microscopy of the mouse brain at 9.4 T. The yttrium barium copper oxide coil has been designed through an iterative process of three-dimensional finite-element simulations and validation against room temperature copper coils. Compared to previous designs, the Helmholtz pair provides substantially higher B1 homogeneity over an extended volume of interest sufficiently large to image biologically relevant specimens. A custom-built cryogenic cooling system maintains the superconducting probe at 60 ± 0.1 K. Specimen loading and probe retuning can be carried out interactively with the coil at operating temperature, enabling much higher through-put. The operation of the probe is a routine, consistent procedure. Signal-to-noise ratio in a mouse brain increased by a factor ranging from 1.1 to 2.9 as compared to a room-temperature solenoid coil optimized for mouse brain microscopy. We demonstrate images encoded at 10 × 10 × 20 μm for an entire mouse brain specimen with signal-to-noise ratio of 18 and a total acquisition time of 16.5 h, revealing neuroanatomy unseen at lower resolution. Phantom measurements show an effective spatial resolution better than 20 μm.
Ultra-modular 500m2 heliostat field for high flux/high temperature solar-driven processes
NASA Astrophysics Data System (ADS)
Romero, Manuel; González-Aguilar, José; Luque, Salvador
2017-06-01
The main objective of the European Project SUN-to-LIQUID is the scale-up and experimental demonstration of the complete process chain to solar liquid fuels from H2O and CO2. This implies moving from a 4 kW laboratory setup to a pre-commercial plant including a heliostat field. The small power and high irradiance onto the focal spot is forcing the optical design to behave half way between a large solar furnace and an extremely small central receiver system. The customized heliostat field makes use of the most recent developments on small size heliostats and a tower with reduced optical height (15 m) to minimize visual impact. A heliostat field of 250kWth (500 m2 reflective surface) has been built adjacent to IMDEA Energy premises at the Technology Park of Móstoles, Spain, and consists of 169 small size heliostats (1.9 m × 1.6 m). In spite of the small size and compactness of the field, when all heliostats are aligned, it is possible to fulfil the specified flux above 2500 kW/m2 for at least 50 kW and an aperture of 16 cm, with a peak flux of 3000 kW/m2.
Design of a superconducting volume coil for magnetic resonance microscopy of the mouse brain.
Nouls, John C; Izenson, Michael G; Greeley, Harold P; Johnson, G Allan
2008-04-01
We present the design process of a superconducting volume coil for magnetic resonance microscopy of the mouse brain at 9.4T. The yttrium barium copper oxide coil has been designed through an iterative process of three-dimensional finite-element simulations and validation against room temperature copper coils. Compared to previous designs, the Helmholtz pair provides substantially higher B(1) homogeneity over an extended volume of interest sufficiently large to image biologically relevant specimens. A custom-built cryogenic cooling system maintains the superconducting probe at 60+/-0.1K. Specimen loading and probe retuning can be carried out interactively with the coil at operating temperature, enabling much higher through-put. The operation of the probe is a routine, consistent procedure. Signal-to-noise ratio in a mouse brain increased by a factor ranging from 1.1 to 2.9 as compared to a room-temperature solenoid coil optimized for mouse brain microscopy. We demonstrate images encoded at 10x10x20mum for an entire mouse brain specimen with signal-to-noise ratio of 18 and a total acquisition time of 16.5h, revealing neuroanatomy unseen at lower resolution. Phantom measurements show an effective spatial resolution better than 20mum.
Zhu, Xiaodong; Wang, Jing; Tang, Juan
2017-12-15
Environmentally friendly handling and efficient recycling of waste electrical on Waste Electrical and Electronic Equipment (WEEE) have grown to be a global social problem. As holders of WEEE, consumers have a significant effect on the recycling process. A consideration of and attention to the influence of consumer behavior in the recycling process can help achieve more effective recycling of WEEE. In this paper, we built a dual-channel closed-loop supply chain model composed of manufacturers, retailers, and network recycling platforms. Based on the influence of customer bargaining behavior, we studied several different scenarios of centralized decision-making, decentralized decision-making, and contract coordination, using the Stackelberg game theory. The results show that retailers and network recycling platforms will reduce the direct recovery prices to maintain their own profit when considering the impact of consumer bargaining behavior, while remanufacturers will improve the transfer payment price for surrendering part of the profit under revenue and the expense sharing contract. Using this contract, we can achieve supply chain coordination and eliminate the effect of consumer bargaining behavior on supply chain performance. It can be viewed from the parameter sensitivity analysis that when we select the appropriate sharing coefficient, the closed-loop supply chain can achieve the same system performance under a centralized decision.
Zhu, Xiaodong; Wang, Jing; Tang, Juan
2017-01-01
Environmentally friendly handling and efficient recycling of waste electrical on Waste Electrical and Electronic Equipment (WEEE) have grown to be a global social problem. As holders of WEEE, consumers have a significant effect on the recycling process. A consideration of and attention to the influence of consumer behavior in the recycling process can help achieve more effective recycling of WEEE. In this paper, we built a dual-channel closed-loop supply chain model composed of manufacturers, retailers, and network recycling platforms. Based on the influence of customer bargaining behavior, we studied several different scenarios of centralized decision-making, decentralized decision-making, and contract coordination, using the Stackelberg game theory. The results show that retailers and network recycling platforms will reduce the direct recovery prices to maintain their own profit when considering the impact of consumer bargaining behavior, while remanufacturers will improve the transfer payment price for surrendering part of the profit under revenue and the expense sharing contract. Using this contract, we can achieve supply chain coordination and eliminate the effect of consumer bargaining behavior on supply chain performance. It can be viewed from the parameter sensitivity analysis that when we select the appropriate sharing coefficient, the closed-loop supply chain can achieve the same system performance under a centralized decision. PMID:29244778
Novel Applications of Rapid Prototyping in Gamma-ray and X-ray Imaging
Miller, Brian W.; Moore, Jared W.; Gehm, Michael E.; Furenlid, Lars R.; Barrett, Harrison H.
2010-01-01
Advances in 3D rapid-prototyping printers, 3D modeling software, and casting techniques allow for the fabrication of cost-effective, custom components in gamma-ray and x-ray imaging systems. Applications extend to new fabrication methods for custom collimators, pinholes, calibration and resolution phantoms, mounting and shielding components, and imaging apertures. Details of the fabrication process for these components are presented, specifically the 3D printing process, cold casting with a tungsten epoxy, and lost-wax casting in platinum. PMID:22984341
Customer relationship management implementation in the small and medium enterprise
NASA Astrophysics Data System (ADS)
Nugroho, Agus; Suharmanto, Agus; Masugino
2018-03-01
To win the global competition and sustain the business, small and medium enterprise shall implement a reliable information technology application to support their customer data base, production and sales as well as marketing management. This paper addresses the implementation of Customer Relationship Management (CRM) in small and medium enterprise, CV. Densuko Jaya. It is a small and medium enterprises in Semarang, Central Java, Republic of Indonesia deal with rubber processing industry supply chain. ADDIE model utilized in study to setup the CRM functionality at these enterprises. The aim of the authors is to present the benefits resulting from the application of CRM technologies at these enterprises to solve their chronicle issues in the field of integrated customer data base, production management process and sales automation in order to boost their business in the near future. Training and coaching have been delivered to the enterprises staffs and management to ensure that they can execute the system.
FPGA-based gating and logic for multichannel single photon counting
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pooser, Raphael C; Earl, Dennis Duncan; Evans, Philip G
2012-01-01
We present results characterizing multichannel InGaAs single photon detectors utilizing gated passive quenching circuits (GPQC), self-differencing techniques, and field programmable gate array (FPGA)-based logic for both diode gating and coincidence counting. Utilizing FPGAs for the diode gating frontend and the logic counting backend has the advantage of low cost compared to custom built logic circuits and current off-the-shelf detector technology. Further, FPGA logic counters have been shown to work well in quantum key distribution (QKD) test beds. Our setup combines multiple independent detector channels in a reconfigurable manner via an FPGA backend and post processing in order to perform coincidencemore » measurements between any two or more detector channels simultaneously. Using this method, states from a multi-photon polarization entangled source are detected and characterized via coincidence counting on the FPGA. Photons detection events are also processed by the quantum information toolkit for application testing (QITKAT)« less
NASA Astrophysics Data System (ADS)
Kraljić, K.; Strüngmann, L.; Fimmel, E.; Gumbel, M.
2018-01-01
The genetic code is degenerated and it is assumed that redundancy provides error detection and correction mechanisms in the translation process. However, the biological meaning of the code's structure is still under current research. This paper presents a Genetic Code Analysis Toolkit (GCAT) which provides workflows and algorithms for the analysis of the structure of nucleotide sequences. In particular, sets or sequences of codons can be transformed and tested for circularity, comma-freeness, dichotomic partitions and others. GCAT comes with a fertile editor custom-built to work with the genetic code and a batch mode for multi-sequence processing. With the ability to read FASTA files or load sequences from GenBank, the tool can be used for the mathematical and statistical analysis of existing sequence data. GCAT is Java-based and provides a plug-in concept for extensibility. Availability: Open source Homepage:http://www.gcat.bio/
NASA Astrophysics Data System (ADS)
De Leon, Marlene M.; Estuar, Maria Regina E.; Lim, Hadrian Paulo; Victorino, John Noel C.; Co, Jerelyn; Saddi, Ivan Lester; Paelmo, Sharlene Mae; Dela Cruz, Bon Lemuel
2017-09-01
Environment and agriculture related applications have been gaining ground for the past several years and have been the context for researches in ubiquitous and pervasive computing. This study is a part of a bigger study that uses artificial intelligence in developing models to detect, monitor, and forecast the spread of Fusarium oxysporum cubense TR4 (FOC TR4) on Cavendish bananas cultivated in the Philippines. To implement an Intelligent Farming system, 1) wireless sensor nodes (WSNs) are deployed in Philippine banana plantations to collect soil parameter data that is considered to affect the health of Cavendish bananas, 2) a custom built smartphone application is used for collecting, storing, and transmitting soil data, plant images and plant status data to a cloud storage, and 3) a custom built web application is used to load and display results of physico-chemical analysis of soil, analysis of data models, and geographic locations of plants being monitored. This study discusses the issues, considerations, and solutions implemented in the development of an asynchronous communication channel to ensure that all data collected by WSNs and smartphone applications are transmitted with a high degree of accuracy and reliability. From a design standpoint: standard API documentation on usage of data type is required to avoid inconsistencies in parameter passing. From a technical standpoint, there is a need to include error-handling mechanisms especially for delays in transmission of data as well as generalize method of parsing thru multidimensional array of data. Strategies are presented in the paper.
Brettin, Thomas; Davis, James J.; Disz, Terry; ...
2015-02-10
The RAST (Rapid Annotation using Subsystem Technology) annotation engine was built in 2008 to annotate bacterial and archaeal genomes. It works by offering a standard software pipeline for identifying genomic features (i.e., protein-encoding genes and RNA) and annotating their functions. Recently, in order to make RAST a more useful research tool and to keep pace with advancements in bioinformatics, it has become desirable to build a version of RAST that is both customizable and extensible. In this paper, we describe the RAST tool kit (RASTtk), a modular version of RAST that enables researchers to build custom annotation pipelines. RASTtk offersmore » a choice of software for identifying and annotating genomic features as well as the ability to add custom features to an annotation job. RASTtk also accommodates the batch submission of genomes and the ability to customize annotation protocols for batch submissions. This is the first major software restructuring of RAST since its inception.« less
Partners in quality: managing your suppliers.
Conway, B A
1991-05-01
Just expecting more from your supplier is not what partnership is about. We have had the experience where the quality improvement and partnership banner has been waved but the tone and spirit of the meeting did not encourage or support a joint quality improvement effort. Benefits will not be achieved until the wall truly begins to come apart and the relationship is built on mutual respect and trust. Data collection and open answers to questions often reveal embarrassing errors and obvious needs for improvements. As stated before, blame and finger-pointing must be replaced with a mutual commitment to asking and answering the question, "How can we improve?" As Dr. W. Edwards Deming has stated, "End the practice of awarding business on the basis of price tag. Instead, minimize total cost. Move toward a single supplier for any one item on a long-term relationship of loyalty and trust." The structured approach of a quality improvement process and the application of quality methods and techniques has proven useful in removing emotion and helping the team focus on the process rather than the people and the issues involved. Quality improvement methods are focused on achieving both customer and supplier goals--customer satisfaction, employee satisfaction, and operational efficiency and effectiveness. Our experience with Partners in Quality as well as our experience with the quality leadership process supports a recent quote in the Harvard Business Review: "Quality is not just a slogan...(it is) the most profitable way to run a business."
NASA Astrophysics Data System (ADS)
Lindholm, D. M.; Wilson, A.
2010-12-01
The Laboratory for Atmospheric and Space Physics at the University of Colorado has developed an Open Source, OPeNDAP compliant, Java Servlet based, RESTful web service to serve time series data. In addition to handling OPeNDAP style requests and returning standard responses, existing modules for alternate output formats can be reused or customized. It is also simple to reuse or customize modules to directly read various native data sources and even to perform some processing on the server. The server is built around a common data model based on the Unidata Common Data Model (CDM) which merges the NetCDF, HDF, and OPeNDAP data models. The server framework features a modular architecture that supports pluggable Readers, Writers, and Filters via the common interface to the data, enabling a workflow that reads data from their native form, performs some processing on the server, and presents the results to the client in its preferred form. The service is currently being used operationally to serve time series data for the LASP Interactive Solar Irradiance Data Center (LISIRD, http://lasp.colorado.edu/lisird/) and as part of the Time Series Data Server (TSDS, http://tsds.net/). I will present the data model and how it enables reading, writing, and processing concerns to be separated into loosely coupled components. I will also share thoughts for evolving beyond the time series abstraction and providing a general purpose data service that can be orchestrated into larger workflows.
Microwave-based navigation of femtosatellites using on-off keying
NASA Astrophysics Data System (ADS)
Kamte, Namrata Jagdish
The objective of this research is to validate that a custom-built microchip-scale satellite transmitting a signal modulated with a Pseudo Random Noise code using On-Off Keying, can be tracked. The weak GPS satellite signal is modulated with a Pseudo Random Noise (PRN) code that provides a mathematical gain. Our signal is modulated with the same PRN code using On-Off Keying (OOK) unlike Phase Shift Keying used in GPS satellites. Our goal is to obtain timing and positioning information from the microchip-scale satellite via a ground station using the concepts of PRN encoding and the OOK modulation technique. Decimeter scale satellites, with a mass of 2--6 kilograms, referred to as picosatellites, have been tracked successfully by ground stations. The microchip-scale satellite, called the femtosatellite is smaller with even less mass, at most 100 grams. At this size the satellite can take advantage of small-scale physics to perform maneuver, such as solar pressure, which only slightly perturb large spacecraft. Additionally, the reduced size decreases the cost of launch as compared to the picosatellites. A swarm of such femtosatellites can serve as environmental probes, interplanetary chemists or in-orbit inspectors of the parent spacecraft. In May 2011, NASA's last space shuttle mission STS-134 carried femtosatellites developed by Cornell researchers called "Sprites". The sprites were deployed from the International Space Station but ground stations on Earth failed to track them. In an effort to develop an alternative femtosatellite design, we have built our own femtosatellite prototype. Our femtosatellite prototype contains the AVR microcontroller on an Arduino board. This assembly is connected to a radio transmitter and a custom antenna transmitting a 433 Mhz radio frequency signal. The prototype transmits a PRN code modulated onto the signal using OOK. Our ground station consists of a Universal Software Radio Peripheral (USRP) with a custom antenna for reception of the 433 MHz signal. The USRP is driven by an open source software-defined radio application called GNU Radio. The required components of the signal are extracted from GNU Radio and processed in order to plot the received data. Benchtop testing of these OOK signals has yielded a reception sensitivity of upto 1 microsecond, which translates into a ranging capability similar to that of GPS satellites. We have correlated the received and replica PRN sequences and demonstrated that they match. The correlation can be used to obtain the identity and position of the femtosatellite prototype. This demonstrates the ability to track a femtosatellite signal that is lower than ambient noise, just like the signals broadcast from GPS satellites. Further, we have performed a system analysis and recognized key system behavioral problems. Thus we have determinately developed an optimum femtosatellite prototype and designed a novel positioning signal, providing a stepping- stone in the journey of successful femtosatellite communication.
Space Mission Operations Ground Systems Integration Customer Service
NASA Technical Reports Server (NTRS)
Roth, Karl
2014-01-01
The facility, which is now the Huntsville Operations Support Center (HOSC) at Marshall Space Flight Center in Huntsville, AL, has provided continuous space mission and related services for the space industry since 1961, from Mercury Redstone through the International Space Station (ISS). Throughout the long history of the facility and mission support teams, the HOSC has developed a stellar customer support and service process. In this era, of cost cutting, and providing more capability and results with fewer resources, space missions are looking for the most efficient way to accomplish their objectives. One of the first services provided by the facility was fax transmission of documents to, then, Cape Canaveral in Florida. The headline in the Marshall Star, the newspaper for the newly formed Marshall Space Flight Center, read "Exact copies of Documents sent to Cape in 4 minutes." The customer was Dr. Wernher von Braun. Currently at the HOSC we are supporting, or have recently supported, missions ranging from simple ISS payloads requiring little more than "bentpipe" telemetry access, to a low cost free-flyer Fast, Affordable, Science and Technology Satellite (FASTSAT), to a full service ISS payload Alpha Magnetic Spectrometer 2 (AMS2) supporting 24/7 operations at three operations centers around the world with an investment of over 2 billion dollars. The HOSC has more need and desire than ever to provide fast and efficient customer service to support these missions. Here we will outline how our customer-centric service approach reduces the cost of providing services, makes it faster and easier than ever for new customers to get started with HOSC services, and show what the future holds for our space mission operations customers. We will discuss our philosophy concerning our responsibility and accessibility to a mission customer as well as how we deal with the following issues: initial contact with a customer, reducing customer cost, changing regulations and security, and cultural differences, to ensure an efficient response to customer issues using a small Customer Service Team (CST) and adaptability, constant communication with customers, technical expertise and knowledge of services, and dedication to customer service. The HOSC Customer Support Team has implemented a variety of processes, and procedures that help to mitigate the potential problems that arise when integrating ground system services for a variety of complex missions and the lessons learned from this experience will lead the future of customer service in the space operations industry.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Haurykiewicz, John Paul; Dinehart, Timothy Grant; Parker, Robert Young
2016-05-12
The purpose of this process analysis was to analyze the Badge Offices’ current processes from a systems perspective and consider ways of pursuing objectives set forth by SEC-PS, namely increased customer flow (throughput) and reduced customer wait times. Information for the analysis was gathered for the project primarily through Badge Office Subject Matter Experts (SMEs), and in-person observation of prevailing processes. Using the information gathered, a process simulation model was constructed to represent current operations and allow assessment of potential process changes relative to factors mentioned previously. The overall purpose of the analysis was to provide SEC-PS management with informationmore » and recommendations to serve as a basis for additional focused study and areas for potential process improvements in the future.« less
Development and fabrication of a solar cell junction processing system
NASA Technical Reports Server (NTRS)
1984-01-01
A processing system capable of producing solar cell junctions by ion implantation followed by pulsed electron beam annealing was developed and constructed. The machine was to be capable of processing 4-inch diameter single-crystal wafers at a rate of 10(7) wafers per year. A microcomputer-controlled pulsed electron beam annealer with a vacuum interlocked wafer transport system was designed, built and demonstrated to produce solar cell junctions on 4-inch wafers with an AMI efficiency of 12%. Experiments showed that a non-mass-analyzed (NMA) ion beam could implant 10 keV phosphorous dopant to form solar cell junctions which were equivalent to mass-analyzed implants. A NMA ion implanter, compatible with the pulsed electron beam annealer and wafer transport system was designed in detail but was not built because of program termination.
Phaeton Mast Dynamics: On-Orbit Characterization of Deployable Masts
NASA Technical Reports Server (NTRS)
Michaels, Darren J.
2011-01-01
The PMD instrument is a set of three custom-designed triaxial accelerometer systems designed specifically to detect and characterize the modal dynamics of deployable masts in orbit. The instrument was designed and built as a payload for the NuSTAR spacecraft, but it is now sponsored by the Air Force Research Laboratory's DSX project. It can detect acceleration levels from 1 micro gram to 0.12g over a frequency range of 0.1Hz to 30Hz, the results of which can support future modeling and designing of deployable mast structures for space. This paper details the hardware architecture and design, calibration test and results, and current status of the PMD instrument.
Origins of 1/f noise in nanostructure inclusion polymorphous silicon films
2011-01-01
In this article, we report that the origins of 1/f noise in pm-Si:H film resistors are inhomogeneity and defective structure. The results obtained are consistent with Hooge's formula, where the noise parameter, αH, is independent of doping ratio. The 1/f noise power spectral density and noise parameter αH are proportional to the squared value of temperature coefficient of resistance (TCR). The resistivity and TCR of pm-Si:H film resistor were obtained through linear current-voltage measurement. The 1/f noise, measured by a custom-built noise spectroscopy system, shows that the power spectral density is a function of both doping ratio and temperature. PMID:21711802
The Human Communication Research Centre dialogue database.
Anderson, A H; Garrod, S C; Clark, A; Boyle, E; Mullin, J
1992-10-01
The HCRC dialogue database consists of over 700 transcribed and coded dialogues from pairs of speakers aged from seven to fourteen. The speakers are recorded while tackling co-operative problem-solving tasks and the same pairs of speakers are recorded over two years tackling 10 different versions of our two tasks. In addition there are over 200 dialogues recorded between pairs of undergraduate speakers engaged on versions of the same tasks. Access to the database, and to its accompanying custom-built search software, is available electronically over the JANET system by contacting liz@psy.glasgow.ac.uk, from whom further information about the database and a user's guide to the database can be obtained.
Hologram interferometry in automotive component vibration testing
NASA Astrophysics Data System (ADS)
Brown, Gordon M.; Forbes, Jamie W.; Marchi, Mitchell M.; Wales, Raymond R.
1993-02-01
An ever increasing variety of automotive component vibration testing is being pursued at Ford Motor Company, U.S.A. The driving force for use of hologram interferometry in these tests is the continuing need to design component structures to meet more stringent functional performance criteria. Parameters such as noise and vibration, sound quality, and reliability must be optimized for the lightest weight component possible. Continually increasing customer expectations and regulatory pressures on fuel economy and safety mandate that vehicles be built from highly optimized components. This paper includes applications of holographic interferometry for powertrain support structure tuning, body panel noise reduction, wiper system noise and vibration path analysis, and other vehicle component studies.
A planning support system to optimize approval of private housing development projects
NASA Astrophysics Data System (ADS)
Hussnain, M. Q.; Wakil, K.; Waheed, A.; Tahir, A.
2016-06-01
Out of 182 million population of Pakistan, 38% reside in urban areas having an average growth rate of 1.6%, raising the urban housing demand significantly. Poor state response to fulfil the housing needs has resulted in a mushroom growth of private housing schemes (PHS) over the years. Consequently, only in five major cities of Punjab, there are 383 legal and 150 illegal private housing development projects against 120 public sector housing schemes. A major factor behind the cancerous growth of unapproved PHS is the prolonged and delayed approval process in concerned approval authorities requiring 13 months on average. Currently, manual and paper-based approaches are used for vetting and for granting the permission which is highly subjective and non-transparent. This study aims to design a flexible planning support system (PSS) to optimize the vetting process of PHS projects under any development authority in Pakistan by reducing time and cost required for site and documents investigations. Relying on the review of regulatory documents and interviews with professional planners and land developers, this study describes the structure of a PSS developed using open- source geo-spatial tools such as OpenGeo Suite, PHP, and PostgreSQL. It highlights the development of a Knowledge Module (based on regulatory documents) containing equations related to scheme type, size (area), location, access road, components of layout plan, planning standards and other related approval checks. Furthermore, it presents the architecture of the database module and system data requirements categorized as base datasets (built-in part of PSS) and input datasets (related to the housing project under approval). It is practically demonstrated that developing a customized PSS to optimize PHS approval process in Pakistan is achievable with geospatial technology. With the provision of such a system, the approval process for private housing schemes not only becomes quicker and user-friendly but also transparent.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Carnegie Mellon University
2008-09-30
Carnegie Mellon University (CMU) under contract from Department of Energy/National Energy Technology Laboratory (DoE/NETL) and co-funding from the Northeast Gas Association (NGA), has completed the overall system design, field-trial and Magnetic Flux Leakage (MFL) sensor evaluation program for the next-generation Explorer-II (X-II) live gas main Non-destructive Evaluation (NDE) and visual inspection robot platform. The design is based on the Explorer-I prototype which was built and field-tested under a prior (also DoE- and NGA co-funded) program, and served as the validation that self-powered robots under wireless control could access and navigate live natural gas distribution mains. The X-II system design ({approx}8more » ft. and 66 lbs.) was heavily based on the X-I design, yet was substantially expanded to allow the addition of NDE sensor systems (while retaining its visual inspection capability), making it a modular system, and expanding its ability to operate at pressures up to 750 psig (high-pressure and unpiggable steel-pipe distribution mains). A new electronics architecture and on-board software kernel were added to again improve system performance. A locating sonde system was integrated to allow for absolute position-referencing during inspection (coupled with external differential GPS) and emergency-locating. The power system was upgraded to utilize lithium-based battery-cells for an increase in mission-time. The resulting robot-train system with CAD renderings of the individual modules. The system architecture now relies on a dual set of end camera-modules to house the 32-bit processors (Single-Board Computer or SBC) as well as the imaging and wireless (off-board) and CAN-based (on-board) communication hardware and software systems (as well as the sonde-coil and -electronics). The drive-module (2 ea.) are still responsible for bracing (and centering) to drive in push/pull fashion the robot train into and through the pipes and obstacles. The steering modules and their arrangement, still allow the robot to configure itself to perform any-angle (up to 90 deg) turns in any orientation (incl. vertical), and enable the live launching and recovery of the system using custom fittings and a (to be developed) launch-chamber/-tube. The battery modules are used to power the system, by providing power to the robot's bus. The support modules perform the functions of centration for the rest of the train as well as odometry pickups using incremental encoding schemes. The electronics architecture is based on a distributed (8-bit) microprocessor architecture (at least 1 in ea. module) communicating to a (one of two) 32-bit SBC, which manages all video-processing, posture and motion control as well as CAN and wireless communications. The operator controls the entire system from an off-board (laptop) controller, which is in constant wireless communication with the robot train in the pipe. The sensor modules collect data and forward it to the robot operator computer (via the CAN-wireless communications chain), who then transfers it to a dedicated NDE data-storage and post-processing computer for further (real-time or off-line) analysis. The prototype robot system was built and tested indoors and outdoors, outfitted with a Remote-Field Eddy Current (RFEC) sensor integrated as its main NDE sensor modality. An angled launcher, allowing for live launching and retrieval, was also built to suit custom angled launch-fittings from TDW. The prototype vehicle and launcher systems are shown. The complete system, including the in-pipe robot train, launcher, integrated NDE-sensor and real-time video and control console and NDE-data collection and -processing and real-time display, were demonstrated to all sponsors prior to proceeding into final field-trials--the individual components and setting for said acceptance demonstration are shown. The launcher-tube was also used to verify that the vehicle system is capable of operating in high-pressure environments, and is safely deployable using proper evacuating/purging techniques for operation in the potentially explosive natural gas environment. The test-setting and environment for safety-certification of the X-II robot platform and the launch and recovery procedures, is shown. Field-trials were successfully carried out in a live steel pipeline in Northwestern Pennsylvania. The robot was launched and recovered multiple times, travelling thousands of feet and communicating in real time with video and command-and-control (C&C) data under remote operator control from a laptop, with NDE sensor-data streaming to a second computer for storage, display and post-processing. Representative images of the activities and systems used in the week-long field-trial are shown. CMU also evaluated the ability of the X-II design to be able to integrate an MFL sensor, by adding additional drive-/battery-/steering- and support-modules to extend the X-II train.« less
An advanced software suite for the processing and analysis of silicon luminescence images
NASA Astrophysics Data System (ADS)
Payne, D. N. R.; Vargas, C.; Hameiri, Z.; Wenham, S. R.; Bagnall, D. M.
2017-06-01
Luminescence imaging is a versatile characterisation technique used for a broad range of research and industrial applications, particularly for the field of photovoltaics where photoluminescence and electroluminescence imaging is routinely carried out for materials analysis and quality control. Luminescence imaging can reveal a wealth of material information, as detailed in extensive literature, yet these techniques are often only used qualitatively instead of being utilised to their full potential. Part of the reason for this is the time and effort required for image processing and analysis in order to convert image data to more meaningful results. In this work, a custom built, Matlab based software suite is presented which aims to dramatically simplify luminescence image processing and analysis. The suite includes four individual programs which can be used in isolation or in conjunction to achieve a broad array of functionality, including but not limited to, point spread function determination and deconvolution, automated sample extraction, image alignment and comparison, minority carrier lifetime calibration and iron impurity concentration mapping.
Smartphones in ecology and evolution: a guide for the app-rehensive.
Teacher, Amber G F; Griffiths, David J; Hodgson, David J; Inger, Richard
2013-12-01
Smartphones and their apps (application software) are now used by millions of people worldwide and represent a powerful combination of sensors, information transfer, and computing power that deserves better exploitation by ecological and evolutionary researchers. We outline the development process for research apps, provide contrasting case studies for two new research apps, and scan the research horizon to suggest how apps can contribute to the rapid collection, interpretation, and dissemination of data in ecology and evolutionary biology. We emphasize that the usefulness of an app relies heavily on the development process, recommend that app developers are engaged with the process at the earliest possible stage, and commend efforts to create open-source software scaffolds on which customized apps can be built by nonexperts. We conclude that smartphones and their apps could replace many traditional handheld sensors, calculators, and data storage devices in ecological and evolutionary research. We identify their potential use in the high-throughput collection, analysis, and storage of complex ecological information.
Photoacoustic Imaging with a Commercial Ultrasound System and a Custom Probe
Wang, Xueding; Fowlkes, J. Brian; Cannata, Jonathan M.; Hu, Changhong; Carson, Paul L.
2010-01-01
Building photoacoustic imaging (PAI) systems by using stand-alone ultrasound (US) units makes it convenient to take advantage of the state-of-the-art ultrasonic technologies. However, the sometimes limited receiving sensitivity and the comparatively narrow bandwidth of commercial US probes may not be sufficient to acquire high quality photoacoustic images. In this work, a high-speed PAI system has been developed using a commercial US unit and a custom built 128-element piezoelectric-polymer array (PPA) probe using a P(VDF-TrFE) film and flexible circuit to define the elements. Since the US unit supports simultaneous signal acquisition from 64 parallel receive channels, PAI data for synthetic image formation from a 64 or 128 element array aperture can be acquired after a single or dual laser firing, respectively. Therefore, 2D B-scan imaging can be achieved with a maximum frame rate up to 10 Hz, limited only by the laser repetition rate. The uniquely properties of P(VDF-TrFE) facilitated a wide -6 dB receiving bandwidth of over 120 % for the array. A specially designed 128-channel preamplifier board made the connection between the array and the system cable which not only enabled element electrical impedance matching but also further elevated the signal-to-noise ratio (SNR) to further enhance the detection of weak photoacoustic signals. Through the experiments on phantoms and rabbit ears, the good performance of this PAI system was demonstrated. PMID:21276653
An Integrated Information System for Supporting Quality Management Tasks
NASA Astrophysics Data System (ADS)
Beyer, N.; Helmreich, W.
2004-08-01
In a competitive environment, well defined processes become the strategic advantage of a company. Hence, targeted Quality Management ensures efficiency, trans- parency and, ultimately, customer satisfaction. In the particular context of a Space Test Centre, a num- ber of specific Quality Management standards have to be applied. According to the revision of ISO 9001 dur- ing 2000, and due to the adaptation of ECSS-Q20-07, process orientation and data analysis are key tasks for ensuring and evaluating the efficiency of a company's processes. In line with these requirements, an integrated management system for accessing the necessary infor- mation to support Quality Management and other proc- esses has been established. Some of its test-related fea- tures are presented here. Easy access to the integrated management system from any work place at IABG's Space Test Centre is ensured by means of an intranet portal. It comprises a full set of quality-related process descriptions, information on test facilities, emergency procedures, and other relevant in- formation. The portal's web interface provides direct access to a couple of external applications. Moreover, easy updating of all information and low cost mainte- nance are features of this integrated information system. The timely and transparent management of non- conformances is covered by a dedicated NCR database which incorporates full documentation capability, elec- tronic signature and e-mail notification of concerned staff. A search interface allows for queries across all documented non-conformances. Furthermore, print ver- sions can be generated at any stage in the process, e.g. for distribution to customers. Feedback on customer satisfaction is sought through a web-based questionnaire. The process is initiated by the responsible test manager through submission of an e- mail that contains a hyperlink to a secure website, ask- ing the customer to complete the brief online form, which is directly fed to a database for subsequent evaluation by the Quality Manager. All such information can be processed and presented in an appropriate manner for internal or external audits, as well as for regular management reviews.
Shuttle operations era planning for flight operations
NASA Technical Reports Server (NTRS)
Holt, J. D.; Beckman, D. A.
1984-01-01
The Space Transportation System (STS) provides routine access to space for a wide range of customers in which cargos vary from single payloads on dedicated flights to multiple payloads that share Shuttle resources. This paper describes the flight operations planning process from payload introduction through flight assignment to execution of the payload objectives and the changes that have been introduced to improve that process. Particular attention is given to the factors that influence the amount of preflight preparation necessary to satisfy customer requirements. The partnership between the STS operations team and the customer is described in terms of their functions and responsibilities in the development of a flight plan. A description of the Mission Control Center (MCC) and payload support capabilities completes the overview of Shuttle flight operations.
NASA Astrophysics Data System (ADS)
Koon, Phillip L.; Greene, Scott
2002-07-01
Our aerospace customers are demanding that we drastically reduce the cost of operating and supporting our products. Our space customer in particular is looking for the next generation of reusable launch vehicle systems to support more aircraft like operation. To achieve this goal requires more than an evolution in materials, processes and systems, what is required is a paradigm shift in the design of the launch vehicles and the processing systems that support the launch vehicles. This paper describes the Automated Informed Maintenance System (AIM) we are developing for NASA's Space Launch Initiative (SLI) Second Generation Reusable Launch Vehicle (RLV). Our system includes an Integrated Health Management (IHM) system for the launch vehicles and ground support systems, which features model based diagnostics and prognostics. Health Management data is used by our AIM decision support and process aids to automatically plan maintenance, generate work orders and schedule maintenance activities along with the resources required to execute these processes. Our system will automate the ground processing for a spaceport handling multiple RLVs executing multiple missions. To accomplish this task we are applying the latest web based distributed computing technologies and application development techniques.