Verloock, Leen; Joseph, Wout; Gati, Azeddine; Varsier, Nadège; Flach, Björn; Wiart, Joe; Martens, Luc
2013-06-01
An experimental validation of a low-cost method for extrapolation and estimation of the maximal electromagnetic-field exposure from long-term evolution (LTE) radio base station installations are presented. No knowledge on downlink band occupation or service characteristics is required for the low-cost method. The method is applicable in situ. It only requires a basic spectrum analyser with appropriate field probes without the need of expensive dedicated LTE decoders. The method is validated both in laboratory and in situ, for a single-input single-output antenna LTE system and a 2×2 multiple-input multiple-output system, with low deviations in comparison with signals measured using dedicated LTE decoders.
Coates, Jennifer; Colaiezzi, Brooke; Fiedler, John L; Wirth, James; Lividini, Keith; Rogers, Beatrice
2012-09-01
Dietary assessment data are essential for designing, monitoring, and evaluating food fortification and other food-based nutrition programs. Planners and managers must understand the validity, usefulness, and cost tradeoffs of employing alternative dietary assessment methods, but little guidance exists. To identify and apply criteria to assess the tradeoffs of using alternative dietary methods for meeting fortification programming needs. Twenty-five semistructured expert interviews were conducted and literature was reviewed for information on the validity, usefulness, and cost of using 24-hour recalls, Food Frequency Questionnaires/Fortification Rapid Assessment Tool (FFQ/FRAT), Food Balance Sheets (FBS), and Household Consumption and Expenditures Surveys (HCES) for program stage-specific information needs. Criteria were developed and applied to construct relative rankings of the four methods. Needs assessment: HCES offers the greatest suitability at the lowest cost for estimating the risk of inadequate intakes, but relative to 24-hour recall compromises validity. HCES should be used to identify vehicles and to estimate coverage and likely impact due to its low cost and moderate-to-high validity. Baseline assessment: 24-hour recall should be applied using a representative sample. Monitoring: A simple, low-cost FFQ can be used to monitor coverage. Impact evaluation: 24-hour recall should be used to assess changes in nutrient intakes. FBS have low validity relative to other methods for all programmatic purposes. Each dietary assessment method has strengths and weaknesses that vary by context and purpose. Method selection must be driven by the program's data needs, the suitability of the methods for the purpose, and a clear understanding of the tradeoffs involved.
2009-12-01
correctly Risk before validation step: 41-60% - Is this too high/ low ? Why? Risk 8: Operational or data latency impacts based on relationship between...too high, too low , or correct. We also asked them to comment on why they felt this way. Finally, we left additional space on the survey for any...cost of each validation effort was too high, too low , or acceptable. They then gave us rationale for their beliefs. The second cost associated with
Benchmark tests for a Formula SAE Student car prototyping
NASA Astrophysics Data System (ADS)
Mariasiu, Florin
2011-12-01
Aerodynamic characteristics of a vehicle are important elements in its design and construction. A low drag coefficient brings significant fuel savings and increased engine power efficiency. In designing and developing vehicles trough computer simulation process to determine the vehicles aerodynamic characteristics are using dedicated CFD (Computer Fluid Dynamics) software packages. However, the results obtained by this faster and cheaper method, are validated by experiments in wind tunnels tests, which are expensive and were complex testing equipment are used in relatively high costs. Therefore, the emergence and development of new low-cost testing methods to validate CFD simulation results would bring great economic benefits for auto vehicles prototyping process. This paper presents the initial development process of a Formula SAE Student race-car prototype using CFD simulation and also present a measurement system based on low-cost sensors through which CFD simulation results were experimentally validated. CFD software package used for simulation was Solid Works with the FloXpress add-on and experimental measurement system was built using four piezoresistive force sensors FlexiForce type.
Romero-Franco, Natalia; Jiménez-Reyes, Pedro; Montaño-Munuera, Juan A
2017-11-01
Lower limb isometric strength is a key parameter to monitor the training process or recognise muscle weakness and injury risk. However, valid and reliable methods to evaluate it often require high-cost tools. The aim of this study was to analyse the concurrent validity and reliability of a low-cost digital dynamometer for measuring isometric strength in lower limb. Eleven physically active and healthy participants performed maximal isometric strength for: flexion and extension of ankle, flexion and extension of knee, flexion, extension, adduction, abduction, internal and external rotation of hip. Data obtained by the digital dynamometer were compared with the isokinetic dynamometer to examine its concurrent validity. Data obtained by the digital dynamometer from 2 different evaluators and 2 different sessions were compared to examine its inter-rater and intra-rater reliability. Intra-class correlation (ICC) for validity was excellent in every movement (ICC > 0.9). Intra and inter-tester reliability was excellent for all the movements assessed (ICC > 0.75). The low-cost digital dynamometer demonstrated strong concurrent validity and excellent intra and inter-tester reliability for assessing isometric strength in the main lower limb movements.
Chang, Jasper O; Levy, Susan S; Seay, Seth W; Goble, Daniel J
2014-05-01
Recent guidelines advocate sports medicine professionals to use balance tests to assess sensorimotor status in the management of concussions. The present study sought to determine whether a low-cost balance board could provide a valid, reliable, and objective means of performing this balance testing. Criterion validity testing relative to a gold standard and 7 day test-retest reliability. University biomechanics laboratory. Thirty healthy young adults. Balance ability was assessed on 2 days separated by 1 week using (1) a gold standard measure (ie, scientific grade force plate), (2) a low-cost Nintendo Wii Balance Board (WBB), and (3) the Balance Error Scoring System (BESS). Validity of the WBB center of pressure path length and BESS scores were determined relative to the force plate data. Test-retest reliability was established based on intraclass correlation coefficients. Composite scores for the WBB had excellent validity (r = 0.99) and test-retest reliability (R = 0.88). Both the validity (r = 0.10-0.52) and test-retest reliability (r = 0.61-0.78) were lower for the BESS. These findings demonstrate that a low-cost balance board can provide improved balance testing accuracy/reliability compared with the BESS. This approach provides a potentially more valid/reliable, yet affordable, means of assessing sports-related concussion compared with current methods.
Montgomery, Jill D; Hensler, Heather R; Jacobson, Lisa P; Jenkins, Frank J
2008-07-01
The aim of the present study was to determine if the Alpha DigiDoc RT system would be an effective method of quantifying immunohistochemical staining as compared with a manual counting method, which is considered the gold standard. Two readers were used to count 31 samples by both methods. The results obtained using the Bland-Altman for concordance deemed no statistical difference between the 2 methods. Thus, the Alpha DigiDoc RT system is an effective, low cost method to quantify immunohistochemical data.
Rebouças, Camila Tavares; Kogawa, Ana Carolina; Salgado, Hérida Regina Nunes
2018-05-18
Background: A green analytical chemistry method was developed for quantification of enrofloxacin in tablets. The drug, a second-generation fluoroquinolone, was first introduced in veterinary medicine for the treatment of various bacterial species. Objective: This study proposed to develop, validate, and apply a reliable, low-cost, fast, and simple IR spectroscopy method for quantitative routine determination of enrofloxacin in tablets. Methods: The method was completely validated according to the International Conference on Harmonisation guidelines, showing accuracy, precision, selectivity, robustness, and linearity. Results: It was linear over the concentration range of 1.0-3.0 mg with correlation coefficients >0.9999 and LOD and LOQ of 0.12 and 0.36 mg, respectively. Conclusions: Now that this IR method has met performance qualifications, it can be adopted and applied for the analysis of enrofloxacin tablets for production process control. The validated method can also be utilized to quantify enrofloxacin in tablets and thus is an environmentally friendly alternative for the routine analysis of enrofloxacin in quality control. Highlights: A new green method for the quantitative analysis of enrofloxacin by Fourier-Transform Infrared spectroscopy was validated. It is a fast, clean and low-cost alternative for the evaluation of enrofloxacin tablets.
USDA-ARS?s Scientific Manuscript database
Current methods for assessing children's dietary intake, such as interviewer-administered 24-h dietary recall (24-h DR), are time consuming and resource intensive. Self-administered instruments offer a low-cost diet assessment method for use with children. The present study assessed the validity of ...
Are Education Cost Functions Ready for Prime Time? An Examination of Their Validity and Reliability
ERIC Educational Resources Information Center
Duncombe, William; Yinger, John
2011-01-01
This article makes the case that cost functions are the best available methodology for ensuring consistency between a state's educational accountability system and its education finance system. Because they are based on historical data and well-known statistical methods, cost functions are a particularly flexible and low-cost way to forecast what…
Estimation of viscoelastic surface wave parameters using a low cost optical deflection method
NASA Astrophysics Data System (ADS)
Brum, J.; Balay, G.; Arzúa, A.; Núñez, I.; Negreira, C.
2010-01-01
In this work an optical deflection method was used to study surface vibrations created by a low frequency source placed on the sample's surface. The optical method consists in placing a laser beam perpendicularly the sample's surface (gelatine based phantom). A beam-splitter is placed between the laser and the sample to project the reflected beam into a screen. As the surface moves due to the action of the low frequency source the laser beam on the screen also moves. Recording this movement with a digital camera allow us to reconstruct de surface motion using the light reflection law. If the scattering of the surface is very strong (such the one in biological tissue) a lens is placed between the surface and the beam-splitter to collect the scattered light. As validation method the surface movement was measured using a 10 MHz ultrasonic transducer placed normal to the surface in pulse-eco mode. The optical measurements were in complete agreement with the acoustical measurements. The optical measurement has the following advantages over the acoustic: 2-dimensional motion could be recorded and it is low cost. Since the acquisition was synchronized and the source-laser beam distance is known, measuring the time of flight an estimation of the surface wave velocity is obtained in order to measure the elasticity of the sample. The authors conclude that a reliable optical, low cost method for obtaining surface wave parameters of biological tissue was developed and successfully validate.
INTERMINGLED SKIN GRAFTING: A VALID TRANSPLANTATION METHOD AT LOW COST
Domres, B.; Kistler, D.; Rutczynska, J.
2007-01-01
Summary The almost forgotten method of intermingled skin grafting of allogeneic material with small autogeneic islets, once developed in the People's Republic of China, proves the feasibility of permanent healing of even extensive burn wounds, at low cost, and therefore an effective treatment possibility in poorer countries, as well as under conditions of a burn disaster. Intermingled skin grafting obtains a better elasticity of the reconditioned skin as elastic fibres of the allodermis survive, and this results in fewer contractures. From the cosmetic point of view the transplantation of autologous keratinocytes results in a better aesthetic homogeneous texture. PMID:21991087
Detecting and treating occlusal caries lesions: a cost-effectiveness analysis.
Schwendicke, F; Stolpe, M; Meyer-Lueckel, H; Paris, S
2015-02-01
The health gains and costs resulting from using different caries detection strategies might not only depend on the accuracy of the used method but also the treatment emanating from its use in different populations. We compared combinations of visual-tactile, radiographic, or laser-fluorescence-based detection methods with 1 of 3 treatments (non-, micro-, and invasive treatment) initiated at different cutoffs (treating all or only dentinal lesions) in populations with low or high caries prevalence. A Markov model was constructed to follow an occlusal surface in a permanent molar in an initially 12-y-old male German patient over his lifetime. Prevalence data and transition probabilities were extracted from the literature, while validity parameters of different methods were synthesized or obtained from systematic reviews. Microsimulations were performed to analyze the model, assuming a German health care setting and a mixed public-private payer perspective. Radiographic and fluorescence-based methods led to more overtreatments, especially in populations with low prevalence. For the latter, combining visual-tactile or radiographic detection with microinvasive treatment retained teeth longest (mean 66 y) at lowest costs (329 and 332 Euro, respectively), while combining radiographic or fluorescence-based detections with invasive treatment was the least cost-effective (<60 y, >700 Euro). In populations with high prevalence, combining radiographic detection with microinvasive treatment was most cost-effective (63 y, 528 Euro), while sensitive detection methods combined with invasive treatments were again the least cost-effective (<59 y, >690 Euro). The suitability of detection methods differed significantly between populations, and the cost-effectiveness was greatly influenced by the treatment initiated after lesion detection. The accuracy of a detection method relative to a "gold standard" did not automatically convey into better health or reduced costs. Detection methods should be evaluated not only against their criterion validity but also the long-term effects resulting from their use in different populations. © International & American Associations for Dental Research 2014.
USDA-ARS?s Scientific Manuscript database
Sunflower and soybean are summer annuals that can be grown as an alternative to corn and may be particularly useful in organic production systems. Rapid and low cost methods of analyzing plant quality would be helpful for crop management. We developed and validated calibration models for Near-infrar...
The U.S. EPA’s in vitro bioaccessibility (IVBA) method 9200.1-86 defines a validated analytical procedure for the determination of lead bioaccessibility in contaminated soils. The method requires the use of a custom-fabricated extraction device that uses a heated water bath for ...
NASA Astrophysics Data System (ADS)
Lee, Peter; Calvo, Conrado J.; Alfonso-Almazán, José M.; Quintanilla, Jorge G.; Chorro, Francisco J.; Yan, Ping; Loew, Leslie M.; Filgueiras-Rama, David; Millet, José
2017-02-01
Panoramic optical mapping is the primary method for imaging electrophysiological activity from the entire outer surface of Langendorff-perfused hearts. To date, it is the only method of simultaneously measuring multiple key electrophysiological parameters, such as transmembrane voltage and intracellular free calcium, at high spatial and temporal resolution. Despite the impact it has already had on the fields of cardiac arrhythmias and whole-heart computational modeling, present-day system designs precludes its adoption by the broader cardiovascular research community because of their high costs. Taking advantage of recent technological advances, we developed and validated low-cost optical mapping systems for panoramic imaging using Langendorff-perfused pig hearts, a clinically-relevant model in basic research and bioengineering. By significantly lowering financial thresholds, this powerful cardiac electrophysiology imaging modality may gain wider use in research and, even, teaching laboratories, which we substantiated using the lower-cost Langendorff-perfused rabbit heart model.
Lee, Peter; Calvo, Conrado J; Alfonso-Almazán, José M; Quintanilla, Jorge G; Chorro, Francisco J; Yan, Ping; Loew, Leslie M; Filgueiras-Rama, David; Millet, José
2017-02-27
Panoramic optical mapping is the primary method for imaging electrophysiological activity from the entire outer surface of Langendorff-perfused hearts. To date, it is the only method of simultaneously measuring multiple key electrophysiological parameters, such as transmembrane voltage and intracellular free calcium, at high spatial and temporal resolution. Despite the impact it has already had on the fields of cardiac arrhythmias and whole-heart computational modeling, present-day system designs precludes its adoption by the broader cardiovascular research community because of their high costs. Taking advantage of recent technological advances, we developed and validated low-cost optical mapping systems for panoramic imaging using Langendorff-perfused pig hearts, a clinically-relevant model in basic research and bioengineering. By significantly lowering financial thresholds, this powerful cardiac electrophysiology imaging modality may gain wider use in research and, even, teaching laboratories, which we substantiated using the lower-cost Langendorff-perfused rabbit heart model.
Validation of a Low-Cost Paper-Based Screening Test for Sickle Cell Anemia
Piety, Nathaniel Z.; Yang, Xiaoxi; Kanter, Julie; Vignes, Seth M.; George, Alex; Shevkoplyas, Sergey S.
2016-01-01
Background The high childhood mortality and life-long complications associated with sickle cell anemia (SCA) in developing countries could be significantly reduced with effective prophylaxis and education if SCA is diagnosed early in life. However, conventional laboratory methods used for diagnosing SCA remain prohibitively expensive and impractical in this setting. This study describes the clinical validation of a low-cost paper-based test for SCA that can accurately identify sickle trait carriers (HbAS) and individuals with SCA (HbSS) among adults and children over 1 year of age. Methods and Findings In a population of healthy volunteers and SCA patients in the United States (n = 55) the test identified individuals whose blood contained any HbS (HbAS and HbSS) with 100% sensitivity and 100% specificity for both visual evaluation and automated analysis, and detected SCA (HbSS) with 93% sensitivity and 94% specificity for visual evaluation and 100% sensitivity and 97% specificity for automated analysis. In a population of post-partum women (with a previously unknown SCA status) at a primary obstetric hospital in Cabinda, Angola (n = 226) the test identified sickle cell trait carriers with 94% sensitivity and 97% specificity using visual evaluation (none of the women had SCA). Notably, our test permits instrument- and electricity-free visual diagnostics, requires minimal training to be performed, can be completed within 30 minutes, and costs about $0.07 in test-specific consumable materials. Conclusions Our results validate the paper-based SCA test as a useful low-cost tool for screening adults and children for sickle trait and disease and demonstrate its practicality in resource-limited clinical settings. PMID:26735691
Technique for Very High Order Nonlinear Simulation and Validation
NASA Technical Reports Server (NTRS)
Dyson, Rodger W.
2001-01-01
Finding the sources of sound in large nonlinear fields via direct simulation currently requires excessive computational cost. This paper describes a simple technique for efficiently solving the multidimensional nonlinear Euler equations that significantly reduces this cost and demonstrates a useful approach for validating high order nonlinear methods. Up to 15th order accuracy in space and time methods were compared and it is shown that an algorithm with a fixed design accuracy approaches its maximal utility and then its usefulness exponentially decays unless higher accuracy is used. It is concluded that at least a 7th order method is required to efficiently propagate a harmonic wave using the nonlinear Euler equations to a distance of 5 wavelengths while maintaining an overall error tolerance that is low enough to capture both the mean flow and the acoustics.
Automated extraction and validation of children's gait parameters with the Kinect.
Motiian, Saeid; Pergami, Paola; Guffey, Keegan; Mancinelli, Corrie A; Doretto, Gianfranco
2015-12-02
Gait analysis for therapy regimen prescription and monitoring requires patients to physically access clinics with specialized equipment. The timely availability of such infrastructure at the right frequency is especially important for small children. Besides being very costly, this is a challenge for many children living in rural areas. This is why this work develops a low-cost, portable, and automated approach for in-home gait analysis, based on the Microsoft Kinect. A robust and efficient method for extracting gait parameters is introduced, which copes with the high variability of noisy Kinect skeleton tracking data experienced across the population of young children. This is achieved by temporally segmenting the data with an approach based on coupling a probabilistic matching of stride template models, learned offline, with the estimation of their global and local temporal scaling. A preliminary study conducted on healthy children between 2 and 4 years of age is performed to analyze the accuracy, precision, repeatability, and concurrent validity of the proposed method against the GAITRite when measuring several spatial and temporal children's gait parameters. The method has excellent accuracy and good precision, with segmenting temporal sequences of body joint locations into stride and step cycles. Also, the spatial and temporal gait parameters, estimated automatically, exhibit good concurrent validity with those provided by the GAITRite, as well as very good repeatability. In particular, on a range of nine gait parameters, the relative and absolute agreements were found to be good and excellent, and the overall agreements were found to be good and moderate. This work enables and validates the automated use of the Kinect for children's gait analysis in healthy subjects. In particular, the approach makes a step forward towards developing a low-cost, portable, parent-operated in-home tool for clinicians assisting young children.
Bitella, Giovanni; Rossi, Roberta; Bochicchio, Rocco; Perniola, Michele; Amato, Mariana
2014-10-21
Monitoring soil water content at high spatio-temporal resolution and coupled to other sensor data is crucial for applications oriented towards water sustainability in agriculture, such as precision irrigation or phenotyping root traits for drought tolerance. The cost of instrumentation, however, limits measurement frequency and number of sensors. The objective of this work was to design a low cost "open hardware" platform for multi-sensor measurements including water content at different depths, air and soil temperatures. The system is based on an open-source ARDUINO microcontroller-board, programmed in a simple integrated development environment (IDE). Low cost high-frequency dielectric probes were used in the platform and lab tested on three non-saline soils (ECe1: 2.5 < 0.1 mS/cm). Empirical calibration curves were subjected to cross-validation (leave-one-out method), and normalized root mean square error (NRMSE) were respectively 0.09 for the overall model, 0.09 for the sandy soil, 0.07 for the clay loam and 0.08 for the sandy loam. The overall model (pooled soil data) fitted the data very well (R2 = 0.89) showing a high stability, being able to generate very similar RMSEs during training and validation (RMSE(training) = 2.63; RMSE(validation) = 2.61). Data recorded on the card were automatically sent to a remote server allowing repeated field-data quality checks. This work provides a framework for the replication and upgrading of a customized low cost platform, consistent with the open source approach whereby sharing information on equipment design and software facilitates the adoption and continuous improvement of existing technologies.
Can a low-cost webcam be used for a remote neurological exam?
Wood, Jeffrey; Wallin, Mitchell; Finkelstein, Joseph
2013-01-01
Multiple sclerosis (MS) is a demyelinating and axonal degenerative disease of the central nervous system. It is the most common progressive neurological disorder of young adults affecting over 1 million persons worldwide. Despite the increased use of neuroimaging and other tools to measure MS morbidity, the neurological examination remains the primary method to document relapses and progression in disease. The goal of this study was to demonstrate the feasibility and validity of using a low-cost webcam for remote neurological examination in home-setting for patients with MS. Using cross-over design, 20 MS patients were evaluated in-person and via remote televisit and results of the neurological evaluation were compared. Overall, we found that agreement between face-to-face and remote EDSS evaluation was sufficient to provide clinically valid information. Another important finding of this study was high acceptance of patients and their providers of using remote televisits for conducting neurological examinations at MS patient homes. The results of this study demonstrated potential of using low-cost webcams for remote neurological exam in patients with MS.
Maschio, Federico; Pandya, Mirali; Olszewski, Raphael
2016-01-01
Background The objective of this study was to investigate the accuracy of 3-dimensional (3D) plastic (ABS) models generated using a low-cost 3D fused deposition modelling printer. Material/Methods Two human dry mandibles were scanned with a cone beam computed tomography (CBCT) Accuitomo device. Preprocessing consisted of 3D reconstruction with Maxilim software and STL file repair with Netfabb software. Then, the data were used to print 2 plastic replicas with a low-cost 3D fused deposition modeling printer (Up plus 2®). Two independent observers performed the identification of 26 anatomic landmarks on the 4 mandibles (2 dry and 2 replicas) with a 3D measuring arm. Each observer repeated the identifications 20 times. The comparison between the dry and plastic mandibles was based on 13 distances: 8 distances less than 12 mm and 5 distances greater than 12 mm. Results The mean absolute difference (MAD) was 0.37 mm, and the mean dimensional error (MDE) was 3.76%. The MDE decreased to 0.93% for distances greater than 12 mm. Conclusions Plastic models generated using the low-cost 3D printer UPplus2® provide dimensional accuracies comparable to other well-established rapid prototyping technologies. Validated low-cost 3D printers could represent a step toward the better accessibility of rapid prototyping technologies in the medical field. PMID:27003456
Dahlqvist, Camilla; Hansson, Gert-Åke; Forsman, Mikael
2016-07-01
Repetitive work and work in constrained postures are risk factors for developing musculoskeletal disorders. Low-cost, user-friendly technical methods to quantify these risks are needed. The aims were to validate inclination angles and velocities of one model of the new generation of accelerometers with integrated data loggers against a previously validated one, and to compare meaurements when using a plain reference posture with that of a standardized one. All mean (n = 12 subjects) angular RMS-differences in 4 work tasks and 4 body parts were <2.5° and all mean median angular velocity differences <5.0 °/s. The mean correlation between the inclination signal-pairs was 0.996. This model of the new generation of triaxial accelerometers proved to be comparable to the validated accelerometer using a data logger. This makes it well-suited, for both researchers and practitioners, to measure postures and movements during work. Further work is needed for validation of the plain reference posture for upper arms. Copyright © 2016 Elsevier Ltd and The Ergonomics Society. All rights reserved.
Performance of low-cost monitors to assess household air pollution.
Curto, A; Donaire-Gonzalez, D; Barrera-Gómez, J; Marshall, J D; Nieuwenhuijsen, M J; Wellenius, G A; Tonne, C
2018-05-01
Exposure to household air pollution is a leading cause of morbidity and mortality globally. However, due to the lack of validated low-cost monitors with long-lasting batteries in indoor environments, most epidemiologic studies use self-reported data or short-term household air pollution assessments as proxies of long-term exposure. We evaluated the performance of three low-cost monitors measuring fine particulate matter (PM 2.5 ) and carbon monoxide (CO) in a wood-combustion experiment conducted in one household of Spain for 5 days (including the co-location of 2 units of HAPEX and 3 units of TZOA-R for PM 2.5 and 3 units of EL-USB-CO for CO; a total of 40 unit-days). We used Spearman correlation (ρ) and Concordance Correlation Coefficient (CCC) to assess accuracy of low-cost monitors versus equivalent research-grade devices. We also conducted a field study in India for 1 week (including HAPEX in 3 households and EL-USB-CO in 4 households; a total of 49 unit-days). Correlation and agreement at 5-min were moderate-high for one unit of HAPEX (ρ = 0.73 / CCC = 0.59), for one unit of TZOA-R (ρ = 0.89 / CCC = 0.62) and for three units of EL-USB-CO (ρ = 0.82-0.89 / CCC = 0.66-0.91) in Spain, although the failure or malfunction rate among low-cost units was high in both settings (60% of unit-days in Spain and 43% in India). Low-cost monitors tested here are not yet ready to replace more established exposure assessment methods in long-term household air pollution epidemiologic studies. More field validation is needed to assess evolving sensors and monitors with application to health studies. Copyright © 2018 Elsevier Inc. All rights reserved.
Reduced basis ANOVA methods for partial differential equations with high-dimensional random inputs
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liao, Qifeng, E-mail: liaoqf@shanghaitech.edu.cn; Lin, Guang, E-mail: guanglin@purdue.edu
2016-07-15
In this paper we present a reduced basis ANOVA approach for partial deferential equations (PDEs) with random inputs. The ANOVA method combined with stochastic collocation methods provides model reduction in high-dimensional parameter space through decomposing high-dimensional inputs into unions of low-dimensional inputs. In this work, to further reduce the computational cost, we investigate spatial low-rank structures in the ANOVA-collocation method, and develop efficient spatial model reduction techniques using hierarchically generated reduced bases. We present a general mathematical framework of the methodology, validate its accuracy and demonstrate its efficiency with numerical experiments.
Kim, Jimin P; Xie, Zhiwei; Creer, Michael; Liu, Zhiwen; Yang, Jian
2017-01-01
Chloride is an essential electrolyte that maintains homeostasis within the body, where abnormal chloride levels in biological fluids may indicate various diseases such as Cystic Fibrosis. However, current analytical solutions for chloride detection fail to meet the clinical needs of both high performance and low material or labor costs, hindering translation into clinical settings. Here we present a new class of fluorescence chloride sensors derived from a facile citrate -based synthesis platform that utilize dynamic quenching mechanisms. Based on this low-cost platform, we demonstrate for the first time a selective sensing strategy that uses a single fluorophore to detect multiple halides simultaneously, promising both selectivity and automation to improve performance and reduce labor costs. We also demonstrate the clinical utility of citrate-based sensors as a new sweat chloride test method for the diagnosis of Cystic Fibrosis by performing analytical validation with sweat controls and clinical validation with sweat from individuals with or without Cystic Fibrosis. Lastly, molecular modeling studies reveal the structural mechanism behind chloride sensing, serving to expand this class of fluorescence sensors with improved chloride sensitivities. Thus citrate-based fluorescent materials may enable low-cost, automated multi-analysis systems for simpler, yet accurate, point-of-care diagnostics that can be readily translated into clinical settings. More broadly, a wide range of medical, industrial, and environmental applications can be achieved with such a facile synthesis platform, demonstrated in our citrate-based biodegradable polymers with intrinsic fluorescence sensing.
Toward an affordable and user-friendly visual motion capture system.
Bonnet, V; Sylla, N; Cherubini, A; Gonzáles, A; Azevedo Coste, C; Fraisse, P; Venture, G
2014-01-01
The present study aims at designing and evaluating a low-cost, simple and portable system for arm joint angle estimation during grasping-like motions. The system is based on a single RGB-D camera and three customized markers. The automatically detected and tracked marker positions were used as inputs to an offline inverse kinematic process based on bio-mechanical constraints to reduce noise effect and handle marker occlusion. The method was validated on 4 subjects with different motions. The joint angles were estimated both with the proposed low-cost system and, a stereophotogrammetric system. Comparative analysis shows good accuracy with high correlation coefficient (r= 0.92) and low average RMS error (3.8 deg).
Patterson, P Daniel; Suyama, Joe; Reis, Steven E; Weaver, Matthew D; Hostler, David
2013-01-01
Cardiac arrest is a leading cause of mortality among firefighters. We sought to develop a valid method for determining the costs of a workplace prevention program for firefighters. In 2012, we developed a draft framework using human resource accounting and in-depth interviews with experts in the firefighting and insurance industries. The interviews produced a draft cost model with 6 components and 26 subcomponents. In 2013, we randomly sampled 100 fire chiefs out of >7,400 affiliated with the International Association of Fire Chiefs. We used the Content Validity Index (CVI) to identify the content valid components of the draft cost model. This was accomplished by having fire chiefs rate the relevancy of cost components using a 4-point Likert scale (highly relevant to not relevant). We received complete survey data from 65 fire chiefs (65% response rate). We retained 5 components and 21 subcomponents based on CVI scores ≥0.70. The five main components include, (1) investment costs, (2) orientation and training costs, (3) medical and pharmaceutical costs, (4) education and continuing education costs, and (5) maintenance costs. Data from a diverse sample of fire chiefs has produced a content valid method for calculating the cost of a prevention program among firefighters.
Patterson, P. Daniel; Suyama, Joe; Reis, Steven E.; Weaver, Matthew D.; Hostler, David
2013-01-01
Cardiac arrest is a leading cause of mortality among firefighters. We sought to develop a valid method for determining the costs of a workplace prevention program for firefighters. In 2012, we developed a draft framework using human resource accounting and in-depth interviews with experts in the firefighting and insurance industries. The interviews produced a draft cost model with 6 components and 26 subcomponents. In 2013, we randomly sampled 100 fire chiefs out of >7,400 affiliated with the International Association of Fire Chiefs. We used the Content Validity Index (CVI) to identify the content valid components of the draft cost model. This was accomplished by having fire chiefs rate the relevancy of cost components using a 4-point Likert scale (highly relevant to not relevant). We received complete survey data from 65 fire chiefs (65% response rate). We retained 5 components and 21 subcomponents based on CVI scores ≥0.70. The five main components include, (1) investment costs, (2) orientation and training costs, (3) medical and pharmaceutical costs, (4) education and continuing education costs, and (5) maintenance costs. Data from a diverse sample of fire chiefs has produced a content valid method for calculating the cost of a prevention program among firefighters. PMID:24455288
A novel method for producing low cost dynamometric wheels based on harmonic elimination techniques
NASA Astrophysics Data System (ADS)
Gutiérrez-López, María D.; García de Jalón, Javier; Cubillo, Adrián
2015-02-01
A method for producing low cost dynamometric wheels is presented in this paper. For carrying out this method, the metallic part of a commercial wheel is instrumented with strain gauges, which must be grouped in at least three circumferences and in equidistant radial lines. The strain signals of the same circumference are linearly combined to obtain at least two new signals that only depend on the tyre/road contact forces and moments. The influence of factors like the angle rotated by the wheel, the temperature or the centrifugal forces is eliminated in them by removing the continuous component and the largest possible number of harmonics, except the first or the second one, of the strain signals. The contact forces and moments are obtained from these new signals by solving two systems of linear equations with three unknowns each. This method is validated with some theoretical and experimental examples.
A flood map based DOI decoding method for block detector: a GATE simulation study.
Shi, Han; Du, Dong; Su, Zhihong; Peng, Qiyu
2014-01-01
Positron Emission Tomography (PET) systems using detectors with Depth of Interaction (DOI) capabilities could achieve higher spatial resolution and better image quality than those without DOI. Up till now, most DOI methods developed are not cost-efficient for a whole body PET system. In this paper, we present a DOI decoding method based on flood map for low-cost conventional block detector with four-PMT readout. Using this method, the DOI information can be directly extracted from the DOI-related crystal spot deformation in the flood map. GATE simulations are then carried out to validate the method, confirming a DOI sorting accuracy of 85.27%. Therefore, we conclude that this method has the potential to be applied in conventional detectors to achieve a reasonable DOI measurement without dramatically increasing their complexity and cost of an entire PET system.
Validation of an effective, low cost, Free/open access 3D-printed stethoscope
Pavlosky, Alexander; Glauche, Jennifer; Chambers, Spencer; Al-Alawi, Mahmoud; Yanev, Kliment
2018-01-01
The modern acoustic stethoscope is a useful clinical tool used to detect subtle, pathological changes in cardiac, pulmonary and vascular sounds. Currently, brand-name stethoscopes are expensive despite limited innovations in design or fabrication in recent decades. Consequently, the high cost of high quality, brand name models serves as a barrier to clinicians practicing in various settings, especially in low- and middle-income countries. In this publication, we describe the design and validation of a low-cost open-access (Free/Libre) 3D-printed stethoscope which is comparable to the Littmann Cardiology III for use in low-access clinics. PMID:29538426
Jimenez, Krystal; Vargas, Cristina; Garcia, Karla; Guzman, Herlinda; Angulo, Marco; Billimek, John
2017-02-01
Purpose The purpose of this study was to examine the reliability and validity of a Spanish version of the Beliefs about Medicines Questionnaire (BMQ) as a measure to evaluate beliefs about medications and to differentiate adherent from nonadherent patients among low-income Latino patients with diabetes in the United States. Methods Seventy-three patients were administered the BMQ and surveyed for evidence of medication nonadherence. Internal consistency of the BMQ was assessed by Cronbach's alpha along with performing a confirmatory factor analysis. Criterion validity was assessed by comparing mean scores on 3 subscales of the BMQ (General Overuse, General Harm, and Specific Necessity-Concerns difference score) between adherent patients and patients reporting nonadherence for 3 different reasons (unintentional nonadherence, cost-related nonadherence, and nonadherence due to reasons other than cost) using independent samples t tests. Results The BMQ is a reliable instrument to examine beliefs about medications in this Spanish-speaking population. Construct validity testing shows nearly identical factor loading as the original construct map. General Overuse scores were significantly more negative for patients reporting each reason for nonadherence compared with their adherent counterparts. Necessity-Concerns difference scores were significantly more negative for patients reporting nonadherence for reasons other than cost compared with those who did not report this reason for nonadherence. Conclusion The Spanish version of the BMQ is appropriate to assess beliefs about medications in Latino patients with type 2 diabetes in the United States and may help identify patients who become nonadherent to medications for reasons other than out-of-pocket costs.
Balsalobre-Fernández, Carlos; Tejero-González, Carlos M; del Campo-Vecino, Juan; Bavaresco, Nicolás
2014-02-01
Flight time is the most accurate and frequently used variable when assessing the height of vertical jumps. The purpose of this study was to analyze the validity and reliability of an alternative method (i.e., the HSC-Kinovea method) for measuring the flight time and height of vertical jumping using a low-cost high-speed Casio Exilim FH-25 camera (HSC). To this end, 25 subjects performed a total of 125 vertical jumps on an infrared (IR) platform while simultaneously being recorded with a HSC at 240 fps. Subsequently, 2 observers with no experience in video analysis analyzed the 125 videos independently using the open-license Kinovea 0.8.15 software. The flight times obtained were then converted into vertical jump heights, and the intraclass correlation coefficient (ICC), Bland-Altman plot, and Pearson correlation coefficient were calculated for those variables. The results showed a perfect correlation agreement (ICC = 1, p < 0.0001) between both observers' measurements of flight time and jump height and a highly reliable agreement (ICC = 0.997, p < 0.0001) between the observers' measurements of flight time and jump height using the HSC-Kinovea method and those obtained using the IR system, thus explaining 99.5% (p < 0.0001) of the differences (shared variance) obtained using the IR platform. As a result, besides requiring no previous experience in the use of this technology, the HSC-Kinovea method can be considered to provide similarly valid and reliable measurements of flight time and vertical jump height as more expensive equipment (i.e., IR). As such, coaches from many sports could use the HSC-Kinovea method to measure the flight time and height of their athlete's vertical jumps.
NASA Technical Reports Server (NTRS)
Mcpherson, J.
1991-01-01
The topics presented are covered in viewgraph form. The objectives are to develop and validate technology, design tools and methodologies to enable the low cost commercial development and operational uses of hydrogen and hydrocarbon fueled liquid engines, low pressure booster engines and hybrid engines.
Potential of balloon payloads for in flight validation of direct and nulling interferometry concepts
NASA Astrophysics Data System (ADS)
Demangeon, Olivier; Ollivier, Marc; Le Duigou, Jean-Michel; Cassaing, Frédéric; Coudé du Foresto, Vincent; Mourard, Denis; Kern, Pierre; Lam Trong, Tien; Evrard, Jean; Absil, Olivier; Defrere, Denis; Lopez, Bruno
2010-07-01
While the question of low cost / low science precursors is raised to validate the concepts of direct and nulling interferometry space missions, balloon payloads offer a real opportunity thanks to their relatively low cost and reduced development plan. Taking into account the flight capabilities of various balloon types, we propose in this paper, several concepts of payloads associated to their flight plan. We also discuss the pros and cons of each concepts in terms of technological and science demonstration power.
NASA Astrophysics Data System (ADS)
na ayudhaya, Paisarn Daungjak; Klinbumrung, Arrak; Jaroensutasinee, Krisanadej; Pratontep, Sirapat; Kerdcharoen, Teerakiat
2009-05-01
In case of species of natural and aromatic plant originated from the northern Thailand, sensory characteristics, especially odours, have unique identifiers of herbs. The instruments sensory analysis have performed by several of differential of sensing, so call `electronic nose', to be a significantly and rapidly for chemometrics. The signal responses of the low cost electronic nose were evaluated by principal component analysis (PCA). The aims of this paper evaluated various of Thai-herbs grown in Northern of Thailand as data preprocessing tools of the Low-cost electronic nose (enNU-PYO1). The essential oil groups of Thai herbs such as Garlic, Lemongrass, Shallot (potato onion), Onion, Zanthoxylum limonella (Dennst.) Alston (Thai name is Makaen), and Kaffir lime leaf were compared volatilized from selected fresh herbs. Principal component analysis of the original sensor responses did clearly distinguish either all samples. In all cases more than 97% for cross-validated group were classified correctly. The results demonstrated that it was possible to develop in a model to construct a low-cost electronic nose to provide measurement of odoriferous herbs.
Vyas, A; Goel, G
2017-09-01
Minimal invasive surgery training requires a lot of practice and for this purpose innovative tools are needed to develop methods for practice and training skills outside the operating room. Commercially available devices are readily available but cost effectiveness and availability are major limiting factors in resource limited setting. We present an innovative and cost effective laparoscopic simulator which can be easily manufactured and used for practice of laparoscopic surgery. Using a free android application, such as IP webcam we can relay video to laptop without the use of any cables and uniquely we use the flash of a camera as the light source and a selfie stick for movement of the camera. Use of this type of setup can help to reduce cost of simulated learning in low income countries and makes laparoscopic training facilities readily available. Copyright© Authors.
Kupssinskü, Lucas S.; T. Guimarães, Tainá; Koste, Emilie C.; da Silva, Juarez M.; de Souza, Laís V.; Oliverio, William F. M.; Jardim, Rogélio S.; Koch, Ismael É.; de Souza, Jonas G.; Mauad, Frederico F.
2018-01-01
Water quality monitoring through remote sensing with UAVs is best conducted using multispectral sensors; however, these sensors are expensive. We aimed to predict multispectral bands from a low-cost sensor (R, G, B bands) using artificial neural networks (ANN). We studied a lake located on the campus of Unisinos University, Brazil, using a low-cost sensor mounted on a UAV. Simultaneously, we collected water samples during the UAV flight to determine total suspended solids (TSS) and dissolved organic matter (DOM). We correlated the three bands predicted with TSS and DOM. The results show that the ANN validation process predicted the three bands of the multispectral sensor using the three bands of the low-cost sensor with a low average error of 19%. The correlations with TSS and DOM resulted in R2 values of greater than 0.60, consistent with literature values. PMID:29315219
Davison, Kirsten K.; Austin, S. Bryn; Giles, Catherine; Cradock, Angie L.; Lee, Rebekka M.; Gortmaker, Steven L.
2017-01-01
Interest in evaluating and improving children’s diets in afterschool settings has grown, necessitating the development of feasible yet valid measures for capturing children’s intake in such settings. This study’s purpose was to test the criterion validity and cost of three unobtrusive visual estimation methods compared to a plate-weighing method: direct on-site observation using a 4-category rating scale and off-site rating of digital photographs taken on-site using 4- and 10-category scales. Participants were 111 children in grades 1–6 attending four afterschool programs in Boston, MA in December 2011. Researchers observed and photographed 174 total snack meals consumed across two days at each program. Visual estimates of consumption were compared to weighed estimates (the criterion measure) using intra-class correlations. All three methods were highly correlated with the criterion measure, ranging from 0.92–0.94 for total calories consumed, 0.86–0.94 for consumption of pre-packaged beverages, 0.90–0.93 for consumption of fruits/vegetables, and 0.92–0.96 for consumption of grains. For water, which was not pre-portioned, coefficients ranged from 0.47–0.52. The photographic methods also demonstrated excellent inter-rater reliability: 0.84–0.92 for the 4-point and 0.92–0.95 for the 10-point scale. The costs of the methods for estimating intake ranged from $0.62 per observation for the on-site direct visual method to $0.95 per observation for the criterion measure. This study demonstrates that feasible, inexpensive methods can validly and reliably measure children’s dietary intake in afterschool settings. Improving precision in measures of children’s dietary intake can reduce the likelihood of spurious or null findings in future studies. PMID:25596895
Jimenez, Krystal; Vargas, Cristina; Garcia, Karla; Guzman, Herlinda; Angulo, Marco; Billimek, John
2018-01-01
Purpose The purpose of this study was to examine the reliability and validity of a Spanish version of the Beliefs About Medicines Questionnaire (BMQ) as a measure to evaluate beliefs about medications and to differentiate adherent from nonadherent patients among low-income Latino patients with diabetes in the United States. Methods Seventy-three patients were administered the BMQ and surveyed for evidence of medication nonadherence. Internal consistency of the BMQ was assessed by Cronbach’s alpha along with performing a confirmatory factor analysis. Criterion validity was assessed by comparing mean scores on three subscales of the BMQ (General Overuse, General Harm, and Specific Necessity-Concerns difference score) between adherent patients and patients reporting nonadherence for three different reasons (unintentional nonadherence, cost-related nonadherence, and nonadherence due to reasons other than cost) using independent samples t-tests. Results The BMQ is a reliable instrument to examine beliefs about medications in this Spanish-speaking population. Construct validity testing shows nearly identical factor loading as the original construct map. General Overuse scores were significantly more negative for patients reporting each reason for nonadherence compared to their adherent counterparts. Necessity-concerns difference scores were significantly more negative for patients reporting nonadherence for reasons other than cost compared to those who did not report this reason for nonadherence. Conclusions The Spanish version of the BMQ is appropriate to assess beliefs about medications in Latino patients with type 2 diabetes in the United States, and may help identify patients who become nonadherent to medications for reasons other than out of pocket costs. PMID:27831521
Cost model validation: a technical and cultural approach
NASA Technical Reports Server (NTRS)
Hihn, J.; Rosenberg, L.; Roust, K.; Warfield, K.
2001-01-01
This paper summarizes how JPL's parametric mission cost model (PMCM) has been validated using both formal statistical methods and a variety of peer and management reviews in order to establish organizational acceptance of the cost model estimates.
Bitella, Giovanni; Rossi, Roberta; Bochicchio, Rocco; Perniola, Michele; Amato, Mariana
2014-01-01
Monitoring soil water content at high spatio-temporal resolution and coupled to other sensor data is crucial for applications oriented towards water sustainability in agriculture, such as precision irrigation or phenotyping root traits for drought tolerance. The cost of instrumentation, however, limits measurement frequency and number of sensors. The objective of this work was to design a low cost “open hardware” platform for multi-sensor measurements including water content at different depths, air and soil temperatures. The system is based on an open-source ARDUINO microcontroller-board, programmed in a simple integrated development environment (IDE). Low cost high-frequency dielectric probes were used in the platform and lab tested on three non-saline soils (ECe1: 2.5 < 0.1 mS/cm). Empirical calibration curves were subjected to cross-validation (leave-one-out method), and normalized root mean square error (NRMSE) were respectively 0.09 for the overall model, 0.09 for the sandy soil, 0.07 for the clay loam and 0.08 for the sandy loam. The overall model (pooled soil data) fitted the data very well (R2 = 0.89) showing a high stability, being able to generate very similar RMSEs during training and validation (RMSEtraining = 2.63; RMSEvalidation = 2.61). Data recorded on the card were automatically sent to a remote server allowing repeated field-data quality checks. This work provides a framework for the replication and upgrading of a customized low cost platform, consistent with the open source approach whereby sharing information on equipment design and software facilitates the adoption and continuous improvement of existing technologies. PMID:25337742
Getting a Valid Survey Response From 662 Plastic Surgeons in the 21st Century.
Reinisch, John F; Yu, Daniel C; Li, Wai-Yee
2016-01-01
Web-based surveys save time and money. As electronic questionnaires have increased in popularity, telephone and mailed surveys have declined. With any survey, a response rate of 75% or greater is critical for the validity of any study. We wanted to determine which survey method achieved the highest response among academic plastic surgeons. All American Association of Plastic Surgeons members were surveyed regarding authorship issues. They were randomly assigned to receive the questionnaire through 1 of 4 methods: (A) emailed with a link to an online survey; (B) regular mail; (C) regular mail + $1 bill, and (D) regular mail + $5 bill. Two weeks after the initial mailing, the number of responses was collected, and nonresponders were contacted to remind them to participate. The study was closed after 10 weeks. Survey costs were calculated based on the actual cost of sending the initial survey, including stationary, printing, postage (groups B-D), labor, and cost of any financial incentives. Cost of reminders to nonresponders was calculated at $5 per reminder, giving a total survey cost. Of 662 surveys sent, 54 were returned because of incorrect address/email, retirement, or death. Four hundred seventeen of the remaining 608 surveys were returned and analyzed. The response rate was lowest in the online group and highest in those mailed with a monetary incentive. Despite the convenience and low initial cost of web-based surveys, this generated the lowest response. We obtained statistically significant response rates (79% and 84%) only by using postal mail with monetary incentives and reminders. The inclusion of a $1 bill represented the greatest value and cost-effective survey method, based on cost per response.
A scalable correlator for multichannel diffuse correlation spectroscopy.
Stapels, Christopher J; Kolodziejski, Noah J; McAdams, Daniel; Podolsky, Matthew J; Fernandez, Daniel E; Farkas, Dana; Christian, James F
2016-02-01
Diffuse correlation spectroscopy (DCS) is a technique which enables powerful and robust non-invasive optical studies of tissue micro-circulation and vascular blood flow. The technique amounts to autocorrelation analysis of coherent photons after their migration through moving scatterers and subsequent collection by single-mode optical fibers. A primary cost driver of DCS instruments are the commercial hardware-based correlators, limiting the proliferation of multi-channel instruments for validation of perfusion analysis as a clinical diagnostic metric. We present the development of a low-cost scalable correlator enabled by microchip-based time-tagging, and a software-based multi-tau data analysis method. We will discuss the capabilities of the instrument as well as the implementation and validation of 2- and 8-channel systems built for live animal and pre-clinical settings.
Anderson, J.R.; Ackerman, J.J.H.; Garbow, J.R.
2015-01-01
Two semipermeable, hollow fiber phantoms for the validation of perfusion-sensitive magnetic resonance methods and signal models are described. Semipermeable hollow fibers harvested from a standard commercial hemodialysis cartridge serve to mimic tissue capillary function. Flow of aqueous media through the fiber lumen is achieved with a laboratory-grade peristaltic pump. Diffusion of water and solute species (e.g., Gd-based contrast agent) occurs across the fiber wall, allowing exchange between the lumen and the extralumenal space. Phantom design attributes include: i) small physical size, ii) easy and low-cost construction, iii) definable compartment volumes, and iv) experimental control over media content and flow rate. PMID:26167136
Quantitative analysis of Sudan dye adulteration in paprika powder using FTIR spectroscopy.
Lohumi, Santosh; Joshi, Ritu; Kandpal, Lalit Mohan; Lee, Hoonsoo; Kim, Moon S; Cho, Hyunjeong; Mo, Changyeun; Seo, Young-Wook; Rahman, Anisur; Cho, Byoung-Kwan
2017-05-01
As adulteration of foodstuffs with Sudan dye, especially paprika- and chilli-containing products, has been reported with some frequency, this issue has become one focal point for addressing food safety. FTIR spectroscopy has been used extensively as an analytical method for quality control and safety determination for food products. Thus, the use of FTIR spectroscopy for rapid determination of Sudan dye in paprika powder was investigated in this study. A net analyte signal (NAS)-based methodology, named HLA/GO (hybrid linear analysis in the literature), was applied to FTIR spectral data to predict Sudan dye concentration. The calibration and validation sets were designed to evaluate the performance of the multivariate method. The obtained results had a high determination coefficient (R 2 ) of 0.98 and low root mean square error (RMSE) of 0.026% for the calibration set, and an R 2 of 0.97 and RMSE of 0.05% for the validation set. The model was further validated using a second validation set and through the figures of merit, such as sensitivity, selectivity, and limits of detection and quantification. The proposed technique of FTIR combined with HLA/GO is rapid, simple and low cost, making this approach advantageous when compared with the main alternative methods based on liquid chromatography (LC) techniques.
Carmo, Ana Paula Barbosa do; Borborema, Manoella; Ribeiro, Stephan; De-Oliveira, Ana Cecilia Xavier; Paumgartten, Francisco Jose Roma; Moreira, Davyson de Lima
2017-01-01
Primaquine (PQ) diphosphate is an 8-aminoquinoline antimalarial drug with unique therapeutic properties. It is the only drug that prevents relapses of Plasmodium vivax or Plasmodium ovale infections. In this study, a fast, sensitive, cost-effective, and robust method for the extraction and high-performance liquid chromatography with diode array ultraviolet detection (HPLC-DAD-UV ) analysis of PQ in the blood plasma was developed and validated. After plasma protein precipitation, PQ was obtained by liquid-liquid extraction and analyzed by HPLC-DAD-UV with a modified-silica cyanopropyl column (250mm × 4.6mm i.d. × 5μm) as the stationary phase and a mixture of acetonitrile and 10mM ammonium acetate buffer (pH = 3.80) (45:55) as the mobile phase. The flow rate was 1.0mL·min-1, the oven temperature was 50OC, and absorbance was measured at 264nm. The method was validated for linearity, intra-day and inter-day precision, accuracy, recovery, and robustness. The detection (LOD) and quantification (LOQ) limits were 1.0 and 3.5ng·mL-1, respectively. The method was used to analyze the plasma of female DBA-2 mice treated with 20mg.kg-1 (oral) PQ diphosphate. By combining a simple, low-cost extraction procedure with a sensitive, precise, accurate, and robust method, it was possible to analyze PQ in small volumes of plasma. The new method presents lower LOD and LOQ limits and requires a shorter analysis time and smaller plasma volumes than those of previously reported HPLC methods with DAD-UV detection. The new validated method is suitable for kinetic studies of PQ in small rodents, including mouse models for the study of malaria.
Fischedick, Justin T; Glas, Ronald; Hazekamp, Arno; Verpoorte, Rob
2009-01-01
Cannabis and cannabinoid based medicines are currently under serious investigation for legitimate development as medicinal agents, necessitating new low-cost, high-throughput analytical methods for quality control. The goal of this study was to develop and validate, according to ICH guidelines, a simple rapid HPTLC method for the quantification of Delta(9)-tetrahydrocannabinol (Delta(9)-THC) and qualitative analysis of other main neutral cannabinoids found in cannabis. The method was developed and validated with the use of pure cannabinoid reference standards and two medicinal cannabis cultivars. Accuracy was determined by comparing results obtained from the HTPLC method with those obtained from a validated HPLC method. Delta(9)-THC gives linear calibration curves in the range of 50-500 ng at 206 nm with a linear regression of y = 11.858x + 125.99 and r(2) = 0.9968. Results have shown that the HPTLC method is reproducible and accurate for the quantification of Delta(9)-THC in cannabis. The method is also useful for the qualitative screening of the main neutral cannabinoids found in cannabis cultivars.
Toward cost-efficient sampling methods
NASA Astrophysics Data System (ADS)
Luo, Peng; Li, Yongli; Wu, Chong; Zhang, Guijie
2015-09-01
The sampling method has been paid much attention in the field of complex network in general and statistical physics in particular. This paper proposes two new sampling methods based on the idea that a small part of vertices with high node degree could possess the most structure information of a complex network. The two proposed sampling methods are efficient in sampling high degree nodes so that they would be useful even if the sampling rate is low, which means cost-efficient. The first new sampling method is developed on the basis of the widely used stratified random sampling (SRS) method and the second one improves the famous snowball sampling (SBS) method. In order to demonstrate the validity and accuracy of two new sampling methods, we compare them with the existing sampling methods in three commonly used simulation networks that are scale-free network, random network, small-world network, and also in two real networks. The experimental results illustrate that the two proposed sampling methods perform much better than the existing sampling methods in terms of achieving the true network structure characteristics reflected by clustering coefficient, Bonacich centrality and average path length, especially when the sampling rate is low.
Lamorde, Mohammed; Fillekes, Quirine; Sigaloff, Kim; Kityo, Cissy; Buzibye, Allan; Kayiwa, Joshua; Merry, Concepta; Nakatudde-Katumba, Lillian; Burger, David; de Wit, Tobias F Rinke
2014-09-01
In resource limited settings access to laboratory monitoring of HIV treatment is limited and therapeutic drug monitoring is generally unavailable. This study aimed to evaluate nevirapine concentrations in saliva using low-cost thin-layer chromatography (TLC) and nevirapine concentrations in plasma and saliva using high performance liquid chromatography (HPLC) methods; and to correlate nevirapine plasma concentrations to HIV treatment outcomes in Ugandan patients. Paired plasma and stimulated saliva samples were obtained from Ugandan, HIV-infected adults on nevirapine-based ART. Nevirapine concentrations were measured using a validated HPLC method and a novel TLC method. Plasma nevirapine concentrations <3.0 mg/L using HPLC were considered subtherapeutic. Negative/positive predictive values of different thresholds for subtherapeutic nevirapine concentrations in saliva were determined. Virologic testing and, if applicable, HIV drug resistance testing was performed. Median (interquartile range, IQR) age of 297 patients was 39.1 (32.8-45.2) years. Three hundred saliva and 287 plasma samples were available for analysis. Attempts failed to determine nevirapine saliva concentrations by TLC. Using HPLC, median (IQR) nevirapine concentrations in saliva and plasma were 3.40 (2.59-4.47) mg/L and 6.17 (4.79-7.96) mg/L, respectively. The mean (coefficient of variation,%) nevirapine saliva/plasma ratio was 0.58 (62%). A cut-off value of 1.60 mg/L nevirapine in saliva was associated with a negative/positive predictive value of 0.99/0.72 and a sensitivity/specificity of 87%/98% for predicting subtherapeutic nevirapine plasma concentrations, respectively. Only 5% (15/287) of patients had subtherapeutic nevirapine plasma concentrations, of which 3 patients had viral load results > 400 copies/mL. Patients with nevirapine concentrations in plasma <3.0 mg/L had an Odds Ratio of 3.29 (95% CI: 1.00 - 10.74) for virological failure (viral load >400 copies/mL). The low-cost TLC technique for monitoring nevirapine in saliva was unsuccessful but monitoring nevirapine saliva and plasma concentrations using HPLC was shown to be feasible in the research/specialist context in Uganda. Further optimization and validation is required for the low-cost TLC technique.
Jackson-Morris, Angela; Bleymann, Kayleigh; Lyall, Elaine; Aslam, Fouad; Bam, Tara Singh; Chowdhury, Ishrat; Daouda, Elhadj Adam; Espinosa, Mariana; Romo, Jonathan; Singh, Rana J; Semple, Sean
2016-05-01
Many low- and middle-income countries (LMICs) have enacted legislation banning smoking in public places, yet enforcement remains challenging. The aim of this study was to assess the feasibility of using a validated low-cost methodology (the Dylos DC1700) to provide objective evidence of smoke-free (SF) law compliance in hospitality venues in urban LMIC settings, where outdoor air pollution levels are generally high. Teams measured indoor fine particulate matter (PM2.5) concentrations and systematically observed smoking behavior and SF signage in a convenience sample of hospitality venues (bars, restaurants, cafes, and hotels) covered by existing SF legislation in Mexico, Pakistan, Indonesia, Chad, Bangladesh, and India. Outdoor air PM2.5 was also measured on each sampling day. Data were collected from 626 venues. Smoking was observed during almost one-third of visits with substantial differences between countries-from 5% in India to 72% in Chad. After excluding venues where other combustion sources were observed, secondhand smoke (SHS) derived PM2.5 was calculated by subtracting outdoor ambient PM2.5 concentrations from indoor measurements and was, on average, 34 µg/m(3) in venues with observed smoking-compared to an average value of 0 µg/m(3) in venues where smoking was not observed (P < .001). In over one-quarter of venues where smoking was observed the difference between indoor and outdoor PM2.5 concentrations exceeded 64 µg/m(3). This study suggests that low-cost air quality monitoring is a viable method for improving knowledge about environmental SHS and can provide indicative data on compliance with local and national SF legislation in hospitality venues in LMICs. Air quality monitoring can provide objective scientific data on SHS and air quality levels in venues to assess the effectiveness of SF laws and identify required improvements. Equipment costs and high outdoor air pollution levels have hitherto limited application in LMICs. This study tested the feasibility of using a validated low-cost methodology in hospitality venues in six LMIC urban settings and suggests this is a viable method for improving knowledge about SHS exposure and can provide indicative data on compliance with SF legislation. © The Author 2016. Published by Oxford University Press on behalf of the Society for Research on Nicotine and Tobacco. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
A low-cost contact system to assess load displacement velocity in a resistance training machine.
Buscà, Bernat; Font, Anna
2011-01-01
This study sought to determine the validity of a new system for assessing the displacement and average velocity within machine-based resistance training exercise using the Chronojump System. The new design is based on a contact bar and a simple, low-cost mechanism that detects the conductivity of electrical potentials with a precision chronograph. This system allows coaches to assess velocity to control the strength training process. A validation study was performed by assessing the concentric phase parameters of a leg press exercise. Output time data from the Chronojump System in combination with the pre-established range of movement was compared with data from a position sensor connected to a Biopac System. A subset of 87 actions from 11 professional tennis players was recorded and, using the two methods, average velocity and displacement variables in the same action were compared. A t-test for dependent samples and a correlation analysis were undertaken. The r value derived from the correlation between the Biopac System and the contact Chronojump System was >0.94 for all measures of displacement and velocity on all loads (p < 0.01). The Effect Size (ES) was 0.18 in displacement and 0.14 in velocity and ranged from 0.09 to 0.31 and from 0.07 to 0.34, respectively. The magnitude of the difference between the two methods in all parameters and the correlation values provided certain evidence of validity of the Chronojump System to assess the average displacement velocity of loads in a resistance training machine. Key pointsThe assessment of speed in resistance machines is a valuable source of information for strength training.Many commercial systems used to assess velocity, power and force are expensive thereby preventing widespread use by coaches and athletes.The system is intended to be a low-cost device for assessing and controlling the velocity exerted on each repetition in any resistance training machine.The system could be easily adapted in any vertical displacement barbell exercise.
NASA Astrophysics Data System (ADS)
Gutzweiler, Ludwig; Stumpf, Fabian; Tanguy, Laurent; Roth, Guenter; Koltay, Peter; Zengerle, Roland; Riegger, Lutz
2016-04-01
Microfluidic systems fabricated in polydimethylsiloxane (PDMS) enable a broad variety of applications and are widespread in the field of Lab-on-a-Chip. Here we demonstrate semi-contact-writing, a novel method for fabrication of polymer based molds for casting microfluidic PDMS chips in a highly flexible, time and cost-efficient manner. The method is related to direct-writing of an aqueous polymer solution on a planar glass substrate and substitutes conventional, time- and cost-consuming UV-lithography. This technique facilitates on-demand prototyping in a low-cost manner and is therefore ideally suited for rapid chip layout iterations. No cleanroom facilities and less expertise are required. Fabrication time from scratch to ready-to-use PDMS-chip is less than 5 h. This polymer writing method enables structure widths down to 140 μm and controllable structure heights ranging from 5.5 μm for writing single layers up to 98 μm by stacking. As a unique property, freely selectable height variations across a substrate can be achieved by application of local stacking. Furthermore, the molds exhibit low surface roughness (R a = 24 nm, R RMS = 28 nm) and high fidelity edge sharpness. We validated the method by fabrication of molds to cast PDMS chips for droplet based flow-through PCR with single-cell sensitivity.
USDA-ARS?s Scientific Manuscript database
Soybean and sunflower are summer annuals that can be grown as an alternative to corn and may be particularly useful in organic production systems for forage in addition to their traditional use as protein and/or oil yielding crops. Rapid and low cost methods of analyzing plant quality would be helpf...
NASA Astrophysics Data System (ADS)
San-Blas, A. A.; Roca, J. M.; Cogollos, S.; Morro, J. V.; Boria, V. E.; Gimeno, B.
2016-06-01
In this work, a full-wave tool for the accurate analysis and design of compensated E-plane multiport junctions is proposed. The implemented tool is capable of evaluating the undesired effects related to the use of low-cost manufacturing techniques, which are mostly due to the introduction of rounded corners in the cross section of the rectangular waveguides of the device. The obtained results show that, although stringent mechanical effects are imposed, it is possible to compensate for the impact of the cited low-cost manufacturing techniques by redesigning the matching elements considered in the original device. Several new designs concerning a great variety of E-plane components (such as right-angled bends, T-junctions and magic-Ts) are presented, and useful design guidelines are provided. The implemented tool, which is mainly based on the boundary integral-resonant mode expansion technique, has been successfully validated by comparing the obtained results to simulated data provided by a commercial software based on the finite element method.
A low-cost three-dimensional laser surface scanning approach for defining body segment parameters.
Pandis, Petros; Bull, Anthony Mj
2017-11-01
Body segment parameters are used in many different applications in ergonomics as well as in dynamic modelling of the musculoskeletal system. Body segment parameters can be defined using different methods, including techniques that involve time-consuming manual measurements of the human body, used in conjunction with models or equations. In this study, a scanning technique for measuring subject-specific body segment parameters in an easy, fast, accurate and low-cost way was developed and validated. The scanner can obtain the body segment parameters in a single scanning operation, which takes between 8 and 10 s. The results obtained with the system show a standard deviation of 2.5% in volumetric measurements of the upper limb of a mannequin and 3.1% difference between scanning volume and actual volume. Finally, the maximum mean error for the moment of inertia by scanning a standard-sized homogeneous object was 2.2%. This study shows that a low-cost system can provide quick and accurate subject-specific body segment parameter estimates.
Subramanian, Sujha; Tangka, Florence K.L.; Beebe, Maggie Cole; Trebino, Diana; Weir, Hannah K.; Babcock, Frances
2016-01-01
Background Cancer registration data is vital for creating evidence-based policies and interventions. Quantifying the resources needed for cancer registration activities and identifying potential efficiencies are critically important to ensure sustainability of cancer registry operations. Methods Using a previously validated web-based cost assessment tool, we collected activity-based cost data and report findings using 3 years of data from 40 National Program of Cancer Registry grantees. We stratified registries by volume: low-volume included fewer than 10,000 cases, medium-volume included 10,000–50,000 cases, and high-volume included >50,000 cases. Results Low-volume cancer registries incurred an average of $93.11 to report a case (without in-kind contributions) compared with $27.70 incurred by high-volume registries. Across all registries, the highest cost per case was incurred for data collection and abstraction ($8.33), management ($6.86), and administration ($4.99). Low- and medium-volume registries have higher costs than high-volume registries for all key activities. Conclusions Some cost differences by volume can be explained by the large fixed costs required for administering and performing registration activities, but other reasons may include the quality of the data initially submitted to the registries from reporting sources such as hospitals and pathology laboratories. Automation or efficiency improvements in data collection can potentially reduce overall costs. PMID:26702880
Nelson, Clay M; Gilmore, Thomas M; Harrington, M; Scheckel, Kirk G; Miller, Bradley W; Bradham, Karen D
2013-03-01
The U.S. EPA's in vitro bioaccessibility (IVBA) method 9200.1-86 defines a validated analytical procedure for the determination of lead bioaccessibility in contaminated soils. The method requires the use of a custom-fabricated extraction device that uses a heated water bath for sample incubation. In an effort to improve ease of use, increase sample throughput, and reduce equipment acquisition and maintenance costs, an alternative low-cost, commercially available extraction device capable of sample incubation via heated air and end-over-end rotation was evaluated. An intra-laboratory study was conducted to compare lead bioaccessibility values derived using the two extraction devices. IVBA values were not statistically different (α = 0.05) between the two extraction devices for any of the soils (n = 6) evaluated in this study, with an average difference in mean lead IVBA of 0.8% (s.d. = 0.5%). The commercially available extraction device was able to generate accurate lead IVBA data as compared to the U.S. EPA's expected value for a National Institute of Standards and Technology standard reference material soil. The relative percent differences between high and low IVBA values for each soil, a measure of instrument precision, were also not statistically different (α = 0.05) between the two extraction devices. The statistical agreement of lead IVBA values observed using the two extraction devices supports the use of a low-cost, commercially available extraction device as a reliable alternative to a custom-fabricated device as required by EPA method 9200.1-86.
Chen, Po-Yu
2014-01-01
The validness of the expiration dates (validity period) that manufacturers provide on food product labels is a crucial food safety problem. Governments must study how to use their authority by implementing fair awards and punishments to prompt manufacturers into adopting rigorous considerations, such as the effect of adopting new storage methods for extending product validity periods on expected costs. Assuming that a manufacturer sells fresh food or drugs, this manufacturer must respond to current stochastic demands at each unit of time to determine the purchase amount of products for sale. If this decision maker is capable and an opportunity arises, new packaging methods (e.g., aluminum foil packaging, vacuum packaging, high-temperature sterilization after glass packaging, or packaging with various degrees of dryness) or storage methods (i.e., adding desiccants or various antioxidants) can be chosen to extend the validity periods of products. To minimize expected costs, this decision maker must be aware of the processing costs of new storage methods, inventory standards, inventory cycle lengths, and changes in relationships between factors such as stochastic demand functions in a cycle. Based on these changes in relationships, this study established a mathematical model as a basis for discussing the aforementioned topics.
Low-cost carbon thick-film strain sensors for implantable applications
NASA Astrophysics Data System (ADS)
Gutierrez, Christian A.; Meng, Ellis
2010-09-01
The suitability of low-cost carbon thick-film strain sensors embedded within a biomedical grade silicone rubber (Silastic® MDX4-4210) for implantable applications is investigated. These sensors address the need for robust cost-effective implantable strain sensing technology for the closed loop operation of function-restoring neural prosthetic systems. Design, fabrication and characterization of the sensors are discussed in the context of the application to strain/fullness measurements of the urinary bladder as part of the neuroprosthetic treatment of lower urinary tract dysfunction. The fabrication process, utilizing off-the-shelf screen-printing materials, is convenient and cost effective while achieving resolutions down to 75 µm. This method can also be extended to produce multilayer embedded devices by superposition of different screen-printable materials. Uniaxial loading performance, temperature dependence and long-term soak testing are used to validate suitability for implantation while proof-of-concept operation (up to 40% strain) is demonstrated on a bench-top latex balloon bladder model.
Gujral, Rajinder Singh; Haque, Sk Manirul
2010-01-01
A simple and sensitive UV spectrophotometric method was developed and validated for the simultaneous determination of Potassium Clavulanate (PC) and Amoxicillin Trihydrate (AT) in bulk, pharmaceutical formulations and in human urine samples. The method was linear in the range of 0.2–8.5 μg/ml for PC and 6.4–33.6 μg/ml for AT. The absorbance was measured at 205 and 271 nm for PC and AT respectively. The method was validated with respect to accuracy, precision, specificity, ruggedness, robustness, limit of detection and limit of quantitation. This method was used successfully for the quality assessment of four PC and AT drug products and in human urine samples with good precision and accuracy. This is found to be simple, specific, precise, accurate, reproducible and low cost UV Spectrophotometric method. PMID:23675211
Implementation of a low-cost Interim 21CFR11 compliance solution for laboratory environments.
Greene, Jack E
2003-01-01
In the recent past, compliance with 21CFR11 has become a major buzzword within the pharmaceutical and biotechnology industries. While commercial solutions exist, implementation and validation are expensive and cumbersome. Frequent implementation of new features via point releases further complicates purchasing decisions by making it difficult to weigh the risk of non-compliance against the costs of too frequent upgrades. This presentation discusses a low-cost interim solution to the problem. While this solution does not address 100% of the issues raised by 21CFR11, it does implement and validate: (1) computer system security; (2) backup and restore ability on the electronic records store; and (3) an automated audit trail mechanism that captures the date, time and user identification whenever electronic records are created, modified or deleted. When coupled with enhanced procedural controls, this solution provides an acceptable level of compliance at extremely low cost.
Crowdtruth validation: a new paradigm for validating algorithms that rely on image correspondences.
Maier-Hein, Lena; Kondermann, Daniel; Roß, Tobias; Mersmann, Sven; Heim, Eric; Bodenstedt, Sebastian; Kenngott, Hannes Götz; Sanchez, Alexandro; Wagner, Martin; Preukschas, Anas; Wekerle, Anna-Laura; Helfert, Stefanie; März, Keno; Mehrabi, Arianeb; Speidel, Stefanie; Stock, Christian
2015-08-01
Feature tracking and 3D surface reconstruction are key enabling techniques to computer-assisted minimally invasive surgery. One of the major bottlenecks related to training and validation of new algorithms is the lack of large amounts of annotated images that fully capture the wide range of anatomical/scene variance in clinical practice. To address this issue, we propose a novel approach to obtaining large numbers of high-quality reference image annotations at low cost in an extremely short period of time. The concept is based on outsourcing the correspondence search to a crowd of anonymous users from an online community (crowdsourcing) and comprises four stages: (1) feature detection, (2) correspondence search via crowdsourcing, (3) merging multiple annotations per feature by fitting Gaussian finite mixture models, (4) outlier removal using the result of the clustering as input for a second annotation task. On average, 10,000 annotations were obtained within 24 h at a cost of $100. The annotation of the crowd after clustering and before outlier removal was of expert quality with a median distance of about 1 pixel to a publically available reference annotation. The threshold for the outlier removal task directly determines the maximum annotation error, but also the number of points removed. Our concept is a novel and effective method for fast, low-cost and highly accurate correspondence generation that could be adapted to various other applications related to large-scale data annotation in medical image computing and computer-assisted interventions.
Validation of a low-cost EEG device for mood induction studies.
Rodríguez, Alejandro; Rey, Beatriz; Alcañiz, Mariano
2013-01-01
New electroencephalography (EEG) devices, more portable and cheaper, are appearing on the market. Studying the reliability of these EEG devices for emotional studies would be interesting, as these devices could be more economical and compatible with Virtual Reality (VR) settings. Therefore, the aim in this work was to validate a low-cost EEG device (Emotiv Epoc) to monitor brain activity during a positive emotional induction procedure. Emotional pictures (IAPS) were used to induce a positive mood in sixteen participants. Changes in the brain activity of subjects were compared between positive induction and neutral conditions. Obtained results were in accordance with previous scientific literature regarding frontal EEG asymmetry, which supports the possibility of using this low-cost EEG device in future mood induction studies combined with VR.
Development and Validation of a Kit to Measure Drink Antioxidant Capacity Using a Novel Colorimeter.
Priftis, Alexandros; Stagos, Dimitrios; Tzioumakis, Nikolaos; Konstantinopoulos, Konstantinos; Patouna, Anastasia; Papadopoulos, Georgios E; Tsatsakis, Aristides; Kouretas, Dimitrios
2016-08-30
Measuring the antioxidant capacity of foods is essential, as a means of quality control to ensure that the final product reaching the consumer will be of high standards. Despite the already existing assays with which the antioxidant activity is estimated, new, faster and low cost methods are always sought. Therefore, we have developed a novel colorimeter and combined it with a slightly modified DPPH assay, thus creating a kit that can assess the antioxidant capacity of liquids (e.g., different types of coffee, beer, wine, juices) in a quite fast and low cost manner. The accuracy of the colorimeter was ensured by comparing it to a fully validated Hitachi U-1900 spectrophotometer, and a coefficient was calculated to eliminate the observed differences. In addition, a new, user friendly software was developed, in order to render the procedure as easy as possible, while allowing a central monitoring of the obtained results. Overall, a novel kit was developed, with which the antioxidant activity of liquids can be measured, firstly to ensure their quality and secondly to assess the amount of antioxidants consumed with the respective food.
Laparoscopic Common Bile Duct Exploration Four-Task Training Model: Construct Validity
Otaño, Natalia; Rodríguez, Omaira; Sánchez, Renata; Benítez, Gustavo; Schweitzer, Michael
2012-01-01
Background: Training models in laparoscopic surgery allow the surgical team to practice procedures in a safe environment. We have proposed the use of a 4-task, low-cost inert model to practice critical steps of laparoscopic common bile duct exploration. Methods: The performance of 3 groups with different levels of expertise in laparoscopic surgery, novices (A), intermediates (B), and experts (C), was evaluated using a low-cost inert model in the following tasks: (1) intraoperative cholangiography catheter insertion, (2) transcystic exploration, (3) T-tube placement, and (4) choledochoscope management. Kruskal-Wallis and Mann-Whitney tests were used to identify differences among the groups. Results: A total of 14 individuals were evaluated: 5 novices (A), 5 intermediates (B), and 4 experts (C). The results involving intraoperative cholangiography catheter insertion were similar among the 3 groups. As for the other tasks, the expert had better results than the other 2, in which no significant differences occurred. The proposed model is able to discriminate among individuals with different levels of expertise, indicating that the abilities that the model evaluates are relevant in the surgeon's performance in CBD exploration. Conclusions: Construct validity for tasks 2 and 3 was demonstrated. However, task 1 was no capable of distinguishing between groups, and task 4 was not statistically validated. PMID:22906323
Rahman, Md. Sayedur; Sathasivam, Kathiresan V.
2015-01-01
Biosorption process is a promising technology for the removal of heavy metals from industrial wastes and effluents using low-cost and effective biosorbents. In the present study, adsorption of Pb2+, Cu2+, Fe2+, and Zn2+ onto dried biomass of red seaweed Kappaphycus sp. was investigated as a function of pH, contact time, initial metal ion concentration, and temperature. The experimental data were evaluated by four isotherm models (Langmuir, Freundlich, Temkin, and Dubinin-Radushkevich) and four kinetic models (pseudo-first-order, pseudo-second-order, Elovich, and intraparticle diffusion models). The adsorption process was feasible, spontaneous, and endothermic in nature. Functional groups in the biomass involved in metal adsorption process were revealed as carboxylic and sulfonic acids and sulfonate by Fourier transform infrared analysis. A total of nine error functions were applied to validate the models. We strongly suggest the analysis of error functions for validating adsorption isotherm and kinetic models using linear methods. The present work shows that the red seaweed Kappaphycus sp. can be used as a potentially low-cost biosorbent for the removal of heavy metal ions from aqueous solutions. Further study is warranted to evaluate its feasibility for the removal of heavy metals from the real environment. PMID:26295032
Rahman, Md Sayedur; Sathasivam, Kathiresan V
2015-01-01
Biosorption process is a promising technology for the removal of heavy metals from industrial wastes and effluents using low-cost and effective biosorbents. In the present study, adsorption of Pb(2+), Cu(2+), Fe(2+), and Zn(2+) onto dried biomass of red seaweed Kappaphycus sp. was investigated as a function of pH, contact time, initial metal ion concentration, and temperature. The experimental data were evaluated by four isotherm models (Langmuir, Freundlich, Temkin, and Dubinin-Radushkevich) and four kinetic models (pseudo-first-order, pseudo-second-order, Elovich, and intraparticle diffusion models). The adsorption process was feasible, spontaneous, and endothermic in nature. Functional groups in the biomass involved in metal adsorption process were revealed as carboxylic and sulfonic acids and sulfonate by Fourier transform infrared analysis. A total of nine error functions were applied to validate the models. We strongly suggest the analysis of error functions for validating adsorption isotherm and kinetic models using linear methods. The present work shows that the red seaweed Kappaphycus sp. can be used as a potentially low-cost biosorbent for the removal of heavy metal ions from aqueous solutions. Further study is warranted to evaluate its feasibility for the removal of heavy metals from the real environment.
Sebrow, Dov; Lavery, Hugh J; Brajtbord, Jonathan S; Hobbs, Adele; Levinson, Adam W; Samadi, David B
2012-02-01
To describe a novel, low-cost, online health-related quality of life (HRQOL) survey that allows for automated follow-up and convenient access for patients in geographically diverse locations. Clinicians and investigators have been encouraged to use validated HRQOL instruments when reporting outcomes after radical prostatectomy. The institutional review board approved our protocol and the use of a secure web site (http://www.SurveyMonkey.com) to send patients a collection of validated postprostatectomy HRQOL instruments by electronic mail. To assess compliance with the electronic mail format, a pilot study of cross-sectional surveys was sent to patients who presented for follow-up after robotic-assisted laparoscopic prostatectomy. The response data were transmitted in secure fashion in compliance with the Health Insurance Portability and Accountability Act. After providing written informed consent, 514 patients who presented for follow-up after robotic-assisted laparoscopic prostatectomy from March 2010 to February 2011 were sent the online survey. A total of 293 patients (57%) responded, with an average age of 60 years and a median interval from surgery of 12 months. Of the respondents, 75% completed the survey within 4 days of receiving the electronic mail, with a median completion time of 15 minutes. The total survey administration costs were limited to the web site's $200 annual fee-for-service. An online survey can be a low-cost, efficient, and confidential modality for assessing validated HRQOL outcomes in patients who undergo treatment of localized prostate cancer. This method could be especially useful for those who cannot return for follow-up because of geographic reasons. Copyright © 2012 Elsevier Inc. All rights reserved.
A Low-Cost Energy-Efficient Cableless Geophone Unit for Passive Surface Wave Surveys.
Dai, Kaoshan; Li, Xiaofeng; Lu, Chuan; You, Qingyu; Huang, Zhenhua; Wu, H Felix
2015-09-25
The passive surface wave survey is a practical, non-invasive seismic exploration method that has increasingly been used in geotechnical engineering. However, in situ deployment of traditional wired geophones is labor intensive for a dense sensor array. Alternatively, stand-alone seismometers can be used, but they are bulky, heavy, and expensive because they are usually designed for long-term monitoring. To better facilitate field applications of the passive surface wave survey, a low-cost energy-efficient geophone system was developed in this study. The hardware design is presented in this paper. To validate the system's functionality, both laboratory and field experiments were conducted. The unique feature of this newly-developed cableless geophone system allows for rapid field applications of the passive surface wave survey with dense array measurements.
Measuring Flow Rate in Crystalline Bedrock Wells Using the Dissolved Oxygen Alteration Method.
Vitale, Sarah A; Robbins, Gary A
2017-07-01
Determination of vertical flow rates in a fractured bedrock well can aid in planning and implementing hydraulic tests, water quality sampling, and improving interpretations of water quality data. Although flowmeters are highly accurate in flow rate measurement, the high cost and logistics may be limiting. In this study the dissolved oxygen alteration method (DOAM) is expanded upon as a low-cost alternative to determine vertical flow rates in crystalline bedrock wells. The method entails altering the dissolved oxygen content in the wellbore through bubbler aeration, and monitoring the vertical advective movement of the dissolved oxygen over time. Measurements were taken for upward and downward flows, and under ambient and pumping conditions. Vertical flow rates from 0.06 to 2.30 Lpm were measured. To validate the method, flow rates determined with the DOAM were compared to pump discharge rates and found to be in agreement within 2.5%. © 2017, National Ground Water Association.
Sheng, Yanghao; Zhou, Boting
2017-05-26
Therapeutic drug monitoring (TDM) is one of the most important services of clinical laboratories. Two main techniques are commonly used: the immunoassay and chromatography method. We have developed a cost-effective system of two-dimensional liquid chromatography with ultraviolet detection (2D-LC-UV) for high-throughput determination of vancomycin in human plasma that combines the automation and low start-up costs of the immunoassay with the high selectivity and sensitivity of the liquid chromatography coupled with mass spectrometric detection without incurring their disadvantages, achieving high cost-effectiveness. This 2D-LC system offers a large volume injection to provide sufficient sensitivity and uses simulated gradient peak compression technology to control peak broadening and to improve peak shape. A middle column was added to reduce the analysis cycle time and make it suitable for high-throughput routine clinical assays. The analysis cycle time was 4min and the peak width was 0.8min. Compared with other chromatographic methods that have been developed, the analysis cycle time and peak width for vancomycin was reduced significantly. The lower limit of quantification was 0.20μg/mL for vancomycin, which is the same as certain LC-MS/MS methods that have been recently developed and validated. The method is rapid, automated, and low-cost and has high selectivity and sensitivity for the quantification of vancomycin in human plasma, thus making it well-suited for use in hospital clinical laboratories. Copyright © 2017 Elsevier B.V. All rights reserved.
Schneider, Philipp; Castell, Nuria; Vogt, Matthias; Dauge, Franck R; Lahoz, William A; Bartonova, Alena
2017-09-01
The recent emergence of low-cost microsensors measuring various air pollutants has significant potential for carrying out high-resolution mapping of air quality in the urban environment. However, the data obtained by such sensors are generally less reliable than that from standard equipment and they are subject to significant data gaps in both space and time. In order to overcome this issue, we present here a data fusion method based on geostatistics that allows for merging observations of air quality from a network of low-cost sensors with spatial information from an urban-scale air quality model. The performance of the methodology is evaluated for nitrogen dioxide in Oslo, Norway, using both simulated datasets and real-world measurements from a low-cost sensor network for January 2016. The results indicate that the method is capable of producing realistic hourly concentration fields of urban nitrogen dioxide that inherit the spatial patterns from the model and adjust the prior values using the information from the sensor network. The accuracy of the data fusion method is dependent on various factors including the total number of observations, their spatial distribution, their uncertainty (both in terms of systematic biases and random errors), as well as the ability of the model to provide realistic spatial patterns of urban air pollution. A validation against official data from air quality monitoring stations equipped with reference instrumentation indicates that the data fusion method is capable of reproducing city-wide averaged official values with an R 2 of 0.89 and a root mean squared error of 14.3 μg m -3 . It is further capable of reproducing the typical daily cycles of nitrogen dioxide. Overall, the results indicate that the method provides a robust way of extracting useful information from uncertain sensor data using only a time-invariant model dataset and the knowledge contained within an entire sensor network. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.
VDA, a Method of Choosing a Better Algorithm with Fewer Validations
Kluger, Yuval
2011-01-01
The multitude of bioinformatics algorithms designed for performing a particular computational task presents end-users with the problem of selecting the most appropriate computational tool for analyzing their biological data. The choice of the best available method is often based on expensive experimental validation of the results. We propose an approach to design validation sets for method comparison and performance assessment that are effective in terms of cost and discrimination power. Validation Discriminant Analysis (VDA) is a method for designing a minimal validation dataset to allow reliable comparisons between the performances of different algorithms. Implementation of our VDA approach achieves this reduction by selecting predictions that maximize the minimum Hamming distance between algorithmic predictions in the validation set. We show that VDA can be used to correctly rank algorithms according to their performances. These results are further supported by simulations and by realistic algorithmic comparisons in silico. VDA is a novel, cost-efficient method for minimizing the number of validation experiments necessary for reliable performance estimation and fair comparison between algorithms. Our VDA software is available at http://sourceforge.net/projects/klugerlab/files/VDA/ PMID:22046256
A Low-Cost Method for Multiple Disease Prediction.
Bayati, Mohsen; Bhaskar, Sonia; Montanari, Andrea
Recently, in response to the rising costs of healthcare services, employers that are financially responsible for the healthcare costs of their workforce have been investing in health improvement programs for their employees. A main objective of these so called "wellness programs" is to reduce the incidence of chronic illnesses such as cardiovascular disease, cancer, diabetes, and obesity, with the goal of reducing future medical costs. The majority of these wellness programs include an annual screening to detect individuals with the highest risk of developing chronic disease. Once these individuals are identified, the company can invest in interventions to reduce the risk of those individuals. However, capturing many biomarkers per employee creates a costly screening procedure. We propose a statistical data-driven method to address this challenge by minimizing the number of biomarkers in the screening procedure while maximizing the predictive power over a broad spectrum of diseases. Our solution uses multi-task learning and group dimensionality reduction from machine learning and statistics. We provide empirical validation of the proposed solution using data from two different electronic medical records systems, with comparisons to a statistical benchmark.
NASA Technical Reports Server (NTRS)
Howard, J. W.; Kim, H.; Berg, M.; LaBel, K. A.; Stansberry, S.; Friendlich, M.; Irwin, T.
2006-01-01
A viewgraph presentation on the development of a low cost, high speed tester reconfigurable Field Programmable Gata Array (FPGA) is shown. The topics include: 1) Introduction; 2) Objectives; 3) Tester Descriptions; 4) Tester Validations and Demonstrations; 5) Future Work; and 6) Summary.
Design of New Procedures for Diagnosing Prevalent Diseases Using a Low-Cost Telemicroscopy System.
Prieto-Egido, Ignacio; González-Escalada, Alba; García-Giganto, Víctor; Martínez-Fernández, Andrés
2016-11-01
Currently, the diagnosis of prevalent diseases such as malaria, tuberculosis, or diarrheal diseases in rural areas of developing countries requires the displacement of the patient from their community health post to their reference health center or to ship a sample. This delays diagnosis and the treatment of disease. Conduct research to develop a new method for rapid low-cost diagnosis of prevalent diseases in rural areas of developing countries (malaria, tuberculosis, parasitic infections, vaginal infections, and cervical cancer). The study was divided into three phases. The first related to the drafting and validating of new protocols for the preparation of samples that should be adapted to be carried out in areas without power and with little trained personnel. The second phase consisted of developing a telemicroscopy system looking for low cost, software compatibility, and technical quality. Finally, the third phase evaluated the system as a diagnostic tool using direct observation with a conventional microscope as the gold standard. The validation of the new protocols showed that 100% of the vaginal swabs were processed correctly when using direct smear, while they were only 86.3% correct when using Gram stain; 68.3% of fecal samples were correctly processed using Kinyoun stain; 61.7% of blood samples when using thin film; and 83.8% when using thick film. Phase 2 permitted the development of a low-cost (<$250) and low-power (<15 W) telemicroscopy system that allows real-time consultation between health technicians and specialists. Finally, phase 3 proved that there was no difference between the diagnostics obtained by direct observation in a microscope and those ones obtained through the new telemicroscopy system. This study has verified the effectiveness of the telemicroscopy system as a diagnostic tool, given the complete agreement between the diagnoses made with it and those made with the gold standard.
Jerrett, Michael; Donaire-Gonzalez, David; Popoola, Olalekan; Jones, Roderic; Cohen, Ronald C; Almanza, Estela; de Nazelle, Audrey; Mead, Iq; Carrasco-Turigas, Glòria; Cole-Hunter, Tom; Triguero-Mas, Margarita; Seto, Edmund; Nieuwenhuijsen, Mark
2017-10-01
Low cost, personal air pollution sensors may reduce exposure measurement errors in epidemiological investigations and contribute to citizen science initiatives. Here we assess the validity of a low cost personal air pollution sensor. Study participants were drawn from two ongoing epidemiological projects in Barcelona, Spain. Participants repeatedly wore the pollution sensor - which measured carbon monoxide (CO), nitric oxide (NO), and nitrogen dioxide (NO 2 ). We also compared personal sensor measurements to those from more expensive instruments. Our personal sensors had moderate to high correlations with government monitors with averaging times of 1-h and 30-min epochs (r ~ 0.38-0.8) for NO and CO, but had low to moderate correlations with NO 2 (~0.04-0.67). Correlations between the personal sensors and more expensive research instruments were higher than with the government monitors. The sensors were able to detect high and low air pollution levels in agreement with expectations (e.g., high levels on or near busy roadways and lower levels in background residential areas and parks). Our findings suggest that the low cost, personal sensors have potential to reduce exposure measurement error in epidemiological studies and provide valid data for citizen science studies. Copyright © 2017 Elsevier Inc. All rights reserved.
Shifted Transversal Design smart-pooling for high coverage interactome mapping
Xin, Xiaofeng; Rual, Jean-François; Hirozane-Kishikawa, Tomoko; Hill, David E.; Vidal, Marc; Boone, Charles; Thierry-Mieg, Nicolas
2009-01-01
“Smart-pooling,” in which test reagents are multiplexed in a highly redundant manner, is a promising strategy for achieving high efficiency, sensitivity, and specificity in systems-level projects. However, previous applications relied on low redundancy designs that do not leverage the full potential of smart-pooling, and more powerful theoretical constructions, such as the Shifted Transversal Design (STD), lack experimental validation. Here we evaluate STD smart-pooling in yeast two-hybrid (Y2H) interactome mapping. We employed two STD designs and two established methods to perform ORFeome-wide Y2H screens with 12 baits. We found that STD pooling achieves similar levels of sensitivity and specificity as one-on-one array-based Y2H, while the costs and workloads are divided by three. The screening-sequencing approach is the most cost- and labor-efficient, yet STD identifies about twofold more interactions. Screening-sequencing remains an appropriate method for quickly producing low-coverage interactomes, while STD pooling appears as the method of choice for obtaining maps with higher coverage. PMID:19447967
Mass-flow-rate-controlled fluid flow in nanochannels by particle insertion and deletion.
Barclay, Paul L; Lukes, Jennifer R
2016-12-01
A nonequilibrium molecular dynamics method to induce fluid flow in nanochannels, the insertion-deletion method (IDM), is introduced. IDM inserts and deletes particles within distinct regions in the domain, creating locally high and low pressures. The benefits of IDM are that it directly controls a physically meaningful quantity, the mass flow rate, allows for pressure and density gradients to develop in the direction of flow, and permits treatment of complex aperiodic geometries. Validation of IDM is performed, yielding good agreement with the analytical solution of Poiseuille flow in a planar channel. Comparison of IDM to existing methods indicates that it is best suited for gases, both because it intrinsically accounts for compressibility effects on the flow and because the computational cost of particle insertion is lowest for low-density fluids.
Pavlovich, Mark; Greene, Brandon F.
1984-01-01
In this study, we describe the development and evaluation of a self-instructional program for installing 10 low-cost/no-cost weatherization materials (e.g., weatherstripping, caulking). This program was a weatherization and retrofit manual (WARM) providing step-by-step instructions and illustrations. Boy and Girl Scouts participated and used either the WARM or existing product instructions (EPI) to apply the materials. Scouts installed the materials properly only when they used the WARM. PMID:16795671
Parametric study of different contributors to tumor thermal profile
NASA Astrophysics Data System (ADS)
Tepper, Michal; Gannot, Israel
2014-03-01
Treating cancer is one of the major challenges of modern medicine. There is great interest in assessing tumor development in in vivo animal and human models, as well as in in vitro experiments. Existing methods are either limited by cost and availability or by their low accuracy and reproducibility. Thermography holds the potential of being a noninvasive, low-cost, irradiative and easy-to-use method for tumor monitoring. Tumors can be detected in thermal images due to their relatively higher or lower temperature compared to the temperature of the healthy skin surrounding them. Extensive research is performed to show the validity of thermography as an efficient method for tumor detection and the possibility of extracting tumor properties from thermal images, showing promising results. However, deducing from one type of experiment to others is difficult due to the differences in tumor properties, especially between different types of tumors or different species. There is a need in a research linking different types of tumor experiments. In this research, parametric analysis of possible contributors to tumor thermal profiles was performed. The effect of tumor geometric, physical and thermal properties was studied, both independently and together, in phantom model experiments and computer simulations. Theoretical and experimental results were cross-correlated to validate the models used and increase the accuracy of simulated complex tumor models. The contribution of different parameters in various tumor scenarios was estimated and the implication of these differences on the observed thermal profiles was studied. The correlation between animal and human models is discussed.
Electron Beam-Cure Polymer Matrix Composites: Processing and Properties
NASA Technical Reports Server (NTRS)
Wrenn, G.; Frame, B.; Jensen, B.; Nettles, A.
2001-01-01
Researchers from NASA and Oak Ridge National Laboratory are evaluating a series of electron beam curable composites for application in reusable launch vehicle airframe and propulsion systems. Objectives are to develop electron beam curable composites that are useful at cryogenic to elevated temperatures (-217 C to 200 C), validate key mechanical properties of these composites, and demonstrate cost-saving fabrication methods at the subcomponent level. Electron beam curing of polymer matrix composites is an enabling capability for production of aerospace structures in a non-autoclave process. Payoffs of this technology will be fabrication of composite structures at room temperature, reduced tooling cost and cure time, and improvements in component durability. This presentation covers the results of material property evaluations for electron beam-cured composites made with either unidirectional tape or woven fabric architectures. Resin systems have been evaluated for performance in ambient, cryogenic, and elevated temperature conditions. Results for electron beam composites and similar composites cured in conventional processes are reviewed for comparison. Fabrication demonstrations were also performed for electron beam-cured composite airframe and propulsion piping subcomponents. These parts have been built to validate manufacturing methods with electron beam composite materials, to evaluate electron beam curing processing parameters, and to demonstrate lightweight, low-cost tooling options.
A Low-Cost Contact System to Assess Load Displacement Velocity in a Resistance Training Machine
Buscà, Bernat; Font, Anna
2011-01-01
This study sought to determine the validity of a new system for assessing the displacement and average velocity within machine-based resistance training exercise using the Chronojump System. The new design is based on a contact bar and a simple, low-cost mechanism that detects the conductivity of electrical potentials with a precision chronograph. This system allows coaches to assess velocity to control the strength training process. A validation study was performed by assessing the concentric phase parameters of a leg press exercise. Output time data from the Chronojump System in combination with the pre-established range of movement was compared with data from a position sensor connected to a Biopac System. A subset of 87 actions from 11 professional tennis players was recorded and, using the two methods, average velocity and displacement variables in the same action were compared. A t-test for dependent samples and a correlation analysis were undertaken. The r value derived from the correlation between the Biopac System and the contact Chronojump System was >0.94 for all measures of displacement and velocity on all loads (p < 0.01). The Effect Size (ES) was 0.18 in displacement and 0.14 in velocity and ranged from 0.09 to 0.31 and from 0.07 to 0.34, respectively. The magnitude of the difference between the two methods in all parameters and the correlation values provided certain evidence of validity of the Chronojump System to assess the average displacement velocity of loads in a resistance training machine. Key points The assessment of speed in resistance machines is a valuable source of information for strength training. Many commercial systems used to assess velocity, power and force are expensive thereby preventing widespread use by coaches and athletes. The system is intended to be a low-cost device for assessing and controlling the velocity exerted on each repetition in any resistance training machine. The system could be easily adapted in any vertical displacement barbell exercise. PMID:24150620
A Low-Cost Energy-Efficient Cableless Geophone Unit for Passive Surface Wave Surveys
Dai, Kaoshan; Li, Xiaofeng; Lu, Chuan; You, Qingyu; Huang, Zhenhua; Wu, H. Felix
2015-01-01
The passive surface wave survey is a practical, non-invasive seismic exploration method that has increasingly been used in geotechnical engineering. However, in situ deployment of traditional wired geophones is labor intensive for a dense sensor array. Alternatively, stand-alone seismometers can be used, but they are bulky, heavy, and expensive because they are usually designed for long-term monitoring. To better facilitate field applications of the passive surface wave survey, a low-cost energy-efficient geophone system was developed in this study. The hardware design is presented in this paper. To validate the system’s functionality, both laboratory and field experiments were conducted. The unique feature of this newly-developed cableless geophone system allows for rapid field applications of the passive surface wave survey with dense array measurements. PMID:26404270
Software design and implementation of ship heave motion monitoring system based on MBD method
NASA Astrophysics Data System (ADS)
Yu, Yan; Li, Yuhan; Zhang, Chunwei; Kang, Won-Hee; Ou, Jinping
2015-03-01
Marine transportation plays a significant role in the modern transport sector due to its advantage of low cost, large capacity. It is being attached enormous importance to all over the world. Nowadays the related areas of product development have become an existing hot spot. DSP signal processors feature micro volume, low cost, high precision, fast processing speed, which has been widely used in all kinds of monitoring systems. But traditional DSP code development process is time-consuming, inefficiency, costly and difficult. MathWorks company proposed Model-based Design (MBD) to overcome these defects. By calling the target board modules in simulink library to compile and generate the corresponding code for the target processor. And then automatically call DSP integrated development environment CCS for algorithm validation on the target processor. This paper uses the MDB to design the algorithm for the ship heave motion monitoring system. It proves the effectiveness of the MBD run successfully on the processor.
Paper-based chromatic toxicity bioassay by analysis of bacterial ferricyanide reduction.
Pujol-Vila, F; Vigués, N; Guerrero-Navarro, A; Jiménez, S; Gómez, D; Fernández, M; Bori, J; Vallès, B; Riva, M C; Muñoz-Berbel, X; Mas, J
2016-03-03
Water quality assessment requires a continuous and strict analysis of samples to guarantee compliance with established standards. Nowadays, the increasing number of pollutants and their synergistic effects lead to the development general toxicity bioassays capable to analyse water pollution as a whole. Current general toxicity methods, e.g. Microtox(®), rely on long operation protocols, the use of complex and expensive instrumentation and sample pre-treatment, which should be transported to the laboratory for analysis. These requirements delay sample analysis and hence, the response to avoid an environmental catastrophe. In an attempt to solve it, a fast (15 min) and low-cost toxicity bioassay based on the chromatic changes associated to bacterial ferricyanide reduction is here presented. E. coli cells (used as model bacteria) were stably trapped on low-cost paper matrices (cellulose-based paper discs, PDs) and remained viable for long times (1 month at -20 °C). Apart from bacterial carrier, paper matrices also acted as a fluidic element, allowing fluid management without the need of external pumps. Bioassay evaluation was performed using copper as model toxic agent. Chromatic changes associated to bacterial ferricyanide reduction were determined by three different transduction methods, i.e. (i) optical reflectometry (as reference method), (ii) image analysis and (iii) visual inspection. In all cases, bioassay results (in terms of half maximal effective concentrations, EC50) were in agreement with already reported data, confirming the good performance of the bioassay. The validation of the bioassay was performed by analysis of real samples from natural sources, which were analysed and compared with a reference method (i.e. Microtox). Obtained results showed agreement for about 70% of toxic samples and 80% of non-toxic samples, which may validate the use of this simple and quick protocol in the determination of general toxicity. The minimum instrumentation requirements and the simplicity of the bioassay open the possibility of in-situ water toxicity assessment with a fast and low-cost protocol. Copyright © 2016 Elsevier B.V. All rights reserved.
Cherny, N I; Sullivan, R; Torode, J; Saar, M; Eniu, A
2017-01-01
Abstract Background The availability and affordability of safe, effective, high-quality, affordable anticancer therapies are a core requirement for effective national cancer control plans. Method Online survey based on a previously validated approach. The aims of the study were to evaluate (i) the availability on national formulary of licensed antineoplastic medicines across the globe, (ii) patient out-of-pocket costs for the medications, (iii) the actual availability of the medication for a patient with a valid prescription, (iv) information relating to possible factors adversely impacting the availability of antineoplastic agents and (v) the impact of the country’s level of economic development on these parameters. A total of 304 field reporters from 97 countries were invited to participate. The preliminary set of data was posted on the ESMO website for open peer review and amendments have been incorporated into the final report. Results Surveys were submitted by 135 reporters from 63 countries and additional peer-review data were submitted by 54 reporters from 19 countries. There are substantial differences in the formulary availability, out-of-pocket costs and actual availability for many anticancer medicines. The most substantial issues are in lower-middle- and low-income countries. Even among medications on the WHO Model List of Essential Medicines (EML) the discrepancies are profound and these relate to high out-of-pocket costs (in low-middle-income countries 32.0% of EML medicines are available only at full cost and 5.2% are not available at all, and for low-income countries, the corresponding figures are even worse at 57.7% and 8.3%, respectively). Conclusions There is wide global variation in formulary availability, out-of-pocket expenditures and actual availability for most licensed anticancer medicines. Low- and low-middle-income countries have significant lack of availability and high out-of-pocket expenditures for cancer medicines on the WHO EML, with much less availability of new, more expensive targeted agents compared with high-income countries. PMID:28950323
Hammer, K A; Janes, F R
1995-01-01
The objectives for developing the participative method of subject definition were to gain all the relevant information to a high level of fidelity in the earliest stages of the work and so be able to build a realistic model at reduced labour cost. In order to better integrate the two activities--information acquisition and mathematical modelling--a procedure was devised using the methods of interactive management to facilitate teamwork. This procedure provided the techniques to create suitable working relationships between the two groups, the informants and the modellers, so as to maximize their free and accurate intercommunication, both during the initial definition of the linen service and during the monitoring of the accuracy and reality of the draft models. The objectives of this project were met in that the final model was quickly validated and approved, at a low labour cost.
Diefenbach, Angela K.; Crider, Juliet G.; Schilling, Steve P.; Dzurisin, Daniel
2012-01-01
We describe a low-cost application of digital photogrammetry using commercially available photogrammetric software and oblique photographs taken with an off-the-shelf digital camera to create sequential digital elevation models (DEMs) of a lava dome that grew during the 2004–2008 eruption of Mount St. Helens (MSH) volcano. Renewed activity at MSH provided an opportunity to devise and test this method, because it could be validated against other observations of this well-monitored volcano. The datasets consist of oblique aerial photographs (snapshots) taken from a helicopter using a digital single-lens reflex camera. Twelve sets of overlapping digital images of the dome taken during 2004–2007 were used to produce DEMs and to calculate lava dome volumes and extrusion rates. Analyses of the digital images were carried out using photogrammetric software to produce three-dimensional coordinates of points identified in multiple photos. The evolving morphology of the dome was modeled by comparing successive DEMs. Results were validated by comparison to volume measurements derived from traditional vertical photogrammetric surveys by the US Geological Survey Cascades Volcano Observatory. Our technique was significantly less expensive and required less time than traditional vertical photogrammetric techniques; yet, it consistently yielded volume estimates within 5% of the traditional method. This technique provides an inexpensive, rapid assessment tool for tracking lava dome growth or other topographic changes at restless volcanoes.
Estimation of kernels mass ratio to total in-shell peanuts using low-cost RF impedance meter
USDA-ARS?s Scientific Manuscript database
In this study estimation of percentage of total kernel mass within a given mass of in-shell peanuts was determined nondestructively using a low-cost RF impedance meter. Peanut samples were divided into two groups one the calibration and the other the validation group. Each group contained 25 samples...
Cabrera, Carlos; Chang, Lei; Stone, Mars; Busch, Michael; Wilson, David H
2015-11-01
Nucleic acid testing (NAT) has become the standard for high sensitivity in detecting low levels of virus. However, adoption of NAT can be cost prohibitive in low-resource settings where access to extreme sensitivity could be clinically advantageous for early detection of infection. We report development and preliminary validation of a simple, low-cost, fully automated digital p24 antigen immunoassay with the sensitivity of quantitative NAT viral load (NAT-VL) methods for detection of acute HIV infection. We developed an investigational 69-min immunoassay for p24 capsid protein for use on a novel digital analyzer on the basis of single-molecule-array technology. We evaluated the assay for sensitivity by dilution of standardized preparations of p24, cultured HIV, and preseroconversion samples. We characterized analytical performance and concordance with 2 NAT-VL methods and 2 contemporary p24 Ag/Ab combination immunoassays with dilutions of viral isolates and samples from the earliest stages of HIV infection. Analytical sensitivity was 0.0025 ng/L p24, equivalent to 60 HIV RNA copies/mL. The limit of quantification was 0.0076 ng/L, and imprecision across 10 runs was <10% for samples as low as 0.09 ng/L. Clinical specificity was 95.1%. Sensitivity concordance vs NAT-VL on dilutions of preseroconversion samples and Group M viral isolates was 100%. The digital immunoassay exhibited >4000-fold greater sensitivity than contemporary immunoassays for p24 and sensitivity equivalent to that of NAT methods for early detection of HIV. The data indicate that NAT-level sensitivity for acute HIV infection is possible with a simple, low-cost digital immunoassay. © 2015 American Association for Clinical Chemistry.
Mbinze, J K; Sacré, P-Y; Yemoa, A; Mavar Tayey Mbay, J; Habyalimana, V; Kalenda, N; Hubert, Ph; Marini, R D; Ziemons, E
2015-01-01
Poor quality antimalarial drugs are one of the public's major health problems in Africa. The depth of this problem may be explained in part by the lack of effective enforcement and the lack of efficient local drug analysis laboratories. To tackle part of this issue, two spectroscopic methods with the ability to detect and to quantify quinine dihydrochloride in children's oral drops formulations were developed and validated. Raman and near infrared (NIR) spectroscopy were selected for the drug analysis due to their low cost, non-destructive and rapid characteristics. Both of the methods developed were successfully validated using the total error approach in the range of 50-150% of the target concentration (20%W/V) within the 10% acceptance limits. Samples collected on the Congolese pharmaceutical market were analyzed by both techniques to detect potentially substandard drugs. After a comparison of the analytical performance of both methods, it has been decided to implement the method based on NIR spectroscopy to perform the routine analysis of quinine oral drop samples in the Quality Control Laboratory of Drugs at the University of Kinshasa (DRC). Copyright © 2015 Elsevier B.V. All rights reserved.
Sloan, N L; Rosen, D; de la Paz, T; Arita, M; Temalilwa, C; Solomons, N W
1997-02-01
The prevalence of vitamin A deficiency has traditionally been assessed through xerophthalmia or biochemical surveys. The cost and complexity of implementing these methods limits the ability of nonresearch organizations to identify vitamin A deficiency. This study examined the validity of a simple, inexpensive food frequency method to identify areas with a high prevalence of vitamin A deficiency. The validity of the method was tested in 15 communities, 5 each from the Philippines, Guatemala, and Tanzania. Serum retinol concentrations of less than 20 micrograms/dL defined vitamin A deficiency. Weighted measures of vitamin A intake six or fewer times per week and unweighted measures of consumption of animal sources of vitamin A four or fewer times per week correctly classified seven of eight communities as having a high prevalence of vitamin A deficiency (i.e., 15% or more preschool-aged children in the community had the deficiency) (sensitivity = 87.5%) and four of seven communities as having a low prevalence (specificity = 57.1%). This method correctly classified the vitamin A deficiency status of 73.3% of the communities but demonstrated a high false-positive rate (42.9%).
Berkman, Elliot T.; Dickenson, Janna; Falk, Emily B.; Lieberman, Matthew D.
2011-01-01
Objective Understanding the psychological processes that contribute to smoking reduction will yield population health benefits. Negative mood may moderate smoking lapse during cessation, but this relationship has been difficult to measure in ongoing daily experience. We used a novel form of ecological momentary assessment to test a self-control model of negative mood and craving leading to smoking lapse. Design We validated short message service (SMS) text as a user-friendly and low-cost option for ecologically measuring real-time health behaviors. We sent text messages to cigarette smokers attempting to quit eight times daily for the first 21 days of cessation (N-obs = 3,811). Main outcome measures Approximately every two hours, we assessed cigarette count, mood, and cravings, and examined between- and within-day patterns and time-lagged relationships among these variables. Exhaled carbon monoxide was assessed pre- and posttreatment. Results Negative mood and craving predicted smoking two hours later, but craving mediated the mood–smoking relationship. Also, this mediation relationship predicted smoking over the next two, but not four, hours. Conclusion Results clarify conflicting previous findings on the relation between affect and smoking, validate a new low-cost and user-friendly method for collecting fine-grained health behavior assessments, and emphasize the importance of rapid, real-time measurement of smoking moderators. PMID:21401252
Simplified Life-Cycle Cost Estimation
NASA Technical Reports Server (NTRS)
Remer, D. S.; Lorden, G.; Eisenberger, I.
1983-01-01
Simple method for life-cycle cost (LCC) estimation avoids pitfalls inherent in formulations requiring separate estimates of inflation and interest rates. Method depends for validity observation that interest and inflation rates closely track each other.
A MODIFIED LOW COST COLOURIMETRIC METHOD FOR PARACETAMOL (ACETAMINOPHEN) MEASUREMENT IN PLASMA
Shihana, Fathima; Dissanayake, Dhammika Menike; Dargan, Paul Ivor; Dawson, Andrew Hamilton
2011-01-01
Background Despite a significant increase in the number of patients with paracetamol poisoning in the developing world, plasma paracetamol assays are not widely available. The purpose of this study was to assess a low cost modified colorimetric paracetamol assay that has the potential to be performed in small laboratories with restricted resources. Methods The paracetamol assay used in this study was based on the Glynn and Kendal colorimetric method with a few modifications in order to decrease the production of nitrous gas and thereby reduce infrastructure costs. Preliminary validation studies were performed using spiked aqueous samples with known concentrations of paracetamol. Subsequently, the results from the colorimetric method for 114 stored clinical samples from patients with paracetamol poisoning were compared with those from the current gold-standard high-performance liquid chromatography (HPLC) method. A prospective survey, assessing the clinical use of the paracetamol assay, was performed on all patients with paracetamol poisoning attending the Peradeniya General Hospital, Sri Lanka over a ten-month period. Results The recovery study showed an excellent correlation (r2 >0.998) for paracetamol concentrations from 25- 400 mg/l. The final yellow colour was stable for at least 10 minutes at room temperature. There was also excellent correlation with the HPLC method (r2=0.9758). In the clinical cohort study, use of the antidote N-acetylcysteine was avoided in over a third of patients who had a plasma paracetamol concentration measured. The cost of consumables used per assay was $0.50 (US). Conclusions This colorimetric paracetamol assay is reliable and accurate and can be performed rapidly, easily and economically. Use of this assay in resource poor clinical settings has the potential to have a significant clinical and economic impact on the management of paracetamol poisoning. PMID:20095813
Wells, Dagan; Kaur, Kulvinder; Grifo, Jamie; Glassner, Michael; Taylor, Jenny C; Fragouli, Elpida; Munne, Santiago
2014-01-01
Background The majority of human embryos created using in vitro fertilisation (IVF) techniques are aneuploid. Comprehensive chromosome screening methods, applicable to single cells biopsied from preimplantation embryos, allow reliable identification and transfer of euploid embryos. Recently, randomised trials using such methods have indicated that aneuploidy screening improves IVF success rates. However, the high cost of testing has restricted the availability of this potentially beneficial strategy. This study aimed to harness next-generation sequencing (NGS) technology, with the intention of lowering the costs of preimplantation aneuploidy screening. Methods Embryo biopsy, whole genome amplification and semiconductor sequencing. Results A rapid (<15 h) NGS protocol was developed, with consumable cost only two-thirds that of the most widely used method for embryo aneuploidy detection. Validation involved blinded analysis of 54 cells from cell lines or biopsies from human embryos. Sensitivity and specificity were 100%. The method was applied clinically, assisting in the selection of euploid embryos in two IVF cycles, producing healthy children in both cases. The NGS approach was also able to reveal specified mutations in the nuclear or mitochondrial genomes in parallel with chromosome assessment. Interestingly, elevated mitochondrial DNA content was associated with aneuploidy (p<0.05), a finding suggestive of a link between mitochondria and chromosomal malsegregation. Conclusions This study demonstrates that NGS provides highly accurate, low-cost diagnosis of aneuploidy in cells from human preimplantation embryos and is rapid enough to allow testing without embryo cryopreservation. The method described also has the potential to shed light on other aspects of embryo genetics of relevance to health and viability. PMID:25031024
Digital Sequences and a Time Reversal-Based Impact Region Imaging and Localization Method
Qiu, Lei; Yuan, Shenfang; Mei, Hanfei; Qian, Weifeng
2013-01-01
To reduce time and cost of damage inspection, on-line impact monitoring of aircraft composite structures is needed. A digital monitor based on an array of piezoelectric transducers (PZTs) is developed to record the impact region of impacts on-line. It is small in size, lightweight and has low power consumption, but there are two problems with the impact alarm region localization method of the digital monitor at the current stage. The first one is that the accuracy rate of the impact alarm region localization is low, especially on complex composite structures. The second problem is that the area of impact alarm region is large when a large scale structure is monitored and the number of PZTs is limited which increases the time and cost of damage inspections. To solve the two problems, an impact alarm region imaging and localization method based on digital sequences and time reversal is proposed. In this method, the frequency band of impact response signals is estimated based on the digital sequences first. Then, characteristic signals of impact response signals are constructed by sinusoidal modulation signals. Finally, the phase synthesis time reversal impact imaging method is adopted to obtain the impact region image. Depending on the image, an error ellipse is generated to give out the final impact alarm region. A validation experiment is implemented on a complex composite wing box of a real aircraft. The validation results show that the accuracy rate of impact alarm region localization is approximately 100%. The area of impact alarm region can be reduced and the number of PZTs needed to cover the same impact monitoring region is reduced by more than a half. PMID:24084123
Development and validation of an automated unit for the extraction of radiocaesium from seawater.
Bokor, Ilonka; Sdraulig, Sandra; Jenkinson, Peter; Madamperuma, Janaka; Martin, Paul
2016-01-01
An automated unit was developed for the in-situ extraction of radiocaesium ((137)Cs and (134)Cs) from large volumes of seawater to achieve very low detection limits. The unit was designed for monitoring of Australian ocean and coastal waters, including at ports visited by nuclear-powered warships. The unit is housed within a robust case, and is easily transported and operated. It contains four filter cartridges connected in series. The first two cartridges are used to remove any suspended material that may be present in the seawater, while the last two cartridges are coated with potassium copper hexacyanoferrate for caesium extraction. Once the extraction is completed the coated cartridges are ashed. The ash is transferred to a small petri dish for counting of (137)Cs and (134)Cs by high resolution gamma spectrometry for a minimum of 24 h. The extraction method was validated for the following criteria: selectivity, trueness, precision, linearity, limit of detection and traceability. The validation showed the unit to be fit for purpose with the method capable of achieving low detection limits required for environmental samples. The results for the environmental measurements in Australian seawater correlate well with those reported in the Worldwide Marine Radioactivity Study (WOMARS). The cost of preparation and running the system is low and waste generation is minimal. Crown Copyright © 2015. Published by Elsevier Ltd. All rights reserved.
Costing evidence for health care decision-making in Austria: A systematic review
Mayer, Susanne; Kiss, Noemi; Łaszewska, Agata
2017-01-01
Background With rising healthcare costs comes an increasing demand for evidence-informed resource allocation using economic evaluations worldwide. Furthermore, standardization of costing and reporting methods both at international and national levels are imperative to make economic evaluations a valid tool for decision-making. The aim of this review is to assess the availability and consistency of costing evidence that could be used for decision-making in Austria. It describes systematically the current economic evaluation and costing studies landscape focusing on the applied costing methods and their reporting standards. Findings are discussed in terms of their likely impacts on evidence-based decision-making and potential suggestions for areas of development. Methods A systematic literature review of English and German language peer-reviewed as well as grey literature (2004–2015) was conducted to identify Austrian economic analyses. The databases MEDLINE, EMBASE, SSCI, EconLit, NHS EED and Scopus were searched. Publication and study characteristics, costing methods, reporting standards and valuation sources were systematically synthesised and assessed. Results A total of 93 studies were included. 87% were journal articles, 13% were reports. 41% of all studies were full economic evaluations, mostly cost-effectiveness analyses. Based on relevant standards the most commonly observed limitations were that 60% of the studies did not clearly state an analytical perspective, 25% of the studies did not provide the year of costing, 27% did not comprehensively list all valuation sources, and 38% did not report all applied unit costs. Conclusion There are substantial inconsistencies in the costing methods and reporting standards in economic analyses in Austria, which may contribute to a low acceptance and lack of interest in economic evaluation-informed decision making. To improve comparability and quality of future studies, national costing guidelines should be updated with more specific methodological guidance and a national reference cost library should be set up to allow harmonisation of valuation methods. PMID:28806728
Process for Low Cost Domestic Production of LIB Cathode Materials
DOE Office of Scientific and Technical Information (OSTI.GOV)
Thurston, Anthony
The objective of the research was to determine the best low cost method for the large scale production of the Nickel-Cobalt-Manganese (NCM) layered cathode materials. The research and development focused on scaling up the licensed technology from Argonne National Laboratory in BASF’s battery material pilot plant in Beachwood Ohio. Since BASF did not have experience with the large scale production of the NCM cathode materials there was a significant amount of development that was needed to support BASF’s already existing research program. During the three year period BASF was able to develop and validate production processes for the NCM 111,more » 523 and 424 materials as well as begin development of the High Energy NCM. BASF also used this time period to provide free cathode material samples to numerous manufactures, OEM’s and research companies in order to validate the ma-terials. The success of the project can be demonstrated by the construction of the production plant in Elyria Ohio and the successful operation of that facility. The benefit of the project to the public will begin to be apparent as soon as material from the production plant is being used in electric vehicles.« less
High-Speed Friction-Stir Welding To Enable Aluminum Tailor-Welded Blanks
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hovanski, Yuri; Upadhyay, Piyush; Carsley, John
Current joining technologies for automotive aluminum alloys are utilized in low-volume and niche applications, and have yet to be scaled for the high-volume vehicle market. This study targeted further weight reduction, part reduction, and cost savings by enabling tailor-welded blank technology for aluminum alloys at high-volumes. While friction stir welding has been traditionally applied at linear velocities less than one meter per minute, high volume production applications demand the process be extended to higher velocities more amenable to cost sensitive production environments. Unfortunately, weld parameters and performance developed and characterized at low to moderate welding velocities do not directly translatemore » to high speed linear friction stir welding. Therefore, in order to facilitate production of high volume aluminum welded components, parameters were developed with a minimum welding velocity of three meters per minute. With an emphasis on weld quality, welded blanks were evaluated for post-weld formability utilizing a combination of numerical and experimental methods. Evaluation across scales was ultimately validated by stamping full-size production door inner panels made from dissimilar thickness aluminum tailor-welded blanks, which provided validation of the numerical and experimental analysis of laboratory scale tests.« less
Enabling high speed friction stir welding of aluminum tailor welded blanks
NASA Astrophysics Data System (ADS)
Hovanski, Yuri
Current welding technologies for production of aluminum tailor-welded blanks (TWBs) are utilized in low-volume and niche applications, and have yet to be scaled for the high-volume vehicle market. This study targeted further weight reduction, part reduction, and cost savings by enabling tailor-welded blank technology for aluminum alloys at high-volumes. While friction stir welding (FSW) has traditionally been applied at linear velocities less than one meter per minute, high volume production applications demand the process be extended to higher velocities more amenable to cost sensitive production environments. Unfortunately, weld parameters and performance developed and characterized at low to moderate welding velocities do not directly translate to high speed linear friction stir welding. Therefore, in order to facilitate production of high volume aluminum FSW components, parameters were developed with a minimum welding velocity of three meters per minute. With an emphasis on weld quality, welded blanks were evaluated for post-weld formability using a combination of numerical and experimental methods. Evaluation across scales was ultimately validated by stamping full-size production door inner panels made from dissimilar thickness aluminum tailor-welded blanks, which provided validation of the numerical and experimental analysis of laboratory scale tests.
High-Speed Friction-Stir Welding to Enable Aluminum Tailor-Welded Blanks
NASA Astrophysics Data System (ADS)
Hovanski, Yuri; Upadhyay, Piyush; Carsley, John; Luzanski, Tom; Carlson, Blair; Eisenmenger, Mark; Soulami, Ayoub; Marshall, Dustin; Landino, Brandon; Hartfield-Wunsch, Susan
2015-05-01
Current welding technologies for production of aluminum tailor-welded blanks (TWBs) are utilized in low-volume and niche applications, and they have yet to be scaled for the high-volume vehicle market. This study targeted further weight reduction, part reduction, and cost savings by enabling tailor-welded blank technology for aluminum alloys at high volumes. While friction-stir welding (FSW) has been traditionally applied at linear velocities less than 1 m/min, high-volume production applications demand the process be extended to higher velocities more amenable to cost-sensitive production environments. Unfortunately, weld parameters and performance developed and characterized at low-to-moderate welding velocities do not directly translate to high-speed linear FSW. Therefore, to facilitate production of high-volume aluminum FSW components, parameters were developed with a minimum welding velocity of 3 m/min. With an emphasis on weld quality, welded blanks were evaluated for postweld formability using a combination of numerical and experimental methods. An evaluation across scales was ultimately validated by stamping full-size production door inner panels made from dissimilar thickness aluminum TWBs, which provided validation of the numerical and experimental analysis of laboratory-scale tests.
Erdodi, Laszlo A; Abeare, Christopher A; Lichtenstein, Jonathan D; Tyson, Bradley T; Kucharski, Brittany; Zuccato, Brandon G; Roth, Robert M
2017-02-01
Research suggests that select processing speed measures can also serve as embedded validity indicators (EVIs). The present study examined the diagnostic utility of Wechsler Adult Intelligence Scale-Fourth Edition (WAIS-IV) subtests as EVIs in a mixed clinical sample of 205 patients medically referred for neuropsychological assessment (53.3% female, mean age = 45.1). Classification accuracy was calculated against 3 composite measures of performance validity as criterion variables. A PSI ≤79 produced a good combination of sensitivity (.23-.56) and specificity (.92-.98). A Coding scaled score ≤5 resulted in good specificity (.94-1.00), but low and variable sensitivity (.04-.28). A Symbol Search scaled score ≤6 achieved a good balance between sensitivity (.38-.64) and specificity (.88-.93). A Coding-Symbol Search scaled score difference ≥5 produced adequate specificity (.89-.91) but consistently low sensitivity (.08-.12). A 2-tailed cutoff on the Coding/Symbol Search raw score ratio (≤1.41 or ≥3.57) produced acceptable specificity (.87-.93), but low sensitivity (.15-.24). Failing ≥2 of these EVIs produced variable specificity (.81-.93) and sensitivity (.31-.59). Failing ≥3 of these EVIs stabilized specificity (.89-.94) at a small cost to sensitivity (.23-.53). Results suggest that processing speed based EVIs have the potential to provide a cost-effective and expedient method for evaluating the validity of cognitive data. Given their generally low and variable sensitivity, however, they should not be used in isolation to determine the credibility of a given response set. They also produced unacceptably high rates of false positive errors in patients with moderate-to-severe head injury. Combining evidence from multiple EVIs has the potential to improve overall classification accuracy. (PsycINFO Database Record (c) 2017 APA, all rights reserved).
A low-cost acoustic permeameter
NASA Astrophysics Data System (ADS)
Drake, Stephen A.; Selker, John S.; Higgins, Chad W.
2017-04-01
Intrinsic permeability is an important parameter that regulates air exchange through porous media such as snow. Standard methods of measuring snow permeability are inconvenient to perform outdoors, are fraught with sampling errors, and require specialized equipment, while bringing intact samples back to the laboratory is also challenging. To address these issues, we designed, built, and tested a low-cost acoustic permeameter that allows computation of volume-averaged intrinsic permeability for a homogenous medium. In this paper, we validate acoustically derived permeability of homogenous, reticulated foam samples by comparison with results derived using a standard flow-through permeameter. Acoustic permeameter elements were designed for use in snow, but the measurement methods are not snow-specific. The electronic components - consisting of a signal generator, amplifier, speaker, microphone, and oscilloscope - are inexpensive and easily obtainable. The system is suitable for outdoor use when it is not precipitating, but the electrical components require protection from the elements in inclement weather. The permeameter can be operated with a microphone either internally mounted or buried a known depth in the medium. The calibration method depends on choice of microphone positioning. For an externally located microphone, calibration was based on a low-frequency approximation applied at 500 Hz that provided an estimate of both intrinsic permeability and tortuosity. The low-frequency approximation that we used is valid up to 2 kHz, but we chose 500 Hz because data reproducibility was maximized at this frequency. For an internally mounted microphone, calibration was based on attenuation at 50 Hz and returned only intrinsic permeability. We found that 50 Hz corresponded to a wavelength that minimized resonance frequencies in the acoustic tube and was also within the response limitations of the microphone. We used reticulated foam of known permeability (ranging from 2 × 10-7 to 3 × 10-9 m2) and estimated tortuosity of 1.05 to validate both methods. For the externally mounted microphone the mean normalized standard deviation was 6 % for permeability and 2 % for tortuosity. The mean relative error from known measurements was 17 % for permeability and 2 % for tortuosity. For the internally mounted microphone the mean normalized standard deviation for permeability was 10 % and the relative error was also 10 %. Permeability determination for an externally mounted microphone is less sensitive to environmental noise than is the internally mounted microphone and is therefore the recommended method. The approximation using the internally mounted microphone was developed as an alternative for circumstances in which placing the microphone in the medium was not feasible. Environmental noise degrades precision of both methods and is recognizable as increased scatter for replicate data points.
Development and validation of a low-cost mobile robotics testbed
NASA Astrophysics Data System (ADS)
Johnson, Michael; Hayes, Martin J.
2012-03-01
This paper considers the design, construction and validation of a low-cost experimental robotic testbed, which allows for the localisation and tracking of multiple robotic agents in real time. The testbed system is suitable for research and education in a range of different mobile robotic applications, for validating theoretical as well as practical research work in the field of digital control, mobile robotics, graphical programming and video tracking systems. It provides a reconfigurable floor space for mobile robotic agents to operate within, while tracking the position of multiple agents in real-time using the overhead vision system. The overall system provides a highly cost-effective solution to the topical problem of providing students with practical robotics experience within severe budget constraints. Several problems encountered in the design and development of the mobile robotic testbed and associated tracking system, such as radial lens distortion and the selection of robot identifier templates are clearly addressed. The testbed performance is quantified and several experiments involving LEGO Mindstorm NXT and Merlin System MiaBot robots are discussed.
Lave, Matthew; Stein, Joshua; Smith, Ryan
2016-07-28
To address the lack of knowledge of local solar variability, we have developed and deployed a low-cost solar variability datalogger (SVD). While most currently used solar irradiance sensors are expensive pyranometers with high accuracy (relevant for annual energy estimates), low-cost sensors display similar precision (relevant for solar variability) as high-cost pyranometers, even if they are not as accurate. In this work, we present evaluation of various low-cost irradiance sensor types, describe the SVD, and present validation and comparison of the SVD collected data. In conclusion, the low cost and ease of use of the SVD will enable a greater understandingmore » of local solar variability, which will reduce developer and utility uncertainty about the impact of solar photovoltaic (PV) installations and thus will encourage greater penetrations of solar energy.« less
Maschio, Federico; Pandya, Mirali; Olszewski, Raphael
2016-03-22
The objective of this study was to investigate the accuracy of 3-dimensional (3D) plastic (ABS) models generated using a low-cost 3D fused deposition modelling printer. Two human dry mandibles were scanned with a cone beam computed tomography (CBCT) Accuitomo device. Preprocessing consisted of 3D reconstruction with Maxilim software and STL file repair with Netfabb software. Then, the data were used to print 2 plastic replicas with a low-cost 3D fused deposition modeling printer (Up plus 2®). Two independent observers performed the identification of 26 anatomic landmarks on the 4 mandibles (2 dry and 2 replicas) with a 3D measuring arm. Each observer repeated the identifications 20 times. The comparison between the dry and plastic mandibles was based on 13 distances: 8 distances less than 12 mm and 5 distances greater than 12 mm. The mean absolute difference (MAD) was 0.37 mm, and the mean dimensional error (MDE) was 3.76%. The MDE decreased to 0.93% for distances greater than 12 mm. Plastic models generated using the low-cost 3D printer UPplus2® provide dimensional accuracies comparable to other well-established rapid prototyping technologies. Validated low-cost 3D printers could represent a step toward the better accessibility of rapid prototyping technologies in the medical field.
Molecular diagnostics for low resource settings
NASA Astrophysics Data System (ADS)
Weigl, Bernhard H.
2010-03-01
As traditional high quality diagnostic laboratories are not widely available or affordable in developing country health care settings, microfluidics-based point-of-care diagnostics may be able to address the need to perform complex assays in under-resourced areas. Many instrument-based as well as non-instrumented microfluidic prototype diagnostics are currently being developed. In addition to various engineering challenges, the greatest remaining issue is the search for truly low-cost disposable manufacturing methods. Diagnostics for global health, and specifically microfluidics and molecular-based low resource diagnostics, have become a very active research area over the last five years, thanks in part to new funding that became available from the Bill and Melinda Gates Foundation, the National Institutes of Health, and other sources. This has led to a number of interesting prototype devices that are now in advanced development or clinical validation. These devices include disposables and instruments that perform multiplexed PCR-based lab-on-a-chips for enteric, febrile, and vaginal diseases, as well as immunoassays for diseases such as malaria, HIV, and various sexually transmitted diseases. More recently, instrument-free diagnostic disposables based on isothermal nucleic acid amplification have been developed as well. Regardless of platform, however, the search for truly low-cost manufacturing methods that would result in cost of goods per disposable of around US1/unit at volume remains a big challenge. This talk will give an overview over existing platform development efforts as well as present some original research in this area at PATH.
NASA Astrophysics Data System (ADS)
Ginanjar, Irlandia; Pasaribu, Udjianna S.; Indratno, Sapto W.
2017-03-01
This article presents the application of the principal component analysis (PCA) biplot for the needs of data mining. This article aims to simplify and objectify the methods for objects clustering in PCA biplot. The novelty of this paper is to get a measure that can be used to objectify the objects clustering in PCA biplot. Orthonormal eigenvectors, which are the coefficients of a principal component model representing an association between principal components and initial variables. The existence of the association is a valid ground to objects clustering based on principal axes value, thus if m principal axes used in the PCA, then the objects can be classified into 2m clusters. The inter-city buses are clustered based on maintenance costs data by using two principal axes PCA biplot. The buses are clustered into four groups. The first group is the buses with high maintenance costs, especially for lube, and brake canvass. The second group is the buses with high maintenance costs, especially for tire, and filter. The third group is the buses with low maintenance costs, especially for lube, and brake canvass. The fourth group is buses with low maintenance costs, especially for tire, and filter.
Fast and low-cost method for VBES bathymetry generation in coastal areas
NASA Astrophysics Data System (ADS)
Sánchez-Carnero, N.; Aceña, S.; Rodríguez-Pérez, D.; Couñago, E.; Fraile, P.; Freire, J.
2012-12-01
Sea floor topography is key information in coastal area management. Nowadays, LiDAR and multibeam technologies provide accurate bathymetries in those areas; however these methodologies are yet too expensive for small customers (fishermen associations, small research groups) willing to keep a periodic surveillance of environmental resources. In this paper, we analyse a simple methodology for vertical beam echosounder (VBES) bathymetric data acquisition and postprocessing, using low-cost means and free customizable tools such as ECOSONS and gvSIG (that is compared with industry standard ArcGIS). Echosounder data was filtered, resampled and, interpolated (using kriging or radial basis functions). Moreover, the presented methodology includes two data correction processes: Monte Carlo simulation, used to reduce GPS errors, and manually applied bathymetric line transformations, both improving the obtained results. As an example, we present the bathymetry of the Ría de Cedeira (Galicia, NW Spain), a good testbed area for coastal bathymetry methodologies given its extension and rich topography. The statistical analysis, performed by direct ground-truthing, rendered an upper bound of 1.7 m error, at 95% confidence level, and 0.7 m r.m.s. (cross-validation provided 30 cm and 25 cm, respectively). The methodology presented is fast and easy to implement, accurate outside transects (accuracy can be estimated), and can be used as a low-cost periodical monitoring method.
Semi-physical Simulation Platform of a Parafoil Nonlinear Dynamic System
NASA Astrophysics Data System (ADS)
Gao, Hai-Tao; Yang, Sheng-Bo; Zhu, Er-Lin; Sun, Qing-Lin; Chen, Zeng-Qiang; Kang, Xiao-Feng
2013-11-01
Focusing on the problems in the process of simulation and experiment on a parafoil nonlinear dynamic system, such as limited methods, high cost and low efficiency we present a semi-physical simulation platform. It is designed by connecting parts of physical objects to a computer, and remedies the defect that a computer simulation is divorced from a real environment absolutely. The main components of the platform and its functions, as well as simulation flows, are introduced. The feasibility and validity are verified through a simulation experiment. The experimental results show that the platform has significance for improving the quality of the parafoil fixed-point airdrop system, shortening the development cycle and saving cost.
Assessing culture via the Internet: methods and techniques for psychological research.
Barry, D T
2001-02-01
This study examines the acculturation experiences of Arabic immigrants and assesses the utility of the Internet as a data collection tool. Based on in-depth pilot interview data from 10 male Arabic immigrants and items selected from pre-existing measures, the Male Arabic Ethnic Identity Measure (MAEIM) was developed. Male Arab immigrants (115 males) were solicited through traditional methods in addition to the Internet. Satisfactory reliability and validity were reported for the MAEIM. No significant differences emerged between the Internet and Midwestern samples. The Internet proved to be an effective method for soliciting a relatively large, geographically dispersed sample of Arabic immigrants. The use of the Internet as a research tool is examined in the context of anonymity, networking, low-cost, perceived interactive control, methodological rigor, and external validity. The Internet was an effective vehicle for addressing concerns raised by prospective participants. It is suggested that the Internet may be an important method to assess culture-relevant variables in further research on Arab and other immigrant populations.
Huang, Qi; Yang, Dapeng; Jiang, Li; Zhang, Huajie; Liu, Hong; Kotani, Kiyoshi
2017-01-01
Performance degradation will be caused by a variety of interfering factors for pattern recognition-based myoelectric control methods in the long term. This paper proposes an adaptive learning method with low computational cost to mitigate the effect in unsupervised adaptive learning scenarios. We presents a particle adaptive classifier (PAC), by constructing a particle adaptive learning strategy and universal incremental least square support vector classifier (LS-SVC). We compared PAC performance with incremental support vector classifier (ISVC) and non-adapting SVC (NSVC) in a long-term pattern recognition task in both unsupervised and supervised adaptive learning scenarios. Retraining time cost and recognition accuracy were compared by validating the classification performance on both simulated and realistic long-term EMG data. The classification results of realistic long-term EMG data showed that the PAC significantly decreased the performance degradation in unsupervised adaptive learning scenarios compared with NSVC (9.03% ± 2.23%, p < 0.05) and ISVC (13.38% ± 2.62%, p = 0.001), and reduced the retraining time cost compared with ISVC (2 ms per updating cycle vs. 50 ms per updating cycle). PMID:28608824
Huang, Qi; Yang, Dapeng; Jiang, Li; Zhang, Huajie; Liu, Hong; Kotani, Kiyoshi
2017-06-13
Performance degradation will be caused by a variety of interfering factors for pattern recognition-based myoelectric control methods in the long term. This paper proposes an adaptive learning method with low computational cost to mitigate the effect in unsupervised adaptive learning scenarios. We presents a particle adaptive classifier (PAC), by constructing a particle adaptive learning strategy and universal incremental least square support vector classifier (LS-SVC). We compared PAC performance with incremental support vector classifier (ISVC) and non-adapting SVC (NSVC) in a long-term pattern recognition task in both unsupervised and supervised adaptive learning scenarios. Retraining time cost and recognition accuracy were compared by validating the classification performance on both simulated and realistic long-term EMG data. The classification results of realistic long-term EMG data showed that the PAC significantly decreased the performance degradation in unsupervised adaptive learning scenarios compared with NSVC (9.03% ± 2.23%, p < 0.05) and ISVC (13.38% ± 2.62%, p = 0.001), and reduced the retraining time cost compared with ISVC (2 ms per updating cycle vs. 50 ms per updating cycle).
Auvinet, E; Multon, F; Manning, V; Meunier, J; Cobb, J P
2017-01-01
Gait asymmetry information is a key point in disease screening and follow-up. Constant Relative Phase (CRP) has been used to quantify within-stride asymmetry index, which requires noise-free and accurate motion capture, which is difficult to obtain in clinical settings. This study explores a new index, the Longitudinal Asymmetry Index (ILong) which is derived using data from a low-cost depth camera (Kinect). ILong is based on depth images averaged over several gait cycles, rather than derived joint positions or angles. This study aims to evaluate (1) the validity of CRP computed with Kinect, (2) the validity and sensitivity of ILong for measuring gait asymmetry based solely on data provided by a depth camera, (3) the clinical applicability of a posteriorly mounted camera system to avoid occlusion caused by the standard front-fitted treadmill consoles and (4) the number of strides needed to reliably calculate ILong. The gait of 15 subjects was recorded concurrently with a marker-based system (MBS) and Kinect, and asymmetry was artificially reproduced by introducing a 5cm sole attached to one foot. CRP computed with Kinect was not reliable. ILong detected this disturbed gait reliably and could be computed from a posteriorly placed Kinect without loss of validity. A minimum of five strides was needed to achieve a correlation coefficient of 0.9 between standard MBS and low-cost depth camera based ILong. ILong provides a clinically pragmatic method for measuring gait asymmetry, with application for improved patient care through enhanced disease, screening, diagnosis and monitoring. Copyright © 2016. Published by Elsevier B.V.
Cancer Detection, Diagnosis, and Treatment Technologies for Global Health: Supporting the developmen
NCI, Center for Global Health supports the development and validation of low-cost, portable technologies that can improve cancer detection, diagnosis, and treatment in low-and middle-income countries.
Wells, Dagan; Kaur, Kulvinder; Grifo, Jamie; Glassner, Michael; Taylor, Jenny C; Fragouli, Elpida; Munne, Santiago
2014-08-01
The majority of human embryos created using in vitro fertilisation (IVF) techniques are aneuploid. Comprehensive chromosome screening methods, applicable to single cells biopsied from preimplantation embryos, allow reliable identification and transfer of euploid embryos. Recently, randomised trials using such methods have indicated that aneuploidy screening improves IVF success rates. However, the high cost of testing has restricted the availability of this potentially beneficial strategy. This study aimed to harness next-generation sequencing (NGS) technology, with the intention of lowering the costs of preimplantation aneuploidy screening. Embryo biopsy, whole genome amplification and semiconductor sequencing. A rapid (<15 h) NGS protocol was developed, with consumable cost only two-thirds that of the most widely used method for embryo aneuploidy detection. Validation involved blinded analysis of 54 cells from cell lines or biopsies from human embryos. Sensitivity and specificity were 100%. The method was applied clinically, assisting in the selection of euploid embryos in two IVF cycles, producing healthy children in both cases. The NGS approach was also able to reveal specified mutations in the nuclear or mitochondrial genomes in parallel with chromosome assessment. Interestingly, elevated mitochondrial DNA content was associated with aneuploidy (p<0.05), a finding suggestive of a link between mitochondria and chromosomal malsegregation. This study demonstrates that NGS provides highly accurate, low-cost diagnosis of aneuploidy in cells from human preimplantation embryos and is rapid enough to allow testing without embryo cryopreservation. The method described also has the potential to shed light on other aspects of embryo genetics of relevance to health and viability. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.
Sample size determination for disease prevalence studies with partially validated data.
Qiu, Shi-Fang; Poon, Wai-Yin; Tang, Man-Lai
2016-02-01
Disease prevalence is an important topic in medical research, and its study is based on data that are obtained by classifying subjects according to whether a disease has been contracted. Classification can be conducted with high-cost gold standard tests or low-cost screening tests, but the latter are subject to the misclassification of subjects. As a compromise between the two, many research studies use partially validated datasets in which all data points are classified by fallible tests, and some of the data points are validated in the sense that they are also classified by the completely accurate gold-standard test. In this article, we investigate the determination of sample sizes for disease prevalence studies with partially validated data. We use two approaches. The first is to find sample sizes that can achieve a pre-specified power of a statistical test at a chosen significance level, and the second is to find sample sizes that can control the width of a confidence interval with a pre-specified confidence level. Empirical studies have been conducted to demonstrate the performance of various testing procedures with the proposed sample sizes. The applicability of the proposed methods are illustrated by a real-data example. © The Author(s) 2012.
From functional structure to packaging: full-printing fabrication of a microfluidic chip.
Zheng, Fengyi; Pu, Zhihua; He, Enqi; Huang, Jiasheng; Yu, Bocheng; Li, Dachao; Li, Zhihong
2018-05-24
This paper presents a concept of a full-printing methodology aiming at convenient and fast fabrication of microfluidic devices. For the first time, we achieved a microfluidic biochemical sensor with all functional structures fabricated by inkjet printing, including electrodes, immobilized enzymes, microfluidic components and packaging. With the cost-effective and rapid process, this method provides the possibility of quick model validation of a novel lab-on-chip system. In this study, a three-electrode electrochemical system was integrated successfully with glucose oxidase immobilization gel and sealed in an ice channel, forming a disposable microfluidic sensor for glucose detection. This fully-printed chip was characterized and showed good sensitivity and a linear section at a low-level concentration of glucose (0-10 mM). With the aid of automatic equipment, the fully-printed sensor can be massively produced with low cost.
Costing evidence for health care decision-making in Austria: A systematic review.
Mayer, Susanne; Kiss, Noemi; Łaszewska, Agata; Simon, Judit
2017-01-01
With rising healthcare costs comes an increasing demand for evidence-informed resource allocation using economic evaluations worldwide. Furthermore, standardization of costing and reporting methods both at international and national levels are imperative to make economic evaluations a valid tool for decision-making. The aim of this review is to assess the availability and consistency of costing evidence that could be used for decision-making in Austria. It describes systematically the current economic evaluation and costing studies landscape focusing on the applied costing methods and their reporting standards. Findings are discussed in terms of their likely impacts on evidence-based decision-making and potential suggestions for areas of development. A systematic literature review of English and German language peer-reviewed as well as grey literature (2004-2015) was conducted to identify Austrian economic analyses. The databases MEDLINE, EMBASE, SSCI, EconLit, NHS EED and Scopus were searched. Publication and study characteristics, costing methods, reporting standards and valuation sources were systematically synthesised and assessed. A total of 93 studies were included. 87% were journal articles, 13% were reports. 41% of all studies were full economic evaluations, mostly cost-effectiveness analyses. Based on relevant standards the most commonly observed limitations were that 60% of the studies did not clearly state an analytical perspective, 25% of the studies did not provide the year of costing, 27% did not comprehensively list all valuation sources, and 38% did not report all applied unit costs. There are substantial inconsistencies in the costing methods and reporting standards in economic analyses in Austria, which may contribute to a low acceptance and lack of interest in economic evaluation-informed decision making. To improve comparability and quality of future studies, national costing guidelines should be updated with more specific methodological guidance and a national reference cost library should be set up to allow harmonisation of valuation methods.
Esmaeilzadeh, Sara; Valizadeh, Hadi; Zakeri-Milani, Parvin
2016-01-01
Purpose: The main goal of this study was development of a reverse phase high performance liquid chromatography (RP-HPLC) method for flutamide quantitation which is applicable to protein binding studies. Methods: Ultrafilteration method was used for protein binding study of flutamide. For sample analysis, flutamide was extracted by a simple and low cost extraction method using diethyl ether and then was determined by HPLC/UV. Acetanilide was used as an internal standard. The chromatographic system consisted of a reversed-phase C8 column with C8 pre-column, and the mobile phase of a mixture of 29% (v/v) methanol, 38% (v/v) acetonitrile and 33% (v/v) potassium dihydrogen phosphate buffer (50 mM) with pH adjusted to 3.2. Results: Acetanilide and flutamide were eluted at 1.8 and 2.9 min, respectively. The linearity of method was confirmed in the range of 62.5-16000 ng/ml (r2 > 0.99). The limit of quantification was shown to be 62.5 ng/ml. Precision and accuracy ranges found to be (0.2-1.4%, 90-105%) and (0.2-5.3 %, 86.7-98.5 %) respectively. Acetanilide and flutamide capacity factor values of 1.35 and 2.87, tailing factor values of 1.24 and 1.07 and resolution values of 1.8 and 3.22 were obtained in accordance with ICH guidelines. Conclusion: Based on the obtained results a rapid, precise, accurate, sensitive and cost-effective analysis procedure was proposed for quantitative determination of flutamide. PMID:27478788
Elshout, Mari; Webers, Carroll A B; van der Reis, Margriet I; Schouten, Jan S A G
2018-06-04
Ophthalmologists increasingly depend on new drugs to advance their treatment options. These options are limited by restraints on reimbursements for new and expensive drugs. These restraints are put in place through health policy decisions based on cost-effectiveness analyses (CEA). Cost-effectiveness analyses need to be valid and of good quality to support correct decisions to create new treatment opportunities. In this study, we report the quality, validity and usefulness of CEAs for therapies for nAMD. A systematic review in PubMed, EMBASE and Cochrane was performed to include CEAs. Quality and validity assessment was based on current general quality criteria and on elements that are specific to the field of ophthalmology. Forty-eight CEAs were included in the review. Forty-four CEAs did not meet four basic model quality and validity criteria specific to CEAs in the field of ophthalmology (both eyes analysed instead of one; a time horizon extending beyond 4 years; extrapolating VA and treatment intervals beyond trial data realistically; and including the costs of low-vision). Four CEAs aligned with the quality and validity criteria. In two of these CEAs bevacizumab as-needed (PRN) was more cost-effective than bevacizumab monthly; aflibercept (VIEW); or ranibizumab monthly or PRN. In two CEAs, ranibizumab (PRN or treat and extent) was dominant over aflibercept. In two other CEAs, aflibercept was either more cost-effective or dominant over ranibizumab monthly or PRN. Two of the CEAs of sufficient quality and validity show that bevacizumab PRN is the most cost-effective treatment. Comparing ranibizumab and aflibercept, either treatment can be more cost-effective depending on the assumptions used for drug prices and treatment frequencies. The majority of the published CEAs are of insufficient quality and validity. They wrongly inform decision-makers at the cost of opportunities for ophthalmologists to treat patients. As such, they may negatively influence overall patient outcomes and societal costs. For future ophthalmic treatments, CEAs need to be improved and only published when they are of sufficient quality and validity. © 2018 Acta Ophthalmologica Scandinavica Foundation. Published by John Wiley & Sons Ltd.
Microfluidic diagnostics for low-resource settings
NASA Astrophysics Data System (ADS)
Hawkins, Kenneth R.; Weigl, Bernhard H.
2010-02-01
Diagnostics for low-resource settings need to be foremost inexpensive, but also accurate, reliable, rugged and suited to the contexts of the developing world. Diagnostics for global health, based on minimally-instrumented, microfluidicsbased platforms employing low-cost disposables, has become a very active research area recently-thanks, in part, to new funding from the Bill & Melinda Gates Foundation, the National Institutes of Health, and other sources. This has led to a number of interesting prototype devices that are now in advanced development or clinical validation. These devices include disposables and instruments that perform multiplexed PCR-based assays for enteric, febrile, and vaginal diseases, as well as immunoassays for diseases such as malaria, HIV, and various sexually transmitted diseases. More recently, instrument-free diagnostic disposables based on isothermal nucleic-acid amplification have been developed. Regardless of platform, however, the search for truly low-cost manufacturing methods that would enable affordable systems (at volume, in the appropriate context) remains a significant challenge. Here we give an overview of existing platform development efforts, present some original research in this area at PATH, and reiterate a call to action for more.
ERIC Educational Resources Information Center
Paula, Cristiane S.; Cunha, Graccielle Rodrigues; Bordini, Daniela; Brunoni, Decio; Moya, Ana Claudia; Bosa, Cleonice Alves; Mari, Jair J.; Cogo-Moreira, Hugo
2018-01-01
Simple and low-cost observational-tools to detect symptoms of Autism Spectrum Disorder (ASD) are still necessary. The OERA is a new assessment tool to screen children eliciting observable behaviors with no substantial knowledge on ASD required. The sample was 99 children aged 3-10: 76 with ASD and 23 without ASD (11/23 had intellectual…
Validation of a unique concept for a low-cost, lightweight space-deployable antenna structure
NASA Technical Reports Server (NTRS)
Freeland, R. E.; Bilyeu, G. D.; Veal, G. R.
1993-01-01
An experiment conducted in the framework of a NASA In-Space Technology Experiments Program based on a concept of inflatable deployable structures is described. The concept utilizes very low inflation pressure to maintain the required geometry on orbit and gravity-induced deflection of the structure precludes any meaningful ground-based demonstrations of functions performance. The experiment is aimed at validating and characterizing the mechanical functional performance of a 14-m-diameter inflatable deployable reflector antenna structure in the orbital operational environment. Results of the experiment are expected to significantly reduce the user risk associated with using large space-deployable antennas by demonstrating the functional performance of a concept that meets the criteria for low-cost, lightweight, and highly reliable space-deployable structures.
Polarization digital holographic microscopy using low-cost liquid crystal polarization rotators
NASA Astrophysics Data System (ADS)
Dovhaliuk, Rostyslav Yu
2018-02-01
Polarization imaging methods are actively used to study anisotropic objects. A number of methods and systems, such as imaging polarimeters, were proposed to measure the state of polarization of light that passed through the object. Digital holographic and interferometric approaches can be used to quantitatively measure both amplitude and phase of a wavefront. Using polarization modulation optics, the measurement capabilities of such interference-based systems can be extended to measure polarization-dependent parameters, such as phase retardation. Different kinds of polarization rotators can be used to alternate the polarization of a reference beam. Liquid crystals are used in a rapidly increasing number of different optoelectronic devices. Twisted nematic liquid crystals are widely used as amplitude modulators in electronic displays and light valves or shutter glass. Such devices are of particular interest for polarization imaging, as they can be used as polarization rotators, and due to large-scale manufacturing have relatively low cost. A simple Mach-Zehnder polarized holographic setup that uses modified shutter glass as a polarization rotator is demonstrated. The suggested approach is experimentally validated by measuring retardation of quarter-wave film.
Walker, Damian
2003-03-01
Many donors and countries are striving to respond to the HIV/AIDS epidemic by implementing prevention programmes. However, the resources available for providing these activities relative to needs are limited. Hence, decision-makers must choose among various types of interventions. Cost information, both measures of cost and cost-effectiveness, serves as a critical input into the processes of setting priorities and allocating resources efficiently. This paper reviews the cost and cost-effectiveness evidence base of HIV/AIDS prevention programmes in low- and middle-income countries (LMICs). None of the studies found have complete cost data for a full range of HIV/AIDS prevention programmes in any one country. However, the range of studies highlight the relative emphasis of different types of HIV/AIDS prevention strategies by region, reflecting the various modes of transmission and hence, to a certain extent, the stage of the epidemic. The costing methods applied and results obtained in this review give rise to questions of reliability, validity and transparency. First, not all of the studies report the methods used to calculate the costs, and/or do not provide all the necessary data inputs such that recalculation of the results is possible. Secondly, methods that are documented vary widely, rendering different studies, even within the same country and programme setting, largely incomparable. Finally, even with consistent and replicable measurement, the results as presented are generally not comparable because of the lack of a common outcome measure. Therefore, the extent to which the available cost and cost-effectiveness evidence base on HIV/AIDS prevention strategies can provide guidance to decision-makers is limited, and there is an urgent need for the generation of this knowledge for planning and decision-making.
Two-stage atlas subset selection in multi-atlas based image segmentation.
Zhao, Tingting; Ruan, Dan
2015-06-01
Fast growing access to large databases and cloud stored data presents a unique opportunity for multi-atlas based image segmentation and also presents challenges in heterogeneous atlas quality and computation burden. This work aims to develop a novel two-stage method tailored to the special needs in the face of large atlas collection with varied quality, so that high-accuracy segmentation can be achieved with low computational cost. An atlas subset selection scheme is proposed to substitute a significant portion of the computationally expensive full-fledged registration in the conventional scheme with a low-cost alternative. More specifically, the authors introduce a two-stage atlas subset selection method. In the first stage, an augmented subset is obtained based on a low-cost registration configuration and a preliminary relevance metric; in the second stage, the subset is further narrowed down to a fusion set of desired size, based on full-fledged registration and a refined relevance metric. An inference model is developed to characterize the relationship between the preliminary and refined relevance metrics, and a proper augmented subset size is derived to ensure that the desired atlases survive the preliminary selection with high probability. The performance of the proposed scheme has been assessed with cross validation based on two clinical datasets consisting of manually segmented prostate and brain magnetic resonance images, respectively. The proposed scheme demonstrates comparable end-to-end segmentation performance as the conventional single-stage selection method, but with significant computation reduction. Compared with the alternative computation reduction method, their scheme improves the mean and medium Dice similarity coefficient value from (0.74, 0.78) to (0.83, 0.85) and from (0.82, 0.84) to (0.95, 0.95) for prostate and corpus callosum segmentation, respectively, with statistical significance. The authors have developed a novel two-stage atlas subset selection scheme for multi-atlas based segmentation. It achieves good segmentation accuracy with significantly reduced computation cost, making it a suitable configuration in the presence of extensive heterogeneous atlases.
Measurement Techniques and Instruments Suitable for Life-prediction Testing of Photovoltaic Arrays
NASA Technical Reports Server (NTRS)
Noel, G. T.; Wood, V. E.; Mcginniss, V. D.; Hassell, J. A.; Richard, N. A.; Gaines, G. B.; Carmichael, D. C.
1979-01-01
The validation of a 20-year service life for low-cost photovoltaic arrays is a critical requirement in the Low-Cost Solar Array (LSA) Project. The validation is accomplished through accelerated life-prediction tests. A two-phase study was conducted to address the needs before such tests are carried out. The results and recommended techniques from the Phase 1 investigation are summarized in the appendix. Phase 2 of the study is covered in this report and consisted of experimental evaluations of three techniques selected from these recommended as a results of the Phase 1 findings. The three techniques evaluated were specular and nonspecular optical reflectometry, chemiluminescence measurements, and electric current noise measurements.
Electromagnetic field tapering using all-dielectric gradient index materials.
Yi, Jianjia; Piau, Gérard-Pascal; de Lustrac, André; Burokur, Shah Nawaz
2016-07-28
The concept of transformation optics (TO) is applied to control the flow of electromagnetic fields between two sections of different dimensions through a tapering device. The broadband performance of the field taper is numerically and experimentally validated. The taper device presents a graded permittivity profile and is fabricated through three-dimensional (3D) polyjet printing technology using low-cost all-dielectric materials. Calculated and measured near-field mappings are presented in order to validate the proposed taper. A good qualitative agreement is obtained between full-wave simulations and experimental tests. Such all-dielectric taper paves the way to novel types of microwave devices that can be easily fabricated through low-cost additive manufacturing processes.
NASA Technical Reports Server (NTRS)
Lewis, Pattie
2007-01-01
Stennis Space Center (SSC), Kennedy Space Center (KSC) and Air Force Space Command (AFSPC) identified particulate emissions and waste generated from the depainting process of steel structures as hazardous materials to be eliminated or reduced. A Potential Alternatives Report, Potential Alternatives Report for Validation of Alternative Low Emission Surface Preparation/Depainting Technologies for Structural Steel, provided a technical analyses of identified alternatives to the current coating removal processes, criteria used to select alternatives for further analysis, and a list of those alternatives recommended for testing. The initial coating removal alternatives list was compiled using literature searches and stakeholder recommendations. The involved project participants initially considered approximately 13 alternatives. In late 2003, core project members selected the following depainting processes to be further evaluated: (1) Plastic Blast Media-Quickstrip(R)-A. (2) Hard Abrasive-Steel-Magic(R). (3) Sponge Blasting-Sponge-Jet(R). (4) Liquid Nitrogen-NItroJet(R). (5) Mechanical Removal with Vacuum Attachment-DESCO and OCM Clean-Air (6) Laser Coating Removal Alternatives were tested in accordance with the Joint Test Protocol for Validation of Alternative Low-Emission Surface Preparation/Depainting Technologies for Structural Steel, and the Field Evaluation Test Plan for Validation of Alternative Low-Emission Surface Preparation/Depainting Technologies for Structural Steel. Results of the testing are documented in the Joint Test Report. This Cost-Benefit Analysis (CBA) focuses on the three alternatives (Quickstrip(R)-A, SteelMagic (R), and Sponge-Jet(R)) that were considered viable alternatives for large area operations based on the results of the field demonstration and lab testing. This CBA was created to help participants determine if implementation of the candidate alternatives is economically justified. Each of the alternatives examined reduced Environmental Activity (EA) Costs-those costs associated with complying with environmental regulations. One alternative, Steel-Magic(R), also showed reduced Direct Costs and reduced total costs.
Design for low-power and reliable flexible electronics
NASA Astrophysics Data System (ADS)
Huang, Tsung-Ching (Jim)
Flexible electronics are emerging as an alternative to conventional Si electronics for large-area low-cost applications such as e-paper, smart sensors, and disposable RFID tags. By utilizing inexpensive manufacturing methods such as ink-jet printing and roll-to-roll imprinting, flexible electronics can be made on low-cost plastics just like printing a newspaper. However, the key elements of exible electronics, thin-film transistors (TFTs), have slower operating speeds and less reliability than their Si electronics counterparts. Furthermore, depending on the material property, TFTs are usually mono-type -- either p- or n-type -- devices. Making air-stable complementary TFT circuits is very challenging and not applicable to most TFT technologies. Existing design methodologies for Si electronics, therefore, cannot be directly applied to exible electronics. Other inhibiting factors such as high supply voltage, large process variation, and lack of trustworthy device modeling also make designing larger-scale and robust TFT circuits a significant challenge. The major goal of this dissertation is to provide a viable solution for robust circuit design in exible electronics. I will first introduce a reliability simulation framework that can predict the degraded TFT circuits' performance under bias-stress. This framework has been validated using the amorphous-silicon (a-Si) TFT scan driver for TFT-LCD displays. To reuse the existing CMOS design ow for exible electronics, I propose a Pseudo-CMOS cell library that can make TFT circuits operable under low supply voltage and which has post-fabrication tunability for reliability and performance enhancement. This cell library has been validated using 2V self-assembly-monolayer (SAM) organic TFTs with a low-cost shadow-mask deposition process. I will also demonstrate a 3-bit 1.25KS/s Flash ADC in a-Si TFTs, which is based on the proposed Pseudo-CMOS cell library, and explore more possibilities in display, energy, and sensing applications.
A generic, cost-effective, and scalable cell lineage analysis platform
Biezuner, Tamir; Spiro, Adam; Raz, Ofir; Amir, Shiran; Milo, Lilach; Adar, Rivka; Chapal-Ilani, Noa; Berman, Veronika; Fried, Yael; Ainbinder, Elena; Cohen, Galit; Barr, Haim M.; Halaban, Ruth; Shapiro, Ehud
2016-01-01
Advances in single-cell genomics enable commensurate improvements in methods for uncovering lineage relations among individual cells. Current sequencing-based methods for cell lineage analysis depend on low-resolution bulk analysis or rely on extensive single-cell sequencing, which is not scalable and could be biased by functional dependencies. Here we show an integrated biochemical-computational platform for generic single-cell lineage analysis that is retrospective, cost-effective, and scalable. It consists of a biochemical-computational pipeline that inputs individual cells, produces targeted single-cell sequencing data, and uses it to generate a lineage tree of the input cells. We validated the platform by applying it to cells sampled from an ex vivo grown tree and analyzed its feasibility landscape by computer simulations. We conclude that the platform may serve as a generic tool for lineage analysis and thus pave the way toward large-scale human cell lineage discovery. PMID:27558250
Li, Jianwei; Zhang, Weimin; Zeng, Weiqin; Chen, Guolong; Qiu, Zhongchao; Cao, Xinyuan; Gao, Xuanyi
2017-01-01
Estimation of the stress distribution in ferromagnetic components is very important for evaluating the working status of mechanical equipment and implementing preventive maintenance. Eddy current testing technology is a promising method in this field because of its advantages of safety, no need of coupling agent, etc. In order to reduce the cost of eddy current stress measurement system, and obtain the stress distribution in ferromagnetic materials without scanning, a low cost eddy current stress measurement system based on Archimedes spiral planar coil was established, and a method based on BP neural network to obtain the stress distribution using the stress of several discrete test points was proposed. To verify the performance of the developed test system and the validity of the proposed method, experiment was implemented using structural steel (Q235) specimens. Standard curves of sensors at each test point were achieved, the calibrated data were used to establish the BP neural network model for approximating the stress variation on the specimen surface, and the stress distribution curve of the specimen was obtained by interpolating with the established model. The results show that there is a good linear relationship between the change of signal modulus and the stress in most elastic range of the specimen, and the established system can detect the change in stress with a theoretical average sensitivity of -0.4228 mV/MPa. The obtained stress distribution curve is well consonant with the theoretical analysis result. At last, possible causes and improving methods of problems appeared in the results were discussed. This research has important significance for reducing the cost of eddy current stress measurement system, and advancing the engineering application of eddy current stress testing.
Li, Jianwei; Zeng, Weiqin; Chen, Guolong; Qiu, Zhongchao; Cao, Xinyuan; Gao, Xuanyi
2017-01-01
Estimation of the stress distribution in ferromagnetic components is very important for evaluating the working status of mechanical equipment and implementing preventive maintenance. Eddy current testing technology is a promising method in this field because of its advantages of safety, no need of coupling agent, etc. In order to reduce the cost of eddy current stress measurement system, and obtain the stress distribution in ferromagnetic materials without scanning, a low cost eddy current stress measurement system based on Archimedes spiral planar coil was established, and a method based on BP neural network to obtain the stress distribution using the stress of several discrete test points was proposed. To verify the performance of the developed test system and the validity of the proposed method, experiment was implemented using structural steel (Q235) specimens. Standard curves of sensors at each test point were achieved, the calibrated data were used to establish the BP neural network model for approximating the stress variation on the specimen surface, and the stress distribution curve of the specimen was obtained by interpolating with the established model. The results show that there is a good linear relationship between the change of signal modulus and the stress in most elastic range of the specimen, and the established system can detect the change in stress with a theoretical average sensitivity of -0.4228 mV/MPa. The obtained stress distribution curve is well consonant with the theoretical analysis result. At last, possible causes and improving methods of problems appeared in the results were discussed. This research has important significance for reducing the cost of eddy current stress measurement system, and advancing the engineering application of eddy current stress testing. PMID:29145500
Esmaeilzadeh, Sara; Valizadeh, Hadi; Zakeri-Milani, Parvin
2016-06-01
The main goal of this study was development of a reverse phase high performance liquid chromatography (RP-HPLC) method for flutamide quantitation which is applicable to protein binding studies. Ultrafilteration method was used for protein binding study of flutamide. For sample analysis, flutamide was extracted by a simple and low cost extraction method using diethyl ether and then was determined by HPLC/UV. Acetanilide was used as an internal standard. The chromatographic system consisted of a reversed-phase C8 column with C8 pre-column, and the mobile phase of a mixture of 29% (v/v) methanol, 38% (v/v) acetonitrile and 33% (v/v) potassium dihydrogen phosphate buffer (50 mM) with pH adjusted to 3.2. Acetanilide and flutamide were eluted at 1.8 and 2.9 min, respectively. The linearity of method was confirmed in the range of 62.5-16000 ng/ml (r(2) > 0.99). The limit of quantification was shown to be 62.5 ng/ml. Precision and accuracy ranges found to be (0.2-1.4%, 90-105%) and (0.2-5.3 %, 86.7-98.5 %) respectively. Acetanilide and flutamide capacity factor values of 1.35 and 2.87, tailing factor values of 1.24 and 1.07 and resolution values of 1.8 and 3.22 were obtained in accordance with ICH guidelines. Based on the obtained results a rapid, precise, accurate, sensitive and cost-effective analysis procedure was proposed for quantitative determination of flutamide.
Lerche, L; Olsen, A; Petersen, K E N; Rostgaard-Hansen, A L; Dragsted, L O; Nordsborg, N B; Tjønneland, A; Halkjaer, J
2017-12-01
Valid assessments of physical activity (PA) and cardiorespiratory fitness (CRF) are essential in epidemiological studies to define dose-response relationship for formulating thorough recommendations of an appropriate pattern of PA to maintain good health. The aim of this study was to validate the Danish step test, the physical activity questionnaire Active-Q, and self-rated fitness against directly measured maximal oxygen uptake (VO 2 max). A population-based subsample (n=125) was included from the "Diet, Cancer and Health-Next Generations" (DCH-NG) cohort which is under establishment. Validity coefficients, which express the correlation between measured and "true" exposure, were calculated, and misclassification across categories was evaluated. The validity of the Danish step test was moderate (women: r=.66, and men: r=.56); however, men were systematically underestimated (43% misclassification). When validating the questionnaire-derived measures of PA, leisure-time physical activity was not correlated with VO 2 max. Positive correlations were found for sports overall, but these were only significant for men: total hours per week of sports (r=.26), MET-hours per week of sports (r=.28) and vigorous sports (0.28) alone were positively correlated with VO 2 max. Finally, the percentage of misclassification was low for self-rated fitness (women: 9% and men: 13%). Thus, self-rated fitness was found to be a superior method to the Danish step test, as well as being less cost prohibitive and more practical than the VO 2 max method. Finally, even if correlations were low, they support the potential for questionnaire outcomes, particularly sports, vigorous sports, and self-rated fitness to be used to estimate CRF. © 2017 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
Remote sensing validation through SOOP technology: implementation of Spectra system
NASA Astrophysics Data System (ADS)
Piermattei, Viviana; Madonia, Alice; Bonamano, Simone; Consalvi, Natalizia; Caligiore, Aurelio; Falcone, Daniela; Puri, Pio; Sarti, Fabio; Spaccavento, Giovanni; Lucarini, Diego; Pacci, Giacomo; Amitrano, Luigi; Iacullo, Salvatore; D'Andrea, Salvatore; Marcelli, Marco
2017-04-01
The development of low-cost instrumentation plays a key role in marine environmental studies and represents one of the most innovative aspects of marine research. The availability of low-cost technologies allows the realization of extended observatory networks for the study of marine phenomena through an integrated approach merging observations, remote sensing and operational oceanography. Marine services and practical applications critically depends on the availability of large amount of data collected with sufficiently dense spatial and temporal sampling. This issue directly influences the robustness both of ocean forecasting models and remote sensing observations through data assimilation and validation processes, particularly in the biological domain. For this reason it is necessary the development of cheap, small and integrated smart sensors, which could be functional both for satellite data validation and forecasting models data assimilation as well as to support early warning systems for environmental pollution control and prevention. This is particularly true in coastal areas, which are subjected to multiple anthropic pressures. Moreover, coastal waters can be classified like case 2 waters, where the optical properties of inorganic suspended matter and chromophoric dissolved organic matter must be considered and separated by the chlorophyll a contribution. Due to the high costs of mooring systems, research vessels, measure platforms and instrumentation a big effort was dedicated to the design, development and realization of a new low cost mini-FerryBox system: Spectra. Thanks to the modularity and user-friendly employment of the system, Spectra allows to acquire continuous in situ measures of temperature, conductivity, turbidity, chlorophyll a and chromophoric dissolved organic matter (CDOM) fluorescences from voluntary vessels, even by non specialized operators (Marcelli et al., 2014; 2016). This work shows the preliminary application of this technology to remote sensing data validation.
Nonlinear information fusion algorithms for data-efficient multi-fidelity modelling.
Perdikaris, P; Raissi, M; Damianou, A; Lawrence, N D; Karniadakis, G E
2017-02-01
Multi-fidelity modelling enables accurate inference of quantities of interest by synergistically combining realizations of low-cost/low-fidelity models with a small set of high-fidelity observations. This is particularly effective when the low- and high-fidelity models exhibit strong correlations, and can lead to significant computational gains over approaches that solely rely on high-fidelity models. However, in many cases of practical interest, low-fidelity models can only be well correlated to their high-fidelity counterparts for a specific range of input parameters, and potentially return wrong trends and erroneous predictions if probed outside of their validity regime. Here we put forth a probabilistic framework based on Gaussian process regression and nonlinear autoregressive schemes that is capable of learning complex nonlinear and space-dependent cross-correlations between models of variable fidelity, and can effectively safeguard against low-fidelity models that provide wrong trends. This introduces a new class of multi-fidelity information fusion algorithms that provide a fundamental extension to the existing linear autoregressive methodologies, while still maintaining the same algorithmic complexity and overall computational cost. The performance of the proposed methods is tested in several benchmark problems involving both synthetic and real multi-fidelity datasets from computational fluid dynamics simulations.
Validation of a Type 2 Diabetes Screening Tool in Rural Honduras
Milton, Evan C.; Herman, William H.; Aiello, Allison E.; Danielson, Kris R.; Mendoza-Avelarez, Milton O.; Piette, John D.
2010-01-01
OBJECTIVE To validate a low-cost tool for identifying diabetic patients in rural areas of Latin America. RESEARCH DESIGN AND METHODS A regression equation incorporating postprandial time and a random plasma glucose was used to screen 800 adults in Honduras. Patients with a probability of diabetes of ≥20% were asked to return for a fasting plasma glucose (FPG). A random fifth of those with a screener-based probability of diabetes <20% were also asked to return for follow-up. The gold standard was an FPG ≥126 mg/dl. RESULTS The screener had very good test characteristics (area under the receiver operating characteristic curve = 0.89). Using the screening criterion of ≥0.42, the equation had a sensitivity of 74.1% and specificity of 97.2%. CONCLUSIONS This screener is a valid measure of diabetes risk in Honduras and could be used to identify diabetic patients in poor clinics in Latin America. PMID:19918008
A model of head-related transfer functions based on a state-space analysis
NASA Astrophysics Data System (ADS)
Adams, Norman Herkamp
This dissertation develops and validates a novel state-space method for binaural auditory display. Binaural displays seek to immerse a listener in a 3D virtual auditory scene with a pair of headphones. The challenge for any binaural display is to compute the two signals to supply to the headphones. The present work considers a general framework capable of synthesizing a wide variety of auditory scenes. The framework models collections of head-related transfer functions (HRTFs) simultaneously. This framework improves the flexibility of contemporary displays, but it also compounds the steep computational cost of the display. The cost is reduced dramatically by formulating the collection of HRTFs in the state-space and employing order-reduction techniques to design efficient approximants. Order-reduction techniques based on the Hankel-operator are found to yield accurate low-cost approximants. However, the inter-aural time difference (ITD) of the HRTFs degrades the time-domain response of the approximants. Fortunately, this problem can be circumvented by employing a state-space architecture that allows the ITD to be modeled outside of the state-space. Accordingly, three state-space architectures are considered. Overall, a multiple-input, single-output (MISO) architecture yields the best compromise between performance and flexibility. The state-space approximants are evaluated both empirically and psychoacoustically. An array of truncated FIR filters is used as a pragmatic reference system for comparison. For a fixed cost bound, the state-space systems yield lower approximation error than FIR arrays for D>10, where D is the number of directions in the HRTF collection. A series of headphone listening tests are also performed to validate the state-space approach, and to estimate the minimum order N of indiscriminable approximants. For D = 50, the state-space systems yield order thresholds less than half those of the FIR arrays. Depending upon the stimulus uncertainty, a minimum state-space order of 7≤N≤23 appears to be adequate. In conclusion, the proposed state-space method enables a more flexible and immersive binaural display with low computational cost.
Manufacturing of Low Cost, Durable Membrane Electrode Assemblies Engineered for Rapid Conditioning
DOE Office of Scientific and Technical Information (OSTI.GOV)
Busby, Colin
Over the past 20 years significant progress in membrane-electrode assembly (MEA) technology development for polymer electrolyte fuel cells (PEMFCs) has resulted in the PEMFC technology approaching a commercial reality for transportation applications. However, there remain two primary technical challenges to be addressed in the MEA. First and foremost is meeting the automotive cost targets: Producing a fuel cell stack cost competitive with today’s internal combustion engine. In addition to the material cost, MEA (and other components) and stack assembly production methods must be amenable for use in low cost, high speed, automotive assembly line. One impediment to this latter goalmore » is that stack components must currently go through a long and tedious conditioning procedure before they produce optimal power. This so-called “break-in” can take many hours, and can involve quite complex voltage, temperature and/or pressure steps. These break-in procedures must be simplified and the time required reduced if fuel cells are to become a viable automotive engine. The second challenge is to achieve the durability targets in real-world automotive duty cycle operations. Significant improvements in cost, break-in time, and durability for the key component of fuel cell stacks, MEAs were achieved in this project. Advanced modeling was used to guide design of the new MEA to maximize performance and durability. A new, innovative process and manufacturing approach utilizing direct in-line coating using scalable, cost-competitive, continuous high volume 3-layer rolled-good manufacturing processes was developed and validated by single cell and short stack testing. In addition, the direct coating methods employed were shown to reduce the cost for sacrificial films. Furthermore, Gore has demonstrated a 10 µm reinforced membrane that is used in the new low-cost process and can meet automotive power density and durability targets. Across a wide range of operating conditions, the direct-coated MEA outperformed the commercial baseline MEA, and did so through a process that delivers MEAs at $92.35/m2 at a volume of 500,000 systems per year, according to Strategic Analysis (SA) estimates.« less
Psychometric and cognitive validation of a social capital measurement tool in Peru and Vietnam.
De Silva, Mary J; Harpham, Trudy; Tuan, Tran; Bartolini, Rosario; Penny, Mary E; Huttly, Sharon R
2006-02-01
Social capital is a relatively new concept which has attracted significant attention in recent years. No consensus has yet been reached on how to measure social capital, resulting in a large number of different tools available. While psychometric validation methods such as factor analysis have been used by a few studies to assess the internal validity of some tools, these techniques rely on data already collected by the tool and are therefore not capable of eliciting what the questions are actually measuring. The Young Lives (YL) study includes quantitative measures of caregiver's social capital in four countries (Vietnam, Peru, Ethiopia, and India) using a short version of the Adapted Social Capital Assessment Tool (SASCAT). A range of different psychometric methods including factor analysis were used to evaluate the construct validity of SASCAT in Peru and Vietnam. In addition, qualitative cognitive interviews with 20 respondents from Peru and 24 respondents from Vietnam were conducted to explore what each question is actually measuring. We argue that psychometric validation techniques alone are not sufficient to adequately validate multi-faceted social capital tools for use in different cultural settings. Psychometric techniques show SASCAT to be a valid tool reflecting known constructs and displaying postulated links with other variables. However, results from the cognitive interviews present a more mixed picture with some questions being appropriately interpreted by respondents, and others displaying significant differences between what the researchers intended them to measure and what they actually do. Using evidence from a range of methods of assessing validity has enabled the modification of an existing instrument into a valid and low cost tool designed to measure social capital within larger surveys in Peru and Vietnam, with the potential for use in other developing countries following local piloting and cultural adaptation of the tool.
Quantitative imaging assay for NF-κB nuclear translocation in primary human macrophages
Noursadeghi, Mahdad; Tsang, Jhen; Haustein, Thomas; Miller, Robert F.; Chain, Benjamin M.; Katz, David R.
2008-01-01
Quantitative measurement of NF-κB nuclear translocation is an important research tool in cellular immunology. Established methodologies have a number of limitations, such as poor sensitivity, high cost or dependence on cell lines. Novel imaging methods to measure nuclear translocation of transcriptionally active components of NF-κB are being used but are also partly limited by the need for specialist imaging equipment or image analysis software. Herein we present a method for quantitative detection of NF-κB rel A nuclear translocation, using immunofluorescence microscopy and the public domain image analysis software ImageJ that can be easily adopted for cellular immunology research without the need for specialist image analysis expertise and at low cost. The method presented here is validated by demonstrating the time course and dose response of NF-κB nuclear translocation in primary human macrophages stimulated with LPS, and by comparison with a commercial NF-κB activation reporter cell line. PMID:18036607
Mohammad Abdulghani, Hamza; G. Ponnamperuma, Gominda; Ahmad, Farah; Amin, Zubair
2014-01-01
Objective: To evaluate assessment system of the 'Research Methodology Course' using utility criteria (i.e. validity, reliability, acceptability, educational impact, and cost-effectiveness). This study demonstrates comprehensive evaluation of assessment system and suggests a framework for similar courses. Methods: Qualitative and quantitative methods used for evaluation of the course assessment components (50 MCQ, 3 Short Answer Questions (SAQ) and research project) using the utility criteria. Results of multiple evaluation methods for all the assessment components were collected and interpreted together to arrive at holistic judgments, rather than judgments based on individual methods or individual assessment. Results: Face validity, evaluated using a self-administered questionnaire (response rate-88.7%) disclosed that the students perceived that there was an imbalance in the contents covered by the assessment. This was confirmed by the assessment blueprint. Construct validity was affected by the low correlation between MCQ and SAQ scores (r=0.326). There was a higher correlation between the project and MCQ (r=0.466)/SAQ (r=0.463) scores. Construct validity was also affected by the presence of recall type of MCQs (70%; 35/50), item construction flaws and non-functioning distractors. High discriminating indices (>0.35) were found in MCQs with moderate difficulty indices (0.3-0.7). Reliability of the MCQs was 0.75 which could be improved up to 0.8 by increasing the number of MCQs to at least 70. A positive educational impact was found in the form of the research project assessment driving students to present/publish their work in conferences/peer reviewed journals. Cost per student to complete the course was US$164.50. Conclusions: The multi-modal evaluation of an assessment system is feasible and provides thorough and diagnostic information. Utility of the assessment system could be further improved by modifying the psychometrically inappropriate assessment items. PMID:24772117
Choosing MUSE: Validation of a Low-Cost, Portable EEG System for ERP Research
Krigolson, Olave E.; Williams, Chad C.; Norton, Angela; Hassall, Cameron D.; Colino, Francisco L.
2017-01-01
In recent years there has been an increase in the number of portable low-cost electroencephalographic (EEG) systems available to researchers. However, to date the validation of the use of low-cost EEG systems has focused on continuous recording of EEG data and/or the replication of large system EEG setups reliant on event-markers to afford examination of event-related brain potentials (ERP). Here, we demonstrate that it is possible to conduct ERP research without being reliant on event markers using a portable MUSE EEG system and a single computer. Specifically, we report the results of two experiments using data collected with the MUSE EEG system—one using the well-known visual oddball paradigm and the other using a standard reward-learning task. Our results demonstrate that we could observe and quantify the N200 and P300 ERP components in the visual oddball task and the reward positivity (the mirror opposite component to the feedback-related negativity) in the reward-learning task. Specifically, single sample t-tests of component existence (all p's < 0.05), computation of Bayesian credible intervals, and 95% confidence intervals all statistically verified the existence of the N200, P300, and reward positivity in all analyses. We provide with this research paper an open source website with all the instructions, methods, and software to replicate our findings and to provide researchers with an easy way to use the MUSE EEG system for ERP research. Importantly, our work highlights that with a single computer and a portable EEG system such as the MUSE one can conduct ERP research with ease thus greatly extending the possible use of the ERP methodology to a variety of novel contexts. PMID:28344546
Choosing MUSE: Validation of a Low-Cost, Portable EEG System for ERP Research.
Krigolson, Olave E; Williams, Chad C; Norton, Angela; Hassall, Cameron D; Colino, Francisco L
2017-01-01
In recent years there has been an increase in the number of portable low-cost electroencephalographic (EEG) systems available to researchers. However, to date the validation of the use of low-cost EEG systems has focused on continuous recording of EEG data and/or the replication of large system EEG setups reliant on event-markers to afford examination of event-related brain potentials (ERP). Here, we demonstrate that it is possible to conduct ERP research without being reliant on event markers using a portable MUSE EEG system and a single computer. Specifically, we report the results of two experiments using data collected with the MUSE EEG system-one using the well-known visual oddball paradigm and the other using a standard reward-learning task. Our results demonstrate that we could observe and quantify the N200 and P300 ERP components in the visual oddball task and the reward positivity (the mirror opposite component to the feedback-related negativity) in the reward-learning task. Specifically, single sample t -tests of component existence (all p 's < 0.05), computation of Bayesian credible intervals, and 95% confidence intervals all statistically verified the existence of the N200, P300, and reward positivity in all analyses. We provide with this research paper an open source website with all the instructions, methods, and software to replicate our findings and to provide researchers with an easy way to use the MUSE EEG system for ERP research. Importantly, our work highlights that with a single computer and a portable EEG system such as the MUSE one can conduct ERP research with ease thus greatly extending the possible use of the ERP methodology to a variety of novel contexts.
A low-cost, tablet-based option for prehospital neurologic assessment
Chapman Smith, Sherita N.; Govindarajan, Prasanthi; Padrick, Matthew M.; Lippman, Jason M.; McMurry, Timothy L.; Resler, Brian L.; Keenan, Kevin; Gunnell, Brian S.; Mehndiratta, Prachi; Chee, Christina Y.; Cahill, Elizabeth A.; Dietiker, Cameron; Cattell-Gordon, David C.; Smith, Wade S.; Perina, Debra G.; Solenski, Nina J.; Worrall, Bradford B.
2016-01-01
Objectives: In this 2-center study, we assessed the technical feasibility and reliability of a low cost, tablet-based mobile telestroke option for ambulance transport and hypothesized that the NIH Stroke Scale (NIHSS) could be performed with similar reliability between remote and bedside examinations. Methods: We piloted our mobile telemedicine system in 2 geographic regions, central Virginia and the San Francisco Bay Area, utilizing commercial cellular networks for videoconferencing transmission. Standardized patients portrayed scripted stroke scenarios during ambulance transport and were evaluated by independent raters comparing bedside to remote mobile telestroke assessments. We used a mixed-effects regression model to determine intraclass correlation of the NIHSS between bedside and remote examinations (95% confidence interval). Results: We conducted 27 ambulance runs at both sites and successfully completed the NIHSS for all prehospital assessments without prohibitive technical interruption. The mean difference between bedside (face-to-face) and remote (video) NIHSS scores was 0.25 (1.00 to −0.50). Overall, correlation of the NIHSS between bedside and mobile telestroke assessments was 0.96 (0.92–0.98). In the mixed-effects regression model, there were no statistically significant differences accounting for method of evaluation or differences between sites. Conclusions: Utilizing a low-cost, tablet-based platform and commercial cellular networks, we can reliably perform prehospital neurologic assessments in both rural and urban settings. Further research is needed to establish the reliability and validity of prehospital mobile telestroke assessment in live patients presenting with acute neurologic symptoms. PMID:27281534
Wang, Li-Ju; Naudé, Nicole; Demissie, Misganaw; Crivaro, Anne; Kamoun, Malek; Wang, Ping; Li, Lei
2018-07-01
Most mobile health (mHealth) diagnostic devices for laboratory tests only analyze one sample at a time, which is not suitable for large volume serology testing, especially in low-resource settings with shortage of health professionals. In this study, we developed an ultra-low-cost clinically-accurate mobile phone microplate reader (mReader), and clinically validated this optical device for 12 infectious disease tests. The mReader optically reads 96 samples on a microplate at one time. 771 de-identified patient samples were tested for 12 serology assays for bacterial/viral infections. The mReader and the clinical instrument blindly read and analyzed all tests in parallel. The analytical accuracy and the diagnostic performance of the mReader were evaluated across the clinical reportable categories by comparison with clinical laboratorial testing results. The mReader exhibited 97.59-99.90% analytical accuracy and <5% coefficient of variation (CV). The positive percent agreement (PPA) in all 12 tests achieved 100%, negative percent agreement (NPA) was higher than 83% except for one test (42.86%), and overall percent agreement (OPA) ranged 89.33-100%. We envision the mReader can benefit underserved areas/populations and low-resource settings in rural clinics/hospitals at a low cost (~$50 USD) with clinical-level analytical quality. It has the potential to improve health access, speed up healthcare delivery, and reduce health disparities and education disparities by providing access to a low-cost spectrophotometer. Copyright © 2018 Elsevier B.V. All rights reserved.
2011-01-01
Background The lack of culturally adapted and validated instruments for child mental health and psychosocial support in low and middle-income countries is a barrier to assessing prevalence of mental health problems, evaluating interventions, and determining program cost-effectiveness. Alternative procedures are needed to validate instruments in these settings. Methods Six criteria are proposed to evaluate cross-cultural validity of child mental health instruments: (i) purpose of instrument, (ii) construct measured, (iii) contents of construct, (iv) local idioms employed, (v) structure of response sets, and (vi) comparison with other measurable phenomena. These criteria are applied to transcultural translation and alternative validation for the Depression Self-Rating Scale (DSRS) and Child PTSD Symptom Scale (CPSS) in Nepal, which recently suffered a decade of war including conscription of child soldiers and widespread displacement of youth. Transcultural translation was conducted with Nepali mental health professionals and six focus groups with children (n = 64) aged 11-15 years old. Because of the lack of child mental health professionals in Nepal, a psychosocial counselor performed an alternative validation procedure using psychosocial functioning as a criterion for intervention. The validation sample was 162 children (11-14 years old). The Kiddie-Schedule for Affective Disorders and Schizophrenia (K-SADS) and Global Assessment of Psychosocial Disability (GAPD) were used to derive indication for treatment as the external criterion. Results The instruments displayed moderate to good psychometric properties: DSRS (area under the curve (AUC) = 0.82, sensitivity = 0.71, specificity = 0.81, cutoff score ≥ 14); CPSS (AUC = 0.77, sensitivity = 0.68, specificity = 0.73, cutoff score ≥ 20). The DSRS items with significant discriminant validity were "having energy to complete daily activities" (DSRS.7), "feeling that life is not worth living" (DSRS.10), and "feeling lonely" (DSRS.15). The CPSS items with significant discriminant validity were nightmares (CPSS.2), flashbacks (CPSS.3), traumatic amnesia (CPSS.8), feelings of a foreshortened future (CPSS.12), and easily irritated at small matters (CPSS.14). Conclusions Transcultural translation and alternative validation feasibly can be performed in low clinical resource settings through task-shifting the validation process to trained mental health paraprofessionals using structured interviews. This process is helpful to evaluate cost-effectiveness of psychosocial interventions. PMID:21816045
Lee, Robert H; Bott, Marjorie J; Forbes, Sarah; Redford, Linda; Swagerty, Daniel L; Taunton, Roma Lee
2003-01-01
Understanding how quality improvement affects costs is important. Unfortunately, low-cost, reliable ways of measuring direct costs are scarce. This article builds on the principles of process improvement to develop a costing strategy that meets both criteria. Process-based costing has 4 steps: developing a flowchart, estimating resource use, valuing resources, and calculating direct costs. To illustrate the technique, this article uses it to cost the care planning process in 3 long-term care facilities. We conclude that process-based costing is easy to implement; generates reliable, valid data; and allows nursing managers to assess the costs of new or modified processes.
Accurate prediction of bond dissociation energies of large n-alkanes using ONIOM-CCSD(T)/CBS methods
NASA Astrophysics Data System (ADS)
Wu, Junjun; Ning, Hongbo; Ma, Liuhao; Ren, Wei
2018-05-01
Accurate determination of the bond dissociation energies (BDEs) of large alkanes is desirable but practically impossible due to the expensive cost of high-level ab initio methods. We developed a two-layer ONIOM-CCSD(T)/CBS method which treats the high layer with CCSD(T) method and the low layer with DFT method, respectively. The accuracy of this method was validated by comparing the calculated BDEs of n-hexane with that obtained at the CCSD(T)-F12b/aug-cc-pVTZ level of theory. On this basis, the C-C BDEs of C6-C20 n-alkanes were calculated systematically using the ONIOM [CCSD(T)/CBS(D-T):M06-2x/6-311++G(d,p)] method, showing a good agreement with the data available in the literature.
Brewin, James; Tang, Jessica; Dasgupta, Prokar; Khan, Muhammad S; Ahmed, Kamran; Bello, Fernando; Kneebone, Roger; Jaye, Peter
2015-07-01
To evaluate the face, content and construct validity of the distributed simulation (DS) environment for technical and non-technical skills training in endourology. To evaluate the educational impact of DS for urology training. DS offers a portable, low-cost simulated operating room environment that can be set up in any open space. A prospective mixed methods design using established validation methodology was conducted in this simulated environment with 10 experienced and 10 trainee urologists. All participants performed a simulated prostate resection in the DS environment. Outcome measures included surveys to evaluate the DS, as well as comparative analyses of experienced and trainee urologist's performance using real-time and 'blinded' video analysis and validated performance metrics. Non-parametric statistical methods were used to compare differences between groups. The DS environment demonstrated face, content and construct validity for both non-technical and technical skills. Kirkpatrick level 1 evidence for the educational impact of the DS environment was shown. Further studies are needed to evaluate the effect of simulated operating room training on real operating room performance. This study has shown the validity of the DS environment for non-technical, as well as technical skills training. DS-based simulation appears to be a valuable addition to traditional classroom-based simulation training. © 2014 The Authors BJU International © 2014 BJU International Published by John Wiley & Sons Ltd.
Bourget, Philippe; Amin, Alexandre; Vidal, Fabrice; Merlette, Christophe; Troude, Pénélope; Baillet-Guffroy, Arlette
2014-08-15
The purpose of the study was to perform a comparative analysis of the technical performance, respective costs and environmental effect of two invasive analytical methods (HPLC and UV/visible-FTIR) as compared to a new non-invasive analytical technique (Raman spectroscopy). Three pharmacotherapeutic models were used to compare the analytical performances of the three analytical techniques. Statistical inter-method correlation analysis was performed using non-parametric correlation rank tests. The study's economic component combined calculations relative to the depreciation of the equipment and the estimated cost of an AQC unit of work. In any case, analytical validation parameters of the three techniques were satisfactory, and strong correlations between the two spectroscopic techniques vs. HPLC were found. In addition, Raman spectroscopy was found to be superior as compared to the other techniques for numerous key criteria including a complete safety for operators and their occupational environment, a non-invasive procedure, no need for consumables, and a low operating cost. Finally, Raman spectroscopy appears superior for technical, economic and environmental objectives, as compared with the other invasive analytical methods. Copyright © 2014 Elsevier B.V. All rights reserved.
Zhang, Mengliang; Zhao, Yang; Harrington, Peter de B; Chen, Pei
2016-03-01
Two simple fingerprinting methods, flow-injection coupled to ultraviolet spectroscopy and proton nuclear magnetic resonance, were used for discriminating between Aurantii fructus immaturus and Fructus poniciri trifoliatae immaturus . Both methods were combined with partial least-squares discriminant analysis. In the flow-injection method, four data representations were evaluated: total ultraviolet absorbance chromatograms, averaged ultraviolet spectra, absorbance at 193, 205, 225, and 283 nm, and absorbance at 225 and 283 nm. Prediction rates of 100% were achieved for all data representations by partial least-squares discriminant analysis using leave-one-sample-out cross-validation. The prediction rate for the proton nuclear magnetic resonance data by partial least-squares discriminant analysis with leave-one-sample-out cross-validation was also 100%. A new validation set of data was collected by flow-injection with ultraviolet spectroscopic detection two weeks later and predicted by partial least-squares discriminant analysis models constructed by the initial data representations with no parameter changes. The classification rates were 95% with the total ultraviolet absorbance chromatograms datasets and 100% with the other three datasets. Flow-injection with ultraviolet detection and proton nuclear magnetic resonance are simple, high throughput, and low-cost methods for discrimination studies.
Validating Human Performance Models of the Future Orion Crew Exploration Vehicle
NASA Technical Reports Server (NTRS)
Wong, Douglas T.; Walters, Brett; Fairey, Lisa
2010-01-01
NASA's Orion Crew Exploration Vehicle (CEV) will provide transportation for crew and cargo to and from destinations in support of the Constellation Architecture Design Reference Missions. Discrete Event Simulation (DES) is one of the design methods NASA employs for crew performance of the CEV. During the early development of the CEV, NASA and its prime Orion contractor Lockheed Martin (LM) strived to seek an effective low-cost method for developing and validating human performance DES models. This paper focuses on the method developed while creating a DES model for the CEV Rendezvous, Proximity Operations, and Docking (RPOD) task to the International Space Station. Our approach to validation was to attack the problem from several fronts. First, we began the development of the model early in the CEV design stage. Second, we adhered strictly to M&S development standards. Third, we involved the stakeholders, NASA astronauts, subject matter experts, and NASA's modeling and simulation development community throughout. Fourth, we applied standard and easy-to-conduct methods to ensure the model's accuracy. Lastly, we reviewed the data from an earlier human-in-the-loop RPOD simulation that had different objectives, which provided us an additional means to estimate the model's confidence level. The results revealed that a majority of the DES model was a reasonable representation of the current CEV design.
Jank, Louise; Martins, Magda Targa; Arsand, Juliana Bazzan; Campos Motta, Tanara Magalhães; Hoff, Rodrigo Barcellos; Barreto, Fabiano; Pizzolato, Tânia Mara
2015-11-01
A fast and simple method for residue analysis of the antibiotics classes of macrolides (erythromycin, azithromycin, tylosin, tilmicosin and spiramycin) and lincosamides (lincomycin and clindamycin) was developed and validated for cattle, swine and chicken muscle and for bovine milk. Sample preparation consists in a liquid-liquid extraction (LLE) with acetonitrile, followed by liquid chromatography-electrospray-tandem mass spectrometry analysis (LC-ESI-MS/MS), without the need of any additional clean-up steps. Chromatographic separation was achieved using a C18 column and a mobile phase composed by acidified acetonitrile and water. The method was fully validated according the criteria of the Commission Decision 2002/657/EC. Validation parameters such as limit of detection, limit of quantification, linearity, accuracy, repeatability, specificity, reproducibility, decision limit (CCα) and detection capability (CCβ) were evaluated. All calculated values met the established criteria. Reproducibility values, expressed as coefficient of variation, were all lower than 19.1%. Recoveries range from 60% to 107%. Limits of detection were from 5 to 25 µg kg(-1).The present method is able to be applied in routine analysis, with adequate time of analysis, low cost and a simple sample preparation protocol. Copyright © 2015. Published by Elsevier B.V.
Jalink, M B; Goris, J; Heineman, E; Pierie, J P E N; ten Cate Hoedemaker, H O
2014-02-01
Virtual reality (VR) laparoscopic simulators have been around for more than 10 years and have proven to be cost- and time-effective in laparoscopic skills training. However, most simulators are, in our experience, considered less interesting by residents and are often poorly accessible. Consequently, these devices are rarely used in actual training. In an effort to make a low-cost and more attractive simulator, a custom-made Nintendo Wii game was developed. This game could ultimately be used to train the same basic skills as VR laparoscopic simulators ought to. Before such a video game can be implemented into a surgical training program, it has to be validated according to international standards. The main goal of this study was to test construct and concurrent validity of the controls of a prototype of the game. In this study, the basic laparoscopic skills of experts (surgeons, urologists, and gynecologists, n = 15) were compared to those of complete novices (internists, n = 15) using the Wii Laparoscopy (construct validity). Scores were also compared to the Fundamentals of Laparoscopy (FLS) Peg Transfer test, an already established assessment method for measuring basic laparoscopic skills (concurrent validity). Results showed that experts were 111 % faster (P = 0.001) on the Wii Laparoscopy task than novices. Also, scores of the FLS Peg Transfer test and the Wii Laparoscopy showed a significant, high correlation (r = 0.812, P < 0.001). The prototype setup of the Wii Laparoscopy possesses solid construct and concurrent validity.
Al Ali, Ahmad; Touboul, David; Le Caër, Jean-Pierre; Schmitz-Afonso, Isabelle; Flinois, Jean-Pierre; Marchetti, Catherine; De Waziers, Isabelle; Brunelle, Alain; Laprévote, Olivier; Beaune, Philippe
2014-08-01
Cytochromes P450 (CYPs) play critical roles in oxidative metabolism of many endogenous and exogenous compounds. Protein expression levels of CYPs in liver provide relevant information for a better understanding of the importance of CYPs in pharmacology and toxicology. This work aimed at establishing a simple method to quantify six CYPs (CYP3A4, CYP3A5, CYP1A2, CYP2D6, CYP2C9, and CYP2J2) in various biological samples without isotopic labeling. The biological matrix was spiked with the standard peptides prior to the digestion step to realize a label-free quantification by mass spectrometry. The method was validated and applied to quantify these six isoforms in both human liver microsomes and mitochondria, but also in recombinant expression systems such as baculosomes and the HepG2 cell line. The results showed intra-assay and interassay accuracy and precision within 16 % and 5 %, respectively, at the low quality control level, and demonstrated the advantages of the method in terms of reproducibility and cost.
NASA Astrophysics Data System (ADS)
Maimaiti, Maimaitirebike
Inkjet printing is an attractive patterning technology that has received tremendous interest as a mass fabrication method for a variety of electronic devices due to its manufacturing exibility and low-cost feature. However, the printing facilities that are being used, especially the inkjet printer, are very expensive. This thesis introduces an extremely cost-friendly inkjet printing method using a printer that costs less than $100. In order to verify its reliability, linearly and circularly polarized (CPd) planar and conformal microstrip antennas were fabricated using this printing method, and their measurement results were compared with copper microstrip antennas. The result shows that the printed microstrip antennas have similar performances to those of the copper antennas except for lower efficiency. The effects of the conductivity and thickness of the ink layer on the antenna properties were studied, and it is found that the conductivity is the main factor affecting the radiation efficiency, though thicker ink yields more effective antennas. This thesis also presents the detailed antenna design for a sub-payload. The sub-payload is a cylindrical structure with a diameter of six inches and a height of four inches. It has four booms coming out from the surface, which are used to measure the variations of the energy flow into the upper atmosphere in and around the aurora. The sub-payload has two types of antennas: linearly polarized (LPd) S-band antennas and right-hand circularly polarized (RHCPd) GPS antennas. Each type of antenna has various requirements to be fully functional for specific research tasks. The thesis includes the design methods of each type of antenna, challenges that were confronted, and the possible solutions that were proposed. As a practical application, the inkjet printing method was conveniently applied in validating some of the antenna designs.
COAL UTILITY EVIRONMENTAL COST (CUECOST) WORKBOOK USER'S MANUAL
The document is a user's manual for the Coal Utility Environmental Cost (CUECost) workbook (an interrelated set of spreadsheets) and documents its development and the validity of methods used to estimate installed capital ad annualize costs. The CUECost workbook produces rough-or...
Estimation of Temporal Gait Parameters Using a Human Body Electrostatic Sensing-Based Method.
Li, Mengxuan; Li, Pengfei; Tian, Shanshan; Tang, Kai; Chen, Xi
2018-05-28
Accurate estimation of gait parameters is essential for obtaining quantitative information on motor deficits in Parkinson's disease and other neurodegenerative diseases, which helps determine disease progression and therapeutic interventions. Due to the demand for high accuracy, unobtrusive measurement methods such as optical motion capture systems, foot pressure plates, and other systems have been commonly used in clinical environments. However, the high cost of existing lab-based methods greatly hinders their wider usage, especially in developing countries. In this study, we present a low-cost, noncontact, and an accurate temporal gait parameters estimation method by sensing and analyzing the electrostatic field generated from human foot stepping. The proposed method achieved an average 97% accuracy on gait phase detection and was further validated by comparison to the foot pressure system in 10 healthy subjects. Two results were compared using the Pearson coefficient r and obtained an excellent consistency ( r = 0.99, p < 0.05). The repeatability of the purposed method was calculated between days by intraclass correlation coefficients (ICC), and showed good test-retest reliability (ICC = 0.87, p < 0.01). The proposed method could be an affordable and accurate tool to measure temporal gait parameters in hospital laboratories and in patients' home environments.
Amin Nili, Vahid; Mansouri, Ehsan; Kavehvash, Zahra; Fakharzadeh, Mohammad; Shabany, Mahdi; Khavasi, Amin
2018-01-01
In this paper, a closed-form two-dimensional reconstruction technique for hybrid frequency and mechanical scanning millimeter-wave (MMW) imaging systems is proposed. Although being commercially implemented in many imaging systems as a low-cost real-time solution, the results of frequency scanning systems have been reconstructed numerically or have been reported as the captured raw data with no clear details. Furthermore, this paper proposes a new framework to utilize the captured data of different frequencies for three-dimensional (3D) reconstruction based on novel proposed closed-form relations. The hybrid frequency and mechanical scanning structure, together with the proposed reconstruction method, yields a low-cost MMW imaging system with a satisfying performance. The extracted reconstruction formulations are validated through numerical simulations, which show comparable image quality with conventional MMW imaging systems, i.e., switched-array (SA) and phased-array (PA) structures. Extensive simulations are also performed in the presence of additive noise, demonstrating the acceptable robustness of the system against system noise compared to SA and comparable performance with PA. Finally, 3D reconstruction of the simulated data shows a depth resolution of better than 10 cm with minimum degradation of lateral resolution in the 10 GHz frequency bandwidth.
NASA Astrophysics Data System (ADS)
Song, Seok-Jeong; Kim, Tae-Il; Kim, Youngmi; Nam, Hyoungsik
2018-05-01
Recently, a simple, sensitive, and low-cost fluorescent indicator has been proposed to determine water contents in organic solvents, drugs, and foodstuffs. The change of water content leads to the change of the indicator's fluorescence color under the ultra-violet (UV) light. Whereas the water content values could be estimated from the spectrum obtained by a bulky and expensive spectrometer in the previous research, this paper demonstrates a simple and low-cost camera-based water content measurement scheme with the same fluorescent water indicator. Water content is calculated over the range of 0-30% by quadratic polynomial regression models with color information extracted from the captured images of samples. Especially, several color spaces such as RGB, xyY, L∗a∗b∗, u‧v‧, HSV, and YCBCR have been investigated to establish the optimal color information features over both linear and nonlinear RGB data given by a camera before and after gamma correction. In the end, a 2nd order polynomial regression model along with HSV in a linear domain achieves the minimum mean square error of 1.06% for a 3-fold cross validation method. Additionally, the resultant water content estimation model is implemented and evaluated in an off-the-shelf Android-based smartphone.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Karnowski, Thomas Paul; Giancardo, Luca; Li, Yaquin
2013-01-01
Automated retina image analysis has reached a high level of maturity in recent years, and thus the question of how validation is performed in these systems is beginning to grow in importance. One application of retina image analysis is in telemedicine, where an automated system could enable the automated detection of diabetic retinopathy and other eye diseases as a low-cost method for broad-based screening. In this work we discuss our experiences in developing a telemedical network for retina image analysis, including our progression from a manual diagnosis network to a more fully automated one. We pay special attention to howmore » validations of our algorithm steps are performed, both using data from the telemedicine network and other public databases.« less
Two-stage atlas subset selection in multi-atlas based image segmentation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhao, Tingting, E-mail: tingtingzhao@mednet.ucla.edu; Ruan, Dan, E-mail: druan@mednet.ucla.edu
2015-06-15
Purpose: Fast growing access to large databases and cloud stored data presents a unique opportunity for multi-atlas based image segmentation and also presents challenges in heterogeneous atlas quality and computation burden. This work aims to develop a novel two-stage method tailored to the special needs in the face of large atlas collection with varied quality, so that high-accuracy segmentation can be achieved with low computational cost. Methods: An atlas subset selection scheme is proposed to substitute a significant portion of the computationally expensive full-fledged registration in the conventional scheme with a low-cost alternative. More specifically, the authors introduce a two-stagemore » atlas subset selection method. In the first stage, an augmented subset is obtained based on a low-cost registration configuration and a preliminary relevance metric; in the second stage, the subset is further narrowed down to a fusion set of desired size, based on full-fledged registration and a refined relevance metric. An inference model is developed to characterize the relationship between the preliminary and refined relevance metrics, and a proper augmented subset size is derived to ensure that the desired atlases survive the preliminary selection with high probability. Results: The performance of the proposed scheme has been assessed with cross validation based on two clinical datasets consisting of manually segmented prostate and brain magnetic resonance images, respectively. The proposed scheme demonstrates comparable end-to-end segmentation performance as the conventional single-stage selection method, but with significant computation reduction. Compared with the alternative computation reduction method, their scheme improves the mean and medium Dice similarity coefficient value from (0.74, 0.78) to (0.83, 0.85) and from (0.82, 0.84) to (0.95, 0.95) for prostate and corpus callosum segmentation, respectively, with statistical significance. Conclusions: The authors have developed a novel two-stage atlas subset selection scheme for multi-atlas based segmentation. It achieves good segmentation accuracy with significantly reduced computation cost, making it a suitable configuration in the presence of extensive heterogeneous atlases.« less
NASA Astrophysics Data System (ADS)
Saager, Rolf B.; Baldado, Melissa L.; Rowland, Rebecca A.; Kelly, Kristen M.; Durkin, Anthony J.
2018-04-01
With recent proliferation in compact and/or low-cost clinical multispectral imaging approaches and commercially available components, questions remain whether they adequately capture the requisite spectral content of their applications. We present a method to emulate the spectral range and resolution of a variety of multispectral imagers, based on in-vivo data acquired from spatial frequency domain spectroscopy (SFDS). This approach simulates spectral responses over 400 to 1100 nm. Comparing emulated data with full SFDS spectra of in-vivo tissue affords the opportunity to evaluate whether the sparse spectral content of these imagers can (1) account for all sources of optical contrast present (completeness) and (2) robustly separate and quantify sources of optical contrast (crosstalk). We validate the approach over a range of tissue-simulating phantoms, comparing the SFDS-based emulated spectra against measurements from an independently characterized multispectral imager. Emulated results match the imager across all phantoms (<3 % absorption, <1 % reduced scattering). In-vivo test cases (burn wounds and photoaging) illustrate how SFDS can be used to evaluate different multispectral imagers. This approach provides an in-vivo measurement method to evaluate the performance of multispectral imagers specific to their targeted clinical applications and can assist in the design and optimization of new spectral imaging devices.
An Instantaneous Low-Cost Point-of-Care Anemia Detection Device
Punter-Villagrasa, Jaime; Cid, Joan; Páez-Avilés, Cristina; Rodríguez-Villarreal, Ivón; Juanola-Feliu, Esteve; Colomer-Farrarons, Jordi; Miribel-Català, Pere Ll.
2015-01-01
We present a small, compact and portable device for point-of-care instantaneous early detection of anemia. The method used is based on direct hematocrit measurement from whole blood samples by means of impedance analysis. This device consists of a custom electronic instrumentation and a plug-and-play disposable sensor. The designed electronics rely on straightforward standards for low power consumption, resulting in a robust and low consumption device making it completely mobile with a long battery life. Another approach could be powering the system based on other solutions like indoor solar cells, or applying energy-harvesting solutions in order to remove the batteries. The sensing system is based on a disposable low-cost label-free three gold electrode commercial sensor for 50 μL blood samples. The device capability for anemia detection has been validated through 24 blood samples, obtained from four hospitalized patients at Hospital Clínic. As a result, the response, effectiveness and robustness of the portable point-of-care device to detect anemia has been proved with an accuracy error of 2.83% and a mean coefficient of variation of 2.57% without any particular case above 5%. PMID:25690552
Design and Validation of a Low-Cost Portable Device to Quantify Postural Stability.
Zhu, Yong
2017-03-18
Measurement of the displacement of the center-of-pressure (COP) is an important tool used in biomechanics to assess postural stability and human balance. The goal of this research was to design and validate a low-cost portable device that can offer a quick indication of the state of postural stability and human balance related conditions. Approximate entropy (ApEn) values reflecting the amount of irregularity hiding in COP oscillations were used to calculate the index. The prototype adopted a portable design using the measurements of the load cells located at the four corners of a low-cost force platform. The test subject was asked to stand on the device in a quiet, normal, upright stance for 30 s with eyes open and subsequently for 30 s with eyes closed. Based on the COP displacement signals, the ApEn values were calculated. The results indicated that the prototype device was capable of capturing the increase in regularity of postural control in the visual-deprivation conditions. It was also able to decipher the subtle postural control differences along anterior-posterior and medial-lateral directions. The data analysis demonstrated that the prototype would enable the quantification of postural stability and thus provide a low-cost portable device to assess many conditions related to postural stability and human balance such as aging and pathologies.
42 CFR 441.472 - Budget methodology.
Code of Federal Regulations, 2011 CFR
2011-10-01
... based utilizing valid, reliable cost data. (2) The State's method is applied consistently to participants. (3) The State's method is open for public inspection. (4) The State's method includes a...
42 CFR 441.472 - Budget methodology.
Code of Federal Regulations, 2013 CFR
2013-10-01
... based utilizing valid, reliable cost data. (2) The State's method is applied consistently to participants. (3) The State's method is open for public inspection. (4) The State's method includes a...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sullivan, Clair J.
The use of radiation detectors as an element in the so-called “Internet of Things” has recently become viable with the available of low-cost, mobile radiation sensors capable of streaming geo-referenced data. New methods for fusing the data from multiple sensors on such a network is presented. The traditional simple and ordinary Kriging methods present a challenge for such a network since the assumption of a constant mean is not valid in this application. In conclusion, a variety of Kalman filters are introduced in an attempt to solve the problem associated with this variable and unknown mean. Results are presented onmore » a deployed sensor network.« less
Sullivan, Clair J.
2016-01-01
The use of radiation detectors as an element in the so-called “Internet of Things” has recently become viable with the available of low-cost, mobile radiation sensors capable of streaming geo-referenced data. New methods for fusing the data from multiple sensors on such a network is presented. The traditional simple and ordinary Kriging methods present a challenge for such a network since the assumption of a constant mean is not valid in this application. In conclusion, a variety of Kalman filters are introduced in an attempt to solve the problem associated with this variable and unknown mean. Results are presented onmore » a deployed sensor network.« less
Pediatric laryngeal simulator using 3D printed models: A novel technique.
Kavanagh, Katherine R; Cote, Valerie; Tsui, Yvonne; Kudernatsch, Simon; Peterson, Donald R; Valdez, Tulio A
2017-04-01
Simulation to acquire and test technical skills is an essential component of medical education and residency training in both surgical and nonsurgical specialties. High-quality simulation education relies on the availability, accessibility, and reliability of models. The objective of this work was to describe a practical pediatric laryngeal model for use in otolaryngology residency training. Ideally, this model would be low-cost, have tactile properties resembling human tissue, and be reliably reproducible. Pediatric laryngeal models were developed using two manufacturing methods: direct three-dimensional (3D) printing of anatomical models and casted anatomical models using 3D-printed molds. Polylactic acid, acrylonitrile butadiene styrene, and high-impact polystyrene (HIPS) were used for the directly printed models, whereas a silicone elastomer (SE) was used for the casted models. The models were evaluated for anatomic quality, ease of manipulation, hardness, and cost of production. A tissue likeness scale was created to validate the simulation model. Fleiss' Kappa rating was performed to evaluate interrater agreement, and analysis of variance was performed to evaluate differences among the materials. The SE provided the most anatomically accurate models, with the tactile properties allowing for surgical manipulation of the larynx. Direct 3D printing was more cost-effective than the SE casting method but did not possess the material properties and tissue likeness necessary for surgical simulation. The SE models of the pediatric larynx created from a casting method demonstrated high quality anatomy, tactile properties comparable to human tissue, and easy manipulation with standard surgical instruments. Their use in a reliable, low-cost, accessible, modular simulation system provides a valuable training resource for otolaryngology residents. N/A. Laryngoscope, 127:E132-E137, 2017. © 2016 The American Laryngological, Rhinological and Otological Society, Inc.
Soares, Karina Lotz; Cerqueira, Maristela Barnes Rodrigues; Caldas, Sergiane Souza; Primel, Ednei Gilberto
2017-09-01
This study describes the development, optimization and validation of a method for the extraction of 15 pesticides of different chemical classes in drinking water treatment sludge (DWTS) by vortex-assisted Matrix Solid Phase Dispersion (MSPD) with determination by gas chromatography coupled to mass spectrometry. It focused on the application of alternative and different solid supports to the extraction step of the MSPD. The main parameters that influenced the extraction were studied in order to obtain better recovery responses. Recoveries ranged from 70 to 120% with RSD below 20% for all analytes. Limits of quantification (LOQ) of the method ranged from 5 to 500 μg kg -1 whereas the analytical curves showed correlation coefficients above 0.997. The method under investigation used low volume of solvent (5 mL), low sample mass (1.5 g) and low mass of chitin (0.5 g), an environmentally friendly support. It has advantages, such as speed, simplicity and low cost material, over other methods. When the method was applied, 4 out of 15 pesticides were detected in the DWTS samples in concentrations below the LOQ. Copyright © 2017 Elsevier Ltd. All rights reserved.
Two-Method Planned Missing Designs for Longitudinal Research
ERIC Educational Resources Information Center
Garnier-Villarreal, Mauricio; Rhemtulla, Mijke; Little, Todd D.
2014-01-01
We examine longitudinal extensions of the two-method measurement design, which uses planned missingness to optimize cost-efficiency and validity of hard-to-measure constructs. These designs use a combination of two measures: a "gold standard" that is highly valid but expensive to administer, and an inexpensive (e.g., survey-based)…
Nathan, Dominic E; Johnson, Michelle J; McGuire, John R
2009-01-01
Hand and arm impairment is common after stroke. Robotic stroke therapy will be more effective if hand and upper-arm training is integrated to help users practice reaching and grasping tasks. This article presents the design, development, and validation of a low-cost, functional electrical stimulation grasp-assistive glove for use with task-oriented robotic stroke therapy. Our glove measures grasp aperture while a user completes simple-to-complex real-life activities, and when combined with an integrated functional electrical stimulator, it assists in hand opening and closing. A key function is a new grasp-aperture prediction model, which uses the position of the end-effectors of two planar robots to define the distance between the thumb and index finger. We validated the accuracy and repeatability of the glove and its capability to assist in grasping. Results from five nondisabled subjects indicated that the glove is accurate and repeatable for both static hand-open and -closed tasks when compared with goniometric measures and for dynamic reach-to-grasp tasks when compared with motion analysis measures. Results from five subjects with stroke showed that with the glove, they could open their hands but without it could not. We present a glove that is a low-cost solution for in vivo grasp measurement and assistance.
Goetzel, Ron Z; Henke, Rachel Mosher; Benevent, Richele; Tabrizi, Maryam J; Kent, Karen B; Smith, Kristyn J; Roemer, Enid Chung; Grossmeier, Jessica; Mason, Shawn T; Gold, Daniel B; Noeldner, Steven P; Anderson, David R
2014-02-01
To determine the ability of the Health Enhancement Research Organization (HERO) Scorecard to predict changes in health care expenditures. Individual employee health care insurance claims data for 33 organizations completing the HERO Scorecard from 2009 to 2011 were linked to employer responses to the Scorecard. Organizations were dichotomized into "high" versus "low" scoring groups and health care cost trends were compared. A secondary analysis examined the tool's ability to predict health risk trends. "High" scorers experienced significant reductions in inflation-adjusted health care costs (averaging an annual trend of -1.6% over 3 years) compared with "low" scorers whose cost trend remained stable. The risk analysis was inconclusive because of the small number of employers scoring "low." The HERO Scorecard predicts health care cost trends among employers. More research is needed to determine how well it predicts health risk trends for employees.
NASA Astrophysics Data System (ADS)
Trivailo, O.; Sippel, M.; Şekercioğlu, Y. A.
2012-08-01
The primary purpose of this paper is to review currently existing cost estimation methods, models, tools and resources applicable to the space sector. While key space sector methods are outlined, a specific focus is placed on hardware cost estimation on a system level, particularly for early mission phases during which specifications and requirements are not yet crystallised, and information is limited. For the space industry, cost engineering within the systems engineering framework is an integral discipline. The cost of any space program now constitutes a stringent design criterion, which must be considered and carefully controlled during the entire program life cycle. A first step to any program budget is a representative cost estimate which usually hinges on a particular estimation approach, or methodology. Therefore appropriate selection of specific cost models, methods and tools is paramount, a difficult task given the highly variable nature, scope as well as scientific and technical requirements applicable to each program. Numerous methods, models and tools exist. However new ways are needed to address very early, pre-Phase 0 cost estimation during the initial program research and establishment phase when system specifications are limited, but the available research budget needs to be established and defined. Due to their specificity, for vehicles such as reusable launchers with a manned capability, a lack of historical data implies that using either the classic heuristic approach such as parametric cost estimation based on underlying CERs, or the analogy approach, is therefore, by definition, limited. This review identifies prominent cost estimation models applied to the space sector, and their underlying cost driving parameters and factors. Strengths, weaknesses, and suitability to specific mission types and classes are also highlighted. Current approaches which strategically amalgamate various cost estimation strategies both for formulation and validation of an estimate, and techniques and/or methods to attain representative and justifiable cost estimates are consequently discussed. Ultimately, the aim of the paper is to establish a baseline for development of a non-commercial, low cost, transparent cost estimation methodology to be applied during very early program research phases at a complete vehicle system level, for largely unprecedented manned launch vehicles in the future. This paper takes the first step to achieving this through the identification, analysis and understanding of established, existing techniques, models, tools and resources relevant within the space sector.
NASA Astrophysics Data System (ADS)
Sánchez, H. R.; Pis Diez, R.
2016-04-01
Based on the Aλ diagnostic for multireference effects recently proposed [U.R. Fogueri, S. Kozuch, A. Karton, J.M. Martin, Theor. Chem. Acc. 132 (2013) 1], a simple method for improving total atomization energies and reaction energies calculated at the CCSD level of theory is proposed. The method requires a CCSD calculation and two additional density functional theory calculations for the molecule. Two sets containing 139 and 51 molecules are used as training and validation sets, respectively, for total atomization energies. An appreciable decrease in the mean absolute error from 7-10 kcal mol-1 for CCSD to about 2 kcal mol-1 for the present method is observed. The present method provides atomization energies and reaction energies that compare favorably with relatively recent scaled CCSD methods.
Brooker, Simon; Kabatereine, Narcis B.; Myatt, Mark; Stothard, J. Russell; Fenwick, Alan
2007-01-01
Summary Rapid and accurate identification of communities at highest risk of morbidity from schistosomiasis is key for sustainable control. Although school questionnaires can effectively and inexpensively identify communities with a high prevalence of Schistosoma haematobium, parasitological screening remains the preferred option for S. mansoni. To help reduce screening costs, we investigated the validity of Lot Quality Assurance Sampling (LQAS) in classifying schools according categories of S. mansoni prevalence in Uganda, and explored its applicability and cost-effectiveness. First, we evaluated several sampling plans using computer simulation and then field tested one sampling plan in 34 schools in Uganda. Finally, cost-effectiveness of different screening and control strategies (including mass treatment without prior screening) was determined, and sensitivity analysis undertaken to assess the effect of infection levels and treatment costs. In identifying schools with prevalence ≥50%, computer simulations showed that LQAS had high levels of sensitivity and specificity (>90%) at sample sizes <20. The method also provides an ability to classify communities into three prevalence categories. Field testing showed that LQAS where 15 children were sampled had excellent diagnostic performance (sensitivity: 100%, specificity: 96.4%, positive predictive value: 85.7% and negative predictive value: 92.3%). Screening using LQAS was more cost-effective than mass treating all schools (US$ 218 vs. US$ 482 / high prevalence school treated). Threshold analysis indicated that parasitological screening and mass treatment would become equivalent for settings where prevalence exceeds 50% in 75% of schools and for treatment costs of US$ 0.19 per schoolchild. We conclude that, in Uganda, LQAS provides a rapid, valid, and cost-effective method for guiding decision makers in allocating finite resources for the control of schistosomiasis. PMID:15960703
Brooker, Simon; Kabatereine, Narcis B; Myatt, Mark; Russell Stothard, J; Fenwick, Alan
2005-07-01
Rapid and accurate identification of communities at highest risk of morbidity from schistosomiasis is key for sustainable control. Although school questionnaires can effectively and inexpensively identify communities with a high prevalence of Schistosoma haematobium, parasitological screening remains the preferred option for S. mansoni. To help reduce screening costs, we investigated the validity of Lot Quality Assurance Sampling (LQAS) in classifying schools according to categories of S. mansoni prevalence in Uganda, and explored its applicability and cost-effectiveness. First, we evaluated several sampling plans using computer simulation and then field tested one sampling plan in 34 schools in Uganda. Finally, cost-effectiveness of different screening and control strategies (including mass treatment without prior screening) was determined, and sensitivity analysis undertaken to assess the effect of infection levels and treatment costs. In identifying schools with prevalences > or =50%, computer simulations showed that LQAS had high levels of sensitivity and specificity (>90%) at sample sizes <20. The method also provides an ability to classify communities into three prevalence categories. Field testing showed that LQAS where 15 children were sampled had excellent diagnostic performance (sensitivity: 100%, specificity: 96.4%, positive predictive value: 85.7% and negative predictive value: 92.3%). Screening using LQAS was more cost-effective than mass treating all schools (US$218 vs. US$482/high prevalence school treated). Threshold analysis indicated that parasitological screening and mass treatment would become equivalent for settings where prevalence > or =50% in 75% of schools and for treatment costs of US$0.19 per schoolchild. We conclude that, in Uganda, LQAS provides a rapid, valid and cost-effective method for guiding decision makers in allocating finite resources for the control of schistosomiasis.
Hessel, F P; Wittmann, M; Petro, W; Wasem, J
2000-07-01
Studies in health economics especially economic evaluations of health care technologies and programmes are getting more and more important. However, in Germany there are no established, validated and commonly used instruments for the costing process. For the economic evaluation of a rehabilitation programme for patients with chronic lung diseases such as asthma and chronic bronchitis we developed methods for identification, measurement and validation of resource use during the inpatient rehabilitation programme and during the outpatient follow-up period. These methods are based on methodological considerations as well as on practical experience from conducting a pilot study. With regard to the inpatient setting all relevant diagnostic and therapeutic resource uses could be measured basing on routine clinical documentation and validated by using the cost accounting of the clinic. For measuring the use of resources during the follow-up period in an outpatient setting no reliable administrative data are accessible. Hence, we compared a standardised retrospective patient questionnaire used in a 20-minute interview (n = 50) and a cost diary for the continuing documentation by the patient over a period of 4 weeks (n = 50). Both tools were useful for measuring all relevant resource uses in sufficient detail, but because of higher participation rates and lower dropouts the structured interview appears to be more suitable. Average total costs per month were 1591 DM (interview), respectively 1867 DM (cost diary). Besides productivity loss, costs for medication and GP visits caused the relatively highest resource uses. Practicable instruments were developed for the costing process as part of an economic evaluation in a German rehabilitation setting for pulmonary diseases. After individual modification, these could also be used for different indications and in other institutional settings.
The effectiveness of physical models in teaching anatomy: a meta-analysis of comparative studies.
Yammine, Kaissar; Violato, Claudio
2016-10-01
There are various educational methods used in anatomy teaching. While three dimensional (3D) visualization technologies are gaining ground due to their ever-increasing realism, reports investigating physical models as a low-cost 3D traditional method are still the subject of considerable interest. The aim of this meta-analysis is to quantitatively assess the effectiveness of such models based on comparative studies. Eight studies (7 randomized trials; 1 quasi-experimental) including 16 comparison arms and 820 learners met the inclusion criteria. Primary outcomes were defined as factual, spatial and overall percentage scores. The meta-analytical results are: educational methods using physical models yielded significantly better results when compared to all other educational methods for the overall knowledge outcome (p < 0.001) and for spatial knowledge acquisition (p < 0.001). Significantly better results were also found with regard to the long-retention knowledge outcome (p < 0.01). No significance was found for the factual knowledge acquisition outcome. The evidence in the present systematic review was found to have high internal validity and at least an acceptable strength. In conclusion, physical anatomical models offer a promising tool for teaching gross anatomy in 3D representation due to their easy accessibility and educational effectiveness. Such models could be a practical tool to bring up the learners' level of gross anatomy knowledge at low cost.
Zanelli, Ugo; Michna, Thomas; Petersson, Carl
2018-03-26
1. A novel method utilizing an internal standard in hepatocytes incubations has been developed and demonstrated to decrease the variability in the determination of intrinsic clearance (CL int ) in this system. The reduced variability was shown to allow differentiation of lower elimination rate constants from noise. 2. The suggested method was able to compensate for a small but systematic error (0.5 µL/min/10 6 cells) caused by an evaporation of approximately 15% of the volume during the incubation time. 3. The approach was validated using six commercial drugs (ketoprofen, tolbutamide, phenacetin, etodolac and quinidine) which were metabolized by different pathways. 4. The suggested internal standard, MSC1815677, was extensively characterized and the acquired data suggest that it fulfills the requirements of an internal standard present during the incubation. The proposed internal standard was stable during the incubation and showed a low potential to inhibit drug metabolizing enzymes and transporters. With MSC1815677 we propose a novel simple, robust and cost-effective method to address the challenges in the estimation of low clearance in hepatocyte incubations.
Evaluation on Cost Overrun Risks of Long-distance Water Diversion Project Based on SPA-IAHP Method
NASA Astrophysics Data System (ADS)
Yuanyue, Yang; Huimin, Li
2018-02-01
Large investment, long route, many change orders and etc. are main causes for costs overrun of long-distance water diversion project. This paper, based on existing research, builds a full-process cost overrun risk evaluation index system for water diversion project, apply SPA-IAHP method to set up cost overrun risk evaluation mode, calculate and rank weight of every risk evaluation indexes. Finally, the cost overrun risks are comprehensively evaluated by calculating linkage measure, and comprehensive risk level is acquired. SPA-IAHP method can accurately evaluate risks, and the reliability is high. By case calculation and verification, it can provide valid cost overrun decision making information to construction companies.
Tensor-based dynamic reconstruction method for electrical capacitance tomography
NASA Astrophysics Data System (ADS)
Lei, J.; Mu, H. P.; Liu, Q. B.; Li, Z. H.; Liu, S.; Wang, X. Y.
2017-03-01
Electrical capacitance tomography (ECT) is an attractive visualization measurement method, in which the acquisition of high-quality images is beneficial for the understanding of the underlying physical or chemical mechanisms of the dynamic behaviors of the measurement objects. In real-world measurement environments, imaging objects are often in a dynamic process, and the exploitation of the spatial-temporal correlations related to the dynamic nature will contribute to improving the imaging quality. Different from existing imaging methods that are often used in ECT measurements, in this paper a dynamic image sequence is stacked into a third-order tensor that consists of a low rank tensor and a sparse tensor within the framework of the multiple measurement vectors model and the multi-way data analysis method. The low rank tensor models the similar spatial distribution information among frames, which is slowly changing over time, and the sparse tensor captures the perturbations or differences introduced in each frame, which is rapidly changing over time. With the assistance of the Tikhonov regularization theory and the tensor-based multi-way data analysis method, a new cost function, with the considerations of the multi-frames measurement data, the dynamic evolution information of a time-varying imaging object and the characteristics of the low rank tensor and the sparse tensor, is proposed to convert the imaging task in the ECT measurement into a reconstruction problem of a third-order image tensor. An effective algorithm is developed to search for the optimal solution of the proposed cost function, and the images are reconstructed via a batching pattern. The feasibility and effectiveness of the developed reconstruction method are numerically validated.
Mohammad Abdulghani, Hamza; G Ponnamperuma, Gominda; Ahmad, Farah; Amin, Zubair
2014-03-01
To evaluate assessment system of the 'Research Methodology Course' using utility criteria (i.e. validity, reliability, acceptability, educational impact, and cost-effectiveness). This study demonstrates comprehensive evaluation of assessment system and suggests a framework for similar courses. Qualitative and quantitative methods used for evaluation of the course assessment components (50 MCQ, 3 Short Answer Questions (SAQ) and research project) using the utility criteria. RESULTS of multiple evaluation methods for all the assessment components were collected and interpreted together to arrive at holistic judgments, rather than judgments based on individual methods or individual assessment. Face validity, evaluated using a self-administered questionnaire (response rate-88.7%) disclosed that the students perceived that there was an imbalance in the contents covered by the assessment. This was confirmed by the assessment blueprint. Construct validity was affected by the low correlation between MCQ and SAQ scores (r=0.326). There was a higher correlation between the project and MCQ (r=0.466)/SAQ (r=0.463) scores. Construct validity was also affected by the presence of recall type of MCQs (70%; 35/50), item construction flaws and non-functioning distractors. High discriminating indices (>0.35) were found in MCQs with moderate difficulty indices (0.3-0.7). Reliability of the MCQs was 0.75 which could be improved up to 0.8 by increasing the number of MCQs to at least 70. A positive educational impact was found in the form of the research project assessment driving students to present/publish their work in conferences/peer reviewed journals. Cost per student to complete the course was US$164.50. The multi-modal evaluation of an assessment system is feasible and provides thorough and diagnostic information. Utility of the assessment system could be further improved by modifying the psychometrically inappropriate assessment items.
Lip, Gregory Y H; Lanitis, Tereza; Kongnakorn, Thitima; Phatak, Hemant; Chalkiadaki, Corina; Liu, Xianchen; Kuznik, Andreas; Lawrence, Jack; Dorian, Paul
2015-11-01
The purpose of this analysis was to assess the cost-effectiveness of apixaban 5 mg BID versus high- and low-dose edoxaban (60 mg and 30 mg once daily) as intended starting dose strategies for stroke prevention in patients from a UK National Health Service perspective. A previously developed and validated Markov model was adapted to evaluate the lifetime clinical and economic impact of apixaban 5 mg BID versus edoxaban (high and low dose) in patients with nonvalvular atrial fibrillation. A pairwise indirect treatment comparison was conducted for clinical end points, and price parity was assumed between apixaban and edoxaban. Costs in 2012 British pounds, life-years, and quality-adjusted life-years (QALYs) gained, discounted at 3.5% per annum, were estimated. Apixaban was predicted to increase life expectancy and QALYs versus low- and high-dose edoxaban. These gains were achieved at cost-savings versus low-dose edoxaban, thus being dominant and nominal increases in costs versus high-dose edoxaban. The incremental cost-effectiveness ratio of apixaban versus high-dose edoxaban was £6763 per QALY gained. Apixaban was deemed to be dominant (less costly and more effective) versus low-dose edoxaban and a cost-effective alternative to high-dose edoxaban. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.
Validity of the Kinect for Gait Assessment: A Focused Review
Springer, Shmuel; Yogev Seligmann, Galit
2016-01-01
Gait analysis may enhance clinical practice. However, its use is limited due to the need for expensive equipment which is not always available in clinical settings. Recent evidence suggests that Microsoft Kinect may provide a low cost gait analysis method. The purpose of this report is to critically evaluate the literature describing the concurrent validity of using the Kinect as a gait analysis instrument. An online search of PubMed, CINAHL, and ProQuest databases was performed. Included were studies in which walking was assessed with the Kinect and another gold standard device, and consisted of at least one numerical finding of spatiotemporal or kinematic measures. Our search identified 366 papers, from which 12 relevant studies were retrieved. The results demonstrate that the Kinect is valid only for some spatiotemporal gait parameters. Although the kinematic parameters measured by the Kinect followed the trend of the joint trajectories, they showed poor validity and large errors. In conclusion, the Kinect may have the potential to be used as a tool for measuring spatiotemporal aspects of gait, yet standardized methods should be established, and future examinations with both healthy subjects and clinical participants are required in order to integrate the Kinect as a clinical gait analysis tool. PMID:26861323
Evaluating the Reliability, Validity, and Usefulness of Education Cost Studies
ERIC Educational Resources Information Center
Baker, Bruce D.
2006-01-01
Recent studies that purport to estimate the costs of constitutionally adequate education have been described as either a "gold standard" that should guide legislative school finance policy design and judicial evaluation, or as pure "alchemy." Methods for estimating the cost of constitutionally adequate education can be roughly…
Validation of a RANS transition model using a high-order weighted compact nonlinear scheme
NASA Astrophysics Data System (ADS)
Tu, GuoHua; Deng, XiaoGang; Mao, MeiLiang
2013-04-01
A modified transition model is given based on the shear stress transport (SST) turbulence model and an intermittency transport equation. The energy gradient term in the original model is replaced by flow strain rate to saving computational costs. The model employs local variables only, and then it can be conveniently implemented in modern computational fluid dynamics codes. The fifth-order weighted compact nonlinear scheme and the fourth-order staggered scheme are applied to discrete the governing equations for the purpose of minimizing discretization errors, so as to mitigate the confusion between numerical errors and transition model errors. The high-order package is compared with a second-order TVD method on simulating the transitional flow of a flat plate. Numerical results indicate that the high-order package give better grid convergence property than that of the second-order method. Validation of the transition model is performed for transitional flows ranging from low speed to hypersonic speed.
Multisensor system for toxic gases detection generated on indoor environments
NASA Astrophysics Data System (ADS)
Durán, C. M.; Monsalve, P. A. G.; Mosquera, C. J.
2016-11-01
This work describes a wireless multisensory system for different toxic gases detection generated on indoor environments (i.e., Underground coal mines, etc.). The artificial multisensory system proposed in this study was developed through a set of six chemical gas sensors (MQ) of low cost with overlapping sensitivities to detect hazardous gases in the air. A statistical parameter was implemented to the data set and two pattern recognition methods such as Principal Component Analysis (PCA) and Discriminant Function Analysis (DFA) were used for feature selection. The toxic gases categories were classified with a Probabilistic Neural Network (PNN) in order to validate the results previously obtained. The tests were carried out to verify feasibility of the application through a wireless communication model which allowed to monitor and store the information of the sensor signals for the appropriate analysis. The success rate in the measures discrimination was 100%, using an artificial neural network where leave-one-out was used as cross validation method.
A study of methods for lowering aerial environmental survey cost
NASA Technical Reports Server (NTRS)
Stansberry, J. R.
1973-01-01
The results are presented of a study of methods for lowering the cost of environmental aerial surveys. A wide range of low cost techniques were investigated for possible application to current pressing urban and rural problems. The objective of the study is to establish a definition of the technical problems associated with conducting aerial surveys using various low cost techniques, to conduct a survey of equipment which may be used in low cost systems, and to establish preliminary estimates of cost. A set of candidate systems were selected and described for the environmental survey tasks.
Miciak, Jeremy; Fletcher, Jack M.; Stuebing, Karla; Vaughn, Sharon; Tolar, Tammy D.
2014-01-01
Purpose Few empirical investigations have evaluated LD identification methods based on a pattern of cognitive strengths and weaknesses (PSW). This study investigated the reliability and validity of two proposed PSW methods: the concordance/discordance method (C/DM) and cross battery assessment (XBA) method. Methods Cognitive assessment data for 139 adolescents demonstrating inadequate response to intervention was utilized to empirically classify participants as meeting or not meeting PSW LD identification criteria using the two approaches, permitting an analysis of: (1) LD identification rates; (2) agreement between methods; and (3) external validity. Results LD identification rates varied between the two methods depending upon the cut point for low achievement, with low agreement for LD identification decisions. Comparisons of groups that met and did not meet LD identification criteria on external academic variables were largely null, raising questions of external validity. Conclusions This study found low agreement and little evidence of validity for LD identification decisions based on PSW methods. An alternative may be to use multiple measures of academic achievement to guide intervention. PMID:24274155
Validation of Underwater Sensor Package Using Feature Based SLAM
Cain, Christopher; Leonessa, Alexander
2016-01-01
Robotic vehicles working in new, unexplored environments must be able to locate themselves in the environment while constructing a picture of the objects in the environment that could act as obstacles that would prevent the vehicles from completing their desired tasks. In enclosed environments, underwater range sensors based off of acoustics suffer performance issues due to reflections. Additionally, their relatively high cost make them less than ideal for usage on low cost vehicles designed to be used underwater. In this paper we propose a sensor package composed of a downward facing camera, which is used to perform feature tracking based visual odometry, and a custom vision-based two dimensional rangefinder that can be used on low cost underwater unmanned vehicles. In order to examine the performance of this sensor package in a SLAM framework, experimental tests are performed using an unmanned ground vehicle and two feature based SLAM algorithms, the extended Kalman filter based approach and the Rao-Blackwellized, particle filter based approach, to validate the sensor package. PMID:26999142
McGuffey, James E.; Wei, Binnian; Bernert, John T.; Morrow, John C.; Xia, Baoyun; Wang, Lanqing; Blount, Benjamin C.
2014-01-01
Tobacco use is a major contributor to premature morbidity and mortality. The measurement of nicotine and its metabolites in urine is a valuable tool for evaluating nicotine exposure and for nicotine metabolic profiling—i.e., metabolite ratios. In addition, the minor tobacco alkaloids—anabasine and anatabine—can be useful for monitoring compliance in smoking cessation programs that use nicotine replacement therapy. Because of an increasing demand for the measurement of urinary nicotine metabolites, we developed a rapid, low-cost method that uses isotope dilution liquid chromatography-tandem mass spectrometry (LC-MS/MS) for simultaneously quantifying nicotine, six nicotine metabolites, and two minor tobacco alkaloids in smokers' urine. This method enzymatically hydrolyzes conjugated nicotine (primarily glucuronides) and its metabolites. We then use acetone pretreatment to precipitate matrix components (endogenous proteins, salts, phospholipids, and exogenous enzyme) that may interfere with LC-MS/MS analysis. Subsequently, analytes (nicotine, cotinine, hydroxycotinine, norcotinine, nornicotine, cotinine N-oxide, nicotine 1′-N-oxide, anatabine, and anabasine) are chromatographically resolved within a cycle time of 13.5 minutes. The optimized assay produces linear responses across the analyte concentrations typically found in urine collected from daily smokers. Because matrix ion suppression may influence accuracy, we include a discussion of conventions employed in this procedure to minimize matrix interferences. Simplicity, low cost, low maintenance combined with high mean metabolite recovery (76–99%), specificity, accuracy (0–10% bias) and reproducibility (2–9% C.V.) make this method ideal for large high through-put studies. PMID:25013964
NASA Astrophysics Data System (ADS)
Bravo, Mikel; Angulo-Vinuesa, Xabier; Martin-Lopez, Sonia; Lopez-Amo, Manuel; Gonzalez-Herraez, Miguel
2013-05-01
High-Q resonators have been widely used for sensing purposes. High Q factors normally lead to sharp spectral peaks which accordingly provide a strong sensitivity in spectral interrogation methods. In this work we employ a low-Q ring resonator to develop a high sensitivity sub-micrometric resolution displacement sensor. We use the slow-light effects occurring close to the critical coupling regime to achieve high sensitivity in the device. By tuning the losses in the cavity close to the critical coupling, extremely high group delay variations can be achieved, which in turn introduce strong enhancements of the absorption of the structure. We first validate the concept using an Optical Vector Analyzer (OVA) and then we propose a simple functional scheme for achieving a low-cost interrogation of this kind of sensors.
NASA Astrophysics Data System (ADS)
Korany, Mohamed A.; Mahgoub, Hoda; Haggag, Rim S.; Ragab, Marwa A. A.; Elmallah, Osama A.
2018-06-01
A green, simple and cost effective chemometric UV-Vis spectrophotometric method has been developed and validated for correcting interferences that arise during conducting biowaiver studies. Chemometric manipulation has been done for enhancing the results of direct absorbance, resulting from very low concentrations (high incidence of background noise interference) of earlier points in the dissolution timing in case of dissolution profile using first and second derivative (D1 & D2) methods and their corresponding Fourier function convoluted methods (D1/FF& D2/FF). The method applied for biowaiver study of Donepezil Hydrochloride (DH) as a representative model was done by comparing two different dosage forms containing 5 mg DH per tablet as an application of a developed chemometric method for correcting interferences as well as for the assay and dissolution testing in its tablet dosage form. The results showed that first derivative technique can be used for enhancement of the data in case of low concentration range of DH (1-8 μg mL-1) in the three different pH dissolution media which were used to estimate the low drug concentrations dissolved at the early points in the biowaiver study. Furthermore, the results showed similarity in phosphate buffer pH 6.8 and dissimilarity in the other 2 pH media. The method was validated according to ICH guidelines and USP monograph for both assays (HCl of pH 1.2) and dissolution study in 3 pH media (HCl of pH 1.2, acetate buffer of pH 4.5 and phosphate buffer of pH 6.8). Finally, the assessment of the method greenness was done using two different assessment techniques: National Environmental Method Index label and Eco scale methods. Both techniques ascertained the greenness of the proposed method.
Korany, Mohamed A; Mahgoub, Hoda; Haggag, Rim S; Ragab, Marwa A A; Elmallah, Osama A
2018-06-15
A green, simple and cost effective chemometric UV-Vis spectrophotometric method has been developed and validated for correcting interferences that arise during conducting biowaiver studies. Chemometric manipulation has been done for enhancing the results of direct absorbance, resulting from very low concentrations (high incidence of background noise interference) of earlier points in the dissolution timing in case of dissolution profile using first and second derivative (D1 & D2) methods and their corresponding Fourier function convoluted methods (D1/FF& D2/FF). The method applied for biowaiver study of Donepezil Hydrochloride (DH) as a representative model was done by comparing two different dosage forms containing 5mg DH per tablet as an application of a developed chemometric method for correcting interferences as well as for the assay and dissolution testing in its tablet dosage form. The results showed that first derivative technique can be used for enhancement of the data in case of low concentration range of DH (1-8μgmL -1 ) in the three different pH dissolution media which were used to estimate the low drug concentrations dissolved at the early points in the biowaiver study. Furthermore, the results showed similarity in phosphate buffer pH6.8 and dissimilarity in the other 2pH media. The method was validated according to ICH guidelines and USP monograph for both assays (HCl of pH1.2) and dissolution study in 3pH media (HCl of pH1.2, acetate buffer of pH4.5 and phosphate buffer of pH6.8). Finally, the assessment of the method greenness was done using two different assessment techniques: National Environmental Method Index label and Eco scale methods. Both techniques ascertained the greenness of the proposed method. Copyright © 2018 Elsevier B.V. All rights reserved.
The future of targeted peptidomics.
Findeisen, Peter
2013-12-01
Targeted MS is becoming increasingly important for sensitive and specific quantitative detection of proteins and respective PTMs. In this article, Ceglarek et al. [Proteomics Clin. Appl. 2013, 7, 794-801] present an LC-MS-based method for simultaneous quantitation of seven apolipoproteins in serum specimens. The assay fulfills many necessities of routine diagnostic applications, namely, low cost, high throughput, and good reproducibility. We anticipate that validation of new biomarkers will speed up with this technology and the palette of laboratory-based diagnostic tools will hopefully be augmented significantly in the near future. © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Nema, Shubham; Hasan, Whidul; Bhargava, Anamika; Bhargava, Yogesh
2016-09-15
Behavioural neuroscience relies on software driven methods for behavioural assessment, but the field lacks cost-effective, robust, open source software for behavioural analysis. Here we propose a novel method which we called as ZebraTrack. It includes cost-effective imaging setup for distraction-free behavioural acquisition, automated tracking using open-source ImageJ software and workflow for extraction of behavioural endpoints. Our ImageJ algorithm is capable of providing control to users at key steps while maintaining automation in tracking without the need for the installation of external plugins. We have validated this method by testing novelty induced anxiety behaviour in adult zebrafish. Our results, in agreement with established findings, showed that during state-anxiety, zebrafish showed reduced distance travelled, increased thigmotaxis and freezing events. Furthermore, we proposed a method to represent both spatial and temporal distribution of choice-based behaviour which is currently not possible to represent using simple videograms. ZebraTrack method is simple and economical, yet robust enough to give results comparable with those obtained from costly proprietary software like Ethovision XT. We have developed and validated a novel cost-effective method for behavioural analysis of adult zebrafish using open-source ImageJ software. Copyright © 2016 Elsevier B.V. All rights reserved.
Knowledge-based assistance in costing the space station DMS
NASA Technical Reports Server (NTRS)
Henson, Troy; Rone, Kyle
1988-01-01
The Software Cost Engineering (SCE) methodology developed over the last two decades at IBM Systems Integration Division (SID) in Houston is utilized to cost the NASA Space Station Data Management System (DMS). An ongoing project to capture this methodology, which is built on a foundation of experiences and lessons learned, has resulted in the development of an internal-use-only, PC-based prototype that integrates algorithmic tools with knowledge-based decision support assistants. This prototype Software Cost Engineering Automation Tool (SCEAT) is being employed to assist in the DMS costing exercises. At the same time, DMS costing serves as a forcing function and provides a platform for the continuing, iterative development, calibration, and validation and verification of SCEAT. The data that forms the cost engineering database is derived from more than 15 years of development of NASA Space Shuttle software, ranging from low criticality, low complexity support tools to highly complex and highly critical onboard software.
Ruger, Jennifer Prah; Emmons, Karen M; Kearney, Margaret H; Weinstein, Milton C
2009-01-01
Background Economic theory provides the philosophical foundation for valuing costs in judging medical and public health interventions. When evaluating smoking cessation interventions, accurate data on costs are essential for understanding resource consumption. Smoking cessation interventions, for which prior data on resource costs are typically not available, present special challenges. We develop a micro-costing methodology for estimating the real resource costs of outreach motivational interviewing (MI) for smoking cessation and relapse prevention among low-income pregnant women and report results from a randomized controlled trial (RCT) employing the methodology. Methodological standards in cost analysis are necessary for comparison and uniformity in analysis across interventions. Estimating the costs of outreach programs is critical for understanding the economics of reaching underserved and hard-to-reach populations. Methods Randomized controlled trial (1997-2000) collecting primary cost data for intervention. A sample of 302 low-income pregnant women was recruited from multiple obstetrical sites in the Boston metropolitan area. MI delivered by outreach health nurses vs. usual care (UC), with economic costs as the main outcome measures. Results The total cost of the MI intervention for 156 participants was $48,672 or $312 per participant. The total cost of $311.8 per participant for the MI intervention compared with a cost of $4.82 per participant for usual care, a difference of $307 ([CI], $289.2 to $322.8). The total fixed costs of the MI were $3,930 and the total variable costs of the MI were $44,710. The total expected program costs for delivering MI to 500 participants would be 147,430, assuming no economies of scale in program delivery. The main cost components of outreach MI were intervention delivery, travel time, scheduling, and training. Conclusion Grounded in economic theory, this methodology systematically identifies and measures resource utilization, using a process tracking system and calculates both component-specific and total costs of outreach MI. The methodology could help improve collection of accurate data on costs and estimates of the real resource costs of interventions alongside clinical trials and improve the validity and reliability of estimates of resource costs for interventions targeted at underserved and hard-to-reach populations. PMID:19775455
Simões, Rodrigo Almeida; Bonato, Pierina Sueli; Mirnaghi, Fatemeh S; Bojko, Barbara; Pawliszyn, Janusz
2015-01-01
A high-throughput bioanalytical method using 96-blade thin film microextraction (TFME) and LC-MS/MS for the analysis of repaglinide (RPG) and two of its main metabolites was developed and used for an in vitro metabolism study. The target analytes were extracted from human microsomal medium by a 96-blade-TFME system employing the low-cost prototype 'SPME multi-sampler' using C18 coating. Method validation showed recoveries around 90% for all analytes and was linear over the concentration range of 2-1000 ng ml(-1) for RPG and of 2-500 ng ml(-1) for each RPG metabolite. The method was applied to an in vitro metabolism study of RPG employing human liver microsomes and proved to be very useful for this purpose.
Zhang, Zhijun; Ashraf, Muhammad; Sahn, David J; Song, Xubo
2014-05-01
Quantitative analysis of cardiac motion is important for evaluation of heart function. Three dimensional (3D) echocardiography is among the most frequently used imaging modalities for motion estimation because it is convenient, real-time, low-cost, and nonionizing. However, motion estimation from 3D echocardiographic sequences is still a challenging problem due to low image quality and image corruption by noise and artifacts. The authors have developed a temporally diffeomorphic motion estimation approach in which the velocity field instead of the displacement field was optimized. The optimal velocity field optimizes a novel similarity function, which we call the intensity consistency error, defined as multiple consecutive frames evolving to each time point. The optimization problem is solved by using the steepest descent method. Experiments with simulated datasets, images of anex vivo rabbit phantom, images of in vivo open-chest pig hearts, and healthy human images were used to validate the authors' method. Simulated and real cardiac sequences tests showed that results in the authors' method are more accurate than other competing temporal diffeomorphic methods. Tests with sonomicrometry showed that the tracked crystal positions have good agreement with ground truth and the authors' method has higher accuracy than the temporal diffeomorphic free-form deformation (TDFFD) method. Validation with an open-access human cardiac dataset showed that the authors' method has smaller feature tracking errors than both TDFFD and frame-to-frame methods. The authors proposed a diffeomorphic motion estimation method with temporal smoothness by constraining the velocity field to have maximum local intensity consistency within multiple consecutive frames. The estimated motion using the authors' method has good temporal consistency and is more accurate than other temporally diffeomorphic motion estimation methods.
Quinn, Casey W; Cate, David M; Miller-Lionberg, Daniel D; Reilly, Thomas; Volckens, John; Henry, Charles S
2018-03-20
Metal contamination of natural and drinking water systems poses hazards to public and environmental health. Quantifying metal concentrations in water typically requires sample collection in the field followed by expensive laboratory analysis that can take days to weeks to obtain results. The objective of this work was to develop a low-cost, field-deployable method to quantify trace levels of copper in drinking water by coupling solid-phase extraction/preconcentration with a microfluidic paper-based analytical device. This method has the advantages of being hand-powered (instrument-free) and using a simple "read by eye" quantification motif (based on color distance). Tap water samples collected across Fort Collins, CO, were tested with this method and validated against ICP-MS. We demonstrate the ability to quantify the copper content of tap water within 30% of a reference technique at levels ranging from 20 to 500 000 ppb. The application of this technology, which should be sufficient as a rapid screening tool, can lead to faster, more cost-effective detection of soluble metals in water systems.
Clarke, Philip M
2002-03-01
In this study, the convergent validity of the contingent valuation method (CVM) and travel cost method (TCM) is tested by comparing estimates of the willingness to pay (WTP) for improving access to mammographic screening in rural areas of Australia. It is based on a telephone survey of 458 women in 19 towns, in which they were asked about their recent screening behaviour and their WTP to have a mobile screening unit visit their nearest town. After eliminating missing data and other non-usable responses the contingent valuation experiment and travel cost model were based on information from 372 and 319 women, respectively. Estimates of the maximum WTP for the use of mobile screening units were derived using both methods and compared. The highest mean WTP estimated using the TCM was $83.10 (95% C.I. $99.06-$68.53), which is significantly less than the estimate of $148.09 ($131.13-$166.60) using the CVM. This could be due to the CVM estimates also reflecting non-use values such as altruism, or a range of potential biases that are known to affect both methods. Further tests of validity are required in order to gain a greater understanding of the relationship between these two methods of estimating WTP. Copyright 2001 John Wiley & Sons, Ltd.
Tres, A; van der Veer, G; Perez-Marin, M D; van Ruth, S M; Garrido-Varo, A
2012-08-22
Organic products tend to retail at a higher price than their conventional counterparts, which makes them susceptible to fraud. In this study we evaluate the application of near-infrared spectroscopy (NIRS) as a rapid, cost-effective method to verify the organic identity of feed for laying hens. For this purpose a total of 36 organic and 60 conventional feed samples from The Netherlands were measured by NIRS. A binary classification model (organic vs conventional feed) was developed using partial least squares discriminant analysis. Models were developed using five different data preprocessing techniques, which were externally validated by a stratified random resampling strategy using 1000 realizations. Spectral regions related to the protein and fat content were among the most important ones for the classification model. The models based on data preprocessed using direct orthogonal signal correction (DOSC), standard normal variate (SNV), and first and second derivatives provided the most successful results in terms of median sensitivity (0.91 in external validation) and median specificity (1.00 for external validation of SNV models and 0.94 for DOSC and first and second derivative models). A previously developed model, which was based on fatty acid fingerprinting of the same set of feed samples, provided a higher sensitivity (1.00). This shows that the NIRS-based approach provides a rapid and low-cost screening tool, whereas the fatty acid fingerprinting model can be used for further confirmation of the organic identity of feed samples for laying hens. These methods provide additional assurance to the administrative controls currently conducted in the organic feed sector.
Comparison of lifetime-based methods for 2D phosphor thermometry in high-temperature environment
NASA Astrophysics Data System (ADS)
Peng, Di; Liu, Yingzheng; Zhao, Xiaofeng; Kim, Kyung Chun
2016-09-01
This paper discusses the currently available techniques for 2D phosphor thermometry, and compares the performance of two lifetime-based methods: high-speed imaging and the dual-gate. High-speed imaging resolves luminescent decay with a fast frame rate, and has become a popular method for phosphor thermometry in recent years. But it has disadvantages such as high equipment cost and long data processing time, and it would fail at sufficiently high temperature due to a low signal-to-noise ratio and short lifetime. The dual-gate method only requires two images on the decay curve and therefore greatly reduces cost in hardware and processing time. A dual-gate method for phosphor thermometry has been developed and compared with the high-speed imaging method through both calibration and a jet impingement experiment. Measurement uncertainty has been evaluated for a temperature range of 473-833 K. The effects of several key factors on uncertainty have been discussed, including the luminescent signal level, the decay lifetime and temperature sensitivity. The results show that both methods are valid for 2D temperature sensing within the given range. The high-speed imaging method shows less uncertainty at low temperatures where the signal level and the lifetime are both sufficient, but its performance is degraded at higher temperatures due to a rapidly reduced signal and lifetime. For T > 750 K, the dual-gate method outperforms the high-speed imaging method thanks to its superiority in signal-to-noise ratio and temperature sensitivity. The dual-gate method has great potential for applications in high-temperature environments where the high-speed imaging method is not applicable.
Zhang, Jiawei; Song, Lirong; Pedersen, Steffen Hindborg; Yin, Hao; Hung, Le Thanh; Iversen, Bo Brummerstedt
2017-01-01
Widespread application of thermoelectric devices for waste heat recovery requires low-cost high-performance materials. The currently available n-type thermoelectric materials are limited either by their low efficiencies or by being based on expensive, scarce or toxic elements. Here we report a low-cost n-type material, Te-doped Mg3Sb1.5Bi0.5, that exhibits a very high figure of merit zT ranging from 0.56 to 1.65 at 300−725 K. Using combined theoretical prediction and experimental validation, we show that the high thermoelectric performance originates from the significantly enhanced power factor because of the multi-valley band behaviour dominated by a unique near-edge conduction band with a sixfold valley degeneracy. This makes Te-doped Mg3Sb1.5Bi0.5 a promising candidate for the low- and intermediate-temperature thermoelectric applications. PMID:28059069
Patient-centered technological assessment and monitoring of depression for low-income patients.
Wu, Shinyi; Vidyanti, Irene; Liu, Pai; Hawkins, Caitlin; Ramirez, Magaly; Guterman, Jeffrey; Gross-Schulman, Sandra; Sklaroff, Laura Myerchin; Ell, Kathleen
2014-01-01
Depression is a significant challenge for ambulatory care because it worsens health status and outcomes, increases health care utilizations and costs, and elevates suicide risk. An automatic telephonic assessment (ATA) system that links with tasks and alerts to providers may improve quality of depression care and increase provider productivity. We used ATA system in a trial to assess and monitor depressive symptoms of 444 safety-net primary care patients with diabetes. We assessed system properties, evaluated preliminary clinical outcomes, and estimated cost savings. The ATA system is feasible, reliable, valid, safe, and likely cost-effective for depression screening and monitoring for low-income primary care population.
Characterization of a Low-Cost Multi-Parameter Sensor for Resource Applications: Preprint
DOE Office of Scientific and Technical Information (OSTI.GOV)
Habte, Aron M; Sengupta, Manajit; Andreas, Afshin M
Low-cost multi-parameter sensing and measurement devices enable cost-effective monitoring of the functional, operational reliability, efficiency, and resiliency of the electrical grid. The National Renewable Research Laboratory (NREL) Solar Radiation Research Laboratory (SRRL), in collaboration with Arable Labs Inc., deployed Arable Lab's Mark multi-parameter sensor system. The unique suite of system sensors measures the down-welling and upwelling shortwave solar resource and longwave radiation, humidity, air temperature, and ground temperature. This study describes the shortwave calibration, characteriza-tion, and validation of measurement accuracy of this instrument by comparison with existing instruments that are part of NREL-SRRL's Baseline Measurement System.
Optimization study on multiple train formation scheme of urban rail transit
NASA Astrophysics Data System (ADS)
Xia, Xiaomei; Ding, Yong; Wen, Xin
2018-05-01
The new organization method, represented by the mixed operation of multi-marshalling trains, can adapt to the characteristics of the uneven distribution of passenger flow, but the research on this aspect is still not perfect enough. This paper introduced the passenger sharing rate and congestion penalty coefficient with different train formations. On this basis, this paper established an optimization model with the minimum passenger cost and operation cost as objective, and operation frequency and passenger demand as constraint. The ideal point method is used to solve this model. Compared with the fixed marshalling operation model, the overall cost of this scheme saves 9.24% and 4.43% respectively. This result not only validates the validity of the model, but also illustrate the advantages of the multiple train formations scheme.
Aorta and pulmonary artery segmentation using optimal surface graph cuts in non-contrast CT
NASA Astrophysics Data System (ADS)
Sedghi Gamechi, Zahra; Arias-Lorza, Andres M.; Pedersen, Jesper Holst; de Bruijne, Marleen
2018-03-01
Accurate measurements of the size and shape of the aorta and pulmonary arteries are important as risk factors for cardiovascular diseases, and for Chronicle Obstacle Pulmonary Disease (COPD).1 The aim of this paper is to propose an automated method for segmenting the aorta and pulmonary arteries in low-dose non-ECGgated non-contrast CT scans. Low contrast and the high noise level make the automatic segmentation in such images a challenging task. In the proposed method, first, a minimum cost path tracking algorithm traces the centerline between user-defined seed points. The cost function is based on a multi-directional medialness filter and a lumen intensity similarity metric. The vessel radius is also estimated from the medialness filter. The extracted centerlines are then smoothed and dilated non-uniformly according to the extracted local vessel radius and subsequently used as initialization for a graph-cut segmentation. The algorithm is evaluated on 225 low-dose non-ECG-gated non-contrast CT scans from a lung cancer screening trial. Quantitatively analyzing 25 scans with full manual annotations, we obtain a dice overlap of 0.94+/-0.01 for the aorta and 0.92+/-0.01 for pulmonary arteries. Qualitative validation by visual inspection on 200 scans shows successful segmentation in 93% of all cases for the aorta and 94% for pulmonary arteries.
Drost, Ruben M W A; Paulus, Aggie T G; Ruwaard, Dirk; Evers, Silvia M A A
2017-02-01
There is a lack of knowledge about methods for valuing health intervention-related costs and monetary benefits in the education and criminal justice sectors, also known as 'inter-sectoral costs and benefits' (ICBs). The objective of this study was to develop methods for obtaining unit prices for the valuation of ICBs. By conducting an exploratory literature study and expert interviews, several generic methods were developed. The methods' feasibility was assessed through application in the Netherlands. Results were validated in an expert meeting, which was attended by policy makers, public health experts, health economists and HTA-experts, and discussed at several international conferences and symposia. The study resulted in four methods, including the opportunity cost method (A) and valuation using available unit prices (B), self-constructed unit prices (C) or hourly labor costs (D). The methods developed can be used internationally and are valuable for the broad international field of HTA.
Design optimization of space launch vehicles using a genetic algorithm
NASA Astrophysics Data System (ADS)
Bayley, Douglas James
The United States Air Force (USAF) continues to have a need for assured access to space. In addition to flexible and responsive spacelift, a reduction in the cost per launch of space launch vehicles is also desirable. For this purpose, an investigation of the design optimization of space launch vehicles has been conducted. Using a suite of custom codes, the performance aspects of an entire space launch vehicle were analyzed. A genetic algorithm (GA) was employed to optimize the design of the space launch vehicle. A cost model was incorporated into the optimization process with the goal of minimizing the overall vehicle cost. The other goals of the design optimization included obtaining the proper altitude and velocity to achieve a low-Earth orbit. Specific mission parameters that are particular to USAF space endeavors were specified at the start of the design optimization process. Solid propellant motors, liquid fueled rockets, and air-launched systems in various configurations provided the propulsion systems for two, three and four-stage launch vehicles. Mass properties models, an aerodynamics model, and a six-degree-of-freedom (6DOF) flight dynamics simulator were all used to model the system. The results show the feasibility of this method in designing launch vehicles that meet mission requirements. Comparisons to existing real world systems provide the validation for the physical system models. However, the ability to obtain a truly minimized cost was elusive. The cost model uses an industry standard approach, however, validation of this portion of the model was challenging due to the proprietary nature of cost figures and due to the dependence of many existing systems on surplus hardware.
NASA Technical Reports Server (NTRS)
Lin, S. S.; Lauer, M. S.; Asher, C. R.; Cosgrove, D. M.; Blackstone, E.; Thomas, J. D.; Garcia, M. J.
2001-01-01
OBJECTIVES: We sought to develop and validate a model that estimates the risk of obstructive coronary artery disease in patients undergoing operations for mitral valve degeneration and to demonstrate its potential clinical utility. METHODS: A total of 722 patients (67% men; age, 61 +/- 12 years) without a history of myocardial infarction, ischemic electrocardiographic changes, or angina who underwent routine coronary angiography before mitral valve prolapse operations between 1989 and 1996 were analyzed. A bootstrap-validated logistic regression model on the basis of clinical risk factors was developed to identify low-risk (< or =5%) patients. Obstructive coronary atherosclerosis was defined as 50% or more luminal narrowing in one or more major epicardial vessels, as determined by means of coronary angiography. RESULTS: One hundred thirty-nine (19%) patients had obstructive coronary atherosclerosis. Independent predictors of coronary artery disease include age, male sex, hypertension, diabetes mellitus,and hyperlipidemia. Two hundred twenty patients were designated as low risk according to the logistic model. Of these patients, only 3 (1.3%) had single-vessel disease, and none had multivessel disease. The model showed good discrimination, with an area under the receiver-operating characteristic curve of 0.84. Cost analysis indicated that application of this model could safely eliminate 30% of coronary angiograms, corresponding to cost savings of $430,000 per 1000 patients without missing any case of high-risk coronary artery disease. CONCLUSION: A model with standard clinical predictors can reliably estimate the prevalence of obstructive coronary atherosclerosis in patients undergoing mitral valve prolapse operations. This model can identify low-risk patients in whom routine preoperative angiography may be safely avoided.
Estimating dietary costs of low-income women in California: a comparison of 2 approaches.
Aaron, Grant J; Keim, Nancy L; Drewnowski, Adam; Townsend, Marilyn S
2013-04-01
Currently, no simplified approach to estimating food costs exists for a large, nationally representative sample. The objective was to compare 2 approaches for estimating individual daily diet costs in a population of low-income women in California. Cost estimates based on time-intensive method 1 (three 24-h recalls and associated food prices on receipts) were compared with estimates made by using less intensive method 2 [a food-frequency questionnaire (FFQ) and store prices]. Low-income participants (n = 121) of USDA nutrition programs were recruited. Mean daily diet costs, both unadjusted and adjusted for energy, were compared by using Pearson correlation coefficients and the Bland-Altman 95% limits of agreement between methods. Energy and nutrient intakes derived by the 2 methods were comparable; where differences occurred, the FFQ (method 2) provided higher nutrient values than did the 24-h recall (method 1). The crude daily diet cost was $6.32 by the 24-h recall method and $5.93 by the FFQ method (P = 0.221). The energy-adjusted diet cost was $6.65 by the 24-h recall method and $5.98 by the FFQ method (P < 0.001). Although the agreement between methods was weaker than expected, both approaches may be useful. Additional research is needed to further refine a large national survey approach (method 2) to estimate daily dietary costs with the use of this minimal time-intensive method for the participant and moderate time-intensive method for the researcher.
Wynwood, Sarah J.; Burns, Mary-Anne A.; Graham, Glenn C.; Weier, Steven L.; McKay, David B.; Craig, Scott B.
2015-01-01
A microsphere immunoassay (MIA) utilising Luminex xMap technology that is capable of determining leptospirosis IgG and IgM independently was developed. The MIA was validated using 200 human samples submitted for routine leptospirosis serology testing. The traditional microscopic agglutination (MAT) method (now 100 years old) suffers from a significant range of technical problems including a dependence on antisera which is difficult to source and produce, false positive reactions due to auto-agglutination and an inability to differentiate between IgG and IgM antibodies. A comparative validation method of the MIA against the MAT was performed and used to determine the ability of the MIA to detect leptospiral antibodies when compared with the MAT. The assay was able to determine samples in the reactive, equivocal and non-reactive ranges when compared to the MAT and was able to differentiate leptospiral IgG antibodies from leptospiral IgM antibodies. The MIA is more sensitive than the MAT and in true infections was able to detect low levels of antibody in the later stages of the acute phase as well as detect higher levels of IgM antibody earlier in the immune phase of the infection. The relatively low cost, high throughput platform and significantly reduced dependency on large volumes of rabbit antisera make this assay worthy of consideration for any microbiological assay that currently uses agglutination assays. PMID:25807009
Dyer, Bryce; Disley, B Xavier
2018-02-01
Lower-limb amputees typically require some form of prosthetic limb to ride a bicycle for recreation or when competing. At elite-level racing speeds, aerodynamic drag can represent the majority of the resistance acting against a cyclists' forward motion. As a result, the reduction of such resistance is beneficial to an amputee whereby the form and function of the prosthetic limb can be optimized through engineering. To measure the performance of such limbs, field testing provides a cost-effective and context-specific method of aerodynamic drag measurement. However, few methods have been formally validated and none have been applied to amputees with lower-limb amputations. In this paper, an elite level para-cyclist wore two different prosthetic limb designs and had their total aerodynamic drag of a wind tunnel reference method statistically correlated against a velodrome-based virtual elevation field test method. The calculated coefficient of variation was in the range of 0.7-0.9% for the wind tunnel method and 2-3% for the virtual elevation method. A 0.03 m 2 difference was identified in the absolute values recorded between the two methods. Ultimately, both methods exhibited high levels of precision, yet relative results to each other. The virtual elevation method is proposed as a suitable technique to assess the aerodynamic drag of amputee para-cyclists. Implications for rehabilitation This assessment method will provide practitioners a reliable means of assessing the impact of changes made to prosthetics design for cyclists with limb absence. The proposed method offers a low cost and geographically accessible solution compared to others proposed in the past. This assessment method has significant potential for impact among prosthetic limb users looking to improve their cycling performance whereas previous attention in this field has been extremely limited.
Okamoto, Nobuhiko; Nakashima, Mitsuko; Tsurusaki, Yoshinori; Miyake, Noriko; Saitsu, Hirotomo; Matsumoto, Naomichi
2013-01-01
Next-generation sequencing (NGS) combined with enrichment of target genes enables highly efficient and low-cost sequencing of multiple genes for genetic diseases. The aim of this study was to validate the accuracy and sensitivity of our method for comprehensive mutation detection in autism spectrum disorder (ASD). We assessed the performance of the bench-top Ion Torrent PGM and Illumina MiSeq platforms as optimized solutions for mutation detection, using microdroplet PCR-based enrichment of 62 ASD associated genes. Ten patients with known mutations were sequenced using NGS to validate the sensitivity of our method. The overall read quality was better with MiSeq, largely because of the increased indel-related error associated with PGM. The sensitivity of SNV detection was similar between the two platforms, suggesting they are both suitable for SNV detection in the human genome. Next, we used these methods to analyze 28 patients with ASD, and identified 22 novel variants in genes associated with ASD, with one mutation detected by MiSeq only. Thus, our results support the combination of target gene enrichment and NGS as a valuable molecular method for investigating rare variants in ASD. PMID:24066114
El-Masry, Amal A; Hammouda, Mohammed E A; El-Wasseef, Dalia R; El-Ashry, Saadia M
2018-02-15
Two simple, sensitive, rapid, validated and cost effective spectroscopic methods were established for quantification of antihistaminic drug azelastine (AZL) in bulk powder as well as in pharmaceutical dosage forms. In the first method (A) the absorbance difference between acidic and basic solutions was measured at 228nm, whereas in the second investigated method (B) the binary complex formed between AZL and Eosin Y in acetate buffer solution (pH3) was measured at 550nm. Different criteria that have critical influence on the intensity of absorption were deeply studied and optimized so as to achieve the highest absorption. The proposed methods obeyed Beer ' s low in the concentration range of (2.0-20.0μg·mL -1 ) and (0.5-15.0μg·mL -1 ) with % recovery±S.D. of (99.84±0.87), (100.02±0.78) for methods (A) and (B), respectively. Furthermore, the proposed methods were easily applied for quality control of pharmaceutical preparations without any conflict with its co-formulated additives, and the analytical results were compatible with those obtained by the comparison one with no significant difference as insured by student's t-test and the variance ratio F-test. Validation of the proposed methods was performed according the ICH guidelines in terms of linearity, limit of quantification, limit of detection, accuracy, precision and specificity, where the analytical results were persuasive. Copyright © 2017 Elsevier B.V. All rights reserved.
Computationally Efficient Adaptive Beamformer for Ultrasound Imaging Based on QR Decomposition.
Park, Jongin; Wi, Seok-Min; Lee, Jin S
2016-02-01
Adaptive beamforming methods for ultrasound imaging have been studied to improve image resolution and contrast. The most common approach is the minimum variance (MV) beamformer which minimizes the power of the beamformed output while maintaining the response from the direction of interest constant. The method achieves higher resolution and better contrast than the delay-and-sum (DAS) beamformer, but it suffers from high computational cost. This cost is mainly due to the computation of the spatial covariance matrix and its inverse, which requires O(L(3)) computations, where L denotes the subarray size. In this study, we propose a computationally efficient MV beamformer based on QR decomposition. The idea behind our approach is to transform the spatial covariance matrix to be a scalar matrix σI and we subsequently obtain the apodization weights and the beamformed output without computing the matrix inverse. To do that, QR decomposition algorithm is used and also can be executed at low cost, and therefore, the computational complexity is reduced to O(L(2)). In addition, our approach is mathematically equivalent to the conventional MV beamformer, thereby showing the equivalent performances. The simulation and experimental results support the validity of our approach.
Purposive Facebook Recruitment Endows Cost-Effective Nutrition Education Program Evaluation
Wamboldt, Patricia
2013-01-01
Background Recent legislation established a requirement for nutrition education in federal assistance programs to be evidence-based. Recruitment of low-income persons to participate and evaluate nutrition education activities can be challenging and costly. Facebook has been shown to be a cost-effective strategy to recruit this target audience to a nutrition program. Objective The purpose of our study was to examine Facebook as a strategy to recruit participants, especially Supplemental Nutrition Assistance Program Education (SNAP-Ed) eligible persons, to view and evaluate an online nutrition education program intended to be offered as having some evidence base for SNAP-Ed programming. Methods English-speaking, low-income Pennsylvania residents, 18-55 years with key profile words (eg, Supplemental Nutrition Assistance Program, Food bank), responded to a Facebook ad inviting participation in either Eating Together as a Family is Worth It (WI) or Everyone Needs Folic Acid (FA). Participants completed an online survey on food-related behaviors, viewed a nutrition education program, and completed a program evaluation. Facebook set-up functions considered were costing action, daily spending cap, and population reach. Results Respondents for both WI and FA evaluations were similar; the majority were white, <40 years, overweight or obese body mass index, and not eating competent. A total of 807 Facebook users clicked on the WI ad with 73 unique site visitors and 47 of them completing the program evaluation (ie, 47/807, 5.8% of clickers and 47/73, 64% of site visitors completed the evaluation). Cost per completed evaluation was US $25.48; cost per low-income completer was US $39.92. Results were similar for the FA evaluation; 795 Facebook users clicked on the ad with 110 unique site visitors, and 73 completing the evaluation (ie, 73/795, 9.2% of ad clickers and 73/110, 66% of site visitors completed the evaluation). Cost per valid completed survey with program evaluation was US $18.88; cost per low-income completer was US $27.53. Conclusions With Facebook we successfully recruited low-income Pennsylvanians to online nutrition program evaluations. Benefits using Facebook as a recruitment strategy included real-time recruitment management with lower costs and more efficiency compared to previous data from traditional research recruitment strategies reported in the literature. Limitations prompted by repeated survey attempts need to be addressed to optimize this recruitment strategy. PMID:23948573
Cold-end Subsystem Testing for the Fission Power System Technology Demonstration Unit
NASA Technical Reports Server (NTRS)
Briggs, Maxwell; Gibson, Marc; Ellis, David; Sanzi, James
2013-01-01
The Fission Power System (FPS) Technology Demonstration Unit (TDU) consists of a pumped sodium-potassium (NaK) loop that provides heat to a Stirling Power Conversion Unit (PCU), which converts some of that heat into electricity and rejects the waste heat to a pumped water loop. Each of the TDU subsystems is being tested independently prior to full system testing at the NASA Glenn Research Center. The pumped NaK loop is being tested at NASA Marshall Space Flight Center; the Stirling PCU and electrical controller are being tested by Sunpower Inc.; and the pumped water loop is being tested at Glenn. This paper describes cold-end subsystem setup and testing at Glenn. The TDU cold end has been assembled in Vacuum Facility 6 (VF 6) at Glenn, the same chamber that will be used for TDU testing. Cold-end testing in VF 6 will demonstrate functionality; validated cold-end fill, drain, and emergency backup systems; and generated pump performance and system pressure drop data used to validate models. In addition, a low-cost proof-of concept radiator has been built and tested at Glenn, validating the design and demonstrating the feasibility of using low-cost metal radiators as an alternative to high-cost composite radiators in an end-to-end TDU test.
High fidelity, low cost moulage as a valid simulation tool to improve burns education.
Pywell, M J; Evgeniou, E; Highway, K; Pitt, E; Estela, C M
2016-06-01
Simulation allows the opportunity for repeated practice in controlled, safe conditions. Moulage uses materials such as makeup to simulate clinical presentations. Moulage fidelity can be assessed by face validity (realism) and content validity (appropriateness). The aim of this project is to compare the fidelity of professional moulage to non-professional moulage in the context of a burns management course. Four actors were randomly assigned to a professional make-up artist or a course faculty member for moulage preparation such that two actors were in each group. Participants completed the actor-based burn management scenarios and answered a ten-question Likert-scale questionnaire on face and content validity. Mean scores and a linear mixed effects model were used to compare professional and non-professional moulage. Cronbach's alpha assessed internal consistency. Twenty participants experienced three out of four scenarios and at the end of the course completed a total of 60 questionnaires. Professional moulage had higher average ratings for face (4.30 v 3.80; p=0.11) and content (4.30 v 4.00; p=0.06) validity. Internal consistency of face (α=0.91) and content (α=0.85) validity questions was very good. The fidelity of professionally prepared moulage, as assessed by content validity, was higher than non-professionally prepared moulage. We have shown that using professional techniques and low cost materials we can prepare quality high fidelity moulage simulations. Crown Copyright © 2016. Published by Elsevier Ltd. All rights reserved.
Lince-Deroche, Naomi; Phiri, Jane; Michelow, Pam; Smith, Jennifer S.; Firnhaber, Cindy
2015-01-01
Background South Africa has high rates of HIV and HPV and high incidence and mortality from cervical cancer. However, cervical cancer is largely preventable when early screening and treatment are available. We estimate the costs and cost-effectiveness of conventional cytology (Pap), visual inspection with acetic acid (VIA) and HPV DNA testing for detecting cases of CIN2+ among HIV-infected women currently taking antiretroviral treatment at a public HIV clinic in Johannesburg, South Africa. Methods Method effectiveness was derived from a validation study completed at the clinic. Costs were estimated from the provider perspective using micro-costing between June 2013-April 2014. Capital costs were annualized using a discount rate of 3%. Two different service volume scenarios were considered. Threshold analysis was used to explore the potential for reducing the cost of HPV DNA testing. Results VIA was least costly in both scenarios. In the higher volume scenario, the average cost per procedure was US$ 3.67 for VIA, US$ 8.17 for Pap and US$ 54.34 for HPV DNA. Colposcopic biopsies cost on average US$ 67.71 per procedure. VIA was least sensitive but most cost-effective at US$ 17.05 per true CIN2+ case detected. The cost per case detected for Pap testing was US$ 130.63 using a conventional definition for positive results and US$ 187.52 using a more conservative definition. HPV DNA testing was US$ 320.09 per case detected. Colposcopic biopsy costs largely drove the total and per case costs. A 71% reduction in HPV DNA screening costs would make it competitive with the conservative Pap definition. Conclusions Women need access to services which meet their needs and address the burden of cervical dysplasia and cancer in this region. Although most cost-effective, VIA may require more frequent screening due to low sensitivity, an important consideration for an HIV-positive population with increased risk for disease progression. PMID:26569487
Development and Validation of a Computational Model for Androgen Receptor Activity
Testing thousands of chemicals to identify potential androgen receptor (AR) agonists or antagonists would cost millions of dollars and take decades to complete using current validated methods. High-throughput in vitro screening (HTS) and computational toxicology approaches can mo...
A feasibility study of damage detection in beams using high-speed camera (Conference Presentation)
NASA Astrophysics Data System (ADS)
Wan, Chao; Yuan, Fuh-Gwo
2017-04-01
In this paper a method for damage detection in beam structures using high-speed camera is presented. Traditional methods of damage detection in structures typically involve contact (i.e., piezoelectric sensor or accelerometer) or non-contact sensors (i.e., laser vibrometer) which can be costly and time consuming to inspect an entire structure. With the popularity of the digital camera and the development of computer vision technology, video cameras offer a viable capability of measurement including higher spatial resolution, remote sensing and low-cost. In the study, a damage detection method based on the high-speed camera was proposed. The system setup comprises a high-speed camera and a line-laser which can capture the out-of-plane displacement of a cantilever beam. The cantilever beam with an artificial crack was excited and the vibration process was recorded by the camera. A methodology called motion magnification, which can amplify subtle motions in a video is used for modal identification of the beam. A finite element model was used for validation of the proposed method. Suggestions for applications of this methodology and challenges in future work will be discussed.
Gorguluarslan, Recep M; Choi, Seung-Kyum; Saldana, Christopher J
2017-07-01
A methodology is proposed for uncertainty quantification and validation to accurately predict the mechanical response of lattice structures used in the design of scaffolds. Effective structural properties of the scaffolds are characterized using a developed multi-level stochastic upscaling process that propagates the quantified uncertainties at strut level to the lattice structure level. To obtain realistic simulation models for the stochastic upscaling process and minimize the experimental cost, high-resolution finite element models of individual struts were reconstructed from the micro-CT scan images of lattice structures which are fabricated by selective laser melting. The upscaling method facilitates the process of determining homogenized strut properties to reduce the computational cost of the detailed simulation model for the scaffold. Bayesian Information Criterion is utilized to quantify the uncertainties with parametric distributions based on the statistical data obtained from the reconstructed strut models. A systematic validation approach that can minimize the experimental cost is also developed to assess the predictive capability of the stochastic upscaling method used at the strut level and lattice structure level. In comparison with physical compression test results, the proposed methodology of linking the uncertainty quantification with the multi-level stochastic upscaling method enabled an accurate prediction of the elastic behavior of the lattice structure with minimal experimental cost by accounting for the uncertainties induced by the additive manufacturing process. Copyright © 2017 Elsevier Ltd. All rights reserved.
Hart, Teresa L; Brusseau, Timothy; Kulinna, Pamela Hodges; McClain, James J; Tudor-Locke, Catrine
2011-12-01
This study compared step counts detected by four, low-cost, objective, physical-activity-assessment instruments and evaluated their ability to detect moderate-to-vigorous physical activity (MVPA) compared to the ActiGraph accelerometer (AG). Thirty-six 10-11-year-old children wore the NL-1000, Yamax Digiwalker SW 200, Omron HJ-151, and Walk4Life MVP concurrently with the AG during school hours on a single day. AG MVPA was derived from activity count data using previously validated cut points. Two of the evaluated instruments provided similar group mean MVPA and step counts compared to AG (dependent on cut point). Low-cost instruments may be useful for measurement of both MVPA and steps in children's physical activity interventions and program evaluation.
The cost diary: a method to measure direct and indirect costs in cost-effectiveness research.
Goossens, M E; Rutten-van Mölken, M P; Vlaeyen, J W; van der Linden, S M
2000-07-01
From a societal perspective long-term clinical trials or follow-up studies should preferably not only include an evaluation of the health effect for the patient, but also an economic evaluation. In order to yield comprehensive medical and nonmedical resource use data, we at least partly depend on respondents' recall for collecting these costing data. A patient cost diary was developed in order to estimate total resource use, expenses, and lost production due to illness and treatment. We applied the cost diary in two randomized clinical trials evaluating the cost-effectiveness of behavioral rehabilitation in 205 fibromyalgia and chronic low back pain patients. The use of the diary was evaluated, studying the feasibility, the influence of the period of data collection on the results, and some aspects of validity. Eighty-five percent of the patients completed at least one diary and in total 68% of the diaries were returned. Although the results for the three alternative periods of data collection (keeping the diary 1 week every month, 2 weeks every 2 months, or a full year) were not significantly different, they were only moderately correlated. Finally, self-reported specialist care contacts were generally in agreement with data from an insurance company. However, for physiotherapy contacts there were differences between the self-reported and insurance data. This study shows how the cost diary might be used successfully in cost-effectiveness studies.
Solar-Diesel Hybrid Power System Optimization and Experimental Validation
NASA Astrophysics Data System (ADS)
Jacobus, Headley Stewart
As of 2008 1.46 billion people, or 22 percent of the World's population, were without electricity. Many of these people live in remote areas where decentralized generation is the only method of electrification. Most mini-grids are powered by diesel generators, but new hybrid power systems are becoming a reliable method to incorporate renewable energy while also reducing total system cost. This thesis quantifies the measurable Operational Costs for an experimental hybrid power system in Sierra Leone. Two software programs, Hybrid2 and HOMER, are used during the system design and subsequent analysis. Experimental data from the installed system is used to validate the two programs and to quantify the savings created by each component within the hybrid system. This thesis bridges the gap between design optimization studies that frequently lack subsequent validation and experimental hybrid system performance studies.
HSR combustion analytical research
NASA Technical Reports Server (NTRS)
Nguyen, H. Lee
1992-01-01
Increasing the pressure and temperature of the engines of a new generation of supersonic airliners increases the emissions of nitrogen oxides (NO(x)) to a level that would have an adverse impact on the Earth's protective ozone layer. In the process of evolving and implementing low emissions combustor technologies, NASA LeRC has pursued a combustion analysis code program to guide combustor design processes, to identify potential concepts of the greatest promise, and to optimize them at low cost, with short turnaround time. The computational analyses are evaluated at actual engine operating conditions. The approach is to upgrade and apply advanced computer programs for gas turbine applications. Efforts were made in further improving the code capabilities for modeling the physics and the numerical methods of solution. Then test cases and measurements from experiments are used for code validation.
Reliability and validity of the AutoCAD software method in lumbar lordosis measurement
Letafatkar, Amir; Amirsasan, Ramin; Abdolvahabi, Zahra; Hadadnezhad, Malihe
2011-01-01
Objective The aim of this study was to determine the reliability and validity of the AutoCAD software method in lumbar lordosis measurement. Methods Fifty healthy volunteers with a mean age of 23 ± 1.80 years were enrolled. A lumbar lateral radiograph was taken on all participants, and the lordosis was measured according to the Cobb method. Afterward, the lumbar lordosis degree was measured via AutoCAD software and flexible ruler methods. The current study is accomplished in 2 parts: intratester and intertester evaluations of reliability as well as the validity of the flexible ruler and software methods. Results Based on the intraclass correlation coefficient, AutoCAD's reliability and validity in measuring lumbar lordosis were 0.984 and 0.962, respectively. Conclusions AutoCAD showed to be a reliable and valid method to measure lordosis. It is suggested that this method may replace those that are costly and involve health risks, such as radiography, in evaluating lumbar lordosis. PMID:22654681
Design and Validation of a 150 MHz HFFQCM Sensor for Bio-Sensing Applications
Fernández, Román; García, Pablo; García, María; Jiménez, Yolanda; Arnau, Antonio
2017-01-01
Acoustic wave resonators have become suitable devices for a broad range of sensing applications due to their sensitivity, low cost, and integration capability, which are all factors that meet the requirements for the resonators to be used as sensing elements for portable point of care (PoC) platforms. In this work, the design, characterization, and validation of a 150 MHz high fundamental frequency quartz crystal microbalance (HFF-QCM) sensor for bio-sensing applications are introduced. Finite element method (FEM) simulations of the proposed design are in good agreement with the electrical characterization of the manufactured resonators. The sensor is also validated for bio-sensing applications. For this purpose, a specific sensor cell was designed and manufactured that addresses the critical requirements associated with this type of sensor and application. Due to the small sensing area and the sensor’s fragility, these requirements include a low-volume flow chamber in the nanoliter range, and a system approach that provides the appropriate pressure control for assuring liquid confinement while maintaining the integrity of the sensor with a good base line stability and easy sensor replacement. The sensor characteristics make it suitable for consideration as the elemental part of a sensor matrix in a multichannel platform for point of care applications. PMID:28885551
Estimating dietary costs of low-income women in California: a comparison of 2 approaches123
Aaron, Grant J; Keim, Nancy L; Drewnowski, Adam
2013-01-01
Background: Currently, no simplified approach to estimating food costs exists for a large, nationally representative sample. Objective: The objective was to compare 2 approaches for estimating individual daily diet costs in a population of low-income women in California. Design: Cost estimates based on time-intensive method 1 (three 24-h recalls and associated food prices on receipts) were compared with estimates made by using less intensive method 2 [a food-frequency questionnaire (FFQ) and store prices]. Low-income participants (n = 121) of USDA nutrition programs were recruited. Mean daily diet costs, both unadjusted and adjusted for energy, were compared by using Pearson correlation coefficients and the Bland-Altman 95% limits of agreement between methods. Results: Energy and nutrient intakes derived by the 2 methods were comparable; where differences occurred, the FFQ (method 2) provided higher nutrient values than did the 24-h recall (method 1). The crude daily diet cost was $6.32 by the 24-h recall method and $5.93 by the FFQ method (P = 0.221). The energy-adjusted diet cost was $6.65 by the 24-h recall method and $5.98 by the FFQ method (P < 0.001). Conclusions: Although the agreement between methods was weaker than expected, both approaches may be useful. Additional research is needed to further refine a large national survey approach (method 2) to estimate daily dietary costs with the use of this minimal time-intensive method for the participant and moderate time-intensive method for the researcher. PMID:23388658
An organophosphonate strategy for functionalizing silicon photonic biosensors
Shang, Jing; Cheng, Fang; Dubey, Manish; Kaplan, Justin M.; Rawal, Meghana; Jiang, Xi; Newburg, David S.; Sullivan, Philip A.; Andrade, Rodrigo B.; Ratner, Daniel M.
2012-01-01
Silicon photonic microring resonators have established their potential for label-free and low-cost biosensing applications. However, the long-term performance of this optical sensing platform requires robust surface modification and biofunctionalization. Herein, we demonstrate a conjugation strategy based on an organophosphonate surface coating and vinyl sulfone linker to biofunctionalize silicon resonators for biomolecular sensing. To validate this method, a series of glycans, including carbohydrates and glycoconjugates, were immobilized on divinyl sulfone (DVS)/organophosphonate-modified microrings and used to characterize carbohydrate-protein and norovirus particle interactions. This biofunctional platform was able to orthogonally detect multiple specific carbohydrate-protein interactions simultaneously. Additionally, the platform was capable of reproducible binding after multiple regenerations by high-salt, high-pH or low-pH solutions and after 1-month storage in ambient conditions. This remarkable stability and durability of the organophosphonate immobilization strategy will facilitate the application of silicon microring resonators in various sensing conditions, prolong their lifetime, and minimize the cost for storage and delivery; these characteristics are requisite for developing biosensors for point-of-care and distributed diagnostics and other biomedical applications. In addition, the platform demonstrated its ability to characterize carbohydrate-mediated host-virus interactions, providing a facile method for discovering new anti-viral agents to prevent infectious disease. PMID:22220731
The Effects of Quality of Care on Costs: A Conceptual Framework
Nuckols, Teryl K; Escarce, José J; Asch, Steven M
2013-01-01
Context The quality of health care and the financial costs affected by receiving care represent two fundamental dimensions for judging health care performance. No existing conceptual framework appears to have described how quality influences costs. Methods We developed the Quality-Cost Framework, drawing from the work of Donabedian, the RAND/UCLA Appropriateness Method, reports by the Institute of Medicine, and other sources. Findings The Quality-Cost Framework describes how health-related quality of care (aspects of quality that influence health status) affects health care and other costs. Structure influences process, which, in turn, affects proximate and ultimate outcomes. Within structure, subdomains include general structural characteristics, circumstance-specific (e.g., disease-specific) structural characteristics, and quality-improvement systems. Process subdomains include appropriateness of care and medical errors. Proximate outcomes consist of disease progression, disease complications, and care complications. Each of the preceding subdomains influences health care costs. For example, quality improvement systems often create costs associated with monitoring and feedback. Providing appropriate care frequently requires additional physician visits and medications. Care complications may result in costly hospitalizations or procedures. Ultimate outcomes include functional status as well as length and quality of life; the economic value of these outcomes can be measured in terms of health utility or health-status-related costs. We illustrate our framework using examples related to glycemic control for type 2 diabetes mellitus or the appropriateness of care for low back pain. Conclusions The Quality-Cost Framework describes the mechanisms by which health-related quality of care affects health care and health status–related costs. Additional work will need to validate the framework by applying it to multiple clinical conditions. Applicability could be assessed by using the framework to classify the measures of quality and cost reported in published studies. Usefulness could be demonstrated by employing the framework to identify design flaws in published cost analyses, such as omitting the costs attributable to a relevant subdomain of quality. PMID:23758513
ERIC Educational Resources Information Center
Surapiboonchai, Kampol
2010-01-01
There is a lack of valid and reliable low cost observational instruments to measure moderate to vigorous physical activity (MVPA) in school physical education (PE). The participants in this study were third to tenth grade boys and girls from a south Texas school district. The SAM (Simple Activity Measurement) activity levels were compared with…
Convolutional Sparse Coding for RGB+NIR Imaging.
Hu, Xuemei; Heide, Felix; Dai, Qionghai; Wetzstein, Gordon
2018-04-01
Emerging sensor designs increasingly rely on novel color filter arrays (CFAs) to sample the incident spectrum in unconventional ways. In particular, capturing a near-infrared (NIR) channel along with conventional RGB color is an exciting new imaging modality. RGB+NIR sensing has broad applications in computational photography, such as low-light denoising, it has applications in computer vision, such as facial recognition and tracking, and it paves the way toward low-cost single-sensor RGB and depth imaging using structured illumination. However, cost-effective commercial CFAs suffer from severe spectral cross talk. This cross talk represents a major challenge in high-quality RGB+NIR imaging, rendering existing spatially multiplexed sensor designs impractical. In this work, we introduce a new approach to RGB+NIR image reconstruction using learned convolutional sparse priors. We demonstrate high-quality color and NIR imaging for challenging scenes, even including high-frequency structured NIR illumination. The effectiveness of the proposed method is validated on a large data set of experimental captures, and simulated benchmark results which demonstrate that this work achieves unprecedented reconstruction quality.
Patient-Centered Technological Assessment and Monitoring of Depression for Low-Income Patients
Wu, Shinyi; Vidyanti, Irene; Liu, Pai; Hawkins, Caitlin; Ramirez, Magaly; Guterman, Jeffrey; Gross-Schulman, Sandra; Sklaroff, Laura Myerchin; Ell, Kathleen
2014-01-01
Depression is a significant challenge for ambulatory care because it worsens health status and outcomes, increases health care utilizations and costs, and elevates suicide risk. An automatic telephonic assessment (ATA) system that links with tasks and alerts to providers may improve quality of depression care and increase provider productivity. We used ATA system in a trial to assess and monitor depressive symptoms of 444 safety-net primary care patients with diabetes. We assessed system properties, evaluated preliminary clinical outcomes, and estimated cost savings. The ATA system is feasible, reliable, valid, safe, and likely cost-effective for depression screening and monitoring for low-income primary care population. PMID:24525531
Dong, Meili; Wu, Jiandong; Ma, Zimin; Peretz-Soroka, Hagit; Zhang, Michael; Komenda, Paul; Tangri, Navdeep; Liu, Yong; Rigatto, Claudio; Lin, Francis
2017-03-26
Traditional diagnostic tests for chronic diseases are expensive and require a specialized laboratory, therefore limiting their use for point-of-care (PoC) testing. To address this gap, we developed a method for rapid and low-cost C-reactive protein (CRP) detection from blood by integrating a paper-based microfluidic immunoassay with a smartphone (CRP-Chip). We chose CRP for this initial development because it is a strong biomarker of prognosis in chronic heart and kidney disease. The microfluidic immunoassay is realized by lateral flow and gold nanoparticle-based colorimetric detection of the target protein. The test image signal is acquired and analyzed using a commercial smartphone with an attached microlens and a 3D-printed chip-phone interface. The CRP-Chip was validated for detecting CRP in blood samples from chronic kidney disease patients and healthy subjects. The linear detection range of the CRP-Chip is up to 2 μg/mL and the detection limit is 54 ng/mL. The CRP-Chip test result yields high reproducibility and is consistent with the standard ELISA kit. A single CRP-Chip can perform the test in triplicate on a single chip within 15 min for less than 50 US cents of material cost. This CRP-Chip with attractive features of low-cost, fast test speed, and integrated easy operation with smartphones has the potential to enable future clinical PoC chronic disease diagnosis and risk stratification by parallel measurements of a panel of protein biomarkers.
Dong, Meili; Wu, Jiandong; Ma, Zimin; Peretz-Soroka, Hagit; Zhang, Michael; Komenda, Paul; Tangri, Navdeep; Liu, Yong; Rigatto, Claudio; Lin, Francis
2017-01-01
Traditional diagnostic tests for chronic diseases are expensive and require a specialized laboratory, therefore limiting their use for point-of-care (PoC) testing. To address this gap, we developed a method for rapid and low-cost C-reactive protein (CRP) detection from blood by integrating a paper-based microfluidic immunoassay with a smartphone (CRP-Chip). We chose CRP for this initial development because it is a strong biomarker of prognosis in chronic heart and kidney disease. The microfluidic immunoassay is realized by lateral flow and gold nanoparticle-based colorimetric detection of the target protein. The test image signal is acquired and analyzed using a commercial smartphone with an attached microlens and a 3D-printed chip–phone interface. The CRP-Chip was validated for detecting CRP in blood samples from chronic kidney disease patients and healthy subjects. The linear detection range of the CRP-Chip is up to 2 μg/mL and the detection limit is 54 ng/mL. The CRP-Chip test result yields high reproducibility and is consistent with the standard ELISA kit. A single CRP-Chip can perform the test in triplicate on a single chip within 15 min for less than 50 US cents of material cost. This CRP-Chip with attractive features of low-cost, fast test speed, and integrated easy operation with smartphones has the potential to enable future clinical PoC chronic disease diagnosis and risk stratification by parallel measurements of a panel of protein biomarkers. PMID:28346363
Blind system identification of two-thermocouple sensor based on cross-relation method.
Li, Yanfeng; Zhang, Zhijie; Hao, Xiaojian
2018-03-01
In dynamic temperature measurement, the dynamic characteristics of the sensor affect the accuracy of the measurement results. Thermocouples are widely used for temperature measurement in harsh conditions due to their low cost, robustness, and reliability, but because of the presence of the thermal inertia, there is a dynamic error in the dynamic temperature measurement. In order to eliminate the dynamic error, two-thermocouple sensor was used to measure dynamic gas temperature in constant velocity flow environments in this paper. Blind system identification of two-thermocouple sensor based on a cross-relation method was carried out. Particle swarm optimization algorithm was used to estimate time constants of two thermocouples and compared with the grid based search method. The method was validated on the experimental equipment built by using high temperature furnace, and the input dynamic temperature was reconstructed by using the output data of the thermocouple with small time constant.
Blind system identification of two-thermocouple sensor based on cross-relation method
NASA Astrophysics Data System (ADS)
Li, Yanfeng; Zhang, Zhijie; Hao, Xiaojian
2018-03-01
In dynamic temperature measurement, the dynamic characteristics of the sensor affect the accuracy of the measurement results. Thermocouples are widely used for temperature measurement in harsh conditions due to their low cost, robustness, and reliability, but because of the presence of the thermal inertia, there is a dynamic error in the dynamic temperature measurement. In order to eliminate the dynamic error, two-thermocouple sensor was used to measure dynamic gas temperature in constant velocity flow environments in this paper. Blind system identification of two-thermocouple sensor based on a cross-relation method was carried out. Particle swarm optimization algorithm was used to estimate time constants of two thermocouples and compared with the grid based search method. The method was validated on the experimental equipment built by using high temperature furnace, and the input dynamic temperature was reconstructed by using the output data of the thermocouple with small time constant.
2014-01-01
Background Untreated behavioral and mental health problems beginning in early childhood are costly problems affecting the long-term health and wellbeing of children, their families, and society. Although parent training (PT) programs have been demonstrated to be a cost-effective intervention modality for treating childhood behavior problems, they have been less effective for children from low-income and underserved racial and ethnic populations. The purpose of this randomized trial is to compare the effectiveness, cost, and social validity of two manualized evidence-based PT programs that were developed and tested on different populations and employ different delivery models: (1) The Chicago Parent Program (CPP), a group-based program developed in collaboration with a community advisory board of African-American and Latino parents; and (2) Parent-Child Interaction Therapy (PCIT), an individualized parent-child coaching model considered to be ‘the gold standard’ for parents of children with externalizing behavior problems. Methods This trial uses an experimental design with randomization of parents seeking behavioral treatment for their 2- to 5-year-old children at a mental health clinic in Baltimore, MD (80% African-American or multi-racial; 97% receiving Medicaid). Using block randomization procedures, 262 parents are randomized to CPP or PCIT. Clinicians (n = 13) employed in the mental health clinic and trained in CPP or PCIT are also recruited to participate. Primary outcomes of interest are reductions in child behavior problems, improvements in parenting, perceived value of the interventions from the perspective of parents and clinicians, and cost. Parent distress and family social risk are assessed as modifiers of treatment effectiveness. We hypothesize that CPP will be at least as effective as PCIT for reducing child behavior problems and improving parenting but the programs will differ on cost and their social validity as perceived by parents and clinicians. Discussion This is the first study to compare the effectiveness of a PT program originally designed with and for parents from underserved racial and ethnic populations (CPP) against a well-established program considered to be the ‘the gold standard’ (PCIT) with a high-risk population of parents. Challenges related to conducting a randomized trial in a fee-for-service mental health clinic serving urban, low-income families are discussed. Trial registration NCT01517867 PMID:24581245
A Low-Cost, In Situ Resistivity and Temperature Monitoring System
We present a low-cost, reliable method for long-term in situ autonomous monitoring of subsurface resistivity and temperature in a shallow, moderately heterogeneous subsurface. Probes, to be left in situ, were constructed at relatively low cost with close electrode spacing. Once i...
Reliability and validity of the AutoCAD software method in lumbar lordosis measurement.
Letafatkar, Amir; Amirsasan, Ramin; Abdolvahabi, Zahra; Hadadnezhad, Malihe
2011-12-01
The aim of this study was to determine the reliability and validity of the AutoCAD software method in lumbar lordosis measurement. Fifty healthy volunteers with a mean age of 23 ± 1.80 years were enrolled. A lumbar lateral radiograph was taken on all participants, and the lordosis was measured according to the Cobb method. Afterward, the lumbar lordosis degree was measured via AutoCAD software and flexible ruler methods. The current study is accomplished in 2 parts: intratester and intertester evaluations of reliability as well as the validity of the flexible ruler and software methods. Based on the intraclass correlation coefficient, AutoCAD's reliability and validity in measuring lumbar lordosis were 0.984 and 0.962, respectively. AutoCAD showed to be a reliable and valid method to measure lordosis. It is suggested that this method may replace those that are costly and involve health risks, such as radiography, in evaluating lumbar lordosis.
Ultrasonic Methods for Human Motion Detection
2006-10-01
contacts. The active method utilizes continuous wave ultrasonic Doppler sonar . Human motions have unique Doppler signatures and their combination...The present article reports results of human motion investigations with help of CW ultrasonic Doppler sonar . Low-cost, low-power ultrasonic motion...have been developed for operation in air [10]. Benefits of using ultrasonic CW Doppler sonar included the low-cost, low-electric noise, small size
André, Nuno Sequeira; Habel, Kai; Louchet, Hadrien; Richter, André
2013-11-04
We report experimental validations of an adaptive 2nd order Volterra equalization scheme for cost effective IMDD OFDM systems. This equalization scheme was applied to both uplink and downlink transmission. Downlink settings were optimized for maximum bitrate where we achieved 34 Gb/s over 10 km of SSMF using an EML with 10 GHz bandwidth. For the uplink, maximum reach was optimized achieving 14 Gb/s using a low-cost DML with 2.5 GHz bandwidth.
Automated lung sound analysis for detecting pulmonary abnormalities.
Datta, Shreyasi; Dutta Choudhury, Anirban; Deshpande, Parijat; Bhattacharya, Sakyajit; Pal, Arpan
2017-07-01
Identification of pulmonary diseases comprises of accurate auscultation as well as elaborate and expensive pulmonary function tests. Prior arts have shown that pulmonary diseases lead to abnormal lung sounds such as wheezes and crackles. This paper introduces novel spectral and spectrogram features, which are further refined by Maximal Information Coefficient, leading to the classification of healthy and abnormal lung sounds. A balanced lung sound dataset, consisting of publicly available data and data collected with a low-cost in-house digital stethoscope are used. The performance of the classifier is validated over several randomly selected non-overlapping training and validation samples and tested on separate subjects for two separate test cases: (a) overlapping and (b) non-overlapping data sources in training and testing. The results reveal that the proposed method sustains an accuracy of 80% even for non-overlapping data sources in training and testing.
NASA Astrophysics Data System (ADS)
Elbakary, Mohamed I.; Iftekharuddin, Khan M.; Papelis, Yiannis; Newman, Brett
2017-05-01
Air Traffic Management (ATM) concepts are commonly tested in simulation to obtain preliminary results and validate the concepts before adoption. Recently, the researchers found that simulation is not enough because of complexity associated with ATM concepts. In other words, full-scale tests must eventually take place to provide compelling performance evidence before adopting full implementation. Testing using full-scale aircraft produces a high-cost approach that yields high-confidence results but simulation provides a low-risk/low-cost approach with reduced confidence on the results. One possible approach to increase the confidence of the results and simultaneously reduce the risk and the cost is using unmanned sub-scale aircraft in testing new concepts for ATM. This paper presents the simulation results of using unmanned sub-scale aircraft in implementing ATM concepts compared to the full scale aircraft. The results of simulation show that the performance of sub-scale is quite comparable to that of the full-scale which validates use of the sub-scale in testing new ATM concepts. Keywords: Unmanned
Raffo, Antonio; Costanzo, Sandra; Di Massa, Giuseppe
2017-01-08
A vibration sensor based on the use of a Software-Defined Radio (SDR) platform is adopted in this work to provide a contactless and multipurpose solution for low-cost real-time vibrations monitoring. In order to test the vibration detection ability of the proposed non-contact method, a 1 GHz Doppler radar sensor is simulated and successfully assessed on targets at various distances, with various oscillation frequencies and amplitudes. Furthermore, an SDR Doppler platform is practically realized, and preliminary experimental validations on a device able to produce a harmonic motion are illustrated to prove the effectiveness of the proposed approach.
CARVALHO, Suzana Papile Maciel; BRITO, Liz Magalhães; de PAIVA, Luiz Airton Saavedra; BICUDO, Lucilene Arilho Ribeiro; CROSATO, Edgard Michel; de OLIVEIRA, Rogério Nogueira
2013-01-01
Validation studies of physical anthropology methods in the different population groups are extremely important, especially in cases in which the population variations may cause problems in the identification of a native individual by the application of norms developed for different communities. Objective This study aimed to estimate the gender of skeletons by application of the method of Oliveira, et al. (1995), previously used in a population sample from Northeast Brazil. Material and Methods The accuracy of this method was assessed for a population from Southeast Brazil and validated by statistical tests. The method used two mandibular measurements, namely the bigonial distance and the mandibular ramus height. The sample was composed of 66 skulls and the method was applied by two examiners. The results were statistically analyzed by the paired t test, logistic discriminant analysis and logistic regression. Results The results demonstrated that the application of the method of Oliveira, et al. (1995) in this population achieved very different outcomes between genders, with 100% for females and only 11% for males, which may be explained by ethnic differences. However, statistical adjustment of measurement data for the population analyzed allowed accuracy of 76.47% for males and 78.13% for females, with the creation of a new discriminant formula. Conclusion It was concluded that methods involving physical anthropology present high rate of accuracy for human identification, easy application, low cost and simplicity; however, the methodologies must be validated for the different populations due to differences in ethnic patterns, which are directly related to the phenotypic aspects. In this specific case, the method of Oliveira, et al. (1995) presented good accuracy and may be used for gender estimation in Brazil in two geographic regions, namely Northeast and Southeast; however, for other regions of the country (North, Central West and South), previous methodological adjustment is recommended as demonstrated in this study. PMID:24037076
Kern, Simon; Meyer, Klas; Guhl, Svetlana; Gräßer, Patrick; Paul, Andrea; King, Rudibert; Maiwald, Michael
2018-05-01
Monitoring specific chemical properties is the key to chemical process control. Today, mainly optical online methods are applied, which require time- and cost-intensive calibration effort. NMR spectroscopy, with its advantage being a direct comparison method without need for calibration, has a high potential for enabling closed-loop process control while exhibiting short set-up times. Compact NMR instruments make NMR spectroscopy accessible in industrial and rough environments for process monitoring and advanced process control strategies. We present a fully automated data analysis approach which is completely based on physically motivated spectral models as first principles information (indirect hard modeling-IHM) and applied it to a given pharmaceutical lithiation reaction in the framework of the European Union's Horizon 2020 project CONSENS. Online low-field NMR (LF NMR) data was analyzed by IHM with low calibration effort, compared to a multivariate PLS-R (partial least squares regression) approach, and both validated using online high-field NMR (HF NMR) spectroscopy. Graphical abstract NMR sensor module for monitoring of the aromatic coupling of 1-fluoro-2-nitrobenzene (FNB) with aniline to 2-nitrodiphenylamine (NDPA) using lithium-bis(trimethylsilyl) amide (Li-HMDS) in continuous operation. Online 43.5 MHz low-field NMR (LF) was compared to 500 MHz high-field NMR spectroscopy (HF) as reference method.
Global cost of child survival: estimates from country-level validation
van Ekdom, Liselore; Scherpbier, Robert W; Niessen, Louis W
2011-01-01
Abstract Objective To cross-validate the global cost of scaling up child survival interventions to achieve the fourth Millennium Development Goal (MDG4) as estimated by the World Health Organization (WHO) in 2007 by using the latest country-provided data and new assumptions. Methods After the main cost categories for each country were identified, validation questionnaires were sent to 32 countries with high child mortality. Publicly available estimates for disease incidence, intervention coverage, prices and resources for individual-level and programme-level activities were validated against local data. Nine updates to the 2007 WHO model were generated using revised assumptions. Finally, estimates were extrapolated to 75 countries and combined with cost estimates for immunization and malaria programmes and for programmes for the prevention of mother-to-child transmission of the human immunodeficiency virus (HIV). Findings Twenty-six countries responded. Adjustments were largest for system- and programme-level data and smallest for patient data. Country-level validation caused a 53% increase in original cost estimates (i.e. 9 billion 2004 United States dollars [US$]) for 26 countries owing to revised system and programme assumptions, especially surrounding community health worker costs. The additional effect of updated population figures was small; updated epidemiologic figures increased costs by US$ 4 billion (+15%). New unit prices in the 26 countries that provided data increased estimates by US$ 4.3 billion (+16%). Extrapolation to 75 countries increased the original price estimate by US$ 33 billion (+80%) for 2010–2015. Conclusion Country-level validation had a significant effect on the cost estimate. Price adaptations and programme-related assumptions contributed substantially. An additional 74 billion US$ 2005 (representing a 12% increase in total health expenditure) would be needed between 2010 and 2015. Given resource constraints, countries will need to prioritize health activities within their national resource envelope. PMID:21479091
Rothenhöfer, Martin; Scherübl, Rosmarie; Bernhardt, Günther; Heilmann, Jörg; Buschauer, Armin
2012-07-27
Purified oligomers of hyalobiuronic acid are indispensable tools to elucidate the physiological and pathophysiological role of hyaluronan degradation by various hyaluronidase isoenzymes. Therefore, we established and validated a novel sensitive, convenient, rapid, and cost-effective high performance thin layer chromatography (HPTLC) method for the qualitative and quantitative analysis of small saturated hyaluronan oligosaccharides consisting of 2-4 hyalobiuronic acid moieties. The use of amino-modified silica as stationary phase allows a simple reagent-free in situ derivatization by heating, resulting in a very low limit of detection (7-19 pmol per band, depending on the analyzed saturated oligosaccharide). By this derivatization procedure for the first time densitometric quantification of the analytes could be performed by HPTLC. The validated method showed a quantification limit of 37-71 pmol per band and was proven to be superior in comparison to conventional detection of hyaluronan oligosaccharides. The analytes were identified by hyphenation of normal phase planar chromatography to mass spectrometry (TLC-MS) using electrospray ionization. As an alternative to sequential techniques such as high performance liquid chromatography (HPLC) and capillary electrophoresis (CE), the validated HPTLC quantification method can easily be automated and is applicable to the analysis of multiple samples in parallel. Copyright © 2012 Elsevier B.V. All rights reserved.
Doubly stochastic radial basis function methods
NASA Astrophysics Data System (ADS)
Yang, Fenglian; Yan, Liang; Ling, Leevan
2018-06-01
We propose a doubly stochastic radial basis function (DSRBF) method for function recoveries. Instead of a constant, we treat the RBF shape parameters as stochastic variables whose distribution were determined by a stochastic leave-one-out cross validation (LOOCV) estimation. A careful operation count is provided in order to determine the ranges of all the parameters in our methods. The overhead cost for setting up the proposed DSRBF method is O (n2) for function recovery problems with n basis. Numerical experiments confirm that the proposed method not only outperforms constant shape parameter formulation (in terms of accuracy with comparable computational cost) but also the optimal LOOCV formulation (in terms of both accuracy and computational cost).
Vassall, Anna; van Kampen, Sanne; Sohn, Hojoon; Michael, Joy S.; John, K. R.; den Boon, Saskia; Davis, J. Lucian; Whitelaw, Andrew; Nicol, Mark P.; Gler, Maria Tarcela; Khaliqov, Anar; Zamudio, Carlos; Perkins, Mark D.; Boehme, Catharina C.; Cobelens, Frank
2011-01-01
Background Xpert MTB/RIF (Xpert) is a promising new rapid diagnostic technology for tuberculosis (TB) that has characteristics that suggest large-scale roll-out. However, because the test is expensive, there are concerns among TB program managers and policy makers regarding its affordability for low- and middle-income settings. Methods and Findings We estimate the impact of the introduction of Xpert on the costs and cost-effectiveness of TB care using decision analytic modelling, comparing the introduction of Xpert to a base case of smear microscopy and clinical diagnosis in India, South Africa, and Uganda. The introduction of Xpert increases TB case finding in all three settings; from 72%–85% to 95%–99% of the cohort of individuals with suspected TB, compared to the base case. Diagnostic costs (including the costs of testing all individuals with suspected TB) also increase: from US$28–US$49 to US$133–US$146 and US$137–US$151 per TB case detected when Xpert is used “in addition to” and “as a replacement of” smear microscopy, respectively. The incremental cost effectiveness ratios (ICERs) for using Xpert “in addition to” smear microscopy, compared to the base case, range from US$41–$110 per disability adjusted life year (DALY) averted. Likewise the ICERS for using Xpert “as a replacement of” smear microscopy range from US$52–$138 per DALY averted. These ICERs are below the World Health Organization (WHO) willingness to pay threshold. Conclusions Our results suggest that Xpert is a cost-effective method of TB diagnosis, compared to a base case of smear microscopy and clinical diagnosis of smear-negative TB in low- and middle-income settings where, with its ability to substantially increase case finding, it has important potential for improving TB diagnosis and control. The extent of cost-effectiveness gain to TB programmes from deploying Xpert is primarily dependent on current TB diagnostic practices. Further work is required during scale-up to validate these findings. Please see later in the article for the Editors' Summary PMID:22087078
How low can you go? The impact of reduced benefits and increased cost sharing.
Lee, Jason S; Tollen, Laura
2002-01-01
Amid escalating health care costs and a managed care backlash, employers are considering traditional cost control methods from the pre-managed care era. We use an actuarial model to estimate the premium-reducing effects of two such methods: increasing employee cost sharing and reducing benefits. Starting from a baseline plan with rich benefits and low cost sharing, estimated premium savings as a result of eliminating five specific benefits were about 22 percent. The same level of savings was also achieved by increasing cost sharing from a 15 dollars copayment with no deductible to 20 percent coinsurance and a 250 dollars deductible. Further increases in cost sharing produced estimated savings of up to 50 percent. We discuss possible market- and individual-level effects of the proliferation of plans with high cost sharing and low benefits.
NASA Astrophysics Data System (ADS)
Tubiana, Jerome; Kass, Alex J.; Newman, Maya Y.; Levitz, David
2015-07-01
Detecting pre-cancer in epithelial tissues such as the cervix is a challenging task in low-resources settings. In an effort to achieve low cost cervical cancer screening and diagnostic method for use in low resource settings, mobile colposcopes that use a smartphone as their engine have been developed. Designing image analysis software suited for this task requires proper modeling of light propagation from the abnormalities inside tissues to the camera of the smartphones. Different simulation methods have been developed in the past, by solving light diffusion equations, or running Monte Carlo simulations. Several algorithms exist for the latter, including MCML and the recently developed MCX. For imaging purpose, the observable parameter of interest is the reflectance profile of a tissue under some specific pattern of illumination and optical setup. Extensions of the MCX algorithm to simulate this observable under these conditions were developed. These extensions were validated against MCML and diffusion theory for the simple case of contact measurements, and reflectance profiles under colposcopy imaging geometry were also simulated. To validate this model, the diffuse reflectance profiles of tissue phantoms were measured with a spectrometer under several illumination and optical settings for various homogeneous tissues phantoms. The measured reflectance profiles showed a non-trivial deviation across the spectrum. Measurements of an added absorber experiment on a series of phantoms showed that absorption of dye scales linearly when fit to both MCX and diffusion models. More work is needed to integrate a pupil into the experiment.
NASA Astrophysics Data System (ADS)
Stewart, R. D.; Abou Najm, M. R.; Rupp, D. E.; Selker, J. S.
2010-12-01
Shrinking/swelling soils are characterized by transient crack networks which function as dominant controls on the partitioning of surface and subsurface flow, the rate and depth of percolation, and evaporation rates. For such soils, understanding the dynamics of cracks is critical to accurately quantify their influence on groundwater recharge, stream-flow generation, and solute transport, among other component of a site’s hydrology. We propose a low-cost method for measuring transient crack-volume using a sealed plastic bag connected by a hose to a PVC standpipe. The empty bag is placed into the crack, and then water is added via the standpipe, until the bag has expanded to the boundaries of the crack and some water remains in the standpipe. As the crack shrinks or swells, its volume changes, causing water displacement within the bag, which is measured as a corresponding change in water level in the standpipe. An automated level logger within the standpipe is used to record changes in water level, which are converted to volumetric changes from the known internal cross-sectional area of the standpipe. The volume of water filling the bag is accurately measured at the start and completion of the experiment (to check for leakage). Adding the startup volume to the cumulative temporal volumetric change in the standpipe provides a simple and accurate method for monitoring transient crack volume. Currently, the design is undergoing preliminary testing in a field site in Ninhue, Chile, and field and laboratory testing in Corvallis, Oregon. Initial results from the Chilean field site suggest that the crack-o-meters are responding to the closing of cracks, but further effort is needed to calibrate and validate the results. We hope that these low-cost “crack-o-meters” will become useful and simple tools for researchers to quantify temporal changes in crack volume with the objective of incorporating these results into hydrological modeling efforts.
Global Perspective for Protecting Intellectual Property - Patenting in USA and Poland
NASA Astrophysics Data System (ADS)
Grebski, Michalene Eva; Wolniak, Radosław
2018-06-01
Paper addresses the different methods for protecting intellectual property in modern knowledge-based economies. The focus of the paper is a comparison between the procedures for applying for patents in Poland and the United States. The comparison has been made from the perspective of the cost of obtaining and maintaining a patent in Poland, the United States and some other countries. The comparison has also been made from the perspective of the procedures for applying for a patent in different countries based on the Patent Cooperation Treaty. The paper also includes a comparison of the time needed for processing the patent application. Low cost provisional twelve-month patent pending protection available in the United States is also being discussed. The paper also provides some guidance and recommendations for conducting a patent search in order to validate the originality of the invention.
Validation of extremes within the Perfect-Predictor Experiment of the COST Action VALUE
NASA Astrophysics Data System (ADS)
Hertig, Elke; Maraun, Douglas; Wibig, Joanna; Vrac, Mathieu; Soares, Pedro; Bartholy, Judith; Pongracz, Rita; Mares, Ileana; Gutierrez, Jose Manuel; Casanueva, Ana; Alzbutas, Robertas
2016-04-01
Extreme events are of widespread concern due to their damaging consequences on natural and anthropogenic systems. From science to applications the statistical attributes of rare and infrequent occurrence and low probability become connected with the socio-economic aspect of strong impact. Specific end-user needs regarding information about extreme events depend on the type of application, but as a joining element there is always the request for easily accessible climate change information with a clear description of their uncertainties and limitations. Within the Perfect-Predictor Experiment of the COST Action VALUE extreme indices modelled from a wide range of downscaling methods are compared to reference indices calculated from observational data. The experiment uses reference data from a selection of 86 weather stations representative of the different climates in Europe. Results are presented for temperature and precipitation extremes and include aspects of the marginal distribution as well as spell-length related aspects.
Conget, Ignacio; Martín-Vaquero, Pilar; Roze, Stéphane; Elías, Isabel; Pineda, Cristina; Álvarez, María; Delbaere, Alexis; Ampudia-Blasco, Francisco Javier
2018-05-19
To compare the cost-effectiveness of sensor-augmented pump therapy (SAP) [continuous subcutaneous insulin infusion (CSII) plus real-time continuous glucose monitoring (RT-CGM)] with low glucose suspend (MiniMed™ Veo™) and CSII alone in patients with type 1 diabetes mellitus (T1DM) at high risk of hypoglycemia in Spain. The IQVIA CORE Diabetes Model was used to estimate healthcare outcomes as life-years gained (LYGs) and quality-adjusted life years (QALYs), and to project lifetime costs. Information about efficacy, resource utilization, and unit costs (€2016) was taken from published sources and validated by an expert panel. Analyses were performed from both the Spanish National Health System (NHS) perspective and the societal perspective. From the NHS perspective, SAP with low glucose suspend was associated to a €47,665 increase in direct healthcare costs and to increases of 0.19 LYGs and 1.88 QALYs, both discounted, which resulted in an incremental cost-effectiveness ratio (ICER) of €25,394/QALY. From the societal perspective, SAP with low glucose suspend increased total costs (including direct and indirect healthcare costs) by €41,036, with a resultant ICER of €21,862/QALY. Considering the willingness-to-pay threshold of €30,000/QALY in Spain, SAP with low glucose suspend represents a cost-effective option from both the NHS and societal perspectives. Sensitivity analyses confirmed the robustness of the model. From both the Spanish NHS perspective and the societal perspective, SAP with low glucose suspend is a cost-effective option for the treatment of T1DM patients at high risk of hypoglycemia. Copyright © 2018 SEEN y SED. Publicado por Elsevier España, S.L.U. All rights reserved.
NASA Astrophysics Data System (ADS)
Lippincott, M.; Lewis, E. S.; Gehrke, G. E.; Wise, A.; Pyle, S.; Sinatra, V.; Bland, G.; Bydlowski, D.; Henry, A.; Gilberts, P. A.
2016-12-01
Community groups are interested in low-cost sensors to monitor their environment. However, many new commercial sensors are unknown devices without peer-reviewed evaluations of data quality or pathways to regulatory acceptance, and the time to achieve these outcomes may be beyond a community's patience and attention. Rather than developing a device from scratch or validating a new commercial product, a workflow is presented whereby existing technologies, especially those that are out of patent, are replicated through open online collaboration between communities affected by environmental pollution, volunteers, academic institutions, and existing open hardware and open source software projects. Technology case studies will be presented, focusing primarily on a passive PM monitor based on the UNC Passive Monitor. Stages of the project will be detailed moving from identifying community needs, reviewing existing technology, partnership development, technology replication, IP review and licensing, data quality assurance (in process), and field evaluation with community partners (in process), with special attention to partnership development and technology review. We have leveraged open hardware and open source software to lower the cost and access barriers of existing technologies for PM10-2.5 and other atmospheric measures that have already been validated through peer review. Existing validation of and regulatory familiarity with a technology enables a rapid pathway towards collecting data, shortening the time it takes for communities to leverage data in environmental management decisions. Online collaboration requires rigorous documentation that aids in spreading research methods and promoting deep engagement by interested community researchers outside academia. At the same time, careful choice of technology and the use of small-scale fabrication through laser cutting, 3D printing, and open, shared repositories of plans and software enables educational engagement that broadens a project's reach.
NASA Astrophysics Data System (ADS)
Yi, Xiaoqing; Hao, Liling; Jiang, Fangfang; Xu, Lisheng; Song, Shaoxiu; Li, Gang; Lin, Ling
2017-08-01
Synchronous acquisition of multi-channel biopotential signals, such as electrocardiograph (ECG) and electroencephalograph, has vital significance in health care and clinical diagnosis. In this paper, we proposed a new method which is using single channel ADC to acquire multi-channel biopotential signals modulated by square waves synchronously. In this method, a specific modulate and demodulate method has been investigated without complex signal processing schemes. For each channel, the sampling rate would not decline with the increase of the number of signal channels. More specifically, the signal-to-noise ratio of each channel is n times of the time-division method or an improvement of 3.01 ×log2n dB, where n represents the number of the signal channels. A numerical simulation shows the feasibility and validity of this method. Besides, a newly developed 8-lead ECG based on the new method has been introduced. These experiments illustrate that the method is practicable and thus is potential for low-cost medical monitors.
Estimation of low back moments from video analysis: a validation study.
Coenen, Pieter; Kingma, Idsart; Boot, Cécile R L; Faber, Gert S; Xu, Xu; Bongers, Paulien M; van Dieën, Jaap H
2011-09-02
This study aimed to develop, compare and validate two versions of a video analysis method for assessment of low back moments during occupational lifting tasks since for epidemiological studies and ergonomic practice relatively cheap and easily applicable methods to assess low back loads are needed. Ten healthy subjects participated in a protocol comprising 12 lifting conditions. Low back moments were assessed using two variants of a video analysis method and a lab-based reference method. Repeated measures ANOVAs showed no overall differences in peak moments between the two versions of the video analysis method and the reference method. However, two conditions showed a minor overestimation of one of the video analysis method moments. Standard deviations were considerable suggesting that errors in the video analysis were random. Furthermore, there was a small underestimation of dynamic components and overestimation of the static components of the moments. Intraclass correlations coefficients for peak moments showed high correspondence (>0.85) of the video analyses with the reference method. It is concluded that, when a sufficient number of measurements can be taken, the video analysis method for assessment of low back loads during lifting tasks provides valid estimates of low back moments in ergonomic practice and epidemiological studies for lifts up to a moderate level of asymmetry. Copyright © 2011 Elsevier Ltd. All rights reserved.
Gabrilovich, Evgeniy
2013-01-01
Background Postmarket drug safety surveillance largely depends on spontaneous reports by patients and health care providers; hence, less common adverse drug reactions—especially those caused by long-term exposure, multidrug treatments, or those specific to special populations—often elude discovery. Objective Here we propose a low cost, fully automated method for continuous monitoring of adverse drug reactions in single drugs and in combinations thereof, and demonstrate the discovery of heretofore-unknown ones. Methods We used aggregated search data of large populations of Internet users to extract information related to drugs and adverse reactions to them, and correlated these data over time. We further extended our method to identify adverse reactions to combinations of drugs. Results We validated our method by showing high correlations of our findings with known adverse drug reactions (ADRs). However, although acute early-onset drug reactions are more likely to be reported to regulatory agencies, we show that less acute later-onset ones are better captured in Web search queries. Conclusions Our method is advantageous in identifying previously unknown adverse drug reactions. These ADRs should be considered as candidates for further scrutiny by medical regulatory authorities, for example, through phase 4 trials. PMID:23778053
Expert system verification and validation study. Delivery 3A and 3B: Trip summaries
NASA Technical Reports Server (NTRS)
French, Scott
1991-01-01
Key results are documented from attending the 4th workshop on verification, validation, and testing. The most interesting part of the workshop was when representatives from the U.S., Japan, and Europe presented surveys of VV&T within their respective regions. Another interesting part focused on current efforts to define industry standards for artificial intelligence and how that might affect approaches to VV&T of expert systems. The next part of the workshop focused on VV&T methods of applying mathematical techniques to verification of rule bases and techniques for capturing information relating to the process of developing software. The final part focused on software tools. A summary is also presented of the EPRI conference on 'Methodologies, Tools, and Standards for Cost Effective Reliable Software Verification and Validation. The conference was divided into discussion sessions on the following issues: development process, automated tools, software reliability, methods, standards, and cost/benefit considerations.
NASA Astrophysics Data System (ADS)
Rothmund, Sabrina; Niethammer, Uwe; Walter, Marco; Joswig, Manfred
2013-04-01
In recent years, the high-resolution and multi-temporal 3D mapping of the Earth's surface using terrestrial laser scanning (TLS), ground-based optical images and especially low-cost UAV-based aerial images (Unmanned Aerial Vehicle) has grown in importance. This development resulted from the progressive technical improvement of the imaging systems and the freely available multi-view stereo (MVS) software packages. These different methods of data acquisition for the generation of accurate, high-resolution digital surface models (DSMs) were applied as part of an eight-week field campaign at the Super-Sauze landslide (South French Alps). An area of approximately 10,000 m² with long-term average displacement rates greater than 0.01 m/day has been investigated. The TLS-based point clouds were acquired at different viewpoints with an average point spacing between 10 to 40 mm and at different dates. On these days, more than 50 optical images were taken on points along a predefined line on the side part of the landslide by a low-cost digital compact camera. Additionally, aerial images were taken by a radio-controlled mini quad-rotor UAV equipped with another low-cost digital compact camera. The flight altitude ranged between 20 m and 250 m and produced a corresponding ground resolution between 0.6 cm and 7 cm. DGPS measurements were carried out as well in order to geo-reference and validate the point cloud data. To generate unscaled photogrammetric 3D point clouds from a disordered and tilted image set, we use the widespread open-source software package Bundler and PMVS2 (University of Washington). These multi-temporal DSMs are required on the one hand to determine the three-dimensional surface deformations and on the other hand it will be required for differential correction for orthophoto production. Drawing on the example of the acquired data at the Super-Sauze landslide, we demonstrate the potential but also the limitations of the photogrammetric point clouds. To determine the quality of the photogrammetric point cloud, these point clouds are compared with the TLS-based DSMs. The comparison shows that photogrammetric points accuracies are in the range of cm to dm, therefore don't reach the quality of the high-resolution TLS-based DSMs. Further, the validation of the photogrammetric point clouds reveals that some of them have internal curvature effects. The advantage of the photogrammetric 3D data acquisition is the use of low-cost equipment and less time-consuming data collection in the field. While the accuracy of the photogrammetric point clouds is not as high as TLS-based DSMs, the advantages of the former method are seen when applied in areas where dm-range is sufficient.
Bardaji, Raul; Sánchez, Albert-Miquel; Simon, Carine; Wernand, Marcel R; Piera, Jaume
2016-03-15
A critical parameter to assess the environmental status of water bodies is the transparency of the water, as it is strongly affected by different water quality related components (such as the presence of phytoplankton, organic matter and sediment concentrations). One parameter to assess the water transparency is the diffuse attenuation coefficient. However, the number of subsurface irradiance measurements obtained with conventional instrumentation is relatively low, due to instrument costs and the logistic requirements to provide regular and autonomous observations. In recent years, the citizen science concept has increased the number of environmental observations, both in time and space. The recent technological advances in embedded systems and sensors also enable volunteers (citizens) to create their own devices (known as Do-It-Yourself or DIY technologies). In this paper, a DIY instrument to measure irradiance at different depths and automatically calculate the diffuse attenuation Kd coefficient is presented. The instrument, named KdUINO, is based on an encapsulated low-cost photonic sensor and Arduino (an open-hardware platform for the data acquisition). The whole instrument has been successfully operated and the data validated comparing the KdUINO measurements with the commercial instruments. Workshops have been organized with high school students to validate its feasibility.
Bardaji, Raul; Sánchez, Albert-Miquel; Simon, Carine; Wernand, Marcel R.; Piera, Jaume
2016-01-01
A critical parameter to assess the environmental status of water bodies is the transparency of the water, as it is strongly affected by different water quality related components (such as the presence of phytoplankton, organic matter and sediment concentrations). One parameter to assess the water transparency is the diffuse attenuation coefficient. However, the number of subsurface irradiance measurements obtained with conventional instrumentation is relatively low, due to instrument costs and the logistic requirements to provide regular and autonomous observations. In recent years, the citizen science concept has increased the number of environmental observations, both in time and space. The recent technological advances in embedded systems and sensors also enable volunteers (citizens) to create their own devices (known as Do-It-Yourself or DIY technologies). In this paper, a DIY instrument to measure irradiance at different depths and automatically calculate the diffuse attenuation Kd coefficient is presented. The instrument, named KdUINO, is based on an encapsulated low-cost photonic sensor and Arduino (an open-hardware platform for the data acquisition). The whole instrument has been successfully operated and the data validated comparing the KdUINO measurements with the commercial instruments. Workshops have been organized with high school students to validate its feasibility. PMID:26999132
Buck, Ursula; Buße, Kirsten; Campana, Lorenzo; Schyma, Christian
2018-03-01
Three-dimensional (3D) measurement techniques are gaining importance in many areas. The latest developments brought more cost-effective, user-friendly, and faster technologies onto the market. Which 3D techniques are suitable in the field of forensic medicine and what are their advantages and disadvantages? This wide-ranging study evaluated and validated various 3D measurement techniques for the forensic requirements. High-tech methods as well as low-budget systems have been tested and compared in terms of accuracy, ease of use, expenditure of time, mobility, cost, necessary knowhow, and their limitations. Within this study, various commercial measuring systems of the different techniques were tested. Based on the first results, one measuring system was selected for each technique, which appeared to be the most suitable for the forensic application or is already established in forensic medicine. A body of a deceased, a face and an injury of a living person, and a shoe sole were recorded by 11 people with different professions and previous knowledge using the selected systems. The results were assessed and the personal experiences were evaluated using a questionnaire. In addition, precision investigations were carried out using test objects. The study shows that the hand-held scanner and photogrammetry are very suitable for the 3D documentation of forensic medical findings. Their moderate acquisition costs and easy operation could lead to more frequent application in forensic medicine in the future. For special applications, the stripe-light scanner still has its justification due to its high precision, the flexible application area, and the high reliability. The results show that, thanks to the technological advances, the 3D measurement technology will have more and more impact on the routine of the forensic medical examination.
Hertzog, Gabriel I; Soares, Karina L; Caldas, Sergiane S; Primel, Ednei G
2015-06-01
A procedure based on vortex-assisted matrix solid-phase dispersion (MSPD) for the extraction of 15 pharmaceuticals from fish samples with determination by liquid chromatography-tandem mass spectrometry (LC-MS/MS) was validated. Florisil, C18, diatomaceous earth, chitin, and chitosan were evaluated as solid supports. Best results were obtained with 0.5 g of diatomaceous earth, 0.5 g of sodium sulfate, and 5 mL of methanol. Analytical recoveries ranged from 58 to 128 % with relative standard deviation (RSD) lower than 15 %. Limit of quantification (LOQ) values for the 15 compounds ranged from 5 to 1000 ng g(-1). The method under investigation has shown to be a simple and fast extraction tool with minimum instrumentation and low amount of reagent, resulting in method low cost. Besides, alternative materials, such as chitin and chitosan, which were applied to the dispersion step for the first time, were found to be interesting alternatives.
NASA Astrophysics Data System (ADS)
Agogue, Romain; Chebil, Naziha; Deleglise-Lagardere, Mylène; Beauchene, Pierre; Park, Chung Hae
2017-10-01
We propose a new experimental method using a Hassler cell and air injection to measure the permeability of fiber preform while avoiding a race tracking effect. This method was proven to be particularly efficient to measure very low through-thickness permeability of preform fabricated by automated dry fiber placement. To validate the reliability of the permeability measurement, the experiments of viscous liquid infusion into the preform with or without a distribution medium were performed. The experimental data of flow front advancement was compared with the numerical simulation result using the permeability values obtained by the Hassler cell permeability measurement set-up as well as by the liquid infusion experiments. To address the computational cost issue, the model for the equivalent permeability of distribution medium was employed in the numerical simulation of liquid flow. The new concept using air injection and Hassler cell for the fiber preform permeability measurement was shown to be reliable and efficient.
Carvalho, Suzana Papile Maciel; Brito, Liz Magalhães; Paiva, Luiz Airton Saavedra de; Bicudo, Lucilene Arilho Ribeiro; Crosato, Edgard Michel; Oliveira, Rogério Nogueira de
2013-01-01
Validation studies of physical anthropology methods in the different population groups are extremely important, especially in cases in which the population variations may cause problems in the identification of a native individual by the application of norms developed for different communities. This study aimed to estimate the gender of skeletons by application of the method of Oliveira, et al. (1995), previously used in a population sample from Northeast Brazil. The accuracy of this method was assessed for a population from Southeast Brazil and validated by statistical tests. The method used two mandibular measurements, namely the bigonial distance and the mandibular ramus height. The sample was composed of 66 skulls and the method was applied by two examiners. The results were statistically analyzed by the paired t test, logistic discriminant analysis and logistic regression. The results demonstrated that the application of the method of Oliveira, et al. (1995) in this population achieved very different outcomes between genders, with 100% for females and only 11% for males, which may be explained by ethnic differences. However, statistical adjustment of measurement data for the population analyzed allowed accuracy of 76.47% for males and 78.13% for females, with the creation of a new discriminant formula. It was concluded that methods involving physical anthropology present high rate of accuracy for human identification, easy application, low cost and simplicity; however, the methodologies must be validated for the different populations due to differences in ethnic patterns, which are directly related to the phenotypic aspects. In this specific case, the method of Oliveira, et al. (1995) presented good accuracy and may be used for gender estimation in Brazil in two geographic regions, namely Northeast and Southeast; however, for other regions of the country (North, Central West and South), previous methodological adjustment is recommended as demonstrated in this study.
Fong, Daniel Tik-Pui; Chan, Yue-Yan
2010-01-01
Wearable motion sensors consisting of accelerometers, gyroscopes and magnetic sensors are readily available nowadays. The small size and low production costs of motion sensors make them a very good tool for human motions analysis. However, data processing and accuracy of the collected data are important issues for research purposes. In this paper, we aim to review the literature related to usage of inertial sensors in human lower limb biomechanics studies. A systematic search was done in the following search engines: ISI Web of Knowledge, Medline, SportDiscus and IEEE Xplore. Thirty nine full papers and conference abstracts with related topics were included in this review. The type of sensor involved, data collection methods, study design, validation methods and its applications were reviewed. PMID:22163542
Fong, Daniel Tik-Pui; Chan, Yue-Yan
2010-01-01
Wearable motion sensors consisting of accelerometers, gyroscopes and magnetic sensors are readily available nowadays. The small size and low production costs of motion sensors make them a very good tool for human motions analysis. However, data processing and accuracy of the collected data are important issues for research purposes. In this paper, we aim to review the literature related to usage of inertial sensors in human lower limb biomechanics studies. A systematic search was done in the following search engines: ISI Web of Knowledge, Medline, SportDiscus and IEEE Xplore. Thirty nine full papers and conference abstracts with related topics were included in this review. The type of sensor involved, data collection methods, study design, validation methods and its applications were reviewed.
NASA Astrophysics Data System (ADS)
Su, Yang; Zhou, Hua; Wang, Yiming; Shen, Huiping
2018-03-01
In this paper we propose a new design to demodulate polarization properties induced by pressure using a PBS (polarization beam splitter), which is different with traditional polarimeter based on the 4-detector polarization measurement approach. The theoretical model is established by Muller matrix method. Experimental results confirm the validity of our analysis. Proportional relationships and linear fit are found between output signal and applied pressure. A maximum sensitivity of 0.092182 mv/mv is experimentally achieved and the frequency response exhibits a <0.14 dB variation across the measurement bandwidth. The sensitivity dependence on incident SOP (state of polarization) is investigated. The simple and all-fiber configuration, low-cost and high speed potential make it promising for fiber-based dynamic pressure sensing.
NASA Astrophysics Data System (ADS)
Luján, José M.; Bermúdez, Vicente; Guardiola, Carlos; Abbad, Ali
2010-10-01
In-cylinder pressure measurement has historically been used for off-line combustion diagnosis, but online application for real-time combustion control has become of great interest. This work considers low computing-cost methods for analysing the instant variation of the chamber pressure, directly obtained from the electric signal provided by a traditional piezoelectric sensor. Presented methods are based on the detection of sudden changes in the chamber pressure, which are amplified by the pressure derivative, and which are due to thermodynamic phenomena within the cylinder. Signal analysis tools both in time and in time-frequency domains are used for detecting the start of combustion, the end of combustion and the heat release peak. Results are compared with classical thermodynamic analysis and validated in several turbocharged diesel engines.
Center for Global Health announces grants to support portable technologies
NCI's Center for Global Health announced grants that will support the development and validation of low-cost, portable technologies. These technologies have the potential to improve early detection, diagnosis, and non-invasive or minimally invasive treatm
OMI Total Ozone Column Product Validated Against UVMFR Retrievals
NASA Astrophysics Data System (ADS)
Ioannis, Raptis Panagiotis; Kazadziz, Stelios; Eleftherantos, Kostas; Kosmopoulos, Panagiotis; Amiridis, Vassilis
2015-11-01
The Ozone Monitoring Instrument (OMI) is a spectroradiometer on board NASA Aura, providing Total Ozone Column (TOC), almost globally, every day, with a spatial resolution of 13kmX24 km, since July 2004. In the next few months Sentinel-5P will be launched, and carry TROPOMI, a spaceborne nadir viewing spectrometer which will cover tha same spectral range, narrowing the spatial resolution to 7 km X 7 km and extending current data record. Studies have evaluated OMI's product using Brewer spectroradiometer measurements and found average biases to be less than 3%.UVMFR (Ultraviolet Multifilter Radiometer) is an instrument designed to measure total and diffuse and calculate Direct solar Irradiance at 7 wavelengths in the UV spectrum, with high accuracy and very high frequency. Main advantages of this instrument is the portability, the automatic calibration procedure, simple operational use, unattended functionality and the relatively low cost. In that frame it could become a very effective solution to validate satellite products.A method was developed to retrieve TOC, from UVMFR measurements combined with radiative transfer model calculations. Lookup tables of ratios of direct solar irradiance at 305nm and 325nm in respect to TOC, Solar Zenith Angle and Aerosol Optical Depth have been constructed and compared with UVMFR irradiance measurements in order to retrieve TOC.We used UVMFR measurements in Athens, Greece during the period July 2009 to May 2014 to create a TOC time series with high temporal frequency (1 minute for cloudless conditions).The validation of the method have been assessed using a Brewer spectroradiometer operating in parallel for the whole period. In order to compare OMI-based and ground-based TOC measurements we have calculated UVMFR daily values of TOC averaging measurements in a 2 hour window around OMI overpass. This comparison revealed differences up to 7%, with mean differences at 4.2 DU and standard deviation of 8.7%. Same seasonal cycle was observed in both data sets, with minimum values at October-November andmaximum at April-May. Also a small seasonal dependent difference among the time series was observed. OMI retrieval permanently underestimated during spring months, and overestimated at summer months. We investigated this behavior by examining Ozone Effective Temperature influence by its effect on ozone absorption coefficient and detect a relation of 0.9% TOC change per K. We applied a correction to the data set using stratospheric temperature climatological values.This method could be be adopted in order to validate TROPOMI retrievals in places where Brewer instruments are not available, benefiting from instrument 's mobility and low cost and portability.
Albuquerque Filho, Alfredo Pereira Leite de; Araújo, Jéssica Guido de; Souza, Inacelli Queiroz de; Martins, Luciana Cardoso; Oliveira, Marta Iglis de; Silva, Maria Jesuíta Bezerra da; Montarroyos, Ulisses Ramos; Miranda Filho, Demócrito de Barros
2011-01-01
Leptospirosis is often mistaken for other acute febrile illnesses because of its nonspecific presentation. Bacteriologic, serologic, and molecular methods have several limitations for early diagnosis: technical complexity, low availability, low sensitivity in early disease, or high cost. This study aimed to validate a case definition, based on simple clinical and laboratory tests, that is intended for bedside diagnosis of leptospirosis among hospitalized patients. Adult patients, admitted to two reference hospitals in Recife, Brazil, with a febrile illness of less than 21 days and with a clinical suspicion of leptospirosis, were included to test a case definition comprising ten clinical and laboratory criteria. Leptospirosis was confirmed or excluded by a composite reference standard (microscopic agglutination test, ELISA, and blood culture). Test properties were determined for each cutoff number of the criteria from the case definition. Ninety seven patients were included; 75 had confirmed leptospirosis and 22 did not. Mean number of criteria from the case definition that were fulfilled was 7.8±1.2 for confirmed leptospirosis and 5.9±1.5 for non-leptospirosis patients (p<0.0001). Best sensitivity (85.3%) and specificity (68.2%) combination was found with a cutoff of 7 or more criteria, reaching positive and negative predictive values of 90.1% and 57.7%, respectively; accuracy was 81.4%. The case definition, for a cutoff of at least 7 criteria, reached average sensitivity and specificity, but with a high positive predictive value. Its simplicity and low cost make it useful for rapid bedside leptospirosis diagnosis in Brazilian hospitalized patients with acute severe febrile disease.
Validation of Pooled Whole-Genome Re-Sequencing in Arabidopsis lyrata.
Fracassetti, Marco; Griffin, Philippa C; Willi, Yvonne
2015-01-01
Sequencing pooled DNA of multiple individuals from a population instead of sequencing individuals separately has become popular due to its cost-effectiveness and simple wet-lab protocol, although some criticism of this approach remains. Here we validated a protocol for pooled whole-genome re-sequencing (Pool-seq) of Arabidopsis lyrata libraries prepared with low amounts of DNA (1.6 ng per individual). The validation was based on comparing single nucleotide polymorphism (SNP) frequencies obtained by pooling with those obtained by individual-based Genotyping By Sequencing (GBS). Furthermore, we investigated the effect of sample number, sequencing depth per individual and variant caller on population SNP frequency estimates. For Pool-seq data, we compared frequency estimates from two SNP callers, VarScan and Snape; the former employs a frequentist SNP calling approach while the latter uses a Bayesian approach. Results revealed concordance correlation coefficients well above 0.8, confirming that Pool-seq is a valid method for acquiring population-level SNP frequency data. Higher accuracy was achieved by pooling more samples (25 compared to 14) and working with higher sequencing depth (4.1× per individual compared to 1.4× per individual), which increased the concordance correlation coefficient to 0.955. The Bayesian-based SNP caller produced somewhat higher concordance correlation coefficients, particularly at low sequencing depth. We recommend pooling at least 25 individuals combined with sequencing at a depth of 100× to produce satisfactory frequency estimates for common SNPs (minor allele frequency above 0.05).
A lower bound to the social cost of CO2 emissions
NASA Astrophysics Data System (ADS)
van den Bergh, J. C. J. M.; Botzen, W. J. W.
2014-04-01
Many studies have estimated the social cost of carbon (SCC). We critically evaluate SCC estimates, focusing on omitted cost categories, discounting, uncertainties about damage costs and risk aversion. This allows for the calculation of a lower bound to the SCC. Dominant SCC values turn out to be gross underestimates, notably, but not only, for a low discount rate. The validity of this lower bound is supported by a precautionary approach to reflect risk aversion against extreme climate change. The results justify a more stringent climate policy than is suggested by most influential past studies.
Rater reliability and construct validity of a mobile application for posture analysis
Szucs, Kimberly A.; Brown, Elena V. Donoso
2018-01-01
[Purpose] Measurement of posture is important for those with a clinical diagnosis as well as researchers aiming to understand the impact of faulty postures on the development of musculoskeletal disorders. A reliable, cost-effective and low tech posture measure may be beneficial for research and clinical applications. The purpose of this study was to determine rater reliability and construct validity of a posture screening mobile application in healthy young adults. [Subjects and Methods] Pictures of subjects were taken in three standing positions. Two raters independently digitized the static standing posture image twice. The app calculated posture variables, including sagittal and coronal plane translations and angulations. Intra- and inter-rater reliability were calculated using the appropriate ICC models for complete agreement. Construct validity was determined through comparison of known groups using repeated measures ANOVA. [Results] Intra-rater reliability ranged from 0.71 to 0.99. Inter-rater reliability was good to excellent for all translations. ICCs were stronger for translations versus angulations. The construct validity analysis found that the app was able to detect the change in the four variables selected. [Conclusion] The posture mobile application has demonstrated strong rater reliability and preliminary evidence of construct validity. This application may have utility in clinical and research settings. PMID:29410561
Rater reliability and construct validity of a mobile application for posture analysis.
Szucs, Kimberly A; Brown, Elena V Donoso
2018-01-01
[Purpose] Measurement of posture is important for those with a clinical diagnosis as well as researchers aiming to understand the impact of faulty postures on the development of musculoskeletal disorders. A reliable, cost-effective and low tech posture measure may be beneficial for research and clinical applications. The purpose of this study was to determine rater reliability and construct validity of a posture screening mobile application in healthy young adults. [Subjects and Methods] Pictures of subjects were taken in three standing positions. Two raters independently digitized the static standing posture image twice. The app calculated posture variables, including sagittal and coronal plane translations and angulations. Intra- and inter-rater reliability were calculated using the appropriate ICC models for complete agreement. Construct validity was determined through comparison of known groups using repeated measures ANOVA. [Results] Intra-rater reliability ranged from 0.71 to 0.99. Inter-rater reliability was good to excellent for all translations. ICCs were stronger for translations versus angulations. The construct validity analysis found that the app was able to detect the change in the four variables selected. [Conclusion] The posture mobile application has demonstrated strong rater reliability and preliminary evidence of construct validity. This application may have utility in clinical and research settings.
Oueslati, Amel; Ollitrault, Frederique; Baraket, Ghada; Salhi-Hannachi, Amel; Navarro, Luis; Ollitrault, Patrick
2016-08-18
Chloroplast DNA is a primary source of molecular variations for phylogenetic analysis of photosynthetic eukaryotes. However, the sequencing and analysis of multiple chloroplastic regions is difficult to apply to large collections or large samples of natural populations. The objective of our work was to demonstrate that a molecular taxonomic key based on easy, scalable and low-cost genotyping method should be developed from a set of Single Nucleotide Polymorphisms (SNPs) diagnostic of well-established clades. It was applied to the Aurantioideae subfamily, the largest group of the Rutaceae family that includes the cultivated citrus species. The publicly available nucleotide sequences of eight plastid genomic regions were compared for 79 accessions of the Aurantioideae subfamily to search for SNPs revealing taxonomic differentiation at the inter-tribe, inter-subtribe, inter-genus and interspecific levels. Diagnostic SNPs (DSNPs) were found for 46 of the 54 clade levels analysed. Forty DSNPs were selected to develop KASPar markers and their taxonomic value was tested by genotyping 108 accessions of the Aurantioideae subfamily. Twenty-seven markers diagnostic of 24 clades were validated and they displayed a very high rate of transferability in the Aurantioideae subfamily (only 1.2 % of missing data on average). The UPGMA from the validated markers produced a cladistic organisation that was highly coherent with the previous phylogenetic analysis based on the sequence data of the eight plasmid regions. In particular, the monophyletic origin of the "true citrus" genera plus Oxanthera was validated. However, some clarification remains necessary regarding the organisation of the other wild species of the Citreae tribe. We validated the concept that with well-established clades, DSNPs can be selected and efficiently transformed into competitive allele-specific PCR markers (KASPar method) allowing cost-effective highly efficient cladistic analysis in large collections at subfamily level. The robustness of this genotyping method is an additional decisive advantage for network collaborative research. The availability of WGS data for the main "true citrus" species should soon make it possible to develop a set of DSNP markers allowing very fine resolution of this very important horticultural group.
Low-cost structured-light based 3D capture system design
NASA Astrophysics Data System (ADS)
Dong, Jing; Bengtson, Kurt R.; Robinson, Barrett F.; Allebach, Jan P.
2014-03-01
Most of the 3D capture products currently in the market are high-end and pricey. They are not targeted for consumers, but rather for research, medical, or industrial usage. Very few aim to provide a solution for home and small business applications. Our goal is to fill in this gap by only using low-cost components to build a 3D capture system that can satisfy the needs of this market segment. In this paper, we present a low-cost 3D capture system based on the structured-light method. The system is built around the HP TopShot LaserJet Pro M275. For our capture device, we use the 8.0 Mpixel camera that is part of the M275. We augment this hardware with two 3M MPro 150 VGA (640 × 480) pocket projectors. We also describe an analytical approach to predicting the achievable resolution of the reconstructed 3D object based on differentials and small signal theory, and an experimental procedure for validating that the system under test meets the specifications for reconstructed object resolution that are predicted by our analytical model. By comparing our experimental measurements from the camera-projector system with the simulation results based on the model for this system, we conclude that our prototype system has been correctly configured and calibrated. We also conclude that with the analytical models, we have an effective means for specifying system parameters to achieve a given target resolution for the reconstructed object.
Wang, Yijia; Zeinhom, Mohamed M A; Yang, Mingming; Sun, Rongrong; Wang, Shengfu; Smith, Jordan N; Timchalk, Charles; Li, Lei; Lin, Yuehe; Du, Dan
2017-09-05
Onsite rapid detection of herbicides and herbicide residuals in environmental and biological specimens are important for agriculture, environmental concerns, food safety, and health care. The traditional method for herbicide detection requires expensive laboratory equipment and a long turnaround time. In this work, we developed a single-stripe microliter plate smartphone-based colorimetric device for rapid and low-cost in-field tests. This portable smartphone platform is capable of screening eight samples in a single-stripe microplate. The device combined the advantages of small size (50 × 100 × 160 mm 3 ) and low cost ($10). The platform was calibrated by using two different dye solutions, i.e. methyl blue (MB) and rhodamine B, for the red and green channels. The results showed good correlation with results attained from a traditional laboratory reader. We demonstrated the application of this platform for detection of the herbicide 2,4-dichlorophenoxyacetic acid in the range of 1 to 80 ppb. Spiked samples of tap water, rat serum, plasma, and human serum were tested by our device. Recoveries obtained varied from 95.6% to 105.2% for all of the spiked samples using the microplate reader and from 93.7% to 106.9% for all of the samples using the smartphone device. This work validated that the smartphone optical-sensing platform is comparable to the commercial microplate reader; it is eligible for onsite, rapid, and low-cost detection of herbicides for environmental evaluation and biological monitoring.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, Yijia; Zeinhom, Mohamed M. A.; Yang, Mingming
Onsite rapid detection of herbicide and herbicide residuals in environmental and biological specimens is important for agriculture, environment, food safety, and health care. Traditional method for herbicide detection requires expensive laboratory equipment and a long turn-round time. In this work, we developed a single-stripe microliter plate smartphone colorimetric device for rapid and low-cost in-field test. This portable smartphone platform is capable of screening 8 samples in a microplate single-stripe. The device combined the advantages of small size (50×100×160 mm3) and low cost ($10). The platform was calibrated by using two different dye solutions, i.e. methyl blue (MB) and Rhodamine B,more » for green and red channels. The results showed good correlation with results attained from a traditional laboratory reader. We demonstrated the application of this platform for an herbicide, 2,4-Dichlorophenoxyacetic acid detection in the range of 1 ppb to 80 ppb. Spiked samples of tap water, rat serum, plasma and human serum were tested by our device. Recoveries obtained varied from 95.6% to 105.2% for all spiked samples using the microplate reader and from 93.7% to 106.9% using the smartphone device. This work validated that the smartphone optical sensing platform is comparable to the commercial microplate reader, it is eligible for onsite rapid and low-cost detection of herbicide for environmental evaluation and biological monitoring.« less
A low-cost, tablet-based option for prehospital neurologic assessment: The iTREAT Study.
Chapman Smith, Sherita N; Govindarajan, Prasanthi; Padrick, Matthew M; Lippman, Jason M; McMurry, Timothy L; Resler, Brian L; Keenan, Kevin; Gunnell, Brian S; Mehndiratta, Prachi; Chee, Christina Y; Cahill, Elizabeth A; Dietiker, Cameron; Cattell-Gordon, David C; Smith, Wade S; Perina, Debra G; Solenski, Nina J; Worrall, Bradford B; Southerland, Andrew M
2016-07-05
In this 2-center study, we assessed the technical feasibility and reliability of a low cost, tablet-based mobile telestroke option for ambulance transport and hypothesized that the NIH Stroke Scale (NIHSS) could be performed with similar reliability between remote and bedside examinations. We piloted our mobile telemedicine system in 2 geographic regions, central Virginia and the San Francisco Bay Area, utilizing commercial cellular networks for videoconferencing transmission. Standardized patients portrayed scripted stroke scenarios during ambulance transport and were evaluated by independent raters comparing bedside to remote mobile telestroke assessments. We used a mixed-effects regression model to determine intraclass correlation of the NIHSS between bedside and remote examinations (95% confidence interval). We conducted 27 ambulance runs at both sites and successfully completed the NIHSS for all prehospital assessments without prohibitive technical interruption. The mean difference between bedside (face-to-face) and remote (video) NIHSS scores was 0.25 (1.00 to -0.50). Overall, correlation of the NIHSS between bedside and mobile telestroke assessments was 0.96 (0.92-0.98). In the mixed-effects regression model, there were no statistically significant differences accounting for method of evaluation or differences between sites. Utilizing a low-cost, tablet-based platform and commercial cellular networks, we can reliably perform prehospital neurologic assessments in both rural and urban settings. Further research is needed to establish the reliability and validity of prehospital mobile telestroke assessment in live patients presenting with acute neurologic symptoms. © 2016 American Academy of Neurology.
False positive results using calcitonin as a screening method for medullary thyroid carcinoma.
Batista, Rafael Loch; Toscanini, Andrea Cecilia; Brandão, Lenine Garcia; Cunha-Neto, Malebranche Berardo C
2013-05-01
The role of serum calcitonin as part of the evaluation of thyroid nodules has been widely discussed in literature. However there still is no consensus of measurement of calcitonin in the initial evaluation of a patient with thyroid nodule. Problems concerning cost-benefit, lab methods, false positive and low prevalence of medullary thyroid carcinoma (MTC) are factors that limit this approach. We have illustrated two cases where serum calcitonin was used in the evaluation of thyroid nodule and rates proved to be high. A stimulation test was performed, using calcium as secretagogue, and calcitonin hyper-stimulation was confirmed, but anatomopathologic examination did not evidence medullar neoplasia. Anatomopathologic diagnosis detected Hashimoto thyroiditis in one case and adenomatous goiter plus an occult papillary thyroid carcinoma in the other one. Recommendation for routine use of serum calcitonin in the initial diagnostic evaluation of a thyroid nodule, followed by a confirming stimulation test if basal serum calcitonin is showed to be high, is the most currently recommended approach, but questions concerning cost-benefit and possibility of diagnosis error make the validity of this recommendation discussible.
Tang, Rongnian; Chen, Xupeng; Li, Chuang
2018-05-01
Near-infrared spectroscopy is an efficient, low-cost technology that has potential as an accurate method in detecting the nitrogen content of natural rubber leaves. Successive projections algorithm (SPA) is a widely used variable selection method for multivariate calibration, which uses projection operations to select a variable subset with minimum multi-collinearity. However, due to the fluctuation of correlation between variables, high collinearity may still exist in non-adjacent variables of subset obtained by basic SPA. Based on analysis to the correlation matrix of the spectra data, this paper proposed a correlation-based SPA (CB-SPA) to apply the successive projections algorithm in regions with consistent correlation. The result shows that CB-SPA can select variable subsets with more valuable variables and less multi-collinearity. Meanwhile, models established by the CB-SPA subset outperform basic SPA subsets in predicting nitrogen content in terms of both cross-validation and external prediction. Moreover, CB-SPA is assured to be more efficient, for the time cost in its selection procedure is one-twelfth that of the basic SPA.
Enhanced conformational sampling of carbohydrates by Hamiltonian replica-exchange simulation.
Mishra, Sushil Kumar; Kara, Mahmut; Zacharias, Martin; Koca, Jaroslav
2014-01-01
Knowledge of the structure and conformational flexibility of carbohydrates in an aqueous solvent is important to improving our understanding of how carbohydrates function in biological systems. In this study, we extend a variant of the Hamiltonian replica-exchange molecular dynamics (MD) simulation to improve the conformational sampling of saccharides in an explicit solvent. During the simulations, a biasing potential along the glycosidic-dihedral linkage between the saccharide monomer units in an oligomer is applied at various levels along the replica runs to enable effective transitions between various conformations. One reference replica runs under the control of the original force field. The method was tested on disaccharide structures and further validated on biologically relevant blood group B, Lewis X and Lewis A trisaccharides. The biasing potential-based replica-exchange molecular dynamics (BP-REMD) method provided a significantly improved sampling of relevant conformational states compared with standard continuous MD simulations, with modest computational costs. Thus, the proposed BP-REMD approach adds a new dimension to existing carbohydrate conformational sampling approaches by enhancing conformational sampling in the presence of solvent molecules explicitly at relatively low computational cost.
Characterization of a Low-Cost Multiparameter Sensor for Solar Resource Applications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Habte, Aron M; Sengupta, Manajit; Andreas, Afshin M
Low-cost, multiparameter sensing and measurement devices enable cost-effective monitoring of the functional, operational reliability, efficiency, and resiliency of the electric grid. The National Renewable Research Laboratory (NREL) Solar Radiation Research Laboratory (SRRL), in collaboration with Arable Labs, Inc., deployed Arable Lab's Mark multiparameter sensor system. The device measures the downwelling and upwelling shortwave solar resource and longwave radiation, humidity, air temperature, and ground temperature. The system is also equipped with six downward-and upward-facing narrowband spectrometer channels that measure spectral radiation and surface spectral reflectance. This study describes the shortwave calibration, characterization, and validation of measurement accuracy of this instrument bymore » comparison with existing instruments that are part of NREL-SRRL's Baseline Measurement System.« less
Sizing and economic analysis of stand alone photovoltaic system with hydrogen storage
NASA Astrophysics Data System (ADS)
Nordin, N. D.; Rahman, H. A.
2017-11-01
This paper proposes a design steps in sizing of standalone photovoltaic system with hydrogen storage using intuitive method. The main advantage of this method is it uses a direct mathematical approach to find system’s size based on daily load consumption and average irradiation data. The keys of system design are to satisfy a pre-determined load requirement and maintain hydrogen storage’s state of charge during low solar irradiation period. To test the effectiveness of the proposed method, a case study is conducted using Kuala Lumpur’s generated meteorological data and rural area’s typical daily load profile of 2.215 kWh. In addition, an economic analysis is performed to appraise the proposed system feasibility. The finding shows that the levelized cost of energy for proposed system is RM 1.98 kWh. However, based on sizing results obtained using a published method with AGM battery as back-up supply, the system cost is lower and more economically viable. The feasibility of PV system with hydrogen storage can be improved if the efficiency of hydrogen storage technologies significantly increases in the future. Hence, a sensitivity analysis is performed to verify the effect of electrolyzer and fuel cell efficiencies towards levelized cost of energy. Efficiencies of electrolyzer and fuel cell available in current market are validated using laboratory’s experimental data. This finding is needed to envisage the applicability of photovoltaic system with hydrogen storage as a future power supply source in Malaysia.
Low-Cost Aqueous Coal Desulfurization
NASA Technical Reports Server (NTRS)
Kalvinskas, J. J.; Vasilakos, N.; Corcoran, W. H.; Grohmann, K.; Rohatgi, N. K.
1982-01-01
Water-based process for desulfurizing coal not only eliminates need for costly organic solvent but removes sulfur more effectively than an earlier solvent-based process. New process could provide low-cost commercial method for converting high-sulfur coal into environmentally acceptable fuel.
Carvlin, Graeme N; Lugo, Humberto; Olmedo, Luis; Bejarano, Ester; Wilkie, Alexa; Meltzer, Dan; Wong, Michelle; King, Galatea; Northcross, Amanda; Jerrett, Michael; English, Paul B; Hammond, Donald; Seto, Edmund
2017-12-01
The Imperial County Community Air Monitoring Network was developed as part of a community-engaged research study to provide real-time particulate matter (PM) air quality information at a high spatial resolution in Imperial County, California. The network augmented the few existing regulatory monitors and increased monitoring near susceptible populations. Monitors were both calibrated and field validated, a key component of evaluating the quality of the data produced by the community monitoring network. This paper examines the performance of a customized version of the low-cost Dylos optical particle counter used in the community air monitors compared with both PM 2.5 and PM 10 (particulate matter with aerodynamic diameters <2.5 and <10 μm, respectively) federal equivalent method (FEM) beta-attenuation monitors (BAMs) and federal reference method (FRM) gravimetric filters at a collocation site in the study area. A conversion equation was developed that estimates particle mass concentrations from the native Dylos particle counts, taking into account relative humidity. The R 2 for converted hourly averaged Dylos mass measurements versus a PM 2.5 BAM was 0.79 and that versus a PM 10 BAM was 0.78. The performance of the conversion equation was evaluated at six other sites with collocated PM 2.5 environmental beta-attenuation monitors (EBAMs) located throughout Imperial County. The agreement of the Dylos with the EBAMs was moderate to high (R 2 = 0.35-0.81). The performance of low-cost air quality sensors in community networks is currently not well documented. This paper provides a methodology for quantifying the performance of a next-generation Dylos PM sensor used in the Imperial County Community Air Monitoring Network. This air quality network provides data at a much finer spatial and temporal resolution than has previously been possible with government monitoring efforts. Once calibrated and validated, these high-resolution data may provide more information on susceptible populations, assist in the identification of air pollution hotspots, and increase community awareness of air pollution.
Weighted divergence correction scheme and its fast implementation
NASA Astrophysics Data System (ADS)
Wang, ChengYue; Gao, Qi; Wei, RunJie; Li, Tian; Wang, JinJun
2017-05-01
Forcing the experimental volumetric velocity fields to satisfy mass conversation principles has been proved beneficial for improving the quality of measured data. A number of correction methods including the divergence correction scheme (DCS) have been proposed to remove divergence errors from measurement velocity fields. For tomographic particle image velocimetry (TPIV) data, the measurement uncertainty for the velocity component along the light thickness direction is typically much larger than for the other two components. Such biased measurement errors would weaken the performance of traditional correction methods. The paper proposes a variant for the existing DCS by adding weighting coefficients to the three velocity components, named as the weighting DCS (WDCS). The generalized cross validation (GCV) method is employed to choose the suitable weighting coefficients. A fast algorithm for DCS or WDCS is developed, making the correction process significantly low-cost to implement. WDCS has strong advantages when correcting velocity components with biased noise levels. Numerical tests validate the accuracy and efficiency of the fast algorithm, the effectiveness of GCV method, and the advantages of WDCS. Lastly, DCS and WDCS are employed to process experimental velocity fields from the TPIV measurement of a turbulent boundary layer. This shows that WDCS achieves a better performance than DCS in improving some flow statistics.
Validation of a spectrophotometric assay method for bisoprolol using picric acid.
Panainte, Alina-Diana; Bibire, Nela; Tântaru, Gladiola; Apostu, M; Vieriu, Mădălina
2013-01-01
Bisoprolol is a drug belonging to beta blockers drugs used primarily for the treatment of cardiovascular diseases. A spectrophotometric method for quantitative determination of bisoprolol was developed based on the formation of a complex combination between bisoprolol and picric acid. The complex combination of bisoprolol and picric acid has a maximum absorbance peak at 420 nm. Optimum working conditions were established and the method was validated. The method presented a good linearity in the concentration range 5-120 microg/ml (regression coefficient r2 = 0.9992). The RSD for the precision of the method was 1.74 and for the intermediate precision 1.43, and recovery values ranged between 98.25-101.48%. The proposed and validated spectrophotometric method for the determination of bisoprolol is simple and cost effective.
M & V Shootout: Setting the Stage For Testing the Performance of New Energy Baseline
DOE Office of Scientific and Technical Information (OSTI.GOV)
Touzani, Samir; Custodio, Claudine; Sohn, Michael
Trustworthy savings calculations are critical to convincing investors in energy efficiency projects of the benefit and cost-effectiveness of such investments and their ability to replace or defer supply-side capital investments. However, today’s methods for measurement and verification (M&V) of energy savings constitute a significant portion of the total costs of efficiency projects. They also require time-consuming data acquisition and often do not deliver results until years after the program period has ended. A spectrum of savings calculation approaches are used, with some relying more heavily on measured data and others relying more heavily on estimated or modeled data, or stipulatedmore » information. The rising availability of “smart” meters, combined with new analytical approaches to quantifying savings, has opened the door to conducting M&V more quickly and at lower cost, with comparable or improved accuracy. Energy management and information systems (EMIS) technologies, not only enable significant site energy savings, but are also beginning to offer M&V capabilities. This paper expands recent analyses of public-domain, whole-building M&V methods, focusing on more novel baseline modeling approaches that leverage interval meter data. We detail a testing procedure and metrics to assess the performance of these new approaches using a large test dataset. We also provide conclusions regarding the accuracy, cost, and time trade-offs between more traditional M&V and these emerging streamlined methods. Finally, we discuss the potential evolution of M&V to better support the energy efficiency industry through low-cost approaches, and the long-term agenda for validation of building energy analytics.« less
Extracting cardiac myofiber orientations from high frequency ultrasound images
NASA Astrophysics Data System (ADS)
Qin, Xulei; Cong, Zhibin; Jiang, Rong; Shen, Ming; Wagner, Mary B.; Kirshbom, Paul; Fei, Baowei
2013-03-01
Cardiac myofiber plays an important role in stress mechanism during heart beating periods. The orientation of myofibers decides the effects of the stress distribution and the whole heart deformation. It is important to image and quantitatively extract these orientations for understanding the cardiac physiological and pathological mechanism and for diagnosis of chronic diseases. Ultrasound has been wildly used in cardiac diagnosis because of its ability of performing dynamic and noninvasive imaging and because of its low cost. An extraction method is proposed to automatically detect the cardiac myofiber orientations from high frequency ultrasound images. First, heart walls containing myofibers are imaged by B-mode high frequency (<20 MHz) ultrasound imaging. Second, myofiber orientations are extracted from ultrasound images using the proposed method that combines a nonlinear anisotropic diffusion filter, Canny edge detector, Hough transform, and K-means clustering. This method is validated by the results of ultrasound data from phantoms and pig hearts.
Social cost impact assessment of pipeline infrastructure projects
DOE Office of Scientific and Technical Information (OSTI.GOV)
Matthews, John C., E-mail: matthewsj@battelle.org; Allouche, Erez N., E-mail: allouche@latech.edu; Sterling, Raymond L., E-mail: sterling@latech.edu
A key advantage of trenchless construction methods compared with traditional open-cut methods is their ability to install or rehabilitate underground utility systems with limited disruption to the surrounding built and natural environments. The equivalent monetary values of these disruptions are commonly called social costs. Social costs are often ignored by engineers or project managers during project planning and design phases, partially because they cannot be calculated using standard estimating methods. In recent years some approaches for estimating social costs were presented. Nevertheless, the cost data needed for validation of these estimating methods is lacking. Development of such social cost databasesmore » can be accomplished by compiling relevant information reported in various case histories. This paper identifies eight most important social cost categories, presents mathematical methods for calculating them, and summarizes the social cost impacts for two pipeline construction projects. The case histories are analyzed in order to identify trends for the various social cost categories. The effectiveness of the methods used to estimate these values is also discussed. These findings are valuable for pipeline infrastructure engineers making renewal technology selection decisions by providing a more accurate process for the assessment of social costs and impacts. - Highlights: • Identified the eight most important social cost factors for pipeline construction • Presented mathematical methods for calculating those social cost factors • Summarized social cost impacts for two pipeline construction projects • Analyzed those projects to identify trends for the social cost factors.« less
Cho, Sanghee; Grazioso, Ron; Zhang, Nan; Aykac, Mehmet; Schmand, Matthias
2011-12-07
The main focus of our study is to investigate how the performance of digital timing methods is affected by sampling rate, anti-aliasing and signal interpolation filters. We used the Nyquist sampling theorem to address some basic questions such as what will be the minimum sampling frequencies? How accurate will the signal interpolation be? How do we validate the timing measurements? The preferred sampling rate would be as low as possible, considering the high cost and power consumption of high-speed analog-to-digital converters. However, when the sampling rate is too low, due to the aliasing effect, some artifacts are produced in the timing resolution estimations; the shape of the timing profile is distorted and the FWHM values of the profile fluctuate as the source location changes. Anti-aliasing filters are required in this case to avoid the artifacts, but the timing is degraded as a result. When the sampling rate is marginally over the Nyquist rate, a proper signal interpolation is important. A sharp roll-off (higher order) filter is required to separate the baseband signal from its replicates to avoid the aliasing, but in return the computation will be higher. We demonstrated the analysis through a digital timing study using fast LSO scintillation crystals as used in time-of-flight PET scanners. From the study, we observed that there is no significant timing resolution degradation down to 1.3 Ghz sampling frequency, and the computation requirement for the signal interpolation is reasonably low. A so-called sliding test is proposed as a validation tool checking constant timing resolution behavior of a given timing pick-off method regardless of the source location change. Lastly, the performance comparison for several digital timing methods is also shown.
Yu, H H; Bi, X; Liu, Y Y
2017-08-10
Objective: To evaluate the reliability and validity of the Chinese version on comprehensive scores for financial toxicity (COST), based on the patient-reported outcome measures. Methods: A total of 118 cancer patients were face-to-face interviewed by well-trained investigators. Cronbach's α and Pearson correlation coefficient were used to evaluate reliability. Content validity index (CVI) and exploratory factor analysis (EFA) were used to evaluate the content validity and construct validity, respectively. Results: The Cronbach's α coefficient appeared as 0.889 for the whole questionnaire, with the results of test-retest were between 0.77 and 0.98. Scale-content validity index (S-CVI) appeared as 0.82, with item-content validity index (I-CVI) between 0.83 and 1.00. Two components were extracted from the Exploratory factor analysis, with cumulative rate as 68.04% and loading>0.60 on every item. Conclusion: The Chinese version of COST scale showed high reliability and good validity, thus can be applied to assess the financial situation in cancer patients.
An ultraviolet-spectrophotometric method for the determination of glimepiride in solid dosage forms.
Afieroho, Ozadheoghene E; Okorie, Ogbonna; Okonkwo, Tochukwu J N
2011-06-01
Considering the cost of acquiring a liquid chromatographic instrument in underdeveloped economies, the rising incidence of diabetes mellitus, the need to evaluate the quality performance of glimepiride generics, and the need for less toxic processes, this research is an imperative. The method was validated for linearity, recovery accuracy, intra- and inter-day precision, specificity in the presence of excipients, and inter-day stability under laboratory conditions. Student's t test at the 95% confidence limit was used for statistics. Using 96% ethanol as solvent, a less toxic and cost-effective spectrophotometric method for the determination of glimepiride in solid dosage forms was developed and validated. The results of the validated parameters showed a λ(max) of 231 nm, linearity range of 0.5-22 μg/mL, precision with relative SD of <1.0%, recovery accuracy of 100.8%, regression equation of y = 45.741x + 0.0202, R(2) = 0.999, limit of detection of 0.35 μg/mL, and negligible interference from common excipients and colorants. The method was found to be accurate at the 95% confidence limit compared with the standard liquid chromatographic method with comparable reproducibility when used to assay the formulated products Amaryl(®) (sanofi-aventis, Paris, France) and Mepyril(®) (May & Baker Nigeria PLC, Ikeja, Nigeria). The results obtained for the validated parameters were within allowable limits. This method is recommended for routine quality control analysis.
Validation of a wireless modular monitoring system for structures
NASA Astrophysics Data System (ADS)
Lynch, Jerome P.; Law, Kincho H.; Kiremidjian, Anne S.; Carryer, John E.; Kenny, Thomas W.; Partridge, Aaron; Sundararajan, Arvind
2002-06-01
A wireless sensing unit for use in a Wireless Modular Monitoring System (WiMMS) has been designed and constructed. Drawing upon advanced technological developments in the areas of wireless communications, low-power microprocessors and micro-electro mechanical system (MEMS) sensing transducers, the wireless sensing unit represents a high-performance yet low-cost solution to monitoring the short-term and long-term performance of structures. A sophisticated reduced instruction set computer (RISC) microcontroller is placed at the core of the unit to accommodate on-board computations, measurement filtering and data interrogation algorithms. The functionality of the wireless sensing unit is validated through various experiments involving multiple sensing transducers interfaced to the sensing unit. In particular, MEMS-based accelerometers are used as the primary sensing transducer in this study's validation experiments. A five degree of freedom scaled test structure mounted upon a shaking table is employed for system validation.
Wootton, Richard; Vladzymyrskyy, Anton; Zolfo, Maria; Bonnardot, Laurent
2011-01-01
Background Telemedicine has been used for many years to support doctors in the developing world. Several networks provide services in different settings and in different ways. However, to draw conclusions about which telemedicine networks are successful requires a method of evaluating them. No general consensus or validated framework exists for this purpose. Objective To define a basic method of performance measurement that can be used to improve and compare teleconsultation networks; to employ the proposed framework in an evaluation of three existing networks; to make recommendations about the future implementation and follow-up of such networks. Methods Analysis based on the experience of three telemedicine networks (in operation for 7–10 years) that provide services to doctors in low-resource settings and which employ the same basic design. Findings Although there are many possible indicators and metrics that might be relevant, five measures for each of the three user groups appear to be sufficient for the proposed framework. In addition, from the societal perspective, information about clinical- and cost-effectiveness is also required. The proposed performance measurement framework was applied to three mature telemedicine networks. Despite their differences in terms of activity, size and objectives, their performance in certain respects is very similar. For example, the time to first reply from an expert is about 24 hours for each network. Although all three networks had systems in place to collect data from the user perspective, none of them collected information about the coordinator's time required or about ease of system usage. They had only limited information about quality and cost. Conclusion Measuring the performance of a telemedicine network is essential in understanding whether the network is working as intended and what effect it is having. Based on long-term field experience, the suggested framework is a practical tool that will permit organisations to assess the performance of their own networks and to improve them by comparison with others. All telemedicine systems should provide information about setup and running costs because cost-effectiveness is crucial for sustainability. PMID:22162965
Space processes for extended low-G testing
NASA Technical Reports Server (NTRS)
Steurer, W. H.; Kaye, S.; Gorham, D. J.
1973-01-01
Results of an investigation of verifying the capabilities of space processes in ground based experiments at low-g periods are presented. Limited time experiments were conducted with the processes. A valid representation of the complete process cycle was achieved at low-g periods ranging from 40 to 390 seconds. A minimum equipment inventory, is defined. A modular equipment design, adopted to assure low cost and high program flexibility, is presented as well as procedures and data established for the synthesis and definition of dedicated and mixed rocket payloads.
NASA Astrophysics Data System (ADS)
Choi, Nack-Bong
Flexible electronics is an emerging next-generation technology that offers many advantages such as light weight, durability, comfort, and flexibility. These unique features enable many new applications such as flexible display, flexible sensors, conformable electronics, and so forth. For decades, a variety of flexible substrates have been demonstrated for the application of flexible electronics. Most of them are plastic films and metal foils so far. For the fundamental device of flexible circuits, thin film transistors (TFTs) using poly silicon, amorphous silicon, metal oxide and organic semiconductor have been successfully demonstrated. Depending on application, low-cost and disposable flexible electronics will be required for convenience. Therefore it is important to study inexpensive substrates and to explore simple processes such as printing technology. In this thesis, paper is introduced as a new possible substrate for flexible electronics due to its low-cost and renewable property, and amorphous indium gallium zinc oxide (a-IGZO) TFTs are realized as the promising device on the paper substrate. The fabrication process and characterization of a-IGZO TFT on the paper substrate are discussed. a-IGZO TFTs using a polymer gate dielectric on the paper substrate demonstrate excellent performances with field effect mobility of ˜20 cm2 V-1 s-1, on/off current ratio of ˜106, and low leakage current, which show the enormous potential for flexible electronics application. In order to complement the n-channel a-IGZO TFTs and then enable complementary metal-oxide semiconductor (CMOS) circuit architectures, cuprous oxide is studied as a candidate material of p-channel oxide TFTs. In this thesis, a printing process is investigated as an alternative method for the fabrication of low-cost and disposable electronics. Among several printing methods, a modified offset roll printing that prints high resolution patterns is presented. A new method to fabricate a high resolution printing plate is investigated and the most favorable condition to transfer ink from a blanket to a cliche is studied. Consequently, a high resolution cliche is demonstrated and the printed patterns of 10mum width and 6mum line spacing are presented. In addition, the top gate a-IGZO TFTs with channel width/length of 12/6mum is successfully demonstrated by printing etch-resists. This work validates the compatibility of a-IGZO TFT on paper substrate for the disposable microelectronics application and presents the potential of low-cost and high resolution printing technology.
Ultrasound guided electrical impedance tomography for 2D free-interface reconstruction
NASA Astrophysics Data System (ADS)
Liang, Guanghui; Ren, Shangjie; Dong, Feng
2017-07-01
The free-interface detection problem is normally seen in industrial or biological processes. Electrical impedance tomography (EIT) is a non-invasive technique with advantages of high-speed and low cost, and is a promising solution for free-interface detection problems. However, due to the ill-posed and nonlinear characteristics, the spatial resolution of EIT is low. To deal with the issue, an ultrasound guided EIT is proposed to directly reconstruct the geometric configuration of the target free-interface. In the method, the position of the central point of the target interface is measured by a pair of ultrasound transducers mounted at the opposite side of the objective domain, and then the position measurement is used as the prior information for guiding the EIT-based free-interface reconstruction. During the process, a constrained least squares framework is used to fuse the information from different measurement modalities, and the Lagrange multiplier-based Levenberg-Marquardt method is adopted to provide the iterative solution of the constraint optimization problem. The numerical results show that the proposed ultrasound guided EIT method for the free-interface reconstruction is more accurate than the single modality method, especially when the number of valid electrodes is limited.
The principles of quality-associated costing: derivation from clinical transfusion practice.
Trenchard, P M; Dixon, R
1997-01-01
As clinical transfusion practice works towards achieving cost-effectiveness, prescribers of blood and its derivatives must be certain that the prices of such products are based on real manufacturing costs and not market forces. Using clinical cost-benefit analysis as the context for the costing and pricing of blood products, this article identifies the following two principles: (1) the product price must equal the product cost (the "price = cost" rule) and (2) the product cost must equal the real cost of product manufacture. In addition, the article describes a new method of blood product costing, quality-associated costing (QAC), that will enable valid cost-benefit analysis of blood products.
Using Ensemble Decisions and Active Selection to Improve Low-Cost Labeling for Multi-View Data
NASA Technical Reports Server (NTRS)
Rebbapragada, Umaa; Wagstaff, Kiri L.
2011-01-01
This paper seeks to improve low-cost labeling in terms of training set reliability (the fraction of correctly labeled training items) and test set performance for multi-view learning methods. Co-training is a popular multiview learning method that combines high-confidence example selection with low-cost (self) labeling. However, co-training with certain base learning algorithms significantly reduces training set reliability, causing an associated drop in prediction accuracy. We propose the use of ensemble labeling to improve reliability in such cases. We also discuss and show promising results on combining low-cost ensemble labeling with active (low-confidence) example selection. We unify these example selection and labeling strategies under collaborative learning, a family of techniques for multi-view learning that we are developing for distributed, sensor-network environments.
Kogkaki, Efstathia A; Sofoulis, Manos; Natskoulis, Pantelis; Tarantilis, Petros A; Pappas, Christos S; Panagou, Efstathios Z
2017-10-16
The purpose of this study was to evaluate the potential of FT-IR spectroscopy as a high-throughput method for rapid differentiation among the ochratoxigenic species of Aspergillus carbonarius and the non-ochratoxigenic or low toxigenic species of Aspergillus niger aggregate, namely A. tubingensis and A. niger isolated previously from grapes of Greek vineyards. A total of 182 isolates of A. carbonarius, A. tubingensis, and A. niger were analyzed using FT-IR spectroscopy. The first derivative of specific spectral regions (3002-2801cm -1 , 1773-1550cm -1 , and 1286-952cm -1 ) were chosen and evaluated with respect to absorbance values. The average spectra of 130 fungal isolates were used for model calibration based on Discriminant analysis and the remaining 52 spectra were used for external model validation. This methodology was able to differentiate correctly 98.8% in total accuracy in both model calibration and validation. The per class accuracy for A. carbonarius was 95.3% and 100% for model calibration and validation, respectively, whereas for A. niger aggregate the per class accuracy amounted to 100% in both cases. The obtained results indicated that FT-IR could become a promising, fast, reliable and low-cost tool for the discrimination and differentiation of closely related fungal species. Copyright © 2017 Elsevier B.V. All rights reserved.
Hoorfar, J.; Hansen, F.; Christensen, J.; Mansdal, S.; Josefsen, M. H.
2016-01-01
ABSTRACT Salmonella is recognized as one of the most important foodborne bacteria and has wide health and socioeconomic impacts worldwide. Fresh pork meat is one of the main sources of Salmonella, and efficient and fast methods for detection are therefore necessary. Current methods for Salmonella detection in fresh meat usually include >16 h of culture enrichment, in a few cases <12 h, thus requiring at least two working shifts. Here, we report a rapid (<5 h) and high-throughput method for screening of Salmonella in samples from fresh pork meat, consisting of a 3-h enrichment in standard buffered peptone water and a real-time PCR-compatible sample preparation method based on filtration, centrifugation, and enzymatic digestion, followed by fast-cycling real-time PCR detection. The method was validated in an unpaired comparative study against the Nordic Committee on Food Analysis (NMKL) reference culture method 187. Pork meat samples (n = 140) were either artificially contaminated with Salmonella at 0, 1 to 10, or 10 to 100 CFU/25 g of meat or naturally contaminated. Cohen's kappa for the degree of agreement between the rapid method and the reference was 0.64, and the relative accuracy, sensitivity, and specificity for the rapid method were 81.4, 95.1, and 97.9%, respectively. The 50% limit of detections (LOD50s) were 8.8 CFU/25 g for the rapid method and 7.7 CFU/25 g for the reference method. Implementation of this method will enable faster release of Salmonella low-risk meat, providing savings for meat producers, and it will help contribute to improved food safety. IMPORTANCE While the cost of analysis and hands-on time of the presented rapid method were comparable to those of reference culture methods, the fast product release by this method can provide the meat industry with a competitive advantage. Not only will the abattoirs save costs for work hours and cold storage, but consumers and retailers will also benefit from fresher meat with a longer shelf life. Furthermore, the presented sample preparation might be adjusted for application in the detection of other pathogenic bacteria in different sample types. PMID:27986726
Fachmann, M S R; Löfström, C; Hoorfar, J; Hansen, F; Christensen, J; Mansdal, S; Josefsen, M H
2017-03-01
Salmonella is recognized as one of the most important foodborne bacteria and has wide health and socioeconomic impacts worldwide. Fresh pork meat is one of the main sources of Salmonella , and efficient and fast methods for detection are therefore necessary. Current methods for Salmonella detection in fresh meat usually include >16 h of culture enrichment, in a few cases <12 h, thus requiring at least two working shifts. Here, we report a rapid (<5 h) and high-throughput method for screening of Salmonella in samples from fresh pork meat, consisting of a 3-h enrichment in standard buffered peptone water and a real-time PCR-compatible sample preparation method based on filtration, centrifugation, and enzymatic digestion, followed by fast-cycling real-time PCR detection. The method was validated in an unpaired comparative study against the Nordic Committee on Food Analysis (NMKL) reference culture method 187. Pork meat samples ( n = 140) were either artificially contaminated with Salmonella at 0, 1 to 10, or 10 to 100 CFU/25 g of meat or naturally contaminated. Cohen's kappa for the degree of agreement between the rapid method and the reference was 0.64, and the relative accuracy, sensitivity, and specificity for the rapid method were 81.4, 95.1, and 97.9%, respectively. The 50% limit of detections (LOD 50 s) were 8.8 CFU/25 g for the rapid method and 7.7 CFU/25 g for the reference method. Implementation of this method will enable faster release of Salmonella low-risk meat, providing savings for meat producers, and it will help contribute to improved food safety. IMPORTANCE While the cost of analysis and hands-on time of the presented rapid method were comparable to those of reference culture methods, the fast product release by this method can provide the meat industry with a competitive advantage. Not only will the abattoirs save costs for work hours and cold storage, but consumers and retailers will also benefit from fresher meat with a longer shelf life. Furthermore, the presented sample preparation might be adjusted for application in the detection of other pathogenic bacteria in different sample types. Copyright © 2017 American Society for Microbiology.
Construction concepts and validation of the 3D printed UST_2 modular stellarator
NASA Astrophysics Data System (ADS)
Queral, V.
2015-03-01
High accuracy, geometric complexity and thus high cost of stellarators tend to hinder the advance of stellarator research. Nowadays, new manufacturing methods might be developed for the production of small and middle-size stellarators. The methods should demonstrate advantages with respect common fabrication methods, like casting, cutting, forging and welding, for the construction of advanced highly convoluted modular stellarators. UST2 is a small modular three period quasi-isodynamic stellarator of major radius 0.26 m and plasma volume 10 litres being currently built to validate additive manufacturing (3D printing) for stellarator construction. The modular coils are wound in grooves defined on six 3D printed half period frames designed as light truss structures filled by a strong filler. A geometrically simple assembling configuration has been concocted for UST2 so as to try to lower the cost of the device while keeping the positioning accuracy of the different elements. The paper summarizes the construction and assembling concepts developed, the devised positioning methodology, the design of the coil frames and positioning elements and, an initial validation of the assembling of the components.
Breilh, Jaime; Pagliccia, Nino; Yassi, Annalee
2012-01-01
Chronic pesticide poisoning is difficult to detect. We sought to develop a low-cost test battery for settings such as Ecuador's floriculture industry. First we had to develop a case definition; as with all occupational diseases a case had to have both sufficient effective dose and associated health effects. For the former, using canonical discriminant analysis, we found that adding measures of protection and overall environmental stressors to occupational category and duration of exposure was useful. For the latter, factor analysis suggested three distinct manifestations of pesticide poisoning. We then determined sensitivity and specificity of various combinations of symptoms and simple neurotoxicity tests from the Pentox questionnaire, and found that doing so increased sensitivity and specificity compared to use of acethylcholinesterase alone--the current screening standard. While sensitivity and specificity varied with different case definitions, our results support the development of a low-cost test battery for screening in such settings.
Llorente-Mirandes, Toni; Calderón, Josep; Centrich, Francesc; Rubio, Roser; López-Sánchez, José Fermín
2014-03-15
The present study arose from the need to determine inorganic arsenic (iAs) at low levels in cereal-based food. Validated methods with a low limit of detection (LOD) are required to analyse these kinds of food. An analytical method for the determination of iAs, methylarsonic acid (MA) and dimethylarsinic acid (DMA) in cereal-based food and infant cereals is reported. The method was optimised and validated to achieve low LODs. Ion chromatography-inductively coupled plasma mass spectrometry (IC-ICPMS) was used for arsenic speciation. The main quality parameters were established. To expand the applicability of the method, different cereal products were analysed: bread, biscuits, breakfast cereals, wheat flour, corn snacks, pasta and infant cereals. The total and inorganic arsenic content of 29 cereal-based food samples ranged between 3.7-35.6 and 3.1-26.0 μg As kg(-1), respectively. The present method could be considered a valuable tool for assessing inorganic arsenic contents in cereal-based foods. Copyright © 2013 Elsevier Ltd. All rights reserved.
Low cost environmental sensors for Spaceflight : NMP Space Environmental Monitor (SEM) requirements
NASA Technical Reports Server (NTRS)
Garrett, Henry B.; Buelher, Martin G.; Brinza, D.; Patel, J. U.
2005-01-01
An outstanding problem in spaceflight is the lack of adequate sensors for monitoring the space environment and its effects on engineering systems. By adequate, we mean low cost in terms of mission impact (e.g., low price, low mass/size, low power, low data rate, and low design impact). The New Millennium Program (NMP) is investigating the development of such a low-cost Space Environmental Monitor (SEM) package for inclusion on its technology validation flights. This effort follows from the need by NMP to characterize the space environment during testing so that potential users can extrapolate the test results to end-use conditions. The immediate objective of this effort is to develop a small diagnostic sensor package that could be obtained from commercial sources. Environments being considered are: contamination, atomic oxygen, ionizing radiation, cosmic radiation, EMI, and temperature. This talk describes the requirements and rational for selecting these environments and reviews a preliminary design that includes a micro-controller data logger with data storage and interfaces to the sensors and spacecraft. If successful, such a sensor package could be the basis of a unique, long term program for monitoring the effects of the space environment on spacecraft systems.
Low Cost Environmental Sensors for Spaceflight: NMP Space Environmental Monitor (SEM) Requirements
NASA Technical Reports Server (NTRS)
Garrett, Henry B.; Buehler, Martin G.; Brinza, D.; Patel, J. U.
2005-01-01
An outstanding problem in spaceflight is the lack of adequate sensors for monitoring the space environment and its effects on engineering systems. By adequate, we mean low cost in terms of mission impact (e.g., low price, low mass/size, low power, low data rate, and low design impact). The New Millennium Program (NMP) is investigating the development of such a low-cost Space Environmental Monitor (SEM) package for inclusion on its technology validation flights. This effort follows from the need by NMP to characterize the space environment during testing so that potential users can extrapolate the test results to end-use conditions. The immediate objective of this effort is to develop a small diagnostic sensor package that could be obtained from commercial sources. Environments being considered are: contamination, atomic oxygen, ionizing radiation, cosmic radiation, EMI, and temperature. This talk describes the requirements and rational for selecting these environments and reviews a preliminary design that includes a micro-controller data logger with data storage and interfaces to the sensors and spacecraft. If successful, such a sensor package could be the basis of a unique, long term program for monitoring the effects of the space environment on spacecraft systems.
A method for spectral DNS of low Rm channel flows based on the least dissipative modes
NASA Astrophysics Data System (ADS)
Kornet, Kacper; Pothérat, Alban
2015-10-01
We put forward a new type of spectral method for the direct numerical simulation of flows where anisotropy or very fine boundary layers are present. The main idea is to take advantage of the fact that such structures are dissipative and that their presence should reduce the number of degrees of freedom of the flow, when paradoxically, their fine resolution incurs extra computational cost in most current methods. The principle of this method is to use a functional basis with elements that already include these fine structures so as to avoid these extra costs. This leads us to develop an algorithm to implement a spectral method for arbitrary functional bases, and in particular, non-orthogonal ones. We construct a basic implementation of this algorithm to simulate magnetohydrodynamic (MHD) channel flows with an externally imposed, transverse magnetic field, where very thin boundary layers are known to develop along the channel walls. In this case, the sought functional basis can be built out of the eigenfunctions of the dissipation operator, which incorporate these boundary layers, and it turns out to be non-orthogonal. We validate this new scheme against numerical simulations of freely decaying MHD turbulence based on a finite volume code and it is found to provide accurate results. Its ability to fully resolve wall-bounded turbulence with a number of modes close to that required by the dynamics is demonstrated on a simple example. This opens the way to full-blown simulations of MHD turbulence under very high magnetic fields. Until now such simulations were too computationally expensive. In contrast to traditional methods the computational cost of the proposed method, does not depend on the intensity of the magnetic field.
Improved patch-based learning for image deblurring
NASA Astrophysics Data System (ADS)
Dong, Bo; Jiang, Zhiguo; Zhang, Haopeng
2015-05-01
Most recent image deblurring methods only use valid information found in input image as the clue to fill the deblurring region. These methods usually have the defects of insufficient prior information and relatively poor adaptiveness. Patch-based method not only uses the valid information of the input image itself, but also utilizes the prior information of the sample images to improve the adaptiveness. However the cost function of this method is quite time-consuming and the method may also produce ringing artifacts. In this paper, we propose an improved non-blind deblurring algorithm based on learning patch likelihoods. On one hand, we consider the effect of the Gaussian mixture model with different weights and normalize the weight values, which can optimize the cost function and reduce running time. On the other hand, a post processing method is proposed to solve the ringing artifacts produced by traditional patch-based method. Extensive experiments are performed. Experimental results verify that our method can effectively reduce the execution time, suppress the ringing artifacts effectively, and keep the quality of deblurred image.
A digital image method of spot tests for determination of copper in sugar cane spirits
NASA Astrophysics Data System (ADS)
Pessoa, Kenia Dias; Suarez, Willian Toito; dos Reis, Marina Ferreira; de Oliveira Krambeck Franco, Mathews; Moreira, Renata Pereira Lopes; dos Santos, Vagner Bezerra
2017-10-01
In this work the development and validation of analytical methodology for determination of copper in sugarcane spirit samples is carried out. The digital image based (DIB) method was applied along with spot test from the colorimetric reaction employing the RGB color model. For the determination of copper concentration, it was used the cuprizone - a bidentate organic reagent - which forms with copper a blue chelate in an alkaline medium. A linear calibration curve over the concentration range from 0.75 to 5.00 mg L- 1 (r2 = 0.9988) was obtained and limits of detection and quantification of 0.078 mg L- 1 and 0.26 mg L- 1 were acquired, respectively. For the accuracy studies, recovery percentages ranged from 98 to 104% were obtained. The comparison of cooper concentration results in sugar cane spirits using the DIB method and Flame Atomic Absorption Spectrometry as reference method showed no significant differences between both methods, which were performed using the paired t-test in 95% of confidence level. Thus, the spot test method associated with DIB allows the use of devices as digital cameras and smartphones to evaluate colorimetric reaction with low waste generation, practicality, quickness, accuracy, precision, high portability and low-cost.
López, Elena; García, Sergio; Barea, Rafael; Bergasa, Luis M.; Molinos, Eduardo J.; Arroyo, Roberto; Romera, Eduardo; Pardo, Samuel
2017-01-01
One of the main challenges of aerial robots navigation in indoor or GPS-denied environments is position estimation using only the available onboard sensors. This paper presents a Simultaneous Localization and Mapping (SLAM) system that remotely calculates the pose and environment map of different low-cost commercial aerial platforms, whose onboard computing capacity is usually limited. The proposed system adapts to the sensory configuration of the aerial robot, by integrating different state-of-the art SLAM methods based on vision, laser and/or inertial measurements using an Extended Kalman Filter (EKF). To do this, a minimum onboard sensory configuration is supposed, consisting of a monocular camera, an Inertial Measurement Unit (IMU) and an altimeter. It allows to improve the results of well-known monocular visual SLAM methods (LSD-SLAM and ORB-SLAM are tested and compared in this work) by solving scale ambiguity and providing additional information to the EKF. When payload and computational capabilities permit, a 2D laser sensor can be easily incorporated to the SLAM system, obtaining a local 2.5D map and a footprint estimation of the robot position that improves the 6D pose estimation through the EKF. We present some experimental results with two different commercial platforms, and validate the system by applying it to their position control. PMID:28397758
Erçağ, Erol; Uzer, Ayşem; Apak, Reşat
2009-05-15
Because of the extremely heterogeneous distribution of explosives in contaminated soils, on-site colorimetric methods are efficient tools to assess the nature and extent of contamination. To meet the need for rapid and low-cost chemical sensing of explosive traces or residues in soil and post-blast debris, a colorimetric absorption-based sensor for trinitrotoluene (TNT) determination has been developed. The charge-transfer (CT) reagent (dicyclohexylamine, DCHA) is entrapped in a polyvinylchloride (PVC) polymer matrix plasticised with dioctylphtalate (DOP), and moulded into a transparent sensor membrane sliced into test strips capable of sensing TNT showing an absorption maximum at 530 nm when placed in a 1-mm spectrophotometer cell. The sensor gave a linear absorption response to 5-50 mg L(-1) TNT solutions in 30% aqueous acetone with limit of detection (LOD): 3 mg L(-1). The sensor is only affected by tetryl, but not by RDX, pentaerythritoltetranitrate (PETN), dinitrotoluene (DNT), and picric acid. The proposed method was statistically validated for TNT assay against high performance liquid chromatography (HPLC) using a standard sample of Comp B. The developed sensor was relatively resistant to air and water, was of low-cost and high specificity, gave a rapid and reproducible response, and was suitable for field use of TNT determination in both dry and humid soil and groundwater with a portable colorimeter.
Control Design and Performance Analysis for Autonomous Formation Flight Experimentss
NASA Astrophysics Data System (ADS)
Rice, Caleb Michael
Autonomous Formation Flight is a key approach for reducing greenhouse gas emissions and managing traffic in future high density airspace. Unmanned Aerial Vehicles (UAV's) have made it possible for the physical demonstration and validation of autonomous formation flight concepts inexpensively and eliminates the flight risk to human pilots. This thesis discusses the design, implementation, and flight testing of three different formation flight control methods, Proportional Integral and Derivative (PID); Fuzzy Logic (FL); and NonLinear Dynamic Inversion (NLDI), and their respective performance behavior. Experimental results show achievable autonomous formation flight and performance quality with a pair of low-cost unmanned research fixed wing aircraft and also with a solo vertical takeoff and landing (VTOL) quadrotor.
The effect of time synchronization of wireless sensors on the modal analysis of structures
NASA Astrophysics Data System (ADS)
Krishnamurthy, V.; Fowler, K.; Sazonov, E.
2008-10-01
Driven by the need to reduce the installation cost and maintenance cost of structural health monitoring (SHM) systems, wireless sensor networks (WSNs) are becoming increasingly popular. Perfect time synchronization amongst the wireless sensors is a key factor enabling the use of low-cost, low-power WSNs for structural health monitoring applications based on output-only modal analysis of structures. In this paper we present a theoretical framework for analysis of the impact created by time delays in the measured system response on the reconstruction of mode shapes using the popular frequency domain decomposition (FDD) technique. This methodology directly estimates the change in mode shape values based on sensor synchronicity. We confirm the proposed theoretical model by experimental validation in modal identification experiments performed on an aluminum beam. The experimental validation was performed using a wireless intelligent sensor and actuator network (WISAN) which allows for close time synchronization between sensors (0.6-10 µs in the tested configuration) and guarantees lossless data delivery under normal conditions. The experimental results closely match theoretical predictions and show that even very small delays in output response impact the mode shapes.
The costly pursuit of self-esteem.
Crocker, Jennifer; Park, Lora E
2004-05-01
Researchers have recently questioned the benefits associated with having high self-esteem. The authors propose that the importance of self-esteem lies more in how people strive for it rather than whether it is high or low. They argue that in domains in which their self-worth is invested, people adopt the goal to validate their abilities and qualities, and hence their self-worth. When people have self-validation goals, they react to threats in these domains in ways that undermine learning; relatedness; autonomy and self-regulation; and over time, mental and physical health. The short-term emotional benefits of pursuing self-esteem are often outweighed by long-term costs. Previous research on self-esteem is reinterpreted in terms of self-esteem striving. Cultural roots of the pursuit of self-esteem are considered. Finally, the alternatives to pursuing self-esteem, and ways of avoiding its costs, are discussed. ((c) 2004 APA, all rights reserved)
Potential public sector cost-savings from over-the-counter access to oral contraceptives.
Foster, Diana G; Biggs, M Antonia; Phillips, Kathryn A; Grindlay, Kate; Grossman, Daniel
2015-05-01
This study estimates how making oral contraceptive pills (OCPs) available without a prescription may affect contraceptive use, unintended pregnancies and associated contraceptive and pregnancy costs among low-income women. Based on published figures, we estimate two scenarios [low over-the-counter (OTC) use and high OTC use] of the proportion of low-income women likely to switch to an OTC pill and predict adoption of OCPs according to the out-of-pocket costs per pill pack. We then estimate cost-savings of each scenario by comparing the total public sector cost of providing OCPs OTC and medical care for unintended pregnancy. Twenty-one percent of low-income women at risk for unintended pregnancy are very likely to use OCPs if they were available without a prescription. Women's use of OTC OCPs varies widely by the out-of-pocket pill pack cost. In a scenario assuming no out-of-pocket costs for the over-the counter pill, an additional 11-21% of low-income women will use the pill, resulting in a 20-36% decrease in the number of women using no method or a method less effective than the pill, and a 7-25% decrease in the number of unintended pregnancies, depending on the level of use and any effect on contraceptive failure rates. If out-of-pocket costs for such pills are low, OTC access could have a significant effect on use of effective contraceptives and unintended pregnancy. Public health plans may reduce expenditures on pregnancy and contraceptive healthcare services by covering oral contraceptives as an OTC product. Interest in OTC access to oral contraceptives is high. Removing the prescription barrier, particularly if pill packs are available at low or zero out-of-pocket cost, could increase the use of effective methods of contraception and reduce unintended pregnancy and healthcare costs for contraceptive and pregnancy care. Copyright © 2015 Elsevier Inc. All rights reserved.
Medical cost analysis: application to colorectal cancer data from the SEER Medicare database.
Bang, Heejung
2005-10-01
Incompleteness is a key feature of most survival data. Numerous well established statistical methodologies and algorithms exist for analyzing life or failure time data. However, induced censorship invalidates the use of those standard analytic tools for some survival-type data such as medical costs. In this paper, some valid methods currently available for analyzing censored medical cost data are reviewed. Some cautionary findings under different assumptions are envisioned through application to medical costs from colorectal cancer patients. Cost analysis should be suitably planned and carefully interpreted under various meaningful scenarios even with judiciously selected statistical methods. This approach would be greatly helpful to policy makers who seek to prioritize health care expenditures and to assess the elements of resource use.
Banknotes and unattended cash transactions
NASA Astrophysics Data System (ADS)
Bernardini, Ronald R.
2000-04-01
There is a 64 billion dollar annual unattended cash transaction business in the US with 10 to 20 million daily transactions. Even small problems with the machine readability of banknotes can quickly become a major problem to the machine manufacturer and consumer. Traditional note designs incorporate overt security features for visual validation by the public. Many of these features such as fine line engraving, microprinting and watermarks are unsuitable as machine readable features in low cost note acceptors. Current machine readable features, mostly covert, were designed and implemented with the central banks in mind. These features are only usable by the banks large, high speed currency sorting and validation equipment. New note designs should consider and provide for low cost not acceptors, implementing features developed for inexpensive sensing technologies. Machine readable features are only as good as their consistency. Quality of security features as well as that of the overall printing process must be maintained to ensure reliable and secure operation of note readers. Variations in printing and of the components used to make the note are one of the major causes of poor performance in low cost note acceptors. The involvement of machine manufacturers in new currency designs will aid note producers in the design of a note that is machine friendly, helping to secure the acceptance of the note by the public as well as acting asa deterrent to fraud.
NASA Astrophysics Data System (ADS)
Depernet, Daniel; Ba, Oumar; Berthon, Alain
2012-12-01
This paper presents a contribution to implementation of hybrid power plants in rural areas without electricity in Senegal. Wind and photovoltaic generators coupling is used to benefit from renewable energy resources in this country. Lead acid storage batteries are coupled with the generators to ensure smoothness of the electricity generation. This work is focused in particular on the development of a low cost online impedance spectroscopy method to address the problem of limited lifetime of batteries and the difficulties of their maintenance in isolated areas. Control of static converter associated with the battery is adapted to integrate the functionality of characterization of batteries by impedance spectroscopy. An experimental platform developed in the laboratory has validated the method for online measurement of battery impedance spectrum and to initiate a phase of data monitoring.
A pore-scale numerical method for simulating low-salinity waterflooding in porous media
NASA Astrophysics Data System (ADS)
Jiang, F.; Yang, J.; Tsuji, T.
2017-12-01
Low-salinity (LS)water injection has been attracting attention as a practical oil recovery technique because of its low cost and high efficiency in recent years. Many researchers conducted laboratory and observed its significant benefits compared to conventional high-salinity (HS) waterflooding. However, the fundamental mechanisms remain poorly understood. Different mechanisms such as fine migration, wettability alteration have been proposed to explain this low-salinity effect. Here, we aim to focus on investigating the effect of wettability alteration on the recovery efficiency. For this purpose, we proposed a pore scale numerical method to quantitatively evaluate the impact of salinity concentration on the sweep efficiency. We first developed the pore scale model by coupling the convection-diffusion model for tracking the concentration change and the lattice Boltzmann model for two-phase flow behavior, and assuming that a reduction of water salinity leads to localised wettability alteration. The model is then validated by simulating the contact angle change of an oil droplet attached to a clay substrate. Finally, the method was applied on a real rock geometry extracted from the micro-CT images of Berea sandstone. The results indicate that the initial wettability state of the system and the extent of wettability alteration are important in predicting the improvement of oil recovery due to LS brine injection. This work was supported by JSPS KAKENHI Grant Numbers 16K18331.
High-fidelity, low-cost, automated method to assess laparoscopic skills objectively.
Gray, Richard J; Kahol, Kanav; Islam, Gazi; Smith, Marshall; Chapital, Alyssa; Ferrara, John
2012-01-01
We sought to define the extent to which a motion analysis-based assessment system constructed with simple equipment could measure technical skill objectively and quantitatively. An "off-the-shelf" digital video system was used to capture the hand and instrument movement of surgical trainees (beginner level = PGY-1, intermediate level = PGY-3, and advanced level = PGY-5/fellows) while they performed a peg transfer exercise. The video data were passed through a custom computer vision algorithm that analyzed incoming pixels to measure movement smoothness objectively. The beginner-level group had the poorest performance, whereas those in the advanced group generated the highest scores. Intermediate-level trainees scored significantly (p < 0.04) better than beginner trainees. Advanced-level trainees scored significantly better than intermediate-level trainees and beginner-level trainees (p < 0.04 and p < 0.03, respectively). A computer vision-based analysis of surgical movements provides an objective basis for technical expertise-level analysis with construct validity. The technology to capture the data is simple, low cost, and readily available, and it obviates the need for expert human assessment in this setting. Copyright © 2012 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.
A low-cost method for estimating energy expenditure during soccer refereeing.
Ardigò, Luca Paolo; Padulo, Johnny; Zuliani, Andrea; Capelli, Carlo
2015-01-01
This study aimed to apply a validated bioenergetics model of sprint running to recordings obtained from commercial basic high-sensitivity global positioning system receivers to estimate energy expenditure and physical activity variables during soccer refereeing. We studied five Italian fifth division referees during 20 official matches while carrying the receivers. By applying the model to the recorded speed and acceleration data, we calculated energy consumption during activity, mass-normalised total energy consumption, total distance, metabolically equivalent distance and their ratio over the entire match and the two halves. Main results were as follows: (match) energy consumption = 4729 ± 608 kJ, mass normalised total energy consumption = 74 ± 8 kJ · kg(-1), total distance = 13,112 ± 1225 m, metabolically equivalent distance = 13,788 ± 1151 m and metabolically equivalent/total distance = 1.05 ± 0.05. By using a very low-cost device, it is possible to estimate the energy expenditure of soccer refereeing. The provided predicting mass-normalised total energy consumption versus total distance equation can supply information about soccer refereeing energy demand.
Low-cost single-crystal turbine blades, volume 2
NASA Technical Reports Server (NTRS)
Strangman, T. E.; Dennis, R. E.; Heath, B. R.
1984-01-01
The overall objectives of Project 3 were to develop the exothermic casting process to produce uncooled single-crystal (SC) HP turbine blades in MAR-M 247 and higher strength derivative alloys and to validate the materials process and components through extensive mechanical property testing, rig testing, and 200 hours of endurance engine testing. These Program objectives were achieved. The exothermic casting process was successfully developed into a low-cost nonproperietary method for producing single-crystal castings. Single-crystal MAR-M 247 and two derivatives DS alloys developed during this project, NASAIR 100 and SC Alloy 3, were fully characterized through mechanical property testing. SC MAR-M 247 shows no significant improvement in strength over directionally solidified (DS) MAR-M 247, but the derivative alloys, NASAIR 100 and Alloy 3, show significant tensile and fatigue improvements. Firtree testing, holography, and strain-gauge rig testing were used to determine the effects of the anisotropic characteristics of single-crystal materials. No undesirable characteristics were found. In general, the single-crystal material behaved similarly to DS MAR-M 247. Two complete engine sets of SC HP turbine blades were cast using the exothermic casting process and fully machined. These blades were successfully engine-tested.
Kumar Thakur, Rupak; Anoop, C S
2015-08-01
Cardio-vascular health monitoring has gained considerable attention in the recent years. Principle of non-contact capacitive electrocardiograph (ECG) and its applicability as a valuable, low-cost, easy-to-use scheme for cardio-vascular health monitoring has been demonstrated in some recent research papers. In this paper, we develop a complete non-contact ECG system using a suitable front-end electronic circuit and a heart-rate (HR) measurement unit using enhanced Fourier interpolation technique. The front-end electronic circuit is realized using low-cost, readily available components and the proposed HR measurement unit is designed to achieve fairly accurate results. The entire system has been extensively tested to verify its efficacy and test results show that the developed system can estimate HR with an accuracy of ±2 beats. Detailed tests have been conducted to validate the performance of the system for different cloth thicknesses of the subject. Some basic tests which illustrate the application of the proposed system for heart-rate variability estimation has been conducted and results reported. The developed system can be used as a portable, reliable, long-term cardiac health monitoring device and can be extended to human drowsiness detection.
[A new low-cost webcam-based laparoscopic training model].
Langeron, A; Mercier, G; Lima, S; Chauleur, C; Golfier, F; Seffert, P; Chêne, G
2012-01-01
To validate a new laparoscopy home training model (GYN Trainer®) in order to practise and learn basic laparoscopic surgery. Ten junior surgical residents and six experienced operators were timed and assessed during six laparoscopic exercises performed on the home training model. Acquisition of skill was 35%. All the novices significantly improved performance in surgical skills despite an 8% partial loss of acquisition between two training sessions. Qualitative evaluation of the system was good (3.8/5). This low-cost personal laparoscopic model seems to be a useful tool to assist surgical novices in learning basic laparoscopic skills. Copyright © 2012 Elsevier Masson SAS. All rights reserved.
Horton, Keith A.; Williams-Jones, Glyn; Garbeil, Harold; Elias, Tamar; Sutton, A. Jeff; Mouginis-Mark, Peter J; Porter, John T.; Clegg, Steven
2006-01-01
A miniaturized, lightweight and low-cost UV correlation spectrometer, the FLYSPEC, has been developed as an alternative for the COSPEC, which has long been the mainstay for monitoring volcanic sulfur dioxide fluxes. Field experiments have been conducted with the FLYSPEC at diverse volcanic systems, including Masaya (Nicaragua), Poás (Costa Rica), Stromboli, Etna and Vulcano (Italy), Villarica (Chile) and Kilauea (USA). We present here those validation measurements that were made simultaneously with COSPEC at Kilauea between March 2002 and February 2003. These experiments, with source emission rates that ranged from 95 to 1,560 t d−1, showed statistically identical results from both instruments. SO2 path-concentrations ranged from 0 to >1,000 ppm-m with average correlation coefficients greater than r2=0.946. The small size and low cost create the opportunity for FLYSPEC to be used in novel deployment modes that have the potential to revolutionize the manner in which volcanic and industrial monitoring is performed.
Multi-time Scale Joint Scheduling Method Considering the Grid of Renewable Energy
NASA Astrophysics Data System (ADS)
Zhijun, E.; Wang, Weichen; Cao, Jin; Wang, Xin; Kong, Xiangyu; Quan, Shuping
2018-01-01
Renewable new energy power generation prediction error like wind and light, brings difficulties to dispatch the power system. In this paper, a multi-time scale robust scheduling method is set to solve this problem. It reduces the impact of clean energy prediction bias to the power grid by using multi-time scale (day-ahead, intraday, real time) and coordinating the dispatching power output of various power supplies such as hydropower, thermal power, wind power, gas power and. The method adopts the robust scheduling method to ensure the robustness of the scheduling scheme. By calculating the cost of the abandon wind and the load, it transforms the robustness into the risk cost and optimizes the optimal uncertainty set for the smallest integrative costs. The validity of the method is verified by simulation.
24 CFR 221.275 - Method of paying insurance benefits.
Code of Federal Regulations, 2010 CFR
2010-04-01
... AUTHORITIES LOW COST AND MODERATE INCOME MORTGAGE INSURANCE-SAVINGS CLAUSE Contract Rights and Obligations-Low Cost Homes § 221.275 Method of paying insurance benefits. If the application for insurance benefits is acceptable to the Commissioner, all of the insurance claim shall be paid in cash unless the mortgagee files a...
A new passive sampling method with rapid low-cost spectral detection has recently been developed. The method makes use of an ultraviolet (UV)-transparent polymer which serves as both a concentrator for dissolved compounds, and an optical cell for UV spectral detection. Because ...
Validation of GC and HPLC systems for residue studies
DOE Office of Scientific and Technical Information (OSTI.GOV)
Williams, M.
1995-12-01
For residue studies, GC and HPLC system performance must be validated prior to and during use. One excellent measure of system performance is the standard curve and associated chromatograms used to construct that curve. The standard curve is a model of system response to an analyte over a specific time period, and is prima facia evidence of system performance beginning at the auto sampler and proceeding through the injector, column, detector, electronics, data-capture device, and printer/plotter. This tool measures the performance of the entire chromatographic system; its power negates most of the benefits associated with costly and time-consuming validation ofmore » individual system components. Other measures of instrument and method validation will be discussed, including quality control charts and experimental designs for method validation.« less
Tervonen, Tommi; Gelhorn, Heather; Sri Bhashyam, Sumitra; Poon, Jiat-Ling; Gries, Katharine S; Rentz, Anne; Marsh, Kevin
2017-12-01
Multiple criteria decision analysis swing weighting (SW) and discrete choice experiments (DCE) are appropriate methods for capturing patient preferences on treatment benefit-risk trade-offs. This paper presents a qualitative comparison of the 2 methods. We review and critically assess similarities and differences of SW and DCE based on 6 aspects: comprehension by study participants, cognitive biases, sample representativeness, ability to capture heterogeneity in preferences, reliability and validity, and robustness of the results. The SW choice task can be more difficult, but the workshop context in which SW is conducted may provide more support to patients who are unfamiliar with the end points being evaluated or who have cognitive impairments. Both methods are similarly prone to a number of biases associated with preference elicitation, and DCE is prone to simplifying heuristics, which limits its application with large number of attributes. The low cost per patient of the DCE means that it can be better at achieving a representative sample, though SW does not require such large sample sizes due to exact nature of the collected preference data. This also means that internal validity is automatically enforced with SW, while the internal validity of DCE results needs to be assessed manually. Choice between the 2 methods depends on characteristics of the benefit-risk assessment, especially on how difficult the trade-offs are for the patients to make and how many patients are available. Although there exist some empirical studies on many of the evaluation aspects, critical evidence gaps remain. Copyright © 2017 John Wiley & Sons, Ltd.
Salemi, Jason L; Comins, Meg M; Chandler, Kristen; Mogos, Mulubrhan F; Salihu, Hamisu M
2013-08-01
Comparative effectiveness research (CER) and cost-effectiveness analysis are valuable tools for informing health policy and clinical care decisions. Despite the increased availability of rich observational databases with economic measures, few researchers have the skills needed to conduct valid and reliable cost analyses for CER. The objectives of this paper are to (i) describe a practical approach for calculating cost estimates from hospital charges in discharge data using publicly available hospital cost reports, and (ii) assess the impact of using different methods for cost estimation in maternal and child health (MCH) studies by conducting economic analyses on gestational diabetes (GDM) and pre-pregnancy overweight/obesity. In Florida, we have constructed a clinically enhanced, longitudinal, encounter-level MCH database covering over 2.3 million infants (and their mothers) born alive from 1998 to 2009. Using this as a template, we describe a detailed methodology to use publicly available data to calculate hospital-wide and department-specific cost-to-charge ratios (CCRs), link them to the master database, and convert reported hospital charges to refined cost estimates. We then conduct an economic analysis as a case study on women by GDM and pre-pregnancy body mass index (BMI) status to compare the impact of using different methods on cost estimation. Over 60 % of inpatient charges for birth hospitalizations came from the nursery/labor/delivery units, which have very different cost-to-charge markups (CCR = 0.70) than the commonly substituted hospital average (CCR = 0.29). Using estimated mean, per-person maternal hospitalization costs for women with GDM as an example, unadjusted charges ($US14,696) grossly overestimated actual cost, compared with hospital-wide ($US3,498) and department-level ($US4,986) CCR adjustments. However, the refined cost estimation method, although more accurate, did not alter our conclusions that infant/maternal hospitalization costs were significantly higher for women with GDM than without, and for overweight/obese women than for those in a normal BMI range. Cost estimates, particularly among MCH-related services, vary considerably depending on the adjustment method. Our refined approach will be valuable to researchers interested in incorporating more valid estimates of cost into databases with linked hospital discharge files.
Zou, Yaming; Liao, Yu; Liu, Fengying; Chen, Lei; Shen, Hongcheng; Huang, Shujie; Zheng, Heping; Yang, Bin; Hao, Yuantao
2017-11-01
Syphilis has continuously posed a great challenge to China. However, very little data existed regarding the cost of syphilis. Taking Guangdong Initiative for Comprehensive Control of Syphilis area as the research site, we aimed to comprehensively measure the annual economic burden of syphilis from a societal perspective. Newly diagnosed and follow-up outpatient cases were investigated by questionnaire. Reported tertiary syphilis cases and medical institutions cost were both collected. The direct economic burden was measured by the bottom-up approach, the productivity cost by the human capital method, and the intangible burden by the contingency valuation method. Three hundred five valid early syphilis cases and 13 valid tertiary syphilis cases were collected in the investigation to estimate the personal average cost. The total economic burden of syphilis was US $729,096.85 in Guangdong Initiative for Comprehensive Control of Syphilis sites in the year of 2014, with medical institutions cost accounting for 73.23% of the total. Household average direct cost of early syphilis was US $23.74. Average hospitalization cost of tertiary syphilis was US $2,749.93. Of the cost to medical institutions, screening and testing comprised the largest proportion (26%), followed by intervention and case management (22%) and operational cost (21%). Household average productivity cost of early syphilis was US $61.19. Household intangible cost of syphilis was US $15,810.54. Syphilis caused a substantial economic burden on patients, their families, and society in Guangdong. Household productivity and intangible costs both shared positive relationships with local economic levels. Strengthening the prevention and effective treatment of early syphilis could greatly help to lower the economic burden of syphilis.
Jiang, Jian; James, Christopher A; Wong, Philip
2016-09-05
A LC-MS/MS method has been developed and validated for the determination of glycine in human cerebrospinal fluid (CSF). The validated method used artificial cerebrospinal fluid as a surrogate matrix for calibration standards. The calibration curve range for the assay was 100-10,000ng/mL and (13)C2, (15)N-glycine was used as an internal standard (IS). Pre-validation experiments were performed to demonstrate parallelism with surrogate matrix and standard addition methods. The mean endogenous glycine concentration in a pooled human CSF determined on three days by using artificial CSF as a surrogate matrix and the method of standard addition was found to be 748±30.6 and 768±18.1ng/mL, respectively. A percentage difference of -2.6% indicated that artificial CSF could be used as a surrogate calibration matrix for the determination of glycine in human CSF. Quality control (QC) samples, except the lower limit of quantitation (LLOQ) QC and low QC samples, were prepared by spiking glycine into aliquots of pooled human CSF sample. The low QC sample was prepared from a separate pooled human CSF sample containing low endogenous glycine concentrations, while the LLOQ QC sample was prepared in artificial CSF. Standard addition was used extensively to evaluate matrix effects during validation. The validated method was used to determine the endogenous glycine concentrations in human CSF samples. Incurred sample reanalysis demonstrated reproducibility of the method. Copyright © 2016 Elsevier B.V. All rights reserved.
Cost-effective FITL technologies for small business and residential customers
NASA Astrophysics Data System (ADS)
Andersen, Niels E.; Woolnough, Peter; Seidenberg, Juergen; Ferreira, Mario F. S.
1995-02-01
FIRST is a RACE project where 5 main European telecoms operators, 4 equipment manufacturers and one university have joined up to define and test in a field trial in Portugal a cost effective Optical Access Network. The main design target has been a system which gives cost effective provision of wideband services for small and medium business customers. The system however, incorporates provision of telephone, ISDN and analog and digital video for residential customers as well. Technologies have been chosen with the objective of providing a simple, robust and flexible system where initial deployment costs are low and closely related to the service take up. The paper describes the main technical features of the system and network applications which shows how the system may be introduced in network planning. The system is based on Passive Optical Network technology where video is distributed in the 1550 nm window and telecoms services transmitted at 1300 nm in full duplex mode. The telecoms system provides high capacity, flexibility in loop length and robustness towards outside plant performance. The Subcarrier Multiple Access (SCMA) method is used for upstream transmission of bi-directional telecoms services. SCMA has advantages compared to the Time Division Multiple Access technology used in other systems. Bandwidth/cost tradeoff is better and the lower requirements to the outside plant increases the overall cost benefit. Optical beat noise due to overlapping of laser spectra which may be a problem for this technology has been addressed with success through the use of a suitable modulation and control technique. This technology is further validated in the field trial. The video system provides cost effective long distance transmission on standard fiber with externally modulated lasers and cascaded amplifiers. Coexistence of analog and digital video on one fiber with different modulation schemes i.e. BPSK, QPSK and 64 QAM have been validated. Total life cycle cost evaluations based on availability data, maintenance requirements and expectations for service development have been made. The field trial will be running for two years.
A portable low-cost long-term live-cell imaging platform for biomedical research and education.
Walzik, Maria P; Vollmar, Verena; Lachnit, Theresa; Dietz, Helmut; Haug, Susanne; Bachmann, Holger; Fath, Moritz; Aschenbrenner, Daniel; Abolpour Mofrad, Sepideh; Friedrich, Oliver; Gilbert, Daniel F
2015-02-15
Time-resolved visualization and analysis of slow dynamic processes in living cells has revolutionized many aspects of in vitro cellular studies. However, existing technology applied to time-resolved live-cell microscopy is often immobile, costly and requires a high level of skill to use and maintain. These factors limit its utility to field research and educational purposes. The recent availability of rapid prototyping technology makes it possible to quickly and easily engineer purpose-built alternatives to conventional research infrastructure which are low-cost and user-friendly. In this paper we describe the prototype of a fully automated low-cost, portable live-cell imaging system for time-resolved label-free visualization of dynamic processes in living cells. The device is light-weight (3.6 kg), small (22 × 22 × 22 cm) and extremely low-cost (<€1250). We demonstrate its potential for biomedical use by long-term imaging of recombinant HEK293 cells at varying culture conditions and validate its ability to generate time-resolved data of high quality allowing for analysis of time-dependent processes in living cells. While this work focuses on long-term imaging of mammalian cells, the presented technology could also be adapted for use with other biological specimen and provides a general example of rapidly prototyped low-cost biosensor technology for application in life sciences and education. Copyright © 2014 The Authors. Published by Elsevier B.V. All rights reserved.
Lee, Kyoungyeul; Lee, Minho; Kim, Dongsup
2017-12-28
The identification of target molecules is important for understanding the mechanism of "target deconvolution" in phenotypic screening and "polypharmacology" of drugs. Because conventional methods of identifying targets require time and cost, in-silico target identification has been considered an alternative solution. One of the well-known in-silico methods of identifying targets involves structure activity relationships (SARs). SARs have advantages such as low computational cost and high feasibility; however, the data dependency in the SAR approach causes imbalance of active data and ambiguity of inactive data throughout targets. We developed a ligand-based virtual screening model comprising 1121 target SAR models built using a random forest algorithm. The performance of each target model was tested by employing the ROC curve and the mean score using an internal five-fold cross validation. Moreover, recall rates for top-k targets were calculated to assess the performance of target ranking. A benchmark model using an optimized sampling method and parameters was examined via external validation set. The result shows recall rates of 67.6% and 73.9% for top-11 (1% of the total targets) and top-33, respectively. We provide a website for users to search the top-k targets for query ligands available publicly at http://rfqsar.kaist.ac.kr . The target models that we built can be used for both predicting the activity of ligands toward each target and ranking candidate targets for a query ligand using a unified scoring scheme. The scores are additionally fitted to the probability so that users can estimate how likely a ligand-target interaction is active. The user interface of our web site is user friendly and intuitive, offering useful information and cross references.
Cost-of-illness studies of atrial fibrillation: methodological considerations.
Becker, Christian
2014-10-01
Atrial fibrillation (AF) is the most common heart rhythm arrhythmia, which has considerable economic consequences. This study aims to identify the current cost-of-illness estimates of AF; a focus was put on describing the studies' methodology. A literature review was conducted. Twenty-eight cost-of-illness studies were identified. Cost-of-illness estimates exist for health insurance members, hospital and primary care populations. In addition, the cost of stroke in AF patients and the costs of post-operative AF were calculated. The methods used were heterogeneous, mostly studies calculated excess costs. The identified annual excess costs varied, even among studies from the USA (∼US$1900 to ∼US$19,000). While pointing toward considerable costs, the cost-of-illness studies' relevance could be improved by focusing on subpopulations and treatment mixes. As possible starting points for subsequent economic studies, the methodology of cost-of-illness studies should be taken into account using methods, allowing stakeholders to find suitable studies and validate estimates.
NASA Astrophysics Data System (ADS)
Adams, Marc; Fromm, Reinhard; Bühler, Yves; Bösch, Ruedi; Ginzler, Christian
2016-04-01
Detailed information on the spatio-temporal distribution of seasonal snow in the alpine terrain plays a major role for the hydrological cycle, natural hazard management, flora and fauna, as well as tourism. Current methods are mostly only valid on a regional scale or require a trade-off between the data's availability, cost and resolution. During a one-year pilot study, we investigated the potential of remotely piloted aerial systems (RPAS) and structure-from-motion photogrammetry for snow depth mapping. We employed multi-copter and fixed-wing RPAS, equipped with different low-cost, off-the shelf sensors, at four test sites in Austria and Switzerland. Over 30 flights were performed during the winter 2014/15, where different camera settings, filters and lenses, as well as data collection routines were tested. Orthophotos and digital surface models (DSM) where calculated from the imagery using structure-from-motion photogrammetry software. Snow height was derived by subtracting snow-free from snow-covered DSMs. The RPAS-results were validated against data collected using a variety of well-established remote sensing (i.e. terrestrial laser scanning, large frame aerial sensors) and in-situ measurement techniques. The results show, that RPAS i) are able to map snow depth within accuracies of 0.07-0.15 m root mean square error (RMSE), when compared to traditional in-situ data; ii) can be operated at lower cost, easier repeatability, less operational constraints and higher GSD than large frame aerial sensors on-board manned aircraft, while achieving significantly higher accuracies; iii) are able to acquire meaningful data even under harsh environmental conditions above 2000 m a.s.l. (turbulence, low temperature and high irradiance, low air density). While providing a first prove-of-concept, the study also showed future challenges and limitations of RPAS-based snow depth mapping, including a high dependency on correct co-registration of snow-free and snow-covered height measurements, as well as a significant impact of the underlying vegetation and illumination of the snow surface on the fidelity of the results.
Sherrod, Brandon A.; Dew, Dustin A.; Rogers, Rebecca; Rimmer, James H.; Eberhardt, Alan W.
2017-01-01
Accessible high-capacity weighing scales are scarce in healthcare facilities, in part due to high device cost and weight. This shortage impairs weight monitoring and health maintenance for people with disabilities and/or morbid obesity. We conducted this study to design and validate a lighter, lower cost, high-capacity accessible weighing device. A prototype featuring 360 kg (800 lbs) weight capacity, a wheelchair-accessible ramp, and wireless data transmission was fabricated. Forty-five participants (20 standing, 20 manual wheelchair users, and 5 power wheelchair users) were weighed using the prototype and a calibrated scale. Participants were surveyed to assess perception of each weighing device and the weighing procedure. Weight measurements between devices demonstrated a strong linear correlation (R2=0.997) with absolute differences of 1.4±2.0% (mean±SD). Participant preference ratings showed no difference between devices. The prototype weighed 11 kg (38%) less than the next lightest high-capacity commercial device found by author survey. The prototype’s estimated commercial price range, $500–600, is approximately half the price of the least expensive commercial device found by author survey. Such low cost weighing devices may improve access to weighing instrumentation, which may in turn help eliminate current health disparities. Future work is needed to determine the feasibility of market transition. PMID:27450105
Sherrod, Brandon A; Dew, Dustin A; Rogers, Rebecca; Rimmer, James H; Eberhardt, Alan W
2017-01-01
Accessible high-capacity weighing scales are scarce in healthcare facilities, in part due to high device cost and weight. This shortage impairs weight monitoring and health maintenance for people with disabilities and/or morbid obesity. We conducted this study to design and validate a lighter, lower cost, high-capacity accessible weighing device. A prototype featuring 360 kg (800 lbs) of weight capacity, a wheelchair-accessible ramp, and wireless data transmission was fabricated. Forty-five participants (20 standing, 20 manual wheelchair users, and five power wheelchair users) were weighed using the prototype and a calibrated scale. Participants were surveyed to assess perception of each weighing device and the weighing procedure. Weight measurements between devices demonstrated a strong linear correlation (R 2 = 0.997) with absolute differences of 1.4 ± 2.0% (mean±SD). Participant preference ratings showed no difference between devices. The prototype weighed 11 kg (38%) less than the next lightest high-capacity commercial device found by author survey. The prototype's estimated commercial price range, $500-$600, is approximately half the price of the least expensive commercial device found by author survey. Such low cost weighing devices may improve access to weighing instrumentation, which may in turn help eliminate current health disparities. Future work is needed to determine the feasibility of market transition.
Bhat, Somanath; Polanowski, Andrea M; Double, Mike C; Jarman, Simon N; Emslie, Kerry R
2012-01-01
Recent advances in nanofluidic technologies have enabled the use of Integrated Fluidic Circuits (IFCs) for high-throughput Single Nucleotide Polymorphism (SNP) genotyping (GT). In this study, we implemented and validated a relatively low cost nanofluidic system for SNP-GT with and without Specific Target Amplification (STA). As proof of principle, we first validated the effect of input DNA copy number on genotype call rate using well characterised, digital PCR (dPCR) quantified human genomic DNA samples and then implemented the validated method to genotype 45 SNPs in the humpback whale, Megaptera novaeangliae, nuclear genome. When STA was not incorporated, for a homozygous human DNA sample, reaction chambers containing, on average 9 to 97 copies, showed 100% call rate and accuracy. Below 9 copies, the call rate decreased, and at one copy it was 40%. For a heterozygous human DNA sample, the call rate decreased from 100% to 21% when predicted copies per reaction chamber decreased from 38 copies to one copy. The tightness of genotype clusters on a scatter plot also decreased. In contrast, when the same samples were subjected to STA prior to genotyping a call rate and a call accuracy of 100% were achieved. Our results demonstrate that low input DNA copy number affects the quality of data generated, in particular for a heterozygous sample. Similar to human genomic DNA, a call rate and a call accuracy of 100% was achieved with whale genomic DNA samples following multiplex STA using either 15 or 45 SNP-GT assays. These calls were 100% concordant with their true genotypes determined by an independent method, suggesting that the nanofluidic system is a reliable platform for executing call rates with high accuracy and concordance in genomic sequences derived from biological tissue.
Lopéz-Blanco, José Ramón; Chacón, Pablo
2013-11-01
Here, we employed the collective motions extracted from Normal Mode Analysis (NMA) in internal coordinates (torsional space) for the flexible fitting of atomic-resolution structures into electron microscopy (EM) density maps. The proposed methodology was validated using a benchmark of simulated cases, highlighting its robustness over the full range of EM resolutions and even over coarse-grained representations. A systematic comparison with other methods further showcased the advantages of this proposed methodology, especially at medium to lower resolutions. Using this method, computational costs and potential overfitting problems are naturally reduced by constraining the search in low-frequency NMA space, where covalent geometry is implicitly maintained. This method also effectively captures the macromolecular changes of a representative set of experimental test cases. We believe that this novel approach will extend the currently available EM hybrid methods to the atomic-level interpretation of large conformational changes and their functional implications. Copyright © 2013 Elsevier Inc. All rights reserved.
Analyzing the Validity of the Adult-Adolescent Parenting Inventory for Low-Income Populations
ERIC Educational Resources Information Center
Lawson, Michael A.; Alameda-Lawson, Tania; Byrnes, Edward
2017-01-01
Objectives: The purpose of this study was to examine the construct and predictive validity of the Adult-Adolescent Parenting Inventory (AAPI-2). Methods: The validity of the AAPI-2 was evaluated using multiple statistical methods, including exploratory factor analysis, confirmatory factor analysis, and latent class analysis. These analyses were…
Intrathecal Drug Delivery Systems for Noncancer Pain: A Health Technology Assessment.
2016-01-01
Intrathecal drug delivery systems can be used to manage refractory or persistent chronic nonmalignant (noncancer) pain. We investigated the benefits, harms, cost-effectiveness, and budget impact of these systems compared with current standards of care for adult patients with chronic pain owing to nonmalignant conditions. We searched Ovid MEDLINE, Ovid Embase, the Cochrane Library, and the National Health Service's Economic Evaluation Database and Tufts Cost-Effectiveness Analysis Registry from January 1994 to April 2014 for evidence of effectiveness, harms, and cost-effectiveness. We used existing systematic reviews that had employed reliable search and screen methods and also searched for studies published after the search date reported in the latest systematic review to identify studies. Two reviewers screened records and assessed study validity. We found comparative evidence of effectiveness and harms in one cohort study at high risk of bias (≥ 3-year follow-up, N = 130). Four economic evaluations of low to very low quality were also included. Compared with oral opioid analgesia alone or a program of analgesia plus rehabilitation, intrathecal drug delivery systems significantly reduced pain (27% additional improvement) and morphine consumption. Despite these reductions, intrathecal drug delivery systems were not superior in patient-reported well-being or quality of life. There is no evidence of superiority of intrathecal drug delivery systems over oral opioids in global pain improvement and global treatment satisfaction. Comparative evidence of harms was not found. Cost-effectiveness evidence is of insufficient quality to assess the appropriateness of funding intrathecal drug delivery systems. Evidence comparing intrathecal drug delivery systems with standard care was of very low quality. Current evidence does not establish (or rule out) superiority or cost-effectiveness of intrathecal drug delivery systems for managing chronic refractory nonmalignant pain. The budget impact of funding intrathecal drug delivery systems would be between $1.5 and $5.0 million per year.
Personal exposure monitoring of PM2.5 in indoor and outdoor microenvironments.
Steinle, Susanne; Reis, Stefan; Sabel, Clive E; Semple, Sean; Twigg, Marsailidh M; Braban, Christine F; Leeson, Sarah R; Heal, Mathew R; Harrison, David; Lin, Chun; Wu, Hao
2015-03-01
Adverse health effects from exposure to air pollution are a global challenge and of widespread concern. Recent high ambient concentration episodes of air pollutants in European cities highlighted the dynamic nature of human exposure and the gaps in data and knowledge about exposure patterns. In order to support health impact assessment it is essential to develop a better understanding of individual exposure pathways in people's everyday lives by taking account of all environments in which people spend time. Here we describe the development, validation and results of an exposure method applied in a study conducted in Scotland. A low-cost particle counter based on light-scattering technology - the Dylos 1700 was used. Its performance was validated in comparison with equivalent instruments (TEOM-FDMS) at two national monitoring network sites (R(2)=0.9 at a rural background site, R(2)=0.7 at an urban background site). This validation also provided two functions to convert measured PNCs into calculated particle mass concentrations for direct comparison of concentrations with equivalent monitoring instruments and air quality limit values. This study also used contextual and time-based activity data to define six microenvironments (MEs) to assess everyday exposure of individuals to short-term PM2.5 concentrations. The Dylos was combined with a GPS receiver to track movement and exposure of individuals across the MEs. Seventeen volunteers collected 35 profiles. Profiles may have a different overall duration and structure with respect to times spent in different MEs and activities undertaken. Results indicate that due to the substantial variability across and between MEs, it is essential to measure near-complete exposure pathways to allow for a comprehensive assessment of the exposure risk a person encounters on a daily basis. Taking into account the information gained through personal exposure measurements, this work demonstrates the added value of data generated by the application of low-cost monitors. Copyright © 2014. Published by Elsevier B.V.
NASA Astrophysics Data System (ADS)
Sadighi, Kira; Coffey, Evan; Polidori, Andrea; Feenstra, Brandon; Lv, Qin; Henze, Daven K.; Hannigan, Michael
2018-03-01
Sensor networks are being more widely used to characterize and understand compounds in the atmosphere like ozone (O3). This study employs a measurement tool, called the U-Pod, constructed at the University of Colorado Boulder, to investigate spatial and temporal variability of O3 in a 200 km2 area of Riverside County near Los Angeles, California. This tool contains low-cost sensors to collect ambient data at non-permanent locations. The U-Pods were calibrated using a pre-deployment field calibration technique; all the U-Pods were collocated with regulatory monitors. After collocation, the U-Pods were deployed in the area mentioned. A subset of pods was deployed at two local regulatory air quality monitoring stations providing validation for the collocation calibration method. Field validation of sensor O3 measurements to minute-resolution reference observations resulted in R2 and root mean squared errors (RMSEs) of 0.95-0.97 and 4.4-5.9 ppbv, respectively. Using the deployment data, ozone concentrations were observed to vary on this small spatial scale. In the analysis based on hourly binned data, the median R2 values between all possible U-Pod pairs varied from 0.52 to 0.86 for ozone during the deployment. The medians of absolute differences were calculated between all possible pod pairs, 21 pairs total. The median values of those median absolute differences for each hour of the day varied between 2.2 and 9.3 ppbv for the ozone deployment. Since median differences between U-Pod concentrations during deployment are larger than the respective root mean square error values, we can conclude that there is spatial variability in this criteria pollutant across the study area. This is important because it means that citizens may be exposed to more, or less, ozone than they would assume based on current regulatory monitoring.
Jacob-Tacken, Karin H M; Koopmanschap, Marc A; Meerding, Willem Jan; Severens, Johan L
2005-05-01
In the economic evaluation of health care programmes, productivity costs are often estimated using patients' wages for the period of absence. However, the use of such methods for short periods of absence is controversial. A previous study found that short-term absence is often compensated for during normal working hours and therefore does not lead to productivity losses. As such, the application of any approach almost certainly overestimates productivity costs. In this study, we examined the productivity costs for five different patient populations and one employee population, using the classical method and by identifying when extra effort was needed. In general, the results showed that productivity costs based on identifying extra effort were 25-30% of the classical estimates. For absences of just one day, productivity costs were relevant in only 17-19% of cases. For absences of two weeks or longer, productivity costs were relevant in 35-39% of cases. Measurement of the compensating mechanisms seemed to be valid, since there is considerable agreement between the opinion of supervisors and their employees about whether compensation covers productivity costs. There was much less agreement between supervisors and their employees on specific compensating mechanisms, however. The measurement of compensating mechanisms also seemed to be valid, because--as expected--different compensating mechanisms were reported for different occupations. In our study populations, compensating mechanisms appeared to differ with occupational characteristics, like part-time work, managerial work and shift work. Copyright 2004 John Wiley & Sons, Ltd
Iwamoto, Masami; Nakahira, Yuko; Kimpara, Hideyuki
2015-01-01
Active safety devices such as automatic emergency brake (AEB) and precrash seat belt have the potential to accomplish further reduction in the number of the fatalities due to automotive accidents. However, their effectiveness should be investigated by more accurate estimations of their interaction with human bodies. Computational human body models are suitable for investigation, especially considering muscular tone effects on occupant motions and injury outcomes. However, the conventional modeling approaches such as multibody models and detailed finite element (FE) models have advantages and disadvantages in computational costs and injury predictions considering muscular tone effects. The objective of this study is to develop and validate a human body FE model with whole body muscles, which can be used for the detailed investigation of interaction between human bodies and vehicular structures including some safety devices precrash and during a crash with relatively low computational costs. In this study, we developed a human body FE model called THUMS (Total HUman Model for Safety) with a body size of 50th percentile adult male (AM50) and a sitting posture. The model has anatomical structures of bones, ligaments, muscles, brain, and internal organs. The total number of elements is 281,260, which would realize relatively low computational costs. Deformable material models were assigned to all body parts. The muscle-tendon complexes were modeled by truss elements with Hill-type muscle material and seat belt elements with tension-only material. The THUMS was validated against 35 series of cadaver or volunteer test data on frontal, lateral, and rear impacts. Model validations for 15 series of cadaver test data associated with frontal impacts are presented in this article. The THUMS with a vehicle sled model was applied to investigate effects of muscle activations on occupant kinematics and injury outcomes in specific frontal impact situations with AEB. In the validations using 5 series of cadaver test data, force-time curves predicted by the THUMS were quantitatively evaluated using correlation and analysis (CORA), which showed good or acceptable agreement with cadaver test data in most cases. The investigation of muscular effects showed that muscle activation levels and timing had significant effects on occupant kinematics and injury outcomes. Although further studies on accident injury reconstruction are needed, the THUMS has the potential for predictions of occupant kinematics and injury outcomes considering muscular tone effects with relatively low computational costs.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Huang, Zhenyu; Du, Pengwei; Kosterev, Dmitry
2013-05-01
Disturbance data recorded by phasor measurement units (PMU) offers opportunities to improve the integrity of dynamic models. However, manually tuning parameters through play-back events demands significant efforts and engineering experiences. In this paper, a calibration method using the extended Kalman filter (EKF) technique is proposed. The formulation of EKF with parameter calibration is discussed. Case studies are presented to demonstrate its validity. The proposed calibration method is cost-effective, complementary to traditional equipment testing for improving dynamic model quality.
Castiel, D; Herve, C
1992-01-01
In general, a large number of patients is needed to conclude whether the results of a therapeutic strategy are significant or not. One can lower this number with a logit. The method has been proposed in an article published recently (Cost-utility analysis of early thrombolytic therapy, Pharmaco Economics, 1992). The present article is an essay aimed at validating the method, both from the econometric and ethical points of view.
A New Method for Single-Epoch Ambiguity Resolution with Indoor Pseudolite Positioning.
Li, Xin; Zhang, Peng; Guo, Jiming; Wang, Jinling; Qiu, Weining
2017-04-21
Ambiguity resolution (AR) is crucial for high-precision indoor pseudolite positioning. Due to the existing characteristics of the pseudolite positioning system, such as the geometry structure of the stationary pseudolite which is consistently invariant, the indoor signal is easy to interrupt and the first order linear truncation error cannot be ignored, and a new AR method based on the idea of the ambiguity function method (AFM) is proposed in this paper. The proposed method is a single-epoch and nonlinear method that is especially well-suited for indoor pseudolite positioning. Considering the very low computational efficiency of conventional AFM, we adopt an improved particle swarm optimization (IPSO) algorithm to search for the best solution in the coordinate domain, and variances of a least squares adjustment is conducted to ensure the reliability of the solving ambiguity. Several experiments, including static and kinematic tests, are conducted to verify the validity of the proposed AR method. Numerical results show that the IPSO significantly improved the computational efficiency of AFM and has a more elaborate search ability compared to the conventional grid searching method. For the indoor pseudolite system, which had an initial approximate coordinate precision better than 0.2 m, the AFM exhibited good performances in both static and kinematic tests. With the corrected ambiguity gained from our proposed method, indoor pseudolite positioning can achieve centimeter-level precision using a low-cost single-frequency software receiver.
Control Theory based Shape Design for the Incompressible Navier-Stokes Equations
NASA Astrophysics Data System (ADS)
Cowles, G.; Martinelli, L.
2003-12-01
A design method for shape optimization in incompressible turbulent viscous flow has been developed and validated for inverse design. The gradient information is determined using a control theory based algorithm. With such an approach, the cost of computing the gradient is negligible. An additional adjoint system must be solved which requires the cost of a single steady state flow solution. Thus, this method has an enormous advantage over traditional finite-difference based algorithms. The method of artificial compressibility is utilized to solve both the flow and adjoint systems. An algebraic turbulence model is used to compute the eddy viscosity. The method is validated using several inverse wing design test cases. In each case, the program must modify the shape of the initial wing such that its pressure distribution matches that of the target wing. Results are shown for the inversion of both finite thickness wings as well as zero thickness wings which can be considered a model of yacht sails.
Developing Biosensors in Developing Countries: South Africa as a Case Study
Fogel, Ronen; Limson, Janice
2016-01-01
A mini-review of the reported biosensor research occurring in South Africa evidences a strong emphasis on electrochemical sensor research, guided by the opportunities this transduction platform holds for low-cost and robust sensing of numerous targets. Many of the reported publications centre on fundamental research into the signal transduction method, using model biorecognition elements, in line with international trends. Other research in this field is spread across several areas including: the application of nanotechnology; the identification and validation of biomarkers; development and testing of biorecognition agents (antibodies and aptamers) and design of electro-catalysts, most notably metallophthalocyanine. Biosensor targets commonly featured were pesticides and metals. Areas of regional import to sub-Saharan Africa, such as HIV/AIDs and tuberculosis diagnosis, are also apparent in a review of the available literature. Irrespective of the targets, the challenge to the effective deployment of such sensors remains shaped by social and economic realities such that the requirements thereof are for low-cost and universally easy to operate devices for field settings. While it is difficult to disentangle the intertwined roles of national policy, grant funding availability and, certainly, of global trends in shaping areas of emphasis in research, most notable is the strong role that nanotechnology, and to a certain extent biotechnology, plays in research regarding biosensor construction. Stronger emphasis on collaboration between scientists in theoretical modelling, nanomaterials application and or relevant stakeholders in the specific field (e.g., food or health monitoring) and researchers in biosensor design may help evolve focused research efforts towards development and deployment of low-cost biosensors. PMID:26848700
Developing Biosensors in Developing Countries: South Africa as a Case Study.
Fogel, Ronen; Limson, Janice
2016-02-02
A mini-review of the reported biosensor research occurring in South Africa evidences a strong emphasis on electrochemical sensor research, guided by the opportunities this transduction platform holds for low-cost and robust sensing of numerous targets. Many of the reported publications centre on fundamental research into the signal transduction method, using model biorecognition elements, in line with international trends. Other research in this field is spread across several areas including: the application of nanotechnology; the identification and validation of biomarkers; development and testing of biorecognition agents (antibodies and aptamers) and design of electro-catalysts, most notably metallophthalocyanine. Biosensor targets commonly featured were pesticides and metals. Areas of regional import to sub-Saharan Africa, such as HIV/AIDs and tuberculosis diagnosis, are also apparent in a review of the available literature. Irrespective of the targets, the challenge to the effective deployment of such sensors remains shaped by social and economic realities such that the requirements thereof are for low-cost and universally easy to operate devices for field settings. While it is difficult to disentangle the intertwined roles of national policy, grant funding availability and, certainly, of global trends in shaping areas of emphasis in research, most notable is the strong role that nanotechnology, and to a certain extent biotechnology, plays in research regarding biosensor construction. Stronger emphasis on collaboration between scientists in theoretical modelling, nanomaterials application and or relevant stakeholders in the specific field (e.g., food or health monitoring) and researchers in biosensor design may help evolve focused research efforts towards development and deployment of low-cost biosensors.
Xia, Dunzhu; Yao, Yanhong; Cheng, Limei
2017-06-15
In this paper, we aimed to achieve the indoor tracking control of a two-wheeled inverted pendulum (TWIP) vehicle. The attitude data are acquired from a low cost micro inertial measurement unit (IMU), and the ultra-wideband (UWB) technology is utilized to obtain an accurate estimation of the TWIP's position. We propose a dual-loop control method to realize the simultaneous balance and trajectory tracking control for the TWIP vehicle. A robust adaptive second-order sliding mode control (2-RASMC) method based on an improved super-twisting (STW) algorithm is investigated to obtain the control laws, followed by several simulations to verify its robustness. The outer loop controller is designed using the idea of backstepping. Moreover, three typical trajectories, including a circle, a trifolium and a hexagon, have been designed to prove the adaptability of the control combinations. Six different combinations of inner and outer loop control algorithms have been compared, and the characteristics of inner and outer loop algorithm combinations have been analyzed. Simulation results demonstrate its tracking performance and thus verify the validity of the proposed control methods. Trajectory tracking experiments in a real indoor environment have been performed using our experimental vehicle to further validate the feasibility of the proposed algorithm in practice.
Xia, Dunzhu; Yao, Yanhong; Cheng, Limei
2017-01-01
In this paper, we aimed to achieve the indoor tracking control of a two-wheeled inverted pendulum (TWIP) vehicle. The attitude data are acquired from a low cost micro inertial measurement unit (IMU), and the ultra-wideband (UWB) technology is utilized to obtain an accurate estimation of the TWIP’s position. We propose a dual-loop control method to realize the simultaneous balance and trajectory tracking control for the TWIP vehicle. A robust adaptive second-order sliding mode control (2-RASMC) method based on an improved super-twisting (STW) algorithm is investigated to obtain the control laws, followed by several simulations to verify its robustness. The outer loop controller is designed using the idea of backstepping. Moreover, three typical trajectories, including a circle, a trifolium and a hexagon, have been designed to prove the adaptability of the control combinations. Six different combinations of inner and outer loop control algorithms have been compared, and the characteristics of inner and outer loop algorithm combinations have been analyzed. Simulation results demonstrate its tracking performance and thus verify the validity of the proposed control methods. Trajectory tracking experiments in a real indoor environment have been performed using our experimental vehicle to further validate the feasibility of the proposed algorithm in practice. PMID:28617338
NASA Astrophysics Data System (ADS)
Nurdiani, N.
2018-03-01
Low cost flats in Jakarta – Indonesia is provided by the government for low-income people in urban areas, in line with the program to redevelop or renew slum areas. Low cost flat is built with the minimum standard of building materials. The purpose of this study is to know efforts of the occupants to change of building materials at residential unit of low cost flats. The research was conducted by descriptive method at four of low cost housing in Jakarta: Rusuna Bendungan Hilir 1, Rusuna Tambora IIIA, Rusuna Bidara Cina, and Rusuna Sukapura. The results showed that physical changes which happened in low cost flats are aesthetic (residence paint color change), or improvement of physical quality of residential unit (change of building material), become dominant aspects done by residents in four rusuna.
NASA Astrophysics Data System (ADS)
Fahey, D. W.; Gao, R.; Thornberry, T. D.; Rollins, D. W.; Schwarz, J. P.; Perring, A. E.
2013-12-01
In-situ sampling with particle size spectrometers is an important method to provide detailed size spectra for atmospheric aerosol in the troposphere and stratosphere. The spectra are essential for understanding aerosol sources and aerosol chemical evolution and removal, and for aerosol remote sensing validation. These spectrometers are usually bulky, heavy, and expensive, thereby limiting their application to specific airborne platforms. Here we report a new type of small and light-weight optical aerosol particle size spectrometer that is sensitive enough for many aerosol applications yet is inexpensive enough to be disposable. 3D printing is used for producing structural components for simplicity and low cost. Weighing less than 1 kg individually, we expect these spectrometers can be deployed successfully on small unmanned aircraft systems (UASs) and up to 25 km on weather balloons. Immediate applications include the study of Arctic haze using the Manta UAS, detection of the Asian Tropopause Aerosol Layer in the Asian monsoon system and SAGE III validation onboard weather balloons.
Conesa, Claudia; García-Breijo, Eduardo; Loeff, Edwin; Seguí, Lucía; Fito, Pedro; Laguarda-Miró, Nicolás
2015-01-01
Electrochemical Impedance Spectroscopy (EIS) has been used to develop a methodology able to identify and quantify fermentable sugars present in the enzymatic hydrolysis phase of second-generation bioethanol production from pineapple waste. Thus, a low-cost non-destructive system consisting of a stainless double needle electrode associated to an electronic equipment that allows the implementation of EIS was developed. In order to validate the system, different concentrations of glucose, fructose and sucrose were added to the pineapple waste and analyzed both individually and in combination. Next, statistical data treatment enabled the design of specific Artificial Neural Networks-based mathematical models for each one of the studied sugars and their respective combinations. The obtained prediction models are robust and reliable and they are considered statistically valid (CCR% > 93.443%). These results allow us to introduce this EIS-based technique as an easy, fast, non-destructive, and in-situ alternative to the traditional laboratory methods for enzymatic hydrolysis monitoring. PMID:26378537
Llorca, David F; Sotelo, Miguel A; Parra, Ignacio; Ocaña, Manuel; Bergasa, Luis M
2010-01-01
This paper presents an analytical study of the depth estimation error of a stereo vision-based pedestrian detection sensor for automotive applications such as pedestrian collision avoidance and/or mitigation. The sensor comprises two synchronized and calibrated low-cost cameras. Pedestrians are detected by combining a 3D clustering method with Support Vector Machine-based (SVM) classification. The influence of the sensor parameters in the stereo quantization errors is analyzed in detail providing a point of reference for choosing the sensor setup according to the application requirements. The sensor is then validated in real experiments. Collision avoidance maneuvers by steering are carried out by manual driving. A real time kinematic differential global positioning system (RTK-DGPS) is used to provide ground truth data corresponding to both the pedestrian and the host vehicle locations. The performed field test provided encouraging results and proved the validity of the proposed sensor for being used in the automotive sector towards applications such as autonomous pedestrian collision avoidance.
Llorca, David F.; Sotelo, Miguel A.; Parra, Ignacio; Ocaña, Manuel; Bergasa, Luis M.
2010-01-01
This paper presents an analytical study of the depth estimation error of a stereo vision-based pedestrian detection sensor for automotive applications such as pedestrian collision avoidance and/or mitigation. The sensor comprises two synchronized and calibrated low-cost cameras. Pedestrians are detected by combining a 3D clustering method with Support Vector Machine-based (SVM) classification. The influence of the sensor parameters in the stereo quantization errors is analyzed in detail providing a point of reference for choosing the sensor setup according to the application requirements. The sensor is then validated in real experiments. Collision avoidance maneuvers by steering are carried out by manual driving. A real time kinematic differential global positioning system (RTK-DGPS) is used to provide ground truth data corresponding to both the pedestrian and the host vehicle locations. The performed field test provided encouraging results and proved the validity of the proposed sensor for being used in the automotive sector towards applications such as autonomous pedestrian collision avoidance. PMID:22319323
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lewis, John E.; English, Christine M.; Gesick, Joshua C.
This report documents the validation process as applied to projects awarded through Funding Opportunity Announcements (FOAs) within the U.S. Department of Energy Bioenergy Technologies Office (DOE-BETO). It describes the procedures used to protect and verify project data, as well as the systematic framework used to evaluate and track performance metrics throughout the life of the project. This report also describes the procedures used to validate the proposed process design, cost data, analysis methodologies, and supporting documentation provided by the recipients.
NASA Technical Reports Server (NTRS)
Davis, Don; Bennett, Toby; Short, Nicholas M., Jr.
1994-01-01
The Earth Observing System (EOS), part of a cohesive national effort to study global change, will deploy a constellation of remote sensing spacecraft over a 15 year period. Science data from the EOS spacecraft will be processed and made available to a large community of earth scientists via NASA institutional facilities. A number of these spacecraft are also providing an additional interface to broadcast data directly to users. Direct broadcast of real-time science data from overhead spacecraft has valuable applications including validation of field measurements, planning science campaigns, and science and engineering education. The success and usefulness of EOS direct broadcast depends largely on the end-user cost of receiving the data. To extend this capability to the largest possible user base, the cost of receiving ground stations must be as low as possible. To achieve this goal, NASA Goddard Space Flight Center is developing a prototype low-cost transportable ground station for EOS direct broadcast data based on Very Large Scale Integration (VLSI) components and pipelined, multiprocessing architectures. The targeted reproduction cost of this system is less than $200K. This paper describes a prototype ground station and its constituent components.
Crespin, Oscar M; Okrainec, Allan; Kwong, Andrea V; Habaz, Ilay; Jimenez, Maria Carolina; Szasz, Peter; Weiss, Ethan; Gonzalez, Cecilia G; Mosko, Jeffrey D; Liu, Louis W C; Swanstrom, Lee L; Perretta, Silvana; Shlomovitz, Eran
2018-06-01
The fundamentals of laparoscopic surgery (FLS) training box is a validated tool, already accessible to surgical trainees to hone their laparoscopic skills. We aim to investigate the feasibility of adapting the FLS box for the practice and assessment of endoscopic skills. This would allow for a highly available, reusable, low-cost, mechanical trainer. The design and development process was based on a user-centered design, which is a combination of the design thinking method and cognitive task analysis. The process comprises four phases: empathy, cognitive, prototyping/adaptation, and end user testing. The underlying idea was to utilize as many of the existing components of FLS training to maintain simplicity and cost effectiveness while allowing for the practice of clinically relevant endoscopic skills. A sample size of 18 participants was calculated to be sufficient to detect performance differences between experts and trainees using a two tailed t test with alpha set at 0.05, standard deviation of 5.5, and a power of 80%. Adaptation to the FLS box included two fundamental attachments: a front panel with an insertion point for an endoscope and a shaft which provides additional support and limits movement of the scope. The panel also allows for mounting of retroflexion tasks. Six endoscopic tasks inspired by FLS were designed (two of which utilize existing FLS components). Pilot testing with 38 participants showed high user's satisfaction and demonstrated that the trainer was robust and reliable. Task performance times was able to discriminate between trainees and experts for all six tasks. A mechanical, reusable, low-cost adaptation of the FLS training box for endoscopic skills is feasible and has high user satisfaction. Preliminary testing shows that the simulator is able to discriminate between trainees and experts. Following further validation, this adaptation may act as a supplement to the FES program.
Cherny, N I; Sullivan, R; Torode, J; Saar, M; Eniu, A
2017-11-01
The availability and affordability of safe, effective, high-quality, affordable anticancer therapies are a core requirement for effective national cancer control plans. Online survey based on a previously validated approach. The aims of the study were to evaluate (i) the availability on national formulary of licensed antineoplastic medicines across the globe, (ii) patient out-of-pocket costs for the medications, (iii) the actual availability of the medication for a patient with a valid prescription, (iv) information relating to possible factors adversely impacting the availability of antineoplastic agents and (v) the impact of the country's level of economic development on these parameters. A total of 304 field reporters from 97 countries were invited to participate. The preliminary set of data was posted on the ESMO website for open peer review and amendments have been incorporated into the final report. Surveys were submitted by 135 reporters from 63 countries and additional peer-review data were submitted by 54 reporters from 19 countries. There are substantial differences in the formulary availability, out-of-pocket costs and actual availability for many anticancer medicines. The most substantial issues are in lower-middle- and low-income countries. Even among medications on the WHO Model List of Essential Medicines (EML) the discrepancies are profound and these relate to high out-of-pocket costs (in low-middle-income countries 32.0% of EML medicines are available only at full cost and 5.2% are not available at all, and for low-income countries, the corresponding figures are even worse at 57.7% and 8.3%, respectively). There is wide global variation in formulary availability, out-of-pocket expenditures and actual availability for most licensed anticancer medicines. Low- and low-middle-income countries have significant lack of availability and high out-of-pocket expenditures for cancer medicines on the WHO EML, with much less availability of new, more expensive targeted agents compared with high-income countries. © The Author 2017. Published by Oxford University Press on behalf of the European Society for Medical Oncology.
Wilton, Paula; Smith, Richard; Coast, Joanna; Millar, Michael
2002-04-01
To conduct a systematic review of the literature to describe and critically appraise studies reporting on the cost and/or effectiveness of interventions proposed to control the emergence of antimicrobial resistance (AMR). The search for relevant studies encompassed consultation with world experts in AMR, and electronic bibliographic database search of: Medline (1960-2000); ISI (1981-2000); EMBASE (1988-2000); Grey Literature (1999-2000); Database of Reviews of Effectiveness (DARE) and the NHS Health Economic Evaluation Database (HEED) at York University's Centre for Reviews and Dissemination (CRD) (numerous years); OPAC (1975-2000); and the Cochrane Library Online (1990-2000). Only studies that concerned the effectiveness or cost-effectiveness of measures specifically designed to contain the emergence of AMR were reviewed. Standardised data extraction sheets, based on existing checklists for effectiveness and cost-effectiveness, were used to assess the validity of each study using the 'risk of bias criteria' suggested in the Cochrane Handbook. Only studies categorised as being at low or moderate risk of bias were reported fully. The reliability of the data review process was monitored by comparison of several, random, independent assessments by all authors. The mix of study methods (i.e. including studies based on non-randomised controlled trials) meant that formal meta-analysis was not possible, and thus a qualitative review was performed. In total, 43 studies were reviewed, with 21 classed as being at moderate or low risk of bias and therefore reported in the paper. These studies covered policies on: restricting the use of antimicrobials (five studies, suggesting that restriction policies can alter prescriber behaviour, although with limited evidence of subsequent effect on AMR); prescriber education, feedback and use of guidelines (six studies, with no clear conclusion); combination therapies (seven studies, showing the potential to lower drug-specific resistance, although for an indeterminate time period); vaccination (three studies showing cost/effectiveness). Most of these studies were: from the developed world, principally the USA; hospital-based, with few community level interventions; and concerned with effectiveness, not cost-effectiveness. Overall, there is an absence of good evidence concerning what is effective, and especially cost-effective, in reducing the emergence of AMR. However, in addition to more research concerning these forms of intervention, the paper highlights four specific areas for further investigation: validating intermediate or surrogate outcome measures to enable better use to be made of the literature on intermediate measures; development and evaluation of 'macro' strategies; research into specific aspects of AMR in developing countries; and empirical and methodological research concerning the economic evaluation of interventions.
Fabrication of low cost soft tissue prostheses with the desktop 3D printer
NASA Astrophysics Data System (ADS)
He, Yong; Xue, Guang-Huai; Fu, Jian-Zhong
2014-11-01
Soft tissue prostheses such as artificial ear, eye and nose are widely used in the maxillofacial rehabilitation. In this report we demonstrate how to fabricate soft prostheses mold with a low cost desktop 3D printer. The fabrication method used is referred to as Scanning Printing Polishing Casting (SPPC). Firstly the anatomy is scanned with a 3D scanner, then a tissue casting mold is designed on computer and printed with a desktop 3D printer. Subsequently, a chemical polishing method is used to polish the casting mold by removing the staircase effect and acquiring a smooth surface. Finally, the last step is to cast medical grade silicone into the mold. After the silicone is cured, the fine soft prostheses can be removed from the mold. Utilizing the SPPC method, soft prostheses with smooth surface and complicated structure can be fabricated at a low cost. Accordingly, the total cost of fabricating ear prosthesis is about $30, which is much lower than the current soft prostheses fabrication methods.
Fabrication of low cost soft tissue prostheses with the desktop 3D printer
He, Yong; Xue, Guang-huai; Fu, Jian-zhong
2014-01-01
Soft tissue prostheses such as artificial ear, eye and nose are widely used in the maxillofacial rehabilitation. In this report we demonstrate how to fabricate soft prostheses mold with a low cost desktop 3D printer. The fabrication method used is referred to as Scanning Printing Polishing Casting (SPPC). Firstly the anatomy is scanned with a 3D scanner, then a tissue casting mold is designed on computer and printed with a desktop 3D printer. Subsequently, a chemical polishing method is used to polish the casting mold by removing the staircase effect and acquiring a smooth surface. Finally, the last step is to cast medical grade silicone into the mold. After the silicone is cured, the fine soft prostheses can be removed from the mold. Utilizing the SPPC method, soft prostheses with smooth surface and complicated structure can be fabricated at a low cost. Accordingly, the total cost of fabricating ear prosthesis is about $30, which is much lower than the current soft prostheses fabrication methods. PMID:25427880
Fabrication of low cost soft tissue prostheses with the desktop 3D printer.
He, Yong; Xue, Guang-huai; Fu, Jian-zhong
2014-11-27
Soft tissue prostheses such as artificial ear, eye and nose are widely used in the maxillofacial rehabilitation. In this report we demonstrate how to fabricate soft prostheses mold with a low cost desktop 3D printer. The fabrication method used is referred to as Scanning Printing Polishing Casting (SPPC). Firstly the anatomy is scanned with a 3D scanner, then a tissue casting mold is designed on computer and printed with a desktop 3D printer. Subsequently, a chemical polishing method is used to polish the casting mold by removing the staircase effect and acquiring a smooth surface. Finally, the last step is to cast medical grade silicone into the mold. After the silicone is cured, the fine soft prostheses can be removed from the mold. Utilizing the SPPC method, soft prostheses with smooth surface and complicated structure can be fabricated at a low cost. Accordingly, the total cost of fabricating ear prosthesis is about $30, which is much lower than the current soft prostheses fabrication methods.
Low-cost, light-switched, forward-osmosis desalination system
DOE Office of Scientific and Technical Information (OSTI.GOV)
Warner, John C.
The looming water crisis is the second largest issue facing humanity after energy. In order to meet the increasing demand for clean water, new efficient and low-cost methods of water purification must be developed. A promising method for dry regions with sea borders is the desalination of seawater. While there remain many disadvantages to current desalination techniques, such as environmental pollution and high cost, there is a strong opportunity for new technology development in this area. In this Phase I program, the development of a light-switchable, low-cost desalination system was explored. The system requires photoselective switching of water solubility. Ninemore » new light-switchable spiropyran-based small molecule and polymeric materials were synthesized, and methods to evaluate their desalination potential were developed and utilized. Severable promising spiropyran analogues proved to be photoswitchable, but so far sufficient photoswitchablity of solubility for a commercial desalination system was not achieved. More development is required.« less
Production of ELZM mirrors: performance coupled with attractive schedule, cost, and risk factors
NASA Astrophysics Data System (ADS)
Leys, Antoine; Hull, Tony; Westerhoff, Thomas
2016-08-01
Extreme light weighted ZERODUR Mirrors (ELZM) have been developed to exploit the superb thermal characteristics of ZERODUR. Coupled with up to date mechanical and optical fabrication methods this becomes an attractive technical approach. However the process of making mirror substrates has demonstrated to be unusually rapid and especially cost-effective. ELZM is aimed at the knee of the cost as a function of light weighting curve. ELZM mirrors are available at 88% light weighted. Together with their low risk, low cost production methods, this is presented as a strong option for NASA Explorer and Probe class missions.
Space Technology 5 - A Successful Micro-Satellite Constellation Mission
NASA Technical Reports Server (NTRS)
Carlisle, Candace; Webb, Evan H.
2007-01-01
The Space Technology 5 (ST5) constellation of three micro-satellites was launched March 22, 2006. During the three-month flight demonstration phase, the ST5 team validated key technologies that will make future low-cost micro-sat constellations possible, demonstrated operability concepts for future micro-sat science constellation missions, and demonstrated the utility of a micro-satellite constellation to perform research-quality science. The ST5 mission was successfully completed in June 2006, demonstrating high-quality science and technology validation results.
Mitral Valve Clip for Treatment of Mitral Regurgitation: An Evidence-Based Analysis
Ansari, Mohammed T.; Ahmadzai, Nadera; Coyle, Kathryn; Coyle, Doug; Moher, David
2015-01-01
Background Many of the 500,000 North American patients with chronic mitral regurgitation may be poor candidates for mitral valve surgery. Objective The objective of this study was to investigate the comparative effectiveness, harms, and cost-effectiveness of percutaneous mitral valve repair using mitral valve clips in candidates at prohibitive risk for surgery. Data Sources We searched articles in MEDLINE, Embase, and the Cochrane Library published from 1994 to February 2014 for evidence of effectiveness and harms; for economic literature we also searched NHS EED and Tufts CEA registry. Grey literature was also searched. Review Methods Primary studies were sought from existing systematic reviews that had employed reliable search and screening methods. Newer studies were sought by searching the period subsequent to the last search date of the review. Two reviewers screened records and assessed study validity. We used the Cochrane risk of bias tool for randomized, generic assessment for non-randomized studies, and the Phillips checklist for economic studies. Results Ten studies including 1 randomized trial were included. The majority of the direct comparative evidence compared the mitral valve clip repair with surgery in patients not particularly at prohibitive surgical risk. Irrespective of degenerative or functional chronic mitral regurgitation etiology, evidence of effectiveness and harms is inconclusive and of very low quality. Very-low-quality evidence indicates that percutaneous mitral valve clip repair may provide a survival advantage, at least during the first 1 to 2 years, particularly in medically managed chronic functional mitral regurgitation. Because of limitations in the design of studies, the cost-effectiveness of mitral valve clips in patients at prohibitive risk for surgery also could not be established. Limitations Because of serious concerns of risk of bias, indirectness, and imprecision, evidence is of very low quality. Conclusions No meaningful conclusions can be drawn about the comparative effectiveness, harms, and cost-effectiveness of mitral valve clips in the population with chronic mitral regurgitation who are at prohibitive risk for surgery. PMID:26379810
Cross Validation Through Two-Dimensional Solution Surface for Cost-Sensitive SVM.
Gu, Bin; Sheng, Victor S; Tay, Keng Yeow; Romano, Walter; Li, Shuo
2017-06-01
Model selection plays an important role in cost-sensitive SVM (CS-SVM). It has been proven that the global minimum cross validation (CV) error can be efficiently computed based on the solution path for one parameter learning problems. However, it is a challenge to obtain the global minimum CV error for CS-SVM based on one-dimensional solution path and traditional grid search, because CS-SVM is with two regularization parameters. In this paper, we propose a solution and error surfaces based CV approach (CV-SES). More specifically, we first compute a two-dimensional solution surface for CS-SVM based on a bi-parameter space partition algorithm, which can fit solutions of CS-SVM for all values of both regularization parameters. Then, we compute a two-dimensional validation error surface for each CV fold, which can fit validation errors of CS-SVM for all values of both regularization parameters. Finally, we obtain the CV error surface by superposing K validation error surfaces, which can find the global minimum CV error of CS-SVM. Experiments are conducted on seven datasets for cost sensitive learning and on four datasets for imbalanced learning. Experimental results not only show that our proposed CV-SES has a better generalization ability than CS-SVM with various hybrids between grid search and solution path methods, and than recent proposed cost-sensitive hinge loss SVM with three-dimensional grid search, but also show that CV-SES uses less running time.
Zhang, Aiying; Yin, Chengzeng; Wang, Zhenshun; Zhang, Yonghong; Zhao, Yuanshun; Li, Ang; Sun, Huanqin; Lin, Dongdong; Li, Ning
2016-12-01
Objective To develop a simple, effective, time-saving and low-cost fluorescence protein microarray method for detecting serum alpha-fetoprotein (AFP) in patients with hepatocellular carcinoma (HCC). Method Non-contact piezoelectric print techniques were applied to fluorescence protein microarray to reduce the cost of prey antibody. Serum samples from patients with HCC and healthy control subjects were collected and evaluated for the presence of AFP using a novel fluorescence protein microarray. To validate the fluorescence protein microarray, serum samples were tested for AFP using an enzyme-linked immunosorbent assay (ELISA). Results A total of 110 serum samples from patients with HCC ( n = 65) and healthy control subjects ( n = 45) were analysed. When the AFP cut-off value was set at 20 ng/ml, the fluorescence protein microarray had a sensitivity of 91.67% and a specificity of 93.24% for detecting serum AFP. Serum AFP quantified via fluorescence protein microarray had a similar diagnostic performance compared with ELISA in distinguishing patients with HCC from healthy control subjects (area under receiver operating characteristic curve: 0.906 for fluorescence protein microarray; 0.880 for ELISA). Conclusion A fluorescence protein microarray method was developed for detecting serum AFP in patients with HCC.
Zhang, Aiying; Yin, Chengzeng; Wang, Zhenshun; Zhang, Yonghong; Zhao, Yuanshun; Li, Ang; Sun, Huanqin; Lin, Dongdong
2016-01-01
Objective To develop a simple, effective, time-saving and low-cost fluorescence protein microarray method for detecting serum alpha-fetoprotein (AFP) in patients with hepatocellular carcinoma (HCC). Method Non-contact piezoelectric print techniques were applied to fluorescence protein microarray to reduce the cost of prey antibody. Serum samples from patients with HCC and healthy control subjects were collected and evaluated for the presence of AFP using a novel fluorescence protein microarray. To validate the fluorescence protein microarray, serum samples were tested for AFP using an enzyme-linked immunosorbent assay (ELISA). Results A total of 110 serum samples from patients with HCC (n = 65) and healthy control subjects (n = 45) were analysed. When the AFP cut-off value was set at 20 ng/ml, the fluorescence protein microarray had a sensitivity of 91.67% and a specificity of 93.24% for detecting serum AFP. Serum AFP quantified via fluorescence protein microarray had a similar diagnostic performance compared with ELISA in distinguishing patients with HCC from healthy control subjects (area under receiver operating characteristic curve: 0.906 for fluorescence protein microarray; 0.880 for ELISA). Conclusion A fluorescence protein microarray method was developed for detecting serum AFP in patients with HCC. PMID:27885040
Low cost iodine intercalated graphene for fuel cells electrodes
NASA Astrophysics Data System (ADS)
Marinoiu, Adriana; Raceanu, Mircea; Carcadea, Elena; Varlam, Mihai; Stefanescu, Ioan
2017-12-01
On the theoretical predictions, we report the synthesis of iodine intercalated graphene for proton exchange membrane fuel cells (PEMFCs) applications. The structure and morphology of the samples were characterized by X-ray photoelectron spectroscopy (XPS) analysis, specific surface area by BET method, Raman investigations. The presence of elemental iodine in the form of triiodide and pentaiodide was validated, suggesting that iodine was trapped between graphene layers, leading to interactions with C atoms. The electrochemical performances of iodinated graphenes were tested and compared with a typical PEMFC configuration, containing different Pt/C loading (0.4 and 0.2 mg cm-2). If iodinated graphene is included as microporous layer, the electrochemical performances of the fuel cell are higher in terms of power density than the typical fuel cell. Iodine-doped graphenes have been successfully obtained by simple and cost effective synthetic strategy and demonstrated new insights for designing of a high performance metal-free ORR catalyst by a scalable technique.
Weir, David R.; Wallace, Robert B.; Langa, Kenneth M.; Plassman, Brenda L.; Wilson, Robert S.; Bennett, David A.; Duara, Ranjan; Loewenstein, David; Ganguli, Mary; Sano, Mary
2011-01-01
Establishing methods for ascertainment of dementia and cognitive impairment that are accurate and also cost effective is a challenging enterprise. Large population-based studies often using administrative data sets offer relatively inexpensive but reliable estimates of severe conditions including moderate to advanced dementia that are useful for public health planning, but they can miss less severe cognitive impairment which may be the most effective point for intervention. Clinical and epidemiological cohorts, intensively assessed, provide more sensitive detection of less severe cognitive impairment but are often costly. Here, several approaches to ascertainment are evaluated for validity, reliability, and cost. In particular, the methods of ascertainment from the Health and Retirement Study (HRS) are described briefly, along with those of the Aging, Demographics, and Memory Study (ADAMS). ADAMS, a resource-intense sub-study of the HRS, was designed to provide diagnostic accuracy among persons with more advanced dementia. A proposal to streamline future ADAMS assessments is offered. Also considered are decision tree, algorithmic, and web-based approaches to diagnosis that can reduce the expense of clinical expertise and, in some contexts, can reduce the extent of data collection. These approaches are intended for intensively assessed epidemiological cohorts. The goal is valid and reliable detection with efficient and cost-effective tools. PMID:21255747
NEW METHODS TO SCREEN FOR DEVELOPMENTAL NEUROTOXICITY.
The development of alternative methods for toxicity testing is driven by the need for scientifically valid data (i.e. predictive of a toxic effect) that can be obtained in a rapid and cost-efficient manner. These predictions will enable decisions to be made as to whether further ...
Systematic, Cooperative Evaluation.
ERIC Educational Resources Information Center
Nassif, Paula M.
Evaluation procedures based on a systematic evaluation methodology, decision-maker validity, new measurement and design techniques, low cost, and a high level of cooperation on the part of the school staff were used in the assessment of a public school mathematics program for grades 3-8. The mathematics curriculum was organized into Spirals which…
Organizing for low cost space operations - Status and plans
NASA Technical Reports Server (NTRS)
Lee, C.
1976-01-01
Design features of the Space Transportation System (vehicle reuse, low cost expendable components, simple payload interfaces, standard support systems) must be matched by economical operational methods to achieve low operating and payload costs. Users will be responsible for their own payloads and will be charged according to the services they require. Efficient use of manpower, simple documentation, simplified test, checkout, and flight planning are firm goals, together with flexibility for quick response to varying user needs. Status of the Shuttle hardware, plans for establishing low cost procedures, and the policy for user charges are discussed.
Can economic evaluation in telemedicine be trusted? A systematic review of the literature
Bergmo, Trine S
2009-01-01
Background Telemedicine has been advocated as an effective means to provide health care services over a distance. Systematic information on costs and consequences has been called for to support decision-making in this field. This paper provides a review of the quality, validity and generalisability of economic evaluations in telemedicine. Methods A systematic literature search in all relevant databases was conducted and forms the basis for addressing these issues. Only articles published in peer-reviewed journals and written in English in the period from 1990 to 2007 were analysed. The literature search identified 33 economic evaluations where both costs (resource use) and outcomes (non-resource consequences) were measured. Results This review shows that economic evaluations in telemedicine are highly diverse in terms of both the study context and the methods applied. The articles covered several medical specialities ranging from cardiology and dermatology to psychiatry. The studies analysed telemedicine in home care, and in primary and secondary care settings using a variety of different technologies including videoconferencing, still-images and monitoring (store-and-forward telemedicine). Most studies used multiple outcome measures and analysed the effects using disaggregated cost-consequence frameworks. Objectives, study design, and choice of comparators were mostly well reported. The majority of the studies lacked information on perspective and costing method, few used general statistics and sensitivity analysis to assess validity, and even fewer used marginal analysis. Conclusion As this paper demonstrates, the majority of the economic evaluations reviewed were not in accordance with standard evaluation techniques. Further research is needed to explore the reasons for this and to address how economic evaluation in telemedicine best can take advantage of local constraints and at the same time produce valid and generalisable results. PMID:19852828
Intrathecal Drug Delivery Systems for Cancer Pain: A Health Technology Assessment
2016-01-01
Background Intrathecal drug delivery systems can be used to manage refractory or persistent cancer pain. We investigated the benefits, harms, cost-effectiveness, and budget impact of these systems compared with current standards of care for adult patients with chronic pain due owing to cancer. Methods We searched Ovid MEDLINE, Ovid Embase, the Cochrane Library databases, National Health Service's Economic Evaluation Database, and Tufts Cost-Effectiveness Analysis Registry from January 1994 to April 2014 for evidence of effectiveness, harms, and cost-effectiveness. We used existing systematic reviews that had employed reliable search and screen methods and searched for studies published after the search date reported in the latest systematic review to identify studies. Two reviewers screened records and assessed study validity. The cost burden of publicly funding intrathecal drug delivery systems for cancer pain was estimated for a 5-year timeframe using a combination of published literature, information from the device manufacturer, administrative data, and expert opinion for the inputs. Results We included one randomized trial that examined effectiveness and harms, and one case series that reported an eligible economic evaluation. We found very low quality evidence that intrathecal drug delivery systems added to comprehensive pain management reduce overall drug toxicity; no significant reduction in pain scores was observed. Weak conclusions from economic evidence suggested that intrathecal drug delivery systems had the potential to be more cost-effective than high-cost oral therapy if administered for 7 months or longer. The cost burden of publicly funding this therapy is estimated to be $100,000 in the first year, increasing to $500,000 by the fifth year. Conclusions Current evidence could not establish the benefit, harm, or cost-effectiveness of intrathecal drug delivery systems compared with current standards of care for managing refractory cancer pain in adults. Publicly funding intrathecal drug delivery systems for cancer pain would result in a budget impact of several hundred thousand dollars per year. PMID:27026796
Li, Liqi; Cui, Xiang; Yu, Sanjiu; Zhang, Yuan; Luo, Zhong; Yang, Hua; Zhou, Yue; Zheng, Xiaoqi
2014-01-01
Protein structure prediction is critical to functional annotation of the massively accumulated biological sequences, which prompts an imperative need for the development of high-throughput technologies. As a first and key step in protein structure prediction, protein structural class prediction becomes an increasingly challenging task. Amongst most homological-based approaches, the accuracies of protein structural class prediction are sufficiently high for high similarity datasets, but still far from being satisfactory for low similarity datasets, i.e., below 40% in pairwise sequence similarity. Therefore, we present a novel method for accurate and reliable protein structural class prediction for both high and low similarity datasets. This method is based on Support Vector Machine (SVM) in conjunction with integrated features from position-specific score matrix (PSSM), PROFEAT and Gene Ontology (GO). A feature selection approach, SVM-RFE, is also used to rank the integrated feature vectors through recursively removing the feature with the lowest ranking score. The definitive top features selected by SVM-RFE are input into the SVM engines to predict the structural class of a query protein. To validate our method, jackknife tests were applied to seven widely used benchmark datasets, reaching overall accuracies between 84.61% and 99.79%, which are significantly higher than those achieved by state-of-the-art tools. These results suggest that our method could serve as an accurate and cost-effective alternative to existing methods in protein structural classification, especially for low similarity datasets.
Rossi, Michael R.; Tanaka, Daigo; Shimada, Kenji; Rabin, Yoed
2009-01-01
The current study focuses on experimentally validating a planning scheme based on the so-called bubble-packing method. This study is a part of an ongoing effort to develop computerized planning tools for cryosurgery, where bubble packing has been previously developed as a means to find an initial, uniform distribution of cryoprobes within a given domain; the so-called force-field analogy was then used to move cryoprobes to their optimum layout. However, due to the high quality of the cryoprobes’ distribution, suggested by bubble packing and its low computational cost, it has been argued that a planning scheme based solely on bubble packing may be more clinically relevant. To test this argument, an experimental validation is performed on a simulated cross-section of the prostate, using gelatin solution as a phantom material, proprietary liquid-nitrogen based cryoprobes, and a cryoheater to simulate urethral warming. Experimental results are compared with numerically simulated temperature histories resulting from planning. Results indicate an average disagreement of 0.8 mm in identifying the freezing front location, which is an acceptable level of uncertainty in the context of prostate cryosurgery imaging. PMID:19885373
Learning binary code via PCA of angle projection for image retrieval
NASA Astrophysics Data System (ADS)
Yang, Fumeng; Ye, Zhiqiang; Wei, Xueqi; Wu, Congzhong
2018-01-01
With benefits of low storage costs and high query speeds, binary code representation methods are widely researched for efficiently retrieving large-scale data. In image hashing method, learning hashing function to embed highdimensions feature to Hamming space is a key step for accuracy retrieval. Principal component analysis (PCA) technical is widely used in compact hashing methods, and most these hashing methods adopt PCA projection functions to project the original data into several dimensions of real values, and then each of these projected dimensions is quantized into one bit by thresholding. The variances of different projected dimensions are different, and with real-valued projection produced more quantization error. To avoid the real-valued projection with large quantization error, in this paper we proposed to use Cosine similarity projection for each dimensions, the angle projection can keep the original structure and more compact with the Cosine-valued. We used our method combined the ITQ hashing algorithm, and the extensive experiments on the public CIFAR-10 and Caltech-256 datasets validate the effectiveness of the proposed method.
Cost analysis of objective resident cataract surgery assessments.
Nandigam, Kiran; Soh, Jonathan; Gensheimer, William G; Ghazi, Ahmed; Khalifa, Yousuf M
2015-05-01
To compare 8 ophthalmology resident surgical training tools to determine which is most cost effective. University of Rochester Medical Center, Rochester, New York, USA. Retrospective evaluation of technology. A cost-analysis model was created to compile all relevant costs in running each tool in a medium-sized ophthalmology program. Quantitative cost estimates were obtained based on cost of tools, cost of time in evaluations, and supply and maintenance costs. For wet laboratory simulation, Eyesi was the least expensive cataract surgery simulation method; however, it is only capable of evaluating simulated cataract surgery rehearsal and requires supplementation with other evaluative methods for operating room performance and for noncataract wet lab training and evaluation. The most expensive training tool was the Eye Surgical Skills Assessment Test (ESSAT). The 2 most affordable methods for resident evaluation in operating room performance were the Objective Assessment of Skills in Intraocular Surgery (OASIS) and Global Rating Assessment of Skills in Intraocular Surgery (GRASIS). Cost-based analysis of ophthalmology resident surgical training tools are needed so residency programs can implement tools that are valid, reliable, objective, and cost effective. There is no perfect training system at this time. Copyright © 2015 ASCRS and ESCRS. Published by Elsevier Inc. All rights reserved.
VALUE - Validating and Integrating Downscaling Methods for Climate Change Research
NASA Astrophysics Data System (ADS)
Maraun, Douglas; Widmann, Martin; Benestad, Rasmus; Kotlarski, Sven; Huth, Radan; Hertig, Elke; Wibig, Joanna; Gutierrez, Jose
2013-04-01
Our understanding of global climate change is mainly based on General Circulation Models (GCMs) with a relatively coarse resolution. Since climate change impacts are mainly experienced on regional scales, high-resolution climate change scenarios need to be derived from GCM simulations by downscaling. Several projects have been carried out over the last years to validate the performance of statistical and dynamical downscaling, yet several aspects have not been systematically addressed: variability on sub-daily, decadal and longer time-scales, extreme events, spatial variability and inter-variable relationships. Different downscaling approaches such as dynamical downscaling, statistical downscaling and bias correction approaches have not been systematically compared. Furthermore, collaboration between different communities, in particular regional climate modellers, statistical downscalers and statisticians has been limited. To address these gaps, the EU Cooperation in Science and Technology (COST) action VALUE (www.value-cost.eu) has been brought into life. VALUE is a research network with participants from currently 23 European countries running from 2012 to 2015. Its main aim is to systematically validate and develop downscaling methods for climate change research in order to improve regional climate change scenarios for use in climate impact studies. Inspired by the co-design idea of the international research initiative "future earth", stakeholders of climate change information have been involved in the definition of research questions to be addressed and are actively participating in the network. The key idea of VALUE is to identify the relevant weather and climate characteristics required as input for a wide range of impact models and to define an open framework to systematically validate these characteristics. Based on a range of benchmark data sets, in principle every downscaling method can be validated and compared with competing methods. The results of this exercise will directly provide end users with important information about the uncertainty of regional climate scenarios, and will furthermore provide the basis for further developing downscaling methods. This presentation will provide background information on VALUE and discuss the identified characteristics and the validation framework.
Poss, Jeffrey W; Hirdes, John P; Fries, Brant E; McKillop, Ian; Chase, Mary
2008-04-01
The case-mix system Resource Utilization Groups version III for Home Care (RUG-III/HC) was derived using a modest data sample from Michigan, but to date no comprehensive large scale validation has been done. This work examines the performance of the RUG-III/HC classification using a large sample from Ontario, Canada. Cost episodes over a 13-week period were aggregated from individual level client billing records and matched to assessment information collected using the Resident Assessment Instrument for Home Care, from which classification rules for RUG-III/HC are drawn. The dependent variable, service cost, was constructed using formal services plus informal care valued at approximately one-half that of a replacement worker. An analytic dataset of 29,921 episodes showed a skewed distribution with over 56% of cases falling into the lowest hierarchical level, reduced physical functions. Case-mix index values for formal and informal cost showed very close similarities to those found in the Michigan derivation. Explained variance for a function of combined formal and informal cost was 37.3% (20.5% for formal cost alone), with personal support services as well as informal care showing the strongest fit to the RUG-III/HC classification. RUG-III/HC validates well compared with the Michigan derivation work. Potential enhancements to the present classification should consider the large numbers of undifferentiated cases in the reduced physical function group, and the low explained variance for professional disciplines.
Fawaz, Elyssa G; Salam, Darine A
2018-05-15
In this study, a method for assessing the costs of biodiesel production from waste frying oils in Beirut, Lebanon, was investigated with the aim of developing an economic evaluation of this alternative. A hundred restaurant and hotel enterprises in Beirut were surveyed for promoting them in participating in the biodiesel supply chain, and for data collection on waste frying oils generation, disposal methods and frequency, and acquisition cost. Also, waste frying oils were collected and converted into biodiesel using a one-step base catalyzed transesterification process. Physicochemical characteristics of the produced biodiesel were conforming to international standards. Data produced from laboratory scale conversion of waste frying oils to biodiesel, as well as data collected from the only biodiesel plant in Lebanon was used to determine the production cost of biodiesel. Geographic Information System was used to propose a real-time vehicle routing model to establish the logistics costs associated with waste frying oils collection. Comparing scenarios of the configuration collection network of waste frying oils, and using medium-duty commercial vehicles for collection, a logistics cost of US$/L 0.08 was optimally reached. For the calculation of the total cost of biodiesel production, the minimum, average, and maximum values for the non-fixed cost variables were considered emerging 81 scenarios for possible biodiesel costs. These were compared with information on the commercialization of diesel in Lebanon for the years 2011 through 2017. Although competitive with petroleum diesel for years 2011 to 2014, the total biodiesel cost presented less tolerance to declining diesel prices in the recent years. Sensitivity analysis demonstrated that the acquisition cost of waste frying oils is the key factor affecting the overall cost of biodiesel production. The results of this study validate the economic feasibility of waste frying oils' biodiesel production in the studied urban area upon enforcement of low waste frying oils' acquisition costs, and can help spur food service enterprises to become suppliers of biodiesel production feedstock and support a healthy development of the biodiesel industry in Lebanon. Copyright © 2017 Elsevier B.V. All rights reserved.
Cold-End Subsystem Testing for the Fission Power System Technology Demonstration Unit
NASA Technical Reports Server (NTRS)
Briggs, Mazwell; Gibson, Marc; Ellis, David; Sanzi, James
2013-01-01
The Fission Power System (FPS) Technology Demonstration Unit (TDU) consists of a pumped sodiumpotassium (NaK) loop that provides heat to a Stirling Power Conversion Unit (PCU), which converts some of that heat into electricity and rejects the waste heat to a pumped water loop. Each of the TDU subsystems is being tested independently prior to full system testing at the NASA Glenn Research Center. The pumped NaK loop is being tested at NASA Marshall Space Flight Center; the Stirling PCU and electrical controller are being tested by Sunpower Inc.; and the pumped water loop is being tested at Glenn. This paper describes cold-end subsystem setup and testing at Glenn. The TDU cold end has been assembled in Vacuum Facility 6 (VF 6) at Glenn, the same chamber that will be used for TDU testing. Cold-end testing in VF 6 will demonstrate functionality; validated coldend fill, drain, and emergency backup systems; and generated pump performance and system pressure drop data used to validate models. In addition, a low-cost proof-of concept radiator has been built and tested at Glenn, validating the design and demonstrating the feasibility of using low-cost metal radiators as an alternative to highcost composite radiators in an end-to-end TDU test.
Biomedical microfluidic devices by using low-cost fabrication techniques: A review.
Faustino, Vera; Catarino, Susana O; Lima, Rui; Minas, Graça
2016-07-26
One of the most popular methods to fabricate biomedical microfluidic devices is by using a soft-lithography technique. However, the fabrication of the moulds to produce microfluidic devices, such as SU-8 moulds, usually requires a cleanroom environment that can be quite costly. Therefore, many efforts have been made to develop low-cost alternatives for the fabrication of microstructures, avoiding the use of cleanroom facilities. Recently, low-cost techniques without cleanroom facilities that feature aspect ratios more than 20, for fabricating those SU-8 moulds have been gaining popularity among biomedical research community. In those techniques, Ultraviolet (UV) exposure equipment, commonly used in the Printed Circuit Board (PCB) industry, replaces the more expensive and less available Mask Aligner that has been used in the last 15 years for SU-8 patterning. Alternatively, non-lithographic low-cost techniques, due to their ability for large-scale production, have increased the interest of the industrial and research community to develop simple, rapid and low-cost microfluidic structures. These alternative techniques include Print and Peel methods (PAP), laserjet, solid ink, cutting plotters or micromilling, that use equipment available in almost all laboratories and offices. An example is the xurography technique that uses a cutting plotter machine and adhesive vinyl films to generate the master moulds to fabricate microfluidic channels. In this review, we present a selection of the most recent lithographic and non-lithographic low-cost techniques to fabricate microfluidic structures, focused on the features and limitations of each technique. Only microfabrication methods that do not require the use of cleanrooms are considered. Additionally, potential applications of these microfluidic devices in biomedical engineering are presented with some illustrative examples. Copyright © 2015 Elsevier Ltd. All rights reserved.
ERIC Educational Resources Information Center
Ansaloni, Luca; Catena, Fausto; Coccolini, Frederico; Ceresoli, Marco; Pinna, Antonio Daniele
2014-01-01
Objectives: Inguinal canal anatomy and hernia repair is difficult for medical students and surgical residents to comprehend. Methods: Using low-cost material, a 3-dimensional inexpensive model of the inguinal canal was created to allow students to learn anatomical details and landmarks and to perform their own simulated hernia repair. In order to…
A Programmable Plug & Play Sensor Interface for WSN Applications
Vera, Sergio D.; Bayo, Alberto; Medrano, Nicolás; Calvo, Belén; Celma, Santiago
2011-01-01
Cost reduction in wireless sensor networks (WSN) becomes a priority when extending their application to fields where a great number of sensors is needed, such as habitat monitoring, precision agriculture or diffuse greenhouse emission measurement. In these cases, the use of smart sensors is expensive, consequently requiring the use of low-cost sensors. The solution to convert such generic low-cost sensors into intelligent ones leads to the implementation of a versatile system with enhanced processing and storage capabilities to attain a plug and play electronic interface able to adapt to all the sensors used. This paper focuses on this issue and presents a low-voltage plug & play reprogrammable interface capable of adapting to different sensor types and achieving an optimum reading performance for every sensor. The proposed interface, which includes both electronic and software elements so that it can be easily integrated in WSN nodes, is described and experimental test results to validate its performance are given. PMID:22164118
Máthé, Koppány; Buşoniu, Lucian
2015-01-01
Unmanned aerial vehicles (UAVs) have gained significant attention in recent years. Low-cost platforms using inexpensive sensor payloads have been shown to provide satisfactory flight and navigation capabilities. In this report, we survey vision and control methods that can be applied to low-cost UAVs, and we list some popular inexpensive platforms and application fields where they are useful. We also highlight the sensor suites used where this information is available. We overview, among others, feature detection and tracking, optical flow and visual servoing, low-level stabilization and high-level planning methods. We then list popular low-cost UAVs, selecting mainly quadrotors. We discuss applications, restricting our focus to the field of infrastructure inspection. Finally, as an example, we formulate two use-cases for railway inspection, a less explored application field, and illustrate the usage of the vision and control techniques reviewed by selecting appropriate ones to tackle these use-cases. To select vision methods, we run a thorough set of experimental evaluations. PMID:26121608
Low cost, patterning of human hNT brain cells on parylene-C with UV & IR laser machining.
Raos, Brad J; Unsworth, C P; Costa, J L; Rohde, C A; Doyle, C S; Delivopoulos, E; Murray, A F; Dickinson, M E; Simpson, M C; Graham, E S; Bunting, A S
2013-01-01
This paper describes the use of 800nm femtosecond infrared (IR) and 248nm nanosecond ultraviolet (UV) laser radiation in performing ablative micromachining of parylene-C on SiO2 substrates for the patterning of human hNT astrocytes. Results are presented that support the validity of using IR laser ablative micromachining for patterning human hNT astrocytes cells while UV laser radiation produces photo-oxidation of the parylene-C and destroys cell patterning. The findings demonstrate how IR laser ablative micromachining of parylene-C on SiO2 substrates can offer a low cost, accessible alternative for rapid prototyping, high yield cell patterning.
Towards a detailed anthropometric body characterization using the Microsoft Kinect.
Domingues, Ana; Barbosa, Filipa; Pereira, Eduardo M; Santos, Márcio Borgonovo; Seixas, Adérito; Vilas-Boas, João; Gabriel, Joaquim; Vardasca, Ricardo
2016-01-01
Anthropometry has been widely used in different fields, providing relevant information for medicine, ergonomics and biometric applications. However, the existent solutions present marked disadvantages, reducing the employment of this type of evaluation. Studies have been conducted in order to easily determine anthropometric measures considering data provided by low-cost sensors, such as the Microsoft Kinect. In this work, a methodology is proposed and implemented for estimating anthropometric measures considering the information acquired with this sensor. The measures obtained with this method were compared with the ones from a validation system, Qualisys. Comparing the relative errors determined with state-of-art references, for some of the estimated measures, lower errors were verified and a more complete characterization of the whole body structure was achieved.
Muedra, Vicente; Llau, Juan V; Llagunes, José; Paniagua, Pilar; Veiras, Sonia; Fernández-López, Antonio R; Diago, Carmen; Hidalgo, Francisco; Gil, Jesús; Valiño, Cristina; Moret, Enric; Gómez, Laura; Pajares, Azucena; de Prada, Blanca
2013-04-01
To study the impact on postoperative costs of a patient's antithrombin levels associated with outcomes after cardiac surgery with extracorporeal circulation. An analytic decision model was designed to estimate costs and clinical outcomes after cardiac surgery in a typical patient with low antithrombin levels (<63.7%) compared with a patient with normal antithrombin levels (≥63.7%). The data used in the model were obtained from a literature review and subsequently validated by a panel of experts in cardiothoracic anesthesiology. Multi-institutional (14 Spanish hospitals). Consultant anesthesiologists. A sensitivity analysis of extreme scenarios was carried out to assess the impact of the major variables in the model results. The average cost per patient was €18,772 for a typical patient with low antithrombin levels and €13,881 for a typical patient with normal antithrombin levels. The difference in cost was due mainly to the longer hospital stay of a patient with low antithrombin levels compared with a patient with normal levels (13 v 10 days, respectively, representing a €4,596 higher cost) rather than to costs related to the management of postoperative complications (€215, mostly owing to transfusions). Sensitivity analysis showed a high variability range of approximately ±55% of the base case cost between the minimum and maximum scenarios, with the hospital stay contributing more significantly to the variation. Based on this analytic decision model, there could be a marked increase in the postoperative costs of patients with low antithrombin activity levels at the end of cardiac surgery, mainly ascribed to a longer hospitalization. Copyright © 2013 Elsevier Inc. All rights reserved.
Predictive wind turbine simulation with an adaptive lattice Boltzmann method for moving boundaries
NASA Astrophysics Data System (ADS)
Deiterding, Ralf; Wood, Stephen L.
2016-09-01
Operating horizontal axis wind turbines create large-scale turbulent wake structures that affect the power output of downwind turbines considerably. The computational prediction of this phenomenon is challenging as efficient low dissipation schemes are necessary that represent the vorticity production by the moving structures accurately and that are able to transport wakes without significant artificial decay over distances of several rotor diameters. We have developed a parallel adaptive lattice Boltzmann method for large eddy simulation of turbulent weakly compressible flows with embedded moving structures that considers these requirements rather naturally and enables first principle simulations of wake-turbine interaction phenomena at reasonable computational costs. The paper describes the employed computational techniques and presents validation simulations for the Mexnext benchmark experiments as well as simulations of the wake propagation in the Scaled Wind Farm Technology (SWIFT) array consisting of three Vestas V27 turbines in triangular arrangement.
NASA Technical Reports Server (NTRS)
Patten, William Neff
1989-01-01
There is an evident need to discover a means of establishing reliable, implementable controls for systems that are plagued by nonlinear and, or uncertain, model dynamics. The development of a generic controller design tool for tough-to-control systems is reported. The method utilizes a moving grid, time infinite element based solution of the necessary conditions that describe an optimal controller for a system. The technique produces a discrete feedback controller. Real time laboratory experiments are now being conducted to demonstrate the viability of the method. The algorithm that results is being implemented in a microprocessor environment. Critical computational tasks are accomplished using a low cost, on-board, multiprocessor (INMOS T800 Transputers) and parallel processing. Progress to date validates the methodology presented. Applications of the technique to the control of highly flexible robotic appendages are suggested.
The Mesa Arizona Pupil Tracking System
NASA Technical Reports Server (NTRS)
Wright, D. L.
1973-01-01
A computer-based Pupil Tracking/Teacher Monitoring System was designed for Mesa Public Schools, Mesa, Arizona. The established objectives of the system were to: (1) facilitate the economical collection and storage of student performance data necessary to objectively evaluate the relative effectiveness of teachers, instructional methods, materials, and applied concepts; and (2) identify, on a daily basis, those students requiring special attention in specific subject areas. The system encompasses computer hardware/software and integrated curricula progression/administration devices. It provides daily evaluation and monitoring of performance as students progress at class or individualized rates. In the process, it notifies the student and collects information necessary to validate or invalidate subject presentation devices, methods, materials, and measurement devices in terms of direct benefit to the students. The system utilizes a small-scale computer (e.g., IBM 1130) to assure low-cost replicability, and may be used for many subjects of instruction.
An inexpensive technique for the time resolved laser induced plasma spectroscopy
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ahmed, Rizwan, E-mail: rizwan.ahmed@ncp.edu.pk; Ahmed, Nasar; Iqbal, J.
We present an efficient and inexpensive method for calculating the time resolved emission spectrum from the time integrated spectrum by monitoring the time evolution of neutral and singly ionized species in the laser produced plasma. To validate our assertion of extracting time resolved information from the time integrated spectrum, the time evolution data of the Cu II line at 481.29 nm and the molecular bands of AlO in the wavelength region (450–550 nm) have been studied. The plasma parameters were also estimated from the time resolved and time integrated spectra. A comparison of the results clearly reveals that the time resolved informationmore » about the plasma parameters can be extracted from the spectra registered with a time integrated spectrograph. Our proposed method will make the laser induced plasma spectroscopy robust and a low cost technique which is attractive for industry and environmental monitoring.« less
A modified and cost-effective method for hair cortisol analysis.
Xiang, Lianbin; Sunesara, Imran; Rehm, Kristina E; Marshall, Gailen D
2016-01-01
Hair cortisol may hold potential as a biomarker for assessment of chronic psychological stress. We report a modified and cost-effective method to prepare hair samples for cortisol assay. Hair samples were ground using an inexpensive ball grinder - ULTRA-TURRAX tube drive. Cortisol was extracted from the powder under various defined conditions. The data showed that the optimal conditions for this method include cortisol extraction at room temperature and evaporation using a stream of room air. These findings should allow more widespread research using economical technology to validate the utility of hair cortisol as a biomarker for assessing chronic stress status.
Ahlawat, Sonika; Sharma, Rekha; Maitra, A.; Roy, Manoranjan; Tantia, M.S.
2014-01-01
New, quick, and inexpensive methods for genotyping novel caprine Fec gene polymorphisms through tetra-primer ARMS PCR were developed in the present investigation. Single nucleotide polymorphism (SNP) genotyping needs to be attempted to establish association between the identified mutations and traits of economic importance. In the current study, we have successfully genotyped three new SNPs identified in caprine fecundity genes viz. T(-242)C (BMPR1B), G1189A (GDF9) and G735A (BMP15). Tetra-primer ARMS PCR protocol was optimized and validated for these SNPs with short turn-around time and costs. The optimized techniques were tested on 158 random samples of Black Bengal goat breed. Samples with known genotypes for the described genes, previously tested in duplicate using the sequencing methods, were employed for validation of the assay. Upon validation, complete concordance was observed between the tetra-primer ARMS PCR assays and the sequencing results. These results highlight the ability of tetra-primer ARMS PCR in genotyping of mutations in Fec genes. Any associated SNP could be used to accelerate the improvement of goat reproductive traits by identifying high prolific animals at an early stage of life. Our results provide direct evidence that tetra-primer ARMS-PCR is a rapid, reliable, and cost-effective method for SNP genotyping of mutations in caprine Fec genes. PMID:25606428
A review of Cry protein detection with enzyme-linked immunosorbent assays
USDA-ARS?s Scientific Manuscript database
Several detection methods are available to monitor the fate of Cry proteins in the environment, enzyme-linked immunosorbent assays (ELISAs) have emerged as the preferred detection method, due to their cost-effectiveness, ease of use, and rapid results. Validation of ELISAs is necessary to ensure acc...
Jenkins, Kathy J; Koch Kupiec, Jennifer; Owens, Pamela L; Romano, Patrick S; Geppert, Jeffrey J; Gauvreau, Kimberlee
2016-05-20
The National Quality Forum previously approved a quality indicator for mortality after congenital heart surgery developed by the Agency for Healthcare Research and Quality (AHRQ). Several parameters of the validated Risk Adjustment for Congenital Heart Surgery (RACHS-1) method were included, but others differed. As part of the National Quality Forum endorsement maintenance process, developers were asked to harmonize the 2 methodologies. Parameters that were identical between the 2 methods were retained. AHRQ's Healthcare Cost and Utilization Project State Inpatient Databases (SID) 2008 were used to select optimal parameters where differences existed, with a goal to maximize model performance and face validity. Inclusion criteria were not changed and included all discharges for patients <18 years with International Classification of Diseases, Ninth Revision, Clinical Modification procedure codes for congenital heart surgery or nonspecific heart surgery combined with congenital heart disease diagnosis codes. The final model includes procedure risk group, age (0-28 days, 29-90 days, 91-364 days, 1-17 years), low birth weight (500-2499 g), other congenital anomalies (Clinical Classifications Software 217, except for 758.xx), multiple procedures, and transfer-in status. Among 17 945 eligible cases in the SID 2008, the c statistic for model performance was 0.82. In the SID 2013 validation data set, the c statistic was 0.82. Risk-adjusted mortality rates by center ranged from 0.9% to 4.1% (5th-95th percentile). Congenital heart surgery programs can now obtain national benchmarking reports by applying AHRQ Quality Indicator software to hospital administrative data, based on the harmonized RACHS-1 method, with high discrimination and face validity. © 2016 The Authors. Published on behalf of the American Heart Association, Inc., by Wiley Blackwell.
Unsynchronized scanning with a low-cost laser range finder for real-time range imaging
NASA Astrophysics Data System (ADS)
Hatipoglu, Isa; Nakhmani, Arie
2017-06-01
Range imaging plays an essential role in many fields: 3D modeling, robotics, heritage, agriculture, forestry, reverse engineering. One of the most popular range-measuring technologies is laser scanner due to its several advantages: long range, high precision, real-time measurement capabilities, and no dependence on lighting conditions. However, laser scanners are very costly. Their high cost prevents widespread use in applications. Due to the latest developments in technology, now, low-cost, reliable, faster, and light-weight 1D laser range finders (LRFs) are available. A low-cost 1D LRF with a scanning mechanism, providing the ability of laser beam steering for additional dimensions, enables to capture a depth map. In this work, we present an unsynchronized scanning with a low-cost LRF to decrease scanning period and reduce vibrations caused by stop-scan in synchronized scanning. Moreover, we developed an algorithm for alignment of unsynchronized raw data and proposed range image post-processing framework. The proposed technique enables to have a range imaging system for a fraction of the price of its counterparts. The results prove that the proposed method can fulfill the need for a low-cost laser scanning for range imaging for static environments because the most significant limitation of the method is the scanning period which is about 2 minutes for 55,000 range points (resolution of 250x220 image). In contrast, scanning the same image takes around 4 minutes in synchronized scanning. Once faster, longer range, and narrow beam LRFs are available, the methods proposed in this work can produce better results.
Egle, L; Rechberger, H; Krampe, J; Zessner, M
2016-11-15
Phosphorus (P) is an essential and limited resource. Municipal wastewater is a promising source of P via reuse and could be used to replace P derived from phosphate rocks. The agricultural use of sewage sludge is restricted by legislation or is not practiced in several European countries due to environmental risks posed by organic micropollutants and pathogens. Several technologies have been developed in recent years to recover wastewater P. However, these technologies target different P-containing flows in wastewater treatment plants (effluent, digester supernatant, sewage sludge, and sewage sludge ash), use diverse engineering approaches and differ greatly with respect to P recycling rate, potential of removing or destroying pollutants, product quality, environmental impact and cost. This work compares 19 relevant P recovery technologies by considering their relationships with existing wastewater and sludge treatment systems. A combination of different methods, such as material flow analysis, damage units, reference soil method, annuity method, integrated cost calculation and a literature study on solubility, fertilizing effects and handling of recovered materials, is used to evaluate the different technologies with respect to technical, ecological and economic aspects. With regard to the manifold origins of data an uncertainty concept considering validity of data sources is applied. This analysis revealed that recovery from flows with dissolved P produces clean and plant-available materials. These techniques may even be beneficial from economic and technical perspectives under specific circumstances. However, the recovery rates (a maximum of 25%) relative to the wastewater treatment plant influent are relatively low. The approaches that recover P from sewage sludge apply complex technologies and generally achieve effective removal of heavy metals at moderate recovery rates (~40-50% relative to the WWTP input) and comparatively high costs. Sewage sludge ash is the most promising P source, with recovery rates of 60-90% relative to the wastewater P. The costs highly depend on the purity requirements of the recycled products but can be kept comparatively low, especially if synergies with existing industrial processes are exploited. Copyright © 2016 The Authors. Published by Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Trandafir, Laura; Alexandru, Mioara; Constantin, Mihai; Ioniţă, Anca; Zorilă, Florina; Moise, Valentin
2012-09-01
EN ISO 11137 established regulations for setting or substantiating the dose for achieving the desired sterility assurance level. The validation studies can be designed in particular for different types of products. Each product needs distinct protocols for bioburden determination and sterility testing. The Microbiological Laboratory from Irradiation Processing Center (IRASM) deals with different types of products, mainly for the VDmax25 method. When it comes to microbiological evaluation the most challenging was cotton gauze. A special situation for establishing the sterilization validation method appears in cases of cotton packed in large quantities. The VDmax25 method cannot be applied for items with average bioburden more than 1000 CFU/pack, irrespective of the weight of the package. This is a method limitation and implies increased costs for the manufacturer when choosing other methods. For microbiological tests, culture condition should be selected in both cases of the bioburden and sterility testing. Details about choosing criteria are given.
Low-cost microwave radiometry for remote sensing of soil moisture
NASA Astrophysics Data System (ADS)
Chikando, Eric Ndjoukwe
2007-12-01
Remote sensing is now widely regarded as a dominant means of studying the Earth and its surrounding atmosphere. This science is based on blackbody theory, which states that all objects emit broadband electromagnetic radiation proportional to their temperature. This thermal emission is detectable by radiometers---highly sensitive receivers capable of measuring extremely low power radiation across a continuum of frequencies. In the particular case of a soil surface, one important parameter affecting the emitted radiation is the amount of water content or, soil moisture. A high degree of precision is required when estimating soil moisture in order to yield accurate forecasting of precipitations and short-term climate variability such as storms and hurricanes. Rapid progress within the remote sensing community in tackling current limitations necessitates an awareness of the general public towards the benefits of the science. Information about remote sensing instrumentation and techniques remain inaccessible to many higher-education institutions due to the high cost of instrumentation and the current general inaccessibility of the science. In an effort to draw more talent within the field, more affordable and reliable scientific instrumentation are needed. This dissertation introduces the first low-cost handheld microwave instrumentation fully capable of surface soil moisture studies. The framework of this research is two-fold. First, the development of a low-cost handheld microwave radiometer using the well-known Dicke configuration is examined. The instrument features a super-heterodyne architecture and is designed following a microwave integrated circuit (MIC) system approach. Validation of the instrument is performed by applying it to various soil targets and comparing measurement results to gravimetric technique measured data; a proven scientific method for determining volumetric soil moisture content. Second, the development of a fully functional receiver RF front-end is presented. This receiver module is designed in support to a digital radiometer effort under development by the Center of Microwave Satellite and RF Engineering (COMSARE) at Morgan State University. The topology of the receiver includes a low-noise amplifier, bandpass filters and a three-stage gain amplifier. Design, characterization and evaluation of these system blocks are detailed within the framework of this dissertation.
NASA Astrophysics Data System (ADS)
El-Masry, Amal A.; Hammouda, Mohammed E. A.; El-Wasseef, Dalia R.; El-Ashry, Saadia M.
2018-02-01
Two simple, sensitive, rapid, validated and cost effective spectroscopic methods were established for quantification of antihistaminic drug azelastine (AZL) in bulk powder as well as in pharmaceutical dosage forms. In the first method (A) the absorbance difference between acidic and basic solutions was measured at 228 nm, whereas in the second investigated method (B) the binary complex formed between AZL and Eosin Y in acetate buffer solution (pH 3) was measured at 550 nm. Different criteria that have critical influence on the intensity of absorption were deeply studied and optimized so as to achieve the highest absorption. The proposed methods obeyed Beer's low in the concentration range of (2.0-20.0 μg·mL- 1) and (0.5-15.0 μg·mL- 1) with % recovery ± S.D. of (99.84 ± 0.87), (100.02 ± 0.78) for methods (A) and (B), respectively. Furthermore, the proposed methods were easily applied for quality control of pharmaceutical preparations without any conflict with its co-formulated additives, and the analytical results were compatible with those obtained by the comparison one with no significant difference as insured by student's t-test and the variance ratio F-test. Validation of the proposed methods was performed according the ICH guidelines in terms of linearity, limit of quantification, limit of detection, accuracy, precision and specificity, where the analytical results were persuasive. The absorption spectrum of AZL (16 μg·mL- 1) in 0.1 M HCl. The absorption spectrum of AZL (16 μg·mL- 1) in 0.1 M NaOH. The difference absorption spectrum of AZL (16 μg·mL- 1) in 0.1 M NaOH vs 0.1 M HCl. The absorption spectrum of eosin binary complex with AZL (10 μg·mL- 1).
NASA Astrophysics Data System (ADS)
Hoffer, Nathan Von
Remote sensing has traditionally been done with satellites and manned aircraft. While. these methods can yield useful scientificc data, satellites and manned aircraft have limitations in data frequency, process time, and real time re-tasking. Small low-cost unmanned aerial vehicles (UAVs) provide greater possibilities for personal scientic research than traditional remote sensing platforms. Precision aerial data requires an accurate vehicle dynamics model for controller development, robust flight characteristics, and fault tolerance. One method of developing a model is system identification (system ID). In this thesis system ID of a small low-cost fixed-wing T-tail UAV is conducted. The linerized longitudinal equations of motion are derived from first principles. Foundations of Recursive Least Squares (RLS) are presented along with RLS with an Error Filtering Online Learning scheme (EFOL). Sensors, data collection, data consistency checking, and data processing are described. Batch least squares (BLS) and BLS with EFOL are used to identify aerodynamic coecoefficients of the UAV. Results of these two methods with flight data are discussed.
Application of low-cost adsorbents for dye removal--a review.
Gupta, V K; Suhas
2009-06-01
Dyes are an important class of pollutants, and can even be identified by the human eye. Disposal of dyes in precious water resources must be avoided, however, and for that various treatment technologies are in use. Among various methods adsorption occupies a prominent place in dye removal. The growing demand for efficient and low-cost treatment methods and the importance of adsorption has given rise to low-cost alternative adsorbents (LCAs). This review highlights and provides an overview of these LCAs comprising natural, industrial as well as synthetic materials/wastes and their application for dyes removal. In addition, various other methods used for dye removal from water and wastewater are also complied in brief. From a comprehensive literature review, it was found that some LCAs, in addition to having wide availability, have fast kinetics and appreciable adsorption capacities too. Advantages and disadvantages of adsorbents, favourable conditions for particular adsorbate-adsorbent systems, and adsorption capacities of various low-cost adsorbents and commercial activated carbons as available in the literature are presented. Conclusions have been drawn from the literature reviewed, and suggestions for future research are proposed.
NASA Astrophysics Data System (ADS)
Chowdhury, Md Mukul
With the increased practice of modularization and prefabrication, the construction industry gained the benefits of quality management, improved completion time, reduced site disruption and vehicular traffic, and improved overall safety and security. Whereas industrialized construction methods, such as modular and manufactured buildings, have evolved over decades, core techniques used in prefabrication plants vary only slightly from those employed in traditional site-built construction. With a focus on energy and cost efficient modular construction, this research presents the development of a simulation, measurement and optimization system for energy consumption in the manufacturing process of modular construction. The system is based on Lean Six Sigma principles and loosely coupled system operation to identify the non-value adding tasks and possible causes of low energy efficiency. The proposed system will also include visualization functions for demonstration of energy consumption in modular construction. The benefits of implementing this system include a reduction in the energy consumption in production cost, decrease of energy cost in the production of lean-modular construction, and increase profit. In addition, the visualization functions will provide detailed information about energy efficiency and operation flexibility in modular construction. A case study is presented to validate the reliability of the system.
Gross, Kennen; Brenner, Jeffrey C; Truchil, Aaron; Post, Ernest M; Riley, Amy Henderson
2013-01-01
Developing data-driven local solutions to address rising health care costs requires valid and reliable local data. Traditionally, local public health agencies have relied on birth, death, and specific disease registry data to guide health care planning, but these data sets provide neither health information across the lifespan nor information on local health care utilization patterns and costs. Insurance claims data collected by local hospitals for administrative purposes can be used to create valuable population health data sets. The Camden Coalition of Healthcare Providers partnered with the 3 health systems providing emergency and inpatient care within Camden, New Jersey, to create a local population all-payer hospital claims data set. The combined claims data provide unique insights into the health status, health care utilization patterns, and hospital costs on the population level. The cross-systems data set allows for a better understanding of the impact of high utilizers on a community-level health care system. This article presents an introduction to the methods used to develop Camden's hospital claims data set, as well as results showing the population health insights obtained from this unique data set.
Pruna, Raquel; Palacio, Francisco; Baraket, Abdoullatif; Zine, Nadia; Streklas, Angelos; Bausells, Joan; Errachid, Abdelhamid; López, Manel
2018-02-15
Miniaturizing potentiostats, keeping their cost low and yet preserving full measurement characteristics (e.g. bandwidth, determination of capacitive/inductive contribution to sensor's impedance and parallel screening) is still an unresolved challenge in bioelectronics. In this work, the combination of simple analogue circuitry together with powerful microcontrollers and a digital filter implementation is presented as an alternative to complex and incomplete architectures reported in the literature. A low-cost acquisition electronic system fully integrated with a biosensors platform containing eight gold working microelectrodes and integrated reference and counter electrodes was developed and validated. The manufacturing cost of the prototype was kept below 300 USD. The performance of the proposed device was benchmarked against a commercial impedance analyzer through the electrochemical analysis of a highly sensitive biosensor for the detection of tumor necrosis factor α (TNF-α) within the randomly chosen range of 266pg/mL to 666ng/mL in physiological medium (PBS). A strong correlation between the outputs of both devices was found in a critical range of frequencies (1-10Hz), and several TNF-α cytokine concentrations were properly discriminated. These results are very promising for the development of low-cost, portable and miniaturized electrochemical systems for point-of-care and environmental diagnosis. Copyright © 2017 Elsevier B.V. All rights reserved.
Periodic component analysis as a spatial filter for SSVEP-based brain-computer interface.
Kiran Kumar, G R; Reddy, M Ramasubba
2018-06-08
Traditional Spatial filters used for steady-state visual evoked potential (SSVEP) extraction such as minimum energy combination (MEC) require the estimation of the background electroencephalogram (EEG) noise components. Even though this leads to improved performance in low signal to noise ratio (SNR) conditions, it makes such algorithms slow compared to the standard detection methods like canonical correlation analysis (CCA) due to the additional computational cost. In this paper, Periodic component analysis (πCA) is presented as an alternative spatial filtering approach to extract the SSVEP component effectively without involving extensive modelling of the noise. The πCA can separate out components corresponding to a given frequency of interest from the background electroencephalogram (EEG) by capturing the temporal information and does not generalize SSVEP based on rigid templates. Data from ten test subjects were used to evaluate the proposed method and the results demonstrate that the periodic component analysis acts as a reliable spatial filter for SSVEP extraction. Statistical tests were performed to validate the results. The experimental results show that πCA provides significant improvement in accuracy compared to standard CCA and MEC in low SNR conditions. The results demonstrate that πCA provides better detection accuracy compared to CCA and on par with that of MEC at a lower computational cost. Hence πCA is a reliable and efficient alternative detection algorithm for SSVEP based brain-computer interface (BCI). Copyright © 2018. Published by Elsevier B.V.
Efficient scatter model for simulation of ultrasound images from computed tomography data
NASA Astrophysics Data System (ADS)
D'Amato, J. P.; Lo Vercio, L.; Rubi, P.; Fernandez Vera, E.; Barbuzza, R.; Del Fresno, M.; Larrabide, I.
2015-12-01
Background and motivation: Real-time ultrasound simulation refers to the process of computationally creating fully synthetic ultrasound images instantly. Due to the high value of specialized low cost training for healthcare professionals, there is a growing interest in the use of this technology and the development of high fidelity systems that simulate the acquisitions of echographic images. The objective is to create an efficient and reproducible simulator that can run either on notebooks or desktops using low cost devices. Materials and methods: We present an interactive ultrasound simulator based on CT data. This simulator is based on ray-casting and provides real-time interaction capabilities. The simulation of scattering that is coherent with the transducer position in real time is also introduced. Such noise is produced using a simplified model of multiplicative noise and convolution with point spread functions (PSF) tailored for this purpose. Results: The computational efficiency of scattering maps generation was revised with an improved performance. This allowed a more efficient simulation of coherent scattering in the synthetic echographic images while providing highly realistic result. We describe some quality and performance metrics to validate these results, where a performance of up to 55fps was achieved. Conclusion: The proposed technique for real-time scattering modeling provides realistic yet computationally efficient scatter distributions. The error between the original image and the simulated scattering image was compared for the proposed method and the state-of-the-art, showing negligible differences in its distribution.
Lince-Deroche, Naomi; Phiri, Jane; Michelow, Pam; Smith, Jennifer S; Firnhaber, Cindy
2015-01-01
South Africa has high rates of HIV and HPV and high incidence and mortality from cervical cancer. However, cervical cancer is largely preventable when early screening and treatment are available. We estimate the costs and cost-effectiveness of conventional cytology (Pap), visual inspection with acetic acid (VIA) and HPV DNA testing for detecting cases of CIN2+ among HIV-infected women currently taking antiretroviral treatment at a public HIV clinic in Johannesburg, South Africa. Method effectiveness was derived from a validation study completed at the clinic. Costs were estimated from the provider perspective using micro-costing between June 2013-April 2014. Capital costs were annualized using a discount rate of 3%. Two different service volume scenarios were considered. Threshold analysis was used to explore the potential for reducing the cost of HPV DNA testing. VIA was least costly in both scenarios. In the higher volume scenario, the average cost per procedure was US$ 3.67 for VIA, US$ 8.17 for Pap and US$ 54.34 for HPV DNA. Colposcopic biopsies cost on average US$ 67.71 per procedure. VIA was least sensitive but most cost-effective at US$ 17.05 per true CIN2+ case detected. The cost per case detected for Pap testing was US$ 130.63 using a conventional definition for positive results and US$ 187.52 using a more conservative definition. HPV DNA testing was US$ 320.09 per case detected. Colposcopic biopsy costs largely drove the total and per case costs. A 71% reduction in HPV DNA screening costs would make it competitive with the conservative Pap definition. Women need access to services which meet their needs and address the burden of cervical dysplasia and cancer in this region. Although most cost-effective, VIA may require more frequent screening due to low sensitivity, an important consideration for an HIV-positive population with increased risk for disease progression.
2013-01-01
Background Although patient satisfaction is a relevant outcome measure for health care providers, few satisfaction questionnaires have been generally available to physical therapists or have been validated in an Italian population for use in the outpatient setting. The aim of this study was to translate, culturally adapt, and validate the Italian version of the Physical Therapy Outpatient Satisfaction Survey (PTOPS). Methods The Italian version of the PTOPS (PTOPS-I) was developed through forward-backward translation, review, and field-testing a pre-final version. The reliability of the final questionnaire was measured by internal consistency and test-retest stability at 7 days. Factor analysis was also used to explore construct validity. Concurrent validity was measured by comparing PTOPS-I with a 5-point Likert-type scale measure assessing the Global Perceived Effect (GPE) of the treatment and with a Visual Analogue Scale (VAS). Results 354 outpatients completed the PTOPS-I, and 56 took the re-test. The internal consistency (Cronbach’s alpha) of the original domains (Enhancers, Detractors, Location, and Cost) was 0.758 for Enhancers, 0.847 for Detractors, 0.885 for Location, and 0.706 for Cost. The test-retest stability (Intra-class Correlation Coefficients) was 0.769 for Enhancers, 0.893 for Detractors, 0.862 for Location, and 0.862 for Cost. The factor analysis of the Italian version revealed a structure into four domains, named Depersonalization, Inaccessibility, Ambience, and Cost. Concurrent validity with GPE was significantly demonstrated for all domains except Inaccessibility. Irrelevant or non-significant correlations were observed with VAS. Conclusion The PTOPS-I showed good psychometric properties. Its use can be suggested for Italian-speaking outpatients who receive physical therapy. PMID:23560848
Casartelli, Nicola; Müller, Roland; Maffiuletti, Nicola A
2010-11-01
The aim of the present study was to verify the validity and reliability of the Myotest accelerometric system (Myotest SA, Sion, Switzerland) for the assessment of vertical jump height. Forty-four male basketball players (age range: 9-25 years) performed series of squat, countermovement and repeated jumps during 2 identical test sessions separated by 2-15 days. Flight height was simultaneously quantified with the Myotest system and validated photoelectric cells (Optojump). Two calculation methods were used to estimate the jump height from Myotest recordings: flight time (Myotest-T) and vertical takeoff velocity (Myotest-V). Concurrent validity was investigated comparing Myotest-T and Myotest-V to the criterion method (Optojump), and test-retest reliability was also examined. As regards validity, Myotest-T overestimated jumping height compared to Optojump (p < 0.001) with a systematic bias of approximately 7 cm, even though random errors were low (2.7 cm) and intraclass correlation coefficients (ICCs) where high (>0.98), that is, excellent validity. Myotest-V overestimated jumping height compared to Optojump (p < 0.001), with high random errors (>12 cm), high limits of agreement ratios (>36%), and low ICCs (<0.75), that is, poor validity. As regards reliability, Myotest-T showed high ICCs (range: 0.92-0.96), whereas Myotest-V showed low ICCs (range: 0.56-0.89), and high random errors (>9 cm). In conclusion, Myotest-T is a valid and reliable method for the assessment of vertical jump height, and its use is legitimate for field-based evaluations, whereas Myotest-V is neither valid nor reliable.
Hundalani, Shilpa G; Richards-Kortum, Rebecca; Oden, Maria; Kawaza, Kondwani; Gest, Alfred; Molyneux, Elizabeth
2015-01-01
Background Low-cost bubble continuous positive airway pressure (bCPAP) systems have been shown to improve survival in neonates with respiratory distress, in developing countries including Malawi. District hospitals in Malawi implementing CPAP requested simple and reliable guidelines to enable healthcare workers with basic skills and minimal training to determine when treatment with CPAP is necessary. We developed and validated TRY (T: Tone is good, R: Respiratory Distress and Y=Yes) CPAP, a simple algorithm to identify neonates with respiratory distress who would benefit from CPAP. Objective To validate the TRY CPAP algorithm for neonates with respiratory distress in a low-resource setting. Methods We constructed an algorithm using a combination of vital signs, tone and birth weight to determine the need for CPAP in neonates with respiratory distress. Neonates admitted to the neonatal ward of Queen Elizabeth Central Hospital, in Blantyre, Malawi, were assessed in a prospective, cross-sectional study. Nurses and paediatricians-in-training assessed neonates to determine whether they required CPAP using the TRY CPAP algorithm. To establish the accuracy of the TRY CPAP algorithm in evaluating the need for CPAP, their assessment was compared with the decision of a neonatologist blinded to the TRY CPAP algorithm findings. Results 325 neonates were evaluated over a 2-month period; 13% were deemed to require CPAP by the neonatologist. The inter-rater reliability with the algorithm was 0.90 for nurses and 0.97 for paediatricians-in-training using the neonatologist's assessment as the reference standard. Conclusions The TRY CPAP algorithm has the potential to be a simple and reliable tool to assist nurses and clinicians in identifying neonates who require treatment with CPAP in low-resource settings. PMID:25877290
Ratanawongsa, Neda; Karter, Andrew J.; Quan, Judy; Parker, Melissa M.; Handley, Margaret; Sarkar, Urmimala; Schmittdiel, Julie A.; Schillinger, Dean
2015-01-01
Background With the expansion of Medicaid and low-cost health insurance plans among diverse patient populations, objective measures of medication adherence using pharmacy claims could advance clinical care and translational research for safety net care. However, safety net patients may experience fluctuating prescription drug coverage, affecting the performance of adherence measures. Objective To evaluate the performance of continuous medication gap (CMG) for diverse, low-income managed care members with diabetes. Methods We conducted this cross-sectional analysis using administrative and clinical data for 680 members eligible for a self-management support trial at a non-profit, government-sponsored managed care plan. We applied CMG methodology to cardiometabolic medication claims for English-, Cantonese-, or Spanish-speaking members with diabetes. We examined inclusiveness (the proportion with calculable CMG) and selectivity (sociodemographic and medical differences from members without CMG). To examine validity, we examined unadjusted associations of suboptimal adherence (CMG>20%) with suboptimal cardiometabolic control. Results 429 members (63%) had calculable CMG. Compared to members without CMG, members with CMG were younger; more likely employed; and had poorer glycemic control, but better blood pressure and lipid control. Suboptimal adherence occurred more frequently among members with poor cardiometabolic control than among members with optimal control (28% vs. 12%, p=0.02). Conclusions CMG demonstrated acceptable inclusiveness and validity in a diverse, low-income safety net population, comparable to its performance in studies among other insured populations. CMG may provide a useful tool to measure adherence among increasingly diverse Medicaid populations, complemented by other strategies to reach those not captured by CMG. Trial Registration NCT00683020 PMID:26233541
Rogier, Eric; Plucinski, Mateusz; Lucchi, Naomi; Mace, Kimberly; Chang, Michelle; Lemoine, Jean Frantz; Candrinho, Baltazar; Colborn, James; Dimbu, Rafael; Fortes, Filomeno; Udhayakumar, Venkatachalam; Barnwell, John
2017-01-01
Detection of histidine-rich protein 2 (HRP2) from the malaria parasite Plasmodium falciparum provides evidence for active or recent infection, and is utilized for both diagnostic and surveillance purposes, but current laboratory immunoassays for HRP2 are hindered by low sensitivities and high costs. Here we present a new HRP2 immunoassay based on antigen capture through a bead-based system capable of detecting HRP2 at sub-picogram levels. The assay is highly specific and cost-effective, allowing fast processing and screening of large numbers of samples. We utilized the assay to assess results of HRP2-based rapid diagnostic tests (RDTs) in different P. falciparum transmission settings, generating estimates for true performance in the field. Through this method of external validation, HRP2 RDTs were found to perform well in the high-endemic areas of Mozambique and Angola with 86.4% and 73.9% of persons with HRP2 in their blood testing positive by RDTs, respectively, and false-positive rates of 4.3% and 0.5%. However, in the low-endemic setting of Haiti, only 14.5% of persons found to be HRP2 positive by the bead assay were RDT positive. Additionally, 62.5% of Haitians showing a positive RDT test had no detectable HRP2 by the bead assay, likely indicating that these were false positive tests. In addition to RDT validation, HRP2 biomass was assessed for the populations in these different settings, and may provide an additional metric by which to estimate P. falciparum transmission intensity and measure the impact of interventions. PMID:28192523
NASA Astrophysics Data System (ADS)
Mullan, Donal; Chen, Jie; Zhang, Xunchang John
2016-02-01
Statistical downscaling (SD) methods have become a popular, low-cost and accessible means of bridging the gap between the coarse spatial resolution at which climate models output climate scenarios and the finer spatial scale at which impact modellers require these scenarios, with various different SD techniques used for a wide range of applications across the world. This paper compares the Generator for Point Climate Change (GPCC) model and the Statistical DownScaling Model (SDSM)—two contrasting SD methods—in terms of their ability to generate precipitation series under non-stationary conditions across ten contrasting global climates. The mean, maximum and a selection of distribution statistics as well as the cumulative frequencies of dry and wet spells for four different temporal resolutions were compared between the models and the observed series for a validation period. Results indicate that both methods can generate daily precipitation series that generally closely mirror observed series for a wide range of non-stationary climates. However, GPCC tends to overestimate higher precipitation amounts, whilst SDSM tends to underestimate these. This infers that GPCC is more likely to overestimate the effects of precipitation on a given impact sector, whilst SDSM is likely to underestimate the effects. GPCC performs better than SDSM in reproducing wet and dry day frequency, which is a key advantage for many impact sectors. Overall, the mixed performance of the two methods illustrates the importance of users performing a thorough validation in order to determine the influence of simulated precipitation on their chosen impact sector.
Nunes, Chalder Nogueira; Pauluk, Lucas Ely; Dos Anjos, Vanessa Egéa; Lopes, Mauro Chierici; Quináia, Sueli Pércio
2015-08-01
Contaminants of emerging concern (CECs) are chemicals, including pharmaceutical and personal care products, not commonly monitored in the aquatic environment. Pharmaceuticals are nowadays considered as an important environmental contaminant. Chromatography methods which require expensive equipment and complicated sample pretreatment are used for detection of CECs in natural water. Thus, in this study we proposed a simple, fast, and low-cost voltammetric method as a screening tool for the determination of CECs in natural water prior to chromatography. A case study was conducted with alprazolam (benzodiazepine). The method was optimized and validated in-house. The limit of quantification was 0.4 μg L(-1) for a 120 s preconcentration time. The recoveries ranged from 93 to 120 % for accuracy tests. A further proposal aim was to determine for the first time the occurrence of alprazolam in Brazilian river water and to evaluate its potential use as a marker of contamination by wastewater.
Manlig, Erika; Wahlberg, Per
2017-01-01
Abstract Sodium bisulphite treatment of DNA combined with next generation sequencing (NGS) is a powerful combination for the interrogation of genome-wide DNA methylation profiles. Library preparation for whole genome bisulphite sequencing (WGBS) is challenging due to side effects of the bisulphite treatment, which leads to extensive DNA damage. Recently, a new generation of methods for bisulphite sequencing library preparation have been devised. They are based on initial bisulphite treatment of the DNA, followed by adaptor tagging of single stranded DNA fragments, and enable WGBS using low quantities of input DNA. In this study, we present a novel approach for quick and cost effective WGBS library preparation that is based on splinted adaptor tagging (SPLAT) of bisulphite-converted single-stranded DNA. Moreover, we validate SPLAT against three commercially available WGBS library preparation techniques, two of which are based on bisulphite treatment prior to adaptor tagging and one is a conventional WGBS method. PMID:27899585
Determination of Protein Content by NIR Spectroscopy in Protein Powder Mix Products.
Ingle, Prashant D; Christian, Roney; Purohit, Piyush; Zarraga, Veronica; Handley, Erica; Freel, Keith; Abdo, Saleem
2016-01-01
Protein is a principal component in commonly used dietary supplements and health food products. The analysis of these products, within the consumer package form, is of critical importance for the purpose of ensuring quality and supporting label claims. A rapid test method was developed using near-infrared (NIR) spectroscopy as a compliment to current protein determination by the Dumas combustion method. The NIR method was found to be a rapid, low-cost, and green (no use of chemicals and reagents) complimentary technique. The protein powder samples analyzed in this study were in the range of 22-90% protein. The samples were prepared as mixtures of soy protein, whey protein, and silicon dioxide ingredients, which are common in commercially sold protein powder drink-mix products in the market. A NIR regression model was developed with 17 samples within the constituent range and was validated with 20 independent samples of known protein levels (85-88%). The results show that the NIR method is capable of predicting the protein content with a bias of ±2% and a maximum bias of 3% between NIR and the external Dumas method.
The Mt. Hood challenge: cross-testing two diabetes simulation models.
Brown, J B; Palmer, A J; Bisgaard, P; Chan, W; Pedula, K; Russell, A
2000-11-01
Starting from identical patients with type 2 diabetes, we compared the 20-year predictions of two computer simulation models, a 1998 version of the IMIB model and version 2.17 of the Global Diabetes Model (GDM). Primary measures of outcome were 20-year cumulative rates of: survival, first (incident) acute myocardial infarction (AMI), first stroke, proliferative diabetic retinopathy (PDR), macro-albuminuria (gross proteinuria, or GPR), and amputation. Standardized test patients were newly diagnosed males aged 45 or 75, with high and low levels of glycated hemoglobin (HbA(1c)), systolic blood pressure (SBP), and serum lipids. Both models generated realistic results and appropriate responses to changes in risk factors. Compared with the GDM, the IMIB model predicted much higher rates of mortality and AMI, and fewer strokes. These differences can be explained by differences in model architecture (Markov vs. microsimulation), different evidence bases for cardiovascular prediction (Framingham Heart Study cohort vs. Kaiser Permanente patients), and isolated versus interdependent prediction of cardiovascular events. Compared with IMIB, GDM predicted much higher lifetime costs, because of lower mortality and the use of a different costing method. It is feasible to cross-validate and explicate dissimilar diabetes simulation models using standardized patients. The wide differences in the model results that we observed demonstrate the need for cross-validation. We propose to hold a second 'Mt Hood Challenge' in 2001 and invite all diabetes modelers to attend.
Xing, Jida; Chen, Jie
2015-06-23
In therapeutic ultrasound applications, accurate ultrasound output intensities are crucial because the physiological effects of therapeutic ultrasound are very sensitive to the intensity and duration of these applications. Although radiation force balance is a benchmark technique for measuring ultrasound intensity and power, it is costly, difficult to operate, and compromised by noise vibration. To overcome these limitations, the development of a low-cost, easy to operate, and vibration-resistant alternative device is necessary for rapid ultrasound intensity measurement. Therefore, we proposed and validated a novel two-layer thermoacoustic sensor using an artificial neural network technique to accurately measure low ultrasound intensities between 30 and 120 mW/cm2. The first layer of the sensor design is a cylindrical absorber made of plexiglass, followed by a second layer composed of polyurethane rubber with a high attenuation coefficient to absorb extra ultrasound energy. The sensor determined ultrasound intensities according to a temperature elevation induced by heat converted from incident acoustic energy. Compared with our previous one-layer sensor design, the new two-layer sensor enhanced the ultrasound absorption efficiency to provide more rapid and reliable measurements. Using a three-dimensional model in the K-wave toolbox, our simulation of the ultrasound propagation process demonstrated that the two-layer design is more efficient than the single layer design. We also integrated an artificial neural network algorithm to compensate for the large measurement offset. After obtaining multiple parameters of the sensor characteristics through calibration, the artificial neural network is built to correct temperature drifts and increase the reliability of our thermoacoustic measurements through iterative training about ten seconds. The performance of the artificial neural network method was validated through a series of experiments. Compared to our previous design, the new design reduced sensing time from 20 s to 12 s, and the sensor's average error from 3.97 mW/cm2 to 1.31 mW/cm2 respectively.
Xing, Jida; Chen, Jie
2015-01-01
In therapeutic ultrasound applications, accurate ultrasound output intensities are crucial because the physiological effects of therapeutic ultrasound are very sensitive to the intensity and duration of these applications. Although radiation force balance is a benchmark technique for measuring ultrasound intensity and power, it is costly, difficult to operate, and compromised by noise vibration. To overcome these limitations, the development of a low-cost, easy to operate, and vibration-resistant alternative device is necessary for rapid ultrasound intensity measurement. Therefore, we proposed and validated a novel two-layer thermoacoustic sensor using an artificial neural network technique to accurately measure low ultrasound intensities between 30 and 120 mW/cm2. The first layer of the sensor design is a cylindrical absorber made of plexiglass, followed by a second layer composed of polyurethane rubber with a high attenuation coefficient to absorb extra ultrasound energy. The sensor determined ultrasound intensities according to a temperature elevation induced by heat converted from incident acoustic energy. Compared with our previous one-layer sensor design, the new two-layer sensor enhanced the ultrasound absorption efficiency to provide more rapid and reliable measurements. Using a three-dimensional model in the K-wave toolbox, our simulation of the ultrasound propagation process demonstrated that the two-layer design is more efficient than the single layer design. We also integrated an artificial neural network algorithm to compensate for the large measurement offset. After obtaining multiple parameters of the sensor characteristics through calibration, the artificial neural network is built to correct temperature drifts and increase the reliability of our thermoacoustic measurements through iterative training about ten seconds. The performance of the artificial neural network method was validated through a series of experiments. Compared to our previous design, the new design reduced sensing time from 20 s to 12 s, and the sensor’s average error from 3.97 mW/cm2 to 1.31 mW/cm2 respectively. PMID:26110412
Micro-Costing Quantity Data Collection Methods
Frick, Kevin D.
2009-01-01
Background Micro-costing studies collect detailed data on resources utilized and the value of those resources. Such studies are useful for estimating the cost of new technologies or new community-based interventions, for producing estimates in studies that include non-market goods, and for studying within-procedure cost variation. Objectives This objectives of this paper were to (1) describe basic micro-costing methods focusing on quantity data collection; and (2) suggest a research agenda to improve methods in and the interpretation of micro-costing Research Design Examples in the published literature were used to illustrate steps in the methods of gathering data (primarily quantity data) for a micro-costing study. Results Quantity data collection methods that were illustrated in the literature include the use of (1) administrative databases at single facilities, (2) insurer administrative data, (3) forms applied across multiple settings, (4) an expert panel, (5) surveys or interviews of one or more types of providers; (6) review of patient charts, (7) direct observation, (8) personal digital assistants, (9) program operation logs, and (10) diary data. Conclusions Future micro-costing studies are likely to improve if research is done to compare the validity and cost of different data collection methods; if a critical review is conducted of studies done to date; and if the combination of the results of the first two steps described are used to develop guidelines that address common limitations, critical judgment points, and decisions that can reduce limitations and improve the quality of studies. PMID:19536026
Cadieux, Geneviève; Tamblyn, Robyn; Buckeridge, David L; Dendukuri, Nandini
2017-08-01
Valid measurement of outcomes such as disease prevalence using health care utilization data is fundamental to the implementation of a "learning health system." Definitions of such outcomes can be complex, based on multiple diagnostic codes. The literature on validating such data demonstrates a lack of awareness of the need for a stratified sampling design and corresponding statistical methods. We propose a method for validating the measurement of diagnostic groups that have: (1) different prevalences of diagnostic codes within the group; and (2) low prevalence. We describe an estimation method whereby: (1) low-prevalence diagnostic codes are oversampled, and the positive predictive value (PPV) of the diagnostic group is estimated as a weighted average of the PPV of each diagnostic code; and (2) claims that fall within a low-prevalence diagnostic group are oversampled relative to claims that are not, and bias-adjusted estimators of sensitivity and specificity are generated. We illustrate our proposed method using an example from population health surveillance in which diagnostic groups are applied to physician claims to identify cases of acute respiratory illness. Failure to account for the prevalence of each diagnostic code within a diagnostic group leads to the underestimation of the PPV, because low-prevalence diagnostic codes are more likely to be false positives. Failure to adjust for oversampling of claims that fall within the low-prevalence diagnostic group relative to those that do not leads to the overestimation of sensitivity and underestimation of specificity.
Naked-eye fingerprinting of single nucleotide polymorphisms on psoriasis patients
NASA Astrophysics Data System (ADS)
Valentini, Paola; Marsella, Alessandra; Tarantino, Paolo; Mauro, Salvatore; Baglietto, Silvia; Congedo, Maurizio; Paolo Pompa, Pier
2016-05-01
We report a low-cost test, based on gold nanoparticles, for the colorimetric (naked-eye) fingerprinting of a panel of single nucleotide polymorphisms (SNPs), relevant for the personalized therapy of psoriasis. Such pharmacogenomic tests are not routinely performed on psoriasis patients, due to the high cost of standard technologies. We demonstrated high sensitivity and specificity of our colorimetric test by validating it on a cohort of 30 patients, through a double-blind comparison with two state-of-the-art instrumental techniques, namely reverse dot blotting and sequencing, finding 100% agreement. This test offers high parallelization capabilities and can be easily generalized to other SNPs of clinical relevance, finding broad utility in diagnostics and pharmacogenomics.We report a low-cost test, based on gold nanoparticles, for the colorimetric (naked-eye) fingerprinting of a panel of single nucleotide polymorphisms (SNPs), relevant for the personalized therapy of psoriasis. Such pharmacogenomic tests are not routinely performed on psoriasis patients, due to the high cost of standard technologies. We demonstrated high sensitivity and specificity of our colorimetric test by validating it on a cohort of 30 patients, through a double-blind comparison with two state-of-the-art instrumental techniques, namely reverse dot blotting and sequencing, finding 100% agreement. This test offers high parallelization capabilities and can be easily generalized to other SNPs of clinical relevance, finding broad utility in diagnostics and pharmacogenomics. Electronic supplementary information (ESI) available. See DOI: 10.1039/c6nr02200f
Direct injection GC method for measuring light hydrocarbon emissions from cooling-tower water.
Lee, Max M; Logan, Tim D; Sun, Kefu; Hurley, N Spencer; Swatloski, Robert A; Gluck, Steve J
2003-12-15
A Direct Injection GC method for quantifying low levels of light hydrocarbons (C6 and below) in cooling water has been developed. It is intended to overcome the limitations of the currently available technology. The principle of this method is to use a stripper column in a GC to strip waterfrom the hydrocarbons prior to entering the separation column. No sample preparation is required since the water sample is introduced directly into the GC. Method validation indicates that the Direct Injection GC method offers approximately 15 min analysis time with excellent precision and recovery. The calibration studies with ethylene and propylene show that both liquid and gas standards are suitable for routine calibration and calibration verification. The sampling method using zero headspace traditional VOA (Volatile Organic Analysis) vials and a sample chiller has also been validated. It is apparent that the sampling method is sufficient to minimize the potential for losses of light hydrocarbons, and samples can be held at 4 degrees C for up to 7 days with more than 93% recovery. The Direct Injection GC method also offers <1 ppb (w/v) level method detection limits for ethylene, propylene, and benzene. It is superior to the existing El Paso stripper method. In addition to lower detection limits for ethylene and propylene, the Direct Injection GC method quantifies individual light hydrocarbons in cooling water, provides better recoveries, and requires less maintenance and setup costs. Since the instrumentation and supplies are readily available, this technique could easily be established as a standard or alternative method for routine emission monitoring and leak detection of light hydrocarbons in cooling-tower water.
Using neuromorphic optical sensors for spacecraft absolute and relative navigation
NASA Astrophysics Data System (ADS)
Shake, Christopher M.
We develop a novel attitude determination system (ADS) for use on nano spacecraft using neuromorphic optical sensors. The ADS intends to support nano-satellite operations by providing low-cost, low-mass, low-volume, low-power, and redundant attitude determination capabilities with quick and straightforward onboard programmability for real time spacecraft operations. The ADS is experimentally validated with commercial-off-the-shelf optical devices that perform sensing and image processing on the same circuit board and are biologically inspired by insects' vision systems, which measure optical flow while navigating in the environment. The firmware on the devices is modified to both perform the additional biologically inspired task of tracking objects and communicate with a PC/104 form-factor embedded computer running Real Time Application Interface Linux used on a spacecraft simulator. Algorithms are developed for operations using optical flow, point tracking, and hybrid modes with the sensors, and the performance of the system in all three modes is assessed using a spacecraft simulator in the Advanced Autonomous Multiple Spacecraft (ADAMUS) laboratory at Rensselaer. An existing relative state determination method is identified to be combined with the novel ADS to create a self-contained navigation system for nano spacecraft. The performance of the method is assessed in simulation and found not to match the results from its authors using only conditions and equations already published. An improved target inertia tensor method is proposed as an update to the existing relative state method, but found not to perform as expected, but is presented for others to build upon.
ERIC Educational Resources Information Center
Kash, Bita A.; Hawes, Catherine; Phillips, Charles D.
2007-01-01
Purpose: This study had two goals: (a) to assess the validity of the Online Survey Certification and Reporting (OSCAR) staffing data by comparing them to staffing measures from audited Medicaid Cost Reports and (b) to identify systematic differences between facilities that over-report or underreport staffing in the OSCAR. Design and Methods: We…
Wootton, Richard; Vladzymyrskyy, Anton; Zolfo, Maria; Bonnardot, Laurent
2011-01-01
Telemedicine has been used for many years to support doctors in the developing world. Several networks provide services in different settings and in different ways. However, to draw conclusions about which telemedicine networks are successful requires a method of evaluating them. No general consensus or validated framework exists for this purpose. To define a basic method of performance measurement that can be used to improve and compare teleconsultation networks; to employ the proposed framework in an evaluation of three existing networks; to make recommendations about the future implementation and follow-up of such networks. Analysis based on the experience of three telemedicine networks (in operation for 7-10 years) that provide services to doctors in low-resource settings and which employ the same basic design. Although there are many possible indicators and metrics that might be relevant, five measures for each of the three user groups appear to be sufficient for the proposed framework. In addition, from the societal perspective, information about clinical- and cost-effectiveness is also required. The proposed performance measurement framework was applied to three mature telemedicine networks. Despite their differences in terms of activity, size and objectives, their performance in certain respects is very similar. For example, the time to first reply from an expert is about 24 hours for each network. Although all three networks had systems in place to collect data from the user perspective, none of them collected information about the coordinator's time required or about ease of system usage. They had only limited information about quality and cost. Measuring the performance of a telemedicine network is essential in understanding whether the network is working as intended and what effect it is having. Based on long-term field experience, the suggested framework is a practical tool that will permit organisations to assess the performance of their own networks and to improve them by comparison with others. All telemedicine systems should provide information about setup and running costs because cost-effectiveness is crucial for sustainability.
Novel Composites for Wing and Fuselage Applications. Task 1; Novel Wing Design Concepts
NASA Technical Reports Server (NTRS)
Suarez, J. A.; Buttitta, C.; Flanagan, G.; DeSilva, T.; Egensteiner, W.; Bruno, J.; Mahon, J.; Rutkowski, C.; Collins, R.; Fidnarick, R.;
1996-01-01
Design trade studies were conducted to arrive at advanced wing designs that integrated new material forms with innovative structural concepts and cost-effective fabrication methods. A representative spar was selected for design, fabrication, and test to validate the predicted performance. Textile processes, such as knitting, weaving and stitching, were used to produce fiber preforms that were later fabricated into composite span through epoxy Resin Transfer Molding (RTM), Resin Film Infusion (RFI), and consolidation of commingled thermoplastic and graphite tows. The target design ultimate strain level for these innovative structural design concepts was 6000 mu in. per in. The spars were subjected to four-point beam bending to validate their structural performance. The various material form /processing combination Y-spars were rated for their structural efficiency and acquisition cost. The acquisition cost elements were material, tooling, and labor.
CMOS array design automation techniques. [metal oxide semiconductors
NASA Technical Reports Server (NTRS)
Ramondetta, P.; Feller, A.; Noto, R.; Lombardi, T.
1975-01-01
A low cost, quick turnaround technique for generating custom metal oxide semiconductor arrays using the standard cell approach was developed, implemented, tested and validated. Basic cell design topology and guidelines are defined based on an extensive analysis that includes circuit, layout, process, array topology and required performance considerations particularly high circuit speed.
NASA Astrophysics Data System (ADS)
Ali, M.; Supriyatman; Saehana, S.
2018-03-01
It has been successfully designing low cost of science experiment from recycled materials. The science instruments were produced to explain expansion concept and hydrostatic pressure inside the liquid. Science instruments were calibrated and then validated. It was also implemented in science learning.
Development and validation of a low-density SNP panel related to prolificacy in sheep
USDA-ARS?s Scientific Manuscript database
High-density SNP panels (e.g., 50,000 and 600,000 markers) have been used in exploratory population genetic studies with commercial and minor breeds of sheep. However, routine genetic diversity evaluations of large numbers of samples with large panels are in general cost-prohibitive for gene banks. ...
Assembly and Validation of a Colorimeter
ERIC Educational Resources Information Center
Hughes, Bill
2013-01-01
A low-cost and portable colorimeter kit has been designed and developed as an educational tool at Penn State University by Dr. Daniel Sykes for K-12 schools' integrated STEM learning. This scientific instrument allows students to learn how scientists utilize light as a means to study the chemical and physical properties of materials with an…
NASA Astrophysics Data System (ADS)
Liu, Chengcheng; Zhu, Jianguo; Wang, Youhua; Guo, Youguang; Lei, Gang; Liu, Xiaojing
2015-05-01
This paper proposes a low-cost double rotor axial flux motor (DRAFM) with low cost soft magnetic composite (SMC) core and ferrite permanent magnets (PMs). The topology and operating principle of DRAFM and design considerations for best use of magnetic materials are presented. A 905 W 4800 rpm DRAFM is designed for replacing the high cost NdFeB permanent magnet synchronous motor (PMSM) in a refrigerator compressor. By using the finite element method, the electromagnetic parameters and performance of the DRAFM operated under the field oriented control scheme are calculated. Through the analysis, it is shown that that the SMC and ferrite PM materials can be good candidates for low-cost electric motor applications.
Hugo, Cherie; Isenring, Elisabeth; Miller, Michelle; Marshall, Skye
2018-05-01
observational studies have shown that nutritional strategies to manage malnutrition may be cost-effective in aged care; but more robust economic data is needed to support and encourage translation to practice. Therefore, the aim of this systematic review is to compare the cost-effectiveness of implementing nutrition interventions targeting malnutrition in aged care homes versus usual care. residential aged care homes. systematic literature review of studies published between January 2000 and August 2017 across 10 electronic databases. Cochrane Risk of Bias tool and GRADE were used to evaluate the quality of the studies. eight included studies (3,098 studies initially screened) reported on 11 intervention groups, evaluating the effect of modifications to dining environment (n = 1), supplements (n = 5) and food-based interventions (n = 5). Interventions had a low cost of implementation (<£2.30/resident/day) and provided clinical improvement for a range of outcomes including weight, nutritional status and dietary intake. Supplements and food-based interventions further demonstrated a low cost per quality adjusted life year or unit of physical function improvement. GRADE assessment revealed the quality of the body of evidence that introducing malnutrition interventions, whether they be environmental, supplements or food-based, are cost-effective in aged care homes was low. this review suggests supplements and food-based nutrition interventions in the aged care setting are clinically effective, have a low cost of implementation and may be cost-effective at improving clinical outcomes associated with malnutrition. More studies using well-defined frameworks for economic analysis, stronger study designs with improved quality, along with validated malnutrition measures are needed to confirm and increase confidence with these findings.
Measuring Resource Utilization: A Systematic Review of Validated Self-Reported Questionnaires.
Leggett, Laura E; Khadaroo, Rachel G; Holroyd-Leduc, Jayna; Lorenzetti, Diane L; Hanson, Heather; Wagg, Adrian; Padwal, Raj; Clement, Fiona
2016-03-01
A variety of methods may be used to obtain costing data. Although administrative data are most commonly used, the data available in these datasets are often limited. An alternative method of obtaining costing is through self-reported questionnaires. Currently, there are no systematic reviews that summarize self-reported resource utilization instruments from the published literature.The aim of the study was to identify validated self-report healthcare resource use instruments and to map their attributes.A systematic review was conducted. The search identified articles using terms like "healthcare utilization" and "questionnaire." All abstracts and full texts were considered in duplicate. For inclusion, studies had to assess the validity of a self-reported resource use questionnaire, to report original data, include adult populations, and the questionnaire had to be publically available. Data such as type of resource utilization assessed by each questionnaire, and validation findings were extracted from each study.In all, 2343 unique citations were retrieved; 2297 were excluded during abstract review. Forty-six studies were reviewed in full text, and 15 studies were included in this systematic review. Six assessed resource utilization of patients with chronic conditions; 5 assessed mental health service utilization; 3 assessed resource utilization by a general population; and 1 assessed utilization in older populations. The most frequently measured resources included visits to general practitioners and inpatient stays; nonmedical resources were least frequently measured. Self-reported questionnaires on resource utilization had good agreement with administrative data, although, visits to general practitioners, outpatient days, and nurse visits had poorer agreement.Self-reported questionnaires are a valid method of collecting data on healthcare resource utilization.
Triantis, Theodoros; Tsimeli, Katerina; Kaloudis, Triantafyllos; Thanassoulias, Nicholas; Lytras, Efthymios; Hiskia, Anastasia
2010-05-01
A system of analytical processes has been developed in order to serve as a cost-effective scheme for the monitoring of cyanobacterial toxins on a quantitative basis, in surface and drinking waters. Five cyclic peptide hepatotoxins, microcystin-LR, -RR, -YR, -LA and nodularin were chosen as the target compounds. Two different enzyme-linked immunosorbent assays (ELISA) were validated in order to serve as primary quantitative screening tools. Validation results showed that the ELISA methods are sufficiently specific and sensitive with limits of detection (LODs) around 0.1 microg/L, however, matrix effects should be considered, especially with surface water samples or bacterial mass methanolic extracts. A colorimetric protein phosphatase inhibition assay (PPIA) utilizing protein phosphatase 2A and p-nitrophenyl phosphate as substrate, was applied in microplate format in order to serve as a quantitative screening method for the detection of the toxic activity associated with cyclic peptide hepatotoxins, at concentration levels >0.2 microg/L of MC-LR equivalents. A fast HPLC/PDA method has been developed for the determination of microcystins, by using a short, 50mm C18 column, with 1.8 microm particle size. Using this method a 10-fold reduction of sample run time was achieved and sufficient separation of microcystins was accomplished in less than 3 min. Finally, the analytical system includes an LC/MS/MS method that was developed for the determination of the 5 target compounds after SPE extraction. The method achieves extremely low limits of detection (<0.02 microg/L), in both surface and drinking waters and it is used for identification and verification purposes as well as for determinations at the ppt level. An analytical protocol that includes the above methods has been designed and validated through the analysis of a number of real samples. Copyright 2009 Elsevier Ltd. All rights reserved.
Hazut, Koren; Romem, Pnina; Malkin, Smadar; Livshiz-Riven, Ilana
2016-12-01
The purpose of this study was to compare the predictive validity, economic efficiency, and faculty staff satisfaction of a computerized test versus a personal interview as admission methods for graduate nursing studies. A mixed method study was designed, including cross-sectional and retrospective cohorts, interviews, and cost analysis. One hundred and thirty-four students in the Master of Nursing program participated. The success of students in required core courses was similar in both admission method groups. The personal interview method was found to be a significant predictor of success, with cognitive variables the only significant contributors to the model. Higher satisfaction levels were reported with the computerized test compared with the personal interview method. The cost of the personal interview method, in annual hourly work, was 2.28 times higher than the computerized test. These findings may promote discussion regarding the cost benefit of the personal interview as an admission method for advanced academic studies in healthcare professions. © 2016 John Wiley & Sons Australia, Ltd.
Comparison of scoring approaches for the NEI VFQ-25 in low vision.
Dougherty, Bradley E; Bullimore, Mark A
2010-08-01
The aim of this study was to evaluate different approaches to scoring the National Eye Institute Visual Functioning Questionnaire-25 (NEI VFQ-25) in patients with low vision including scoring by the standard method, by Rasch analysis, and by use of an algorithm created by Massof to approximate Rasch person measure. Subscale validity and use of a 7-item short form instrument proposed by Ryan et al. were also investigated. NEI VFQ-25 data from 50 patients with low vision were analyzed using the standard method of summing Likert-type scores and calculating an overall average, Rasch analysis using Winsteps software, and the Massof algorithm in Excel. Correlations between scores were calculated. Rasch person separation reliability and other indicators were calculated to determine the validity of the subscales and of the 7-item instrument. Scores calculated using all three methods were highly correlated, but evidence of floor and ceiling effects was found with the standard scoring method. None of the subscales investigated proved valid. The 7-item instrument showed acceptable person separation reliability and good targeting and item performance. Although standard scores and Rasch scores are highly correlated, Rasch analysis has the advantages of eliminating floor and ceiling effects and producing interval-scaled data. The Massof algorithm for approximation of the Rasch person measure performed well in this group of low-vision patients. The validity of the subscales VFQ-25 should be reconsidered.
NASA Technical Reports Server (NTRS)
Mandl, Dan; Howard, Joseph
2000-01-01
The New Millennium Program's first Earth-observing mission (EO-1) is a technology validation mission. It is managed by the NASA Goddard Space Flight Center in Greenbelt, Maryland and is scheduled for launch in the summer of 2000. The purpose of this mission is to flight-validate revolutionary technologies that will contribute to the reduction of cost and increase of capabilities for future land imaging missions. In the EO-1 mission, there are five instrument, five spacecraft, and three supporting technologies to flight-validate during a year of operations. EO-1 operations and the accompanying ground system were intended to be simple in order to maintain low operational costs. For purposes of formulating operations, it was initially modeled as a small science mission. However, it quickly evolved into a more complex mission due to the difficulties in effectively integrating all of the validation plans of the individual technologies. As a consequence, more operational support was required to confidently complete the on-orbit validation of the new technologies. This paper will outline the issues and lessons learned applicable to future technology validation missions. Examples of some of these include the following: (1) operational complexity encountered in integrating all of the validation plans into a coherent operational plan, (2) initial desire to run single shift operations subsequently growing to 6 "around-the-clock" operations, (3) managing changes in the technologies that ultimately affected operations, (4) necessity for better team communications within the project to offset the effects of change on the Ground System Developers, Operations Engineers, Integration and Test Engineers, S/C Subsystem Engineers, and Scientists, and (5) the need for a more experienced Flight Operations Team to achieve the necessary operational flexibility. The discussion will conclude by providing several cost comparisons for developing operations from previous missions to EO-1 and discuss some details that might be done differently for future technology validation missions.
Performance test of a low cost roof-mounted wind turbine
NASA Astrophysics Data System (ADS)
Figueroa-Espinoza, Bernardo; Quintal, Roberto; Gou, Clément; Aguilar, Alicia
2013-11-01
A low cost wind turbine was implemented based on the ideas put forward by Hugh Piggot in his book ``A wind turbine recipe book,'' where such device is developed using materials and manufacturing processes available (as much as possible) in developing countries or isolated communities. The wind turbine is to be mounted on a two stories building roof in a coastal zone of Mexico. The velocity profiles and turbulence intensities for typical wind conditions on top of the building roof were analyzed using numerical simulations (RANS) in order to locate the turbine hub above any recirculation and near the maximum average speed. The coefficient of performance is going to be evaluated experimentally by measuring the electrical power generation and wind characteristics that drive the wind turbine on the field. These experimental results will be applied on the improvement of the wind turbine design, as well as the validation of a numerical simulation model that couples the wind characteristics obtained through CFD with the Blade Element Method (BEM) and an electro-mechanical model of the turbine-shaft-generator ensemble. Special thanks to the Coordinación de Investigación Científica of the Universidad Michoacana de San Nicolás de Hidalgo for their support.
Bener, Mustafa; Ozyürek, Mustafa; Güçlü, Kubilay; Apak, Reşat
2010-05-15
A low-cost optical sensor using an immobilized chromogenic redox reagent was devised for measuring the total antioxidant level in a liquid sample without requiring sample pretreatment. The reagent, copper(II)-neocuproine (Cu(II)-Nc) complex, was immobilized onto a cation-exchanger film of Nafion, and the absorbance changes associated with the formation of the highly colored Cu(I)-Nc chelate as a result of reaction with antioxidants was measured at 450 nm. The sensor gave a linear response over a wide concentration range of standard antioxidant compounds. The trolox equivalent antioxidant capacity (TEAC) values of various antioxidants reported in this work using the optical sensor-based "cupric reducing antioxidant capacity" (CUPRAC) assay were comparable to those of the standard solution-based CUPRAC assay, showing that the immobilized Cu(II)-Nc reagent retained its reactivity toward antioxidants. Common food ingredients like oxalate, citrate, fruit acids, and reducing sugars did not interfere with the proposed sensing method. This assay was validated through linearity, additivity, precision, and recovery, demonstrating that the assay is reliable and robust. The developed optical sensor was used to screen total antioxidant capacity (TAC) of some commercial fruit juices without preliminary treatment and showed a promising potential for the preparation of antioxidant inventories of a wide range of food plants.
Citizen Science Air Monitoring in the Ironbound Community ...
The Environmental Protection Agency’s (EPA) mission is to protect human health and the environment. To move toward achieving this goal, EPA is facilitating identification of potential environmental concerns, particularly in vulnerable communities. This includes actively supporting citizen science projects and providing communities with the information and assistance they need to conduct their own air pollution monitoring efforts. The Air Sensor Toolbox for Citizen Scientists1 was developed as a resource to meet stakeholder needs. Examples of materials developed for the Toolbox and ultimately pilot tested in the Ironbound Community in Newark, New Jersey are reported here. The Air Sensor Toolbox for Citizen Scientists is designed as an online resource that provides information and guidance on new, low-cost compact technologies used for measuring air quality. The Toolbox features resources developed by EPA researchers that can be used by citizens to effectively collect, analyze, interpret, and communicate air quality data. The resources include information about sampling methods, how to calibrate and validate monitors, options for measuring air quality, data interpretation guidelines, and low-cost sensor performance information. This Regional Applied Research Effort (RARE) project provided an opportunity for the Office of Research and Development (ORD) to work collaboratively with EPA Region 2 to provide the Ironbound Community with a “Toolbox” specific for c
Deli, Roberto; Di Gioia, Eliana; Galantucci, Luigi Maria; Percoco, Gianluca
2010-01-01
To set up a three-dimensional photogrammetric scanning system for precise landmark measurements, without any physical contact, using a low-cost and noninvasive digital photogrammetric solution, for supporting several necessity in clinical orthodontics and/or surgery diagnosis. Thirty coded targets were directly applied onto the subject's face on the soft tissue landmarks, and then, 3 simultaneous photos were acquired using photogrammetry, at room light conditions. For comparison, a dummy head was digitized both with a photogrammetric technique and with the laser scanner Minolta Vivid 910i (Konica Minolta, Tokyo, Japan). The precise measurement of the landmarks is ranged between 0.017 and 0.029 mm. The system automatically measures spatial position of face landmarks, from which distances and angles can be obtained. The facial measurements were compared with those done using laser scanning and manual caliper. The adopted method gives higher precision than the others (0.022-mm mean value on points and 0.038-mm mean value on linear distances on a dummy head), is simple, and can be used easily as a standard routine. The study demonstrated the validity of photogrammetry for accurate digitization of human face landmarks. This research points out the potential of this low-cost photogrammetry approach for medical digitization.
Replica Approach for Minimal Investment Risk with Cost
NASA Astrophysics Data System (ADS)
Shinzato, Takashi
2018-06-01
In the present work, the optimal portfolio minimizing the investment risk with cost is discussed analytically, where an objective function is constructed in terms of two negative aspects of investment, the risk and cost. We note the mathematical similarity between the Hamiltonian in the mean-variance model and the Hamiltonians in the Hopfield model and the Sherrington-Kirkpatrick model, show that we can analyze this portfolio optimization problem by using replica analysis, and derive the minimal investment risk with cost and the investment concentration of the optimal portfolio. Furthermore, we validate our proposed method through numerical simulations.
Pereiro, Iago; Bendali, Amel; Tabnaoui, Sanae; Alexandre, Lucile; Srbova, Jana; Bilkova, Zuzana; Deegan, Shane; Joshi, Lokesh; Viovy, Jean-Louis; Malaquin, Laurent; Dupuy, Bruno; Descroix, Stéphanie
2017-02-01
A microfluidic method to specifically capture and detect infectious bacteria based on immunorecognition and proliferative power is presented. It involves a microscale fluidized bed in which magnetic and drag forces are balanced to retain antibody-functionalized superparamagnetic beads in a chamber during sample perfusion. Captured cells are then cultivated in situ by infusing nutritionally-rich medium. The system was validated by the direct one-step detection of Salmonella Typhimurium in undiluted unskimmed milk, without pre-treatment. The growth of bacteria induces an expansion of the fluidized bed, mainly due to the volume occupied by the newly formed bacteria. This expansion can be observed with the naked eye, providing simple low-cost detection of only a few bacteria and in a few hours. The time to expansion can also be measured with a low-cost camera, allowing quantitative detection down to 4 cfu (colony forming unit), with a dynamic range of 100 to 10 7 cfu ml -1 in 2 to 8 hours, depending on the initial concentration. This mode of operation is an equivalent of quantitative PCR, with which it shares a high dynamic range and outstanding sensitivity and specificity, operating at the live cell rather than DNA level. Specificity was demonstrated by controls performed in the presence of a 500× excess of non-pathogenic Lactococcus lactis . The system's versatility was demonstrated by its successful application to the detection and quantitation of Escherichia coli O157:H15 and Enterobacter cloacae . This new technology allows fast, low-cost, portable and automated bacteria detection for various applications in food, environment, security and clinics.
Romero, Veronica; Amaral, Joseph; Fitzpatrick, Paula; Schmidt, R C; Duncan, Amie W; Richardson, Michael J
2017-04-01
Functionally stable and robust interpersonal motor coordination has been found to play an integral role in the effectiveness of social interactions. However, the motion-tracking equipment required to record and objectively measure the dynamic limb and body movements during social interaction has been very costly, cumbersome, and impractical within a non-clinical or non-laboratory setting. Here we examined whether three low-cost motion-tracking options (Microsoft Kinect skeletal tracking of either one limb or whole body and a video-based pixel change method) can be employed to investigate social motor coordination. Of particular interest was the degree to which these low-cost methods of motion tracking could be used to capture and index the coordination dynamics that occurred between a child and an experimenter for three simple social motor coordination tasks in comparison to a more expensive, laboratory-grade motion-tracking system (i.e., a Polhemus Latus system). Overall, the results demonstrated that these low-cost systems cannot substitute the Polhemus system in some tasks. However, the lower-cost Microsoft Kinect skeletal tracking and video pixel change methods were successfully able to index differences in social motor coordination in tasks that involved larger-scale, naturalistic whole body movements, which can be cumbersome and expensive to record with a Polhemus. However, we found the Kinect to be particularly vulnerable to occlusion and the pixel change method to movements that cross the video frame midline. Therefore, particular care needs to be taken in choosing the motion-tracking system that is best suited for the particular research.
Design of a Low-cost Oil Spill Tracking Buoy
NASA Astrophysics Data System (ADS)
Zhao, Y.; Hu, X.; Yu, F.; Dong, S.; Chen, G.
2017-12-01
As the rapid development of oil exploitation and transportation, oil spill accidents, such as Prestige oil spill, Gulf of Mexico oil spill accident and so on, happened frequently in recent years which would result in long-term damage to the environment and human life. It would be helpful for rescue operation if we can locate the oil slick diffusion area in real time. Equipped with GNSS system, current tracking buoys(CTB), such as Lagrangian drifting buoy, Surface Velocity Program (SVP) drifter, iSLDMB (Iridium self locating datum marker buoy) and Argosphere buoy, have been used as oil tracking buoy in oil slick observation and as validation tools for oil spill simulation. However, surface wind could affect the movement of oil slick, which couldn't be reflected by CTB, thus the oil spill tracking performance is limited. Here, we proposed an novel oil spill tracking buoy (OSTB) which has a low cost of less than $140 and is equipped with Beidou positioning module and sails to track oil slick. Based on hydrodynamic equilibrium model and ocean dynamic analysis, the wind sails and water sails are designed to be adjustable according to different marine conditions to improve tracking efficiency. Quick release device is designed to assure easy deployment from air or ship. Sea experiment was carried out in Jiaozhou Bay, Northern China. OSTB, SVP, iSLDMB, Argosphere buoy and a piece of oil-simulated rubber sheet were deployed at the same time. Meanwhile, oil spill simulation model GNOME (general NOAA operational modeling environment) was configured with the wind and current field, which were collected by an unmanned surface vehicle (USV) mounted with acoustic Doppler current profilers (ADCP) and wind speed and direction sensors. Experimental results show that the OSTB has better relevance with rubber sheet and GNOME simulation results, which validate the oil tracking ability of OSTB. With low cost and easy deployment, OSTB provides an effective way for oil spill numerical modeling validation and quick response to oil spill accidents.
Healthcare Cost and Impact of Persistent Orofacial Pain: The DEEP Study Cohort.
Durham, J; Shen, J; Breckons, M; Steele, J G; Araujo-Soares, V; Exley, C; Vale, L
2016-09-01
Few data are available on the healthcare costs of those suffering from persistent orofacial pain (POFP). This cohort and cost analysis study examined the direct costs of POFP from the perspective of the healthcare provider (specifically, the UK National Health Service) in 2012 pounds sterling and sought to identify whether dichotomized (high, IIb to IV; low, 0 to IIa) graded chronic pain scale (GCPS) status is predictive of the total cost of healthcare over the last 6 mo. The healthcare utilization data of 198 patients with POFP were collected using a structured interview and a validated "use of services and productivity" questionnaire. Unit costs were used with these utilization data to calculate direct healthcare costs in 3 categories: consultation, medication, and appliances and interventions. Consultation costs were a significant proportion of cumulative healthcare cost (P < 0.001). Dichotomized GCPS status was predictive of increased healthcare cost over the last 6 mo, accounting for an average increase of £366 (95% confidence interval, 135 to 598; P < 0.01) when moving from a low GCPS status to a high GCPS status. Given the predictive capability of dichotomized GCPS status and the success of stratified models of care for other persistent pain conditions, dichotomized GCPS status may offer an opportunity to help determine stratification of care for patients with POFP. © International & American Associations for Dental Research 2016.
Cost-effectiveness of diabetes case management for low-income populations.
Gilmer, Todd P; Roze, Stéphane; Valentine, William J; Emy-Albrecht, Katrina; Ray, Joshua A; Cobden, David; Nicklasson, Lars; Philis-Tsimikas, Athena; Palmer, Andrew J
2007-10-01
To evaluate the cost-effectiveness of Project Dulce, a culturally specific diabetes case management and self-management training program, in four cohorts defined by insurance status. Clinical and cost data on 3,893 persons with diabetes participating in Project Dulce were used as inputs into a diabetes simulation model. The Center for Outcomes Research Diabetes Model, a published, peer-reviewed and validated simulation model of diabetes, was used to evaluate life expectancy, quality-adjusted life expectancy (QALY), cumulative incidence of complications and direct medical costs over patient lifetimes (40-year time horizon) from a third-party payer perspective. Cohort characteristics, treatment effects, and case management costs were derived using a difference in difference design comparing data from the Project Dulce program to a cohort of historical controls. Long-term costs were derived from published U.S. sources. Costs and clinical benefits were discounted at 3.0 percent per annum. Sensitivity analyses were performed. Incremental cost-effectiveness ratios of $10,141, $24,584, $44,941, and $69,587 per QALY gained were estimated for Project Dulce participants versus control in the uninsured, County Medical Services, Medi-Cal, and commercial insurance cohorts, respectively. The Project Dulce diabetes case management program was associated with cost-effective improvements in quality-adjusted life expectancy and decreased incidence of diabetes-related complications over patient lifetimes. Diabetes case management may be particularly cost effective for low-income populations.
2010-01-01
Background Regional generalized cost-effectiveness estimates of prevention, screening and treatment interventions for colorectal cancer are presented. Methods Standardised WHO-CHOICE methodology was used. A colorectal cancer model was employed to provide estimates of screening and treatment effectiveness. Intervention effectiveness was determined via a population state-transition model (PopMod) that simulates the evolution of a sub-regional population accounting for births, deaths and disease epidemiology. Economic costs of procedures and treatment were estimated, including programme overhead and training costs. Results In regions characterised by high income, low mortality and high existing treatment coverage, the addition of screening to the current high treatment levels is very cost-effective, although no particular intervention stands out in cost-effectiveness terms relative to the others. In regions characterised by low income, low mortality with existing treatment coverage around 50%, expanding treatment with or without screening is cost-effective or very cost-effective. Abandoning treatment in favour of screening (no treatment scenario) would not be cost effective. In regions characterised by low income, high mortality and low treatment levels, the most cost-effective intervention is expanding treatment. Conclusions From a cost-effectiveness standpoint, screening programmes should be expanded in developed regions and treatment programmes should be established for colorectal cancer in regions with low treatment coverage. PMID:20236531
Sun, Lei; Jin, Hong-Yu; Tian, Run-Tao; Wang, Ming-Juan; Liu, Li-Na; Ye, Liu-Ping; Zuo, Tian-Tian; Ma, Shuang-Cheng
2017-01-01
Analysis of related substances in pharmaceutical chemicals and multi-components in traditional Chinese medicines needs bulk of reference substances to identify the chromatographic peaks accurately. But the reference substances are costly. Thus, the relative retention (RR) method has been widely adopted in pharmacopoeias and literatures for characterizing HPLC behaviors of those reference substances unavailable. The problem is it is difficult to reproduce the RR on different columns due to the error between measured retention time (t R ) and predicted t R in some cases. Therefore, it is useful to develop an alternative and simple method for prediction of t R accurately. In the present study, based on the thermodynamic theory of HPLC, a method named linear calibration using two reference substances (LCTRS) was proposed. The method includes three steps, procedure of two points prediction, procedure of validation by multiple points regression and sequential matching. The t R of compounds on a HPLC column can be calculated by standard retention time and linear relationship. The method was validated in two medicines on 30 columns. It was demonstrated that, LCTRS method is simple, but more accurate and more robust on different HPLC columns than RR method. Hence quality standards using LCTRS method are easy to reproduce in different laboratories with lower cost of reference substances.
Abujaber, Sumayeh; Gillispie, Gregory; Marmon, Adam; Zeni, Joseph
2015-01-01
Weight bearing asymmetry is common in patients with unilateral lower limb musculoskeletal pathologies. The Nintendo Wii Balance Board (WBB) has been suggested as a low-cost and widely-available tool to measure weight bearing asymmetry in a clinical environment; however no study has evaluated the validity of this tool during dynamic tasks. Therefore, the purpose of this study was to determine the concurrent validity of force measurements acquired from the WBB as compared to laboratory force plates. Thirty-five individuals before, or within 1 year of total joint arthroplasty performed a sit-to-stand and return-to-sit task in two conditions. First, subjects performed the task with both feet placed on a single WBB. Second, the task was repeated with each foot placed on an individual laboratory force plate. Peak vertical ground reaction force (VGRF) under each foot and the inter-limb symmetry ratio were calculated. Validity was examined using Intraclass Correlation Coefficients (ICC), regression analysis, 95% limits of agreement and Bland-Altman plots. Force plates and the WBB exhibited excellent agreement for all outcome measurements (ICC =0.83–0.99). Bland-Altman plots showed no obvious relationship between the difference and the mean for the peak VGRF, but there was a consistent trend in which VGRF on the unaffected side was lower and VGRF on the affected side was higher when using the WBB. However, these consistent biases can be adjusted for by utilizing regression equations that estimate the force plate values based on the WBB force. The WBB may serve as a valid, suitable, and low-cost alternative to expensive, laboratory force plates for measuring weight bearing asymmetry in clinical settings. PMID:25715680
Abujaber, Sumayeh; Gillispie, Gregory; Marmon, Adam; Zeni, Joseph
2015-02-01
Weight bearing asymmetry is common in patients with unilateral lower limb musculoskeletal pathologies. The Nintendo Wii Balance Board (WBB) has been suggested as a low-cost and widely-available tool to measure weight bearing asymmetry in a clinical environment; however no study has evaluated the validity of this tool during dynamic tasks. Therefore, the purpose of this study was to determine the concurrent validity of force measurements acquired from the WBB as compared to laboratory force plates. Thirty-five individuals before, or within 1 year of total joint arthroplasty performed a sit-to-stand and return-to-sit task in two conditions. First, subjects performed the task with both feet placed on a single WBB. Second, the task was repeated with each foot placed on an individual laboratory force plate. Peak vertical ground reaction force (VGRF) under each foot and the inter-limb symmetry ratio were calculated. Validity was examined using Intraclass Correlation Coefficients (ICC), regression analysis, 95% limits of agreement and Bland-Altman plots. Force plates and the WBB exhibited excellent agreement for all outcome measurements (ICC=0.83-0.99). Bland-Altman plots showed no obvious relationship between the difference and the mean for the peak VGRF, but there was a consistent trend in which VGRF on the unaffected side was lower and VGRF on the affected side was higher when using the WBB. However, these consistent biases can be adjusted for by utilizing regression equations that estimate the force plate values based on the WBB force. The WBB may serve as a valid, suitable, and low-cost alternative to expensive, laboratory force plates for measuring weight bearing asymmetry in clinical settings. Copyright © 2015 Elsevier B.V. All rights reserved.
Lee, Jongseok; Jang, Sungok; Son, Heejeong
2016-01-01
Despite the importance of accurate assessment for low-density lipoprotein cholesterol (LDL-C), the Friedewald formula has primarily been used as a cost-effective method to estimate LDL-C when triglycerides are less than 400 mg/dL. In a recent study, an alternative to the formula was proposed to improve estimation of LDL-C. We evaluated the performance of the novel method versus the Friedewald formula using a sample of 5,642 Korean adults with LDL-C measured by an enzymatic homogeneous assay (LDL-CD). Friedewald LDL-C (LDL-CF) was estimated using a fixed factor of 5 for the ratio of triglycerides to very-low-density lipoprotein cholesterol (TG:VLDL-C ratio). However, the novel LDL-C (LDL-CN) estimates were calculated using the N-strata-specific median TG:VLDL-C ratios, LDL-C5 and LDL-C25 from respective ratios derived from our data set, and LDL-C180 from the 180-cell table reported by the original study. Compared with LDL-CF, each LDL-CN estimate exhibited a significantly higher overall concordance in the NCEP-ATP III guideline classification with LDL-CD (p< 0.001 for each comparison). Overall concordance was 78.2% for LDL-CF, 81.6% for LDL-C5, 82.3% for LDL-C25, and 82.0% for LDL-C180. Compared to LDL-C5, LDL-C25 significantly but slightly improved overall concordance (p = 0.008). LDL-C25 and LDL-C180 provided almost the same overall concordance; however, LDL-C180 achieved superior improvement in classifying LDL-C < 70 mg/dL compared to the other estimates. In subjects with triglycerides of 200 to 399 mg/dL, each LDL-CN estimate showed a significantly higher concordance than that of LDL-CF (p< 0.001 for each comparison). The novel method offers a significant improvement in LDL-C estimation when compared with the Friedewald formula. However, it requires further modification and validation considering the racial differences as well as the specific character of the applied measuring method. PMID:26824910
Design, processing and testing of LSI arrays: Hybrid microelectronics task
NASA Technical Reports Server (NTRS)
Himmel, R. P.; Stuhlbarg, S. M.; Ravetti, R. G.; Zulueta, P. J.
1979-01-01
Mathematical cost factors were generated for both hybrid microcircuit and printed wiring board packaging methods. A mathematical cost model was created for analysis of microcircuit fabrication costs. The costing factors were refined and reduced to formulae for computerization. Efficient methods were investigated for low cost packaging of LSI devices as a function of density and reliability. Technical problem areas such as wafer bumping, inner/outer leading bonding, testing on tape, and tape processing, were investigated.
Groundwater Exploration for Rural Communities in Ghana, West Africa
NASA Astrophysics Data System (ADS)
McKay, W. A.
2001-05-01
Exploration for potable water in developing countries continues to be a major activity, as there are more than one billion people without access to safe drinking water. Exploration for groundwater becomes more critical in regions where groundwater movement and occurrence is controlled by secondary features such as fractures and faults. Drilling success rates in such geological settings are generally very low, but can be improved by integrating geological, hydrogeological, aerial photo interpretation with land-based geophysical technology in the selection of drilling sites. To help alleviate water supply problems in West Africa, the Conrad N. Hilton Foundation and other donors, since 1990, have funded the World Vision Ghana Rural Water Project (GRWP) to drill wells for potable water supplies in the Greater Afram Plains (GAP) of Ghana. During the first two years of the program, drilling success rates using traditional methods ranged from 35 to 80 percent, depending on the area. The average drilling success rate for the program was approximately 50 percent. In an effort to increase the efficiency of drilling operations, the Desert Research Institute evaluated and developed techniques for application to well-siting strategies in the GAP area of Ghana. A critical project element was developing technical capabilities of in-country staff to independently implement the new strategies. Simple cost-benefit relationships were then used to evaluate the economic advantages of developing water resources using advanced siting methods. The application of advanced methods in the GAP area reveal an increase of 10 to 15 percent in the success rate over traditional methods. Aerial photography has been found to be the most useful of the imagery products covering the GAP area. An effective approach to geophysical exploration for groundwater has been the combined use of EM and resistivity methods. Economic analyses showed that the use of advanced methods is cost-effective when success rates with traditional methods are less than 70 to 90 percent. Finally, with the focus of GRWP activities shifting to Ghana's northern regions, new challenges in drilling success rates are being encountered. In certain districts, success rates as low as 35 percent are observed, raising questions about the efficacy of existing well-siting strategies in the current physical setting, and the validity of traditional cost-benefit analyses for assessing the economic aspects of water exploration in drought-stricken areas.
Patrick, Amanda R; Schousboe, John T; Losina, Elena; Solomon, Daniel H
2011-09-01
Adherence to osteoporosis treatment is low. Although new therapies and behavioral interventions may improve medication adherence, questions are likely to arise regarding their cost-effectiveness. Our objectives were to develop and validate a model to simulate the clinical outcomes and costs arising from various osteoporosis medication adherence patterns among women initiating bisphosphonate treatment and to estimate the cost-effectiveness of a hypothetical intervention to improve medication adherence. We constructed a computer simulation using estimates of fracture rates, bisphosphonate treatment effects, costs, and utilities for health states drawn from the published literature. Probabilities of transitioning on and off treatment were estimated from administrative claims data. Patients were women initiating bisphosphonate therapy from the general community. We evaluated a hypothetical behavioral intervention to improve medication adherence. Changes in 10-yr fracture rates and incremental cost-effectiveness ratios were evaluated. A hypothetical intervention with a one-time cost of $250 and reducing bisphosphonate discontinuation by 30% had an incremental cost-effectiveness ratio (ICER) of $29,571 per quality-adjusted life year in 65-yr-old women initiating bisphosphonates. Although the ICER depended on patient age, intervention effectiveness, and intervention cost, the ICERs were less than $50,000 per quality-adjusted life year for the majority of intervention cost and effectiveness scenarios evaluated. Results were sensitive to bisphosphonate cost and effectiveness and assumptions about the rate at which intervention and treatment effects decline over time. Our results suggests that behavioral interventions to improve osteoporosis medication adherence will likely have favorable ICERs if their efficacy can be sustained.
[Value-based medicine in ophthalmology].
Hirneiss, C; Neubauer, A S; Tribus, C; Kampik, A
2006-06-01
Value-based medicine (VBM) unifies costs and patient-perceived value (improvement in quality of life, length of life, or both) of an intervention. Value-based ophthalmology is of increasing importance for decisions in eye care. The methods of VBM are explained and definitions for a specific terminology in this field are given. The cost-utility analysis as part of health care economic analyses is explained. VBM exceeds evidence-based medicine by incorporating parameters of cost and benefits from an ophthalmological intervention. The benefit of the intervention is defined as an increase or maintenance of visual quality of life and can be determined by utility analysis. The time trade-off method is valid and reliable for utility analysis. The resources expended for the value gained in VBM are measured with cost-utility analysis in terms of cost per quality-adjusted life years gained (euros/QALY). Numerous cost-utility analyses of different ophthalmological interventions have been published. The fundamental instrument of VBM is cost-utility analysis. The results in cost per QALY allow estimation of cost effectiveness of an ophthalmological intervention. Using the time trade-off method for utility analysis allows the comparison of ophthalmological cost-utility analyses with those of other medical interventions. VBM is important for individual medical decision making and for general health care.
Hewel, Johannes A.; Liu, Jian; Onishi, Kento; Fong, Vincent; Chandran, Shamanta; Olsen, Jonathan B.; Pogoutse, Oxana; Schutkowski, Mike; Wenschuh, Holger; Winkler, Dirk F. H.; Eckler, Larry; Zandstra, Peter W.; Emili, Andrew
2010-01-01
Effective methods to detect and quantify functionally linked regulatory proteins in complex biological samples are essential for investigating mammalian signaling pathways. Traditional immunoassays depend on proprietary reagents that are difficult to generate and multiplex, whereas global proteomic profiling can be tedious and can miss low abundance proteins. Here, we report a target-driven liquid chromatography-tandem mass spectrometry (LC-MS/MS) strategy for selectively examining the levels of multiple low abundance components of signaling pathways which are refractory to standard shotgun screening procedures and hence appear limited in current MS/MS repositories. Our stepwise approach consists of: (i) synthesizing microscale peptide arrays, including heavy isotope-labeled internal standards, for use as high quality references to (ii) build empirically validated high density LC-MS/MS detection assays with a retention time scheduling system that can be used to (iii) identify and quantify endogenous low abundance protein targets in complex biological mixtures with high accuracy by correlation to a spectral database using new software tools. The method offers a flexible, rapid, and cost-effective means for routine proteomic exploration of biological systems including “label-free” quantification, while minimizing spurious interferences. As proof-of-concept, we have examined the abundance of transcription factors and protein kinases mediating pluripotency and self-renewal in embryonic stem cell populations. PMID:20467045
Development of a wireless displacement measurement system using acceleration responses.
Park, Jong-Woong; Sim, Sung-Han; Jung, Hyung-Jo; Spencer, Billie F
2013-07-01
Displacement measurements are useful information for various engineering applications such as structural health monitoring (SHM), earthquake engineering and system identification. Most existing displacement measurement methods are costly, labor-intensive, and have difficulties particularly when applying to full-scale civil structures because the methods require stationary reference points. Indirect estimation methods converting acceleration to displacement can be a good alternative as acceleration transducers are generally cost-effective, easy to install, and have low noise. However, the application of acceleration-based methods to full-scale civil structures such as long span bridges is challenging due to the need to install cables to connect the sensors to a base station. This article proposes a low-cost wireless displacement measurement system using acceleration. Developed with smart sensors that are low-cost, wireless, and capable of on-board computation, the wireless displacement measurement system has significant potential to impact many applications that need displacement information at multiple locations of a structure. The system implements an FIR-filter type displacement estimation algorithm that can remove low frequency drifts typically caused by numerical integration of discrete acceleration signals. To verify the accuracy and feasibility of the proposed system, laboratory tests are carried out using a shaking table and on a three storey shear building model, experimentally confirming the effectiveness of the proposed system.
Development of a Wireless Displacement Measurement System Using Acceleration Responses
Park, Jong-Woong; Sim, Sung-Han; Jung, Hyung-Jo; Spencer, Billie F.
2013-01-01
Displacement measurements are useful information for various engineering applications such as structural health monitoring (SHM), earthquake engineering and system identification. Most existing displacement measurement methods are costly, labor-intensive, and have difficulties particularly when applying to full-scale civil structures because the methods require stationary reference points. Indirect estimation methods converting acceleration to displacement can be a good alternative as acceleration transducers are generally cost-effective, easy to install, and have low noise. However, the application of acceleration-based methods to full-scale civil structures such as long span bridges is challenging due to the need to install cables to connect the sensors to a base station. This article proposes a low-cost wireless displacement measurement system using acceleration. Developed with smart sensors that are low-cost, wireless, and capable of on-board computation, the wireless displacement measurement system has significant potential to impact many applications that need displacement information at multiple locations of a structure. The system implements an FIR-filter type displacement estimation algorithm that can remove low frequency drifts typically caused by numerical integration of discrete acceleration signals. To verify the accuracy and feasibility of the proposed system, laboratory tests are carried out using a shaking table and on a three storey shear building model, experimentally confirming the effectiveness of the proposed system. PMID:23881123
UV Spectrophotometric Method for Estimation of Polypeptide-K in Bulk and Tablet Dosage Forms
NASA Astrophysics Data System (ADS)
Kaur, P.; Singh, S. Kumar; Gulati, M.; Vaidya, Y.
2016-01-01
An analytical method for estimation of polypeptide-k using UV spectrophotometry has been developed and validated for bulk as well as tablet dosage form. The developed method was validated for linearity, precision, accuracy, specificity, robustness, detection, and quantitation limits. The method has shown good linearity over the range from 100.0 to 300.0 μg/ml with a correlation coefficient of 0.9943. The percentage recovery of 99.88% showed that the method was highly accurate. The precision demonstrated relative standard deviation of less than 2.0%. The LOD and LOQ of the method were found to be 4.4 and 13.33, respectively. The study established that the proposed method is reliable, specific, reproducible, and cost-effective for the determination of polypeptide-k.
A Cost Analysis Model for Army Sponsored Graduate Dental Education Programs.
1997-04-01
characteristics of a good measurement tool ? Cooper and Emory in their textbook, Business Research Methods, state there are three major criteria for evaluating...a measurement tool : validity, reliability, and practicality (Cooper and Emory 1995). Validity can be compartmentalized into internal and external...tremendous expense? The AEGD-1 year program is used extensively as a recruiting tool to encourage senior dental students to join the Army Dental Corps. The
Experiences Using Formal Methods for Requirements Modeling
NASA Technical Reports Server (NTRS)
Easterbrook, Steve; Lutz, Robyn; Covington, Rick; Kelly, John; Ampo, Yoko; Hamilton, David
1996-01-01
This paper describes three cases studies in the lightweight application of formal methods to requirements modeling for spacecraft fault protection systems. The case studies differ from previously reported applications of formal methods in that formal methods were applied very early in the requirements engineering process, to validate the evolving requirements. The results were fed back into the projects, to improve the informal specifications. For each case study, we describe what methods were applied, how they were applied, how much effort was involved, and what the findings were. In all three cases, the formal modeling provided a cost effective enhancement of the existing verification and validation processes. We conclude that the benefits gained from early modeling of unstable requirements more than outweigh the effort needed to maintain multiple representations.
Key issues for low-cost FGD installations
DOE Office of Scientific and Technical Information (OSTI.GOV)
DePriest, W.; Mazurek, J.M.
1995-12-01
This paper will discuss various methods for installing low-cost FGD systems. The paper will include a discussion of various types of FGD systems available, both wet and dry, and will compare the relative cost of each type. Important design issues, such as use of spare equipment, materials of construction, etc. will be presented. An overview of various low-cost construction techniques (i.e., modularization) will be included. This paper will draw heavily from Sargent & Lundy`s database of past and current FGD projects together with information we gathered for several Electric Power Research Institute (EPRI) studies on the subject.