Bosch, RJ & Ryan, LM 1998, 'Generalized Poisson models arising from Markov processes', STATISTICS & PROBABILITY LETTERS, vol. 39, no. 3, pp. 205-212.
View/Download from: Publisher's site
View description>>
We develop a family of distributions which allow for over- and underdispersion relative to the Poisson. This latter feature is particularly appealing since many existing methods only allow for overdispersion. These distributions arise from underlying continuous-time Markov processes in which event rates depend on how many events have already occurred. The results are illustrated with underdispersed count data from a polyspermy study and overdispersed data from the Canadian Sickness Survey. (C) 1998 Elsevier Science B.V. All rights reserved.
Cucchiara, R, Neri, G & Piccardi, M 1998, 'A real-time hardware implementation of the hough transform', Journal of Systems Architecture, vol. 45, no. 1, pp. 31-45.
View/Download from: Publisher's site
View description>>
The paper presents a hardware implementation of algorithms based on the Hough transform (HT) for real-time straight line detection. In particular, the basic HT on the edge points (EHT) and the Gradient-Weighted Hough transform (GWHT) for gray-level image
Gaylor, D, Ryan, L, Krewski, D & Zhu, YL 1998, 'Procedures for calculating benchmark doses for health risk assessment', REGULATORY TOXICOLOGY AND PHARMACOLOGY, vol. 28, no. 2, pp. 150-164.
View/Download from: Publisher's site
View description>>
Safety assessment for noncancer health effects generally has been based upon dividing a no observed adverse effect (NOAEL) by uncertainty (safety) factors to provide an acceptable daily intake (ADI) or reference dose (RfD). Since the NOAEL does not utilize all of the available dose-response data, allows higher ADI from poorer experiments, and may have an unknown, unacceptable level of risk, the benchmark dose (BD) with a specified, controlled low level of risk has become popular as an adjunct to the NOAEL or the low observed adverse effect level (LOAEL) in the safety assessment process. The purpose of this paper is to summarize statistical procedures available for calculating BDs and their confidence limits for noncancer endpoints. Procedures are presented and illustrated for quantal (binary), quasicontinuous (proportion), and continuous data. Quasicontinuous data arise in developmental studies where the measure of an effect for a fetus is quantal (normal or abnormal) but the experimental unit is the mother (litter) so that results can be expressed as the proportion of abnormal fetuses per litter. However, the correlation of effects among fetuses within a litter poses some additional statistical problems. Also, developmental studies usually include some continuous measures, such as fetal body weight or length. With continuous data there generally is not a clear demarcation between normal and adverse measurements. In such cases, extremely high and/or low measurements at some designated percentile(s) can be considered abnormal. Then the probability (risk) of abnormal individuals can be estimated as a function of dose. The procedure for estimating a BD with continuous data is illustrated using neurotoxicity data. When multiple measures of adverse effects are available, a BD can be estimated based on a selected endpoint or the appearance of any combination of endpoints. Multivariate procedures are illustrated using developmental and reproductive toxicity data.
Ibrahim, JG, Ryan, LM & Chen, MH 1998, 'Using historical controls to adjust for covariates in trend tests for binary data', JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION, vol. 93, no. 444, pp. 1282-1293.
View/Download from: Publisher's site
View description>>
Historical data often play an important role in helping interpret the results of a current study. This article is motivated primarily by one specific application: the analysis of data from rodent carcinogenicity studies. By proposing a suitable informative prior distribution on the relationship between control outcome data and covariates, we derive modified trend test statistics that incorporate historical control information to adjust for covariate effects. Frequentist and fully Bayesian methods are presented, and novel computational techniques are developed to compute the test statistics. Several attractive theoretical and computational properties of the proposed priors are derived. In addition, a semiautomatic elicitation scheme for the priors is developed. Our approach is used to modify a widely used prevalence test for carcinogenicity studies. The proposed methodology is applied to data from a National Toxicology Program carcinogenicity experiment and is shown to provide helpful insight on the results of the analysis. © 1998 Taylor & Francis Group, LLC.
Ibrahim, JG, Ryan, LM & Chen, M-H 1998, 'Using Historical Controls to Adjust for Covariates in Trend Tests for Binary Data', Journal of the American Statistical Association, vol. 93, no. 444, pp. 1282-1282.
View/Download from: Publisher's site
Keller, SM, Ryan, LM, Coia, LR, Dang, P, Vaught, DJ, Diggs, C, Weiner, LM & Benson, AB 1998, 'High dose chemoradiotherapy followed by esophagectomy for adenocarcinoma of the esophagus and gastroesophageal junction', Cancer, vol. 83, no. 9, pp. 1908-1916.
View/Download from: Publisher's site
Lindsey, JC & Ryan, LM 1998, 'Tutorial in biostatistics - Methods for interval-censored data', STATISTICS IN MEDICINE, vol. 17, no. 2, pp. 219-238.
View/Download from: Publisher's site
View description>>
In standard time-to-event or survival analysis, occurrence times of the event of interest are observed exactly or are right-censored, meaning that it is only known that the event occurred after the last observation time, There are numerous methods available for estimating the survival curve and for testing and estimation of the effects of covariates in this context. In some situations, however, the times of the events of interest may only be known to have occurred within an interval of time. In clinical trials, for example, patients are often seen at pre-scheduled visits but the event of interest may occur in between visits. These data are interval-censored. Owing to the lack of well-known statistical methodology and available software, a common ad hoc approach is to assume that the event occurred at the end (or beginning or midpoint) of each interval, and then apply methods for standard time-to-event data. However, this approach can lead to invalid inferences, and in particular will tend to underestimate the standard errors of the estimated parameters. The purpose of this tutorial is to illustrate and compare available methods which correctly treat the data as being interval-censored. It is not meant to be a full review of all existing methods, but only those which are available in standard statistical software, or which can be easily programmed. All approaches will be illustrated on two data sets and compared with methods which ignore the interval-censored nature of the data. We hope this tutorial will allow those familiar with the application of standard survival analysis techniques the option of applying appropriate methods when presented with interval-censored data. (C) 1998 John Wiley & Sons, Ltd.
Padungtod, C, Lasley, BL, Christiani, DC, Ryan, LM & Xu, XP 1998, 'Reproductive hormone profile among pesticide factory workers', JOURNAL OF OCCUPATIONAL AND ENVIRONMENTAL MEDICINE, vol. 40, no. 12, pp. 1038-1047.
View description>>
Serum follicle-stimulating hormone (FSH), luteinizing hormone (LH), and testosterone levels, as well as urinary levels of FSH, LH, and E1C, a metabolite of testosterone, were measured to investigate the adverse reproductive effects of organophosphate pes
Smith, TJ, Ryan, LM, Douglass, HO, Haller, DG, Dayal, Y, Kirkwood, J, Tormey, DC, Schutt, AJ, Hinson, J & Sischy, B 1998, 'Combined chemoradiotherapy vs. radiotherapy alone for early stage squamous cell carcinoma of the esophagus: a study of the eastern cooperative oncology group', International Journal of Radiation Oncology*Biology*Physics, vol. 42, no. 2, pp. 269-276.
View/Download from: Publisher's site
View description>>
Squamous carcinoma of the thoracic esophagus has an extremely poor prognosis. This study, EST-1282, was undertaken by the Eastern Cooperative Oncology Group (ECOG) to determine whether the combined use of 5-fluorouracil (5-FU), mitomycin C, and radiation
Stoler, JM, Huntington, KS, Peterson, CM, Peterson, KP, Daniel, P, Aboagye, KK, Lieberman, E, Ryan, L & Holmes, LB 1998, 'The prenatal detection of significant alcohol exposure with maternal blood markers', The Journal of Pediatrics, vol. 133, no. 3, pp. 346-352.
View/Download from: Publisher's site
View description>>
Objective: To examine the efficacy of a combination of 4 blood markers of alcohol use in detecting alcohol-abusing pregnant women. Study design: Two new markers of alcohol use, whole blood-associated acetaldehyde and carbohydrate-deficient transferrin, and 2 traditional markers of alcohol use, γ-glutamyl transpeptidase and mean red blood cell volume, were measured in the blood of pregnant women. Each woman was interviewed about alcohol and drug use, medical and obstetric histories, and nutrition. Each infant was examined by a clinician who was blinded to exposure status. Results: All of the women who reported drinking an average of 1 or more ounces of absolute alcohol per day had at least 1 positive blood marker. The infants of mothers with 2 or more positive markers had significantly smaller birth weights, lengths, and head circumferences than the infants with negative maternal screens. The presence of 2 or more positive markers was more predictive of infant outcome than any self-reporting measure. Conclusions: These markers, which detect more at-risk pregnant women than self-reporting methods, could lead to better efforts at detection and prevention of alcohol-induced fetal damage.
Weller, EA & Ryan, LM 1998, 'Testing for trend with count data', BIOMETRICS, vol. 54, no. 2, pp. 762-773.
View/Download from: Publisher's site
View description>>
Among the tests that can be used to detect dose-related trends in count data from toxicological studies axe nonparametric tests such as the Jonckheere-Terpstra and likelihood-based tests, for example, based on a Poisson model. This paper was motivated by a data set of tumor counts in which conflicting conclusions were obtained using these two tests. To define situations where one test may be preferable, we compared the small and large sample performance of these two tests as well as a robust and conditional version of the likelihood-based test in the absence and presence of a dose- related trend for both Poisson and overdispersed Poisson data. Based on our results, we suggest using the Poisson test when little overdispersion is present in the data. For more overdispersed data, we recommend using the robust Poisson test for highly discrete data (response rate lower than 2-3) and the robust Poisson test or the Jonckheere-Terpstra test for moderately discrete or continuous data (average responses larger than 2 or 3). We also studied the effects of dose metameter misspecification. A clear effect on efficiency was seen when the 'wrong' dose metameter was used to compute the test statistic. In general, unless there is strong reason to do otherwise, we recommend the use of equally spaced dose levels when applying the Poisson or robust Poisson test for trend.
Witte, RS, Ryan, LM, Schutt, AJ, Carbone, PP & Engstrom, PF 1998, 'Pala Versus Streptozotocin, Doxorubicin, and Meccnu in the Treatment of Patients With Advanced Pancreatic Carcinoma', Investigational New Drugs, vol. 16, no. 4, pp. 315-318.
View/Download from: Publisher's site
View description>>
Seventy-three eligible, chemotherapy-naive, ambulatory patients with advanced pancreatic carcinoma were allocated to one of two treatment regimens: 35 received PALA (1250 mg/m2 daily x 5 every 4 weeks) and 38 were given SAM (streptozotocin 400 mg/m2 IV daily x 5, doxorubicin 45 mg/m2 IV on day 1 and 22, and methyl CCNU 60 mg/m2 orally on days 1 and 22 every 6 weeks). Doses were modified for myelo-, gi-, or cardiotoxicity. Adequate organ, bone marrow and cardiac function; a measurable lesion; adequate caloric intake; and a life expectancy of 2 months were required for treatment on this trial. One patient on each regimen had a partial response for response rates of 3% (95% confidence intervals, 0.08 to 17%). Median survival on the PALA arm was 5 months and median time to treatment failure was 2.6 months. SAM patients experienced median overall and progression free survivals of 3.4 and 1.9 months, respectively. The severe toxicity observed was almost exclusively myelosuppression on both regimens. One patient receiving SAM had lethal leukopenic sepsis during the first cycle as the only treatment-related death. Neither PALA nor SAM offer any therapeutic utility to patients with advanced pancreatic cancer.
Xu, XP, Cho, SI, Sammel, M, You, LY, Cui, SC, Huang, YM, Ma, GH, Padungtod, C, Pothier, L, Niu, TH, Christiani, D, Smith, T, Ryan, L & Wang, LH 1998, 'Association of petrochemical exposure with spontaneous abortion', OCCUPATIONAL AND ENVIRONMENTAL MEDICINE, vol. 55, no. 1, pp. 31-36.
View/Download from: Publisher's site
Barattin, M, Cucchiara, R & Piccardi, M 1970, 'A Rule-based Vehicular Traffic Tracking System', Proceedings of the Joint Conference on Information Sciences, pp. 334-337.
View description>>
The paper presents a computer vision-based approach to the problem of vehicular traffic monitoring. The approach associates a high-level tracking system to a low-level system that performs moving vehicles detection. The high-level module is based on a large set of rules and is able to keep tracks of all moving or stopped vehicles along the image sequence.
Cucchiara, R & Piccardi, M 1970, 'Exploiting image processing locality in cache pre-fetching', Proceedings. Fifth International Conference on High Performance Computing (Cat. No. 98EX238), Proceedings. Fifth International Conference on High Performance Computing, IEEE Comput. Soc, CHENNAI, INDIA, pp. 466-472.
View/Download from: Publisher's site
Egan, KM, Ryan, LM & Gragoudas, ES 1970, 'Survival implications of enucleation after definitive radiotherapy for choroidal melanoma - An example of regression on time-dependent covariates', ARCHIVES OF OPHTHALMOLOGY, 67th Annual Meeting of the Association-for-Research-in-Vision-and-Ophthalmology, Amer Medical Assoc, FT LAUDERDALE, FLORIDA, pp. 366-370.
View description>>
Objective: To evaluate whether the removal of the eye after radiotherapy alters the rates of metastatic death in patients with melanoma of the choroid. Patients and Methods: Using an extension of the Cox model, we based our analysis on a cohort of 1541 c
Minjie Zhang & Chengqi Zhang 1970, 'Investigations on solution synthesis in distributed expert systems', 1997 IEEE International Conference on Intelligent Processing Systems (Cat. No.97TH8335), 1997 IEEE International Conference on Intelligent Processing Systems, IEEE, pp. 1108-1112.
View/Download from: Publisher's site
View description>>
In this paper, a general procedure of distributed problem solving in DESs is formally described, solution synthesis and solution composition in DESs are compared, and the relationship between solution synthesis in DESs and conflict resolutions in DAI is analyzed. Furthermore, general methodologies used for solution synthesis in DESs are introduced and compared.
Yue Xu & Chenggi Zhang 1970, 'An efficient and practical diagnosis model', Proceedings Tenth IEEE International Conference on Tools with Artificial Intelligence (Cat. No.98CH36294), 10th International Conference on Tools with Artificial Intelligence (ICTA'98), IEEE, TAIPEI, TAIWAN, pp. 367-374.
View/Download from: Publisher's site
Zhang, C & Li, Y 1970, 'An algorithm for plan verification in multiple agent systems', Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), Springer Berlin Heidelberg, pp. 149-163.
View/Download from: Publisher's site
View description>>
© Springer-Verlag Berlin Heidelberg 1998. In this paper, we propose an algorithm which can improve Katz and Rosenschein’s plan verification algorithm. First, we represent the plan-like relations with adjacency lists and inverse adjacency lists to replace adjacency matrixes. Then, we present a method to avoid generating useless sub-graphs while generating the compressed set. Last, we compare two plan verification algorithms. We not only prove that our algorithm is correct, but also prove that our algorithm is better than Katz and Rosenschein’s algorithm both on time complexity and space complexity.
Zhang, C & Luo, X 1970, 'Transformation between the EMYCIN model and the Bayesian network', Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), Springer Berlin Heidelberg, pp. 205-219.
View/Download from: Publisher's site
View description>>
© Springer-Verlag Berlin Heidelberg 1998. If different expert systems use different uncertain reasoning models in a distributed expert system, it is necessary to transform the uncertainty of a proposition from one model to another when they cooperate to solve problems. This paper looks at ways to transform uncertainties between the EMYCIN model and the Bayesian network. In the past, the uncertainty management scheme employed the most extensively in expert systems was the EMYCIN model. Now the scheme is turning towards the Bayesian network. If we can combine, by means of the Internet, pre-existing stand-alone expert systems that use these two models into a distributed expert system, the ability of these individual expert systems in their real applications will be greatly improved. The work described in this paper is an important step in this direction.