Andersen, J, Goetghebeur, E & Ryan, L 1996, 'Missing cause of death information in the analysis of survival data', STATISTICS IN MEDICINE, vol. 15, no. 20, pp. 2191-2201.
View/Download from: Publisher's site
View description>>
Goetghebeur and Ryan proposed a method for proportional hazards analyses of competing risks failure-time data when the failure type is missing for some cases. This paper evaluates the properties of the method using data from a clinical trial in Hodgkin's disease. We generated several patterns of missingness in the cause of death in 'pseudo-studies' derived from the study database. We found that the proposed method provided regression coefficients and inferences that were less biased than those from other methods over an increasing percentage of missingness in the failure type when missingness is random, when it depends on an important covariate, when it depends on failure type, and when it depends on follow-up time. We present suggestions for study design with planned missingness in the failure type.
Bosch, RJ, Wypij, D & Ryan, LM 1996, 'A semiparametric approach to risk assessment for quantitative outcomes', RISK ANALYSIS, vol. 16, no. 5, pp. 657-665.
View/Download from: Publisher's site
View description>>
Characterizing the dose-effect relationship and estimating acceptable exposure levels are the primary goals of quantitative risk assessment. A semiparametric approach is proposed for risk assessment with continuously measured or quantitative outcomes which has advantages over existing methods by requiring fewer assumptions. The approach is based on pairwise ranking between the response values in the control group and those in the exposed groups. The work generalizes the rank-based Wilcoxon-Mann-Whitney test, which for the two-group comparison is effectively a test of whether a response from the control group is different from (larger than) a response in an exposed group. We develop a regression framework that naturally extends this metric to model the dose effect in terms of a risk function. Parameters of the regression model can be estimated with standard software. However, inference requires an additional step to estimate the variance structure of the estimated parameters. An effective dose (ED) and associated lower confidence limit (LED) are easily calculated. The method is supported by a simulation study and is illustrated with a study on the effects of aconiazide. The method offers flexible modeling of the dose effect, and since it is rank- based, it is more resistant to outliers, nonconstant variance and other departures from normality than previously described approaches.
Catalano, PJ, Ryan, LM & Kaden, DA 1996, 'Statistical design aspects of the NTP/HEI collaborative study on the health effects of chronic ozone inhalation', INHALATION TOXICOLOGY, vol. 8, no. 3, pp. 229-249.
View/Download from: Publisher's site
View description>>
The purpose of the NTP/HEI Collaborative Study was to assess exposure- and concentration-related health effects associated with chronic exposure to ozone. Data were obtained from 164 animals specially dedicated to HEI from a standard ozone inhalation study conducted by Battelle Pacific Northwest Laboratories for the National Toxicology Program. The study involved a number of investigators, each interested in assessing a different type of ozone- related health effect, including respiratory function, as well as structural, cellular and biochemical changes in the nose, lungs, and airways. Designing and analyzing a study with multiple investigators raises many statistical challenges. The highest design priority was that each investigator's data be individually interpretable as an independent study. This means that each investigator had to be assigned an adequate number of animals, balanced with respect to concentration level and other factors such as gender and time of sacrifice. An additional feature of the collaborative study was the opportunity it provided to assess and quantify the effect of ozone exposure on a broad spectrum of outcomes, and to explore the relationship between the different types of effect. For example, the data allowed an assessment of whether the animals with the greatest degree of structural damage were also the ones with altered biochemistry. Maximizing the potential to assess these types of correlations required that investigators overlap as much as possible on measurements in individual animals. This aspect of the statistical design requires careful consideration of the compatibility between various investigators. Fortunately, the degree of compatibility was high. In most cases, for example, it was possible to assess respiratory function in the animals prior to their sacrifice, and then to divide the tissue between several different investigators. This article focuses on the statistical design of the collaborative project. Brief ...
Hansen, RM, Ryan, L, Anderson, T, Krzywda, B, Quebbeman, E, Benson, A, Haller, DG & Tormey, DC 1996, 'Phase III Study of Bolus Versus Infusion Fluorouracil With or Without Cisplatin in Advanced Colorectal Cancer', JNCI Journal of the National Cancer Institute, vol. 88, no. 10, pp. 668-674.
View/Download from: Publisher's site
View description>>
Background: Phase II studies of fluorouracil (5-FU) administered by protracted intravenous infusion have suggested an improved response rate and decreased toxicity profile when compared with 5-FU given by bolus injection in patients with metastatic colorectal cancer. Additional studies have suggested further enhancement of infusion 5-FU activity when it is combined with low-dose weekly cisplatin administration. Purpose: This phase III study in adults with metastatic colorectal cancer was planned as a comparison of objective response rates, toxicity, and survival in patients receiving bolus versus protracted-infusion 5-FU with or without cisplatin. Methods: Four hundred ninety-seven previously untreated patients with advanced, measurable metastatic colorectal cancer were randomly assigned to receive treatment A (bolus 5-FU at 500 mg/m2 for 5 days followed in 2 weeks by weekly bolus 5- FU at 600 mg/m2), treatment B (bolus 5-FU at 500 mg/m2 for 5 days followed in 2 weeks by weekly bolus 5-FU at 600 mg/m2, plus weekly cisplatin at 20 mg/m2), treatment C (5-FU at 300 mg/m2 per day by continuous infusion), or treatment D (5-FU at 300 mg/m2 per day by continuous infusion plus weekly cisplatin at 20 mg/m2). All drugs were administered intravenously. Enrollment in the trial occurred from August 1987 through December 1990, and follow-up was through September 1995. The Kaplan-Meier method was used to estimate overall and disease-free survival, and Cox regression models were used to assess the effects of patient characteristics on survival. All P values resulted from two-sided tests. Results: Objective tumor response was observed in 28 (18%) of 153 patients receiving treatment A, in 45 (28%) of 159 patients receiving treatment C (C versus A; P = .045), and in 47 (31%) of 153 patients receiving treatment D (D versus A; P = .016). Because of excessive toxicity, treatment B was discontinued after only 12 patients had begun treatment. Median time to disease progression...
Kaden, DA, Warren, J, Ryan, L, Boorman, G & Mellick, P 1996, 'The NTP/HEI collaborative ozone project on the health effects of chronic ozone inhalation', INHALATION TOXICOLOGY, vol. 8, no. 3, pp. 213-227.
View/Download from: Publisher's site
View description>>
Although many people are exposed to ozone, the effects of chronic exposure to this ubiquitous pollutant, especially low-level chronic exposure, are not well understood. The U.S. Environmental Protection Agency (EPA) current national ambient air quality standard for ozone is exceeded in many communities, especially during the summer. The standard is attained when the number of days per calendar year with maximum hourly average concentrations above 0.12 ppm is equal to or less than 1. The U.S. EPA estimates that 67 million people in the United States, or slightly more than a quarter of the residents, live in areas that were out of compliance with the current National Ambient Air Quality Standard (NAAQS) for ozone in 1989. Although there have been some studies of long-term exposure to ozone, many important questions remain about the health effects of chronic ozone exposure. The Health Effects Institute (HEI), in conjunction with the National Toxicology Program (NTP) carcinogenesis studies, has completed a major effort to help answer these questions. NTP included additional animals in its study for HEI investigators. Included in this effort is a set of studies examining histopathological, biochemical, morphological, and functional alterations in rats exposed to 0, 0.12, 0.5 or 1.0 ppm ozone for 20 mo. This article describes several aspects of this effort. This project can serve as a model for other large toxicological studies for which cancer may not be the only endpoint of concern. The additional animals required for the NTP/HEI Collaborative Ozone Project only represented a modest incremental cost, yet provided information on a much broader range of potential effects of ozone than the basic NTP carcinogenesis studies.
Li, JY & Chow, TWS 1996, 'Functional Approximation of Higher-Order Neural Networks', Journal of Intelligent Systems, vol. 6, no. 3-4, pp. 239-260.
View/Download from: Publisher's site
Andersen, JW, Goetghebeur, EJ & Ryan, L 1970, 'Analysis of survival data under competing risks with missing cause of death information: Application and implications for study design', LIFETIME DATA: MODELS IN RELIABILITY AND SURVIVAL ANALYSIS, 1994 International Research Conference on Lifetime Data Models in Reliability and Survival Analysis, KLUWER ACADEMIC PUBL, HARVARD UNIV, CAMBRIDGE, MA, pp. 13-19.
Cucchiara, R & Piccardi, M 1970, 'DARPA benchmark image processing on SIMD parallel machines', Proceedings of 1996 IEEE Second International Conference on Algorithms and Architectures for Parallel Processing, ICA/sup 3/PP '96, 1996 IEEE Second International Conference on Algorithms and Architectures for Parallel Processing, ICA/sup 3/PP '96, IEEE, SINGAPORE, SINGAPORE, pp. 171-178.
View/Download from: Publisher's site
Cucchiara, R & Piccardi, M 1970, 'Detection of luminosity profiles of elongated shapes', Proceedings of 3rd IEEE International Conference on Image Processing, 3rd IEEE International Conference on Image Processing, IEEE, LAUSANNE, SWITZERLAND, pp. 635-638.
View/Download from: Publisher's site
Luo, X & Zhang, C 1970, 'A unified algebraic structure for uncertain reasonings', Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), Springer Berlin Heidelberg, pp. 459-470.
View/Download from: Publisher's site
View description>>
© Springer-Verlag Berlin Heidelberg 1996. This paper identifies an axiom foundation for uncertain reasonings in rule-based expert systems: a near topological algebra (NT-algebra for short), which holds some basic notions hidden behind the uncertain reasoning models in rule-based expert systems. In according with basic ways of topological connection in an inference network, an NT-algebraic structure has five basic operators, i.e. AND, OR, NOT, Sequential combination and Parallel combination, which obey some axioms. An NT-algebraic structure is defined on a near-degree space introduced by the authors, which is a special topological space. The continuities of real functions, of fuzzy functions and the functions in other sense can be uniformly considered in the framework of a near-degree space. This paper also proves that the EMYCIN's and PROSPECTOR's uncertain reasoning models correspond to good NT-algebras, respectively. Compared to other related works, the NT-algebra as an axiom foundation has the following characteristics: (1) various cases of assessments for uncertainties of both evidence and rules are put into a unified algebraic structure; and (2) major emphasis has been placed on the basic laws of the propagation for them in an inference network.
Yang, H & Zhang, C 1970, 'Definition and application of a comprehensive framework for distributed problem solving', Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), Springer Berlin Heidelberg, pp. 1-15.
View/Download from: Publisher's site
View description>>
© Springer-Verlag Berlin Heidelberg 1996. In the Distributed Artificial Intelligence community, the term “Distributed Problem Solving (DPS)” is widely used. However, different people refer to different types of DPS frameworks. In this paper, we define a comprehensive framework for Distributed Problem Solving. Firstly, we clarify four different types of basic DPS frameworks: DPS1 for task unique-allocation problems, DPS2 for task multi-allocation problems, DPS3 for task decomposition problems, and DPS4 for task division problems. Then, we define the comprehensive framework which can be different combinations of DPS1, DPS2, DPS3, and DPS4. In this framework, Solution Integration (SI) is a necessary component. Three different types of Solution Integration are identified: Solution Synthesis, Solution Composition, and Solution Construction which correspond to DPS2, DPS3 and DPS4, respectively. The definition of the four basic DPS frameworks and the establishment of the comprehensive framework for DPS will hopefully lead to a better understanding and implementation of DPS systems.
Yue Xu & Chengqi Zhang 1970, 'An improved critical diagnosis reasoning method', Proceedings Eighth IEEE International Conference on Tools with Artificial Intelligence, 8th International Conference on Tools with Artificial Intelligence, IEEE Comput. Soc. Press, TOULOUSE, FRANCE, pp. 170-173.
View/Download from: Publisher's site
Zhang, C & Lukose, D 1970, 'Distributed artificial intelligence: Architecture and modelling: First Australian workshop on DAI Canberra, act, Australia, november 13, 1995 proceedings', Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics).
Zhang, M & Zhang, C 1970, 'Neural network strategies for solving synthesis problems in non-conflict cases in distributed expert systems', Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), Springer Berlin Heidelberg, pp. 174-188.
View/Download from: Publisher's site
View description>>
© Springer-Verlag Berlin Heidelberg 1996. In this paper, two neural network mechanisms for synthesis of solutions in non-conflict cases in distributed expert systems (DESs) are proposed. The ideas are: inputs of the neural network are different solutions for the same problem from different expert systems in DESs; outputs of the neural network are the final solutions for the problem after combining different solutions which should be the same as the human experts' final solutions. The first point is to set up the architecture of the neural network and train the neural network by adjusting weights of links to match the outputs of the neural network against the human experts' solutions for all patterns. The second point is that the neural network mechanism proposed in this paper can accommodate the variable number of inputs and outputs without changing neural network architecture.