Astuti, D, Latif, F, Wagner, K, Gentle, D, Cooper, WN, Catchpoole, D, Grundy, R, Ferguson-Smith, AC & Maher, ER 2005, 'Epigenetic alteration at the DLK1-GTL2 imprinted domain in human neoplasia: analysis of neuroblastoma, phaeochromocytoma and Wilms' tumour', British Journal of Cancer, vol. 92, no. 8, pp. 1574-1580.
View/Download from: Publisher's site
View description>>
Epigenetic alterations in the 11p15.5 imprinted gene cluster are frequent in human cancers and are associated with disordered imprinting of insulin-like growth factor (IGF)2 and H19. Recently, an imprinted gene cluster at 14q32 has been defined and includes two closely linked but reciprocally imprinted genes, DLK1 and GTL2, that have similarities to IGF2 and H19, respectively. Both GTL2 and H19 are maternally expressed RNAs with no protein product and display paternal allele promoter region methylation, and DLK1 and IGF2 are both paternally expressed. To determine whether methylation alterations within the 14q32 imprinted domain occur in human tumorigenesis, we investigated the status of the GTL2 promoter differentially methylated region (DMR) in 20 neuroblastoma tumours, 20 phaeochromocytomas and, 40 Wilms' tumours. Hypermethylation of the GTL2 promoter DMR was detected in 25% of neuroblastomas, 10% of phaeochromocytoma and 2.5% of Wilms' tumours. Tumours with GTL2 promoter DMR hypermethylation also demonstrated hypermethylation at an upstream intergenic DMR thought to represent a germline imprinting control element. Analysis of neuroblastoma cell lines revealed that GTL2 DMR hypermethylation was associated with transcriptional repression of GTL2. These epigenetic findings are similar to those reported in Wilms' tumours in which H19 repression and DMR hypermethylation is associated with loss of imprinting (LOI, biallelic expression) of IGF2. However, a neuroblastoma cell line with hypermethylation of the GTL2 promoter and intergenic DMR did not show LOI of DLK1 and although treatment with a demethylating agent restored GTL2 expression and reduced DLK1 expression. As described for IGF2/H19, epigenetic changes at DLK1/GTL2 occur in human cancers. However, these changes are not associated with DLK/LOI highlighting differences in the imprinting control mechanisms operating in the IGF2-H19 and DLK1-GTL2 domains. GTL2 promoter and intergenic DMR hypermethylation is as...
Beydoun, G, Debenham, J & Hoffmann, A 2005, 'Using messaging structure to evolve agents roles in electronic markets', INTELLIGENT AGENTS AND MULTI-AGENT SYSTEMS, vol. 3371, pp. 18-28.
View description>>
Exogenous dynamics play a central role in survival and evolution of institutions. In this paper, we develop an approach to automate part of this evolution process for electronic market places which bring together many online buyers and suppliers. In part
Beydoun, G, Gonzalez-Perez, C, Low, G & Henderson-Sellers, B 2005, 'Synthesis of a generic MAS metamodel.', ACM SIGSOFT Softw. Eng. Notes, vol. 30, no. 4, pp. 1-5.
View/Download from: Publisher's site
Beydoun, G, Hoffmann, AG, Fernández-Breis, JT, Martínez-Béjar, R, Valencia-García, R & Aurum, A 2005, 'Cooperative Modelling Evaluated.', Int. J. Cooperative Inf. Syst., vol. 14, no. 1, pp. 45-71.
View/Download from: Publisher's site
Bilke, S, Chen, Q-R, Westerman, F, Schwab, M, Catchpoole, D & Khan, J 2005, 'Inferring a Tumor Progression Model for Neuroblastoma From Genomic Data', Journal of Clinical Oncology, vol. 23, no. 29, pp. 7322-7331.
View/Download from: Publisher's site
View description>>
Purpose The knowledge of the key genomic events that are causal to cancer development and progression not only is invaluable for our understanding of cancer biology but also may have a direct clinical impact. The task of deciphering a model of tumor progression by requiring that it explains (or at least does not contradict) known clinical and molecular evidence can be very demanding, particularly for cancers with complex patterns of clinical and molecular evidence. Materials and Methods We formalize the process of model inference and show how a progression model for neuroblastoma (NB) can be inferred from genomic data. The core idea of our method is to translate the model of clonal cancer evolution to mathematical testable rules of inheritance. Seventy-eight NB samples in stages 1, 4S, and 4 were analyzed with array-based comparative genomic hybridization. Results The pattern of recurrent genomic alterations in NB is strongly stage dependent and it is possible to identify traces of tumor progression in this type of data. Conclusion A tumor progression model for neuroblastoma is inferred, which is in agreement with clinical evidence, explains part of the heterogeneity of the clinical behavior observed for NB, and is compatible with existing empirical models of NB progression.
Bremner, MJ, Bacon, D & Nielsen, MA 2005, 'Simulating Hamiltonian dynamics using many-qudit Hamiltonians and local unitary control', PHYSICAL REVIEW A, vol. 71, no. 5.
View/Download from: Publisher's site
Cao, YZ & Ying, MS 2005, 'Supervisory control of fuzzy discrete event systems', IEEE TRANSACTIONS ON SYSTEMS MAN AND CYBERNETICS PART B-CYBERNETICS, vol. 35, no. 2, pp. 366-371.
View/Download from: Publisher's site
View description>>
To cope with situations in which a plant's dynamics are not precisely known, we consider the problem of supervisory control for a class of discrete event systems modeled by fuzzy automata. The behavior of such discrete event systems is described by fuzzy
Cetindamar, D 2005, 'Policy issues for Turkish entrepreneurs', Int. J. of Entrepreneurship and Innovation Management, vol. 5, no. 3/4, pp. 187-205.
View/Download from: Publisher's site
View description>>
While it is becoming clear that there is a positive relationship between entrepreneurship and economic development, the topic of entrepreneurship in developing countries has been neglected in the literature. This paper assesses the problems and expectations of entrepreneurs in Turkey. Its main findings are as follows: Turkey underutilises youth and women entrepreneurial resources; there exists a large informal economy that tends to support self-employment rather than entrepreneurship per se; entrepreneurs do not have the kinds of ties with organisations that might be helpful when they are first starting out; entrepreneurs see as their main problems bureaucracy and unstable state policies. Based on these findings, the paper concludes with a policy discussion regarding the development of entrepreneurship in Turkey.
Cetindamar, D 2005, 'Policy issues for Turkish entrepreneurs', International Journal of Entrepreneurship and Innovation Management, vol. 5, no. 3-4, pp. 187-205.
View/Download from: Publisher's site
View description>>
While it is becoming clear that there is a positive relationship between entrepreneurship and economic development, the topic of entrepreneurship in developing countries has been neglected in the literature. This paper assesses the problems and expectations of entrepreneurs in Turkey. Its main findings are as follows:Turkey underutilises youth and women entrepreneurial resources there exists a large informal economy that tends to support self-employment rather than entrepreneurship per se entrepreneurs do not have the kinds of ties with organisations that might be helpful when they are first starting out entrepreneurs see as their main problems bureaucracy and unstable state policies. Based on these findings, the paper concludes with a policy discussion regarding the development of entrepreneurship in Turkey. © 2005 Inderscience Enterprises Ltd.
Cetindamar, D, Çatay, B & Serdar Basmaci, O 2005, 'Competition through collaboration: insights from an initiative in the Turkish textile supply chain', Supply Chain Management: An International Journal, vol. 10, no. 4, pp. 238-240.
View/Download from: Publisher's site
View description>>
PurposeTo gain an understanding of the benefits, bridges, and barriers associated with supply chain collaboration.Design/methodology/approachInsights from extensive field research of a successful collaboration example in the Turkish dyeing and finishing industry.FindingsThe competition among firms is increasingly shifting from company vs company to supply chain vs supply chain. The insights obtained from the collaborative model in this textile supply chain provide a good understanding of the benefits, bridges, and barriers associated with supply chain collaboration. Benefits can be grouped as customer‐oriented benefits, productivity benefits, and innovation related benefits. Factors supporting collaboration are observed as trust, common goals for cooperation, and existence of cooperation mechanisms, while barriers are related to three factors: lack of trust, risk‐benefit evaluation, and lack of common goals for cooperation.Research limitations/implicationsFindings are based on interviews and questionnaires conducted with the managers of 3T, 30 dyeing and finishing firms (ten are partners) and six technology‐supplying partner firms, from various regions in Turkey.Practical implicationsHighlights the importance of trust and collaboration mechanisms in managing collaborations. As the case of 3T in the dyeing and finishing industry shows, collaborations might significantly contribute to the competitiveness of textile firms.Originality/val...
CHIN, C-L & LIN, C-T 2005, 'DETECTION AND COMPENSATION ALGORITHM FOR BACKLIGHT IMAGES WITH FUZZY LOGIC AND ADAPTIVE COMPENSATION CURVE', International Journal of Pattern Recognition and Artificial Intelligence, vol. 19, no. 08, pp. 1041-1057.
View/Download from: Publisher's site
View description>>
This paper presents a new algorithm for detection and compensation of backlight images. The proposed technique attacks the weakness of the conventional backlight image processing methods such as over-saturation, losing contrast and so on. The proposed algorithm consists of two operation phases: detection and compensation phases. In the detection phase, we use the spatial position characteristic and histogram of backlight image to obtain two image indices, which can determine the backlight degree of an image. Fuzzy logic is then used to integrate these two indices into a final backlight index determining the final backlight degree of an image precisely. Second, in the compensation phase, to solve the over-saturation problem that exists usually in conventional image compensation methods, we propose the adaptive compensation-curve scheme to compensate and enhance the brightness of backlight images. The luminance of a backlight image is adjusted according to the compensation curve, which is adapted dynamically according to the backlight degree indicated by the backlight index estimated in the detection phase. The performance of the proposed technique is tested on 100 backlight images covering various kinds of backlight conditions and degrees. The experimental and comparison results clearly show the superiority of the proposed technique.
Chin-Teng Lin, Chun-Lung Chang & Jen-Feng Chung 2005, 'New horizon for CNN: with fuzzy paradigms for multimedia', IEEE Circuits and Systems Magazine, vol. 5, no. 2, pp. 20-35.
View/Download from: Publisher's site
View description>>
The cellular neural network (CNN) is a powerful technique to mimic the local function of biological neural circuits for real-time image and video processing. Recently, it is widely accepted that using a set of CNNs in parallel can achieve higher-level information processing and reasoning functions either from application or biologics points of views. We introduce a novel framework for constructing a multiple-CNN integrated neural system called recurrent fuzzy CNN (RFCNN). This system can automatically learn its proper network structure and parameters simultaneously. In the RFCNN, each learned fuzzy rule corresponds to a CNN. Hence, each CNN takes care of a fuzzily separated problem region, and the functions of all CNNs are integrated through the fuzzy inference mechanism. Some on-line clustering algorithms are introduced for the structure learning, and the ordered-derivative calculus is applied to derive the recurrent learning rules of CNN templates in the parameter-learning phase. RFCNN provides a solution to the current dilemma on the decision of templates and/or fuzzy rules in the existing integrated (fuzzy) CNN systems. The capability of the RFCNN is demonstrated on the real-world vision-based defect inspection and image descreening problems proving that the RFCNN scheme is effective and promising. ©2005 IEEE.
Chin-Teng Lin, Ruei-Cheng Wu, Sheng-Fu Liang, Wen-Hung Chao, Yu-Jie Chen & Tzyy-Ping Jung 2005, 'EEG-based drowsiness estimation for safety driving using independent component analysis', IEEE Transactions on Circuits and Systems I: Regular Papers, vol. 52, no. 12, pp. 2726-2738.
View/Download from: Publisher's site
View description>>
Preventing accidents caused by drowsiness has become a major focus of active safety driving in recent years. It requires an optimal technique to continuously detect drivers' cognitive state related to abilities in perception, recognition, and vehicle control in (near-) real-time. The major challenges in developing such a system include: 1) the lack of significant index for detecting drowsiness and 2) complicated and pervasive noise interferences in a realistic and dynamic driving environment. In this paper, we develop a drowsiness-estimation system based on electroencephalogram (EEG) by combining independent component analysis (ICA), power-spectrum analysis, correlation evaluations, and linear regression model to estimate a driver's cognitive state when he/ she drives a car in a virtual reality (VR)-based dynamic simulator. The driving error is defined as deviations between the center of the vehicle and the center of the cruising lane in the lane-keeping driving task. Experimental results demonstrate the feasibility of quantitatively estimating drowsiness level using ICA-based multistream EEG spectra. The proposed ICA-based method applied to power spectrum of ICA components can successfully (1) remove most of EEG artifacts, (2) suggest an optimal montage to place EEG electrodes, and estimate the driver's drowsiness fluctuation indexed by the driving performance measure. Finally, we present a benchmark study in which the accuracy of ICA-component-based alertness estimates compares favorably to scalp-EEG based. © 2005 IEEE.
Chin-Teng Lin, Wen-Chang Cheng & Sheng-Fu Liang 2005, 'An on-line ICA-mixture-model-based self-constructing fuzzy neural network', IEEE Transactions on Circuits and Systems I: Regular Papers, vol. 52, no. 1, pp. 207-221.
View/Download from: Publisher's site
Chonghui Song & Tianyou Chai 2005, 'Comment on 'Discrete-time optimal fuzzy controller design: global concept approach'', IEEE Transactions on Fuzzy Systems, vol. 13, no. 2, pp. 285-286.
View/Download from: Publisher's site
de Kort, Y, IJsselsteijn, W, Midden, C, Eggen, B & van den Hoven, E 2005, 'Persuasive Gerontechnology', Gerontechnology, vol. 4, no. 3, pp. 123-127.
View description>>
Gerontechnology is a domain that originated a few decades ago and has been developing steadily ever since. Its research focus on a broad set of technologies to serve the ageing society. The present paper aims to connect this domain to a new but promising technology domain that holds great potential for the older population: persuasive technology.
de Raadt, M, Hamilton, M, Lister, RF, Tutty, J, Baker, B, Box, I, Cutts, QI, Fincher, S, Hamer, J, Haden, P, Petre, M, Robins, A, Sutton, K & Tolhurst, D 2005, 'Approaches to learning in computer programming students and their effect on success', Research and Development in Higher Education Series, vol. 28, pp. 407-414.
Devitt, SJ, Cole, JH & Hollenberg, LCL 2005, 'Scheme for direct measurement of a general two-qubit Hamiltonian', Phys. Rev. A., vol. 73, no. 5, p. 052317.
View/Download from: Publisher's site
View description>>
The construction of two-qubit gates appropriate for universal quantumcomputation is of enormous importance to quantum information processing.Building such gates is dependent on accurate knowledge of the interactiondynamics between two qubit systems. This letter will present a systematicmethod for reconstructing the full two-qubit interaction Hamiltonian throughexperimental measures of concurrence. This not only gives a convenient methodfor constructing two qubit quantum gates, but can also be used toexperimentally determine various Hamiltonian parameters in physical systems. Weshow explicitly how this method can be employed to determine the first andsecond order spin-orbit corrections to the exchange coupling in quantum dots.
Devitt, SJ, Greentree, AD & Hollenberg, LCL 2005, 'Information free quantum bus for generating stabiliser states', Quant. Inf. Proc. 6(4):229 (2007), vol. 6, no. 4, pp. 229-242.
View/Download from: Publisher's site
View description>>
Efficient generation of spatially delocalised entangled states is at theheart of quantum information science. Generally flying qubits are proposed forlong range entangling interactions, however here we introduce a bus-mediatedalternative for this task. Our scheme permits efficient and flexible generationof deterministic two-qubit operator measurements and has links to the importantconcepts of mode-entanglement and repeat-until-success protocols. Importantly,unlike flying qubit protocols, our bus particle never contains informationabout the individual quantum states of the particles, hence isinformation-free.
Dong, D-Y, Chen, C-L, Chen, Z-H & Zhang, C-B 2005, 'Quantum mechanics helps in learning for more intelligent robot', Chinese Physics Letters, vol. 2006, p. 23.
View description>>
A learning algorithm based on state superposition principle is presented. Thephysical implementation analysis and simulated experiment results show thatquantum mechanics can give helps in learning for more intelligent robot.
Dovey, K & Singhota, J 2005, 'Learning and knowing in teams', Development and Learning in Organizations: An International Journal, vol. 19, no. 3, pp. 18-20.
View/Download from: Publisher's site
View description>>
PurposeTo explore the collective means through which professional sports teams learn and generate new knowledge forms in order to remain competitive in challenging global arenas, and to examine the applicability of these means to business organizations.Design/methodology/approachThe objectives were achieved by drawing on the business and sporting experience of two executive coaches who have access to current elite‐level sports coaches. Through unstructured interviews with sports coaches and business executives over a period of years, the research question of collective learning in sports teams has been explored and its relevance to business contexts, analyzed.FindingsUsing social capital theory as an analytical lens, the research shows that organizational form is a critical determinant of the effectiveness of collective learning. This is the main reason why business teams are unable to emulate the successful learning that occurs in elite‐level sports teams. The research shows that the hierarchical structure of most business organizations constrains the development of the social capital necessary for sustained learning and knowledge construction.Practical implicationsThe primary implication of the research findings is that business leaders need to view their role as that of creating and managing a social environment in which mission‐pertinent learning and knowledge construction activities are nurtured. In practice, it means that the nature of business leadership and, in particular, power management practices in business organizations needs to be questioned and re‐conceptualized.Originality/valueUsing ...
Dovey, K & White, R 2005, 'Learning about learning in knowledge‐intense organizations', The Learning Organization, vol. 12, no. 3, pp. 246-260.
View/Download from: Publisher's site
View description>>
PurposeThis paper describes and analyses an attempt to engage in transformational learning, oriented to the development of a culture of innovation, at a medium‐size software development organization in Australia.Design/methodology/approachAn action research methodology was used whereby continuous cycles of strategic social learning were collectively theorized, implemented, evaluated and renewed.FindingsThe most important finding of this study is that of the influence of power relations and communication practices upon learning‐for‐innovation in organizations, and the need for the mediation of this influence through the creation of an organizational role that we have entitled an “external critic”. The case also shows the central importance of the relational dimension of social capital generation to learning and the sensitivity of this dimension to power relations.Research limitations/implicationsThe research provides a rich analysis of one company's attempt to learn how to build and sustain a culture of innovation but, as with all case study research, the findings cannot be reliably generalized to other companies. Similarly, the case generates grounded theory that needs to be tested in other organizational contexts.Practical implicationsThe case raises the issue of power management in organizations and its relationship to social learning practices. In particular, it argues for the establishment of a “negotiated order” in organizations (through a mission, vision and core values that are collectively and meani...
Dovey, KA 2005, 'Leadership Education in the Era of Disruption: What Can Business Schools Offer?', International Journal of Leadership Education, vol. 1, no. 1, pp. 179-191.
Duan, RY, Feng, Y, Ji, ZF & Ying, MS 2005, 'Efficiency of deterministic entanglement transformation', PHYSICAL REVIEW A, vol. 71, no. 2, pp. 1-7.
View/Download from: Publisher's site
View description>>
We prove that sufficiently many copies of a bipartite entangled pure state can always be transformed into some copies of another one with certainty by local quantum operations and classical communication. The efficiency of such a transformation is charac
Duan, RY, Feng, Y, Li, X & Ying, MS 2005, 'Multiple-copy entanglement transformation and entanglement catalysis', PHYSICAL REVIEW A, vol. 71, no. 4, pp. 1-11.
View/Download from: Publisher's site
View description>>
We prove that any multiple-copy entanglement transformation [S. Bandyopadhyay, V. Roychowdhury, and U. Sen, Phys. Rev. A 65, 052315 (2002)] can be implemented by a suitable entanglement-assisted local transformation [D. Jonathan and M. B. Plenio, Phys. R
Duan, RY, Feng, Y, Li, X & Ying, MS 2005, 'Trade-off between multiple-copy transformation and entanglement catalysis', PHYSICAL REVIEW A, vol. 71, no. 6, pp. 1-7.
View/Download from: Publisher's site
View description>>
We demonstrate that multiple copies of a bipartite entangled pure state may serve as a catalyst for certain entanglement transformations while a single copy cannot. Such a state is termed a
Duan, RY, Feng, YA & Ying, MS 2005, 'Entanglement-assisted transformation is asymptotically equivalent to multiple-copy transformation', PHYSICAL REVIEW A, vol. 72, no. 2, pp. 1-5.
View/Download from: Publisher's site
View description>>
We show that two ways of manipulating quantum entanglement-namely, entanglement-assisted local transformation [D. Jonathan and M. B. Plenio, Phys. Rev. Lett. 83, 3566 (1999)] and multiple-copy transformation [S. Bandyopadhyay, V. Roychowdhury, and U. Sen
Feng, Y, Duan, R, Ji, Z & Ying, M 2005, 'Proof rules for purely quantum programs', CoRR, vol. abs/cs/0507043.
View description>>
We apply the notion of quantum predicate proposed by D'Hondt and Panangadento analyze a purely quantum language fragment which describes the quantum partof a future quantum computer in Knill's architecture. The denotationalsemantics, weakest precondition semantics, and weakest liberal preconditionsemantics of this language fragment are introduced. To help reasoning aboutquantum programs involving quantum loops, we extend proof rules for classicalprobabilistic programs to our purely quantum programs.
Feng, Y, Duan, RY & Ying, MS 2005, 'Catalyst-assisted probabilistic entanglement transformation', IEEE TRANSACTIONS ON INFORMATION THEORY, vol. 51, no. 3, pp. 1090-1101.
View/Download from: Publisher's site
View description>>
We are concerned with catalyst-assisted probabilistic entanglement transformations. A necessary and sufficient condition is presented under which there exist partial catalysts that can increase the maximal transforming probability of a given entanglement
Ferguson, S & Cabrera, D 2005, 'Vertical localization of sound from multiway loudspeakers', AES Journal of the Audio Engineering Society, vol. 53, no. 3, pp. 163-173.
View description>>
Practical wide-range loudspeakers are usually implemented with multiple drivers, but the systematic effect of the signal frequency upon the vertical localization of sound is scarcely used for loudspeaker enclosure design. Tendencies in vertical localization for the frequency bands characteristic of woofers and tweeters in loudspeakers are shown. Using vertical arrays of individually controlled loudspeakers, synchronous and asynchronous bands of noise were presented to subjects. The frequency of the source affected the vertical position of the low-and high-frequency auditory image pairs significantly and systematically, in a manner broadly consistent with previous studies concerned with single auditory images. Lower frequency sources are localized below their physical positions whereas high-frequency sources are localized at their true positions. This effect is also shown to occur for musical signals. It is demonstrated that low-frequency sources are not localized well when presented in exact synchrony with high-frequency sources, or when they only include energy below 500 Hz.
Gervasi, V & Zowghi, D 2005, 'Reasoning about inconsistencies in natural language requirements', ACM TRANSACTIONS ON SOFTWARE ENGINEERING AND METHODOLOGY, vol. 14, no. 3, pp. 277-330.
View/Download from: Publisher's site
View description>>
The use of logic in identifying and analyzing inconsistency in requirements from multiple stakeholders has been found to be effective in a number of studies. Nonmonotonic logic is a theoretically well-founded formalism that is especially suited for suppo
Gwillim, D, Dovey, K & Wieder, B 2005, 'The politics of post-implementation reviews', INFORMATION SYSTEMS JOURNAL, vol. 15, no. 4, pp. 307-319.
View/Download from: Publisher's site
View description>>
The post-implementation review (PIR) literature emphasizes the benefits of ex post evaluations of information technology (IT) projects. However, empirical studies of actual practice show that few organizations undertake any substantive form of ex post ev
Hazzan, O, Impagliazzo, J, Lister, R & Schocken, S 2005, 'Using history of computing to address problems and opportunities', ACM SIGCSE Bulletin, vol. 37, no. 1, pp. 126-127.
View/Download from: Publisher's site
Hazzan, O, Lister, R, Impagliazzo, J & Schocken, S 2005, 'Using history of computing to address problems and opportunities in computer science education', Proceedings of the Thirty Sixth SIGCSE Technical Symposium on Computer Science Education SIGCSE 2005, vol. 37, no. 1, pp. 126-127.
View description>>
Like nations and peoples, professions have histories too. Similar reasons for teaching history of nations and peoples may explain the importance of teaching prospective professionals the history of their profession. Indeed, much of K-16 education evolves around the teaching of history. Computing is not different with this respect. However, in the computing field, that often lacks attention to the societal impact of its products or an appreciation of the human side of the field, the inclusion of history of computing courses (or incorporating historical perspectives in computer science courses) is rare. In addition, the lack of formal education in computing history and the lack of relevant effective resources does not encourage faculty to incorporate history in their courses. Traditional historians often classify the history of computing as 'recent or contemporary history'. Indeed, the majority of undergraduate students currently an university and college programs were born after the personal computer and their teachers educated after the advent of email. Thus, though computers have strongly influenced their lives, they are generally unaware of the antecedents of the machines and tools they use every day. Hence, they usually do not build on the foundations of the field to explain a subject. Equally, myths and fallacies fill the field, including textbooks. The panel illustrates how teachers can integrate history of computing into traditional computer science education. Open discussion with the audience will follow the panelists' short presentations.
Hsu, C-F, Lin, C-T, Huang, T-Y & Young, K-Y 2005, 'Development of multipurpose virtual-reality dynamic simulator with a force-reflection joystick', Proceedings of the Institution of Mechanical Engineers, Part I: Journal of Systems and Control Engineering, vol. 219, no. 2, pp. 187-195.
View/Download from: Publisher's site
View description>>
The objective of this paper is to develop a multipurpose virtual-reality (VR) dynamic simulation system to meet the requirements of public security in the training of human operators. In this way, the operator can feel that he or she is controlling a real machine or vehicle to achieve the objective of real training. The developed VR dynamical simulation system in this paper mainly consists of three elements: a six-degree-of-freedom motion platform (Stewart platform), a force-reflection joystick, and an interactive VR scene. In the developed VR dynamic simulation system, the operator could sit on a Stewart platform to feel the velocity and orientation of motion, and could handle a force-reflection joystick to transfer the commands to the VR scene. Then, the operator will receive the force feedback from the Stewart platform and the joystick. Finally, a flight simulation scene is applied to illustrate the effectiveness of the developed VR dynamical simulation system. Experimental results demonstrate that the evaluation of the VR dynamical simulation system is comparatively good.
Huang, CD, Liang, SFU, Lin, CT & Ruei-Cheng, WU 2005, 'Machine learning with automatic feature selection for multi-class protein fold classification', Journal of Information Science and Engineering, vol. 21, no. 4, pp. 711-720.
View description>>
The use of a machine learning approach with automatic feature selection for multi-class protein fold classification is studied. Neural networks are used to complete the task of protein fold classification, where each node is associated with a gate. The results show that the proposed architecture is effective in reducing the dimensionality of the data and enhancing the classification performance. The proposed technique allows the processing of more features from amino acid sequences.
Huo, H, Hui, X, Wang, G, Wang, B & Han, D 2005, 'Document fragmentation for XML streams based on Hole-Filler model', Huazhong Keji Daxue Xuebao Ziran Kexue Ban Journal of Huazhong University of Science and Technology Natural Science Edition, vol. 33, no. SUPPL., pp. 249-252.
View description>>
A document fragmentation policy was presented by taking advantage of document object model (DOM) for XML, and a corresponding fragmentation algorithm was designed according to the element fan-outs, to solve the problem of document fragmentation for XML streams based on Hole-Filler model. A tag-based document fragmentation algorithm built on DOM-based algorithm was then proposed to determine document filler points by fragmenting tag structure, so as to reduce the comparisons between element fan-outs and threshold. Finally, an optimized fragmentation policy was presented to avoid trivial pieces by binding XML sub-trees according to the ratio of element fan-outs and threshold. Our performance study shows that the document fragmentation algorithms perform well on execution time, granularity and other metrics.
Ji, ZF, Cao, HG & Ying, MS 2005, 'Optimal conclusive discrimination of two states can be achieved locally', PHYSICAL REVIEW A, vol. 71, no. 3, pp. 1-5.
View/Download from: Publisher's site
View description>>
This paper constructs a local operation and classical communication protocol that achieves the global optimality of conclusive discrimination of any two pure states with arbitrary a priori probability. This can be interpreted that there is no
Ji, ZF, Feng, YA & Ying, MS 2005, 'Local cloning of two product states', PHYSICAL REVIEW A, vol. 72, no. 3, pp. 1-5.
View/Download from: Publisher's site
View description>>
Local quantum operations and classical communication (LOCC) put considerable constraints on many quantum information processing tasks such as cloning and discrimination. Surprisingly, however, discrimination of any two pure states survives such constrain
Kuhnert, M, Voinov, A & Seppelt, R 2005, 'Comparing Raster Map Comparison Algorithms for Spatial Modeling and Analysis', Photogrammetric Engineering & Remote Sensing, vol. 71, no. 8, pp. 975-984.
View/Download from: Publisher's site
Li, SJ, Ying, MS & Li, YM 2005, 'On countable RCC models', FUNDAMENTA INFORMATICAE, vol. 65, no. 4, pp. 329-351.
View description>>
Region Connection Calculus (RCC) is the most widely studied formalism of Qualitative Spatial Reasoning. It has been known for some time that each connected regular topological space provides an RCC model. These 'standard' models are inevitable uncountabl
Li, Y, Li, S & Ying, M 2005, 'Relational reasoning in the region connection calculus', CoRR, vol. abs/cs/0505041.
View description>>
This paper is mainly concerned with the relation-algebraical aspects of thewell-known Region Connection Calculus (RCC). We show that the contact relationalgebra (CRA) of certain RCC model is not atomic complete and hence infinite.So in general an extensional composition table for the RCC cannot be obtainedby simply refining the RCC8 relations. After having shown that each RCC modelis a consistent model of the RCC11 CT, we give an exhaustive investigationabout extensional interpretation of the RCC11 CT. More important, we show thecomplemented closed disk algebra is a representation for the relation algebradetermined by the RCC11 table. The domain of this algebra contains two classesof regions, the closed disks and closures of their complements in the realplane.
Liang, SF, Lin, CT, Wu, RC, Chen, YC, Huang, TY & Jung, TP 2005, 'Monitoring Driver's Alertness Based on the Driving Performance Estimation and the EEG Power Spectrum Analysis', 2005 IEEE Engineering in Medicine and Biology 27th Annual Conference, vol. 10, pp. 5738-5741.
View/Download from: Publisher's site
Lin, C-J & Chen, C-H 2005, 'Identification and prediction using recurrent compensatory neuro-fuzzy systems', Fuzzy Sets and Systems, vol. 150, no. 2, pp. 307-330.
View/Download from: Publisher's site
View description>>
In this paper, a recurrent compensatory neuro-fuzzy system (RCNFS) for identification and prediction is proposed. The compensatory-based fuzzy method uses the adaptive fuzzy operations of neuro-fuzzy systems to make fuzzy logic systems more adaptive and effective. A recurrent network is embedded in the RCNFS by adding feedback connections in the second layer, where the feedback units act as memory elements. In this paper, the RCNFS model is proved to be a universal approximator. Also, an online learning algorithm is proposed which can automatically construct the RCNFS. There are no rules initially in the RCNFS. They are created and adapted as online learning proceeds through simultaneous structure and parameter learning. Structure learning is based on the degree measure and parameter learning is based on the ordered derivative algorithm. Finally, the RCNFS is used in several simulations. The simulation results of the dynamic system model have shown that (1) the RCNFS model converges quickly; (2) the RCNFS model requires a small number of tuning parameters; (3) the RCNFS model can solve temporal problems and approximate a dynamic system. © 2004 Elsevier B.V. All rights reserved.
Lin, C-T, Cheng, W-C & Liang, S-F 2005, 'A 3-D Surface Reconstruction Approach Based on Postnonlinear ICA Model', IEEE Transactions on Neural Networks, vol. 16, no. 6, pp. 1638-1650.
View/Download from: Publisher's site
Lin, C-T, Cheng, W-C & Liang, S-F 2005, 'Neural-Network-Based Adaptive Hybrid-Reflectance Model for 3-D Surface Reconstruction', IEEE Transactions on Neural Networks, vol. 16, no. 6, pp. 1601-1615.
View/Download from: Publisher's site
Lin, CT, Chung, JF & Pu, HC 2005, 'Pedestrian detection system', International Journal of Fuzzy Systems, vol. 7, no. 2, pp. 45-52.
View description>>
In this paper, we propose a new pedestrian detection algorithm, and use it to develop a real-time pedestrian detection system. The pedestrian detection algorithm can be functionally partitioned into two parts: moving object segmentation and pedestrian recognition. In moving object segmentation, we segment the moving objects in the scene by modified temporal differencing method. This method combines general temporal differencing method with detection nets. In pedestrian recognition, we obtain multi-type wavelet templates from input images. Interest points are extracted from wavelet templates. Then, feature points are extracted by statistical method from interest point templates. Finally, these feature points are fed into a trained multilayer back-propagation neural network. The output of the neural network implies the result - pedestrian or non-pedestrian. From experiments, accuracy rate of recognition achieves 95%. We implement this algorithm in a real-time pedestrian detection system. The system can detect pedestrians in the scene in real time. The pedestrian detection rate is 92.58%. © 2005 TFSA.
LIN, C-T, LIN, K-L, YANG, C-H, CHUNG, I-F, HUANG, C-D & YANG, Y-S 2005, 'PROTEIN METAL BINDING RESIDUE PREDICTION BASED ON NEURAL NETWORKS', International Journal of Neural Systems, vol. 15, no. 01n02, pp. 71-84.
View/Download from: Publisher's site
View description>>
Over one-third of protein structures contain metal ions, which are the necessary elements in life systems. Traditionally, structural biologists were used to investigate properties of metalloproteins (proteins which bind with metal ions) by physical means and interpreting the function formation and reaction mechanism of enzyme by their structures and observations from experiments in vitro. Most of proteins have primary structures (amino acid sequence information) only; however, the 3-dimension structures are not always available. In this paper, a direct analysis method is proposed to predict the protein metal-binding amino acid residues from its sequence information only by neural networks with sliding window-based feature extraction and biological feature encoding techniques. In four major bulk elements (Calcium, Potassium, Magnesium, and Sodium), the metal-binding residues are identified by the proposed method with higher than 90% sensitivity and very good accuracy under 5-fold cross validation. With such promising results, it can be extended and used as a powerful methodology for metal-binding characterization from rapidly increasing protein sequences in the future.
Lin, C-T, Wu, R-C, Jung, T-P, Liang, S-F & Huang, T-Y 2005, 'Estimating Driving Performance Based on EEG Spectrum Analysis', EURASIP Journal on Advances in Signal Processing, vol. 2005, no. 19, pp. 3165-3174.
View/Download from: Publisher's site
Lin, C-T, Yeh, C-M, Chung, J-F, Liang, S-F & Pu, H-C 2005, 'Support-Vector-Based Fuzzy Neural Networks', International Journal of Computational Intelligence Research, vol. 1, no. 2.
View/Download from: Publisher's site
Lin, KL, Lin, CY, Huang, CD, Chang, HM, Yang, CY, Lin, CT, Tang, CY & Hsu, DF 2005, 'Methods of improving protein structure prediction based on HLA neural network and combinatorial fusion analysis', WSEAS Transactions on Information Science and Applications, vol. 2, no. 12, pp. 2146-2153.
View description>>
The accurate classification of protein structure is critical and essential for protein function determination in Bioinformatics and Proteomics. A reasonably high rate of prediction accuracy for protein structure classification has been achieved recently in coarse-grained protein class assignment according to their primary amino acid sequences, such as classifying proteins into four classes in SCOP. However, it is still quite a challenge for fine-grained protein fold assignment, especially when the number of possible folding patterns as those defined in SCOP is large. In our previous work, hierarchical learning architecture (HLA) neural networks have been used to differentiate proteins according to their classes and folding patterns. A better prediction accuracy rate for 27 folding categories was 65.5% which improves previous results by Ding and Dubchak with 56.5% prediction accuracy rate. The success of the protein structure classification depends heavily on the computational methods used and the features selected. Here combinatorial fusion analysis (CFA) techniques are used to facilitate feature selection and combination for improving prediction accuracy rate of protein structure classification. The resulting classification has an overall prediction accuracy rate of 87.8% for coarse-grained 4 classes and 70.9% for fine-grained 27 folding categories by applying the concept of CFA to our previous work using neural network with the HLA framework. These results are significantly higher than others and our previous work. They further demonstrate that the CFA techniques can greatly enhance the machine learning method (such as NN in the paper) in the protein structure prediction problem.
Lister, R 2005, 'One Small Step Toward a Culture of Peer Review and Multi- Institutional Sharing of Educational Resources: A Multiple Choice Exam for First Semester Programming Students', Conferences in Research and Practice in Information Technology Series, vol. 42, no. 5, pp. 155-164.
View description>>
This paper presents a multiple choice question exam, used to test students completing their first semester of programming. Assumptions in the design of the exam are identified. A detailed analysis is performed on how students performed on the questions. The intent behind this exercise is to begin a community process of identifying the criteria that define an effective multiplechoice exam for testing novice programmers. The long term aim is to develop consensus on peer review criteria of such exams. This consensus is seen as a necessary precondition for any future public domain library of such multiple-choice questions. © 2005, Australian Computer Society, Inc.
Lister, RF 2005, 'Mixed methods: positivists are from Mars, constructivists are from Venus', ACM SIGCSE Bulletin Inroads, vol. 37, no. 4, pp. 18-19.
Lu, H & Song, Y 2005, 'Brief Introduction to the Development of Electric Power Industry in UK', Modern Electric power, vol. 22, no. 2, pp. 91-94.
Margetts, CDE, Astuti, D, Gentle, DC, Cooper, WN, Cascon, A, Catchpoole, D, Robledo, M, Neumann, HPH, Latif, F & Maher, ER 2005, 'Epigenetic analysis of HIC1, CASP8, FLIP, TSP1, DCR1, DCR2, DR4, DR5, KvDMR1, H19 and preferential 11p15.5 maternal-allele loss in von Hippel-Lindau and sporadic phaeochromocytomas', Endocrine-Related Cancer, vol. 12, no. 1, pp. 161-172.
View/Download from: Publisher's site
View description>>
Phaeochromocytoma is a neural-crest-derived tumour that may be a feature of several familial cancer syndromes including von Hippel-Lindau (VHL) disease, multiple endocrine neoplasia type 2 (MEN 2), neurofibromatosis type 1 (NF1) and germline succinate dehydrogenase subunit (SDHB and SDHD) mutations. However the somatic genetic and epigenetic events that occur in phaeochromocytoma tumourigenesis are not well defined. Epigenetic events including de novo promoter methylation of tumour-suppressor genes are frequent in many human neoplasms. As neuroblastoma and phaeochromocytoma are both neural-crest-derived tumours, we postulated that some epigenetic events might be implicated in both tumour types and wished to establish how somatic epigenetic alterations compared in VHL-associated and sporadic phaeochromocytomas. We identified frequent aberrant methylation of HIC1 (82%) and CASP8 (31%) in phaeochromocytoma, but both genes were significantly more methylated in VHL phaeochromocytomas than in sporadic cases. Of four tumour necrosis factor-related apoptosis-inducing ligand (TRAIL) receptors analysed, DR4 was most commonly methylated (41%; compared with DcR2 (26%), DcR1 (23%) and DR5 (10%)). Gene methylation patterns in phaeochromocytoma and neuroblastoma did not differ significantly suggesting overlapping mechanisms of tumourigenesis. We also investigated the role of 11p15.5-imprinted genes in phaeochromocytoma. We found that in 10 sporadic and VHL phaeochromocytomas with 11p15.5 allele loss, the patterns of methylation of 11p15.5-differentially methylated regions were consistent with maternal, rather than, paternal chromosome loss in all cases (P<0.001). This suggests that 11p15.5-imprinted genes may be implicated in the pathogenesis of both familial (germline VHL
Niazi, M, Wilson, D & Zowghi, D 2005, 'A framework for assisting the design of effective software process improvement implementation strategies', JOURNAL OF SYSTEMS AND SOFTWARE, vol. 78, no. 2, pp. 204-222.
View/Download from: Publisher's site
View description>>
A number of advances have been made in the development of software process improvement (SPI) standards and models, e.g. Capability Maturity Model (CMM), more recently CMMI, and ISO’s SPICE. However, these advances have not been matched by equal advances
Niazi, M, Wilson, D & Zowghi, D 2005, 'A maturity model for the implementation of software process improvement: an empirical study', JOURNAL OF SYSTEMS AND SOFTWARE, vol. 74, no. 2, pp. 155-172.
View/Download from: Publisher's site
Olafsen, RN & Cetindamar, D 2005, 'E‐learning in a competitive firm setting', Innovations in Education and Teaching International, vol. 42, no. 4, pp. 325-335.
View/Download from: Publisher's site
Ruta, D & Gabrys, B 2005, 'Classifier selection for majority voting', Information Fusion, vol. 6, no. 1, pp. 63-81.
View/Download from: Publisher's site
View description>>
Individual classification models are recently challenged by combined pattern recognition systems, which often show better performance. In such systems the optimal set of classifiers is first selected and then combined by a specific fusion method. For a small number of classifiers optimal ensembles can be found exhaustively, but the burden of exponential complexity of such search limits its practical applicability for larger systems. As a result, simpler search algorithms and/or selection criteria are needed to reduce the complexity. This work provides a revision of the classifier selection methodology and evaluates the practical applicability of diversity measures in the context of combining classifiers by majority voting. A number of search algorithms are proposed and adjusted to work properly with a number of selection criteria including majority voting error and various diversity measures. Extensive experiments carried out with 15 classifiers on 27 datasets indicate inappropriateness of diversity measures used as selection criteria in favour of the direct combiner error based search. Furthermore, the results prompted a novel design of multiple classifier systems in which selection and fusion are recurrently applied to a population of best combinations of classifiers rather than the individual best. The improvement of the generalisation performance of such system is demonstrated experimentally. © 2004 Elsevier B.V. All rights reserved.
Sanders, K, Fincher, S, Bouvier, D, Lewandowski, G, Morrison, B, Murphy, L, Petre, M, Richards, B, Tenenberg, J, Thomas, L, Anderson, R, Anderson, R, Fitzgerald, S, Gutschow, A, Haller, S, Lister, R, McCauley, R, McTaggart, J, Prasad, C, Scott, T, Shinners-Kennedy, D, Westbrook, S & Zander, C 2005, 'A multi-institutional, multinational study of programming concepts using card sort data', Expert Systems, vol. 22, no. 3, pp. 121-128.
View/Download from: Publisher's site
View description>>
This paper presents a case study of the use of a repeated single-criterion card sort with an unusually large, diverse participant group. The study, whose goal was to elicit novice programmers' knowledge of programming concepts, involved over 20 researche
Sheng-Che Hsu, Sheng-Fu Liang & Chin-Teng Lin 2005, 'A robust digital image stabilization technique based on inverse triangle method and background detection', IEEE Transactions on Consumer Electronics, vol. 51, no. 2, pp. 335-345.
View/Download from: Publisher's site
Shi, CG, Lu, J & Zhang, GQ 2005, 'An extended Kuhn-Tucker approach for linear bilevel programming', APPLIED MATHEMATICS AND COMPUTATION, vol. 162, no. 1, pp. 51-63.
View/Download from: Publisher's site
View description>>
Kuhn-Tucker approach has been applied with remarkable success in linear bilevel programming (BLP). However, it still has some extent unsatisfactory and incomplete. One principle challenges is that it could not well handle a linear BLP problem when the co
Shi, CG, Zhang, GQ & Lu, J 2005, 'The Kth-best approach for linear bilevel multi-follower programming', JOURNAL OF GLOBAL OPTIMIZATION, vol. 33, no. 4, pp. 563-578.
View/Download from: Publisher's site
View description>>
The majority of research on bilevel programming has centered on the linear version of the problem in which only one leader and one follower are involved. This paper addresses linear bilevel multi-follower programming (BLMFP) problems in which there is no
Sun, XM, Duan, RY & Ying, MS 2005, 'The existence of quantum entanglement catalysts', IEEE TRANSACTIONS ON INFORMATION THEORY, vol. 51, no. 1, pp. 75-80.
View/Download from: Publisher's site
View description>>
Without additional resources, it is often impossible to transform one entangled quantum state into another with local quantum operations and classical communication. Jonathan and Plenio (Phys. Rev. Lett., vol. 83, p. 3566, 1999) presented an interesting
Wei, JS, Greer, BT, Westermann, F, Steinberg, SM, Son, CG, Chen, QR, Whiteford, CC, Bilke, S, Krasnoselsky, AL, Cenacchi, N, Catchpoole, D, Berthold, F, Schwab, M & Khan, J 2005, 'Erratum: Prediction of clinical outcome using gene expression profiling and artificial neural networks for patients with neuroblastoma (Cancer Research (October 1, 2004) 64 (6883-6891))', Cancer Research, vol. 65, no. 1, p. 374.
Wei, JS, Greer, BT, Westermann, F, Steinberg, SM, Son, CG, Chen, QR, Whiteford, CC, Bilke, S, Krasnoselsky, AL, Cenacchi, N, Catchpoole, D, Berthold, F, Schwab, M & Khan, J 2005, 'Prediction of clinical outcome using gene expression profiling and artificial neural networks for patients with neuroblastoma (vol 64, pg 6883, 2004)', CANCER RESEARCH, vol. 65, no. 1, pp. 374-374.
Wu, FJ, Lu, J & Zhang, GQ 2005, 'Development and implementation on a fuzzy multiple objective decision support system', KNOWLEDGE-BASED INTELLIGENT INFORMATION AND ENGINEERING SYSTEMS, PT 1, PROCEEDINGS, vol. 3681, pp. 261-267.
View description>>
A fuzzy-goal optimization-based method has been developed for solving fuzzy multiple objective linear programming (FMOLP) problems where fuzzy parameters in both objective functions and constraints and fuzzy goals of objectives can be in any form of membership function. Based on the method, a fuzzy multiple objective decision support system (FMODSS) is developed. This paper focuses on the development and use of FMODSS in detail. An example is presented for demonstrating how to solve a FMOLP problem by using the FMODSS interactively. © Springer-Verlag Berlin Heidelberg 2005.
Wu, SJ & Lin, CT 2005, 'Authors' reply', IEEE Transactions on Fuzzy Systems, vol. 13, no. 2, pp. 286-286.
View/Download from: Publisher's site
Ying, M 2005, 'A theory of computation based on quantum logic (I).', Theor. Comput. Sci., vol. 344, pp. 134-207.
View/Download from: Publisher's site
Ying, MS 2005, 'A theory of computation based on quantum logic (I)', THEORETICAL COMPUTER SCIENCE, vol. 344, no. 2-3, pp. 134-207.
View/Download from: Publisher's site
View description>>
The (meta)logic underlying classical theory of computation is Boolean (two-valued) logic. Quantum logic was proposed by Birkhoff and von Neumann as a logic of quantum mechanics more than 60 years ago. It is currently understood as a logic whose truth values are taken from an orthomodular lattice. The major difference between Boolean logic and quantum logic is that the latter does not enjoy distributivity in general. The rapid development of quantum computation in recent years stimulates us to establish a theory of computation based on quantum logic. The present paper is the first step toward such a new theory and it focuses on the simplest models of computation, namely finite automata. We introduce the notion of orthomodular lattice-valued (quantum) automaton. Various properties of automata are carefully reexamined in the framework of quantum logic by employing an approach of semantic analysis. We define the class of regular languages accepted by orthomodular lattice-valued automata. The acceptance abilities of orthomodular lattice-valued nondeterministic automata and their various modifications (such as deterministic automata and automata with ε-moves) are compared. The closure properties of orthomodular lattice-valued regular languages are derived. The Kleene theorem about equivalence of regular expressions and finite automata is generalized into quantum logic. We also present a pumping lemma for orthomodular lattice-valued regular languages. It is found that the universal validity of many properties (for example, the Kleene theorem, the equivalence of deterministic and nondeterministic automata) of automata depend heavily upon the distributivity of the underlying logic. This indicates that these properties does not universally hold in the realm of quantum logic. On the other hand, we show that a local validity of them can be recovered by imposing a certain commutativity to the (atomic) statements about the automata under consideration. This reveals a...
Ying, MS 2005, 'Knowledge transformation and fusion in diagnostic systems', ARTIFICIAL INTELLIGENCE, vol. 163, no. 1, pp. 1-45.
View/Download from: Publisher's site
View description>>
Diagnostic systems depend on knowledge bases specifying the causal, structural or functional interactions among components of the diagnosed objects. A diagnostic specification in a diagnostic system is a semantic interpretation of a knowledge base. We in
Ying, MS 2005, 'pi-calculus with noisy channels', ACTA INFORMATICA, vol. 41, no. 9, pp. 525-593.
View/Download from: Publisher's site
View description>>
It is assumed in the pi-calculus that communication channels are always noiseless. But it is usually not the case in the mobile systems that developers are faced with in the real life. In this paper, we introduce an extension of pi, called pi(N), in whic
Zhang, C, Ying, M & Qiao, B 2005, 'Optimal universal programmable detectors for unambiguous discrimination', Phys. Rev. A, vol. 74, pp. 042308-042308.
View/Download from: Publisher's site
View description>>
We discuss the problem of designing unambiguous programmable discriminatorsfor any n unknown quantum states in an m-dimensional Hilbert space. Thediscriminator is a fixed measurement that has two kinds of input registers: theprogram registers and the data register. The quantum state in the data registeris what users want to identify, which is confirmed to be among the n states inprogram registers. The task of the discriminator is to tell the users whichstate stored in the program registers is equivalent to that in the dataregister. First, we give a necessary and sufficient condition for judging anunambiguous programmable discriminator. Then, if $m=n$, we present an optimalunambiguous programmable discriminator for them, in the sense of maximizing theworst-case probability of success. Finally, we propose a universal unambiguousprogrammable discriminator for arbitrary n quantum states.
Zhang, C-B, Dong, D-Y & Chen, Z-H 2005, 'Control of non-controllable quantum systems: A quantum control algorithm based on Grover iteration', J. Opt. B: Quantum Semiclass. Opt., vol. 7, pp. S313-S317.
View description>>
A new notion of controllability, eigenstate controllability, is defined forfinite-dimensional bilinear quantum mechanical systems which are neitherstrongly completely controllably nor completely controllable. And a quantumcontrol algorithm based on Grover iteration is designed to perform a quantumcontrol task of steering a system, which is eigenstate controllable but may notbe (strongly) completely controllable, from an arbitrary state to a targetstate.
Zowghi, D & Gervasi, V 2005, 'Automated tools for requirements engineering', COMPUTER SYSTEMS SCIENCE AND ENGINEERING, vol. 20, no. 1, pp. 3-4.
Beydoun, G, Gonzalez-Perez, C, Henderson-Sellers, B & Low, G 1970, 'Developing and Evaluating a Generic Metamodel for MAS Work Products.', SELMAS (LNCS), Springer, pp. 126-142.
Beydoun, G, Gonzalez-Perez, C, Low, G & Henderson-Sellers, B 1970, 'Synthesis of a generic MAS metamodel', Proceedings of the fourth international workshop on Software engineering for large-scale multi-agent systems - SELMAS '05, the fourth international workshop, ACM Press, St Loius, USA, pp. 1-1.
View/Download from: Publisher's site
View description>>
Method engineering, which focuses on project-specific methodology construction from existing method fragments, is an appealing approach to organize, appropriately access and effectively harness the software engineering knowledge of MAS methodologies. With the objective of applying method engineering for developing an MAS, in this paper we introduce a generic metamodel to serve as a representational infrastructure to unify existing MAS methodologies into a single specification. Our metamodel does not focus on any class of MAS, nor does it impose any restrictions on the format of the system requirements; rather, our metamodel is an abstraction of how any MAS is structured and behaves both at design time and run-time.
Beydoun, G, Gonzalez-Perez, C, Low, G & Henderson-Sellers, B 1970, 'Towards Method Engineering for Multi-Agent Systems: A preliminary validation of a Generic MAS Metamodel.', SEKE, International Conference on Software Engineering and Knowledge Engineering, Knowledge Systems Graduate School, Taipei, Taiwan, pp. 51-56.
View description>>
New Multi-Agent System (MAS) software development methodologies are published at an increasing pace. This is in part due to the accepted premise that no single methodology can be suitable for all MAS software projects. Method engineering, which focuses on projectspecific methodology construction from existing method fragments, is an appealing approach to organize, appropriately access and effectively harness the software engineering knowledge of methodologies. Towards this, in this paper we present and validate a generic productfocussed metamodel to serve as a representational infrastructure to unify existing methodologies into a single specification. Our metamodel does not focus on any class of MAS, nor does it impose any restrictions on the format of system requirements; rather, our metamodel is an abstraction of how any MAS is structured and behaves both at design time and run-time. We analyze two well-known existing MAS metamodels. We sketch how they can be seen as subtypes of our generic metamodel. This constitutes early evidence to support the use of our metamodel towards the construction of situated MAS methodologies.
Beydoun, G, Tran, N, Low, G & Henderson-Sellers, B 1970, 'Preliminary basis for an ontology-based methodological approach for multi-agent systems', PERSPECTIVES IN CONCEPTUAL MODELING, International Conference on Conceptual Modelling, Springer, Klagenfurt, Austria, pp. 131-140.
View/Download from: Publisher's site
View description>>
The influence of ontologies in Knowledge Based Systems (KBS) methodologies extends well beyond the initial analysis phase, leading in the 1990s to domain-independent KBS methodologies. In this paper, we reflect on those lessons and on the roles of ontologies in KBS development. We analyse and identify which of those roles can be transferred towards an ontology-based MAS development methodology. We identify ontology-related inter-dependencies between the analysis and design phases. We produce a set of six recommendations towards creating a domain-independent MAS methodology that incorporates ontologies beyond its initial analysis phase. We identify its essential features and sketch the characteristic tasks within both its analysis and design phases.
Boyd, S, Zowghi, D, Farroukh, A & Society, IEEEC 1970, 'Measuring the expressiveness of a constrained natural language: An empirical study', 13TH IEEE INTERNATIONAL CONFERENCE ON REQUIREMENTS ENGINEERING, PROCEEDINGS, IEEE International Requirements Engineering Conference, IEEE, Paris, France, pp. 339-349.
View/Download from: Publisher's site
View description>>
It has been suggested that constraining a natural language (NL) reduces the degree of ambiguity of requirement specifications written in that language. There is also a tendency to assume that an inescapable side effect of constraining a natural language is a subsequent reduction in its expressiveness. The primary objective of this paper is to describe a technique that we have developed for empirically measuring the expressiveness of a Constrained Natural Language (CNL) when used to specify the requirements in a particular application domain. Our simple yet practical and repeatable technique elucidates the individual contribution that each lexical entity of the CNL can make on the overall expressiveness of the CNL. This technique is particularly useful for designing new CNLs, as well as situations where tailoring or streamlining existing CNLs for particular application domains is needed.
Cabrera, D, Ferguson, S & Laing, G 1970, 'Development of auditory alerts for air traffic control consoles', Audio Engineering Society 119th Convention Fall Preprints 2005, pp. 1-21.
View description>>
This paper documents a project that developed a hierarchical auditory alert scheme for air traffic control consoles, replacing a basic system of auditory alerts. Alerts are designed to convey the level of urgency, not provoke annoyance, be easily distinguished, minimize speech interference, and be easily localized. User evaluations indicate that the new alert scheme is highly advantageous, especially when combined with improved visual coding of alerts. The alert scheme was implemented in Australian air traffic control centers in July 2005.
Chang, E, Dillon, TS & Hussain, FK 1970, 'Trust and reputation relationships in service-oriented environments', Third International Conference on Information Technology and Applications, Vol 1, Proceedings, International Conference on Information Technology and Applications, IEEE Computer Society, Sydney, Australia, pp. 4-14.
View/Download from: Publisher's site
View description>>
Trust and trustworthiness plays a major role in conducting business on the Internet in service-oriented environments. In defining trust for service-oriented environments, one needs to capture the notation of service level, service agreement, context and timeslots. The same applies for reputation which is the opinion of the third party agents which is used in determining the trust and trustworthiness. Because of the complexity of the issues, and the fact that the trust and reputation are essentially concerns with the relationships, it is important to clearly define the notion of the trust relationships and notion of the reputation relationships. In this paper, therefore, we clear these definitions and we introduce a graphical notation for representing these relationships.
Chang, E, Hussain, FK & Dillon, T 1970, 'CCCI metrics for the measurement of quality of e-service', 2005 IEEE/WIC/ACM International Conference on Intelligent Agent Technology, Proceedings, IEEE/WIC/ACM International Conference on Intelligent Agent Technology, IEEE CS Press, Campiegne, France, pp. 603-610.
View/Download from: Publisher's site
View description>>
The growing development in web-based trust and reputation systems in the 21st century will have powerful social and economic impact on all business entities, and will make transparent quality assessment and customer assurance realities in the distributed web-based service oriented environments. The growth in web-based trust and reputation systems will be the foundation for web intelligence in the future. Trust and Reputation systems help capture business intelligence through establishing customer relationships, learning consumer behaviour, capturing market reaction on products and services, disseminating customer feedback, buyers opinions and end-user recommendations, and revealing dishonest services, unfair trading, biased assessment, discriminatory actions, fraudulent behaviours, and un-true advertising. The continuing development of these technologies will help in the improvement of professional business behaviour, sales, reputation of sellers, providers, products and services. In this paper, we present a new methodology known as CCCI (Correlation, Commitment, Clarity, and Influence) for trustworthiness measure that is used in the Trust and Reputation System. The methodology is based on determining the correlation between the originally committed services and the services actually delivered by a Trusted Agent in a business interaction over the service oriented networks to determine the trustworthiness of the Trusted Agent.
Chang, E, Hussain, FK & Dillon, T 1970, 'Reputation ontology for reputation systems', ON THE MOVE TO MEANINGFUL INTERNET SYSTEMS 2005: OTM 2005 WORKSHOPS, PROCEEDINGS, The International Conference on Semantic Web and Web Services, Springer, New York, USA, pp. 957-966.
View/Download from: Publisher's site
View description>>
The growing development of web-based reputation systems in the 21st century will have a powerful social and economic impact on both business entities and individual customers, because it makes transparent quality assessment on products and services to achieve customer assurance in the distributed web-based Reputation Systems. The web-based reputation systems will be the foundation for web intelligence in the future. Trust and Reputation help capture business intelligence through establishing customer trust relationships, learning consumer behavior, capturing market reaction on products and services, disseminating customer feedback, buyers opinions and end-user recommendations. It also reveals dishonest services, unfair trading, biased assessment, discriminatory actions, fraudulent behaviors, and un-true advertising. The continuing development of these technologies will help in the improvement of professional business behavior, sales, reputation of sellers, providers, products and services. Given the importance of reputation in this paper, we propose ontology for reputation. In the business world we can consider the reputation of a product or the reputation of a service or the reputation of an agent. In this paper we propose ontology for these entities that can help us unravel the components and conceptualize the components of reputation of each of the entities.
Chang, E, Hussain, FK & Dillon, TS 1970, 'Trustworthiness measure for e-service', PST 2005 - 3rd Annual Conference on Privacy, Security and Trust, Conference Proceedings, Annual Conference on Privacy, Security and Trust, University of New Brunswick, St Andrews, Canada, pp. 1-14.
View description>>
Traditionally, transactions were carried out face-toface, now, they are carried out over the Internet. The infrastructure for the above business and information exchange could be client-server, peer-to-peer or mobile network environments, and very often users on the network carry out interactions in one of three forms: • Anonymous (No names are identified in the communication) • Pseudo-anonymous (Nicknames are used in the communication) • Non-anonymous (Real names are used in the communication) Incapability or a fraudulent practice could occur when the seller or business provider or buyer (the agents on the network) does not behave in the manner that is mutually agreed or understood, especially if published terms and conditions exist. This paper evaluates currently existing trustworthiness systems and points out that currently there is no existing standardized measurement system for Quality of Service and outlines the methodology that we have developed for this.
Chang, E, Thomson, P, Dillon, T & Hussain, F 1970, 'The Fuzzy and Dynamic Nature of Trust', Lecture Notes in Computer Science: Trust, Privacy, And Security In Digital Business, International Conference on Trust, Privacy and Security in Digital Business, Springer Berlin Heidelberg, Copenhagen, Denmark, pp. 161-174.
View/Download from: Publisher's site
View description>>
Trust is one of the most fuzzy, dynamic and complex concepts in both social and business relationships. The difficulty in measuring Trust and predicting Trustworthiness in service-oriented network environments leads to many questions. These include issue
Chang, EJ, Hussain, FK & Dillon, TS 1970, 'Fuzzy nature of trust and dynamic trust modeling in service oriented environments', Proceedings of the 2005 workshop on Secure web services, CCS05: 12th ACM Conference on Computer and Communications Security 2005, ACM, Fairfax, USA, pp. 75-83.
View/Download from: Publisher's site
View description>>
In this paper, we propose and describe the six characteristics of trust. Based on the six proposed characteristics, we determine why trust is fuzzy. The term 'fuzzy' in this paper not in the sense of the precise definitions given in the Fuzzy Systems literature but to indicate a certain vagueness, complexity or ill-definition and qualitative characterization rather than quantitative representation and dynamic.We then determine why due to the six characteristics of trust, trust is dynamic. We then propose a modeling language tool to model the fuzzy and dynamic nature of trust.
Cheng-Hung Chen & Chin-Teng Lin 1970, 'Identification of Chaotic System Using Recurrent Compensatory Neuro-Fuzzy Systems', 2005 9th International Workshop on Cellular Neural Networks and Their Applications, 2005 9th International Workshop on Cellular Neural Networks and Their Applications, IEEE, pp. 15-18.
View/Download from: Publisher's site
View description>>
In this paper, a Recurrent Compensatory Neuro-Fuzzy System (RCNFS) is proposed for identification and prediction. The compensatory-based fuzzy reasoning method is using adaptive fuzzy operations of neuro-fuzzy systems that can make the fuzzy logic systems more adaptive and effective. The recurrent network is embedded in the RCNFS by adding feedback connections in the second layer, where the feedback units act as memory elements. Also, an on-line learning algorithm is proposed to automatically construct the RCNFS. They are created and adapted as on-line learning proceeds via simultaneous structure and parameter learning. Finally, the RCNFS is applied in several simulations. The simulation results of the dynamic system modeling have shown that 1) the RCNFS model converges quickly; 2) the RCNFS model requires a small number of tuning parameters; 3) the RCNFS model can solve the temporal problems and approximate a dynamic system.
Chin-Teng Lin & Chao-Hui Huang 1970, 'A Complex Texture Classification Algorithm based on Gabor-type Filtering Cellular Neural Networks and Self-Organized Fuzzy Inference Neural Networks', 2005 IEEE International Symposium on Circuits and Systems, 2005 IEEE International Symposium on Circuits and Systems, IEEE, Kobe, JAPAN, pp. 3942-3945.
View/Download from: Publisher's site
Chin-Teng Lin & Chao-Hui Huang 1970, 'Cellular Neural Networks for Hexagonal Image Processing', 2005 9th International Workshop on Cellular Neural Networks and Their Applications, 2005 9th International Workshop on Cellular Neural Networks and Their Applications, IEEE, pp. 81-84.
View/Download from: Publisher's site
View description>>
In this paper, Cellular Neural Networks (CNNs) for Hexagonal Image Processing (HIP) frameworks has been proposed. In this paper, we combine two distinguished researches; one is CNNs, which provide efficient computing abilities. The other is HIP, which contains most compact structure. Both CNNs and HIP are inspired by biological. CNNs present the behaviors which are most similar to the retina of human's eyes, and HIP presents architecture which is also most similar to the distribution of cells on the retina.
Chin-Teng Lin & Shi-An Chen 1970, 'Biological Visual Processing for Hybrid-Order Texture Boundary Detection with CNN-UM', 2005 9th International Workshop on Cellular Neural Networks and Their Applications, 2005 9th International Workshop on Cellular Neural Networks and Their Applications, IEEE, pp. 146-149.
View/Download from: Publisher's site
View description>>
This paper investigates a novel biological visual processing for hybrid-order texture boundary detection. The texture boundary detection is based on the first- and second-order features to model the pre-attentive stage of a human visual system. This system is implemented by a cellular neural network universal machine (CNN-UM) with 3×3 templates to approximate desired filter transfer functions. The system design can process a 64×64 gray-scale image. The proposed algorithm can successfully be performed by CNN-UM and detect the texture boundary in a given image.
Chin-Teng Lin & Tsung-Heng Tsai 1970, 'Biological-Inspired Model for Hybrid-Order Chromatic Texture Segregation', 2005 9th International Workshop on Cellular Neural Networks and Their Applications, 2005 9th International Workshop on Cellular Neural Networks and Their Applications, IEEE, pp. 214-218.
View/Download from: Publisher's site
View description>>
In this study, a computational model for chromatic texture segregation is developed. The model attempts to simulate the visual processing characteristics by mimicking the visual perception. According to Hering's opponent theory, we deal with color information in a color space with three opponent axes. The algorithm extracts 1st-order features by a Gaussian filter and 2 nd-order features by a set of Gabor filters as so called Gabor wavelets. The hybrid-order features are combined at a common site to detect the boundary. The model is very intuitive and physiological relevant such that it reserves opportunities for further approaches.
Chin-Teng Lin, Kan-Wei Fan & Wen-Chang Cheng 1970, 'An Illumination Estimation Scheme for Color Constancy Based on Chromaticity Histogram and Neural Network', 2005 IEEE International Conference on Systems, Man and Cybernetics, 2005 IEEE International Conference on Systems, Man and Cybernetics, IEEE, Waikoloa, HI, pp. 2488-2494.
View/Download from: Publisher's site
Chin-Teng Lin, Wen-Hung Chao, Yu-Chieh Chen & Sheng-Fu Liang 1970, 'Adaptive Feature Extractions in an EEG-based Alertness Estimation System', 2005 IEEE International Conference on Systems, Man and Cybernetics, 2005 IEEE International Conference on Systems, Man and Cybernetics, IEEE, Waikoloa, HI, pp. 2096-2101.
View/Download from: Publisher's site
Chin-Teng Lin, Yu-Chieh Chen, Ruei-Cheng Wu, Sheng-Fu Liang & Teng-Yi Huang 1970, 'Assessment of Driver’s Driving Performance and Alertness Using EEG-based Fuzzy Neural Networks', 2005 IEEE International Symposium on Circuits and Systems, 2005 IEEE International Symposium on Circuits and Systems, IEEE, Kobe, JAPAN, pp. 152-155.
View/Download from: Publisher's site
Chun Yuan Lin, Ken-Li Lin, Chuen-Der Huang, Hsiu-Ming Chang, Chiao Yun Yang, Chin-Teng Lin, Chuan Yi Tang & Hsu, DF 1970, 'Feature Selection and Combination Criteria for Improving Predictive Accuracy in Protein Structure Classification', Fifth IEEE Symposium on Bioinformatics and Bioengineering (BIBE'05), Fifth IEEE Symposium on Bioinformatics and Bioengineering (BIBE'05), IEEE, Minneapolis, MN, pp. 311-315.
View/Download from: Publisher's site
Cornelis, C, Guo, X, Lu, J & Zhang, G 1970, 'A fuzzy relational approach to event recommendation', Proceedings of the 2nd Indian International Conference on Artificial Intelligence, IICAI 2005, Indian International Conference on Artificial Intelligence, IICAI, Pune, INDIA, pp. 2231-2242.
View description>>
Most existing recommender systems employ collaborative filtering (CF) techniques in making projections about which items an e- service user is likely to be interested in, i.e. they identify correlations between users and recommend items which similar users have liked in the past. Traditional CF techniques, however, have difficulties when confronted with sparse rating data, and cannot cope at all with time-specific items, like events, which typically receive their ratings only after they have finished. Content-based (CB) algorithms, which consider the internal structure of items and recommend items similar to those a user liked in the past can partly make up for that drawback, but the collaborative feature is totally lost on them. In this paper, modelling user and item similarities as fuzzy relations, which allow to flexibly reflect the graded/uncertain information in the domain, we develop a novel, hybrid CF-CB approach whose rationale is concisely summed up as 'recommending future items if they are similar to past ones that similar users have liked', and which surpasses related work in the same spirit. Copyright © IICAI 2005.
Coulin, C, Sahraoui, AEK & Zowghi, D 1970, 'Towards a collaborative and combinational approach to requirements elicitation within a systems engineering framework', 18th International Conference on Systems Engineering, Proceedings, International Conference on Software Engineering, ICSEng, Las Vegas, USA, pp. 456-461.
View/Download from: Publisher's site
View description>>
Despite its critical importance to the process of systems development, requirements elicitation continues to be a major problem in both research and practice. This complex activity involving many different facets and issues is often performed badly and subsequently blamed for project failure and poor quality systems. In this paper we present a collaborative and combinational approach to requirements elicitation within a systems engineering framework, proposing the application of current research from other disciplines in areas related to requirements elicitation, such as software engineering and the social sciences, to a general systems engineering context. The work provides both researchers and practitioners with an approach to requirements elicitation for systems engineering that can be applied to real world projects to improve both the process and results, thereby increasing the overall chance of successful system development in terms of on schedule and on budget delivery, and satisfied customers.
Coulin, CR & Zowghi, D 1970, 'What do experts think about elicitation? - A state of practice survey', Proceedings the 10th Australian workshop on requirements engineering, Australian Workshop on Requirements Engineering, Deakin University, Melbourne, AUst, pp. 1-10.
Devitt, SJ, Fowler, AG & Hollenberg, LCL 1970, 'Investigating the practical implementation of Shor's algorithm', SPIE Proceedings, Smart Materials, Nano-, and Micro-Smart Systems, SPIE, Sydney, AUSTRALIA, pp. 483-483.
View/Download from: Publisher's site
Dong, D, Chen, C & Chen, Z 1970, 'Quantum Reinforcement Learning', Springer Berlin Heidelberg, pp. 686-689.
View/Download from: Publisher's site
Dong, D, Chen, C, Zhang, C & Chen, Z 1970, 'An Autonomous Mobile Robot Based on Quantum Algorithm', Springer Berlin Heidelberg, pp. 393-398.
View/Download from: Publisher's site
Dong, DY & Chen, ZH 1970, 'Perturbation theory for quantum control with weak input', 2005 INTERNATIONAL CONFERENCE ON CONTROL AND AUTOMATION (ICCA), VOLS 1 AND 2, 5th International Conference on Control and Automation, IEEE, HUNGARY, Hungarian Acad Sci, Budapest, pp. 736-740.
Dunsire, K, O'Neill, T, Denford, M & Leaney, J 1970, 'The ABACUS architectural approach to computer-based system and enterprise evolution', 12th IEEE International Conference and Workshops on the Engineering of Computer-Based Systems, Proceedings, IEEE International Conference and Workshop on the Engineering of Computer Based Systems, IEEE, Maryland, USA, pp. 62-69.
View/Download from: Publisher's site
View description>>
The enterprise computer-based systems employed by the organisations of today can be extremely complex. Not only do they consist of countless hardware and software products from many varied sources, but they often span continents, piggybacking on public networks. These systems are essential for undertaking business and general operations in the modern environment, and yet the ability of organisations to control their evolution is questionable. The emerging practice of enterprise architecture seeks to control that complexity through the use of a holistic and top-down perspective. However, the toolsets already in me, are very much bottom-up by nature. To overcome the limitations of current enterprise architecture practices, the authors propose the use of the ABACUS methodology and toolset. The authors conclude that by using ABACUS to analyse software and enterprise systems, architects can guide the design and evolution of architectures based on quantifiable non-functional requirements. Furthermore, hierarchical 3D visualisation provides a meaningful and intuitive means for conceiving and communicating complex architectures.
Dyson, LE & Underwood, J 1970, 'Indigenous People on the Web', Building Society Through E-Commerce, Collaborative Electronic Commerce Technology and Research, Universidad de Talca, Talca, Chile, pp. 1-11.
Fenech, BJ & Dovey, KA 1970, 'The role of structure in the failure of organisations to learn and transform', Proceedings of the 6th International conference on organisational learning and knowledge, International conference on organisational learning and knowledge, University of Trento, Trento, Italy, pp. 58-75.
Fincher, S, Lister, R, Clear, T, Robins, A, Tenenberg, J & Petre, M 1970, 'Multi-institutional, multi-national studies in CSEd Research', Proceedings of the 2005 international workshop on Computing education research - ICER '05, the 2005 international workshop, ACM Press, Seattle, USA, pp. 111-121.
View/Download from: Publisher's site
View description>>
One indication of the maturation of Computer Science Education as a research-based discipline is the recent emergence of several large-scale studies spanning multiple institutions. This paper examines a 'family' of these multi-institutional, multi-national studies, detailing core elements and points of difference in both study design and the organization of the research team, and highlighting the costs and benefits associated with the different approaches. Copyright 2005 ACM.
Fowler, AG, Devitt, SJ & Hollenberg, LCL 1970, 'Constructing Steane code fault-tolerant gates', SPIE Proceedings, Smart Materials, Nano-, and Micro-Smart Systems, SPIE, Sydney, AUSTRALIA, pp. 557-557.
View/Download from: Publisher's site
Goyal, ML, Lu, J & Zhang, G 1970, 'Negotiating Multi-Issue e-Market Auction through Fuzzy Attitudes.', CIMCA/IAWTIC, International Conference on Computational Intelligence for Modelling, Control and Automation, IEEE Computer Society, Vienna, Austria, pp. 922-927.
View/Download from: Publisher's site
View description>>
The online auctions are one of the most effective ways of negotiation of salable goods over the Internet. To be successful in open multi-agent environments, agents must be capable of adapting different strategies and tactics to their prevailing circumstances. This paper presents a software test-bed for studying autonomous bidding strategies in simulated auctions for procuring goods. It shows that agents' bidding strategy explore the attitudes and behaviors that help agents to manage dynamic assessment of alternative prices of goods given the different scenario conditions
Guandong Xu, Yanchun Zhang & Xiaofang Zhou 1970, 'Using Probabilistic Latent Semantic Analysis for Web Page Grouping', 15th International Workshop on Research Issues in Data Engineering: Stream Data Mining and Applications (RIDE-SDMA'05), 15th International Workshop on Research Issues in Data Engineering: Stream Data Mining and Applications (RIDE-SDMA'05), IEEE, Tokyo, Japan, pp. 29-36.
View/Download from: Publisher's site
View description>>
The locality of web pages within a web site is initially determined by the designer's expectation. Web usage mining can discover the patterns in the navigational behaviour of web visitors, in turn, improve web site functionality and service designing by considering users' actual opinion. Conventional web page clustering technique is often utilized to reveal the functional similarity of web pages. However, high-dimensional computation problem will be incurred due to taking user transaction as dimension. In this paper, we propose a new web page grouping approach based on Probabilistic Latent Semantic Analysis (PLSA) model. An iterative algorithm based on maximum likelihood principle is employed to overcome the aforementioned computational shortcoming. The web pages are classified into various groups according to user access patterns. Meanwhile, the semantic latent factors or tasks are characterized by extracting the content of 'dominant' pages related to the factors. We demonstrate the effectiveness of our approach by conducting experiments on real world data sets. © 2005 IEEE.
Guo, D, Fong, A, Lail, A, O’Sullivan, M, Stone, G, Kiiveri, H, Henry, M, Stephan, D, Dalla-Pozza, L & Catchpoole, DR 1970, 'Simplifying Complex Microarray Data To Derive Gene Expression Profiles Which Identify Childhood Acute Lymphoblastic Leukaemia Patients at Risk of Relapse.', Blood, American Society of Hematology, pp. 4506-4506.
View/Download from: Publisher's site
View description>>
Abstract The optimal treatment of patients with childhood acute lymphoblastic leukaemia (ALL) depends on establishing accurate diagnosis. Our investigations seek to strategically develop the application of microarray gene expression profiling to identify ALL patients with clinically homogenous presentations but which may respond differently to established treatment regimens. We have determined the gene expression profiles of ALL bone marrow (BM) samples taken from patients at diagnosis. Data analysis has focussed on the use of a novel and innovative statistical technology, Gene-RaVE. This series of patent protected algorithms builds a multinomial regression model using Bayesian variable selection. Gene-RaVE leads to the generation of a parsimonious and simple diagnostic gene expression signature, but which provides increased predictive ability over current analysis approaches. We describe our analysis of both Affymetrix (HU133A) and cDNA (10.5K) microarray gene expression profiles generated from diagnostic BM from >100 ALL patients covering the broad ALL subtypes including T and B lineage as well as T cell lymphoma leukaemia. Comparison of gene expression data failed to identify clearly distinguishing profiles between patients identified as ‘standard risk’ from ‘medium risk’ according to BFM95 clinical criteria. Gene expression profiles from a cohort of ALL patients, identified as ‘standard risk’ at diagnosis, were compared on the basis of their overall clinical outcome: relapse within 2 yrs vs non-relapse. Using a range of analyses including t-test, Gene-RaVE, discriminant analysis approaches and principle component analysis, we have discovered that small subsets of genes (<10), all of which included Nedd4BP3 and Ribosomal Protein L38 (RPL38), can be used to distinguish the two outcome groups. Subsequent validation using real time PCR supports the increase in Nedd4BP3 expression in standard risk pat...
Guo, X, Zhang, G, Chew, E & Burdon, S 1970, 'A Hybrid Recommendation Approach for One-and-Only Items', Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), Australasian Joint Conference on Artificial Intelligence, Springer Berlin Heidelberg, Sydney, Australia, pp. 457-466.
View/Download from: Publisher's site
View description>>
Many mechanisms have been developed to deliver only relevant information to the web users and prevent information overload. The most popular recent developments in the e-commerce domain are the user-preference based personalization and recommendation techniques. However, the existing techniques have a major drawback - poor accuracy of recommendation on one-and-only items - because most of them do not understand the item's semantic features and attributes. Thus, in this study, we propose a novel Semantic Product Relevance model and its attendant personalized recommendation approach to assist Export business selecting the right international trade exhibitions for market promotion. A recommender system, called Smart Trade Exhibition Finder (STEF), is developed to tailor the relevant trade exhibition information to each particular business user. STEF reduces significantly the time, cost and risk faced by exporters in selecting, entering and developing international markets. In particular, the proposed model can be used to overcome the drawback of existing recommendation techniques. © Springer-Verlag Berlin Heidelberg 2005.
Guo-Li Zhang, Hai-Yan Lu, Geng-Yin Li & Guang-Quan Zhang 1970, 'Dynamic economic load dispatch using hybrid genetic algorithm and the method of fuzzy number ranking', 2005 International Conference on Machine Learning and Cybernetics, Proceedings of 2005 International Conference on Machine Learning and Cybernetics, IEEE, Guangzhou, China, pp. 2472-2477 Vol. 4.
View/Download from: Publisher's site
View description>>
This paper proposes a new economic load dispatch model that considers cost coefficients with uncertainties and the constraints of ramp rate. The uncertainties are represented by fuzzy numbers, and the model is known as fuzzy dynamic economic load dispatch model (FDELD). A novel hybrid genetic algorithm with quasi-simplex techniques is proposed to handle the FDELD problem. The algorithm creates offspring by using generic operation and quasi-simplex techniques in parallel. The quasi-simplex techniques consider two potential optimal search directions in generating prospective offspring. One direction is the worst-opposite direction, which is used in the conventional simplex techniques, and the other is the best-forward direction, which is a ray from the centroid of a polyhedron whose vertexes are all the points but the best one towards the best point of the simplex. In addition, in order to reserve more fuzzy information, the fuzzy number ranking method is used to optimize the cost function, avoiding the lost some useful information by getting /spl lambda/-level set. The experimental study shows that FDELD is more practical model; the algorithm and techniques proposed are very efficient to solve FDELD problem.
Hardy, V, Fung, HC, Xian, GS, Wu, JH, Zhang, XY & Dyson, LE 1970, 'Paper usage management and information technology: An environmental case study at an Australian University', Internet and Information Technology in Modern Organizations Challenges and Answers Proceedings of the 5th International Business Information Management Association Conference Ibima 2005, International Business Information Management, IBIMA, Cario, Egypt, pp. 699-705.
View description>>
IT was supposed to lead to the paperless office. However it has actually caused paper usage levels to increase. There are several social explanations for this increasing trend. Printing technology has become more available, and people have more information to print. In addition people often prefer printed matter to working off a screen. In this article we present a case study of paper usage at a University. It was found that, despite continual pressures escalating paper use, a range of strategies from quotas and charges, doublesided printing, reuse of scrap paper and policy changes had been successful in reducing usage. Motivational factors included cost saving and a commitment to the environment.
Hazzan, O, Impagliazzo, J, Lister, R & Schocken, S 1970, 'Using history of computing to address problems and opportunities', Proceedings of the 36th SIGCSE technical symposium on Computer science education, SIGCSE05: Technical Symposium on Computer Science Education, ACM, pp. 126-127.
View/Download from: Publisher's site
Hendriks, M & Dyson, LE 1970, 'Motes; The New Privacy Invaders', 2005 Information Resources Management Association International Conference, International Conference on Information Resources Management, Idea Group Publishing, San Diego, USA, pp. 772-775.
Hintz, T, Piccardi, M & He, X 1970, 'Message from the Chairs', Proceedings - 3rd International Conference on Information Technology and Applications, ICITA 2005, pp. iii-iv.
View/Download from: Publisher's site
Huan Huo, Guoren Wang, Chuan Yang & Rui Zhou 1970, 'Signature-based Filtering Techniques for Structural Joins of XML Data', 21st International Conference on Data Engineering Workshops (ICDEW'05), 21st International Conference on Data Engineering Workshops (ICDEW'05), IEEE, pp. 1274-1274.
View/Download from: Publisher's site
Hussain, FK, Chang, E & Dillon, TS 1970, 'Formalizing a grammar for reputation in peer-to-peer communication', MoMM 2005 Proceedings, International Conference on Advances in Mobile Computing and Multimedia, Australian Computer Society, Kuala Lumpur, Malaysia, pp. 81-96.
Hussain, FK, Chang, E, Dillon, TS & IEEE 1970, 'Markov model for modelling and managing dynamic trust', 2005 3rd IEEE International Conference on Industrial Informatics (INDIN), IEEE International Conference On Industrial Informatics, IEEE, Perth, Australia, pp. 725-733.
View/Download from: Publisher's site
View description>>
In this paper we propose a trust model for decision making that models both the context specific and the dynamic nature of trust. We propose to make use of the Markov Model for modelling the context specific and the dynamic nature of trust. Using the Mar
Hussain, O, Chang, E, Hussain, F, Dillon, T & Soh, B 1970, 'Context Based Riskiness Assessment', TENCON 2005 - 2005 IEEE Region 10 Conference, TENCON 2005 - 2005 IEEE Region 10 Conference, IEEE, Melbourne, Aust, pp. 1-6.
View/Download from: Publisher's site
View description>>
In almost every interaction the trusting peer might fear about the likelihood of the loss in the resources involved during the transaction. This likelihood of the loss in the resources is termed as Risk in the transaction. Hence analyzing the Risk involved in a transaction is important to decide whether to proceed with the transaction or not. If a trusting peer is unfamiliar with a trusted peer and has not interacted with it before in a specific context, then it will ask for recommendations from other peer in order to determine the trusted peer?s Riskiness value or reputation. In this paper we discuss the process of asking recommendations from other peers in a specific context and assimilating those recommendations according to its criteria of the interaction in order to determine the correct Riskiness value of the trusted peer.
Hussain, O, Chang, E, Hussain, FK, Dillon, TS & Soh, B 1970, 'A Methodology for Determining Riskiness in Peer-to-Peer Communications', Proceedings of the IEEE International Conference of Industrial Informatics (INDIN 05), INDIN, IEEE, Perth, Australia, pp. 421-432.
View description>>
Every activity has some Risk involved in it. Analyzing the Risk involved in a transaction is important to decide whether to proceed with the transaction or not. Similarly in Peer-to-Peer communication analyzing the Risk involved in undertaking a transaction with another peer too is important. It would be much easier for the trusting peer to make a decision of proceeding a transaction with the trusted peer if he knows the Risk that the trusted peer is worthy of. In this paper develop and propose such a methodology which allows the trusting peer to rate the trusted peer in terns of Risk that he deserves after the transaction is over.
Hussain, O, Chang, E, Soh, B, Hussain, FK & Dillon, TS 1970, 'Factors of Risk Variance in Decentralized Communications', EICAR 2005 Conference Best Paper Proceedings, European Institute for Computer Anti-Virus Research EICAR Conference, Computer Associates, Saint Julians, Malta, pp. 129-137.
View description>>
Decentralized transactions are increasingly becoming popular. These transactions resemble the early forms of the internet and in many ways are regarded as the next generation of the internet. The result will be that this e-commerce transactions approach will shift to peer-to-peer communications rather than client-server environment. However, these peer-to-peer communications or decentralized transactions suffer from some disadvantages, which includes the risk associated with each transaction. This paper focuses on the factors that influence Risk in a decentralized transaction.
Hussain, OK, Chang, E, Hussain, FK, Dillon, TS & Soh, B 1970, 'Risk in Trusted Decentralized Communications', 21st International Conference on Data Engineering Workshops (ICDEW'05), 21st International Conference on Data Engineering Workshops (ICDEW'05), IEEE, Tokyo, Japan, pp. 1198-1198.
View/Download from: Publisher's site
View description>>
Risk is associated with almost every activity that is undertaken on a daily life. Risk associated with Trust, Security and Privacy. Risk is associated with transactions, businesses, information systems, environments, networks, partnerships, etc. Generally speaking, risk signifies the likelihood of financial losses, human casualties, business destruction and environmental damages. Risk indicator gives early warning to the party involved and helps avoid deserters. Until now, risk has been discussed extensively in the areas of investment, finance, health, environment, daily life activities and engineering. However, there is no systematic study of risk in Decentralised communication, which involves e-business, computer networks and service oriented environment. In this paper, we define risk associated with trusted communication in e-business and e-transactions; provide risk indicator calculations and basic application areas.
Hussain, OK, Chang, E, Hussain, FK, Dillon, TS, Soh, B & IEEE 1970, 'A methodology for determining riskiness in peer-to-peer communications', 2005 3rd IEEE International Conference on Industrial Informatics (INDIN), pp. 655-666.
View/Download from: Publisher's site
View description>>
Every activity has some Risk involved in it. Analyzing the Risk involved in a transaction is important to decide whether to proceed with the transaction or not. Similarly in Peer-to-Peer communication analyzing the Risk involved in undertaking a transaction with another peer too is important. It would be much easier for the trusting peer to make a decision of proceeding a transaction with the trusted peer if he knows the Risk that the trusted peer is worthy of. In this paper develop and propose such a methodology which allows the trusting peer to rate the trusted peer in terms of Risk that he deserves after the transaction is over. © 2005 IEEE.
Jen-Feng Chung, Der-Jenq Liu & Chin-Teng Lin 1970, 'Multiband Room Effect Simulator for 5.1-Channel Sound System', 2005 IEEE International Symposium on Circuits and Systems, 2005 IEEE International Symposium on Circuits and Systems, IEEE, Kobe, JAPAN, pp. 2859-2862.
View/Download from: Publisher's site
Johnston, A, Amitani, S & Edmonds, E 1970, 'Amplifying reflective thinking in musical performance', Proceedings of the 5th conference on Creativity & cognition - C&C '05, the 5th conference, ACM Press, London, UK, pp. 166-166.
View/Download from: Publisher's site
View description>>
In this paper we report on the development of tools that encourage both a creative and reflective approach to music-making and musical skill development. A theoretical approach to musical skill development is outlined and previous work in the area of music visualisation is discussed. In addition the characterisation of music performance as a type of design problem is discussed and the implications of this position for the design of tools for musicians are outlined. Prototype tools, the design of which is informed by the theories and previous work, are described and some preliminary evaluation of their effectiveness is discussed. Future directions are outlined. Copyright 2005 ACM.
Johnston, A, Marks, B & Edmonds, E 1970, 'An artistic approach to designing visualisations to aid instrumental music learning', IADIS International Conference on Cognition and Exploratory Learning in Digital Age, CELDA 2005, Cognition and Exploratory Learning in Digital Age, IADIS Press, Porto, Portugal, pp. 175-182.
View description>>
This paper describes the development of a computer-based music visualisation to support the development of instrumental musical skills in advanced students and professional players. The underlying pedagogical philosophy, based on the 'Natural Learning Process' and the emergence of an artistic rather than engineering approach to software development, based on participatory design, are described.
Johnston, AJ, Marks, B & Edmonds, EA 1970, ''Spheres of Influence' : An Interactive Musical Work', Proceedings of the second Australasian conference on Interactive entertainment, Interactive Entertainment, Creativity Cognition Studios Press, Sydney, Australia, pp. 97-103.
View description>>
In this paper we describe the development of an interactive artwork which incorporates both a musical composition and software which provides a visual and aural accompaniment. The system uses physical modeling to implement a type of virtual 'sonic sculpture' which responds to musical input in a way which appears naturalistic. This work forms part of a larger project to use art to explore the potential of computers to develop interactive tools which support the development of creative musical skills.
Ko, L-W, Kuo, B-C & Lin, C-T 1970, 'An Optimal Nonparametric Weighted System for Hyperspectral Data Classification', KNOWLEDGE-BASED INTELLIGENT INFORMATION AND ENGINEERING SYSTEMS, PT 1, PROCEEDINGS, 9th International Conference on Knowledge-Based Intelligent Information and Engineering Systems, Springer Berlin Heidelberg, La Trobe Univ, Melbourne, AUSTRALIA, pp. 866-872.
View/Download from: Publisher's site
Lan-Da Van, Yuan-Chu Yu, Chun-Ming Huang & Chin-Teng Lin 1970, 'Low computation cycle and high speed recursive DFT/IDFT: VLSI algorithm and architecture', IEEE Workshop on Signal Processing Systems Design and Implementation, 2005., IEEE Workshop on Signal Processing Systems Design and Implementation, 2005., IEEE, Athens, GREECE, pp. 579-584.
View/Download from: Publisher's site
Lee, S, Leaney, J, O’Neill, T & Hunter, M 1970, 'Open Service Access for QoS Control in Next Generation Networks – Improving the OSA/Parlay Connectivity Manager', Lecture Notes In computer Science: Operations And Management In Ip-Based Networks, Proceedings, IPOM, Springer Berlin Heidelberg, Heidelberg, Germany, pp. 29-38.
View/Download from: Publisher's site
View description>>
The need for providing applications with practical, manageable access to feature-rich capabilities of telecommunications networks has resulted in standardization of the OSA/Parlay APIs and more recently the Parlay X Web Services. Connectivity Manager is
Lee, S, Leaney, J, O'Neill, T & Hunter, M 1970, 'Performance Benchmark of a Parallel and Distributed Network Simulator', Workshop on Principles of Advanced and Distributed Simulation (PADS'05), Workshop on Principles of Advanced and Distributed Simulation (PADS'05), IEEE, Monterey, USA, pp. 101-108.
View/Download from: Publisher's site
View description>>
Simulation of large-scale networks requires enormous amounts of memory and processing time. One way of speeding up these simulations is to distribute the model over a number of connected workstations. However, this introduces inefficiencies caused by the need for synchronization and message passing between machines. In distributed network simulation, one of the factors affecting message passing overhead is the amount of cross-traffic between machines. We perform an independent benchmark of the Parallel/Distributed Network Simulator (PDNS) based on experimental results processed at the Australian Centre for Advanced Computing and Communications (ACS) supercomputing cluster. We measure the effect of cross-traffic on wall-clock time needed to complete a simulation for a set of basic network topologies by comparing the result with the wall-clock time needed on a single processor. Our results show that although efficiency is reduced with large amounts of cross-traffic, speedup can still be achieved with PDNS. With these results, we developed a performance model that can be used as a guideline for designing future simulations. © 2005 IEEE.
Lee, S, Leaney, JR, O'Neill, T & Hunter, M 1970, 'Open API of QoS control in Next Generation Networks', Toward Managed Ubiquitous Information Society, Asia-Pacific Network Operations and Management Symposium, IEICE TM, KICS KNOM, IEEE CNOM, IEEE APB, IEEE COMSOC Japan Chapter and TMF, Okinawa, Japan, pp. 295-306.
Leong, T, Vetere, F & Howard, S 1970, 'The Serendipity Shuffle', Australia conference on Computer-Human Interaction, ACM Press, Canberra, pp. 1-4.
Lin, CT & Shou, YW 1970, 'Texture classification and representation by CNN-based feature extraction', Proceedings of the IEEE International Workshop on Cellular Neural Networks and their Applications, pp. 210-213.
View description>>
This paper proposes a novel approach for texture classification by feature extraction based on cellular neural networks (CNN's) and an intelligent arrangement in the design or exact using of the templates in CNN's. This paper also gives a two-way processing mechanism including the analysis of features extracted from the output of CNN's mapping and a selective training step for obtaining the specific templates in CNN's by genetic algorithms (GA's) for more complicated texture patterns. In this paper, we introduce a one dimensional feature curve as well to indicate the characteristics of original texture patterns from the mapping of the output of CNN's for the latter texture classification. The method introduced in this paper could adaptively choose an appropriate processing procedure for the specific issues towards in specific texture patterns. We finally divide our experiments into two sections, for simple and advanced problems in texture classification. Our experimental results demonstrate the valid template training and at the same time show a satisfactory classification outcome in both defined texture problems.
Lin, CT, Chin, CL, Fan, KW & Lin, CY 1970, 'A novel architecture for converting single 2D image into 3D effect image', Proceedings of the IEEE International Workshop on Cellular Neural Networks and their Applications, pp. 52-55.
View description>>
A novel 2D to 3D effect image conversion architecture integrated image segmentation system and depth estimation is presented. Its objective is to describe the technique used to generate stereo pair images starting from a single image source (a view-point) and its related depth map. The conjunction between segmented image and depth map allows reconstructing artificially the binocular view producing a 3D effect depending on screen dimension and image resolution. The system consists of an on-line ICA mixture model that performs image segmentation and depth estimation method to obtain the depth information of image. The proposed system is capable to perform automatic multilevel segmentation of images, based on depth information obtained by the image depth estimation method. The system does not reconstruct the real 3D coordinates of any object inside the scene, but simply assigns the most comfortable shift to the source points to give a 3D-entertainment to the viewer. A simple smoothing filtering technique adaptively resolves the occlusions generated by shifting of the source points without introducing visible and/or annoying artifacts.
Lin, CT, Liang, SF, Yeh, CM & Fan, KW 1970, 'Fuzzy neural network design using support vector regression for function approximation with outliers', Conference Proceedings - IEEE International Conference on Systems, Man and Cybernetics, IEEE International Conference on Systems, Man and Cybernetics, IEEE, Waikoloa, HI, pp. 2763-2768.
View description>>
A fuzzy neural network based on support vector learning mechanism for function approximation is proposed in this paper. Support vector regression (SVR) is a novel method for tackling the problems of function approximation and regression estimation based on the statistical learning theory. SVR has been shown to have robust properties against noise. A novel support-vector-regression based fuzzy neural network (SVRFNN) by integrating SVR technology into FNN is developed. The SVRFNN combines the high accuracy and robustness of support vector regression (SVR) and the efficient human-like reasoning of FNN for function approximation. Experimental results show that the proposed SVFNN for function approximation can achieve good approximation performance with drastically reduced number of fuzzy kernel functions. © 2005 IEEE.
Lin, P, MacArthur, A & Leaney, J 1970, 'Defining Autonomic Computing: A Software Engineering Perspective', 2005 Australian Software Engineering Conference, 2005 Australian Software Engineering Conference, IEEE, Brisbane, Australia, pp. 88-97.
View/Download from: Publisher's site
View description>>
As a rapidly growing field, Autonomic Computing is a promising new approach for developing large scale distributed systems. However, while the vision of achieving self-management in computing systems is well established, the field still lacks a commonly accepted definition of what an Autonomic Computing system is. Without a common definition to dictate the direction of development, it is not possible to know whether a system or technology is a part of Autonomic Computing, or if in fact an Autonomic Computing system has already been built. The purpose of this paper is to establish a standardised and quantitative definition of Autonomic Computing through the application of the Quality Metrics Framework described in IEEE Std 1061-1998 [1]. Through the application of this methodology, stakeholders were systematically analysed and evaluated to obtain a balanced and structured definition of Autonomic Computing. This definition allows for further development and implementation of quality metrics, which are project-specific, quantitative measurements that can be used to validate the success of future Autonomic Computing projects.
Lister, RF 1970, 'Methods for evaluating the approproiateness and effectiveness of summative assessment via multiple choice examinations for technology-focused disciplines', Making a Difference: 2005 Evaluations and Assessment Conference, Evaluations and Assessment Conference, UTS, Sydney, Aust, pp. 75-84.
Liu, W, Zhu, X, Xu, G, Zhang, Q & Gao, L 1970, 'A DNA Based Evolutionary Algorithm for the Minimal Set Cover Problem', Lecture Notes in Computer Science, International Conference on Intelligent Computing, ICIC 2005, Springer Berlin Heidelberg, Hefei, China, pp. 80-89.
View/Download from: Publisher's site
View description>>
With the birth of DNA computing, Paun et al. proposed an elegant algorithm to this problem based on the sticky model proposed by Roweis. However, the drawback of this algorithm is that the 'exponential curse' is hard to overcome, and therefore its application to large instance is limited. In this s paper, we present a DNA based evolutionary algorithm to solve this problem, which takes advantage of both the massive parallelism and the evolution strategy by traditional EAs. The fitness of individuals is defined as the negative value of their length. Both the crossover and mutation can be implemented in a reshuffle process respectively. We also present a short discussion about population size, mutation probability, crossover probability, and genetic operations over multiple points. In the end, we also present some problems needed to be further considered in the future. © Springer-Verlag Berlin Heidelberg 2005.
Maxwell, C, Parakhine, A, Leaney, J & Soc, IEEEC 1970, 'Practical application of formal methods for specification and analysis of software architecture', 2005 Australian Software Engineering Conference, Proceedings, Australian Software Engineering Conference, IEEE, Brisbane, Australia, pp. 302-311.
View/Download from: Publisher's site
View description>>
With the ever-growing pace of technological advancement, computer software is required to become increasingly complex to meet the demands of today's leading edge technologies, and their applications. However, fulfilling this requirement creates new, previously unknown, problems pertaining to non-functional properties of software. Specifically, as the software complexity escalates, it becomes increasingly difficult to scale the software in order to cope with the sometimes overwhelming demand created by system growth. It is therefore essential to have processes for addressing the issues associated with scalability that arise due to the complexity in software systems. In this paper we describe an approach aimed at fulfilling the need for such processes. A combination of Object-Z and temporal logic is used to create an architectural description open to further analysis. We also demonstrate the practicality of this methodology within the context of the coordinated adaptive traffic system (CATS).
Maxwell, C, Parakhine, A, Leaney, J, OiNeill, T & Denford, M 1970, 'Heuristic-based architecture generation for complex computer system optimisation', 12TH IEEE INTERNATIONAL CONFERENCE AND WORKSHOPS ON THE ENGINEERING OF COMPUTER-BASED SYSTEMS, PROCEEDINGS, IEEE International Conference and Workshop on the Engineering of Computer Based Systems, IEEE, Greenbelt, USA, pp. 70-78.
View/Download from: Publisher's site
View description>>
Having come of age in the last decade, the use of architecture to describe complex systems, especially in software, is now maturing. With our current ability to describe, represent, analyse and evaluate architectures comes the next logical step in our application of architecture to system design and optimisation. Driven by the increasing scale and complexity of modern systems, the designers have been forced to find new ways of managing the difficult and complex task of balancing the quality trade-offs inherent in all architectures. Architecture-based optimisation has the potential to not only provide designers with a practical method for approaching this task, but also to provide a generic mechanism for increasing the overall quality of system design. In this paper we explore the issues that surround the development of architectural optimisation and present an example of heuristic-based optimisation of a system with respect to concrete goals.
McGregor, C, Heath, J & Ming Wei 1970, 'A Web services based framework for the transmission of physiological data for local and remote neonatal intensive care', 2005 IEEE International Conference on e-Technology, e-Commerce and e-Service, Proceedings. The 2005 IEEE International Conference on e-Technology, e-Commerce and e-Service, IEEE, Hong Kong Baptist Univ, Hong Kong, PEOPLES R CHINA, pp. 496-501.
View/Download from: Publisher's site
McGregor, C, Kneale, B & Tracy, M 1970, 'Bush Babies Broadband: On-Demand Virtual Neonatal Intensive Care Unit Support for Regional Australia', Third International Conference on Information Technology and Applications (ICITA'05), Proceedings. Third International Conference on Information Technology and Applications, IEEE, Sydney, AUSTRALIA, pp. 113-118.
View/Download from: Publisher's site
McGregor, C, Purdy, M & Kneale, B 1970, 'Compression of XML Physiological Data Streams to Support Neonatal Intensive Care Unit Web Services', 2005 IEEE International Conference on e-Technology, e-Commerce and e-Service, 2005 IEEE International Conference on e-Technology, e-Commerce and e-Service, IEEE, Hong Kong Baptist Univ, Hong Kong, PEOPLES R CHINA, pp. 486-489.
View/Download from: Publisher's site
Milton, J, Kennedy, P & Mitchell, H 1970, 'The effect of mutation on the accumulation of information in a genetic algorithm', AI 2005: ADVANCES IN ARTIFICIAL INTELLIGENCE, Australasian Joint Conference on Artificial Intelligence, Springer, Sydney, Australia, pp. 360-368.
View/Download from: Publisher's site
View description>>
We use an information theory approach to investigate the role of mutation on Genetic Algorithms (GA). The concept of solution alleles representing information in the GA and the associated concept of information density, being the average frequency of solution alleles in the population, are introduced. Using these concepts, we show that mutation applied indiscriminately across the population has, on average, a detrimental effect on the accumulation of solution alleles within the population and hence the construction of the solution. Mutation is shown to reliably promote the accumulation of solution alleles only when it is targeted at individuals with a lower information density than the mutation source. When individuals with a lower information density than the mutation source are targeted for mutation, very high rates of mutation can be used. This significantly increases the diversity of alleles present in the population, while also increasing the average occurrence of solution alleles.
Moscato, P, Mathieson, L, Mendes, A & Berretta, R 1970, 'The electronic primaries: Predicting the U.S. Presidency using feature selection with safe data reduction', Conferences in Research and Practice in Information Technology Series, pp. 371-380.
View description>>
The data mining inspired problem of finding the critical, and most useful features to be used to classify a data set, and construct rules to predict the class of future examples is an interesting and important problem. It is also one of the most useful problems with applications in many areas such as microarray analysis, genomics, proteomics, pattern recognition, data compression and knowledge discovery. Expressed as κ-Feature Set it is also a formally hard problem. In this paper we present a method for coping with this hardness using the combinatorial optimisation and parameterized complexity inspired technique of sound reduction rules. We apply our method to an interesting data set which is used to predict the winner of the popular vote in the U.S. presidential elections. We demonstrate the power and exibility of the reductions, especially when used in the context of the (α β)κ-Feature Set variant problem. Copyright © 2005, Australian Computer Society, Inc.
Nataatmadja, I & Dyson, LE 1970, 'Managing the modern workforce: Cultural Diversity and Its Implications', Managing Mofern Organisations with Information Technology, International Conference on Information Resources Management, Idea Group Publishing, San Diego, USA, pp. 580-583.
Nurmuliani, N, Zowghi, D & Williams, S 1970, 'Characterising requirements volatility: An empirical case study', Proceedings 2005 International Symposium on empirical software engineering ISESE 2005, International Symposium on Empirical Software Engineering, IEEE, Noosa Head, Australia, pp. 427-436.
Nurmuliani, N, Zowghi, D & Williams, SP 1970, 'Characterising requirements volatility: An empirical case study', 2005 International Symposium on Empirical Software Engineering (ISESE), Proceedings, 4th International Symposium on Empirical Software Engineering, IEEE, Noosa Heads, AUSTRALIA, pp. 412-421.
Pattinson, HM & Sood, SC 1970, 'Deciphering storylines in B2B selling-buying interactions', Advances in Marketing: Managerial, Pedagogical, Theoretical - Proceedings of the Annual Meeting of the Society for Marketing Advances, Annual Meeting of the Society for Marketing Advances, Society for Marketing Advances, San Antonio, USA, pp. 199-201.
Pietroni, N & Ganovelli, AGF 1970, 'Robust segmentation of anatomical structures with deformable surfaces and marching cubes', Citeseer.
Riedel, S & Gabrys, B 1970, 'Hierarchical Multilevel Approaches of Forecast Combination', OPERATIONS RESEARCH PROCEEDINGS 2004, Annual International Conference of the German-Operations-Research-Society, Springer Berlin Heidelberg, Tilburg, NETHERLANDS, pp. 479-486.
View/Download from: Publisher's site
Sarosa, S & Zowghi, D 1970, 'Information technology adoption process within Indonesian SMEs: An empirical study', ACIS 2005 Proceedings - 16th Australasian Conference on Information Systems, Australasian Conference on Information Systems, Australasian Chapter of the Association for Information Systems, Sydney, Australia, pp. 1-9.
View description>>
IT adoption within SMEs has been covered extensively within literature, most of which have considered IT adoption from narrow perspective such as drivers and barriers of IT adoption. IT adoption is better defined as a process which involves organisation and its components, stakeholders external to the organisation, and interactions within organisation and between organisation and its stakeholders. This paper uses multi perspective in IT adoption to build model of IT adoption. A field study involving 35 Indonesian SMEs was conducted in the form of semi structured interviews. The result from this field study were analysed and used to refine the proposed model. © 2005.
Sarosa, S & Zowghi, D 1970, 'Recover from information system failure: An Indonesian case study', European and Mediterranean Conference on Information Systems, EMCIS 2005, European, Mediterranean and Middle Eastern Conference on Information Systems, Information Institute, Cario, Egypt, pp. 1-11.
View description>>
Small and Medium Enterprises (SMEs) sometimes acquire information systems that fail to meet their original aims and objectives. In these circumstances, the project sponsors are forced to decide whether they should abandon the system they have paid for or improvise by finding a way around the problem. This paper presents a case study with two Indonesian SMEs who had to deal with information systems failure within their organizations. Although within the information systems literature reports of these types of failure can be found but little is known about the aftermath of failure within SMEs. This case study presents the actions taken by two Indonesian SMEs who had to face with the failure of their web catalogue systems. The notion of IS failure used in this paper is a combination of 'expectation failure' and 'termination failure'.
Sheng-Fu Liang, Chin-Teng Lin, Ruei-Cheng Wu, Teng-Yi Huang & Wen-Hung Chao 1970, 'Classification of Driver’s Cognitive Responses From EEG Analysis', 2005 IEEE International Symposium on Circuits and Systems, 2005 IEEE International Symposium on Circuits and Systems, IEEE, Kobe, JAPAN, pp. 156-159.
View/Download from: Publisher's site
Sheridan-Smith, N, Leaney, J, O'Neill, T & Hunter, M 1970, 'A Policy-Driven Autonomous System for Evolutive and Adaptive Management of Complex Services and Networks', 12th IEEE International Conference and Workshops on the Engineering of Computer-Based Systems (ECBS'05), 12th IEEE International Conference and Workshops on the Engineering of Computer-Based Systems (ECBS'05), IEEE, Greenbelt, Maryland, USA, pp. 389-397.
View/Download from: Publisher's site
View description>>
Many existing management systems are not evolutive or adaptive, leading to multiplicity over time and increasing the management burden. Policy-based management approaches may assist in making networks less complex and more automated, but to date they have not yet been able to evolve to support new service sets or provide the capacity for differentiation. We present the architecture for a policy-based system named Pronto that helps to deal with these issues. Layered network and service models are built above an extensible virtual device model that supports heterogenous management interfaces. Interchangeable management components provide the basic building blocks to construct logical services. The integrated policy-driven service definition language automates the management of the services in a manner that is adaptive, dynamic and reactive to improve the user's overall service experience. © 2005 IEEE.
Sheridan-Smith, N, O’Neill, T, Leaney, J & Hunter, M 1970, 'Enhancements to Policy Distribution for Control Flow and Looping', Lecture Notes in Computer Science Vol 3775/2005, IFIP/IEEE International Workshop on Distributed Systems Operations and Management, Springer Berlin Heidelberg, Barcelona, Spain, pp. 269-280.
View/Download from: Publisher's site
View description>>
Our previous work proposed a simple algorithm for the distribution and coordination of network management policies across a number of autonomous management nodes by partitioning an Abstract Syntax Tree into different branches and specifying coordination points based on data and control flow dependencies. We now extend this work to support more complex policies containing control flow logic and looping, which are part of the PRONTO policy language. Early simulation results demonstrate the potential performance and scalability characteristics of this framework.
Sheridan-Smith, N, O'Neill, T, Leaney, J & Hunter, M 1970, 'Distribution and coordination of policies for large-scale service management', Lanoms 2005 4th Latin American Network Operations and Management Symposium Proceedings, pp. 257-262.
View description>>
The distribution and coordination of policies is often overlooked but is crucial to the scalability of dynamic, personalised services. In this work we partition an Abstract Syntax Tree of the policies to determine the responsibility of different management nodes in a geographically segregated network (i.e. management by delegation). This partitioning is combined with IN/OUT set analysis to determine the required coordination for policy enforcement of complex policies with inter-dependencies. Our simulation results show that this approach is promising, as higher decision loads can be readily handled by further sub-dividing of the network.
Sheridan-Smith, NB, O'Neill, T, Leaney, JR & Hunter, M 1970, 'Distribution and Coordination of Policies for Large-Scale Service Management', Proceedings of the IV Latin American Network Operations and Management Symposium, 4th Latin American Network Operations and Management Symposium (LANOMS), Unknown, Porto Alegre, Brazil, pp. 1-12.
Shi, CG, Lu, J, Zhang, GQ, Zhou, H & IEEE 1970, 'An extended Kuhn-Tucker approach for linear bilevel multifollower programming with partial shared variables among followers', INTERNATIONAL CONFERENCE ON SYSTEMS, MAN AND CYBERNETICS, VOL 1-4, PROCEEDINGS, IEEE Conference on Systems, Man and Cybernetics, IEEE Publisher, Hawaii, USA, pp. 3350-3357.
View/Download from: Publisher's site
View description>>
In a real world bilevel decision-making, the lower level of a bilevel decision usually involves multiple decision units. This paper proposes an extended Kuhn-Tucker approach for linear bilevel multifollower programming problems with partial shared variables among followers. Finally numeric examples are given to show how the Kuhn-Tucker approach works.
Sood, SC & Pattinson, HM 1970, 'Semantics in marketspace: emerging semantic marketing computer-mediated environments', Advances in Marketing: Managerial, Pedagogical, Theoretical - Proceedings of the Annual Meeting of the Society for Marketing Advances, Annual Meeting of the Society for Marketing Advances, Society for Marketing Advances, San Antonio, USA, pp. 198-198.
Sood, SC & Pattinson, HM 1970, 'Urban renewal in Asia-Pacific: A comparative analysis of 'brainports' for Sydney and Kuala Lumpur', Dealing with Dualities - 21st Annual IMP Conference, Annual IMP Conference, IMP Group, Rotterdam, Netherlands, pp. 1-6.
Subkey, A, Cabrera, D & Ferguson, S 1970, 'Localization and image size effects for low frequency sound', Audio Engineering Society 118th Convention Spring Preprints 2005, pp. 1974-1983.
View description>>
Using four subwoofers, this study investigates horizontal auditory image characteristics for one-third octave bands of pink noise in the frequency range 25 Hz to 100 Hz. The subwoofers were located at 90 degree intervals: 45 degrees to the left and right, and in front of and behind the subject. Single noise bands, coherent pairs, and incoherent pairs were subjectively assessed. Subjects drew the auditory image as an ellipse on a response sheet. Results indicate that left-right discrimination occurs even at the lowest frequencies of human hearing - a finding consistent with other recent research. Image width and depth are correlated, increasing at low frequencies for the stimuli tested, and for simultaneous presentation of coherent or incoherent signals. Like other recently published studies using multiple channels of low frequency sound, this study indicates that multiple subwoofers should be beneficial in multichannel audio systems.
van den Hoven, E & Eggen, B 1970, 'Personal souvenirs as ambient intelligent objects', Proceedings of the 2005 joint conference on Smart objects and ambient intelligence: innovative context-aware services: usages and technologies, sOc-EUSAI05: Smart Objects & Ambient Intelligence, ACM, Grenoble, France, pp. 123-128.
View/Download from: Publisher's site
View description>>
Recollecting memories is an important everyday activity, which can be supported in an Ambient Intelligent environment. For optimal support cues are needed that make people reconstruct their memories. The cue category that is most suitable for an Ambient Intelligent environment concerns physical objects, more specifically souvenirs. This paper shows that personal souvenirs are suitable for usage in an Ambient Intelligent recollecting application.
Voinov, AA 1970, 'Understanding and communicating sustainability: Global versus regional', Aiche Annual Meeting Conference Proceedings, p. 12970.
View description>>
Sustainability in its present connotation is a Western concept that has emerged in the West and largely epresents the attitudes of the developed world. Systems in the developing countries are in transition that is further promoted by globalization. They are foreign to sustainability because by definition they are apt to change rather than maintenance, they are either in the release or renewal stages that hardly anybody wishes to sustain, or have just entered the growth stage. Sustainability is enticing for the developed economic systems, which have reached the conservation phase, and would rather endure this stage. In communicating the knowledge of sustainability it is essential to adapt to the local specifics and redefine sustainability accordingly. Local sustainability can be ensured only by borrowing energy, resources and adaptive potential from outside of the system, or by decreasing the sustainability of the global system. Sustainability of a subsystem is achieved at the expense of the supersystem or other subsystems. Therefore institutions that are to maintain life support systems on this planet need to emphasize global priorities and test policies and strategies against the sustainability of the biosphere, rather than regional or local sustainability. We illustrate these ideas with our findings in the Kola Peninsula Russia) and in the Mekong watershed.
Wang, C, Lu, J, Zhang, GQ & Society, IEEEC 1970, 'A semantic classification approach for online product reviews', 2005 IEEE/WIC/ACM International Conference on Web Intelligence, Proceedings, IEEE/WIC/ACM international Conference on Web Intelligence and Intelligent Agent Technology, IEEE, France, pp. 276-279.
View/Download from: Publisher's site
View description>>
With the fast growth of e-commerce, product reviews on the Web have become an important information source for customersý decision making when they plan to buy products online. As the reviews are often too many for customers to go through, how to automatically classify them into different semantic orientations (i.e. recommend/not recommend) has become a research problem. Different from traditional approaches that treat a review as a whole, our approach performs semantic classifications at the sentence level by realizing reviews often contain mixed feelings or opinions. In this approach, a typical feature selection method based on sentence tagging is employed and a naïve bayes classifier is used to create a base classification model, which is then combined with certain heuristic rules for review sentence classification. Experiments show that this approach achieves better results than using general naïve bayes classifiers.
Weakley, AJ, Johnston, AJ, Edmonds, EA & Turner, GA 1970, 'Creative Collaboration: Communication Translation and Generation in the Development of a Computer-based Artwork', HCI International 2005 - 11th International Conference on Human-Computer Interaction, International Conference on Human-Computer Interaction, Lawrence Erlbaum Assoc, Las Vegas, Nevada, pp. 1-9.
Wen-Chang Cheng & Chin-Teng Lin 1970, 'A Novel Post-Nonlinear ICA-Based Reflectance Model for 3D Surface Reconstruction', 2005 IEEE International Symposium on Circuits and Systems, 2005 IEEE International Symposium on Circuits and Systems, IEEE, Kobe, JAPAN, pp. 3023-3026.
View/Download from: Publisher's site
Wu, F, Lu, J & Zhang, G 1970, 'A decision support system prototype for fuzzy multiple objective optimization', Proceedings 4th Conference of the European Society for Fuzzy Logic and Technology and 11th French Days on Fuzzy Logic and Applications Eusflat Lfa 2005 Joint Conference, European Society for Fuzzy logic and Technology and 11 Rencontres Francophones Sur La logique Floue et ses Applications, Universitat politecnica De Catalunya, Barcelona, Spain, pp. 985-990.
View description>>
An interactive fuzzy multiple objective optimization method was proposed for solving fuzzy multiple objective linear programming problems where fuzzy parameters in both objective functions and constraints and fuzzy goals of objectives can be in any form of membership function. Based on the method, in this paper, a fuzzy multiple objective decision support system prototype is developed. A detailed description of the method and system are then supplied.
Xu, G, Zhang, Y & Zhou, X 1970, 'A Web Recommendation Technique Based on Probabilistic Latent Semantic Analysis', Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 6th International Conference on Web Information Systems Engineering, Springer Berlin Heidelberg, New York, NY, USA, pp. 15-28.
View/Download from: Publisher's site
View description>>
Web transaction data between Web visitors and Web functionalities usually convey user task-oriented behavior pattern. Mining such type of click-stream data will lead to capture usage pattern information. Nowadays Web usage mining technique has become one of most widely used methods for Web recommendation, which customizes Web content to user-preferred style. Traditional techniques of Web usage mining, such as Web user session or Web page clustering, association rule and frequent navigational path mining can only discover usage pattern explicitly. They, however, cannot reveal the underlying navigational activities and identify the latent relationships that are associated with the patterns among Web users as well as Web pages. In this work, we propose a Web recommendation framework incorporating Web usage mining technique based on Probabilistic Latent Semantic Analysis (PLSA) model. The main advantages of this method are, not only to discover usage-based access pattern, but also to reveal the underlying latent factor as well. With the discovered user access pattern, we then present user more interested content via collaborative recommendation. To validate the effectiveness of proposed approach, we conduct experiments on real world datasets and make comparisons with some existing traditional techniques, The preliminary experimental results demonstrate the usability of the proposed approach. © Springer-Verlag Berlin Heidelberg 2005.
Xu, G, Zhang, Y & Zhou, X 1970, 'Towards User Profiling for Web Recommendation', Lecture Notes in Computer Science, 18th Australian Joint Conference on Artificial Intelligence, Springer Berlin Heidelberg, Sydney, Australia, pp. 415-424.
View/Download from: Publisher's site
Xu, G, Zhang, Y, Ma, J & Zhou, X 1970, 'Discovering user access pattern based on probabilistic latent factor model', Conferences in Research and Practice in Information Technology Series, 16th Australasian Database Conference, Australian Computer Society, Newcastle, Australia, pp. 27-35.
View description>>
There has been an increased demand for characterizing user access patterns using web mining techniques since the informative knowledge extracted from web server log files can not only offer benefits for web site structure improvement but also for better understanding of user navigational behavior. In this paper, we present a web usage mining method, which utilize web user usage and page linkage information to capture user access pattern based on Probabilistic Latent Semantic Analysis (PLSA) model. A specific probabilistic model analysis algorithm, EM algorithm, is applied to the integrated usage data to infer the latent semantic factors as well as generate user session clusters for revealing user access patterns. Experiments have been conducted on real world data set to validate the effectiveness of the proposed approach. The results have shown that the presented method is capable of characterizing the latent semantic factors and generating user profile in terms of weighted page vectors, which may reflect the common access interest exhibited by users among same session cluster. © 2005, Australian Computer Society, Inc.
Ying Yan, Jianguo Zhu, Haiwei Lu, Youguang Guo & Shuhong Wang 1970, 'A PMSM model incorporating structural and saturation saliencies', 2005 International Conference on Electrical Machines and Systems, Proceedings of the Eighth International Conference on Electrical Machines and Systems, IEEE, Nanjing, China, pp. 194-199 Vol. 1.
View/Download from: Publisher's site
View description>>
Sensorless permanent magnet synchronous motor (PMSM) drive systems have become very attractive due to their advantages, such as the reduction of hardware complexity and hence the reduced system cost and increased reliability. In order to accurately determine the rotor position required for correct electronic commutation, various methods have been proposed. Among them, the most versatile makes use of the structural and/or magnetic saturation saliencies of the PMSM. This paper presents a non-linear model for PMSMs with the saliencies. The phase inductances of a PMSM are measured and expressed by Fourier series at different rotor positions according to their patterns. The dynamic performance of the PMSM is simulated and compared with that based on a model without considering saliency to verify the effectiveness of the proposed model.
Ying, MS 1970, 'A theory of computation based on quantum logic (I)', 2005 IEEE International Conference on Granular Computing, Vols 1 and 2, IEEE International Conference on Granular Computing, IEEE, Beijing, China, pp. 91-91.
View/Download from: Publisher's site
View description>>
Summary form only given. The (meta) logic underlying classical theory of computation is Boolean (two-valued) logic. Quantum logic was proposed by Birkhoff and von Neumann as a logic of quantum mechanics. It is currently understood as a logic whose truth values are taken from an orthomodular lattice. The major difference between Boolean logic and quantum logic is that the latter does not enjoy distributivity in general. The rapid development of quantum computation in recent years stimulates us to establish a theory of computation based on quantum logic. The present paper is the first step toward such a new theory and it focuses on the simplest models of computation, namely finite automata. We introduce the notion of orthomodular lattice-valued (quantum) automaton. The Kleene theorem about equivalence of regular expressions and finite automata is generalized into quantum logic. We also present a pumping lemma for orthomodular lattice-valued regular languages.
Ying-Chang Cheng, Jen-Feng Chung, Chin-Teng Lin & Sheng-Che Hsu 1970, 'Local Motion Estimation Based on Cellular Neural Network Technology for Image Stabilization Processing', 2005 9th International Workshop on Cellular Neural Networks and Their Applications, 2005 9th International Workshop on Cellular Neural Networks and Their Applications, IEEE, pp. 286-289.
View/Download from: Publisher's site
View description>>
This paper presents a novel robust image stabilization (IS) technique to find out local motion vectors in the image sequences captured. Our technique is based on a Cellular Neural Network (CNN) algorithm, which tracks a small set of features to estimate the motion of the camera. Real-time and parallel analog computing elements are contained in the architecture of CNN. It is a regular two-dimensional array and connects with its neighborhood locally. To implement this algorithm on VLSI CNN, the adaptive-minimized threshold method is proposed to find quickly extract reliable motion vectors in plain images which are lack of features or contain large low-contrast area. Each size of CNN is set to 1/120 of an image. A background evaluation model is also developed to deal with irregular images which contain large moving objects. The experimental results are on-line available to demonstrate the remarkable performance of the proposed CNN-based motion technique.
Yonguang Guo, Jianguo Zhu & Haiwei Lu 1970, 'Design of SMC motors using hybrid optimization techniques and 3D FEA with increasing accuracy', 2005 International Conference on Electrical Machines and Systems, Proceedings of the Eighth International Conference on Electrical Machines and Systems, IEEE, Nanjing, China, pp. 2296-2301 Vol. 3.
View/Download from: Publisher's site
View description>>
This paper presents the design and analysis of a three-phase three-stack permanent magnet claw pole motor with soft magnetic composite (SMC) stator core. 3D finite element analysis (FEA) of magnetic field is performed to accurately calculate key motor parameters and performance. Combined optimization techniques and 3D FEA with increasing accuracy are applied to effectively reduce the computational time. The designed motor has been fabricated and tested. The theoretical calculations are validated by the experimental results on the prototype.
YouGuang Guo, Jian Guo Zhu & Haiwei Lu 1970, 'Design and Analysis of a Permanent Magnet Claw Pole/Transverse Flux Motor with SMC Core', 2005 International Conference on Power Electronics and Drives Systems, 2005 International Conference on Power Electronics and Drives Systems, IEEE, pp. 1413-1418.
View/Download from: Publisher's site
View description>>
This paper presents the design and analysis of a claw pole/transverse flux motor (CPTFM) with soft magnetic composite (SMC) core and permanent magnet flux-concentrating rotor. Three-dimensional magnetic field finite element analysis is conducted to accurately calculate key motor parameters such as winding flux, back electromotive force, winding inductance, and core loss. Equivalent electric circuit is derived under optimum brushless DC control condition for motor performance prediction, and computer search techniques are applied for design optimization. All these computations and analyses have been implemented in a commercial software ANSYS for development of the SMC CPTFM prototype.
Yu, S & Zhou, W 1970, 'A Mesh Based Anycast Routing Protocol for Ad Hoc Networks', PARALLEL AND DISTRIBUTED PROCESSING AND APPLICATIONS, 3rd International Conference on Parallel and Distributed Processing and Applications, Springer Berlin Heidelberg, Nanjing, PEOPLES R CHINA, pp. 927-932.
View/Download from: Publisher's site
Yu, S & Zhou, W 1970, 'An Efficient Reliable Architecture for Application Layer Anycast Service', DISTRIBUTED AND PARALLEL COMPUTING, 6th International Conference on Algorithms and Architectures for Parallel Processing, Springer Berlin Heidelberg, Melbourne, AUSTRALIA, pp. 376-385.
View/Download from: Publisher's site
Zhang, Y, Xu, G & Zhou, X 1970, 'A Latent Usage Approach for Clustering Web Transaction and Building User Profile', Lecture Notes in Computer Science, First International Conference, ADMA 2005, Springer Berlin Heidelberg, Wuhan, China, pp. 31-42.
View/Download from: Publisher's site