Beydoun, G, Debenham, J & Hoffmann, A 2005, 'Using messaging structure to evolve agents roles in electronic markets', INTELLIGENT AGENTS AND MULTI-AGENT SYSTEMS, vol. 3371, pp. 18-28.
View description>>
Exogenous dynamics play a central role in survival and evolution of institutions. In this paper, we develop an approach to automate part of this evolution process for electronic market places which bring together many online buyers and suppliers. In part
Beydoun, G, Hoffmann, AG, Fernández-Breis, JT, Martínez-Béjar, R, Valencia-García, R & Aurum, A 2005, 'Cooperative Modelling Evaluated.', Int. J. Cooperative Inf. Syst., vol. 14, no. 1, pp. 45-71.
View/Download from: Publisher's site
Bremner, MJ, Bacon, D & Nielsen, MA 2005, 'Simulating Hamiltonian dynamics using many-qudit Hamiltonians and local unitary control', PHYSICAL REVIEW A, vol. 71, no. 5.
View/Download from: Publisher's site
Cetindamar, D 2005, 'Policy issues for Turkish entrepreneurs', Int. J. of Entrepreneurship and Innovation Management, vol. 5, no. 3/4, pp. 187-205.
View/Download from: Publisher's site
View description>>
While it is becoming clear that there is a positive relationship between entrepreneurship and economic development, the topic of entrepreneurship in developing countries has been neglected in the literature. This paper assesses the problems and expectations of entrepreneurs in Turkey. Its main findings are as follows: Turkey underutilises youth and women entrepreneurial resources; there exists a large informal economy that tends to support self-employment rather than entrepreneurship per se; entrepreneurs do not have the kinds of ties with organisations that might be helpful when they are first starting out; entrepreneurs see as their main problems bureaucracy and unstable state policies. Based on these findings, the paper concludes with a policy discussion regarding the development of entrepreneurship in Turkey.
Cetindamar, D 2005, 'Policy issues for Turkish entrepreneurs', International Journal of Entrepreneurship and Innovation Management, vol. 5, no. 3-4, pp. 187-205.
View/Download from: Publisher's site
View description>>
While it is becoming clear that there is a positive relationship between entrepreneurship and economic development, the topic of entrepreneurship in developing countries has been neglected in the literature. This paper assesses the problems and expectations of entrepreneurs in Turkey. Its main findings are as follows:Turkey underutilises youth and women entrepreneurial resources there exists a large informal economy that tends to support self-employment rather than entrepreneurship per se entrepreneurs do not have the kinds of ties with organisations that might be helpful when they are first starting out entrepreneurs see as their main problems bureaucracy and unstable state policies. Based on these findings, the paper concludes with a policy discussion regarding the development of entrepreneurship in Turkey. © 2005 Inderscience Enterprises Ltd.
Cetindamar, D, Çatay, B & Serdar Basmaci, O 2005, 'Competition through collaboration: insights from an initiative in the Turkish textile supply chain', Supply Chain Management: An International Journal, vol. 10, no. 4, pp. 238-240.
View/Download from: Publisher's site
View description>>
PurposeTo gain an understanding of the benefits, bridges, and barriers associated with supply chain collaboration.Design/methodology/approachInsights from extensive field research of a successful collaboration example in the Turkish dyeing and finishing industry.FindingsThe competition among firms is increasingly shifting from company vs company to supply chain vs supply chain. The insights obtained from the collaborative model in this textile supply chain provide a good understanding of the benefits, bridges, and barriers associated with supply chain collaboration. Benefits can be grouped as customer‐oriented benefits, productivity benefits, and innovation related benefits. Factors supporting collaboration are observed as trust, common goals for cooperation, and existence of cooperation mechanisms, while barriers are related to three factors: lack of trust, risk‐benefit evaluation, and lack of common goals for cooperation.Research limitations/implicationsFindings are based on interviews and questionnaires conducted with the managers of 3T, 30 dyeing and finishing firms (ten are partners) and six technology‐supplying partner firms, from various regions in Turkey.Practical implicationsHighlights the importance of trust and collaboration mechanisms in managing collaborations. As the case of 3T in the dyeing and finishing industry shows, collaborations might significantly contribute to the competitiveness of textile firms.Originality/val...
Chin-Teng Lin, Ruei-Cheng Wu, Sheng-Fu Liang, Wen-Hung Chao, Yu-Jie Chen & Tzyy-Ping Jung 2005, 'EEG-based drowsiness estimation for safety driving using independent component analysis', IEEE Transactions on Circuits and Systems I: Regular Papers, vol. 52, no. 12, pp. 2726-2738.
View/Download from: Publisher's site
View description>>
Preventing accidents caused by drowsiness has become a major focus of active safety driving in recent years. It requires an optimal technique to continuously detect drivers' cognitive state related to abilities in perception, recognition, and vehicle control in (near-) real-time. The major challenges in developing such a system include: 1) the lack of significant index for detecting drowsiness and 2) complicated and pervasive noise interferences in a realistic and dynamic driving environment. In this paper, we develop a drowsiness-estimation system based on electroencephalogram (EEG) by combining independent component analysis (ICA), power-spectrum analysis, correlation evaluations, and linear regression model to estimate a driver's cognitive state when he/ she drives a car in a virtual reality (VR)-based dynamic simulator. The driving error is defined as deviations between the center of the vehicle and the center of the cruising lane in the lane-keeping driving task. Experimental results demonstrate the feasibility of quantitatively estimating drowsiness level using ICA-based multistream EEG spectra. The proposed ICA-based method applied to power spectrum of ICA components can successfully (1) remove most of EEG artifacts, (2) suggest an optimal montage to place EEG electrodes, and estimate the driver's drowsiness fluctuation indexed by the driving performance measure. Finally, we present a benchmark study in which the accuracy of ICA-component-based alertness estimates compares favorably to scalp-EEG based. © 2005 IEEE.
Chin-Teng Lin, Wen-Chang Cheng & Sheng-Fu Liang 2005, 'An on-line ICA-mixture-model-based self-constructing fuzzy neural network', IEEE Transactions on Circuits and Systems I: Regular Papers, vol. 52, no. 1, pp. 207-221.
View/Download from: Publisher's site
Chonghui Song & Tianyou Chai 2005, 'Comment on 'Discrete-time optimal fuzzy controller design: global concept approach'', IEEE Transactions on Fuzzy Systems, vol. 13, no. 2, pp. 285-286.
View/Download from: Publisher's site
de Kort, Y, IJsselsteijn, W, Midden, C, Eggen, B & van den Hoven, E 2005, 'Persuasive Gerontechnology', Gerontechnology, vol. 4, no. 3, pp. 123-127.
View description>>
Gerontechnology is a domain that originated a few decades ago and has been developing steadily ever since. Its research focus on a broad set of technologies to serve the ageing society. The present paper aims to connect this domain to a new but promising technology domain that holds great potential for the older population: persuasive technology.
de Raadt, M, Hamilton, M, Lister, RF, Tutty, J, Baker, B, Box, I, Cutts, QI, Fincher, S, Hamer, J, Haden, P, Petre, M, Robins, A, Sutton, K & Tolhurst, D 2005, 'Approaches to learning in computer programming students and their effect on success', Research and Development in Higher Education Series, vol. 28, pp. 407-414.
Devitt, SJ, Cole, JH & Hollenberg, LCL 2005, 'Scheme for direct measurement of a general two-qubit Hamiltonian', Phys. Rev. A., vol. 73, no. 5, p. 052317.
View/Download from: Publisher's site
View description>>
The construction of two-qubit gates appropriate for universal quantumcomputation is of enormous importance to quantum information processing.Building such gates is dependent on accurate knowledge of the interactiondynamics between two qubit systems. This letter will present a systematicmethod for reconstructing the full two-qubit interaction Hamiltonian throughexperimental measures of concurrence. This not only gives a convenient methodfor constructing two qubit quantum gates, but can also be used toexperimentally determine various Hamiltonian parameters in physical systems. Weshow explicitly how this method can be employed to determine the first andsecond order spin-orbit corrections to the exchange coupling in quantum dots.
Devitt, SJ, Greentree, AD & Hollenberg, LCL 2005, 'Information free quantum bus for generating stabiliser states', Quant. Inf. Proc. 6(4):229 (2007), vol. 6, no. 4, pp. 229-242.
View/Download from: Publisher's site
View description>>
Efficient generation of spatially delocalised entangled states is at theheart of quantum information science. Generally flying qubits are proposed forlong range entangling interactions, however here we introduce a bus-mediatedalternative for this task. Our scheme permits efficient and flexible generationof deterministic two-qubit operator measurements and has links to the importantconcepts of mode-entanglement and repeat-until-success protocols. Importantly,unlike flying qubit protocols, our bus particle never contains informationabout the individual quantum states of the particles, hence isinformation-free.
Dovey, K & Singhota, J 2005, 'Learning and knowing in teams', Development and Learning in Organizations: An International Journal, vol. 19, no. 3, pp. 18-20.
View/Download from: Publisher's site
View description>>
PurposeTo explore the collective means through which professional sports teams learn and generate new knowledge forms in order to remain competitive in challenging global arenas, and to examine the applicability of these means to business organizations.Design/methodology/approachThe objectives were achieved by drawing on the business and sporting experience of two executive coaches who have access to current elite‐level sports coaches. Through unstructured interviews with sports coaches and business executives over a period of years, the research question of collective learning in sports teams has been explored and its relevance to business contexts, analyzed.FindingsUsing social capital theory as an analytical lens, the research shows that organizational form is a critical determinant of the effectiveness of collective learning. This is the main reason why business teams are unable to emulate the successful learning that occurs in elite‐level sports teams. The research shows that the hierarchical structure of most business organizations constrains the development of the social capital necessary for sustained learning and knowledge construction.Practical implicationsThe primary implication of the research findings is that business leaders need to view their role as that of creating and managing a social environment in which mission‐pertinent learning and knowledge construction activities are nurtured. In practice, it means that the nature of business leadership and, in particular, power management practices in business organizations needs to be questioned and re‐conceptualized.Originality/valueUsing ...
Dovey, K & White, R 2005, 'Learning about learning in knowledge‐intense organizations', The Learning Organization, vol. 12, no. 3, pp. 246-260.
View/Download from: Publisher's site
View description>>
PurposeThis paper describes and analyses an attempt to engage in transformational learning, oriented to the development of a culture of innovation, at a medium‐size software development organization in Australia.Design/methodology/approachAn action research methodology was used whereby continuous cycles of strategic social learning were collectively theorized, implemented, evaluated and renewed.FindingsThe most important finding of this study is that of the influence of power relations and communication practices upon learning‐for‐innovation in organizations, and the need for the mediation of this influence through the creation of an organizational role that we have entitled an “external critic”. The case also shows the central importance of the relational dimension of social capital generation to learning and the sensitivity of this dimension to power relations.Research limitations/implicationsThe research provides a rich analysis of one company's attempt to learn how to build and sustain a culture of innovation but, as with all case study research, the findings cannot be reliably generalized to other companies. Similarly, the case generates grounded theory that needs to be tested in other organizational contexts.Practical implicationsThe case raises the issue of power management in organizations and its relationship to social learning practices. In particular, it argues for the establishment of a “negotiated order” in organizations (through a mission, vision and core values that are collectively and meani...
Dovey, KA 2005, 'Leadership Education in the Era of Disruption: What Can Business Schools Offer?', International Journal of Leadership Education, vol. 1, no. 1, pp. 179-191.
Gervasi, V & Zowghi, D 2005, 'Reasoning about inconsistencies in natural language requirements', ACM TRANSACTIONS ON SOFTWARE ENGINEERING AND METHODOLOGY, vol. 14, no. 3, pp. 277-330.
View/Download from: Publisher's site
View description>>
The use of logic in identifying and analyzing inconsistency in requirements from multiple stakeholders has been found to be effective in a number of studies. Nonmonotonic logic is a theoretically well-founded formalism that is especially suited for suppo
Gwillim, D, Dovey, K & Wieder, B 2005, 'The politics of post-implementation reviews', INFORMATION SYSTEMS JOURNAL, vol. 15, no. 4, pp. 307-319.
View/Download from: Publisher's site
View description>>
The post-implementation review (PIR) literature emphasizes the benefits of ex post evaluations of information technology (IT) projects. However, empirical studies of actual practice show that few organizations undertake any substantive form of ex post ev
Hazzan, O, Impagliazzo, J, Lister, R & Schocken, S 2005, 'Using history of computing to address problems and opportunities', ACM SIGCSE Bulletin, vol. 37, no. 1, pp. 126-127.
View/Download from: Publisher's site
Hazzan, O, Lister, R, Impagliazzo, J & Schocken, S 2005, 'Using history of computing to address problems and opportunities in computer science education', Proceedings of the Thirty-Sixth SIGCSE Technical Symposium on Computer Science Education, SIGCSE 2005, vol. 37, no. 1, pp. 126-127.
View description>>
Like nations and peoples, professions have histories too. Similar reasons for teaching history of nations and peoples may explain the importance of teaching prospective professionals the history of their profession. Indeed, much of K-16 education evolves around the teaching of history. Computing is not different with this respect. However, in the computing field, that often lacks attention to the societal impact of its products or an appreciation of the human side of the field, the inclusion of history of computing courses (or incorporating historical perspectives in computer science courses) is rare. In addition, the lack of formal education in computing history and the lack of relevant effective resources does not encourage faculty to incorporate history in their courses. Traditional historians often classify the history of computing as 'recent or contemporary history'. Indeed, the majority of undergraduate students currently an university and college programs were born after the personal computer and their teachers educated after the advent of email. Thus, though computers have strongly influenced their lives, they are generally unaware of the antecedents of the machines and tools they use every day. Hence, they usually do not build on the foundations of the field to explain a subject. Equally, myths and fallacies fill the field, including textbooks. The panel illustrates how teachers can integrate history of computing into traditional computer science education. Open discussion with the audience will follow the panelists' short presentations.
Huo, H, Hui, X, Wang, G, Wang, B & Han, D 2005, 'Document fragmentation for XML streams based on Hole-Filler model', Huazhong Keji Daxue Xuebao (Ziran Kexue Ban)/Journal of Huazhong University of Science and Technology (Natural Science Edition), vol. 33, no. SUPPL., pp. 249-252.
View description>>
A document fragmentation policy was presented by taking advantage of document object model (DOM) for XML, and a corresponding fragmentation algorithm was designed according to the element fan-outs, to solve the problem of document fragmentation for XML streams based on Hole-Filler model. A tag-based document fragmentation algorithm built on DOM-based algorithm was then proposed to determine document filler points by fragmenting tag structure, so as to reduce the comparisons between element fan-outs and threshold. Finally, an optimized fragmentation policy was presented to avoid trivial pieces by binding XML sub-trees according to the ratio of element fan-outs and threshold. Our performance study shows that the document fragmentation algorithms perform well on execution time, granularity and other metrics.
Kuhnert, M, Voinov, A & Seppelt, R 2005, 'Comparing Raster Map Comparison Algorithms for Spatial Modeling and Analysis', Photogrammetric Engineering & Remote Sensing, vol. 71, no. 8, pp. 975-984.
View/Download from: Publisher's site
Liang, SF, Lin, CT, Wu, RC, Chen, YC, Huang, TY & Jung, TP 2005, 'Monitoring Driver's Alertness Based on the Driving Performance Estimation and the EEG Power Spectrum Analysis', 2005 IEEE Engineering in Medicine and Biology 27th Annual Conference, pp. 5738-5741.
View/Download from: Publisher's site
Lin, C-J & Chen, C-H 2005, 'Identification and prediction using recurrent compensatory neuro-fuzzy systems', Fuzzy Sets and Systems, vol. 150, no. 2, pp. 307-330.
View/Download from: Publisher's site
View description>>
In this paper, a recurrent compensatory neuro-fuzzy system (RCNFS) for identification and prediction is proposed. The compensatory-based fuzzy method uses the adaptive fuzzy operations of neuro-fuzzy systems to make fuzzy logic systems more adaptive and effective. A recurrent network is embedded in the RCNFS by adding feedback connections in the second layer, where the feedback units act as memory elements. In this paper, the RCNFS model is proved to be a universal approximator. Also, an online learning algorithm is proposed which can automatically construct the RCNFS. There are no rules initially in the RCNFS. They are created and adapted as online learning proceeds through simultaneous structure and parameter learning. Structure learning is based on the degree measure and parameter learning is based on the ordered derivative algorithm. Finally, the RCNFS is used in several simulations. The simulation results of the dynamic system model have shown that (1) the RCNFS model converges quickly; (2) the RCNFS model requires a small number of tuning parameters; (3) the RCNFS model can solve temporal problems and approximate a dynamic system. © 2004 Elsevier B.V. All rights reserved.
Lin, C-T, Wu, R-C, Jung, T-P, Liang, S-F & Huang, T-Y 2005, 'Estimating Driving Performance Based on EEG Spectrum Analysis', EURASIP Journal on Advances in Signal Processing, vol. 2005, no. 19, pp. 3165-3174.
View/Download from: Publisher's site
Lin, C-T, Yeh, C-M, Chung, J-F, Liang, S-F & Pu, H-C 2005, 'Support-Vector-Based Fuzzy Neural Networks', International Journal of Computational Intelligence Research, vol. 1, no. 2.
View/Download from: Publisher's site
Lister, R 2005, 'One Small Step Toward a Culture of Peer Review and Multi- Institutional Sharing of Educational Resources: A Multiple Choice Exam for First Semester Programming Students', Conferences in Research and Practice in Information Technology Series, vol. 42, no. 5, pp. 155-164.
View description>>
This paper presents a multiple choice question exam, used to test students completing their first semester of programming. Assumptions in the design of the exam are identified. A detailed analysis is performed on how students performed on the questions. The intent behind this exercise is to begin a community process of identifying the criteria that define an effective multiplechoice exam for testing novice programmers. The long term aim is to develop consensus on peer review criteria of such exams. This consensus is seen as a necessary precondition for any future public domain library of such multiple-choice questions. © 2005, Australian Computer Society, Inc.
Lister, RF 2005, 'Mixed methods: positivists are from Mars, constructivists are from Venus', ACM SIGCSE Bulletin Inroads, vol. 37, no. 4, pp. 18-19.
Lu, H & Song, Y 2005, 'Brief Introduction to the Development of Electric Power Industry in UK', Modern Electric power, vol. 22, no. 2, pp. 91-94.
Miliszewska, I & Horwood, J 2005, 'An Architecture for a Federated Education System', International Journal of Distance Education Technologies, vol. 3, no. 1, pp. 97-106.
View/Download from: Publisher's site
View description>>
This article presents the development and software architecture of a conceptual and operational collaborative distance education model. Promotion of educational expertise, especially through expansion of specialization, is of increasing importance. Within the educational sphere, big universities tend to dominate the market at the expense of smaller ones. For small universities, the key to their survival could also be specialization within disciplines, coupled with collaboration among universities. Intra-discipline specialization would promote development of quality services, and interuniversity collaboration would enable a wide offering of these services. The proposed paradigm would require the development of a suitable model to support it. The model proposed in this article is a federation of independent universities that are loosely coupled to facilitate collaboration and the sharing and exchanging of information. The federated model, supported by agent-based communication over the Internet, can operate across geographical, cultural and organizational boundaries while promoting integration within those boundaries.
Niazi, M, Wilson, D & Zowghi, D 2005, 'A framework for assisting the design of effective software process improvement implementation strategies', JOURNAL OF SYSTEMS AND SOFTWARE, vol. 78, no. 2, pp. 204-222.
View/Download from: Publisher's site
View description>>
A number of advances have been made in the development of software process improvement (SPI) standards and models, e.g. Capability Maturity Model (CMM), more recently CMMI, and ISO’s SPICE. However, these advances have not been matched by equal advances
Niazi, M, Wilson, D & Zowghi, D 2005, 'A maturity model for the implementation of software process improvement: an empirical study', JOURNAL OF SYSTEMS AND SOFTWARE, vol. 74, no. 2, pp. 155-172.
View/Download from: Publisher's site
Olafsen, RN & Cetindamar, D 2005, 'E‐learning in a competitive firm setting', Innovations in Education and Teaching International, vol. 42, no. 4, pp. 325-335.
View/Download from: Publisher's site
Sanders, K, Fincher, S, Bouvier, D, Lewandowski, G, Morrison, B, Murphy, L, Petre, M, Richards, B, Tenenberg, J, Thomas, L, Anderson, R, Anderson, R, Fitzgerald, S, Gutschow, A, Haller, S, Lister, R, McCauley, R, McTaggart, J, Prasad, C, Scott, T, Shinners-Kennedy, D, Westbrook, S & Zander, C 2005, 'A multi-institutional, multinational study of programming concepts using card sort data', Expert Systems, vol. 22, no. 3, pp. 121-128.
View/Download from: Publisher's site
View description>>
This paper presents a case study of the use of a repeated single-criterion card sort with an unusually large, diverse participant group. The study, whose goal was to elicit novice programmers' knowledge of programming concepts, involved over 20 researche
Shi, C, Lu, J & Zhang, G 2005, 'An extended Kth-best approach for linear bilevel programming', Applied Mathematics and Computation, vol. 164, no. 3, pp. 843-855.
View/Download from: Publisher's site
View description>>
Kth-best approach is one of the three popular and workable approaches for linear bilevel programming. However, it could not well deal with a linear bilevel programming problem when the constraint functions at the Upper-level are of arbitrary linear form.
Shi, C, Zhang, G & Lu, J 2005, 'On the definition of linear bilevel programming solution', Applied Mathematics and Computation, vol. 160, no. 1, pp. 169-176.
View/Download from: Publisher's site
View description>>
Linear bilevel programming theory has been studied for many years by a number of researchers from different aspects, yet it still remains to some extent unsatisfactory and incomplete. The main challenge is how to solve a linear bilevel programming problem when the upper-level's constraint functions are of arbitrary linear form. This paper proposes a definition for linear bilevel programming solution. The performance comparisons have demonstrated that the new model can solve a wider class of problems than current capabilities permit
Shi, CG, Lu, J & Zhang, GQ 2005, 'An extended Kuhn-Tucker approach for linear bilevel programming', APPLIED MATHEMATICS AND COMPUTATION, vol. 162, no. 1, pp. 51-63.
View/Download from: Publisher's site
View description>>
Kuhn-Tucker approach has been applied with remarkable success in linear bilevel programming (BLP). However, it still has some extent unsatisfactory and incomplete. One principle challenges is that it could not well handle a linear BLP problem when the co
Shi, CG, Zhang, GQ & Lu, J 2005, 'The Kth-best approach for linear bilevel multi-follower programming', JOURNAL OF GLOBAL OPTIMIZATION, vol. 33, no. 4, pp. 563-578.
View/Download from: Publisher's site
View description>>
The majority of research on bilevel programming has centered on the linear version of the problem in which only one leader and one follower are involved. This paper addresses linear bilevel multi-follower programming (BLMFP) problems in which there is no
Wu, FJ, Lu, J & Zhang, GQ 2005, 'Development and implementation on a fuzzy multiple objective decision support system', KNOWLEDGE-BASED INTELLIGENT INFORMATION AND ENGINEERING SYSTEMS, PT 1, PROCEEDINGS, vol. 3681, pp. 261-267.
View description>>
A fuzzy-goal optimization-based method has been developed for solving fuzzy multiple objective linear programming (FMOLP) problems where fuzzy parameters in both objective functions and constraints and fuzzy goals of objectives can be in any form of membership function. Based on the method, a fuzzy multiple objective decision support system (FMODSS) is developed. This paper focuses on the development and use of FMODSS in detail. An example is presented for demonstrating how to solve a FMOLP problem by using the FMODSS interactively. © Springer-Verlag Berlin Heidelberg 2005.
Yusop, N, Lowe, DB & Zowghi, D 2005, 'Impacts of Web Systems on Their Domain', Journal of Web Engineering (Online), vol. 4, pp. 313-338.
Zowghi, D & Gervasi, V 2005, 'Automated tools for requirements engineering', COMPUTER SYSTEMS SCIENCE AND ENGINEERING, vol. 20, no. 1, pp. 3-4.
Beydoun, G, Gonzalez-Perez, C, Low, G & Henderson-Sellers, B 1970, 'Synthesis of a generic MAS metamodel', Proceedings of the fourth international workshop on Software engineering for large-scale multi-agent systems - SELMAS '05, the fourth international workshop, ACM Press, St Loius, USA, pp. 27-31.
View/Download from: Publisher's site
View description>>
Method engineering, which focuses on project-specific methodology construction from existing method fragments, is an appealing approach to organize, appropriately access and effectively harness the software engineering knowledge of MAS methodologies. With the objective of applying method engineering for developing an MAS, in this paper we introduce a generic metamodel to serve as a representational infrastructure to unify existing MAS methodologies into a single specification. Our metamodel does not focus on any class of MAS, nor does it impose any restrictions on the format of the system requirements; rather, our metamodel is an abstraction of how any MAS is structured and behaves both at design time and run-time.
Beydoun, G, Gonzalez-Perez, C, Low, G & Henderson-Sellers, B 1970, 'Towards Method Engineering for Multi-Agent Systems: A preliminary validation of a Generic MAS Metamodel.', SEKE, International Conference on Software Engineering and Knowledge Engineering, Knowledge Systems Graduate School, Taipei, Taiwan, pp. 51-56.
View description>>
New Multi-Agent System (MAS) software development methodologies are published at an increasing pace. This is in part due to the accepted premise that no single methodology can be suitable for all MAS software projects. Method engineering, which focuses on projectspecific methodology construction from existing method fragments, is an appealing approach to organize, appropriately access and effectively harness the software engineering knowledge of methodologies. Towards this, in this paper we present and validate a generic productfocussed metamodel to serve as a representational infrastructure to unify existing methodologies into a single specification. Our metamodel does not focus on any class of MAS, nor does it impose any restrictions on the format of system requirements; rather, our metamodel is an abstraction of how any MAS is structured and behaves both at design time and run-time. We analyze two well-known existing MAS metamodels. We sketch how they can be seen as subtypes of our generic metamodel. This constitutes early evidence to support the use of our metamodel towards the construction of situated MAS methodologies.
Beydoun, G, Tran, N, Low, G & Henderson-Sellers, B 1970, 'Preliminary basis for an ontology-based methodological approach for multi-agent systems', PERSPECTIVES IN CONCEPTUAL MODELING, International Conference on Conceptual Modelling, Springer, Klagenfurt, Austria, pp. 131-140.
View/Download from: Publisher's site
View description>>
The influence of ontologies in Knowledge Based Systems (KBS) methodologies extends well beyond the initial analysis phase, leading in the 1990s to domain-independent KBS methodologies. In this paper, we reflect on those lessons and on the roles of ontologies in KBS development. We analyse and identify which of those roles can be transferred towards an ontology-based MAS development methodology. We identify ontology-related inter-dependencies between the analysis and design phases. We produce a set of six recommendations towards creating a domain-independent MAS methodology that incorporates ontologies beyond its initial analysis phase. We identify its essential features and sketch the characteristic tasks within both its analysis and design phases.
Boyd, S, Zowghi, D, Farroukh, A & Society, IEEEC 1970, 'Measuring the expressiveness of a constrained natural language: An empirical study', 13TH IEEE INTERNATIONAL CONFERENCE ON REQUIREMENTS ENGINEERING, PROCEEDINGS, IEEE International Requirements Engineering Conference, IEEE, Paris, France, pp. 339-349.
View/Download from: Publisher's site
View description>>
It has been suggested that constraining a natural language (NL) reduces the degree of ambiguity of requirement specifications written in that language. There is also a tendency to assume that an inescapable side effect of constraining a natural language is a subsequent reduction in its expressiveness. The primary objective of this paper is to describe a technique that we have developed for empirically measuring the expressiveness of a Constrained Natural Language (CNL) when used to specify the requirements in a particular application domain. Our simple yet practical and repeatable technique elucidates the individual contribution that each lexical entity of the CNL can make on the overall expressiveness of the CNL. This technique is particularly useful for designing new CNLs, as well as situations where tailoring or streamlining existing CNLs for particular application domains is needed.
Chang, E, Dillon, TS & Hussain, FK 1970, 'Trust and reputation relationships in service-oriented environments', Third International Conference on Information Technology and Applications, Vol 1, Proceedings, International Conference on Information Technology and Applications, IEEE Computer Society, Sydney, Australia, pp. 4-14.
View/Download from: Publisher's site
View description>>
Trust and trustworthiness plays a major role in conducting business on the Internet in service-oriented environments. In defining trust for service-oriented environments, one needs to capture the notation of service level, service agreement, context and timeslots. The same applies for reputation which is the opinion of the third party agents which is used in determining the trust and trustworthiness. Because of the complexity of the issues, and the fact that the trust and reputation are essentially concerns with the relationships, it is important to clearly define the notion of the trust relationships and notion of the reputation relationships. In this paper, therefore, we clear these definitions and we introduce a graphical notation for representing these relationships.
Chang, E, Hussain, FK & Dillon, T 1970, 'CCCI metrics for the measurement of quality of e-service', 2005 IEEE/WIC/ACM International Conference on Intelligent Agent Technology, Proceedings, IEEE/WIC/ACM International Conference on Intelligent Agent Technology, IEEE CS Press, Campiegne, France, pp. 603-610.
View/Download from: Publisher's site
View description>>
The growing development in web-based trust and reputation systems in the 21st century will have powerful social and economic impact on all business entities, and will make transparent quality assessment and customer assurance realities in the distributed web-based service oriented environments. The growth in web-based trust and reputation systems will be the foundation for web intelligence in the future. Trust and Reputation systems help capture business intelligence through establishing customer relationships, learning consumer behaviour, capturing market reaction on products and services, disseminating customer feedback, buyers opinions and end-user recommendations, and revealing dishonest services, unfair trading, biased assessment, discriminatory actions, fraudulent behaviours, and un-true advertising. The continuing development of these technologies will help in the improvement of professional business behaviour, sales, reputation of sellers, providers, products and services. In this paper, we present a new methodology known as CCCI (Correlation, Commitment, Clarity, and Influence) for trustworthiness measure that is used in the Trust and Reputation System. The methodology is based on determining the correlation between the originally committed services and the services actually delivered by a Trusted Agent in a business interaction over the service oriented networks to determine the trustworthiness of the Trusted Agent.
Chang, E, Hussain, FK & Dillon, T 1970, 'Reputation ontology for reputation systems', ON THE MOVE TO MEANINGFUL INTERNET SYSTEMS 2005: OTM 2005 WORKSHOPS, PROCEEDINGS, The International Conference on Semantic Web and Web Services, Springer, New York, USA, pp. 957-966.
View/Download from: Publisher's site
View description>>
The growing development of web-based reputation systems in the 21st century will have a powerful social and economic impact on both business entities and individual customers, because it makes transparent quality assessment on products and services to achieve customer assurance in the distributed web-based Reputation Systems. The web-based reputation systems will be the foundation for web intelligence in the future. Trust and Reputation help capture business intelligence through establishing customer trust relationships, learning consumer behavior, capturing market reaction on products and services, disseminating customer feedback, buyers opinions and end-user recommendations. It also reveals dishonest services, unfair trading, biased assessment, discriminatory actions, fraudulent behaviors, and un-true advertising. The continuing development of these technologies will help in the improvement of professional business behavior, sales, reputation of sellers, providers, products and services. Given the importance of reputation in this paper, we propose ontology for reputation. In the business world we can consider the reputation of a product or the reputation of a service or the reputation of an agent. In this paper we propose ontology for these entities that can help us unravel the components and conceptualize the components of reputation of each of the entities.
Chang, E, Hussain, FK & Dillon, TS 1970, 'Trustworthiness measure for e-service', PST 2005 - 3rd Annual Conference on Privacy, Security and Trust, Conference Proceedings, Annual Conference on Privacy, Security and Trust, University of New Brunswick, St Andrews, Canada, pp. 1-14.
View description>>
Traditionally, transactions were carried out face-toface, now, they are carried out over the Internet. The infrastructure for the above business and information exchange could be client-server, peer-to-peer or mobile network environments, and very often users on the network carry out interactions in one of three forms: • Anonymous (No names are identified in the communication) • Pseudo-anonymous (Nicknames are used in the communication) • Non-anonymous (Real names are used in the communication) Incapability or a fraudulent practice could occur when the seller or business provider or buyer (the agents on the network) does not behave in the manner that is mutually agreed or understood, especially if published terms and conditions exist. This paper evaluates currently existing trustworthiness systems and points out that currently there is no existing standardized measurement system for Quality of Service and outlines the methodology that we have developed for this.
Chang, E, Thomson, P, Dillon, T & Hussain, F 1970, 'The Fuzzy and Dynamic Nature of Trust', Lecture Notes in Computer Science: Trust, Privacy, And Security In Digital Business, International Conference on Trust, Privacy and Security in Digital Business, Springer Berlin Heidelberg, Copenhagen, Denmark, pp. 161-174.
View/Download from: Publisher's site
View description>>
Trust is one of the most fuzzy, dynamic and complex concepts in both social and business relationships. The difficulty in measuring Trust and predicting Trustworthiness in service-oriented network environments leads to many questions. These include issue
Chang, EJ, Hussain, FK & Dillon, TS 1970, 'Fuzzy nature of trust and dynamic trust modeling in service oriented environments', Proceedings of the 2005 workshop on Secure web services, CCS05: 12th ACM Conference on Computer and Communications Security 2005, ACM, Fairfax, USA, pp. 1-10.
View/Download from: Publisher's site
View description>>
In this paper, we propose and describe the six characteristics of trust. Based on the six proposed characteristics, we determine why trust is fuzzy. The term 'fuzzy' in this paper not in the sense of the precise definitions given in the Fuzzy Systems literature but to indicate a certain vagueness, complexity or ill-definition and qualitative characterization rather than quantitative representation and dynamic.We then determine why due to the six characteristics of trust, trust is dynamic. We then propose a modeling language tool to model the fuzzy and dynamic nature of trust.
Cornelis, C, Guo, X, Lu, J & Zhang, G 1970, 'A fuzzy relational approach to event recommendation', Proceedings of the 2nd Indian International Conference on Artificial Intelligence, IICAI 2005, Indian International Conference on Artificial Intelligence, IICAI, Pune, INDIA, pp. 2231-2242.
View description>>
Most existing recommender systems employ collaborative filtering (CF) techniques in making projections about which items an e- service user is likely to be interested in, i.e. they identify correlations between users and recommend items which similar users have liked in the past. Traditional CF techniques, however, have difficulties when confronted with sparse rating data, and cannot cope at all with time-specific items, like events, which typically receive their ratings only after they have finished. Content-based (CB) algorithms, which consider the internal structure of items and recommend items similar to those a user liked in the past can partly make up for that drawback, but the collaborative feature is totally lost on them. In this paper, modelling user and item similarities as fuzzy relations, which allow to flexibly reflect the graded/uncertain information in the domain, we develop a novel, hybrid CF-CB approach whose rationale is concisely summed up as 'recommending future items if they are similar to past ones that similar users have liked', and which surpasses related work in the same spirit. Copyright © IICAI 2005.
Coulin, C, Sahraoui, AEK & Zowghi, D 1970, 'Towards a collaborative and combinational approach to requirements elicitation within a systems engineering framework', 18th International Conference on Systems Engineering, Proceedings, International Conference on Software Engineering, ICSEng, Las Vegas, USA, pp. 456-461.
View/Download from: Publisher's site
View description>>
Despite its critical importance to the process of systems development, requirements elicitation continues to be a major problem in both research and practice. This complex activity involving many different facets and issues is often performed badly and subsequently blamed for project failure and poor quality systems. In this paper we present a collaborative and combinational approach to requirements elicitation within a systems engineering framework, proposing the application of current research from other disciplines in areas related to requirements elicitation, such as software engineering and the social sciences, to a general systems engineering context. The work provides both researchers and practitioners with an approach to requirements elicitation for systems engineering that can be applied to real world projects to improve both the process and results, thereby increasing the overall chance of successful system development in terms of on schedule and on budget delivery, and satisfied customers.
Coulin, CR & Zowghi, D 1970, 'What do experts think about elicitation? - A state of practice survey', Proceedings the 10th Australian workshop on requirements engineering, Australian Workshop on Requirements Engineering, Deakin University, Melbourne, AUst, pp. 1-10.
Coulin, CR, Zowghi, D & Sahraoui, A 1970, 'A Lightweight Workshop-centric situational approach for the early stages of requirements elicition in software development', Situational Requirements Engineering Processes - SREP 05 The 1st International Workshop, Situational Requirements Engineering Processes, University of Limerick, Paris, France, pp. 136-151.
Devitt, SJ, Fowler, AG & Hollenberg, LCL 1970, 'Investigating the practical implementation of Shor's algorithm', SPIE Proceedings, Smart Materials, Nano-, and Micro-Smart Systems, SPIE, Sydney, AUSTRALIA, pp. 483-494.
View/Download from: Publisher's site
Dunsire, K, O'Neill, T, Denford, M & Leaney, J 1970, 'The ABACUS architectural approach to computer-based system and enterprise evolution', 12th IEEE International Conference and Workshops on the Engineering of Computer-Based Systems, Proceedings, IEEE International Conference and Workshop on the Engineering of Computer Based Systems, IEEE, Maryland, USA, pp. 62-69.
View/Download from: Publisher's site
View description>>
The enterprise computer-based systems employed by the organisations of today can be extremely complex. Not only do they consist of countless hardware and software products from many varied sources, but they often span continents, piggybacking on public networks. These systems are essential for undertaking business and general operations in the modern environment, and yet the ability of organisations to control their evolution is questionable. The emerging practice of enterprise architecture seeks to control that complexity through the use of a holistic and top-down perspective. However, the toolsets already in me, are very much bottom-up by nature. To overcome the limitations of current enterprise architecture practices, the authors propose the use of the ABACUS methodology and toolset. The authors conclude that by using ABACUS to analyse software and enterprise systems, architects can guide the design and evolution of architectures based on quantifiable non-functional requirements. Furthermore, hierarchical 3D visualisation provides a meaningful and intuitive means for conceiving and communicating complex architectures.
Dyson, LE & Underwood, J 1970, 'Indigenous People on the Web', Building Society Through E-Commerce, Collaborative Electronic Commerce Technology and Research, Universidad de Talca, Talca, Chile, pp. 1-11.
Fenech, BJ & Dovey, KA 1970, 'The role of structure in the failure of organisations to learn and transform', Proceedings of the 6th International conference on organisational learning and knowledge, International conference on organisational learning and knowledge, University of Trento, Trento, Italy, pp. 58-75.
Fincher, S, Lister, R, Clear, T, Robins, A, Tenenberg, J & Petre, M 1970, 'Multi-institutional, multi-national studies in CSEd Research', Proceedings of the 2005 international workshop on Computing education research - ICER '05, the 2005 international workshop, ACM Press, Seattle, USA, pp. 111-121.
View/Download from: Publisher's site
View description>>
One indication of the maturation of Computer Science Education as a research-based discipline is the recent emergence of several large-scale studies spanning multiple institutions. This paper examines a 'family' of these multi-institutional, multi-national studies, detailing core elements and points of difference in both study design and the organization of the research team, and highlighting the costs and benefits associated with the different approaches. Copyright 2005 ACM.
Fowler, AG, Devitt, SJ & Hollenberg, LCL 1970, 'Constructing Steane code fault-tolerant gates', SPIE Proceedings, Smart Materials, Nano-, and Micro-Smart Systems, SPIE, Sydney, AUSTRALIA, pp. 557-568.
View/Download from: Publisher's site
Goyal, ML, Lu, J & Zhang, G 1970, 'Negotiating Multi-Issue e-Market Auction through Fuzzy Attitudes.', CIMCA/IAWTIC, International Conference on Computational Intelligence for Modelling, Control and Automation, IEEE Computer Society, Vienna, Austria, pp. 922-927.
View/Download from: Publisher's site
View description>>
The online auctions are one of the most effective ways of negotiation of salable goods over the Internet. To be successful in open multi-agent environments, agents must be capable of adapting different strategies and tactics to their prevailing circumstances. This paper presents a software test-bed for studying autonomous bidding strategies in simulated auctions for procuring goods. It shows that agents' bidding strategy explore the attitudes and behaviors that help agents to manage dynamic assessment of alternative prices of goods given the different scenario conditions
Guo, X, Zhang, G, Chew, E & Burdon, S 1970, 'A Hybrid Recommendation Approach for One-and-Only Items', Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), Australasian Joint Conference on Artificial Intelligence, Springer Berlin Heidelberg, Sydney, Australia, pp. 457-466.
View/Download from: Publisher's site
View description>>
Many mechanisms have been developed to deliver only relevant information to the web users and prevent information overload. The most popular recent developments in the e-commerce domain are the user-preference based personalization and recommendation techniques. However, the existing techniques have a major drawback - poor accuracy of recommendation on one-and-only items - because most of them do not understand the item's semantic features and attributes. Thus, in this study, we propose a novel Semantic Product Relevance model and its attendant personalized recommendation approach to assist Export business selecting the right international trade exhibitions for market promotion. A recommender system, called Smart Trade Exhibition Finder (STEF), is developed to tailor the relevant trade exhibition information to each particular business user. STEF reduces significantly the time, cost and risk faced by exporters in selecting, entering and developing international markets. In particular, the proposed model can be used to overcome the drawback of existing recommendation techniques. © Springer-Verlag Berlin Heidelberg 2005.
Guo-Li Zhang, Hai-Yan Lu, Geng-Yin Li & Guang-Quan Zhang 1970, 'Dynamic economic load dispatch using hybrid genetic algorithm and the method of fuzzy number ranking', 2005 International Conference on Machine Learning and Cybernetics, Proceedings of 2005 International Conference on Machine Learning and Cybernetics, IEEE, Guangzhou, China, pp. 2472-2477.
View/Download from: Publisher's site
View description>>
This paper proposes a new economic load dispatch model that considers cost coefficients with uncertainties and the constraints of ramp rate. The uncertainties are represented by fuzzy numbers, and the model is known as fuzzy dynamic economic load dispatch model (FDELD). A novel hybrid genetic algorithm with quasi-simplex techniques is proposed to handle the FDELD problem. The algorithm creates offspring by using generic operation and quasi-simplex techniques in parallel. The quasi-simplex techniques consider two potential optimal search directions in generating prospective offspring. One direction is the worst-opposite direction, which is used in the conventional simplex techniques, and the other is the best-forward direction, which is a ray from the centroid of a polyhedron whose vertexes are all the points but the best one towards the best point of the simplex. In addition, in order to reserve more fuzzy information, the fuzzy number ranking method is used to optimize the cost function, avoiding the lost some useful information by getting /spl lambda/-level set. The experimental study shows that FDELD is more practical model; the algorithm and techniques proposed are very efficient to solve FDELD problem.
Hardy, V, Fung, HC, Xian, GS, Wu, JH, Zhang, XY & Dyson, LE 1970, 'Paper usage management and information technology: An environmental case study at an Australian University', Internet and Information Technology in Modern Organizations: Challenges and Answers - Proceedings of the 5th International Business Information Management Association Conference, IBIMA 2005, International Business Information Management, IBIMA, Cario, Egypt, pp. 699-705.
View description>>
IT was supposed to lead to the paperless office. However it has actually caused paper usage levels to increase. There are several social explanations for this increasing trend. Printing technology has become more available, and people have more information to print. In addition people often prefer printed matter to working off a screen. In this article we present a case study of paper usage at a University. It was found that, despite continual pressures escalating paper use, a range of strategies from quotas and charges, doublesided printing, reuse of scrap paper and policy changes had been successful in reducing usage. Motivational factors included cost saving and a commitment to the environment.
Hazzan, O, Impagliazzo, J, Lister, R & Schocken, S 1970, 'Using history of computing to address problems and opportunities', Proceedings of the 36th SIGCSE technical symposium on Computer science education, SIGCSE05: Technical Symposium on Computer Science Education, ACM.
View/Download from: Publisher's site
Hendriks, M & Dyson, LE 1970, 'Motes; The New Privacy Invaders', 2005 Information Resources Management Association International Conference, International Conference on Information Resources Management, Idea Group Publishing, San Diego, USA, pp. 772-775.
Hinchey, M, Rozenblit, J, Leaney, J & O'Neill, T 1970, 'Proceedings - 12th IEEE International Conference and Workshop on the Engineering of Computer-Based System, ECS: Foreword', Proceedings - 12th IEEE International Conference and Workshops on the Engineering of Computer-Based Systems, ECS 2005.
Hintz, T, Piccardi, M & He, X 1970, 'Message from the Chairs', Proceedings - 3rd International Conference on Information Technology and Applications, ICITA 2005, pp. iii-iv.
View/Download from: Publisher's site
Huan Huo, Guoren Wang, Chuan Yang & Rui Zhou 1970, 'Signature-based Filtering Techniques for Structural Joins of XML Data', 21st International Conference on Data Engineering Workshops (ICDEW'05), 21st International Conference on Data Engineering Workshops (ICDEW'05), IEEE.
View/Download from: Publisher's site
Hussain, FK, Chang, E & Dillon, TS 1970, 'Formalizing a grammar for reputation in peer-to-peer communication', MoMM 2005 Proceedings, International Conference on Advances in Mobile Computing and Multimedia, Australian Computer Society, Kuala Lumpur, Malaysia, pp. 81-96.
Hussain, O, Chang, E, Hussain, F, Dillon, T & Soh, B 1970, 'Context Based Riskiness Assessment', TENCON 2005 - 2005 IEEE Region 10 Conference, TENCON 2005 - 2005 IEEE Region 10 Conference, IEEE, Melbourne, Aust, pp. 1-6.
View/Download from: Publisher's site
View description>>
In almost every interaction the trusting peer might fear about the likelihood of the loss in the resources involved during the transaction. This likelihood of the loss in the resources is termed as Risk in the transaction. Hence analyzing the Risk involved in a transaction is important to decide whether to proceed with the transaction or not. If a trusting peer is unfamiliar with a trusted peer and has not interacted with it before in a specific context, then it will ask for recommendations from other peer in order to determine the trusted peer?s Riskiness value or reputation. In this paper we discuss the process of asking recommendations from other peers in a specific context and assimilating those recommendations according to its criteria of the interaction in order to determine the correct Riskiness value of the trusted peer.
Hussain, O, Chang, E, Hussain, FK, Dillon, TS & Soh, B 1970, 'A Methodology for Determining Riskiness in Peer-to-Peer Communications', Proceedings of the IEEE International Conference of Industrial Informatics (INDIN 05), INDIN, IEEE, Perth, Australia, pp. 421-432.
View description>>
Every activity has some Risk involved in it. Analyzing the Risk involved in a transaction is important to decide whether to proceed with the transaction or not. Similarly in Peer-to-Peer communication analyzing the Risk involved in undertaking a transaction with another peer too is important. It would be much easier for the trusting peer to make a decision of proceeding a transaction with the trusted peer if he knows the Risk that the trusted peer is worthy of. In this paper develop and propose such a methodology which allows the trusting peer to rate the trusted peer in terns of Risk that he deserves after the transaction is over.
Hussain, O, Chang, E, Soh, B, Hussain, FK & Dillon, TS 1970, 'Factors of Risk Variance in Decentralized Communications', EICAR 2005 Conference Best Paper Proceedings, European Institute for Computer Anti-Virus Research EICAR Conference, Computer Associates, Saint Julians, Malta, pp. 129-137.
View description>>
Decentralized transactions are increasingly becoming popular. These transactions resemble the early forms of the internet and in many ways are regarded as the next generation of the internet. The result will be that this e-commerce transactions approach will shift to peer-to-peer communications rather than client-server environment. However, these peer-to-peer communications or decentralized transactions suffer from some disadvantages, which includes the risk associated with each transaction. This paper focuses on the factors that influence Risk in a decentralized transaction.
Hussain, OK, Chang, E, Hussain, FK, Dillon, TS & Soh, B 1970, 'Risk in Trusted Decentralized Communications', 21st International Conference on Data Engineering Workshops (ICDEW'05), 21st International Conference on Data Engineering Workshops (ICDEW'05), IEEE, Tokyo, Japan, pp. 63-67.
View/Download from: Publisher's site
View description>>
Risk is associated with almost every activity that is undertaken on a daily life. Risk associated with Trust, Security and Privacy. Risk is associated with transactions, businesses, information systems, environments, networks, partnerships, etc. Generally speaking, risk signifies the likelihood of financial losses, human casualties, business destruction and environmental damages. Risk indicator gives early warning to the party involved and helps avoid deserters. Until now, risk has been discussed extensively in the areas of investment, finance, health, environment, daily life activities and engineering. However, there is no systematic study of risk in Decentralised communication, which involves e-business, computer networks and service oriented environment. In this paper, we define risk associated with trusted communication in e-business and e-transactions; provide risk indicator calculations and basic application areas.
Hussain, OK, Chang, E, Hussain, FK, Dillon, TS, Soh, B & IEEE 1970, 'A methodology for determining riskiness in peer-to-peer communications', 2005 3rd IEEE International Conference on Industrial Informatics (INDIN), pp. 655-666.
View/Download from: Publisher's site
View description>>
Every activity has some Risk involved in it. Analyzing the Risk involved in a transaction is important to decide whether to proceed with the transaction or not. Similarly in Peer-to-Peer communication analyzing the Risk involved in undertaking a transaction with another peer too is important. It would be much easier for the trusting peer to make a decision of proceeding a transaction with the trusted peer if he knows the Risk that the trusted peer is worthy of. In this paper develop and propose such a methodology which allows the trusting peer to rate the trusted peer in terms of Risk that he deserves after the transaction is over. © 2005 IEEE.
Johnston, A, Amitani, S & Edmonds, E 1970, 'Amplifying reflective thinking in musical performance', Proceedings of the 5th conference on Creativity & cognition - C&C '05, the 5th conference, ACM Press, London, UK, pp. 166-175.
View/Download from: Publisher's site
View description>>
In this paper we report on the development of tools that encourage both a creative and reflective approach to music-making and musical skill development. A theoretical approach to musical skill development is outlined and previous work in the area of music visualisation is discussed. In addition the characterisation of music performance as a type of design problem is discussed and the implications of this position for the design of tools for musicians are outlined. Prototype tools, the design of which is informed by the theories and previous work, are described and some preliminary evaluation of their effectiveness is discussed. Future directions are outlined. Copyright 2005 ACM.
Johnston, A, Marks, B & Edmonds, E 1970, 'An artistic approach to designing visualisations to aid instrumental music learning', IADIS International Conference on Cognition and Exploratory Learning in Digital Age, CELDA 2005, Cognition and Exploratory Learning in Digital Age, IADIS Press, Porto, Portugal, pp. 175-182.
View description>>
This paper describes the development of a computer-based music visualisation to support the development of instrumental musical skills in advanced students and professional players. The underlying pedagogical philosophy, based on the 'Natural Learning Process' and the emergence of an artistic rather than engineering approach to software development, based on participatory design, are described.
Johnston, AJ, Marks, B & Edmonds, EA 1970, ''Spheres of Influence' : An Interactive Musical Work', Proceedings of the second Australasian conference on Interactive entertainment, Interactive Entertainment, Creativity Cognition Studios Press, Sydney, Australia, pp. 97-103.
View description>>
In this paper we describe the development of an interactive artwork which incorporates both a musical composition and software which provides a visual and aural accompaniment. The system uses physical modeling to implement a type of virtual 'sonic sculpture' which responds to musical input in a way which appears naturalistic. This work forms part of a larger project to use art to explore the potential of computers to develop interactive tools which support the development of creative musical skills.
Lee, S, Leaney, J, O’Neill, T & Hunter, M 1970, 'Open Service Access for QoS Control in Next Generation Networks – Improving the OSA/Parlay Connectivity Manager', Lecture Notes In computer Science: Operations And Management In Ip-Based Networks, Proceedings, IPOM, Springer Berlin Heidelberg, Heidelberg, Germany, pp. 29-38.
View/Download from: Publisher's site
View description>>
The need for providing applications with practical, manageable access to feature-rich capabilities of telecommunications networks has resulted in standardization of the OSA/Parlay APIs and more recently the Parlay X Web Services. Connectivity Manager is
Lee, S, Leaney, J, O'Neill, T & Hunter, M 1970, 'Performance Benchmark of a Parallel and Distributed Network Simulator', Workshop on Principles of Advanced and Distributed Simulation (PADS'05), Workshop on Principles of Advanced and Distributed Simulation (PADS'05), IEEE, Monterey, USA, pp. 101-108.
View/Download from: Publisher's site
View description>>
Simulation of large-scale networks requires enormous amounts of memory and processing time. One way of speeding up these simulations is to distribute the model over a number of connected workstations. However, this introduces inefficiencies caused by the need for synchronization and message passing between machines. In distributed network simulation, one of the factors affecting message passing overhead is the amount of cross-traffic between machines. We perform an independent benchmark of the Parallel/Distributed Network Simulator (PDNS) based on experimental results processed at the Australian Centre for Advanced Computing and Communications (ACS) supercomputing cluster. We measure the effect of cross-traffic on wall-clock time needed to complete a simulation for a set of basic network topologies by comparing the result with the wall-clock time needed on a single processor. Our results show that although efficiency is reduced with large amounts of cross-traffic, speedup can still be achieved with PDNS. With these results, we developed a performance model that can be used as a guideline for designing future simulations. © 2005 IEEE.
Lee, S, Leaney, JR, O'Neill, T & Hunter, M 1970, 'Open API of QoS control in Next Generation Networks', Toward Managed Ubiquitous Information Society, Asia-Pacific Network Operations and Management Symposium, IEICE TM, KICS KNOM, IEEE CNOM, IEEE APB, IEEE COMSOC Japan Chapter and TMF, Okinawa, Japan, pp. 295-306.
Lin, P, MacArthur, A & Leaney, J 1970, 'Defining Autonomic Computing: A Software Engineering Perspective', 2005 Australian Software Engineering Conference, 2005 Australian Software Engineering Conference, IEEE, Brisbane, Australia, pp. 88-97.
View/Download from: Publisher's site
View description>>
As a rapidly growing field, Autonomic Computing is a promising new approach for developing large scale distributed systems. However, while the vision of achieving self-management in computing systems is well established, the field still lacks a commonly accepted definition of what an Autonomic Computing system is. Without a common definition to dictate the direction of development, it is not possible to know whether a system or technology is a part of Autonomic Computing, or if in fact an Autonomic Computing system has already been built. The purpose of this paper is to establish a standardised and quantitative definition of Autonomic Computing through the application of the Quality Metrics Framework described in IEEE Std 1061-1998 [1]. Through the application of this methodology, stakeholders were systematically analysed and evaluated to obtain a balanced and structured definition of Autonomic Computing. This definition allows for further development and implementation of quality metrics, which are project-specific, quantitative measurements that can be used to validate the success of future Autonomic Computing projects.
Lister, RF 1970, 'Methods for evaluating the approproiateness and effectiveness of summative assessment via multiple choice examinations for technology-focused disciplines', Making a Difference: 2005 Evaluations and Assessment Conference, Evaluations and Assessment Conference, UTS, Sydney, Aust, pp. 75-84.
Maxwell, C, Parakhine, A, Leaney, J & Soc, IEEEC 1970, 'Practical application of formal methods for specification and analysis of software architecture', 2005 Australian Software Engineering Conference, Proceedings, Australian Software Engineering Conference, IEEE, Brisbane, Australia, pp. 302-311.
View/Download from: Publisher's site
View description>>
With the ever-growing pace of technological advancement, computer software is required to become increasingly complex to meet the demands of today's leading edge technologies, and their applications. However, fulfilling this requirement creates new, previously unknown, problems pertaining to non-functional properties of software. Specifically, as the software complexity escalates, it becomes increasingly difficult to scale the software in order to cope with the sometimes overwhelming demand created by system growth. It is therefore essential to have processes for addressing the issues associated with scalability that arise due to the complexity in software systems. In this paper we describe an approach aimed at fulfilling the need for such processes. A combination of Object-Z and temporal logic is used to create an architectural description open to further analysis. We also demonstrate the practicality of this methodology within the context of the coordinated adaptive traffic system (CATS).
Maxwell, C, Parakhine, A, Leaney, J, OiNeill, T & Denford, M 1970, 'Heuristic-based architecture generation for complex computer system optimisation', 12TH IEEE INTERNATIONAL CONFERENCE AND WORKSHOPS ON THE ENGINEERING OF COMPUTER-BASED SYSTEMS, PROCEEDINGS, IEEE International Conference and Workshop on the Engineering of Computer Based Systems, IEEE, Greenbelt, USA, pp. 70-78.
View/Download from: Publisher's site
View description>>
Having come of age in the last decade, the use of architecture to describe complex systems, especially in software, is now maturing. With our current ability to describe, represent, analyse and evaluate architectures comes the next logical step in our application of architecture to system design and optimisation. Driven by the increasing scale and complexity of modern systems, the designers have been forced to find new ways of managing the difficult and complex task of balancing the quality trade-offs inherent in all architectures. Architecture-based optimisation has the potential to not only provide designers with a practical method for approaching this task, but also to provide a generic mechanism for increasing the overall quality of system design. In this paper we explore the issues that surround the development of architectural optimisation and present an example of heuristic-based optimisation of a system with respect to concrete goals.
McGregor, C, Heath, J & Ming Wei 1970, 'A Web services based framework for the transmission of physiological data for local and remote neonatal intensive care', 2005 IEEE International Conference on e-Technology, e-Commerce and e-Service, Proceedings. The 2005 IEEE International Conference on e-Technology, e-Commerce and e-Service, IEEE, Hong Kong Baptist Univ, Hong Kong, PEOPLES R CHINA, pp. 496-501.
View/Download from: Publisher's site
McGregor, C, Kneale, B & Tracy, M 1970, 'Bush Babies Broadband: On-Demand Virtual Neonatal Intensive Care Unit Support for Regional Australia', Third International Conference on Information Technology and Applications (ICITA'05), Proceedings. Third International Conference on Information Technology and Applications, IEEE, Sydney, AUSTRALIA, pp. 113-118.
View/Download from: Publisher's site
McGregor, C, Purdy, M & Kneale, B 1970, 'Compression of XML Physiological Data Streams to Support Neonatal Intensive Care Unit Web Services', 2005 IEEE International Conference on e-Technology, e-Commerce and e-Service, 2005 IEEE International Conference on e-Technology, e-Commerce and e-Service, IEEE, Hong Kong Baptist Univ, Hong Kong, PEOPLES R CHINA, pp. 486-489.
View/Download from: Publisher's site
Milton, J, Kennedy, P & Mitchell, H 1970, 'The effect of mutation on the accumulation of information in a genetic algorithm', AI 2005: ADVANCES IN ARTIFICIAL INTELLIGENCE, Australasian Joint Conference on Artificial Intelligence, Springer, Sydney, Australia, pp. 360-368.
View/Download from: Publisher's site
View description>>
We use an information theory approach to investigate the role of mutation on Genetic Algorithms (GA). The concept of solution alleles representing information in the GA and the associated concept of information density, being the average frequency of solution alleles in the population, are introduced. Using these concepts, we show that mutation applied indiscriminately across the population has, on average, a detrimental effect on the accumulation of solution alleles within the population and hence the construction of the solution. Mutation is shown to reliably promote the accumulation of solution alleles only when it is targeted at individuals with a lower information density than the mutation source. When individuals with a lower information density than the mutation source are targeted for mutation, very high rates of mutation can be used. This significantly increases the diversity of alleles present in the population, while also increasing the average occurrence of solution alleles.
Moscato, P, Mathieson, L, Mendes, A & Berretta, R 1970, 'The electronic primaries: Predicting the U.S. Presidency using feature selection with safe data reduction', Conferences in Research and Practice in Information Technology Series, pp. 371-380.
View description>>
The data mining inspired problem of finding the critical, and most useful features to be used to classify a data set, and construct rules to predict the class of future examples is an interesting and important problem. It is also one of the most useful problems with applications in many areas such as microarray analysis, genomics, proteomics, pattern recognition, data compression and knowledge discovery. Expressed as κ-Feature Set it is also a formally hard problem. In this paper we present a method for coping with this hardness using the combinatorial optimisation and parameterized complexity inspired technique of sound reduction rules. We apply our method to an interesting data set which is used to predict the winner of the popular vote in the U.S. presidential elections. We demonstrate the power and exibility of the reductions, especially when used in the context of the (α β)κ-Feature Set variant problem. Copyright © 2005, Australian Computer Society, Inc.
Nataatmadja, I & Dyson, LE 1970, 'Managing the modern workforce: Cultural Diversity and Its Implications', Managing Mofern Organisations with Information Technology, International Conference on Information Resources Management, Idea Group Publishing, San Diego, USA, pp. 580-583.
Nurmuliani, N, Zowghi, D & Williams, S 1970, 'Characterising requirements volatility: An empirical case study', Proceedings 2005 International Symposium on empirical software engineering ISESE 2005, International Symposium on Empirical Software Engineering, IEEE, Noosa Head, Australia, pp. 427-436.
Nurmuliani, N, Zowghi, D & Williams, SP 1970, 'Characterising requirements volatility: An empirical case study', 2005 International Symposium on Empirical Software Engineering (ISESE), Proceedings, 4th International Symposium on Empirical Software Engineering, IEEE, Noosa Heads, AUSTRALIA, pp. 412-421.
Pattinson, HM & Sood, SC 1970, 'Deciphering storylines in B2B selling-buying interactions', Advances in Marketing: Managerial, Pedagogical, Theoretical - Proceedings of the Annual Meeting of the Society for Marketing Advances, Annual Meeting of the Society for Marketing Advances, Society for Marketing Advances, San Antonio, USA, pp. 199-201.
Pietroni, N & Ganovelli, AGF 1970, 'Robust segmentation of anatomical structures with deformable surfaces and marching cubes', Citeseer.
Sarosa, S & Zowghi, D 1970, 'Information technology adoption process within Indonesian SMEs: An empirical study', ACIS 2005 Proceedings - 16th Australasian Conference on Information Systems, Australasian Conference on Information Systems, Australasian Chapter of the Association for Information Systems, Sydney, Australia, pp. 1-9.
View description>>
IT adoption within SMEs has been covered extensively within literature, most of which have considered IT adoption from narrow perspective such as drivers and barriers of IT adoption. IT adoption is better defined as a process which involves organisation and its components, stakeholders external to the organisation, and interactions within organisation and between organisation and its stakeholders. This paper uses multi perspective in IT adoption to build model of IT adoption. A field study involving 35 Indonesian SMEs was conducted in the form of semi structured interviews. The result from this field study were analysed and used to refine the proposed model. © 2005.
Sarosa, S & Zowghi, D 1970, 'Recover from information system failure: An Indonesian case study', European and Mediterranean Conference on Information Systems, EMCIS 2005, European, Mediterranean and Middle Eastern Conference on Information Systems, Information Institute, Cario, Egypt, pp. 1-11.
View description>>
Small and Medium Enterprises (SMEs) sometimes acquire information systems that fail to meet their original aims and objectives. In these circumstances, the project sponsors are forced to decide whether they should abandon the system they have paid for or improvise by finding a way around the problem. This paper presents a case study with two Indonesian SMEs who had to deal with information systems failure within their organizations. Although within the information systems literature reports of these types of failure can be found but little is known about the aftermath of failure within SMEs. This case study presents the actions taken by two Indonesian SMEs who had to face with the failure of their web catalogue systems. The notion of IS failure used in this paper is a combination of 'expectation failure' and 'termination failure'.
Sheridan-Smith, N, Leaney, J, O'Neill, T & Hunter, M 1970, 'A Policy-Driven Autonomous System for Evolutive and Adaptive Management of Complex Services and Networks', 12th IEEE International Conference and Workshops on the Engineering of Computer-Based Systems (ECBS'05), 12th IEEE International Conference and Workshops on the Engineering of Computer-Based Systems (ECBS'05), IEEE, Greenbelt, Maryland, USA, pp. 389-397.
View/Download from: Publisher's site
View description>>
Many existing management systems are not evolutive or adaptive, leading to multiplicity over time and increasing the management burden. Policy-based management approaches may assist in making networks less complex and more automated, but to date they have not yet been able to evolve to support new service sets or provide the capacity for differentiation. We present the architecture for a policy-based system named Pronto that helps to deal with these issues. Layered network and service models are built above an extensible virtual device model that supports heterogenous management interfaces. Interchangeable management components provide the basic building blocks to construct logical services. The integrated policy-driven service definition language automates the management of the services in a manner that is adaptive, dynamic and reactive to improve the user's overall service experience. © 2005 IEEE.
Sheridan-Smith, N, O’Neill, T, Leaney, J & Hunter, M 1970, 'Enhancements to Policy Distribution for Control Flow and Looping', Lecture Notes in Computer Science Vol 3775/2005, IFIP/IEEE International Workshop on Distributed Systems Operations and Management, Springer Berlin Heidelberg, Barcelona, Spain, pp. 269-280.
View/Download from: Publisher's site
View description>>
Our previous work proposed a simple algorithm for the distribution and coordination of network management policies across a number of autonomous management nodes by partitioning an Abstract Syntax Tree into different branches and specifying coordination points based on data and control flow dependencies. We now extend this work to support more complex policies containing control flow logic and looping, which are part of the PRONTO policy language. Early simulation results demonstrate the potential performance and scalability characteristics of this framework.
Sheridan-Smith, N, O'Neill, T, Leaney, J & Hunter, M 1970, 'Distribution and coordination of policies for large-scale service management', LANOMS 2005 - 4th Latin American Network Operations and Management Symposium, Proceedings, pp. 257-262.
View description>>
The distribution and coordination of policies is often overlooked but is crucial to the scalability of dynamic, personalised services. In this work we partition an Abstract Syntax Tree of the policies to determine the responsibility of different management nodes in a geographically segregated network (i.e. management by delegation). This partitioning is combined with IN/OUT set analysis to determine the required coordination for policy enforcement of complex policies with inter-dependencies. Our simulation results show that this approach is promising, as higher decision loads can be readily handled by further sub-dividing of the network.
Sheridan-Smith, NB, O'Neill, T, Leaney, JR & Hunter, M 1970, 'Distribution and Coordination of Policies for Large-Scale Service Management', Proceedings of the IV Latin American Network Operations and Management Symposium, 4th Latin American Network Operations and Management Symposium (LANOMS), Unknown, Porto Alegre, Brazil, pp. 1-12.
Shi, CG, Lu, J, Zhang, GQ, Zhou, H & IEEE 1970, 'An extended Kuhn-Tucker approach for linear bilevel multifollower programming with partial shared variables among followers', INTERNATIONAL CONFERENCE ON SYSTEMS, MAN AND CYBERNETICS, VOL 1-4, PROCEEDINGS, IEEE Conference on Systems, Man and Cybernetics, IEEE Publisher, Hawaii, USA, pp. 3350-3357.
View/Download from: Publisher's site
View description>>
In a real world bilevel decision-making, the lower level of a bilevel decision usually involves multiple decision units. This paper proposes an extended Kuhn-Tucker approach for linear bilevel multifollower programming problems with partial shared variables among followers. Finally numeric examples are given to show how the Kuhn-Tucker approach works.
Sood, SC & Pattinson, HM 1970, 'Semantics in marketspace: emerging semantic marketing computer-mediated environments', Advances in Marketing: Managerial, Pedagogical, Theoretical - Proceedings of the Annual Meeting of the Society for Marketing Advances, Annual Meeting of the Society for Marketing Advances, Society for Marketing Advances, San Antonio, USA, pp. 198-198.
Sood, SC & Pattinson, HM 1970, 'Urban renewal in Asia-Pacific: A comparative analysis of 'brainports' for Sydney and Kuala Lumpur', Dealing with Dualities - 21st Annual IMP Conference, Annual IMP Conference, IMP Group, Rotterdam, Netherlands, pp. 1-6.
van den Hoven, E & Eggen, B 1970, 'Personal souvenirs as ambient intelligent objects', Proceedings of the 2005 joint conference on Smart objects and ambient intelligence: innovative context-aware services: usages and technologies, sOc-EUSAI05: Smart Objects & Ambient Intelligence, ACM, Grenoble, France, pp. 123-128.
View/Download from: Publisher's site
View description>>
Recollecting memories is an important everyday activity, which can be supported in an Ambient Intelligent environment. For optimal support cues are needed that make people reconstruct their memories. The cue category that is most suitable for an Ambient Intelligent environment concerns physical objects, more specifically souvenirs. This paper shows that personal souvenirs are suitable for usage in an Ambient Intelligent recollecting application.
Voinov, AA 1970, 'Understanding and communicating sustainability: Global versus regional', AIChE Annual Meeting, Conference Proceedings, p. 12970.
View description>>
Sustainability in its present connotation is a Western concept that has emerged in the West and largely epresents the attitudes of the developed world. Systems in the developing countries are in transition that is further promoted by globalization. They are foreign to sustainability because by definition they are apt to change rather than maintenance, they are either in the release or renewal stages that hardly anybody wishes to sustain, or have just entered the growth stage. Sustainability is enticing for the developed economic systems, which have reached the conservation phase, and would rather endure this stage. In communicating the knowledge of sustainability it is essential to adapt to the local specifics and redefine sustainability accordingly. Local sustainability can be ensured only by borrowing energy, resources and adaptive potential from outside of the system, or by decreasing the sustainability of the global system. Sustainability of a subsystem is achieved at the expense of the supersystem or other subsystems. Therefore institutions that are to maintain life support systems on this planet need to emphasize global priorities and test policies and strategies against the sustainability of the biosphere, rather than regional or local sustainability. We illustrate these ideas with our findings in the Kola Peninsula Russia) and in the Mekong watershed.
Wang, C, Lu, J & Zhang, G 1970, 'A framework for capturing domain knowledge via the web', AusWeb05: 11th Australasian World Wide Web Conference, Australian World Wide Web Conference, Southern Cross University, Gold Coast, Aust, pp. 248-255.
View description>>
Domain knowledge can be formalized and represented by ontologies, which play an important role in the realization of the Semantic Web. However, since the acquisition of knowledge from certain domains usually requires deep involvement of qualified domain experts, construction of such ontologies is difficult and costly, even with the availability of dedicated languages and ontology editing tools. Some effect has been made to reduce this involvement by introducing a general paradigm of automatic domain knowledge learning from various sources. To make this paradigm more specific and practical, this paper proposes a framework for capturing domain knowledge through raw domain data available over the Web. This framework consists of three dedicated parts: data collection, pre-processing and mining, where mining part performs core task of the framework. Each part can be designed with specific optimized methods. The preliminary implementation of certain parts has shown it is able to capture the knowledge of electronic product taxonomy via the Web. © 2005. Chao Wang, Jie Lu & Guangquan Zhang.
Wang, C, Lu, J, Zhang, GQ & Society, IEEEC 1970, 'A semantic classification approach for online product reviews', 2005 IEEE/WIC/ACM International Conference on Web Intelligence, Proceedings, IEEE/WIC/ACM international Conference on Web Intelligence and Intelligent Agent Technology, IEEE, France, pp. 276-279.
View/Download from: Publisher's site
View description>>
With the fast growth of e-commerce, product reviews on the Web have become an important information source for customersý decision making when they plan to buy products online. As the reviews are often too many for customers to go through, how to automatically classify them into different semantic orientations (i.e. recommend/not recommend) has become a research problem. Different from traditional approaches that treat a review as a whole, our approach performs semantic classifications at the sentence level by realizing reviews often contain mixed feelings or opinions. In this approach, a typical feature selection method based on sentence tagging is employed and a naïve bayes classifier is used to create a base classification model, which is then combined with certain heuristic rules for review sentence classification. Experiments show that this approach achieves better results than using general naïve bayes classifiers.
Weakley, AJ, Johnston, AJ, Edmonds, EA & Turner, GA 1970, 'Creative Collaboration: Communication Translation and Generation in the Development of a Computer-based Artwork', HCI International 2005 - 11th International Conference on Human-Computer Interaction, International Conference on Human-Computer Interaction, Lawrence Erlbaum Assoc, Las Vegas, Nevada, pp. 1-9.
Xu, G, Zhang, Y & Zhou, X 1970, 'Towards User Profiling for Web Recommendation', Lecture Notes in Computer Science, 18th Australian Joint Conference on Artificial Intelligence, Springer Berlin Heidelberg, Sydney, Australia, pp. 415-424.
View/Download from: Publisher's site
Xu, G, Zhang, Y & Zhou, X 1970, 'Using probabilistic latent semantic analysis for web page grouping', Proceedings of the IEEE International Workshop on Research Issues in Data Engineering, 15th International Workshop on Research Issues in Data Engineering: Stream Data Mining and Applications, IEEE Computer Society, Tokyo, Japan, pp. 29-36.
View description>>
The locality of web pages within a web site is initially determined by the designer's expectation. Web usage mining can discover the patterns in the navigational behaviour of web visitors, in turn, improve web site functionality and service designing by considering users' actual opinion. Conventional web page clustering technique is often utilized to reveal the functional similarity of web pages. However, high-dimensional computation problem will be incurred due to taking user transaction as dimension. In this paper, we propose a new web page grouping approach based on Probabilistic Latent Semantic Analysis (PLSA) model. An iterative algorithm based on maximum likelihood principle is employed to overcome the aforementioned computational shortcoming. The web pages are classified into various groups according to user access patterns. Meanwhile, the semantic latent factors or tasks are characterized by extracting the content of 'dominant' pages related to the factors. We demonstrate the effectiveness of our approach by conducting experiments on real world data sets. © 2005 IEEE.
Xu, G, Zhang, Y, Ma, J & Zhou, X 1970, 'Discovering user access pattern based on probabilistic latent factor model', Conferences in Research and Practice in Information Technology Series, 16th Australasian Database Conference, Australian Computer Society, Newcastle, Australia, pp. 27-35.
View description>>
There has been an increased demand for characterizing user access patterns using web mining techniques since the informative knowledge extracted from web server log files can not only offer benefits for web site structure improvement but also for better understanding of user navigational behavior. In this paper, we present a web usage mining method, which utilize web user usage and page linkage information to capture user access pattern based on Probabilistic Latent Semantic Analysis (PLSA) model. A specific probabilistic model analysis algorithm, EM algorithm, is applied to the integrated usage data to infer the latent semantic factors as well as generate user session clusters for revealing user access patterns. Experiments have been conducted on real world data set to validate the effectiveness of the proposed approach. The results have shown that the presented method is capable of characterizing the latent semantic factors and generating user profile in terms of weighted page vectors, which may reflect the common access interest exhibited by users among same session cluster. © 2005, Australian Computer Society, Inc.
Ying Yan, Jianguo Zhu, Haiwei Lu, Youguang Guo & Shuhong Wang 1970, 'A PMSM model incorporating structural and saturation saliencies', 2005 International Conference on Electrical Machines and Systems, Proceedings of the Eighth International Conference on Electrical Machines and Systems, IEEE, Nanjing, China, pp. 194-199.
View/Download from: Publisher's site
View description>>
Sensorless permanent magnet synchronous motor (PMSM) drive systems have become very attractive due to their advantages, such as the reduction of hardware complexity and hence the reduced system cost and increased reliability. In order to accurately determine the rotor position required for correct electronic commutation, various methods have been proposed. Among them, the most versatile makes use of the structural and/or magnetic saturation saliencies of the PMSM. This paper presents a non-linear model for PMSMs with the saliencies. The phase inductances of a PMSM are measured and expressed by Fourier series at different rotor positions according to their patterns. The dynamic performance of the PMSM is simulated and compared with that based on a model without considering saliency to verify the effectiveness of the proposed model.
Yonguang Guo, Jianguo Zhu & Haiwei Lu 1970, 'Design of SMC motors using hybrid optimization techniques and 3D FEA with increasing accuracy', 2005 International Conference on Electrical Machines and Systems, Proceedings of the Eighth International Conference on Electrical Machines and Systems, IEEE, Nanjing, China, pp. 2296-2301.
View/Download from: Publisher's site
View description>>
This paper presents the design and analysis of a three-phase three-stack permanent magnet claw pole motor with soft magnetic composite (SMC) stator core. 3D finite element analysis (FEA) of magnetic field is performed to accurately calculate key motor parameters and performance. Combined optimization techniques and 3D FEA with increasing accuracy are applied to effectively reduce the computational time. The designed motor has been fabricated and tested. The theoretical calculations are validated by the experimental results on the prototype.
YouGuang Guo, Jian Guo Zhu & Haiwei Lu 1970, 'Design and Analysis of a Permanent Magnet Claw Pole/Transverse Flux Motor with SMC Core', 2005 International Conference on Power Electronics and Drives Systems, 2005 International Conference on Power Electronics and Drives Systems, IEEE, pp. 1413-1418.
View/Download from: Publisher's site
View description>>
This paper presents the design and analysis of a claw pole/transverse flux motor (CPTFM) with soft magnetic composite (SMC) core and permanent magnet flux-concentrating rotor. Three-dimensional magnetic field finite element analysis is conducted to accurately calculate key motor parameters such as winding flux, back electromotive force, winding inductance, and core loss. Equivalent electric circuit is derived under optimum brushless DC control condition for motor performance prediction, and computer search techniques are applied for design optimization. All these computations and analyses have been implemented in a commercial software ANSYS for development of the SMC CPTFM prototype.
Yu, S & Zhou, W 1970, 'An Efficient Reliable Architecture for Application Layer Anycast Service', DISTRIBUTED AND PARALLEL COMPUTING, 6th International Conference on Algorithms and Architectures for Parallel Processing, Springer Berlin Heidelberg, Melbourne, AUSTRALIA, pp. 376-385.
View/Download from: Publisher's site
Zhang, Y, Xu, G & Zhou, X 1970, 'A Latent Usage Approach for Clustering Web Transaction and Building User Profile', Lecture Notes in Computer Science, First International Conference, ADMA 2005, Springer Berlin Heidelberg, Wuhan, China, pp. 31-42.
View/Download from: Publisher's site
Zhi-Xin Yu, Jing-Ran Chen & Tian-Qing Zhu 1970, 'A novel adaptive intrusion detection system based on data mining', 2005 International Conference on Machine Learning and Cybernetics, Proceedings of 2005 International Conference on Machine Learning and Cybernetics, IEEE, pp. 2390-2395.
View/Download from: Publisher's site
View description>>
A Data Mining based Adaptive Intrusion Detection Model (DMAIDM) is presented in this paper. The DMAIDM applies a fast heuristic clustering algorithm for mixed data (FHCAM) to distinguish intrusions from legal behaviors efficiently and an attribute-constrained based fuzzy mining algorithm (ACFMA) to construct intrusion Pattern-database automatically. Verification tests are carried out by using the 10% subset of KDD Cup 1999 Data Set, The average detection rate is 71.67% and the average false detection rate is 0.92%.And the detection rate increases from 65.25%(the second subset) to 85.7%(the ninth subset) adaptively. The experimental results reveal that the DMAIDM is successful in terms of not only accuracy but also efficiency in networks intrusion detection. © 2005 IEEE.
Zowghi, D, Firesmith, D & Henderson-Sellers, B 1970, 'Using the OPEN process framework to produce a situation specific requirements engineering method', Situational Requirements Engineering Processes - SREP 05 The 1st International Workshop, International Workshop on Situational Requirements Engineering Processes, University of Limerick, Ireland, Paris, France, pp. 59-75.
View description>>
Since it is not possible to identify or to create a single method that is appropriate for all situations, the need for a focussed requirements engineering method (REM) necessitates the search for a mechanism that will support the flexible creation of a number of tailored REMs from a single base. Using a repository of reusable method components, it is possible to use the techniques espoused by the method engineering community to construct an appropriate REM that is well-suited to the particular system or application development endeavour under consideration. One particular example is used to illustrate this approach that of the OPEN Process Framework (or OPF).