University of Canberra, School of Information Sciences and Engineering
Place of publication
There has been an increasing focus internationally on the quality and impact of research outputs in recent years. Several countries, including the United Kingdom and New Zealand have implemented schemes to base the funding of research on research quality. The Australian government is planning to implement a Research Quality Framework (RQF) in the next few years that will impact greatly on funding of research in Australian universities. A key issue for Australian researchers is how the quality and impact of research is defined and measured in their discipline areas. Although peer review is widely used to assess the quality of research outputs, it is expensive and labour intensive. Other surrogate quality measures are often used. This paper focuses on measuring the quality of research outputs in the information systems discipline. We argue that measures such as citation indexes are inappropriate for information systems and that the publication outlet is a more suitable indicator of quality. We present a ranking list of journals for the information systems discipline, and discuss the approach we have taken in developing the list. We discuss how the ranking list may be used in defining and measuring the quality of information systems research outputs, the limitations inherent in the approach and discuss lessons we have learned in developing the list.
Unless expressly stated otherwise, the copyright for items in DRO is owned by the author, with all rights reserved.
Every reasonable effort has been made to ensure that permission has been obtained for items included in DRO.
If you believe that your rights have been infringed by this repository, please contact email@example.com.
Every reasonable effort has been made to ensure that permission has been obtained for items included in DRO. If you believe that your rights have been infringed by this repository, please contact firstname.lastname@example.org.