Accessibility Skip to Global Navigation Skip to Local Navigation Skip to Content Skip to Search Skip to Site Map Menu

Four Papers accepted at EASE2017 for Sherlock Licorish et al

Monday, 22 May 2017

Details of the four Information Science papers are as follows:

Attributes that Predict which Features to Fix: Lessons for App Store Mining

By: Sherlock A. Licorish, Bastin Tony Roy Savarimuthu, Swetha Keertipati

Requirements engineering is assessed as the most important phase of the software development process. There is often a large amount of feedback for apps after they are released, and the main challenge is to assess the importance all suggestions, so that the developer can prioritise what needs to be fixed or added. Previous work in this area has not quite addressed the prioritisation challenge given the considerable number of app reviews that are often received. Our research aims at exploring the usefulness of each app review, and the attributes that predict which app features to fix. We conclude that review mining and prioritization challenges remain given variances in app reviews’ content and structure. Findings also point to the need to redesign app review interfaces to consider how reviews are captured.


Interactive Posters: An Alternative to Collect Practitioners’ Experience

By: Philipp Diebold, Matthias Galster, Austen Rainer, Sherlock A. Licorish

The validity of survey-based research depends significantly on the number and the quality of participants. Attempts to get responses for online surveys often fail because questioning is too complex or there is a general lack of interest. Our team of researchers used a more interactive approach at practitioner-focussed software engineering events. We presented our questions on posters, and respondents placed individually coloured stickers on these to indicate their answers. By using this system we retrieved a large number of responses, and it was also easier to follow-up with study participants. We provide the advantages of disadvantages of utilising such posters to help those facing challenges in this area.


Investigating developers’ email discussions during decision-making in Python language evolution

By: Pankajeshwara Sharma, Bastin Tony Roy Savarimuthu, Nigel Stanger, Sherlock A. Licorish, Austen Rainer

Open Source Software (OSS) developers use mailing lists as their main forum for discussing the evolution of a project. However, investigating the use of mailing lists by developers for decision-making has not received much research attention. We have explored this area using a dataset of a little under 50,000 emails from relevant Python lists. Our outcomes point to several opportunities for improving the management of an OSS team based on the knowledge generated from discussions. We have also identified several interesting avenues for future work, such as identifying individuals or groups that present persuasive arguments during decision-making.


Crowdsourced Knowledge on Stack Overflow: A Systematic Mapping Study

By: Sarah Meldrum, Sherlock A. Licorish, Bastin Tony Roy Savarimuthu

Platforms such as Stack Overflow are available for software practitioners to solicit help and solutions to their challenges and knowledge needs. There have been quality concerns as regards this community’s practices. We conducted a systematic mapping study involving nearly 300 papers from six relevant databases. Outcomes show that Stack Overflow has attracted increasing research interest with topics relating to community dynamics as well as human factors, and technical issues. In addition, research studies have been largely evaluative or proposed solutions, though this latter approach tends to lack validation. This signals the need for future work to explore the nature of Stack Overflow research contributions that are provided, and their quality. We outline our research agenda for continuing with such efforts.