The 5th International Workshop on Crowd-Based Requirements Engineering (CrowdRE'21)



21 September 2021
Virtual Conference
In conjunction with RE 2021

Overview

The rise of mobile, social and cloud apps required requirements engineering (RE) to adapt itself. The traditional methods of RE are very inefficient in situations involving thousands to millions of current and potential users of a (software) product. The crowd is an interesting source for RE because it produces user feedback in texts and usage data. Being able to respond quickly, effectively and iteratively to the requirements, problems, wishes and needs identified in user feedback can increase a product’s success. Crowd-Based RE (CrowdRE) seeks to provide RE with suitable means for this crowd paradigm.

The Fifth Workshop on Crowd-Based Requirements Engineering (CrowdRE’21) focuses on CrowdRE in the era of the COVID-19 pandemic and bridging the gap between CrowdRE and development.

University of Notre Dame

CrowdRE’21 builds on the successes of its previous editions, which unified the visions into a coherent RE approach (CrowdRE’15), established a roadmap and shared resources (CrowdRE’17), strengthened relationships to artificial intelligence techniques (CrowdRE as special focal topic of AIRE’18), redefined its scope (CrowdRE’19), and expanded into digital transformation territory (CrowdRE’20).


Keynote Speaker


Fabiano Dalpiaz - On the Value of CrowdRE in Research and Practice

Fabiano Dalpiaz
Abstract: User feedback is a key component of requirements elicitation, even more so in CrowdRE. Such feedback can be obtained either by requesting the crowd to express their needs via a dedicated platform, or by analyzing spontaneous inputs such as user reviews in app stores. In either scenario, significant human effort is required to establish a crowd, to motivate the users, to process their inputs and to combine them with the product's roadmap. Researchers have proposed (semi-) automated approaches that aim to reduce human effort when analyzing large quantities of user feedback. These approaches often employ machine and deep learning to classify and summarize thousands or millions of user reviews. The growing body of literature published in top research venues testifies the academic relevance and value of CrowdRE.
However, considerably less attention has been paid to the value for practitioners. Does the value for the research community (i.e., published papers) lead to comparable value for a development team who wishes to adopt CrowdRE techniques?
Starting from research executed in collaboration with industrial partners, I will offer a personal, yet empirically grounded, perspective on the value of CrowdRE for research and practice. Based on these findings, I will put forward concrete directions for conducting CrowdRE research with value for RE practice.

Bio: Dr. Fabiano Dalpiaz is a tenured assistant professor in the Department of Information and Computing Sciences at Utrecht University in the Netherlands. He is principal investigator in the department's Requirements Engineering lab. In his research, Fabiano blends artificial intelligence with information visualization in order to increase the quality of the requirements engineering process and artifacts, with the ultimate aim of delivering higher-quality software. His research is often validated in-vivo through collaborations with the software industry. Fabiano acted as program co-chair of REFSQ 2021, RCIS 2020, and the RE@Next! track of IEEE RE 2021. He was the organization chair of the REFSQ 2018 conference, and he is an associate editor for the Requirements Engineering Journal and the Business & Information Systems Engineering Journal. He often serves on the program committee of conferences such as RE, CAiSE, REFSQ, ICSE, and AAMAS.


More information:



Photo: Michael Fernandes, Wikimedia Commons
Important Dates
  • Paper Submission: 1 July 2021 (Extended!)
  • Notification: 23 July 2021
  • Camera Ready: 12 August 2021
  • Workshop: 21 September 2021

All deadlines at 23:59:59 AoE.