BANDIT the software tool used for bird banding program at the Bird Banding Lab was previously examined by a team of students including myself to better understand the faults in user-centric design plaguing the program. Prior to our involvement with the software the lab had received complaints from users frustrated about the lack of intuitiveness of BANDIT. Our work in the spring yielding substantial findings about how to increase the usability of BANDIT based off preliminary research (see BANDIT Redesign Part 1). Starting in May 2018 I began to work individually with the Bird Banding Lab to identify specific task flows for particular features within the software. The goal was to increase user retention, participation and data integrity through the integration of the new user flows and paths.
To properly identify which issues were causing the users the most frustrations along with which paths could be altered to maximize benefits for the Bird Banding Lab research consisting of surveys and interviews were conducted. Banders (users who band birds and submit data to BANDIT) were given a survey crafted by myself in collaboration with banding lab staff to better understand their frustrations within the system. The survey asked a series of questions with the goal of identifying the root cause of users frustrations and aversion to the software while also looking towards the future and seeing what types of updates, and system changes users were able to accommodate. Simultaneously as banders were participating in surveys I began conducting face to face interviews with members of the Bird Banding Lab Staff. These interviews set out to establish what staff at the lab was looking for from users and their data. I looked at areas such as frustrations, areas of increased workload, and metrics of a better user satisfaction throughout these interviews as topics of interest.
Once interviews and surveys were completed i complied the results to examine which user tasks should be re-examined first. Based off of results and continued discussions with staff two task flows were decided upon to be re-examined. The first would be the process in which to correct errors within the software while the second would be the batch upload process which allowed users to quickly import large sets of data. These two features were seen as vital to the goal of increasing user retention and data integrity, based off of research. In particular I learned how much the learning curve served as a barrier to new users to continue engaging in the program. The complexity of the software required users to be trained in order to fully understand it, however with most banders only submitting data once a year the idea of spending a few hours learning a software was very appeasing.
Issues with the error correction process as identified through research were mostly surrounding intuitiveness of the system. Users complained that errors were hard to identify, and even harder to resolve. The current process lack guidance and relied on users understanding a complex series of steps to identify and resolve errors. This often resulted in users submitting incorrect or incomplete data to the lab with the hopes of lab staff to fix their mistakes. In the redesign I aimed to simplify this process and provide a guided journey through the use of alert boxes, more simplified and directive language, and highlighted paths. The benefits of these changes would not only increase usability for current users, but allow for a easier on-boarding process for new recruits. This was important since many banders rely on transitional staff such as research assistance and grad students for data entry. Banding staff could also look forward to a decrease workload with these recommendations as users would be able to better resolve their own issues.
Batch upload was the second journey chosen to be re-examined. This was in part due to the popularity of the feature and the ability of this feature to significantly increase user satisfaction when understood correctly. In the current edition BANDIT the batch upload process suffered from issues of lack of intuitiveness. Users often became confused of how to import records if heading titles did not match completely. In the past the Bird Banding Lab had sought to fix this issue by creating a standardized template that could be used to import user data. Survey research showed that users that were using the template had a much higher satisfaction than those who did not however, the template still experienced low usage. In the re-imagining of this process I sought to increase usage of the template and increase the ease of access to template. Through research I was able to determine a simple solution of implementing a MS Excel online feature within the BANDIT tool. The implementation of the tool helped ensure the two major pro's of the original batch upload, independence and familiarity. The tool ensured that users would be able to access the data without being directly logged into the BANDIT portal, along with providing users with an interface they are often accustomed to from previous projects. Implementation of this would allow for not only a way to promote and encourage use of the standard template but also allow for mobile upload, something that is increasingly important as agencies switch over to smart devices in efforts to cut cost. Other updates to the design included a guided entry to the batch upload section from the main screen, something that currently did not exist.