Software bot

A software bot is a type of software agent in the service of software project management and software engineering. A software bot has an identity and potentially personified aspects in order to serve their stakeholders. Software bots often compose software services and provide an alternative user interface, which is sometimes, but not necessarily conversational.

Software bots are typically used to execute tasks, suggest actions, engage in dialogue, and promote social and cultural aspects of a software project.

The term bot is derived from robot. However, robots act in the physical world and software bots act only in digital spaces. Some software bots are designed and behave as chatbots, but not all chatbots are software bots. Erlenhov et al. discuss the past and future of software bots and show that software bots have been adopted for many years.

Usage
Software bots are used to support development activities, such as communication among software developers and automation of repetitive tasks. Software bots have been adopted by several communities related to software development, such as open-source communities on GitHub and Stack Overflow.

GitHub bots have user accounts and can open, close, or comment on pull requests and issues. GitHub bots have been used to assign reviewers, ask contributors to sign the Contributor License Agreement, report continuous integration failures, review code and pull requests, welcome newcomers, run automated tests, merge pull requests, fix bugs and vulnerabilities, etc.

The Slack tool includes an API for developing software bots. There are slack bots for keeping track of todo lists, coordinating standup meetings, and managing support tickets. The Chatbot company products further simplify the process of creating a custom Slack bot.

On Wikipedia, Wikipedia bots automate a variety of tasks, such as creating stub articles, consistently updating the format of multiple articles, and so on. Bots like ClueBot NG are capable of recognizing vandalism and automatically remove disruptive content.

Taxonomies and Classification Frameworks
Lebeuf et al. provide a faceted taxonomy to characterize bots based on a literature review. It is composed of 3 main facets: (i) properties of the environment that the bot was created in; (ii) intrinsic properties of the bot itself; and (iii) the bot's interactions within its environment. They further detail the facets into sets of sub-facets under each of the main facets.

Paikari and van der Hoek defined a set of dimensions to enable comparison of software bots, applied specifically to chatbots. It resulted in six dimensions:
 * Type: the main purpose of the bot (information, collaboration, or automation)
 * Direction of the "conversation" (input, output, or bi-directional)
 * Guidance (human-mediated, or autonomous)
 * Predictability (deterministic, or evolving)
 * Interaction style (dull, alternate vocabulary, relationship-builder, human-like)
 * Communication channel (text, voice, or both)

Erlenhov et al. raised the question of the difference between a bot and simple automation, since much research done in the name of software bots uses the term bot to describe various different tools and sometimes things are "just" plain old development tools. After interviewing and surveying over 100 developers the authors found that not one, but three definitions dominated the community. They created three personas based on these definitions and the difference between what the three personas see as being a bot is mainly the association with a different set of human-like traits. The authors recommends that people doing research or writing about bots try to put their work in the context of one of the personas since the personas have different expectations and problems with the tools.
 * The chat bot persona (Charlie) primarily thinks of bots as tools that communicates with the developer through a natural language interface (typically voice or chat), and caring little about what tasks the bot is used for or how it actually implements these tasks.
 * The autonomous bot persona (Alex) thinks of bots as tools that work on their own (without requiring much input from a developer) on a task that would normally be done by a human.
 * The smart bot persona (Sam) separates bots and plain old development tools through how smart (technically sophisticated) a tool is. Sam cares less about how the tool communicates, but more about if it is unusually good or adaptive at executing a task.

Example of notable bots

 * Dependabot and Renovatebot update software dependencies and detect vulnerabilities. (https://dependabot.com/)
 * Probot is an organization that create and maintain bots for GitHub. The example bots using Probot are the following.
 * Auto Assign (https://probot.github.io/apps/auto-assign/)
 * license bot (https://probot.github.io/)
 * Sentiment bot (https://probot.github.io/apps/sentiment-bot/)
 * Untrivializer bot (https://probot.github.io/apps/untrivializer/)
 * Refactoring-Bot (Refactoring-Bot): provides refactoring based on static code analysis
 * Looks good to me bot (LGTM) is a Semmle product that inspects pull requests on GitHub for code style and unsafe code practices.

Issues and threats
Software bots may not be well accepted by humans. A study from the University of Antwerp has compared how developers active on Stack Overflow perceive answers generated by software bots. They find that developers perceive the quality of software bot-generated answers to be significantly worse if the identity of the software bot is made apparent. By contrast, answers from software bots with human-like identity were better received. In practice, when software bots are used on platforms like GitHub or Wikipedia, their username makes it clear that they are bots, e.g., DependaBot, RenovateBot, DatBot, SineBot.

Bots may be subject to special rules. For instance, the GitHub terms of service does not allow `bot` but accepts `machine account`, where a `machine account` has two properties: 1) a human takes full responsibility of the bot's actions 2) it cannot create other accounts.