User:GCSChris/sandbox

GOOGLE CROWDSOURCE

Crowdsource, also known as Google Crowdsource, is a crowdsourcing platform developed by Google.

Crowdsource was released for Android on the Google Play store on August 29, 2016, and is also available on the web. Crowdsource offers users a variety of short tasks they can complete in order to improve many of Google's services. Such tasks include image label verification, sentiment evaluation, and translation validation. By completing these tasks, users provide Google data it uses to improve services such as Google Maps and Google Translate, as well as Android. As users complete tasks, they earn achievements, including stats, badges, and certificates, which track their progress.

Crowdsource was released quietly on the Google Play store, with no marketing from Google. It received mixed reviews on release, with many reviews stating that its lack of monetary rewards is strange, as similar apps often reward users with app store credit.

Features
Crowdsource presents users with different types of tasks that each provide Google with different information to provide training data to its machine learning algorithms. In the app's description on Google Play, Google refers to these tasks as "microtasks" which typically should take "no more than 5-10 seconds".

Tasks
Upon launch, the Crowdsource Android application presented users with 5 different tasks: image transcription, handwriting recognition, translation, translation validation, and map translation validation.

The most recent version of the app includes a list of the following tasks in its description:


 * Translation: Help us translate phrases to different languages.
 * Translation validation: Select which phrases are translated correctly.
 * Maps Translation validation: Confirm that a business name is translated correctly.
 * Handwriting recognition: Look at the handwriting and type the text you see.
 * Sentiment evaluation: Evaluate a short sentence and if sentiment is positive, neutral or negative.
 * Landmarks: Confirm if landmark “x” is visible in the picture.
 * Image transcription: Type the text that is seen in images of street signs, etc.

Translation related tasks (translation and translation validation) are only shown to users who have selected more than one language they are fluent in. While Maps Translation validation is no longer a task in the Crowdsource Android app, users can still complete translation and translation validation tasks. Translation presents the user with one of the languages they listed themselves as fluent in, and asks them to translate it into another language that they are fluent in. Translation validation presents users with a list of translations submitted by other users, and asks them to categorize them as correct or incorrect. Both of these tasks help improve Google's translating capabilities, most notably in Google Translate, as well as any other Google app with translated content, including Google Maps.

Image Transcription allows users to select a word from a list of topics, such as "Baseball", and then presents the user with a picture, asking the questions "Can Baseball be one of the labels for this image?". Users can select "Yes", "No", or "Skip" if they are unsure. The data gained from this task is used to help read text within images for services like Google Street View.

Handwriting Recognition relies on users to read handwritten words and transcribe them to text. According to Google, completing this task helps improve Google Keyboard's handwriting feature.

Sentiment Evaluation presents the user with various reviews and comments, and asks them to describe the statement as "positive", "neutral", or "negative". Alternatively, users can skip a question if they are unsure. This evaluation by the users of Crowdsource help with various recommendation-based technologies that Google uses on platforms like Google Maps, the Google Play Store, and YouTube.

Achievements
Beyond the tasks that users can complete, the Crowdsource app has an achievements section that shows users stats and badges that they earn through completing different tasks in Crowdsource.

Stats
When users contribute to Crowdsource by completing tasks, Crowdsource tracks their total number of contributions, as well as metrics like "upvotes" which show how many of a user's answers are "in agreement with answers from the Crowdsource community," and "accuracy" which shows the percentage of answers accepted as correct.

Badges
As users complete tasks, they also receive badges. There are badges for each type of task, which track progress along that particular task (such as translation validation), as well as badges for other milestones, like completing a task while offline or completing a task given through a push notification.

Updates
An April 2018 update to Crowdsource included a new, "Image capture" task in which users can take photos, tag them, and upload them to Crowdsource. Users can choose to open-source their images, as well, to share them with researchers and developers not exclusive to Google. In an interview with Wired Magazine, Anurag Batra, a product manager at Google, said that the data gained from users completing this image capture task could improve Google's image search, camera apps, and Google Lens.

Platforms
Crowdsource is also available as a web application. It offers many of the same tasks, such as image transcription, translation, and translation validation, and includes a page for users to view their achievements, much like the Android application. Unlike the Android version, the Crowdsource website includes a task for validation image captions, but does not have the handwriting recognition, sentiment evaluation, and image capture tasks available. The Crowdsource website also offers a "Picture Quest Game" as well.

Reception
Many reviewers on release found the app's lack of monetary rewards strange, due to the fact that Google has a similar app, Google Opinion Rewards, which offers Google Play store credits after completing short surveys.

A August 2016 review from Android Pit noted that "this reliance on altruism [is] a little strange given that Google already has an app, Google Opinion Rewards, which has a financial incentive for user feedback. It works slightly differently, but I don't see why the same reward scheme could not be applied." The review also expressed concern with this model, citing that users would be less likely to complete these tasks without more substantial rewards, writing that Crowdsource is "banking on the kind nature of its users.", and asking "How are users, who generally want everything free and without adverts, going to respond to this in the long-term? Why would they rate this app highly, when the results are so nebulous?" A review from Wired shared similar concerns, writing "while Google is being open about its motivations, it will be difficult for users to know what difference their contributions make."

In Crowdsource's FAQ, Google addresses this question of "Will I get paid for my answers?", answering, "No. Crowdsource is a community effort – we rely on the goodwill of community members to help improve the quality of services such as Google Maps, Google Translate, and others, so that everybody in the world can benefit".

In an August 2016 review, CNET noted that Google's statement in Crowdsource's description, "Every time you use it, you know that you've made the internet a better place for your community.", is not accurate, as Google does not offer free access to Google Maps and Google Translate data. A review from TechCrunch also notes that Crowdsource is "solely focused on helping Google improve its own services," and compares it to Amazon Mechanical Turk, which focuses on tasks from third parties.

Applications
An April 2018 interview in Wired stated that Google's machine learning algorithms work best in the United States and Western Europe, but are less effective in less prosperous countries. In this interview Anurag Batra, a product manager at Google who leads the Crowdsource team, shared Google's motivations behind the Crowdsource app, stating that Google has "very sparse training data set from parts of the world that are not the United States and Western Europe,” According to Wired, Google has a team that promotes the Crowdsource app in India and throughout Asia at colleges, and will likely expand to Latin America later in 2018.

Google uses the answers provided by users of Crowdsource, and validates them by showing them anonymously to other Crowdsource users. According to Google, once answers are validated, they are used to "train computer algorithms that run services such as Google Translate, Google Maps, Android Keyboard, and others."

In a blog post on Local Guides Connect, Batra explains why Crowdsource is helpful to Google, detailing that the questions that Crowdsource asks users are designed to collect better samples of data to feed their machine learning algorithms.

A brief on Cio Dive stated that an "accurate data set is critical" to the success of new technologies such as voice assistants and autonomous vehicles. The brief also notes that companies like Google and IBM are well positioned in the fields of Artificial Intelligence and Machine Learning due to the volume of data available to them to train and develop advanced AI.

Article evaluation
The Google Keep article nicely lays out the top-level aspects of the Google Keep phone app in its introduction. It goes through the key points: what it is, who developed it, where it is available, and when it launched. It also briefly touches on the reception of the app, including positive and negative feedback. Overall this gives me a great overview of the app. The intro also nicely mirrors the format of the remainder of the article; the article is broken into 3 major sections: Features, Platforms, and Reception, which organizes the ideas presented in the introduction.

For most Wikipedia articles, I feel that the first two paragraphs are most important. If the reader is confused about the topic after reading the intro, the article can almost certainly be improved. Especially regarding articles about phone apps, there should be no fluff, and I think this intro does that well. There is nothing that distracts me from the main topic.

The article’s information is relatively up to date as well. It focuses on the years 2013, when the app launched, and 2016, when the app received a substantial update. One thing I would add is another subsection for any 2018 reviews. I would also likely expand the platforms section with more details regarding the iOS version of the app, and perhaps write briefly about the main differences, if any, between the two versions of the apps.

Overall, the article’s tone is neutral, even when incorporating reviews into the article. The author uses quotes and simply mentions the review, not analyzing it themselves. All quotes and reviews as well as other facts about the app are well sourced. Some facts about the app are cited with an APKMirror page for the app.

The talk page is not very active. The article is rated in the following ways for the corresponding WikiProjects:


 * WikiProject Google: B-Class quality, Low-Importance
 * WikiProject Computing: Start-Class quality, Low-Importance
 * WikiProject Apps: Start-Class quality, no importance rating

There is one post about edits that caused a dispute regarding a neutral point-of-view, but there is no follow up.

Article Selection
Most of the articles I am interested in improving are stubs:


 * Mike O’Brien (Stub)
 * This article, as a stub, is very brief. It does a good job of introducing who Mike O’Brien is, but only offers 2 sources to back up all of the information presented in the stub. Based on the introduction, there is much more to mention that just a list of the games he has worked on. In its current state, I would argue that the article should list the games O’Brien has worked on as a bullet-point list.
 * After adding more information, I would split up the article using headers like “Games”, “Career”, and/or “Personal Life” depending on the amount of additional information available.
 * Split screen (computer graphics) (Stub)
 * The first, glaring issue with this article is that it has no citations. However, I do think it is organized in a nice way. It detailed the advantages and disadvantages of split screen in video games from a user’s perspective. This focuses mostly on the game design aspect of split screen. Of course, there are no citations to back this information up, but I think this section could be interesting and useful to readers if referenced properly.
 * The current article includes bias claims such as split screen being “remarkably popular on consoles” with no reference. I would overhaul this article, adding citations to relevant points that the original author included. The main changes I would make other than that would be expanding the technical, computer graphics perspective of the article, especially considering that the article is currently titled “Split screen (computer graphics)”. I would offer an overview of how split screen is implemented, linking internally to any relevant articles such as vector graphics.
 * Google Crowdsource (New Article)
 * I found that there are red links to Google Crowdsource from various other pages such as the page for Google Translate.
 * This app is currently gaining popularity after being released a couple of years ago, so I think it is perfectly relevant to be written about. I also found that it is also listed on the Google WikiProject page as a topic that needs an article.

Article Citation
Translation related tasks (translation and translation validation) are only shown to users who have selected more than one language they are fluent in. While Maps Translation validation is no longer a task in the Crowdsource Android app, users can still complete translation and translation validation tasks.

DYK Hook
Did you know that Google collects information from users around the world using Google Crowdsource to improve its services like Google Translate and Google Maps?

Note: I composed my “hook”, but did not submit to the DYK section because I moved my article into mainspace too long before completing this WikiEd module (articles need to be created or expanded 5-fold no more than 7 days ago).

Peer Review Links

 * User talk:Elenadob/sandbox
 * User talk:Ethan McCue/sandbox