User:GCSChris/sandbox

Source: Wikipedia, the free encyclopedia.

GOOGLE CROWDSOURCE

Google Crowdsource
Developer(s)Google
Initial releaseAugust 29, 2016; 7 years ago (2016-08-29)
Operating systemAndroid, web
TypeCrowdsourcing platform

Crowdsource, also known as Google Crowdsource, is a crowdsourcing platform developed by Google.

Crowdsource was released for Android on the Google Play store on August 29, 2016, and is also available on the web. Crowdsource offers users a variety of short tasks they can complete in order to improve many of Google's services. Such tasks include image label verification, sentiment evaluation, and translation validation. By completing these tasks, users provide Google data it uses to improve services such as Google Maps and Google Translate, as well as Android.[1] As users complete tasks, they earn achievements, including stats, badges, and certificates, which track their progress.

Crowdsource was released quietly on the Google Play store, with no marketing from Google.[2] It received mixed reviews on release, with many reviews stating that its lack of monetary rewards is strange, as similar apps often reward users with app store credit.[3][2][4]

Features

Crowdsource presents users with different types of tasks that each provide Google with different information to provide training data to its machine learning algorithms. In the app's description on Google Play, Google refers to these tasks as "microtasks" which typically should take "no more than 5-10 seconds".[5]

Tasks

Upon launch, the Crowdsource Android application presented users with 5 different tasks: image transcription, handwriting recognition, translation, translation validation, and map translation validation.[4]

The most recent version of the app includes a list of the following tasks in its description:

  • Translation: Help us translate phrases to different languages.
  • Translation validation: Select which phrases are translated correctly.
  • Maps Translation validation: Confirm that a business name is translated correctly.
  • Handwriting recognition: Look at the handwriting and type the text you see.
  • Sentiment evaluation: Evaluate a short sentence and if sentiment is positive, neutral or negative.
  • Landmarks: Confirm if landmark “x” is visible in the picture.
  • Image transcription: Type the text that is seen in images of street signs, etc.[5]

Translation related tasks (translation and translation validation) are only shown to users who have selected more than one language they are fluent in.[5][3][2][4] While Maps Translation validation is no longer a task in the Crowdsource Android app, users can still complete translation and translation validation tasks.[5] Translation presents the user with one of the languages they listed themselves as fluent in, and asks them to translate it into another language that they are fluent in.[5] Translation validation presents users with a list of translations submitted by other users, and asks them to categorize them as correct or incorrect.[5] Both of these tasks help improve Google's translating capabilities, most notably in Google Translate, as well as any other Google app with translated content, including Google Maps.[1][6]

Image Transcription allows users to select a word from a list of topics, such as "Baseball", and then presents the user with a picture, asking the questions "Can Baseball be one of the labels for this image?". Users can select "Yes", "No", or "Skip" if they are unsure.[5][1] The data gained from this task is used to help read text within images for services like Google Street View.[1]

Handwriting Recognition relies on users to read handwritten words and transcribe them to text. According to Google, completing this task helps improve Google Keyboard's handwriting feature.[1]

Sentiment Evaluation presents the user with various reviews and comments, and asks them to describe the statement as "positive", "neutral", or "negative".[7] Alternatively, users can skip a question if they are unsure.[1] This evaluation by the users of Crowdsource help with various recommendation-based technologies that Google uses on platforms like Google Maps, the Google Play Store, and YouTube.[1][7]

Achievements

Beyond the tasks that users can complete, the Crowdsource app has an achievements section that shows users stats and badges that they earn through completing different tasks in Crowdsource.[5][1][8][9]

Stats

When users contribute to Crowdsource by completing tasks, Crowdsource tracks their total number of contributions, as well as metrics like "upvotes" which show how many of a user's answers are "in agreement with answers from the Crowdsource community,"[8] and "accuracy" which shows the percentage of answers accepted as correct.[8]

Badges

As users complete tasks, they also receive badges. There are badges for each type of task, which track progress along that particular task (such as translation validation), as well as badges for other milestones, like completing a task while offline or completing a task given through a push notification.[9][5]

Updates

An April 2018 update to Crowdsource included a new, "Image capture" task in which users can take photos, tag them, and upload them to Crowdsource.[2] Users can choose to open-source their images, as well, to share them with researchers and developers not exclusive to Google.[10] In an interview with Wired Magazine, Anurag Batra, a product manager at Google, said that the data gained from users completing this image capture task could improve Google's image search, camera apps, and Google Lens.[11]

Platforms

Crowdsource is also available as a web application. It offers many of the same tasks, such as image transcription, translation, and translation validation, and includes a page for users to view their achievements, much like the Android application. Unlike the Android version, the Crowdsource website includes a task for validation image captions, but does not have the handwriting recognition, sentiment evaluation, and image capture tasks available.[5][12] The Crowdsource website also offers a "Picture Quest Game" as well.[12]

Reception

Many reviewers on release found the app's lack of monetary rewards strange, due to the fact that Google has a similar app, Google Opinion Rewards, which offers Google Play store credits after completing short surveys.[4][2][3]

A August 2016 review from Android Pit noted that "this reliance on altruism [is] a little strange given that Google already has an app, Google Opinion Rewards, which has a financial incentive for user feedback. It works slightly differently, but I don't see why the same reward scheme could not be applied." The review also expressed concern with this model, citing that users would be less likely to complete these tasks without more substantial rewards, writing that Crowdsource is "banking on the kind nature of its users.", and asking "How are users, who generally want everything free and without adverts, going to respond to this in the long-term? Why would they rate this app highly, when the results are so nebulous?"[4] A review from Wired shared similar concerns, writing "while Google is being open about its motivations, it will be difficult for users to know what difference their contributions make."[11]

In Crowdsource's FAQ, Google addresses this question of "Will I get paid for my answers?", answering, "No. Crowdsource is a community effort – we rely on the goodwill of community members to help improve the quality of services such as Google Maps, Google Translate, and others, so that everybody in the world can benefit".[13]

In an August 2016 review, CNET noted that Google's statement in Crowdsource's description, "Every time you use it, you know that you've made the internet a better place for your community.", is not accurate, as Google does not offer free access to Google Maps and Google Translate data.[14] A review from TechCrunch also notes that Crowdsource is "solely focused on helping Google improve its own services," and compares it to Amazon Mechanical Turk, which focuses on tasks from third parties.[2]

Applications

An April 2018 interview in Wired stated that Google's machine learning algorithms work best in the United States and Western Europe, but are less effective in less prosperous countries. In this interview Anurag Batra, a product manager at Google who leads the Crowdsource team, shared Google's motivations behind the Crowdsource app, stating that Google has "very sparse training data set from parts of the world that are not the United States and Western Europe,”[11] According to Wired, Google has a team that promotes the Crowdsource app in India and throughout Asia at colleges, and will likely expand to Latin America later in 2018.[11]

Google uses the answers provided by users of Crowdsource, and validates them by showing them anonymously to other Crowdsource users.[15] According to Google, once answers are validated, they are used to "train computer algorithms that run services such as Google Translate, Google Maps, Android Keyboard, and others."[15]

In a blog post on Local Guides Connect, Batra explains why Crowdsource is helpful to Google, detailing that the questions that Crowdsource asks users are designed to collect better samples of data to feed their machine learning algorithms.[1]

A brief on Cio Dive stated that an "accurate data set is critical" to the success of new technologies such as voice assistants and autonomous vehicles.[16] The brief also notes that companies like Google and IBM are well positioned in the fields of Artificial Intelligence and Machine Learning due to the volume of data available to them to train and develop advanced AI.[16]

See also

Amazon Mechanical Turk

Google Opinion Rewards

References

  1. ^ a b c d e f g h i AnuragB (February 6, 2017). "Try the new Crowdsource app for Android". Retrieved October 24, 2018.
  2. ^ a b c d e f Perez, Sarah (August 29, 2016). "Google's new app Crowdsource asks users to help with translation, transcription & more". TechCrunch. Retrieved October 23, 2018.
  3. ^ a b c Whitwam, Ryan (August 29, 2016). "New Google Crowdsource app asks you to help with translation and text transcription a few seconds at a time". Android Police. Retrieved October 23, 2018.
  4. ^ a b c d e Gordon, Scott (September 6, 2016). "Google Crowdsource: what you need to know". AndroidPIT. Retrieved October 23, 2018.
  5. ^ a b c d e f g h i j "Crowdsource 1.2.0.215240185 APK Download by Google LLC". APKMirror. October 3, 2018. Retrieved October 23, 2018.
  6. ^ Google. "Translate words & phrases - Crowdsource Help". support.google.com. Retrieved October 26, 2018. {{cite web}}: |last= has generic name (help)
  7. ^ a b "Evaluate sentiment - Crowdsource Help". support.google.com. Retrieved October 26, 2018.
  8. ^ a b c "Achievements (Stats) - Crowdsource Help". support.google.com. Retrieved October 23, 2018.
  9. ^ a b "Achievements (Badges) - Crowdsource Help". support.google.com. Retrieved October 23, 2018.
  10. ^ "Choose & tag images - Crowdsource Help". support.google.com. Retrieved October 23, 2018.
  11. ^ a b c d Simonite, Tom (April 5, 2018). "Google Turns to Users to Improve Its AI Chops Outside the US". WIRED. Retrieved October 23, 2018.
  12. ^ a b "crowdsource.google.com". crowdsource.google.com. Retrieved October 26, 2018.
  13. ^ "FAQ - Crowdsource Help". support.google.com. Retrieved October 23, 2018.
  14. ^ Shankland, Stephen (August 29, 2016). "New Crowdsource app lets your work for Google for free". CNET. Retrieved October 23, 2018.
  15. ^ a b "How does Google use my answers? - Crowdsource Help". support.google.com. Retrieved October 23, 2018.
  16. ^ a b Hickey, Alex (April 6, 2018). "Google counters western bias with global image 'crowdsourcing'". CIO Dive. Retrieved October 24, 2018.

External links

Category:2016 software Crowdsource Category:Android (operating system) software Category:Note-taking software

Article evaluation

The Google Keep article nicely lays out the top-level aspects of the Google Keep phone app in its introduction. It goes through the key points: what it is, who developed it, where it is available, and when it launched. It also briefly touches on the reception of the app, including positive and negative feedback. Overall this gives me a great overview of the app. The intro also nicely mirrors the format of the remainder of the article; the article is broken into 3 major sections: Features, Platforms, and Reception, which organizes the ideas presented in the introduction.

For most Wikipedia articles, I feel that the first two paragraphs are most important. If the reader is confused about the topic after reading the intro, the article can almost certainly be improved. Especially regarding articles about phone apps, there should be no fluff, and I think this intro does that well. There is nothing that distracts me from the main topic.

The article’s information is relatively up to date as well. It focuses on the years 2013, when the app launched, and 2016, when the app received a substantial update. One thing I would add is another subsection for any 2018 reviews. I would also likely expand the platforms section with more details regarding the iOS version of the app, and perhaps write briefly about the main differences, if any, between the two versions of the apps.

Overall, the article’s tone is neutral, even when incorporating reviews into the article. The author uses quotes and simply mentions the review, not analyzing it themselves. All quotes and reviews as well as other facts about the app are well sourced. Some facts about the app are cited with an APKMirror page for the app.

The talk page is not very active. The article is rated in the following ways for the corresponding WikiProjects:

  • WikiProject Google: B-Class quality, Low-Importance
  • WikiProject Computing: Start-Class quality, Low-Importance
  • WikiProject Apps: Start-Class quality, no importance rating

There is one post about edits that caused a dispute regarding a neutral point-of-view, but there is no follow up.

Article Selection

Most of the articles I am interested in improving are stubs:

  • Mike O’Brien (Stub)
    • This article, as a stub, is very brief. It does a good job of introducing who Mike O’Brien is, but only offers 2 sources to back up all of the information presented in the stub. Based on the introduction, there is much more to mention that just a list of the games he has worked on. In its current state, I would argue that the article should list the games O’Brien has worked on as a bullet-point list.
    • After adding more information, I would split up the article using headers like “Games”, “Career”, and/or “Personal Life” depending on the amount of additional information available.
  • Split screen (computer graphics) (Stub)
    • The first, glaring issue with this article is that it has no citations. However, I do think it is organized in a nice way. It detailed the advantages and disadvantages of split screen in video games from a user’s perspective. This focuses mostly on the game design aspect of split screen. Of course, there are no citations to back this information up, but I think this section could be interesting and useful to readers if referenced properly.
    • The current article includes bias claims such as split screen being “remarkably popular on consoles” with no reference. I would overhaul this article, adding citations to relevant points that the original author included. The main changes I would make other than that would be expanding the technical, computer graphics perspective of the article, especially considering that the article is currently titled “Split screen (computer graphics)”. I would offer an overview of how split screen is implemented, linking internally to any relevant articles such as vector graphics.
  • Google Crowdsource (New Article)
    • I found that there are red links to Google Crowdsource from various other pages such as the page for Google Translate.
    • This app is currently gaining popularity after being released a couple of years ago, so I think it is perfectly relevant to be written about. I also found that it is also listed on the Google WikiProject page as a topic that needs an article.

Article Citation

Translation related tasks (translation and translation validation) are only shown to users who have selected more than one language they are fluent in.[1][2][3][4] While Maps Translation validation is no longer a task in the Crowdsource Android app, users can still complete translation and translation validation tasks.[1]

DYK Hook

Did you know that Google collects information from users around the world using Google Crowdsource to improve its services like Google Translate and Google Maps?

Note: I composed my “hook”, but did not submit to the DYK section because I moved my article into mainspace too long before completing this WikiEd module (articles need to be created or expanded 5-fold no more than 7 days ago).

Peer Review Links


  1. ^ a b Cite error: The named reference :2 was invoked but never defined (see the help page).
  2. ^ Cite error: The named reference :6 was invoked but never defined (see the help page).
  3. ^ Cite error: The named reference :4 was invoked but never defined (see the help page).
  4. ^ Cite error: The named reference :0 was invoked but never defined (see the help page).