Loading AI tools
Video-sharing platform owned by Google From Wikipedia, the free encyclopedia
YouTube is an online video sharing platform owned by Google. YouTube was founded on February 14, 2005, by Steve Chen, Chad Hurley, and Jawed Karim, three former employees of PayPal. Headquartered in San Bruno, California, United States, it is the second-most visited website in the world, after Google Search. In January 2024, YouTube had more than 2.7 billion monthly active users, who collectively watched more than one billion hours of videos every day.[7] As of May 2019[update], videos were being uploaded to the platform at a rate of more than 500 hours of content per minute,[8][9] and as of 2021, there were approximately 14 billion videos in total.[9]
Type of business | Subsidiary |
---|---|
Type of site | Online video platform |
Founded | February 14, 2005 |
Headquarters | 901 Cherry Avenue San Bruno, California, United States |
Area served | Worldwide (excluding blocked countries) |
Owner | Google LLC |
Founder(s) | |
Key people |
|
Industry | |
Products | |
Revenue | US$31.5 billion (2023)[1] |
Parent | Google LLC (2006–present) |
URL | youtube (see list of localized domain names) |
Advertising | Google AdSense |
Registration | Optional
|
Users | 2.7 billion MAU (January 2024)[2] |
Launched | December 15, 2005 |
Current status | Active |
Content license | Uploader holds copyright (standard license); Creative Commons can be selected. |
Written in | Python (core/API),[3] C (through CPython), C++, Java (through Guice platform),[4][5] Go,[6] JavaScript (UI) |
On the 9th of October 2006, YouTube was purchased by Google for $1.65 billion (equivalent to $2.31 billion in 2023).[10] Google expanded YouTube's business model of generating revenue from advertisements alone, to offering paid content such as movies and exclusive content produced by and for YouTube. It also offers YouTube Premium, a paid subscription option for watching content without ads. YouTube incorporated Google's AdSense program, generating more revenue for both YouTube and approved content creators. In 2023, YouTube's advertising revenue totaled $31.7 billion, a 2% increase from the $31.1 billion reported in 2022.[11] From Q4 2023 to Q3 2024, YouTube's combined revenue from advertising and subscriptions exceeded $50 billion.[12]
Since its purchase by Google, YouTube has expanded beyond the core website into mobile apps, network television, and the ability to link with other platforms. Video categories on YouTube include music videos, video clips, news, short and feature films, songs, documentaries, movie trailers, teasers and TV spots, live streams, vlogs, and more. Most content is generated by individuals, including collaborations between "YouTubers" and corporate sponsors. Established media, news, and entertainment corporations have also created and expanded their visibility to YouTube channels in order to reach greater audiences.
YouTube has had unprecedented social impact, influencing popular culture, internet trends, and creating multimillionaire celebrities. Despite its growth and success, the platform has been criticized for its facilitation of the spread of misinformation and copyrighted content, routinely violating its users' privacy, excessive censorship, endangering the safety of children and their well-being, and for its inconsistent or incorrect implementation of platform guidelines.
YouTube was founded by Steve Chen, Chad Hurley, and Jawed Karim. The trio were early employees of PayPal, which left them enriched after the company was bought by eBay.[13] Hurley had studied design at the Indiana University of Pennsylvania, and Chen and Karim studied computer science together at the University of Illinois Urbana-Champaign.[14]
According to a story that has often been repeated in the media, Hurley and Chen developed the idea for YouTube during the early months of 2005, after they had experienced difficulty sharing videos that had been shot at a dinner party at Chen's apartment in San Francisco. Karim did not attend the party and denied that it had occurred, but Chen remarked that the idea that YouTube was founded after a dinner party "was probably very strengthened by marketing ideas around creating a story that was very digestible".[15]
Karim said the inspiration for YouTube came from the Super Bowl XXXVIII halftime show controversy, when Janet Jackson's breast was briefly exposed by Justin Timberlake during the halftime show. Karim could not easily find video clips of the incident and the 2004 Indian Ocean Tsunami online, which led to the idea of a video-sharing site.[16][17] Hurley and Chen said that the original idea for YouTube was a video version of an online dating service and had been influenced by the website Hot or Not.[15][18] They created posts on Craigslist asking attractive women to upload videos of themselves to YouTube in exchange for a $100 reward.[19] Difficulty in finding enough dating videos led to a change of plans, with the site's founders deciding to accept uploads of any video.[20]
YouTube began as a venture capital–funded technology startup. Between November 2005 and April 2006, the company raised money from various investors, with Sequoia Capital and Artis Capital Management being the largest two.[13][21] YouTube's early headquarters were situated above a pizzeria and a Japanese restaurant in San Mateo, California.[22] In February 2005, the company activated www.youtube.com
.[23] The first video was uploaded on April 23, 2005. Titled "Me at the zoo", it shows co-founder Jawed Karim at the San Diego Zoo and can still be viewed on the site.[24][25] The same day, the company launched a public beta and by November, a Nike ad featuring Ronaldinho became the first video to reach one million total views.[26][27] The site launched officially on December 15, 2005, by which time the site was receiving 8 million views a day.[28][29] Clips at the time were limited to 100 megabytes, as little as 30 seconds of footage.[30]
YouTube was not the first video-sharing site on the Internet; Vimeo was launched in November 2004, though that site remained a side project of its developers from CollegeHumor.[31] The week of YouTube's launch, NBC-Universal's Saturday Night Live ran a skit "Lazy Sunday" by The Lonely Island. Besides helping to bolster ratings and long-term viewership for Saturday Night Live, "Lazy Sunday"'s status as an early viral video helped establish YouTube as an important website.[32] Unofficial uploads of the skit to YouTube drew in more than five million collective views by February 2006 before they were removed when NBCUniversal requested it two months later based on copyright concerns.[33] Despite eventually being taken down, these duplicate uploads of the skit helped popularize YouTube's reach and led to the upload of more third-party content.[34][35] The site grew rapidly; in July 2006, the company announced that more than 65,000 new videos were being uploaded every day and that the site was receiving 100 million video views per day.[36]
The choice of the name www.youtube.com
led to problems for a similarly named website, www.utube.com
. That site's owner, Universal Tube & Rollform Equipment, filed a lawsuit against YouTube in November 2006, after being regularly overloaded by people looking for YouTube. Universal Tube subsequently changed its website to www.utubeonline.com
.[37][38]
On October 9, 2006, Google announced that they had acquired YouTube for $1.65 billion in Google stock.[39][40] The deal was finalized on November 13, 2006.[41][42] Google's acquisition launched newfound interest in video-sharing sites; IAC, which now owned Vimeo, focused on supporting the content creators to distinguish itself from YouTube.[31] It is at this time YouTube issued the slogan "Broadcast Yourself". The company experienced rapid growth. The Daily Telegraph wrote that in 2007, YouTube consumed as much bandwidth as the entire Internet in 2000.[43] By 2010, the company had reached a market share of around 43% and more than 14 billion views of videos, according to comScore.[44] That year, the company simplified its interface to increase the time users would spend on the site.[45] In 2011, more than three billion videos were being watched each day with 48 hours of new videos uploaded every minute.[46][47][48] However, most of these views came from a relatively small number of videos; according to a software engineer at that time, 30% of videos accounted for 99% of views on the site.[49] That year, the company again changed its interface and at the same time, introduced a new logo with a darker shade of red.[50][51] A subsequent interface change, designed to unify the experience across desktop, TV, and mobile, was rolled out in 2013.[52] By that point, more than 100 hours were being uploaded every minute, increasing to 300 hours by November 2014.[53][54]
During this time, the company also went through some organizational changes. In October 2006, YouTube moved to a new office in San Bruno, California.[55] Hurley announced that he would be stepping down as chief executive officer of YouTube to take an advisory role and that Salar Kamangar would take over as head of the company in October 2010.[56]
In December 2009, YouTube partnered with Vevo.[57] In April 2010, Lady Gaga's "Bad Romance" became the most viewed video, becoming the first video to reach 200 million views on May 9, 2010.[58]
YouTube faced a major lawsuit by Viacom International in 2011 that nearly resulted in the discontinuation of the website. The lawsuit was filed as a result of alleged copyright infringement of Viacom's material by YouTube. However, the United States Court of Appeals for the Second Circuit ruled that YouTube was not liable, and thus YouTube won the case in 2012.[59]
Susan Wojcicki was appointed CEO of YouTube in February 2014.[60] In January 2016, YouTube expanded its headquarters in San Bruno by purchasing an office park for $215 million. The complex has 51,468 square metres (554,000 square feet) of space and can house up to 2,800 employees.[61] YouTube officially launched the "polymer" redesign of its user interfaces based on Material Design language as its default, as well a redesigned logo that is built around the service's play button emblem in August 2017.[62]
Through this period, YouTube tried several new ways to generate revenue beyond advertisements. In 2013, YouTube launched a pilot program for content providers to offer premium, subscription-based channels.[63][64] This effort was discontinued in January 2018 and relaunched in June, with US$4.99 channel subscriptions.[65][66] These channel subscriptions complemented the existing Super Chat ability, launched in 2017, which allows viewers to donate between $1 and $500 to have their comment highlighted.[67] In 2014, YouTube announced a subscription service known as "Music Key", which bundled ad-free streaming of music content on YouTube with the existing Google Play Music service.[68] The service continued to evolve in 2015 when YouTube announced YouTube Red, a new premium service that would offer ad-free access to all content on the platform (succeeding the Music Key service released the previous year), premium original series, and films produced by YouTube personalities, as well as background playback of content on mobile devices. YouTube also released YouTube Music, a third app oriented towards streaming and discovering the music content hosted on the YouTube platform.[69][70][71]
The company also attempted to create products appealing to specific viewers. YouTube released a mobile app known as YouTube Kids in 2015, designed to provide an experience optimized for children. It features a simplified user interface, curated selections of channels featuring age-appropriate content, and parental control features.[72] Also in 2015, YouTube launched YouTube Gaming—a video gaming-oriented vertical and app for videos and live streaming, intended to compete with the Amazon.com-owned Twitch.[73]
The company was attacked on April 3, 2018, when a shooting occurred at YouTube's headquarters in San Bruno, California, which wounded four and resulted in the death of the shooter.[74]
By February 2017, one billion hours of YouTube videos were being watched every day, and 400 hours worth of videos were uploaded every minute.[7][75] Two years later, the uploads had risen to more than 500 hours per minute.[8] During the COVID-19 pandemic, when most of the world was under stay-at-home orders, usage of services like YouTube significantly increased. One data firm[which?] estimated that YouTube was accounting for 15% of all internet traffic, twice its pre-pandemic level.[76] In response to EU officials requesting that such services reduce bandwidth as to make sure medical entities had sufficient bandwidth to share information, YouTube and Netflix stated they would reduce streaming quality for at least thirty days as to cut bandwidth use of their services by 25% to comply with the EU's request.[77] YouTube later announced that they would continue with this move worldwide: "We continue to work closely with governments and network operators around the globe to do our part to minimize stress on the system during this unprecedented situation."[78]
Following a 2018 complaint alleging violations of the Children's Online Privacy Protection Act (COPPA),[79] the company was fined $170 million by the FTC for collecting personal information from minors under the age of 13.[80] YouTube was also ordered to create systems to increase children's privacy.[81][82] Following criticisms of its implementation of those systems, YouTube started treating all videos designated as "made for kids" as liable under COPPA on January 6, 2020.[83][84] Joining the YouTube Kids app, the company created a supervised mode, designed more for tweens, in 2021.[85] Additionally, to compete with TikTok, YouTube released YouTube Shorts, a short-form video platform.[86]
During this period, YouTube entered disputes with other tech companies. For over a year, in 2018 and 2019, no YouTube app was available for Amazon Fire products.[87] In 2020, Roku removed the YouTube TV app from its streaming store after the two companies were unable to reach an agreement.[88]
After testing earlier in 2021, YouTube removed public display of dislike counts on videos in November 2021, claiming the reason for the removal was, based on its internal research, that users often used the dislike feature as a form of cyberbullying and brigading.[89] While some users praised the move as a way to discourage trolls, others felt that hiding dislikes would make it harder for viewers to recognize clickbait or unhelpful videos and that other features already existed for creators to limit bullying. YouTube co-founder Jawed Karim referred to the update as "a stupid idea", and that the real reason behind the change was "not a good one, and not one that will be publicly disclosed." He felt that users' ability on a social platform to identify harmful content was essential, saying, "The process works, and there's a name for it: the wisdom of the crowds. The process breaks when the platform interferes with it. Then, the platform invariably declines."[90][91][92] Shortly after the announcement, software developer Dmitry Selivanov created Return YouTube Dislike, an open-source, third-party browser extension for Chrome and Firefox that allows users to see a video's number of dislikes.[93] In a letter published on January 25, 2022, by then YouTube CEO Susan Wojcicki, acknowledged that removing public dislike counts was a controversial decision, but reiterated that she stands by this decision, claiming that "it reduced dislike attacks."[94]
In 2022, YouTube launched an experiment where the company would show users who watched longer videos on TVs a long chain of short un-skippable adverts, intending to consolidate all ads into the beginning of a video. Following public outrage over the unprecedented amount of un-skippable ads, YouTube "ended" the experiment on September 19 of that year.[95] In October, YouTube announced that they would be rolling out customizable user handles in addition to channel names, which would also become channel URLs.[96]
On February 16, 2023, Wojcicki announced that she would step down as CEO, with Neal Mohan named as her successor. Wojcicki took on an advisory role for Google and parent company Alphabet.[97] Wojcicki died a year and a half later, on August 9, 2024.[98]
In late October 2023, YouTube began cracking down on the use of ad blockers on the platform. Users of ad blockers may be given a pop-up warning saying "Video player will be blocked after 3 videos". Users of ad blockers are shown a message asking them to allow ads or inviting them to subscribe to the ad-free YouTube Premium subscription plan. YouTube says that the use of ad blockers violates its terms of service.[99][100]
In April 2024, YouTube announced it would be "strengthening our enforcement on third-party apps that violate YouTube's Terms of Service, specifically ad-blocking apps".[101]
YouTube has been led by a CEO since its founding in 2005, beginning with Chad Hurley, who led the company until 2010. After Google's acquisition of YouTube, the CEO role was retained. Salar Kamangar took over Hurley's position and kept the job until 2014. He was replaced by Susan Wojcicki, who later resigned in 2023.[97] The current CEO is Neal Mohan, who was appointed on February 16, 2023.[97]
YouTube offers different features based on user verification, such as standard or basic features like uploading videos, creating playlists, and using YouTube Music, with limits based on daily activity (verification via phone number or channel history increases feature availability and daily usage limits); intermediate or additional features like longer videos (over 15 minutes), live streaming, custom thumbnails, and creating podcasts; advanced features like content ID appeals, embedding live streams, applying for monetization, clickable links, adding chapters, and pinning comments on videos or posts.[102]
In January 2012, it was estimated that visitors to YouTube spent an average of 15 minutes a day on the site, in contrast to the four or five hours a day spent by a typical US citizen watching television.[103] In 2017, viewers on average watched YouTube on mobile devices for more than an hour every day.[104]
In December 2012, two billion views were removed from the view counts of Universal and Sony music videos on YouTube, prompting a claim by The Daily Dot that the views had been deleted due to a violation of the site's terms of service, which ban the use of automated processes to inflate view counts. This was disputed by Billboard, which said that the two billion views had been moved to Vevo, since the videos were no longer active on YouTube.[105][106] On August 5, 2015, YouTube patched the formerly notorious behavior which caused a video's view count to freeze at "301" (later "301+") until the actual count was verified to prevent view count fraud.[107] YouTube view counts once again updated in real time.[108]
Since September 2019, subscriber counts are abbreviated. Only three leading digits of channels' subscriber counts are indicated publicly, compromising the function of third-party real-time indicators such as that of Social Blade. Exact counts remain available to channel operators inside YouTube Studio.[109]
On November 11, 2021, after testing out this change in March of the same year, YouTube announced it would start hiding dislike counts on videos, making them invisible to viewers. The company stated the decision was in response to experiments which confirmed that smaller YouTube creators were more likely to be targeted in dislike brigading and harassment. Creators will still be able to see the number of likes and dislikes in the YouTube Studio dashboard tool, according to YouTube.[110][111][112]
YouTube has faced numerous challenges and criticisms in its attempts to deal with copyright, including the site's first viral video, Lazy Sunday, which had to be taken down, due to copyright concerns.[32] At the time of uploading a video, YouTube users are shown a message asking them not to violate copyright laws.[113] Despite this advice, many unauthorized clips of copyrighted material remain on YouTube. YouTube does not view videos before they are posted online, and it is left to copyright holders to issue a DMCA takedown notice pursuant to the terms of the Online Copyright Infringement Liability Limitation Act. Any successful complaint about copyright infringement results in a YouTube copyright strike. Three successful complaints for copyright infringement against a user account will result in the account and all of its uploaded videos being deleted.[114][115] From 2007 to 2009 organizations including Viacom, Mediaset, and the English Premier League have filed lawsuits against YouTube, claiming that it has done too little to prevent the uploading of copyrighted material.[116][117][118]
In August 2008, a US court ruled in Lenz v. Universal Music Corp. that copyright holders cannot order the removal of an online file without first determining whether the posting reflected fair use of the material.[119] YouTube's owner Google announced in November 2015 that they would help cover the legal cost in select cases where they believe fair use defenses apply.[120]
In the 2011 case of Smith v. Summit Entertainment LLC, professional singer Matt Smith sued Summit Entertainment for the wrongful use of copyright takedown notices on YouTube.[121] He asserted seven causes of action, and four were ruled in Smith's favor.[122] In April 2012, a court in Hamburg ruled that YouTube could be held responsible for copyrighted material posted by its users.[123] On November 1, 2016, the dispute with GEMA was resolved, with Google content ID being used to allow advertisements to be added to videos with content protected by GEMA.[124]
In April 2013, it was reported that Universal Music Group and YouTube have a contractual agreement that prevents content blocked on YouTube by a request from UMG from being restored, even if the uploader of the video files a DMCA counter-notice.[125][126] As part of YouTube Music, Universal and YouTube signed an agreement in 2017, which was followed by separate agreements other major labels, which gave the company the right to advertising revenue when its music was played on YouTube.[127] By 2019, creators were having videos taken down or demonetized when Content ID identified even short segments of copyrighted music within a much longer video, with different levels of enforcement depending on the record label.[128] Experts noted that some of these clips said qualified for fair use.[128]
In June 2007, YouTube began trials of a system for automatic detection of uploaded videos that infringe copyright. Google CEO Eric Schmidt regarded this system as necessary for resolving lawsuits such as the one from Viacom, which alleged that YouTube profited from content that it did not have the right to distribute.[129] The system, which was initially called "Video Identification"[130][131] and later became known as Content ID,[132] creates an ID File for copyrighted audio and video material, and stores it in a database. When a video is uploaded, it is checked against the database, and flags the video as a copyright violation if a match is found.[133] When this occurs, the content owner has the choice of blocking the video to make it unviewable, tracking the viewing statistics of the video, or adding advertisements to the video.
An independent test in 2009 uploaded multiple versions of the same song to YouTube and concluded that while the system was "surprisingly resilient" in finding copyright violations in the audio tracks of videos, it was not infallible.[134] The use of Content ID to remove material automatically has led to controversy in some cases, as the videos have not been checked by a human for fair use.[135] If a YouTube user disagrees with a decision by Content ID, it is possible to fill in a form disputing the decision.[136]
Before 2016, videos were not monetized until the dispute was resolved. Since April 2016, videos continue to be monetized while the dispute is in progress, and the money goes to whoever won the dispute.[137] Should the uploader want to monetize the video again, they may remove the disputed audio in the "Video Manager".[138] YouTube has cited the effectiveness of Content ID as one of the reasons why the site's rules were modified in December 2010 to allow some users to upload videos of unlimited length.[139]
YouTube has a set of community guidelines aimed to reduce abuse of the site's features. The uploading of videos containing defamation, pornography, and material encouraging criminal conduct is forbidden by YouTube's "Community Guidelines".[140][better source needed] Generally prohibited material includes sexually explicit content, videos of animal abuse, shock videos, content uploaded without the copyright holder's consent, hate speech, spam, and predatory behavior.[140] YouTube relies on its users to flag the content of videos as inappropriate, and a YouTube employee will view a flagged video to determine whether it violates the site's guidelines.[140] Despite the guidelines, YouTube has faced criticism over aspects of its operations,[141] its recommendation algorithms perpetuating videos that promote conspiracy theories and falsehoods,[142] hosting videos ostensibly targeting children but containing violent or sexually suggestive content involving popular characters,[143] videos of minors attracting pedophilic activities in their comment sections,[144] and fluctuating policies on the types of content that is eligible to be monetized with advertising.[141]
YouTube contracts companies to hire content moderators, who view content flagged as potentially violating YouTube's content policies and determines if they should be removed. In September 2020, a class-action suit was filed by a former content moderator who reported developing post-traumatic stress disorder (PTSD) after an 18-month period on the job.[145][146][147]
Controversial moderation decisions have included material relating to Holocaust denial,[148] the Hillsborough disaster,[149] Anthony Bourdain's death,[150] and the Notre-Dame fire.[151] In July 2008, the Culture and Media Committee of the House of Commons of the United Kingdom stated that it was "unimpressed" with YouTube's system for policing its videos, and argued that "proactive review of content should be standard practice for sites hosting user-generated content".[152]
In June 2022, Media Matters, a media watchdog group, reported that homophobic and transphobic content calling LGBT people "predators" and "groomers" was becoming more common on YouTube.[153] The report also referred to common accusations in YouTube videos that LGBT people are mentally ill.[153] The report stated the content appeared to be in violation of YouTube's hate speech policy.[153]
An August 2022 report by the Center for Countering Digital Hate, a British think tank, found that harassment against women was flourishing on YouTube.[154] In his 2022 book Like, Comment, Subscribe: Inside YouTube's Chaotic Rise to World Domination, Bloomberg reporter Mark Bergen said that many female content creators were dealing with harassment, bullying, and stalking.[154]
YouTube has been criticized for using an algorithm that gives great prominence to videos that promote conspiracy theories, falsehoods and incendiary fringe discourse.[155][156][157][158] According to an investigation by The Wall Street Journal, "YouTube's recommendations often lead users to channels that feature conspiracy theories, partisan viewpoints and misleading videos, even when those users haven't shown interest in such content. When users show a political bias in what they choose to view, YouTube typically recommends videos that echo those biases, often with more-extreme viewpoints."[155][159] After YouTube drew controversy for giving top billing to videos promoting falsehoods and conspiracy when people made breaking-news queries during the 2017 Las Vegas shooting, YouTube changed its algorithm to give greater prominence to mainstream media sources.[155][160][161][162]
In 2017, it was revealed that advertisements were being placed on extremist videos, including videos by rape apologists, anti-Semites, and hate preachers who received ad payouts.[163] After firms started to stop advertising on YouTube in the wake of this reporting, YouTube apologized and said that it would give firms greater control over where ads got placed.[163]
University of North Carolina professor Zeynep Tufekci has referred to YouTube as "The Great Radicalizer", saying "YouTube may be one of the most powerful radicalizing instruments of the 21st century."[164] Jonathan Albright of the Tow Center for Digital Journalism at Columbia University described YouTube as a "conspiracy ecosystem".[157][165]
Before 2019, YouTube took steps to remove specific videos or channels related to supremacist content that had violated its acceptable use policies but otherwise did not have site-wide policies against hate speech.[166]
In the wake of the March 2019 Christchurch mosque attacks, YouTube and other sites like Facebook and Twitter that allowed user-submitted content drew criticism for doing little to moderate and control the spread of hate speech, which was considered to be a factor in the rationale for the attacks.[167][168] These platforms were pressured to remove such content, but in an interview with The New York Times, YouTube's then chief product officer Neal Mohan said that unlike content such as ISIS videos which take a particular format and thus easy to detect through computer-aided algorithms, general hate speech was more difficult to recognize and handle, and thus could not readily take action to remove without human interaction.[169]
In May 2019, YouTube joined an initiative led by France and New Zealand with other countries and tech companies to develop tools to be used to block online hate speech and to develop regulations, to be implemented at the national level, to be levied against technology firms that failed to take steps to remove such speech, though the United States declined to participate.[170][171] Subsequently, on June 5, 2019, YouTube announced a major change to its terms of service and further stated it would "remove content denying that well-documented violent events, like the Holocaust or the shooting at Sandy Hook Elementary, took place."[166][172]
In June 2020, YouTube was criticized for allowing white supremacist content on its platform for years after it announced it would be pledging $1 million to fight racial injustice.[173] Later that month, it banned several channels associated with white supremacy, including those of Stefan Molyneux, David Duke, and Richard B. Spencer, asserting these channels violated their policies on hate speech.[174]
Multiple research studies have investigated cases of misinformation in YouTube. In a July 2019 study based on ten YouTube searches using the Tor Browser related to climate and climate change, the majority of videos were videos that communicated views contrary to the scientific consensus on climate change.[175] A May 2023 study found that YouTube was monetizing and profiting from videos that included misinformation about climate change.[176] A 2019 BBC investigation of YouTube searches in ten different languages found that YouTube's algorithm promoted health misinformation, including fake cancer cures.[177] In Brazil, YouTube has been linked to pushing pseudoscientific misinformation on health matters, as well as elevated far-right fringe discourse and conspiracy theories.[178] In the Philippines, numerous channels disseminated misinformation related to the 2022 Philippine elections.[179] Additionally, research on the dissemination of Flat Earth beliefs in social media, has shown that networks of YouTube channels form an echo chamber that polarizes audiences by appearing to confirm preexisting beliefs.[180]
In 2018, YouTube introduced a system that would automatically add information boxes to videos that its algorithms determined may present conspiracy theories and other fake news, filling the infobox with content from Encyclopædia Britannica and Wikipedia as a means to inform users to minimize misinformation propagation without impacting freedom of speech.[181][182] In 2023, YouTube revealed its changes in handling content associated with eating disorders. This social media platform's Community Guidelines now prohibit content that could encourage emulation from at-risk users.[183]
In January 2019, YouTube said that it had introduced a new policy starting in the United States intended to stop recommending videos containing "content that could misinform users in harmful ways." YouTube gave flat earth theories, miracle cures, and 9/11 truther-isms as examples.[184] Efforts within YouTube engineering to stop recommending borderline extremist videos falling just short of forbidden hate speech, and track their popularity were originally rejected because they could interfere with viewer engagement.[185] In July 2022, YouTube announced policies to combat misinformation surrounding abortion, such as videos with instructions to perform abortion methods that are considered unsafe and videos that contain misinformation about the safety of abortion.[186] Google and YouTube implemented policies in October 2021 to deny monetization or revenue to advertisers or content creators that promoted climate change denial.[187] In January 2024, the Center for Countering Digital Hate reported that climate change deniers were instead pushing other forms of climate change denial that have not yet been banned by YouTube.[188][189]
Following the dissemination via YouTube of misinformation related to the COVID-19 pandemic that 5G communications technology was responsible for the spread of coronavirus disease 2019 which led to multiple 5G towers in the United Kingdom being attacked by arsonists, YouTube removed all such videos linking 5G and the coronavirus in this manner.[190]
In September 2021, YouTube extended this policy to cover videos disseminating misinformation related to any vaccine, including those long approved against measles or Hepatitis B, that had received approval from local health authorities or the World Health Organization.[191][192] The platform proceeded to remove the accounts of anti-vaccine campaigners such as Robert F. Kennedy Jr. and Joseph Mercola.[192] YouTube had extended this moderation to non-medical areas. In the weeks following the 2020 United States presidential election, the site added policies to remove or label videos promoting election fraud claims;[193][194] however, it reversed this policy in June 2023, citing that the removal was necessary to "openly debate political ideas, even those that are controversial or based on disproven assumptions".[195][196]
Leading into 2017, there was a significant increase in the number of videos related to children, coupled between the popularity of parents vlogging their family's activities, and previous content creators moving away from content that often was criticized or demonetized into family-friendly material. In 2017, YouTube reported that time watching family vloggers had increased by 90%.[197][198] However, with the increase in videos featuring children, the site began to face several controversies related to child safety, including with popular channels FamilyOFive and Fantastic Adventures.[199][200][201][202][203]
Later that year, YouTube came under criticism for showing inappropriate videos targeted at children and often featuring popular characters in violent, sexual or otherwise disturbing situations, many of which appeared on YouTube Kids and attracted millions of views. The term "Elsagate" was coined on the Internet and then used by various news outlets to refer to this controversy.[204][205][206][207] Following the criticism, YouTube announced it was strengthening site security to protect children from unsuitable content and the company started to mass delete videos and channels that made improper use of family-friendly characters. As part of a broader concern regarding child safety on YouTube, the wave of deletions also targeted channels that showed children taking part in inappropriate or dangerous activities under the guidance of adults.[208][209][210][211][212][213]
Even for content that appears to be aimed at children and appears to contain only child-friendly content, YouTube's system allows for anonymity of who uploads these videos. These questions have been raised in the past, as YouTube has had to remove channels with children's content which, after becoming popular, then suddenly include inappropriate content masked as children's content.[214] The anonymity of such channel raise concerns because of the lack of knowledge of what purpose they are trying to serve.[215] The difficulty to identify who operates these channels "adds to the lack of accountability", according to Josh Golin of the Campaign for a Commercial-Free Childhood, and educational consultant Renée Chernow-O'Leary found the videos were designed to entertain with no intent to educate, all leading to critics and parents to be concerned for their children becoming too enraptured by the content from these channels.[214] Content creators that earnestly make child-friendly videos have found it difficult to compete with larger channels, unable to produce content at the same rate as them, and lacking the same means of being promoted through YouTube's recommendation algorithms that the larger animated channel networks have shared.[215]
In January 2019, YouTube officially banned videos containing "challenges that encourage acts that have an inherent risk of severe physical harm" (such as the Tide Pod Challenge) and videos featuring pranks that "make victims believe they're in physical danger" or cause emotional distress in children.[216]
In November 2017, it was revealed in the media that many videos featuring children—often uploaded by the minors themselves, and showing innocent content such as the children playing with toys or performing gymnastics—were attracting comments from pedophiles[217][218] with predators finding the videos through private YouTube playlists or typing in certain keywords in Russian.[218] Other child-centric videos originally uploaded to YouTube began propagating on the dark web, and uploaded or embedded onto forums known to be used by pedophiles.[219]
As a result of the controversy, which added to the concern about "Elsagate", several major advertisers whose ads had been running against such videos froze spending on YouTube.[207][220] In December 2018, The Times found more than 100 grooming cases in which children were manipulated into sexually implicit behavior (such as taking off clothes, adopting overtly sexual poses and touching other children inappropriately) by strangers.[221]
In February 2019, YouTube vlogger Matt Watson identified a "wormhole" that would cause the YouTube recommendation algorithm to draw users into this type of video content, and make all of that user's recommended content feature only these types of videos.[222] Most of these videos had comments from sexual predators commenting with timestamps of when the children were shown in compromising positions or otherwise making indecent remarks.[223] In the wake of the controversy, the service reported that they had deleted over 400 channels and tens of millions of comments, and reported the offending users to law enforcement and the National Center for Missing and Exploited Children.[224][225] Despite these measures several large advertisers pulled their advertising from YouTube.[223][226]
Subsequently, YouTube began to demonetize and block advertising on the types of videos that have drawn these predatory comments.[227] YouTube also began to flag channels that predominantly feature children, and preemptively disable their comments sections.[228][229]
A related attempt to algorithmically flag videos containing references to the string "CP" (an abbreviation of child pornography) resulted in some prominent false positives involving unrelated topics using the same abbreviation. YouTube apologized for the errors and reinstated the affected videos.[230]
In June 2019, The New York Times cited researchers who found that users who watched erotic videos could be recommended seemingly innocuous videos of children.[231]
In 2021, two accounts linked to RT Deutsch, the German channel of the Russian RT network were removed as well for breaching YouTube's policies relating to COVID-19.[191] Russia threatened to ban YouTube after the platform deleted two German RT channels in September 2021.[232]
Shortly after the Russian invasion of Ukraine in 2022, YouTube removed all channels funded by the Russian state.[233] YouTube expanded the removal of Russian content from its site to include channels described as 'pro-Russian'. In June 2022, the War Gonzo channel run by Russian military blogger and journalist Semyon Pegov was deleted.[234] In July 2023, YouTube removed the channel of British journalist Graham Phillips, active in covering the War in Donbas from 2014.[235]
In August 2023, a Moscow court fined Google 3 million rubles, around $35,000, for not deleting what it said was "fake news about the war in Ukraine".[236]
YouTube featured an April Fools prank on the site on April 1 of every year from 2008 to 2016. In 2008, all links to videos on the main page were redirected to Rick Astley's music video "Never Gonna Give You Up", a prank known as "rickrolling".[237][238] The next year, when clicking on a video on the main page, the whole page turned upside down, which YouTube claimed was a "new layout".[239] In 2010, YouTube temporarily released a "TEXTp" mode which rendered video imagery into ASCII art letters "in order to reduce bandwidth costs by $1 per second."[240]
The next year, the site celebrated its "100th anniversary" with a range of sepia-toned silent, early 1900s-style films, including a parody of Keyboard Cat.[241] In 2012, clicking on the image of a DVD next to the site logo led to a video about a purported option to order every YouTube video for home delivery on DVD.[242]
In 2013, YouTube teamed up with satirical newspaper company The Onion to claim in an uploaded video that the video-sharing website was launched as a contest which had finally come to an end, and would shut down for ten years before being re-launched in 2023, featuring only the winning video. The video starred several YouTube celebrities, including Antoine Dodson. A video of two presenters announcing the nominated videos streamed live for 12 hours.[243][244]
In 2014, YouTube announced that it was responsible for the creation of all viral video trends, and revealed previews of upcoming trends, such as "Clocking", "Kissing Dad", and "Glub Glub Water Dance".[245] The next year, YouTube added a music button to the video bar that played samples from "Sandstorm" by Darude.[246] In 2016, YouTube introduced an option to watch every video on the platform in 360-degree mode with Snoop Dogg.[247]
YouTube Premium (formerly YouTube Red) is YouTube's premium subscription service. It offers advertising-free streaming, access to original programming, and background and offline video playback on mobile devices.[248] YouTube Premium was originally announced on November 12, 2014, as "Music Key", a subscription music streaming service, and was intended to integrate with and replace the existing Google Play Music "All Access" service.[249][250][251] On October 28, 2015, the service was relaunched as YouTube Red, offering ad-free streaming of all videos and access to exclusive original content.[252][253][254] As of November 2016[update], the service has 1.5 million subscribers, with a further million on a free-trial basis.[255] As of June 2017[update], the first season of YouTube Originals had received 250 million views in total.[256]
YouTube Kids is an American children's video app developed by YouTube, a subsidiary of Google. The app was developed in response to parental and government scrutiny on the content available to children. The app provides a version of the service-oriented towards children, with curated selections of content, parental control features, and filtering of videos deemed inappropriate viewing for children aged under 13, 8 or 5 depending on the age grouping chosen. First released on February 15, 2015, as an Android and iOS mobile app, the app has since been released for LG, Samsung, and Sony smart TVs, as well as for Android TV. On May 27, 2020, it became available on Apple TV. As of September 2019, the app is available in 69 countries, including Hong Kong and Macau, and one province. YouTube launched a web-based version of YouTube Kids on August 30, 2019.
On September 28, 2016, YouTube named Lyor Cohen, the co-founder of 300 Entertainment and former Warner Music Group executive, the Global Head of Music.[257]
In early 2018, Cohen began hinting at the possible launch of YouTube's new subscription music streaming service, a platform that would compete with other services such as Spotify and Apple Music.[258] On May 22, 2018, the music streaming platform named "YouTube Music" was launched.[259][260]
YouTube Movies & TV is a video on demand service that offers movies and television shows for purchase or rental, depending on availability, along with a selection of movies (encompassing between 100 and 500 titles overall) that are free to stream, with interspersed ad breaks. YouTube began offering free-to-view movie titles to its users in November 2018; selections of new movies are added and others removed, unannounced each month.[261]
In March 2021, Google announced plans to gradually deprecate the Google Play Movies & TV app, and eventually migrate all users to the YouTube app's Movies & TV store to view, rent and purchase movies and TV shows (first affecting Roku, Samsung, LG, and Vizio smart TV users on July 15).[262][263] Google Play Movies & TV formally shut down on January 17, 2024, with the web version of that platform migrated to YouTube as an expansion of the Movies & TV store to desktop users. (Other functions of Google Play Movies & TV were integrated into the Google TV service.)[264]
On November 1, 2022, YouTube launched Primetime Channels, a channel store platform offering third-party subscription streaming add-ons sold a la carte through the YouTube website and app, competing with similar subscription add-on stores operated by Apple, Prime Video and Roku. The add-ons can be purchased through the YouTube Movies & TV hub or through the official YouTube channels of the available services; subscribers of YouTube TV add-ons that are sold through Primetime Channels can also access their content via the YouTube app and website. A total of 34 streaming services (including Paramount+, Showtime, Starz, MGM+, AMC+ and ViX+) were initially available for purchase.[265][266]
NFL Sunday Ticket, as part of a broader residential distribution deal with Google signed in December 2022 that also made it available to YouTube TV subscribers, was added to Primetime Channels as a standalone add-on on August 16, 2023.[267][268] The ad-free tier of Max was added to Primetime Channels on December 12, 2023, coinciding with YouTube TV converting its separate HBO (for base plan subscribers) and HBO Max (for all subscribers) linear/VOD add-ons into a single combined Max offering.[269][270][note 1]
On February 28, 2017, in a press announcement held at YouTube Space Los Angeles, YouTube announced YouTube TV, an over-the-top MVPD-style subscription service that would be available for United States customers at a price of US$65 per month. Initially launching in five major markets (New York City, Los Angeles, Chicago, Philadelphia and San Francisco) on April 5, 2017,[271][272] the service offers live streams of programming from the five major broadcast networks (ABC, CBS, The CW, Fox and NBC, along with selected MyNetworkTV affiliates and independent stations in certain markets), as well as approximately 60 cable channels owned by companies such as The Walt Disney Company, Paramount Global, Fox Corporation, NBCUniversal, Allen Media Group and Warner Bros. Discovery (including among others Bravo, USA Network, Syfy, Disney Channel, CNN, Cartoon Network, E!, Fox Sports 1, Freeform, FX and ESPN).[273][274]
Subscribers can also receive premium cable channels (including HBO (via a combined Max add-on that includes in-app and log-in access to the service), Cinemax, Showtime, Starz and MGM+) and other subscription services (such as NFL Sunday Ticket, MLB.tv, NBA League Pass, Curiosity Stream and Fox Nation) as optional add-ons for an extra fee, and can access YouTube Premium original content.[273][274] In September 2022, YouTube TV began allowing customers to purchase most of its premium add-ons (excluding certain services such as NBA League Pass and AMC+) without an existing subscription to its base package.[275]
In September 2016, YouTube Go was announced,[276] as an Android app created for making YouTube easier to access on mobile devices in emerging markets. It was distinct from the company's main Android app and allowed videos to be downloaded and shared with other users. It also allowed users to preview videos, share downloaded videos through Bluetooth, and offered more options for mobile data control and video resolution.[277]
In February 2017, YouTube Go was launched in India, and expanded in November 2017 to 14 other countries, including Nigeria, Indonesia, Thailand, Malaysia, Vietnam, the Philippines, Kenya, and South Africa.[278][279] On February 1, 2018, it was rolled out in 130 countries worldwide, including Brazil, Mexico, Turkey, and Iraq. Before it shut down, the app was available to around 60% of the world's population.[280][281] In May 2022, Google announced that they would be shutting down YouTube Go in August 2022.[282]
In September 2020, YouTube announced that it would be launching a beta version of a new platform of 15-second videos, similar to TikTok, called YouTube Shorts.[283][284] The platform was first tested in India but as of March 2021 has expanded to other countries including the United States with videos now able to be up to 1 minute long.[285] The platform is not a standalone app, but is integrated into the main YouTube app. Like TikTok, it gives users access to built-in creative tools, including the possibility of adding licensed music to their videos.[286] The platform had its global beta launch in July 2021.[287]
In 2018, YouTube started testing a new feature initially called "YouTube Reels".[288] The feature was nearly identical to Instagram Stories and Snapchat Stories. YouTube later renamed the feature "YouTube Stories". It was only available to creators who had more than 10,000 subscribers and could only be posted/seen in the YouTube mobile app.[289] On May 25, 2023, YouTube announced that they would be shutting down this feature on June 26, 2023.[290][291]
In November 2016, YouTube released YouTube VR, a dedicated version with an interface for VR devices, for Google's Daydream mobile VR platform on Android.[292] In November 2018, YouTube VR was released on the Oculus Store for the Oculus Go headset.[292] YouTube VR was updated since for compatibility with successive Quest devices, and was ported to Pico 4.[293]
YouTube VR allows for access to all YouTube-hosted videos, but particularly supports headset access for 360° and 180°-degree video (both in 2D and stereoscopic 3D). Starting with the Oculus Quest, the app was updated for compatibility with mixed-reality passthrough modes on VR headsets. In April 2024, YouTube VR was updated to support 8K SDR video on Meta Quest 3.[294]
Private individuals[295] and large production corporations[296] have used YouTube to grow their audiences. Indie creators have built grassroots followings numbering in the thousands at very little cost or effort, while mass retail and radio promotion proved problematic.[295] Concurrently, old media celebrities moved into the website at the invitation of a YouTube management that witnessed early content creators accruing substantial followings and perceived audience sizes potentially larger than that attainable by television.[296] While YouTube's revenue-sharing "Partner Program" made it possible to earn a substantial living as a video producer—its top five hundred partners each earning more than $100,000 annually[297] and its ten highest-earning channels grossing from $2.5 million to $12 million[298]—in 2012 CMU business editor characterized YouTube as "a free-to-use ... promotional platform for the music labels."[299] In 2013 Forbes' Katheryn Thayer asserted that digital-era artists' work must not only be of high quality, but must elicit reactions on the YouTube platform and social media.[300] Videos of the 2.5% of artists categorized as "mega", "mainstream" and "mid-sized" received 90.3% of the relevant views on YouTube and Vevo in that year.[301] By early 2013, Billboard had announced that it was factoring YouTube streaming data into calculation of the Billboard Hot 100 and related genre charts.[302]
Observing that face-to-face communication of the type that online videos convey has been "fine-tuned by millions of years of evolution", TED curator Chris Anderson referred to several YouTube contributors and asserted that "what Gutenberg did for writing, online video can now do for face-to-face communication."[303] Anderson asserted that it is not far-fetched to say that online video will dramatically accelerate scientific advance, and that video contributors may be about to launch "the biggest learning cycle in human history."[303] In education, for example, the Khan Academy grew from YouTube video tutoring sessions for founder Salman Khan's cousin into what Forbes' Michael Noer called "the largest school in the world," with technology poised to disrupt how people learn.[304] YouTube was awarded a 2008 George Foster Peabody Award,[305] the website being described as a Speakers' Corner that "both embodies and promotes democracy."[306] The Washington Post reported that a disproportionate share of YouTube's most subscribed channels feature minorities, contrasting with mainstream television in which the stars are largely white.[307] A Pew Research Center study reported the development of "visual journalism", in which citizen eyewitnesses and established news organizations share in content creation.[308] The study also concluded that YouTube was becoming an important platform by which people acquire news.[309]
YouTube has enabled people to more directly engage with government, such as in the CNN/YouTube presidential debates (2007) in which ordinary people submitted questions to U.S. presidential candidates via YouTube video, with a techPresident co-founder saying that Internet video was changing the political landscape.[310] Describing the Arab Spring (2010–2012), sociologist Philip N. Howard quoted an activist's succinct description that organizing the political unrest involved using "Facebook to schedule the protests, Twitter to coordinate, and YouTube to tell the world."[311] In 2012, more than a third of the U.S. Senate introduced a resolution condemning Joseph Kony 16 days after the "Kony 2012" video was posted to YouTube, with resolution co-sponsor Senator Lindsey Graham remarking that the video "will do more to lead to (Kony's) demise than all other action combined."[312]
Conversely, YouTube has also allowed government to more easily engage with citizens, the White House's official YouTube channel being the seventh top news organization producer on YouTube in 2012[315] and in 2013 a healthcare exchange commissioned Obama impersonator Iman Crosson's YouTube music video spoof to encourage young Americans to enroll in the Affordable Care Act (Obamacare)-compliant health insurance.[316] In February 2014, U.S. President Obama held a meeting at the White House with leading YouTube content creators not only to promote awareness of Obamacare[317] but more generally to develop ways for government to better connect with the "YouTube Generation."[313] Whereas YouTube's inherent ability to allow presidents to directly connect with average citizens was noted, the YouTube content creators' new media savvy was perceived necessary to better cope with the website's distracting content and fickle audience.[313]
Some YouTube videos have themselves had a direct effect on world events, such as Innocence of Muslims (2012) which spurred protests and related anti-American violence internationally.[318] TED curator Chris Anderson described a phenomenon by which geographically distributed individuals in a certain field share their independently developed skills in YouTube videos, thus challenging others to improve their own skills, and spurring invention and evolution in that field.[303] Journalist Virginia Heffernan stated in The New York Times that such videos have "surprising implications" for the dissemination of culture and even the future of classical music.[319]
A 2017 article in The New York Times Magazine posited that YouTube had become "the new talk radio" for the far right.[320] Almost a year before YouTube's January 2019 announcement that it would begin a "gradual change" of "reducing recommendations of borderline content and content that could misinform users in harmful ways",[321] Zeynep Tufekci had written in The New York Times that, "(g)iven its billion or so users, YouTube may be one of the most powerful radicalizing instruments of the 21st century".[322] Under YouTube's changes to its recommendation engine, the most recommended channel evolved from conspiracy theorist Alex Jones (2016) to Fox News (2019).[323] According to a 2020 study, "An emerging journalistic consensus theorizes the central role played by the video 'recommendation engine', but we believe that this is premature. Instead, we propose the 'Supply and Demand' framework for analyzing politics on YouTube."[324] A 2022 study found that "despite widespread concerns that YouTube's algorithms send people down 'rabbit holes' with recommendations to extremist videos, little systematic evidence exists to support this conjecture", "exposure to alternative and extremist channel videos on YouTube is heavily concentrated among a small group of people with high prior levels of gender and racial resentment.", and "contrary to the 'rabbit holes' narrative, non-subscribers are rarely recommended videos from alternative and extremist channels and seldom follow such recommendations when offered."[325]
The Legion of Extraordinary Dancers[326] and the YouTube Symphony Orchestra[327] selected their membership based on individual video performances.[303][327] Further, the cyber-collaboration charity video "We Are the World 25 for Haiti (YouTube edition)" was formed by mixing performances of 57 globally distributed singers into a single musical work,[328] with The Tokyo Times noting the "We Pray for You" YouTube cyber-collaboration video as an example of a trend to use crowdsourcing for charitable purposes.[329] The anti-bullying It Gets Better Project expanded from a single YouTube video directed to discouraged or suicidal LGBT teens,[330] that within two months drew video responses from hundreds including U.S. President Barack Obama, Vice President Biden, White House staff, and several cabinet secretaries.[331] Similarly, in response to fifteen-year-old Amanda Todd's video "My story: Struggling, bullying, suicide, self-harm", legislative action was undertaken almost immediately after her suicide to study the prevalence of bullying and form a national anti-bullying strategy.[332] In May 2018, after London Metropolitan Police claimed that drill music videos glamorizing violence gave rise to gang violence, YouTube deleted 30 videos.[333]
Prior to 2020, Google did not provide detailed figures for YouTube's running costs, and YouTube's revenues in 2007 were noted as "not material" in a regulatory filing.[334] In June 2008, a Forbes magazine article projected the 2008 revenue at $200 million, noting progress in advertising sales.[335] In 2012, YouTube's revenue from its ads program was estimated at $3.7 billion.[336] In 2013, it nearly doubled and estimated to hit $5.6 billion according to e-Marketer,[336][337] while others estimated $4.7 billion.[336] The vast majority of videos on YouTube are free to view and supported by advertising.[63] In May 2013, YouTube introduced a trial scheme of 53 subscription channels with prices ranging from $0.99 to $6.99 a month.[338] The move was seen as an attempt to compete with other providers of online subscription services such as Netflix, Amazon Prime, and Hulu.[63]
Google first published exact revenue numbers for YouTube in February 2020 as part of Alphabet's 2019 financial report. According to Google, YouTube had made US$15.1 billion in ad revenue in 2019, in contrast to US$8.1 billion in 2017 and US$11.1 billion in 2018. YouTube's revenues made up nearly 10% of the total Alphabet revenue in 2019.[339][340] These revenues accounted for approximately 20 million subscribers combined between YouTube Premium and YouTube Music subscriptions, and 2 million subscribers to YouTube TV.[341]
YouTube had $29.2 billion ads revenue in 2022, up by $398 million from the prior year.[342] In Q2 2024, ad revenue rose to $8.66 billion, up 13% on Q1.[343]
YouTube entered into a marketing and advertising partnership with NBC in June 2006.[344] In March 2007, it struck a deal with BBC for three channels with BBC content, one for news and two for entertainment.[345] In November 2008, YouTube reached an agreement with MGM, Lions Gate Entertainment, and CBS, allowing the companies to post full-length films and television episodes on the site, accompanied by advertisements in a section for U.S. viewers called "Shows". The move was intended to create competition with websites such as Hulu, which features material from NBC, Fox, and Disney.[346][347] In November 2009, YouTube launched a version of "Shows" available to UK viewers, offering around 4,000 full-length shows from more than 60 partners.[348] In January 2010, YouTube introduced an online film rentals service,[349] which is only available to users in the United States, Canada, and the UK as of 2010.[350][351][needs update] The service offers over 6,000 films.[352]
In March 2017, the government of the United Kingdom pulled its advertising campaigns from YouTube, after reports that its ads had appeared on videos containing extremist content. The government demanded assurances that its advertising would "be delivered safely and appropriately". The Guardian newspaper, as well as other major British and U.S. brands, similarly suspended their advertising on YouTube in response to their advertising appearing near offensive content. Google stated that it had "begun an extensive review of our advertising policies and have made a public commitment to put in place changes that give brands more control over where their ads appear".[353][354] In early April 2017, the YouTube channel h3h3Productions presented evidence claiming that a Wall Street Journal article had fabricated screenshots showing major brand advertising on an offensive video containing Johnny Rebel music overlaid on a Chief Keef music video, citing that the video itself had not earned any ad revenue for the uploader. The video was retracted after it was found that the ads had been triggered by the use of copyrighted content in the video.[355][356]
On April 6, 2017, YouTube announced that to "ensure revenue only flows to creators who are playing by the rules", it would change its practices to require that a channel undergo a policy compliance review, and have at least 10,000-lifetime views, before they may join the Partner Program.[357]
In May 2007, YouTube launched its Partner Program (YPP), a system based on AdSense which allows the uploader of the video to share the revenue produced by advertising on the site.[358] YouTube typically takes 45 percent of the advertising revenue from videos in the Partner Program, with 55 percent going to the uploader.[359][360]
There are over two million members of the YouTube Partner Program.[361] According to TubeMogul, in 2013 a pre-roll advertisement on YouTube (one that is shown before the video starts) cost advertisers on average $7.60 per 1000 views. Usually, no more than half of the eligible videos have a pre-roll advertisement, due to a lack of interested advertisers.[362]
YouTube's policies restrict certain forms of content from being included in videos being monetized with advertising, including videos containing violence, strong language, sexual content, "controversial or sensitive subjects and events, including subjects related to war, political conflicts, natural disasters and tragedies, even if graphic imagery is not shown" (unless the content is "usually newsworthy or comedic and the creator's intent is to inform or entertain"),[363] and videos whose user comments contain "inappropriate" content.[364]
In 2013, YouTube introduced an option for channels with at least a thousand subscribers to require a paid subscription in order for viewers to watch videos.[365][366] In April 2017, YouTube set an eligibility requirement of 10,000 lifetime views for a paid subscription.[367] On January 16, 2018, the eligibility requirement for monetization was changed to 4,000 hours of watch-time within the past 12 months and 1,000 subscribers.[367] The move was seen as an attempt to ensure that videos being monetized did not lead to controversy, but was criticized for penalizing smaller YouTube channels.[368]
YouTube Play Buttons, a part of the YouTube Creator Rewards, are a recognition by YouTube of its most popular channels.[369] The trophies made of nickel plated copper-nickel alloy, golden plated brass, silver plated metal, ruby, and red tinted crystal glass are given to channels with at least one hundred thousand, a million, ten million, fifty million subscribers, and one hundred million subscribers, respectively.[370][371]
YouTube's policies on "advertiser-friendly content" restrict what may be incorporated into videos being monetized; this includes strong violence, language,[372] sexual content, and "controversial or sensitive subjects and events, including subjects related to war, political conflicts, natural disasters and tragedies, even if graphic imagery is not shown", unless the content is "usually newsworthy or comedic and the creator's intent is to inform or entertain".[373] In September 2016, after introducing an enhanced notification system to inform users of these violations, YouTube's policies were criticized by prominent users, including Philip DeFranco and Vlogbrothers. DeFranco argued that not being able to earn advertising revenue on such videos was "censorship by a different name". A YouTube spokesperson stated that while the policy itself was not new, the service had "improved the notification and appeal process to ensure better communication to our creators".[374][375][376] Boing Boing reported in 2019 that LGBT keywords resulted in demonetization.[377]
As of November 2020 in the United States, and June 2021 worldwide,[378] YouTube reserves the right to monetize any video on the platform, even if their uploader is not a member of the YouTube Partner Program. This will occur on channels whose content is deemed "advertiser-friendly", and all revenue will go directly to Google without any share given to the uploader.[379]
The majority of YouTube's advertising revenue goes to the publishers and video producers who hold the rights to their videos; the company retains 45% of the ad revenue.[380] In 2010, it was reported that nearly a third of the videos with advertisements were uploaded without permission of the copyright holders. YouTube gives an option for copyright holders to locate and remove their videos or to have them continue running for revenue.[381] In May 2013, Nintendo began enforcing its copyright ownership and claiming the advertising revenue from video creators who posted screenshots of its games.[382] In February 2015, Nintendo agreed to share the revenue with the video creators through the Nintendo Creators Program.[383][384][385] On March 20, 2019, Nintendo announced on Twitter that the company will end the Creators program. Operations for the program ceased on March 20, 2019.[386][387]
YouTube has been censored, filtered, or banned for a variety of reasons, including:[388]
Access to specific videos is sometimes prevented due to copyright and intellectual property protection laws (e.g. in Germany), violations of hate speech, and preventing access to videos judged inappropriate for youth,[389] which is also done by YouTube with the YouTube Kids app and with "restricted mode".[390] Businesses, schools, government agencies, and other private institutions often block social media sites, including YouTube, due to its bandwidth limitations[391][392] and the site's potential for distraction.[388][393]
As of 2018[update], public access to YouTube is blocked in many countries, including China, North Korea, Iran, Turkmenistan,[394] Uzbekistan,[395][396] Tajikistan, Eritrea, Sudan and South Sudan. In some countries, YouTube is blocked for more limited periods of time such as during periods of unrest, the run-up to an election, or in response to upcoming political anniversaries. In cases where the entire site is banned due to one particular video, YouTube will often agree to remove or limit access to that video in order to restore service.[388]
Reports emerged that since October 2019, comments posted with Chinese characters insulting the Chinese Communist Party (共匪 "communist bandit" or 五毛 "50 Cent Party", referring to state-sponsored commentators) were being automatically deleted within 15 seconds.[397]
Specific incidents where YouTube has been blocked include:
Seamless Wikipedia browsing. On steroids.
Every time you click a link to Wikipedia, Wiktionary or Wikiquote in your browser's search results, it will show the modern Wikiwand interface.
Wikiwand extension is a five stars, simple, with minimum permission required to keep your browsing private, safe and transparent.