This story is part of, CNET’s coverage of the run-up to voting in November.
Beginning on Election Day on Nov. 3, YouTube will link to information on election results in an effort to tamp down on confusion in the aftermath of the contest, the company said Tuesday.
The world’s largest video platform, owned by Google, will link to an election results feature that Google will display in its search engine. The search giant will partner with the Associated Press to provide real-time election results, as it did during this year’s primaries and other past elections.
The link, part of what YouTube calls information panels, will appear below videos about the election, as well as certain search results pages for election-related queries. “Results may not be final,” the text of the panel will read. “See the latest on Google.”
The announcement comes exactly a week before Election Day. Because of the coronavirus pandemic and an increase of mail-in voting, experts anticipate there could be delayed results as ballots continue to be tabulated in the days following the contest. Misinformation purveyors could take advantage of that delay to try to sow confusion and spread conspiracy theories, experts say.
Twitter on Monday. The company said it will start including prompts in its apps to educate people about the election, including reminding them of possibly delayed results and misinformation.
More than 60 million Americans have already cast their ballots, and the number could balloon to up to 100 million by Election Day, according to NBC News.
As the contest approaches, Silicon Valley companies have been eager to prove they can avoid the pitfalls they encountered in 2016. That election was marred by interference from Russia, which exploited platforms from Google, Facebook and Twitter to try to influence the outcome of the contest.
In September, YouTube said it would show people information panels on mail-in voting when they watch videos that discuss the subject. The ballot-casting method has become fraught with misinformation as President Donald Trump has tried to discredit the process, while providing no evidence of security flaws in the time-tested system.
YouTube first introduced information panels two years ago, adding short blurbs that appear under false or misleading videos and aim to debunk misinformation by linking to accurate sources. Since then, the company has added the panels to videos about COVID-19, the moon landing and other subjects rife with conspiracy theories.
The panels haven’t always worked as planned in the past. When the Notre Dame cathedral in Paris went up in flames last April, YouTube’s algorithm accidentally displayed an information panel on the 9/11 terrorist attacks because the software made a mistake in analyzing the images in the video. After the fire, YouTube said its systems made the “wrong call.”