Google Wants to Rank Websites for Trustworthiness

They've always been trying to do this. Here's my perspective:

Pagerank was designed to judge the authority of a particular page by judging the authority of the pages that linked to it. If scienceis4sheep.com had 100 links from geocities, and actualscience.com had just 10 - but they were from MIT and NASA's most prominent "good links" section, then actualscience.com would have come up higher on the results.

This worked pretty well, until every website started accepting comments. Suddenly, every post on Nasa.gov might have ten comments posting links to scienceis4sheep.com. So Google's solution to this was to strong arm developers and publishers to add the rel="nofollow" tag to any links generated in user content sections. This would basically tell Google to not count a link as a vote.

This worked out OK for a while, but one of the things Google did to enforce it was to switch from page-level authority rank to domain-level authority rank. Domains that "followed the rules" strictly would benefit, those that didn't languished.

This also had a few side effects. A big domain could capture searches for a specific keyword, even if their page wasn't really the best resource on it. If some obscure forum had the ultimate writeup, you'd never know.

Anyway, we're almost in a third era now, where the official blogs have gone increasingly quiet and almost all of the activity and new publishing is happening in user-contributed areas. Celebrities and subject-matter experts can be found in various subreddits and Twitter feeds. So generic "nofollow" doesn't work, when all your PhDs and CEOs are also moving in to the user contribution section.

Anyway, Google's idea for the last few years has been to build up some kind of author or contributor rank. They really wanted people to use G+ because that would have given them insight in to how people are networked - who trusts who, who interacts with who, etc.

For the details, however, they aren't going to tell us exactly what algorithms are used to accomplish this. After a new update comes out and changes the results, THEN people will start running "scientific" tests to see how different actions weigh in the rankings under the current version.

Either way, Google needs to step it up with accuracy. They've spent so long trying to fight spammers that they became like an obsessed literary trope chasing a white whale or windmill. Spammers gonna spam, especially as most of it is automated. Google needs to get over the god complex that they can stop it from happening, and go back to focusing on delivering the best results.

/r/skeptic Thread Link - theness.com