How to improve website rankings: Advice from Google and Bing at SXSW

South by Southwest attendees packed a large ballroom Monday to hear Matt Cutts of Google and Duane Forrester of Bing discuss how their search engines rank websites.

This has been a hot issue, especially in the past few months as some people have complained that spammy, low-quality sites are too prevalent in the top search results. In a story called “The Dirty Little Secrets of Search,” The New York Times described how J.C. Penney pages rose to the top of search results through questionable tactics. And last year, the Times described how a site that had consistently offered terrible customer service, and had been reviewed poorly on many sites, still ranked at the top when people searched for eyeglasses.

Google changed its algorithm recently to address those issues and others.

Here’s a quick rundown of the talk, moderated by Danny Sullivan of Search Engine Land.

If Google or Bing blocks your site, or it’s doing poorly in search, how do you appeal or notify them?

Both search engines enable people to submit reconsideration requests.

They describe what techniques are not allowed in their webmaster guidelines:

In many cases, the search engines will use their webmaster tools to notify people if they’ve penalized them.

Considering that the search engines frown on “link schemes,” what practices are acceptable? How much can you build out links without running afoul of the rules?

Cutts said that there isn’t anything necessarily wrong with paying someone to be listed in a link directory. However, you should think about what you’re paying for. Are you paying for the link itself or is there some kind of editorial judgment?

If you want to be linked on other sites, “there’s got to be something there [on your site] that people want,” Cutts said. If you create good content, and do so frequently, then people will link to you. “Those sorts of things tend to build up your reputation, build up your credibility. … It’s easier to get links.”

Forrester noted that link-building is simply a marketing tactic. If the links are relevant, then there’s nothing to worry about.

The situations you should avoid, Sullivan said, are link exchanges that are so complicated that you have to think twice about them. They’re probably questionable.

How important is the age of your site?

Age matters for Google and Bing. Older sites probably have a brand name and have accumulated lots of inbound links over time. For Google, age factors into the “reputation” aspect of the algorithm.

But age is just one of the many components that the search engines look at. (Cutts said there are a couple hundred factors used to calculate search results; Forrester said Bing weighs about 1,000 signals.)

Newer sites are indexed all the time, and if they’re relevant, they’ll be ranked well. That goes to the other part of Google’s algorithm –  topicality, which looks at whether search keywords are on the page, in the URL, in the tag and next to each other.

Are Google and Bing going to bring in more results from social networks?

Google and Bing have social search functions, though Sullivan said they operate differently.

Google highlights results shared among your networks (Facebook is a notable exception). If you’re logged in to Twitter, for example, those social results are integrated with others. And links that are shared across all networks, even if they’re not from your friends, can influence search results.

But as Sullivan noted, these tools are still young. “We would love to use social signals more to help us be informed of exactly what’s breaking now,” Cutts said.

Forrester noted that links from social networks aren’t always the most relevant. If your friends are sharing a link on a particular topic that they know little about, “you could get a signal that’s disproportionate.”

How can webmasters better understand what links are moving traffic to their site?

Cutts suggested that you can learn a lot by downloading a CSV of your inbound links and researching them. Bing stores a list of thousands of inbound links that webmasters can use to to look for patterns.

Cutts said a couple of times that Google almost always elects to use additional computing power to expand its search index rather than track backlinks and build up stores of data for webmasters.

Why does Google seem to be better at surfacing “long tail” content, such as the answer to a specific software problem that is posted in an online forum?

According to Forrester, when Bing looks at long pages with lots of content on different topics – such as a troubleshooting post on a forum – it may determine that nothing on the page is particularly relevant. Those pages may be discarded from the index pretty quickly. “We set the quality bar very high at Bing,” he said.

Users can help address this by writing thoughtful responses on forums that will help the search engine understand the importance of the content. “If we think we have even a whiff of quality content, we’re going to hold onto that page,” Forrester said.

How can a site with high-quality content, but less of it, get visibility when a site that catalogs much more low-quality, user-generated content has so many more links?

This question came from someone who works at the Poetry Foundation, which has about 10,000 high-quality poems compared to the millions of user-submitted poems posted on competitors’ sites.

Cutts suggested that a highly curated site encourage its authors to build their social networks and link to their content on the site, which will increase inbound links. The authors should link to the poetry site from their own author websites, too.

You often need just a couple of links to rank well, Forrester said. Sullivan noted that links from certain types of sites, such as news sites, carry more weight.

What’s the impact of semantic Web tools like microformats and “rich snippets,” which are used to add metadata to sites?

Cutts said Google uses these tools to some extent, but for it to become more of a factor, there would have to be some evidence that users find it useful.

Does the top-level domain (the .com or the .tv) matter in rankings?

Neither Google nor Bing look at the top-level domain when indexing content. Cutts said it’s possible that could change if one domain seems to be particularly relevant for a particular kind of content.

How can you make sure your site is ranked high?

Two things, Forrester said: deep content and content research. “Go deep. I don’t think you can go deep enough.”

Cutts pointed people to two resources:

We have made it easy to comment on posts, however we require civility and encourage full names to that end (first initial, last name is OK). Please read our guidelines here before commenting.