With Mrs. Cutts out of town, Matt’s blogging like… like a… well, like an SEOmozzer. I don’t want to take up too much of Matt’s very precious time, but I do have some questions I’d love to get answered so, since Matt couldn’t make it to Chicago to be on any any panels, let’s see if he’s willing (or able) to run through some multiple choice questions:
(Matt – if you can’t answer any of these, we’ll be sad, but we understand. Oh, and feel free to add “E) Fill in your own answer” to any you’d like.)
UPDATE: Since Matt’s been kind enough to respond; I’ve gone through and highlighted his answers in bold and red:
#1 – Will Google offer full link data through Webmaster Central?
- A) Yes, within the next 90 days
- B) Yes, within the next 6 months
- C) Maybe, but not before next June
- D) Sorry, not gonna happen; use Yahoo! or MSN
- E) I’ve learned never to promise stuff in the future.
#2 – Is Google worried about the inherent link bias that sites such as Digg, Slashdot and Reddit provide (specifically, the “linkbait” effect whereby URLs mentioned on these sites typically pick up hundreds or thousands of new inbounds)?
- A) We have no plans to treat these phenomenon any different than a site that picks up buzz via any other method (advertising, email, word-of-mouth, etc.)
- B) We are a bit concerned about the power of the so-called “linkbait effect” and may take steps to limit the value of all those new inbounds just because the Digg crowd loved it
- C) We already have in place measures that limit some of the value you see from this effect.
#3 – In my interview with Vanessa Fox, she mentioned that there really isn’t a penalty for having internal duplicate content issues (pages inside your own site that are copies of other pages on your domain). Would you agree with that statement? Would you advise site owners to attempt to fix internal dup content issues, or is it really OK to let Google sort it out?
- A) Vanessa was right – internal duplicate content isn’t too big of an issue, and we’re pretty good at sorting out which pages to rank. (he added: but it never hurts to help search engines with dupe content issues if it’s easy to help, e.g. in the webmaster console, tell us if you prefer www vs. non-www.)
- B) For those who believe they can do a better job than Google at sorting out which pages they want us to index and rank, I’d say that directing our spider away from dup content or eliminating it is advisable.
- C) Vanessa wasn’t neccessarily wrong, but I’d probably advise that folks strongly think about duplicate content internally – it can cause some troublesome issues.
#4 – The AdSense monetization model is responsible for a very large percentage of the monetization of spam & scraper sites – Google’s always taken a policy of shutting down individual domain accounts, rather than banning users, which many believe would considerably lessen the problem. Is this a step Google’s willing to take?
- A) I think you’re wrong Rand – AdSense isn’t the primary monetization method for spammy sites, and our account practices are effective and fair.
- B) There are some problems with AdSense funding spam sites, but it’s an issue that we’d rather address alogrithmically than by banning users from AdSense.
- C) We’re moving towards banning users for running AdSense scrapers – watch out kiddies 🙂
- D) I wouldn’t be surprised if Google got stricter with reports of webspam in AdSense.
#5 – Which of the following best fits a description of your position at Google today:
- A) Responsible for the quality of Google’s algorithmic search results as a whole (from spam to relevancy to inclusion to crawling and ranking)
- B) In charge of Google’s spam police – finding manipulation and shutting it down
- C) Head of Google’s search team and able to make decisions about which “onebox” results to include, where search engineers should spend their time and Google’s ongoing search strategy
- D) A lowly engineer who has some influence on the results, but is generally assigned far more influence than he actually has
#6 – When large sites see lots of crawling activity with a low percentage of the crawled pages being included in the index (assuming they’ve already done the basics like sitemap submission through webmaster central and posting in the group), what’s the best course of action?
- A) Send me an email – that doesn’t sounds right at all
- B) Wait for 40-60 days; sometimes Googlebot takes its sweet time; particularly this close to Hannukah
- C) Watch out for issues like huge numbers of 301’s, non-unique meta descriptions, low quality pages or pages without significant unique content – we often spider these, but don’t include them
- D) Give us fewer pages to crawl – restrict spidering to 10-20K pages and once we pick those up, expand to a new set to help us index slow and steady
- E) I’d look for more high-quality, editorially chosen backlinks.
#7 – If you were to identify one ranking factor that SEOs don’t pay enough attention to, which of the following would it be:
- A) Temporal link data – have your new inbounds slowed to a snail’s pace… maybe you’re not so relevant any more?
- B) Relevance and quality of on-page content; we’re smarter than you think we are about detecting that stuff
- C) Internal linking – for goodness sake; one link from a deep page and you think it’s supposed to rank well? Come on!
- D) Pagerank – you’ve taken my advice about ignoring the toolbar data too far and now your PR pixie dust meter is running on empty – fill that guy up.
- E) One ranking factor we don’t use currently is phase of the moon. But if that helped, I’d be open to using it. 🙂
#8 – SEOs sometimes get paranoid that Google is trying to take over their jobs for them. Would you like to see Google fill the role of optimization consultant or do you think the profession should remain its own cottage industry?
- A) In all honesty, it would be great if the world didn’t need SEOs, but they do – Google doesn’t want your job, Rand, we’ve got plenty on our plates and the air of mystery / game of cat and mouse is good for both us and you.
- B) I would like to see Google do a better job of informing and helping folks to get their websites indexed and ranking, but there will always be a need for the high-quality, value-added consultancy that a search strategist can bring to the table.
- C) Watch your tail; we’re gunning for you – Google has thought seriously about providing SEO-style services on a paid or free basis to help eliminate the SEO industry.
#9 – Wikipedia has “nofollowed” and “un-nofollowed” their links over the past year, but Wikipedia-spamming and editing has climbed ever-higher. If Matt Cutts ran Wikipedia, would he:
- A) Keep the links followed – kudos to anyone who can build up a high profile at Wikipedia, buy off an editor or stay under their radars (ignore my seething bias in this answer, please).
- B) Nofollow the links – if you get listed in Wikipedia, you’re already getting the traffic and branding benefits and by their very nature, Wikipedia links fit the definition of what “nofollow” should be.
- C) Google already discounts Wikipedia links because we know they’re not an “editorial vote for quality,” so really, it doesn’t matter.
- D) In my ideal world, Wikipedia would add nofollow to their untrusted links, but work out ways to allow trusted links to remove the nofollow attribute.
#10 – The hordes of paparazzi-like drones that swarm you at conferences have already been given the moniker of Cuttlets. Many of these poor folks don’t end up getting their questions answered (as you’re restricted regarding what you can say and often simply don’t have time). What would you advise rejected or untimely Cuttlets to do in order to have the best chance of getting in front of a Google engineer’s eyes?
- A) Send an email to ______ (you’d have to tell me what address to put here)
- B) Leave a note in the Google Webmaster Central Groups site
- C) Ask at a forum like Cre8asite or HighRankings; use your best judgement, but you can often get better answers there than even I can provide
- D) Go to a smaller conference and push harder to get to the front of that line
- E) Mainly B, but C&D aren’t bad ideas either.
#11 – Just for fun; if you were unmarried and could have a date with any of the following ladies of SEO, who would it be (sorry, Mrs. Cutts, but your husband is a heartthrob):
- A) SEO Fangirl
- B) Jessie Strichiola
- C) Heather Lloyd-Martin
- D) Rae Hoffman
- E) Lisa Barone
- F) My God, Rand; don’t make me choose!
- G) Wait, my husband sense is tingling! I sense danger!
It’s completely possible that Matt won’t have a chance to answer these, so feel free to go with the “wisdom of the crowds” phenomenon and have a guess at his answers.