Showing posts with label seoupdates. Show all posts
Showing posts with label seoupdates. Show all posts

Sunday, January 26, 2014

Initial SEO Strategy: Tips That You Must follow Before Optimizing Your Site

Initial SEO Strategy: Tips That You Must follow Before Optimizing Your Site

Essential search engine optimization (SEO) is basic. What's more fundamental. SEO will help you position your site appropriately to be found at the most basic focuses in the purchasing methodology or when someone require your website.

What are web crawlers searching for? In what manner would you be able to fabricate your site in a manner that will please both your visitors/customers, and Google, Bing, and other web crawlers? Above all, in what manner can SEO help your web existence get to be more beneficial?

"Skipping the basics and spending all your time and money on social and 'fancy stuff' is the same as skipping brushing your teeth and showering, but buying white strips and wearing expensive cologne," Shelby (Director of SEO, Chicago Tribune/435 Digital) said.

#What is SEO, Basically?
The objective of foundational SEO isn't to trick or "diversion" the internet searchers. The reason for SEO is to:
  • Make an incredible, consistent visitor experience. 
  • Convey to the web crawler your expectations so they can propose your site for significant query.
Your Website is Like a Cake
Your links, paid query, and social media goes about as the icing, however your content, web data structural planning, content administration framework, and base go about as the sugar and makes the cake. Without it, your cake is flavorless, exhausting, and gets tossed in the garbage.


#What Search Engines Are Looking For?
Web crawlers need to do their employments as best as could be expected under the circumstances by referring visitors to websites and content that is the most important to what the user is searching for. So how is relevancy  confirmed?

Content: Is dictated by the topic that is, no doubt given, the content on the page, and the titles and data that are given. 
Execution: How quick is your site and does it work appropriately? 
Authority: Does your webpage have adequate  content to  link to or do other definitive destinations utilize your site as a source of perspective or refer to the data that is accessible? 
Visitor Experience: How does the site look? It is safe to say that it is not difficult to go around? Does it look safe? Does it have a high  bounce rate?


#What Search Engines Are NOT Accept ?
web crawlers just have a certain amount of information space, so when you're performing shady strategies or attempting to misguide them, risks are you're set to damage yourself in the long run. Things the web crawlers don't need are:

Keyword Stuffing: Excessive of keywords on your pages. 
Buying Links: Purchasing links will get you no place in terms of SEO, so be cautioned. 
Poor User Experience: Make it simple for the visitor to get around. An excess of ads and making it excessively challenging for individuals to find information or data they're searching for will just build your bounce rate. When you know your bounce rate it will help confirm other data about your site. For instance, assuming that its 80 percent or higher and you have content on your site, chances are something is not right.

#Know Your Business Goal:

While this is really self-evident, such a large number of individuals have a tendency to not take a seat and only keep tabs on what their principle objectives are. A few questions you have to ask yourself are:

  • What characterizes a conversion for you? 
  • Is it true that you are offering eyeballs (impressions) or what users click on? 
  • What are your main objectives? 
  • Do you know your possessions and liabilities?

#Don't Underestimate other tactics of Optimization:

Don't Underestimate other tactics of Optimization

Keyword technique is imperative to execute on website, as well as might as well stretch out to other off-site stages, which is the reason you may as well likewise be considering other-channel optimization. These other-channel stages incorporate:
  • Twitter
  • Facebook
  • LinkedIn
  • Email
  • Offline, such as radio and TV ads
Being unwavering with keyword phrases inside these stages won't just help your marking deliberations, additionally prepare visitors to utilize particular expressions you're optimizing for.

#Be Relevant with Domain Names: 
Domain naming is so vital to your generally speaking establishment, so as a best practice you're better off utilizing sub-index root areas (example.com/awesome) versus sub-domains (awesome.example.com). Some other best practices with domain names are:

Relevant Domains: If you write in www.example.com, however then your sort in simply example.com and the "www" does not redirect to www.example.com, that means the web crawlers are seeing two separate website. This isn't adequate for your generally speaking SEO endeavors as it will weaken your inbound links, as external site will be interfacing to www.example.com and example.com. 

Keep it Old Bag: Old domains are superior to new ones, yet in the event that you're purchasing an old domain, first sure that the past webmaster didn't do anything shady to cause the domain to get penalized. 

Keywords in URL: Having Keywords you're attempting to rank for in your domain will just help your generally speaking exertions.

#Optimizing for others Devices also:
In addition to optimizing for the desktop experience, make a point to concentrate on mobile and tablet optimization and other media. 

  • Make rich media content like video, gif pictures as its simpler to get a video to rank on the first page than it is to get a simple content page to rank. 
  • Optimize your non-text content  so web crawler can see it. In the event that your site utilization Flash or PDFs, verify you read up on the most recent best practices so search crawlers can crawl that text and give your webpage credit for it.

#Concentrate on Your Meta Data Too:
Your content on your site might as well have title tags and meta descriptions. 

  • Meta keywords are basically unconsidered by many search engines these days, however in the event that in any case you utilize them, verify it talks particularly to that page and that it is additionally organized rightly. 
  • Your meta description must be unique and relevant to that particular page. Duplicate or copied  meta descriptions from page to page will not get better ranking. 

Title tags must be unique! Think your title as a 4-8 words, so try your hardest to attempt the reader so they need to click and read more.

Summary 
You may as well dependably keep SEO in the front line of your mind, and dependably take after best practices. Skipping the fundamentals of SEO will just leave your site's establishment a wreck and anticipate you from completely expanding income chances.

Wednesday, January 22, 2014

Google Webmaster Tools Crawl Errors Reports Now Showing Errors On Final Redirect URL

In the past, we have seen occasional confusion by webmasters regarding how crawl errors on redirecting pages were shown in Webmaster Tools. It's time to make this a bit clearer and easier to diagnose! While it used to be that we would report the error on the original - redirecting - URL, we'll now show the error on the final URL - the one that actually returns the error code.

Let's look at an example:

 
URL A redirects to URL B, which in turn returns an error. The type of redirect, and type of error is unimportant here.
In the past, we would have reported the error observed at the end under URL A. Now, we'll instead report it as URL B. This makes it much easier to diagnose the crawl errors as they're shown in Webmaster Tools. Using tools like cURL or your favorite online server header checker, you can now easily confirm that this error is actually taking place on URL B.
This change may also be visible in the total error counts for some websites. For example, if your site is moving to a new domain, you'll only see these errors for the new domain (assuming the old domain redirects correctly), which might result in noticeable changes in the total error counts for those sites.
Note that this change only affects how these crawl errors are shown in Webmaster Tools. Also, remember that having crawl errors for URLs that should be returning errors (e.g. they don't exist)does not negatively affect the rest of the website's indexing or ranking (also as discussed on Google+).
We hope this change makes it a bit easier to track down crawl errors, and to clean up the accidental ones that you weren't aware of! If you have any questions, feel free to post here, or drop by in the Google Webmaster Help Forum.

Google’s Matt Cutts: We Don’t Use Twitter Or Facebook Social Signals To Rank Pages

Google’s head of search spam, Matt Cutts, released a video today answering the question, “are Facebook and Twitter signals part of the ranking algorithm?” The short answer was no.
Matt said that Google does not give any special treatment to Facebook or Twitter pages. They are in fact, currently, treated like any other page, according to Matt Cutts.
Matt then answered if Google does special crawling or indexing for these sites, such as indexing the number of likes or tweets a specific page has. Matt said Google does not do that right now. Why?
They have at one point and they were blocked. I believe Matt was referring to Google’s real time search deal expiring with Twitter. Matt explained that they put a lot of engineering time into it and then they were blocked and that work and effort was no longer useful. So for Google to put more engineering time into this and then be blocked again, it just doesn’t pay.
Another reason, Google is worried about crawling identity information at one point and then that information changes but Google doesn’t see the update until much later. Having outdated information can be harmful to some people.
However, Matt does add that he does see Google crawling, indexing and understanding more about identities on the web in the long term. He used our Danny Sullivan as an example, when Danny writes a story here, the site is authoritative, so it ranks well. But if Danny posts a comment on a forum or on Twitter, it would be useful for Google to know that an authority posted on a specific site and thus it should have more ranking weight in Google.
While Google doesn’t do this now, we know they are indeed working on a solution for this.
Here is the video:

Tuesday, January 21, 2014

Was Expedia Penalized By Google? SearchMetrics Says So.

Yesterday morning Patrick Altoft tweeted to me that SearchMetrics is reporting Expedia lost about 25% of their Google traffic overnight.
SearchMetric is indeed reporting this and I confirmed it with Marcus Tober from SearchMetrics via email. I posted the story on the traffic drop at Search Engine Land and asked if it was related to the link buying allegations Expedia was surrounded with last month.
Hacker News thread, that I've been following for the past couple weeks has more details about how Expedia may have been involved in link schemes and may have received an unnatural link penalty by Google. Google has not yet responded to my inquiries about the penalty. So I have no confirmation from Google or Expedia about it.
SEM Rush shows no drop off in traffic:
click for full size
Search Metrics shows a huge decline:
click for full size
To be fair, SearchMetrics is normally dead on.
Here are the keywords SearchMetrics cites as Expedia dropping on:
click for full size
In after hours trading, Expedia's stock dropped about 3% over the news:
Expedia Stock
It is amazing how Google can have such an impact on another large public company. But it is more amazing that companies that size need to participate in link schemes. Again, it seems they did participate in link schemes but it is not 100% clear if they were penalized by Google for it.

Keep Writing Quality Content: SEO Bloggers React To Matt Cutts’ Claim “Guest Blogging Is Dead”

Google’s head of webspam, Matt Cutts, caused an uproar in the SEO community yesterday when he published a blog post on his personal blog claiming guest blogging for SEO purposes is dead.
In his post, Cutts offered a history of how guest blogging has moved from being a reliable source of high-quality content to now being overrun with spam.
“Guest blogging is done; it’s just gotten too spammy,” wrote Cutts. “In general I wouldn’t recommend accepting a guest blog post unless you are willing to vouch for someone personally or know them well.”
As Cutts’ words spread across the web, many SEO bloggers took to their own blogs to offer their take on the demise of guest blogging.
With so much being said on the topic, we’ve put together a round-up of industry reactions, summarizing comments from a selection of popular SEO bloggers.

Christopher Penn: www.ChristopherSPenn.com

Blogger Christopher Penn wasn’t surprised that an “automated, low-quality, easy-to-outsource” SEO tactic like guest blogging is on its way out.
In his post titled, “Be Happy that Guest Blogging For SEO Is Dead,” Penn advises bloggers to continue as if there were no Google: “Would you still pursue guest blogging if there was no SEO, if Google wasn’t looking over your shoulder? Yes, absolutely.”
Penn writes that the death of guest blogging for SEO is a “good thing” for content marketers focused on creating quality content:
If you’ve been relying on spammy guest blogging practices for SEO purposes, then it’s time to move on. If you’re still bringing in guest bloggers who you know, trust and vouch for personally, then chances are Google isn’t going to hurt you.

Andy Beal: www.MarketingPilgrim.com

Andy Beal is another blogger who is happy about Cutts’ announcement. “I’ve grown tired of the gazillion guest post pitches I receive every day,” writes Beal, “Seriously, it’s become worse than the paid link spam emails.”
Beal claims he had already stopped taking guest posts unless he knew the author personally. He goes on to write that he believes guest blogging for exposure – a tactic Cutts vouched for when he updated his post – is still a valuable tactic, “There’s nothing like a quality guest post to get your name in front of a different audience and generate direct traffic back to your site.”
(Cutts later updated his post to clarify that guest blogging is still worthwhile when trying to gain exposure, branding, increased reach, and community.)

Kevin Phelps: www.GuestBlogPoster.com

Some weren’t as thrilled about the news and even took offense to Cutts’ post.
“Even after Matt Cutts’ well thought out response toward guest blogging, I believe they [Google] penalize link building techniques (as opposed to strategies) out of the sheer fact that Google obviously doesn’t even know what guest blogging is,” writes Kevin Phelps on GuestBlogPoster.com.
Phelps claims Cutts isn’t talking about actual guest blogging tactics when he says “Guest blogging is dead,” but is instead referring to manipulative, spammy techniques.
“Lumping those who legitimately contribute to real websites with spammers isn’t fair and it’s not something Google can even enforce unless you’re engaging in spammy techniques,” writes Phelps.

Ann Smarty: www.SEOSmarty.com

Ann Smarty, who also runs the MyBlogGuest.com service, wrote a post on SEOSmarty.com claiming she isn’t concerned with Google. “Google is NOT your friend or your partner,” writes Smarty, “If you grow big enough, Google is likely to become your competitor. Do you really want to depend on Google?”
Smarty tells beginning bloggers that if somebody likes your content and wants to publish it, there is nothing broken. “If someone wants to contribute to your blog and you LOVE what they have to say? Do you need to be on your own because Google wants you to be alone?” asks Smarty.
She later updated her post, clarifying bloggers should not go against Google guidelines. “What I was really trying to say,” writes Smarty, “Do marketing as if Google didn’t exist,” echoing Penn’s advice to continue as if there was no Google.

Jerod Morris: www.CopyBlogger.com

Copyblogger’s Jerod Morris cut to the chase, writing, “Guest blogging is not done, dead or destitute. Have standards, do right by your audience, and play to win in the long term. In short, don’t act like a spammer.”
Morris pointed out that Google will fail as a search engine if it begins penalizing sites with quality content in the form of a guest post. “Quality always wins,” write Morris.

Elisa Gabbert: www.WordStream.com

Search marketing software provider Wordstream published a post on its company blog with similar sentiments. Wordstream’s Elisa Gabbert argues that any SEO tactic will fall victim to the same spammy charges overtime, not just guest blogging.
She says publishers concerned with being penalized by Google should: 1.) Only publish good guest posts; 2.) Don’t label the posts as guests posts; and, 3.) Build relationships, not links.
“Google has always stressed that quality, unique, user-friendly content is the key to search engine rankings,” writes Gabbert, “My guess is, sites that publish content that meets all those criteria won’t be penalized, whether or not some of those content pieces are guest posts.”

Joost de Valk: www.Yoast.com

The founder and CEO of SEO consulting agency Yoast.com agrees with Cutts: “The latest tactic being hammered by Matt Cutts is guest blogging,” writes de Valk, “As the owner of a fairly popular blog, I can only agree with him: it’s gone too far.”
De Valk’s post focuses on the link building attributes of guest blogging, claiming, “Branding is the new link building.” He expands on the idea that guest blogging is out:
So SEOs have a choice: now that Matt has said guest blogging won’t work anymore, are they going to try and find the next disposable tactic? Will they remain tricksters? Or are they going to become real marketers? I think that as an industry we’ve been relying on crappy tactics enough by now.
De Valk goes on to emphasize how brands need to rely on a variety of tactics – and not just rely on one – to optimize their SEO efforts.

Ryan Jones: www.OutSpokenMedia.com

Ryan Jones shares de Valk’s opinion on tactics versus strategies on a the OutSpokenMedia.com blog.
“Guest blogging is a tactic, not a strategy,” writes Jones, “It’s time our industry took a step back from the ‘what’ and started taking a longer look at the ‘why’ of SEO tactics.”
He goes on to say:
It seems whenever an SEO tactic becomes popular, Google decides to take action on it…Google hates automated tactics that provide little value to actual website visitors such as creating links and content just to increase search rankings.
Jones believes that while spammy guest blogging is dead, quality guest posts are not. In his post, Jones outlines a number of SEO tactics that have fallen from grace, from link directories, to press releases, comments and infographics.
“You wouldn’t turn down a column on CNN or an editorial in the Huffington Post if they said you couldn’t have a dofollow link, would you?” asks Jones, “It’s about the audience, not the HTML.”

Monday, January 20, 2014

Google Removed 350 Million Ads & Rejected 3 Million Publishers

Google announced on Friday their efforts to keep their ad network safe, in-line and trustworthy. They shared some pretty crazy stats on what that means for their ad network.
  • Removed 350 million bad ads in 2013
  • Disabled 270,000 advertisers in 2013
  • Blacklisted more than 200,000 total publisher pages
  • Disapproved 3,000,000 attempts to join AdSense
  • Disabled 250,000 publisher accounts
Google published this infographic to show their efforts in a friendly way:
click for full size
Forum discussion at WebmasterWorld.

Google Apologizes For The Hotel Listing Hijack In Google Places

I have to assume most of you by now heard about the huge mess going on with Google Maps business listing in the hotel sector? Danny Sullivan at Search Engine Land, with the team, wrote up an awesome story explaining the how Thousands Of Hotel Listings Were Hijacked In Google+ Local.
I can tell you this story was in the work for a few days and Danny broke it just the other day. It is honestly shocking how something like this can happen to huge hotel chains. It is even more shocking how Google tries to sweep it under the rug. Yea, I know Google Maps is plagued with issues, especially on the Google Places business listing side. But this is a huge mess.
In short, some how, spammers hijacked listings of hotel chains across the world, replacing the hotel's URL with a URL to book the listing on their own affiliate site. This likely ended up costing the hotels a tremendous amount in affiliate fees, which I wouldn't blame them if they didn't pay and end up suing the affiliate that did this.
Here is an example showing one listing with a hijacked URL:
Google barely said anything but now they have their community manager, Jade Wang, respond in a Google Business Help thread that no one really looks at. She wrote:

Spam issue for some hotels in Places for Business:
We've identified a spam issue in Places for Business that is impacting a limited number of business listings in the hotel vertical. The issue is limited to changing the URLs for the business. The team is working to fix the problem as soon as possible and prevent it from happening again. We apologize for any inconvenience this may have caused.
At least they apologized for "any inconvenience" it may have caused "some hotels."
It is just amazing how bad Google Maps for business is and how many hacks and issues are within it.
Forum discussion at Google Business Help & Cre8asite Forums.

Thursday, December 26, 2013

When You Have Bad Links Impacting Your Google Rankings, Which Do You Remove?

I see this all the time, a forum thread, where a webmaster knows his rankings are suffering in Google because he was hit by Penguin because he has a lot of really bad, manipulative links. A WebmasterWorld thread sums up the issues a webmaster in this predicament is in.(1) They hired an SEO company (2) That SEO company ranked them well for years (3) Then Penguin smashed their links (4) They no longer rank as well (5) They are upset with the SEO company (6) They need to figure out how to rank again (7) Removing the links are the only option (8) But removing links that were the result of their initial good rankings won't help them rank immediately
In this thread, the site owner sums it up as:
1) What is the sure proof way to make sure a link is 100% bad? 2) I don't want to remove all links cause I am worried my site will drop even more. I'm sure there are some semi-good links that might be helping.
3) After submitting disavow file, typically how long does it take to recover? We have two sites, one seems to be under penguin and panda updates and the other received a manual penalty for certain bad links for certain terms.
It is sad, indeed. But you need to disavow the links, that is for sure. Those links are not helping you and they are now hurting you. Remove the hurt. Then get people to link to you because they want to link to you.
But which links should you remove? Which links are actually hurting you. That is the hard question. One SEO offered his advice:

the best advice I think I can give is to disavow the "obviously bad" links, but keep the ones you think are "grey" or "borderline" and see if you recover -- Basically, in your situation, meaning you don't "know" what's good and what's bad for sure, I'd "start with the obviously bad" and then "keep going" if necessary.
Of course, there are tools, like Link Detox, Majestic SEO, AHREFs, Moz and others. But we are assuming you have the tools already or you manually go through all your links within Google Webmaster Tools. And when you disavow, make sure to disavow on the domain level.
Forum discussion at WebmasterWorld.

Black Hats Prepare To Spam Google's Author Authority Algorithm

For the past six-months, Google has been working on an algorithm to promote authorities on topics.In short, Google is going to try to figure out which authors or individuals are authorities on a specific topic and promote their content across any site, in some way. You can read more about it in the links above.
This morning, I spotted a thread at Black Hat World where "black hats" are seeking ways to exploit this algorithm by "faking" author authority.
This is how one explained it:
So Google now allows you to "tag" an author in your content. Good authors who are popular get extra ranking bonuses for their articles.So it seems very simple to me. Find a popular author in your niche, and tag him in your links to your content.
Extra link juice off someone else's work.
Another added:
see i am thinking about using my own "fake authors" and starting to build authority around them by using them on all my press releases and article submissions... then later in time, anything posted by that author will be easier to rank?
I doubt, seriously doubt, it will be that easy. But hey, someone has to keep Google on their toes.

Sunday, December 22, 2013

Google December 19th Ranking Fluctuations Reported

So Google confirmed they reduced the authorship snippets from showing in the results. We know that. Matt Cutts strongly implied that there was no update on December 17th, despite all the tracking tools lighting up on that date. That implication that Google minimizes the algorithm updates before the holidays should apply to what I am seeing today - a lot of chatter, in some niches, of a Google update.
The key indicator I use is webmaster/SEO chatter. I check the chatter atWebmasterWorld and dozens of other forums and the chatter has picked up yesterday. Martin Ice Web, who is a Senior Member at WebmasterWorld but is based in Germany, is the loudest on claiming updates today. There are many who agree and see major changes and there are many threads in the Google forums with individual complaints.
That being said, one of the tools I rarely show you, because it often doesn't match the other tools, is the DigitalPoint Keyword Tracker averages changes. It is something fairly new added to the forums sidebar and this reports changes of actual rankings for hundreds of thousands of sites and I'd say millions of keywords. It is based off what webmasters enter into the tracking tools.

It showed a bit of a change on the 17th but on the 19th, it really skyrocketed, like the forums did.
DigitalPoint Keyword Tracker averages
I've emailed Google yesterday to find out if something is going on specific with rankings but I have yet to hear back.
It can be a refresh of Panda or something else but I have no confirmation from Google.

Google's URL/Content Removal Tool Now A Wizard

Google has updated their URL removal tool to make it easier and smarter to remove content specifically from third-party web sites.
Google told us at Search Engine Land specifically that the tool is smarter by analyzing the content of the URL you submitted and letting you know what options you have based on the details of the cache result, search results and the actual live page.
You can remove complete pages if the page is actually not live or blocked to spiders. You can remove content from a page if the content shows in the Google cache and is not live on the page.
Here are some screen shots:


Google's URL/Content Removal Tool Wizard
Google's URL/Content Removal Tool Wizard
Google's URL/Content Removal Tool Wizard
Google's URL/Content Removal Tool Wizard - click for full size
Now, it may even work on soft-404s, so be careful. As WebmasterWorld's moderator said:

Saturday, December 21, 2013

Google’s Matt Cutts: Don’t Duplicate Your Meta Descriptions

Google’s Matt Cutts, the head of search spam, released a video today providing an SEO tip on meta descriptions. Matt said, do not have duplicate meta descriptions on your site.
Matt said it is better to have unique meta descriptions and even no meta descriptions at all, then to show duplicate meta descriptions across pages.
In fact, Matt said for his own blog, he doesn’t bother to make meta descriptions for his own site.
In short, it is better to let Google auto-create snippets for your pages versus having duplicate meta descriptions.

Google Says It’s Now Working To ‘Promote Good Guys’

Google’s Matt Cutts says Google is “now doing work on how to promote good guys.”
More specifically, Google is working on changes to its algorithm that will make it better at promoting content from people who it considers authoritative on certain subjects.
You may recall earlier this year when Cutts put out the following video talking about things Google would be working on this year.


In that, he said, “We have also been working on a lot of ways to help regular webmasters. We’re doing a better job of detecting when someone is more of an authority on a specific space. You know, it could be medical. It could be travel. Whatever. And try to make sure that those rank a little more highly if you’re some sort of authority or a site, according to the algorithms, we think might be a little more appropriate for users.”
Apparently that’s something Google is working on right now.

Cutts appeared in a “This Week In Google” video (via Search Engine Land/Transcript via Craig Moore) in which he said:
We have been working on a lot of different stuff. We are actually now doing work on how to promote good guys. So if you are an authority in a space, if you search for podcasts, you want to return something like Twit.tv. So we are trying to figure out who are the authorities in the individual little topic areas and then how do we make sure those sites show up, for medical, or shopping or travel or any one of thousands of other topics. That is to be done algorithmically not by humans … So page rank is sort of this global importance. The New York times is important so if they link to you then you must also be important. But you can start to drill down in individual topic areas and say okay if Jeff Jarvis (Prof of journalism) links to me he is an expert in journalism and so therefore I might be a little bit more relevant in the journalistic field. We’re trying to measure those kinds of topics. Because you know you really want to listen to the experts in each area if you can.

For quite a while now, authorship has given Google an important signal about individuals as they relate to the content they’re putting out. Interestingly, Google is scaling authorship back a bit. 

Google: Duplicate Content Pollutes 25-30% Of The Web

So Google confirmed they reduced the authorship snippets from showing in the results. We know that. Matt Cutts strongly implied that there was no update on December 17th, despite all the tracking tools lighting up on that date. That implication that Google minimizes the algorithm updates before the holidays should apply to what I am seeing today - a lot of chatter, in some niches, of a Google update.
The key indicator I use is webmaster/SEO chatter. I check the chatter at WebmasterWorld and dozens of other forums and the chatter has picked up yesterday. Martin Ice Web, who is a Senior Member at WebmasterWorld but is based in Germany, is the loudest on claiming updates today. There are many who agree and see major changes and there are many threads in the Google forums with individual complaints.
That being said, one of the tools I rarely show you, because it often doesn't match the other tools, is the DigitalPoint Keyword Tracker averages changes. It is something fairly new added to the forums sidebar and this reports changes of actual rankings for hundreds of thousands of sites and I'd say millions of keywords. It is based off what webmasters enter into the tracking tools.
It showed a bit of a change on the 17th but on the 19th, it really skyrocketed, like the forums did.
DigitalPoint Keyword Tracker averages
I've emailed Google yesterday to find out if something is going on specific with rankings but I have yet to hear back.
It can be a refresh of Panda or something else but I have no confirmation from Google.

Google: Authorship Works With Google+ Vanity URLs

Once you get your authorship working, no one wants to mess around and change URLs. Some don't like to mess around even if the authorship is not live in the search results, in fear changing it might break something. This is even more so now that there was a reduction in authorship display in the results.
authorship markup
One question I get a lot from more advanced SEOs is should I use the numeric or custom/vanity URL from my Google+ page. Should I use the root or /about or /post URL as well? The answer is, it does not matter.
Google's John Mueller went on record about this topic this morning on Google+ saying:
I've seen this come up more since custom/vanity URLs for Google+ profiles have become more popular. Authorship works fine with vanity profile URLs, it works fine with the numeric URLs, and it doesn't matter if you link to your "about" or "posts" page, or even just to the base profile URL. The type of redirect (302 vs 301) also doesn't matter here. If you want to have a bit of fun, you can even use one of the other Google ccTLDs and link to that.
So there you have it, concrete information from Google on one of the scary topics to touch on.

For more information, please see:
https://plus.google.com/authorship
https://support.google.com/webmasters/answer/1408986
https://support.google.com/plus/answer/2676340

Friday, December 20, 2013

Google: Your Various ccTLDs Will Probably Be Fine From The Same IP Address

Ever wondered if Google would mind if you had multiple ccTLD sites hosted from a single IP address? If you’re afraid they might not take kindly to that, you’re in for some good news. It’s not really that big a deal.
Google’s Matt Cutts may have just saved you some time and money with this one. He takes on the following submitted question in the latest Webmaster Help video:
For one customer we have about a dozen individual websites for different countries and languages, with different TLDs under one IP number. Is this okay for Google or do you prefer one IP number per country TLD?

“In an ideal world, it would be wonderful if you could have, for every different .co.uk, .com, .fr, .de, if you could have a different, separate IP address for each one of those, and have them each placed in the UK, or France, or Germany, or something like that,” says Cutts. “But in general, the main thing is, as long as you have different country code top level domains, we are able to distinguish between them. So it’s definitely not the end of the world if you need to put them all on one IP address. We do take the top-level domain as a very strong indicator.”

“So if it’s something where it’s a lot of money or it’s a lot of hassle to set that sort of thing up, I wouldn’t worry about it that much,” he adds. “Instead, I’d just go ahead and say, ‘You know what? I’m gonna go ahead and have all of these domains on one IP address, and just let the top-level domain give the hint about what country it’s in. I think it should work pretty well either way.”
While on the subject, you might want to listen to what Cutts had to say about location and ccTLDs earlier this year in another video.

Tuesday, December 17, 2013

Google Structured Data Dashboard Adds Error Reports

Since we launched the Structured Data dashboard last year, it has quickly become one of the most popular features in Webmaster Tools. We’ve been working to expand it and make it even easier to debug issues so that you can see how Google understands the marked-up content on your site.
Starting today, you can see items with errors in the Structured Data dashboard. This new feature is a result of a collaboration with webmasters, whom we invited in June to>register as early testers of markup error reporting in Webmaster Tools. We’ve incorporated their feedback to improve the functionality of the Structured Data dashboard.
An “item” here represents one top-level structured data element (nested items are not counted) tagged in the HTML code. They are grouped by data type and ordered by number of errors:


We’ve added a separate scale for the errors on the right side of the graph in the dashboard, so you can compare items and errors over time. This can be useful to spot connections between changes you may have made on your site and markup errors that are appearing (or disappearing!).
Our data pipelines have also been updated for more comprehensive reporting, so you may initially see fewer data points in the chronological graph.

How to debug markup implementation errors

  1. To investigate an issue with a specific content type, click on it and we’ll show you the markup errors we’ve found for that type. You can see all of them at once, or filter by error type using the tabs at the top:
  2. Check to see if the markup meets the implementation guidelines for each content type. In our example case (events markup), some of the items are missing a startDate or nameproperty. We also surface missing properties for nested content types (e.g. a review item inside a product item) — in this case, this is the lowprice property.
  3. Click on URLs in the table to see details about what markup we’ve detected when we crawled the page last and what’s missing. You’ll can also use the “Test live data” button to test your markup in the Structured Data Testing Tool. Often when checking a bunch of URLs, you’re likely to spot a common issue that you can solve with a single change (e.g. by adjusting a setting or template in your content management system).
  4. ix the issues and test the new implementation in the Structured Data Testing Tool. After the pages are recrawled and reprocessed, the changes will be reflected in the Structured Data dashboard.
  5. We hope this new feature helps you manage the structured data markup on your site better. We will continue to add more error types in the coming months. Meanwhile, we look forward to your comments and questions here or in the dedicated Structured Data section of the Webmaster Help forum.

Backlinks.com: The Next Link Network Penalized By Google

Google's head of search spam, Matt Cutts, publicly outed onTwitter another link network that Google has penalized. Matt Cutts' new trend is the share a link from the marketing material of the link network and then add a word or two to say the opposite, that Google caught you.
This came a week, literally a week, after Google outed Anglo Rank.
In fact, Matt joked on Twitter that Google "should start taking requests for which link networks to tackle next."
Meanwhile, the folks at Black Hat World are not happy for a few reasons. One said, "It's crazy how he can get away with ruining businesses. It's not like spamming the internet's illegal and Google doesn't own the internet." Well, it is spamming Google and I guess Google has the right to fight back?
That being said, guys - stop buying links unless you want to play the crash and burn game.
I received an email from someone who got one of these link penalties but swore they never paid for links. They did and found out their SEO company did for them.
Be careful and don't mess your 10 year old web site up with these schemes.

Sunday, December 8, 2013

Google Busts Yet Another Link Network: Anglo Rank

Google’s head of search spam, Matt Cutts, just confirmed on Twitter that Google has targeted another “private link network” – this one is named Anglo Rank.
Matt’s tweet was pretty direct, he wrote:
“There are absolutely NO footprints linking the websites together” Oh, Anglo Rank.
That is a quote directly from Anglo Rank’s marketing material, and a dig from Cutts suggesting that indeed, Google was able to spot sites in the network.
In response, Search Engine Land’s editor-in-chief Matt McGee suggested on Twitter that those in the network were likely to find that it was “torched.” Cutts responded by saying “messages can take a few days to show up in [Google Webmaster Tools], so timing of when to post can be tricky to predict.”
In other words — yes, Cutts confirmed that Anglo Rank was penalized, and that those involved with it were getting penalty notifications, and since those were finally starting to appear in Google Webmaster Tools, Cutts felt it was OK to finally go more public with a tweet.


Cutts did say that Anglo Rank is not the only link network targeted in this effort. He responded to me that Google has “been rolling up a few” link networks in this specific target.
So if you get a message in Google Webmaster Tools about paid links in the next day or so, it may be related to this update.

Matt Cutts Talks Content Stitching In New Video

Google has a new Webmaster Help video out about content that takes text from other sources. Specifically, Matt Cutts responds to this question:
Hi Matt, can a site still do well in Google if I copy only a small portion of content from different websites and create my own article by combining it all, considering I will mention the source of that content (by giving their URLs in the article)?



“Yahoo especially used to really hate this particular technique,” says Cutts. “They called it ‘stitching’. If it was like two or three sentences from one article, and two or three sentences from another article, and two or three sentences from another article, they really considered that spam. If all you’re doing is just taking quotes from everybody else, that’s probably not a lot of added value. So I would really ask yourself: are you doing this automatically? Why are you doing this? Why? People don’t just like to watch a clip show on TV. They like to see original content.”

I don’t know. SportsCenter is pretty popular, and I don’t think it’s entirely for all the glowing commentary. It’s also interesting that he’s talking about this from Yahoo’s perspective.
“They don’t just want to see an excerpt and one line, and then an excerpt and one line, and that sort of thing,” Cutts continues. “Now it is possible to pull together a lot of different sources, and generate something really nice, but you’re usually synthesizing. For example, Wikipedia will have stuff that’s notable about a particular topic, and they’ll have their sources noted, and they cite all of their sources there, and they synthesize a little bit, you know. It’s not like they’re just copying the text, but they’re sort of summarizing or presenting as neutral of a case as they can. That’s something that a lot of people really enjoy, and if that’s the sort of thing that you’re talking about, that would probably be fine, but if you’re just wholesale copying sections from individual articles, that’s probably going to be a higher risk area, and I might encourage you to avoid that if you can.”
If you’re creating good content that serves a valid purpose for your users, my guess is that you’ll be fine, but you know Google hates anything automated when it comes to content.