Google Hummingbird Update Does not Mean Your SEO Ruined

on Wednesday 9 October 2013
Hi,

Hummingbird "Google`s 12 years old Algorithm has been totally upgraded" for providing more better results in terms of search query .Many people have been frustrated by Panda and Penguin, and they’ll now see Hummingbird in a negative light. Don’t fall into that trap. If you’re the best at what you do, these updates Google has been rolling out are opportunities to separate yourself from your competition. They may have been engaging in spammy tactics to get good rankings, but if you’ve been focusing on creating content that provides real value to potential customers, their days are numbered. These changes will help you rise above, and the good news, as mentioned above, is if you’ve been doing the right things for your SEO you don’t need to change a thing.....

"For Your Information Google has update this few months ago .so there is no need to think about that it will really impact.they had released into press after implementing the same "


Thanks

You may check on following websites i am working presently like Indian Tour Operators  also check out.


 
There one more update is officially announced By Google`s Spam Team Head Matt cutts tweeted on 4th October that Penguin 2.1 launching today. Affects ~1% of searches to a noticeable degree. More info on Penguin: http://googlewebmastercentral.blogspot.in/2012/04/another-step-to-reward-high-quality.html


Thanks Again

Seo is not Dead - Start Link Earning Stop Link Building

on Thursday 8 August 2013

Google: Guest Blogging Stop Link Building , Start Link Earning 


Hi,

Please Have a look .





Gave a highly entertaining and informative presentation about link building through guest blogging.

Guest Blogging Works, But You Have to do it Right

It’s a scenario that’s far too typical in the SEO world.  An awesome new tactic is discovered, word gets out, and it gets done to death, resulting in it not working anymore (sorry, but it’s true).

Link building through guest blogging has definitely suffered this fate.  Starting out as an incredibly effective means of generating high quality white hat links, over-use and poor implementation have resulted in bloggers cringing at inboxes full of poorly written, self-serving pitch requests, and ultimately ignoring the vast majority of would be guest posts.

However, guest blogging still works, and it works well, but it has to be done right.  The key is to stop thinking about it as link building, and start thinking relationship building.  Build real relationships with the real people running the sites, and the links will come.

Follow these six tips to make your guest posting more effective than you ever thought possible:

#1 Upgrade Your Research

The old way to find potential link sources was a simple Google search for the kind of blog you want a link from.  While this technically still works, this is exactly what your competition is doing, and you want to stand out from the crowd.

A much better way to research sources is through social media channels, especially Twitter, LinkedIn, and Pinterest.  Try searching using relevant keywords with modifiers like:

    Blog
    Blogger
    Editor
    Critic
    Etc.

#2 Don’t be Too Direct

The underlying themes to most unsuccessful guest blog pitch attempts are desperation and laziness.

Something along the lines of:

Hi Blogger,

Here is some content.  Give me a link.

From,

XXXX

Maybe that’s an oversimplification, but far too many guest outreach emails follow this format, and it almost never works.  Remember that you are reaching out to a real person, and it’s a huge turnoff when you immediately demand something.

The first time you contact a blogger, don’t pitch to them; get to know them.  Most bloggers are happy to help out people they like, but the only way to get there is to focus on the relationship before the link.

#3 Approach Through Social Media

Better yet, skip email altogether for the first contact.  Instead, reach out through social channels, where you are much more likely to get a response.

Twitter is one of the best social networks for finding and connecting with bloggers, and should be your first step in reaching out.  Start by following, then tweet directly to them, but don’t ask for a link on the first tweet.

#4 Personalize the pitch

Nothing will get your guest post denied quicker than sending a generic pitch.  Taking the time to personalize each pitch to the person you are sending it to will greatly increase your success rate, and is well worth the five extra minutes.

What if you don’t know enough about the blogger to make it personal?  Then it’s too soon to be pitching!

#5 Offer Value

The best way to get what you want is to give something back.  The primary value you should be offering is excellent content, but don’t stop there.

Some other great ways you can bring value:

    Promote and share their content on your social networks.
    Bring technical issues to their attention, such as dead links or broken forms.
    Leave comments and participate in discussions.

#6 Maintain the Relationship

Oftentimes when guest bloggers manage to get a placement, they pat themselves on the back, take the link, and are never heard from again.

You’ve put the effort into getting that link, but that’s just the beginning of the potential benefits you can gain from maintaining good relations.  If your content plays well, the blogger will be eager to publish more of your submissions in the future.  This is particularly good advice for agencies, who can leverage these relationships with multiple clients.

The key to effective guest blogging is to stop thinking in terms of links, and start thinking in terms of relationships.  Stop treating guest blogging as a numbers game, and the links will roll in.

NOTE: The content of this post has been taken from different online blogger suggestion, This for reference only ..

Google Panda Final Manual Update

on Sunday 17 March 2013

Google Panda Final Manual Update

Google spam team head Matt Cutts said that we should expect the refresh of Panda By Monday ,As its normal that entire Seo and webmaster community would discussing and expecting the rank changes and existing ranking will be fluctuate from worse to better or vice versa.
apart i would like share few points on this .

google yet not confirmed the update neither denied the same .

Matt cutts announced that he Panda algorithm will be more of a rolling update,
as opposed to what it has been as a manually pushed out update.

So the update that is rolling out over the weekend might not be that noticeable to you and I.


"Thanks wait till any final announcement.Seo Tips Center Kolkata"

Types of Robots.txt

on Tuesday 5 March 2013
Hi,

Good morning every one .

Today i am going to point out the few things regarding robots.txt file .A robots.txt file defines the privilege for Search engine crawler that which part of the website will be crawled and which part will not.

Example : Suppose in the root folder of your website there are many folders ,which you don't want to give access to anyone or robots (Crawler) like anon_ftp , cgi-bin , you can simply restrict these folder by adding a file in your root directory called robots.txt. Here is the format

User-agent: *
Disallow: /anon_ftp/
Disallow: /cgi-bin/ 

The above example defines that for all the crawler  these two folders are restricted , they cannot index the content of these folder.if you wish , you can specify the different crawler by different restrictions.
Like you want only Googlebot not index these above mentioned two folders. then the syntax of robots.txt file will be .

User-Agent: Googlebot
Disallow: /anon_ftp/
 
Same applies for other crawler also .
 

Blocking user-agents

The Disallow line lists the pages you want to block. You can list a specific URL or a pattern. The entry should begin with a forward slash (/).
  • To block the entire site, use a forward slash.
    Disallow: /
     
  • To block a directory and everything in it, follow the directory name with a forward slash.
    Disallow: /junk-directory/
  • To block a page, list the page.
    Disallow: /private_file.html
     
  • To remove a specific image from Google Images, add the following:
    User-agent: Googlebot-Image
    Disallow: /images/dogs.jpg 
     
  • To remove all images on your site from Google Images:
    User-agent: Googlebot-Image
    Disallow: / 
     
  • To block files of a specific file type (for example, .gif), use the following:
    User-agent: Googlebot
    Disallow: /*.gif$
     
  • To prevent pages on your site from being crawled, while still displaying AdSense ads on those pages, disallow all bots other than Mediapartners-Google. This keeps the pages from appearing in search results, but allows the Mediapartners-Google robot to analyze the pages to determine the ads to show. The Mediapartners-Google robot doesn't share pages with the other Google user-agents. For example:
     
    User-agent: *
    Disallow: /
    ediapartners-Google Allow: /
    User-agent:
    M
Note that directives are case-sensitive. For instance, Disallow: /junk_file.asp would block http://www.example.com/junk_file.asp, but would allow http://www.example.com/Junk_file.asp. Googlebot will ignore white-space (in particular empty lines)and unknown directives in the robots.txt.
 
 
Hope this is helpful, Suggestions please write me akhi8601@gmail.com


 
 
 

SEO TIPS 2013

on Thursday 7 February 2013

Targeted traffic will be the central part of a very good internet site. Search engines like yahoo have become the leading way of taking visitors for an on the net business. Traffic can easily turn straight into potential business and earnings for virtually every website owner.

Word of advice 1: Steer clear of Bad Code.

Make sure web pages tend to be W3C Compliant. Search engines like yahoo crawlers investigate as a result of internet websites simply by analyzing services program code. Taking that approach would be to produce html clean up and miscalculation free making sure that these types of crawlers will be able to list just about every webpage of an internet site. Possessing glitches can imply all of your web page won't get found with the crawlers, and in the long run may result in a reduced webpage ranking.

Word of advice 2: Make use of Related Keywords and phrases.

If a internet site is around Belgian chocolate, after that follow specific key terms. For example "gourmet Belgian chocolate" and "Belgian chocolate truffles" can be great options as they are very specific. The harder specific a new key word is actually, the particular a lesser amount of rivalry it is going to get in search engine results positioning. This allows web sites to have an much easier time ranking, along with a greater position in Google.

Word of advice 3: Clever Search phrase Placement.

Following the key terms have been determined, they will display someplace inside web page identify and header tag cloud. The particular key terms should also take the web web pages physique, but stay away from overdoing the particular key word thickness.

Word of advice some: Obtain One way links.

Related newly arriving back links coming from additional internet websites which are 'trusted' can easily improve the web-site's optimization. Attempt to avoid paid out hyperlink backside and a lot of reciprocal back links  Steer clear of hyperlink strategies.


PubCon 2012 slides: disavow links tool

on Wednesday 6 February 2013

Webmaster level: Advanced

Today we’re introducing a tool that enables you to disavow links to your site. If you’ve been notified of a manual spam action based on “unnatural links” pointing to your site, this tool can help you address the issue. If you haven’t gotten this notification, this tool generally isn’t something you need to worry about.

First, a quick refresher. Links are one of the most well-known signals we use to order search results. By looking at the links between pages, we can get a sense of which pages are reputable and important, and thus more likely to be relevant to our users. This is the basis of PageRank, which is one of more than 200 signals we rely on to determine rankings. Since PageRank is so well-known, it’s also a target for spammers, and we fight linkspam constantly with algorithms and by taking manual action.

If you’ve ever been caught up in linkspam, you may have seen a message in Webmaster Tools about “unnatural links” pointing to your site. We send you this message when we see evidence of paid links, link exchanges, 

or other link schemes that violate our quality guidelines.
If you get this message, we recommend that you remove from the web as many spammy or low-quality links to your site as possible. This is the best approach because it addresses the problem at the root. By removing the bad links directly, you’re helping to prevent Google (and other search engines) from taking action again in the future. You’re also helping to protect your site’s image, since people will no longer find spammy links pointing to your site on the web and jump to conclusions about your website or business.

If you’ve done as much as you can to remove the problematic links, and there are still some links you just can’t seem to get down, that’s a good time to visit our new Disavow links page. When you arrive, you’ll first 

select your site.



You’ll then be prompted to upload a file containing the links you want to disavow.




The format is straightforward. All you need is a plain text file with one URL per line.
An excerpt of a valid file might look like the following:


# Contacted owner of spamdomain1.com on 7/1/2012 to

# ask for link removal but got no response

domain:spamdomain1.com

# Owner of spamdomain2.com removed most links, but missed these

http://www.spamdomain2.com/contentA.html
http://www.spamdomain2.com/contentB.html
http://www.spamdomain2.com/contentC.html

In this example, lines that begin with a pound sign (#) are considered comments and Google ignores them. The “domain:” keyword indicates that you’d like to disavow links from all pages on a particular site (in this case, “spamdomain1.com”). You can also request to disavow links on specific pages (in this case, three individual pages on spamdomain2.com). We currently support one disavowal file per site and the file is shared among site owners in Webmaster Tools. If you want to update the file, you’ll need to download the existing file, modify it, and upload the new one. The file size limit is 2MB.

One great place to start looking for bad links is the “Links to Your Site” feature in Webmaster Tools. From the homepage, select the site you want, navigate to Traffic > Links to Your Site > Who links the most > More, then click one of the download buttons. This file lists pages that link to your site. If you click “Download latest links,” you’ll see dates as well. This can be a great place to start your investigation, but be sure you don’t upload the entire list of links to your site -- you don’t want to disavow all your links!



The content has been taken from official Google blog.
 

Beginner's Guide to SEO

on Wednesday 30 January 2013

Get started right now by downloading your PDF copy: Powered By Seomoz

Google Update

on Tuesday 29 January 2013

Once Controversial, Google's Product Listing Ads Now A Big Hit

When Google revamped its shopping service last May, it said it would begin charging merchants to list their products. These Product Listing Ads, as they’re called, set off a firestorm of controversy because they represented the first time Google had eliminated a search service that had free listings and made it paid-only.Although Google posed the move as a way to improve the quality of listings, merchants squawked that this would raise their costs to appear in shopping search results. But even more broadly, the move raised doubts in some people’s minds about whether Google’s search engine could still be seen as fair and impartial if its results were affected by who paid it for shopping listings.Well, so much for that issue. Not that it has gone away completely, but by several accounts, Product Listing Ads are a big hit with both advertisers and consumers. A study released this morning by ad management service Marin Software indicates that the share of clicks PLAs got as a portion of overall search ad clicks last year more than tripled. They were still a single-digit percentage of text search ad clicks, about 6%, but Google only rolled them out widely in October.Marin also says that whether merchants wanted to pay or not, they did–increasing their investments in the ads sevenfold, to 3% of their search ad spend as the holiday shopping season peaked. Some retailers spent up to 30% of their search budgets on PLAs, to the tune of hundreds of millions of dollars overall. The ads also had a higher click-through rate and a lower average cost than text search ads, though Marin doesn’t get specific about those measures. “It appears Google’s bet has paid off,” says Matt Lawson, Marin’s VP of marketing. “This is a piece of how Google is competing with Amazon, which is becoming a commerce search engine.”
This isn’t the first report to document the success of Product Listing Ads. Adobe also said recently that by the peak of the holiday shopping season, in mid-December, PLAs accounted for 17% of all ad spending on Google. And search marketing firm RKG said PLAs accounted for 28% of non-brand paid search clicks in the fourth quarter.Apparent success doesn’t make the ads less controversial, at least among search engine watchers like Danny Sullivan. It just looks like that controversy isn’t enough to keep merchants from using them to make more money.

Benefits of no follow links & Do follow

on Thursday 24 January 2013

The advocates of the "do-take after just" camp have an all in all legitimate contention sponsorship them up. As far as SEO worth ("interface squeeze", to utilize the classification) Google may as well hypothetically just check do-accompany joins. Any connection that is no-take after has an immediate and clear direction advising web crawlers to overlook it. Think about the rel=nofollow tag as one of the aforementioned makeshift route marks you sporadically see. The avenue is still there, its just you're being advised to pass by it.

The No-Follow Links Argument 

On the flip side of the coin, some individuals accept that no take after connections really do give some type of connection profit. Far beyond this, they can give awesome referral activity from folks navigating your connections. A fantastic case of handy no accompany connections are web journal remark joins. The unlimited larger piece of the pie of web journals are no take after in present modern times, which was put set up to dishearten remark spam. Howevever, a well-place write remark on a prominent website can convey countless guests in an extremely short space of time.

Wikipedia is a different extraordinary illustration of the advantage of no accompany connections. Course again in the day, Wikipedia connections were really do-accompany. Anyway then join spammers got onto the way that this was a colossal power destination that any individual might alter include their particular connections in. No-accompany connections were acquainted on Wikipedia with straightforwardly battle this. On the other hand, a connection from Wikipedia even today still makes your locale look more sound, and more to the focus, you can get heaps of referral activity from it. One of my web spaces just got a connection from Wikipedia (it was credited as an origin) on a decently darken article, however that specific small join still sends around 100 guests a month. Not awful, assuming that I do say as much myself.

The Evidence 

There isn't actually much robust proof to back any side of this contention up. Indeed, with regards to SEO the whole lot is an inaccurate science. What works for one individual may not indispensably work for you. That being stated, I determinedly urge you to take an "adjusted" and holistic methodology to connection constructing.

Construct a stable of value, setting-appropriate do-take after connections to your site to help it like a champ the web indexing tool rankings. Visitor blogging, ejournal arrangements, network indexes, and article promoting, the aforementioned are just a couple routes to get do accompany connections to your locale that will accommodate convey that extremely critical connection juice. Look at the Affilorama SEO lessons for additional incredible connection fabricating thoughts.

At that point, once you've done that, begin fabricating some no take after connections from website mentioning, discussions (certain discussions are no accompany) and different puts that are setting off to convey you heightened amounts of referral movement. Assuming that you can presell viably then truckloads of money anticipate from value referral activity. Having a pick-in update record is far better, as you can seize that movement and keep them in your bargains circle over and over again.

The Conclusion 

The summation is effortless -do-accompany joins give preferable "connection squeeze" for boosting your resource up the web index rankings. This wouldn't be able to be denied. On the other hand, no-take after connections can frequently be greatly setting-appropriate, and claim roots in origins that are prone to produce more terrific levels of referral movement. Notwithstanding this, internet searching tools are less averse to respect a mixture of both sorts of connections as accompanying an "expected" connection profile. This is an exceptional thing, as an unnatural connection profile is a great deal more inclined to see you booted from the front page of Google for SEO negligence.

Subsequently, it is judicious practice to adjust do-accompany interfaces with no-accompany ones. I wouldn't push excessively about if the connections you are getting are do-accompany or no-take after. As a matter of fact, I wouldn't stretch about it to any detectable degree. Why? The explanation for why is effortless -assuming that you are truly out there actively constructing connections to your destination, you are initiating movement, and that means more than something.

5 Myths On Page Seo


Hi,

I am mentioning few myths which have no existence in on page seo. Like

*  Meta keywords matter

 "meta keywords does not impact in ways "
 *  You should submit to search engines

 "Tons of snake oil SEO companies still tout search engine submission as an offering. However, submitting to search engines through general “submit here” URLs is futile and useless. Providing that your site has a link or people visiting it, it will get found – and indexed – by the search engines. And if it doesn’t have links or visitors, then you’re not going to rank in the algorithms of 2011 anyway, so being indexed in the engine won’t make much of a difference. "

*  Using the ‘nofollow’ tag on internal links helps you direct internal Pagerank

 "The nofollow attribute was originally created under the claim that it was a joint effort by the leading search engines of the day (in 2005) to help fight blog spam. The theory was that if easily spammable platforms like blogs employed the use of the nofollow tag, the value of the link to the spammer would be worthless. If blog spamming no longer worked as a “ranking technique” as a result, then the incentive to do it would be removed and spammers would attempt to move onto the next target."

*  Toolbar Pagerank matters

 "
First, let’s acknowledge that long gone are the days when Google’s algorithm was based primarily on Pagerank. Second, I’d like to be clear that there is a difference between the real, internal Pagerank of a website that Google uses as a part of its algorithm and the green pixels you’re shown in the Google Toolbar (referred to as TBPR – Toolbar Pagerank.)

Way back in the day, the higher your TBPR was, the more valuable Google considered your site and the more your outbound links were worth. So, webmasters who knew this decided to only target links from sites  that had a TBPR of 4 or more. While the original “truth” behind that method has long since become fiction, it unfortunately is still a “strategy” employed or suggested by some people who call themselves SEOs. Currently, Toolbar Pagerank seems to have no direct correlation to higher rankings or higher link value, though it is suspected to have some correlation to higher indexing rates. Matt Cutts has been quoted stating that he himself would like to see Toolbar Pagerank go away (I suspect as he may realize it’s mere existence prolongs the death of the TBPR related SEO “strategies.” )"

*  Google hates affiliate websites

 "Google doesn’t hate affiliate websites. Google hates crap affiliate websites. If the affiliate site is thin or contains the same duplicated information as a thousand other websites that are affiliates of a merchant (think affiliate datafeed sites) with no differentiation to it, then yes, Google is likely looking for your “type” of sites to suffer in upcoming search engine updates. But if you can find a way to create a value add and make your affiliate website into an affiliate brand and promote it via legitimate methods within the Google guidelines, then you’re not a specific target and you likely don’t need to fear every update. If you’re in the former category, learn how to survive the affiliate evolution and make defensible websites Google not only doesn’t hate but ones they actually want to rank."

Few more

On page SEO or search engine optimisation is making sure that your website is as search engine friendly as possible. If your website is not optimised then you have less chance of getting good results in the search engines, here is a quick guide towards good on page SEO:

Make sure that all of your web pages can be indexed by search engines - make sure that they all have at least one link from somewhere on your site.

Make sure that you have unique content on every single page.

Make sure that your meta-tags are arranged correctly - your page title tags and description tags should describe the content of your different web pages. The page title tags should be less then 68 characters and the description tags more detailed but less then 148 characters.

Make sure you label the different headers on your web pages using H tags.

Make sure that your web page URLs are SEO friendly, use mod re-write for Linux and Apahche hosting or use IIS redirect for Windows. Ideally make it so that the URLs describe your content i.e. use domain.com/blue-widgets.php as apposed to having something like domain.com/product.php?cat=146. Use hyphens or underscores to separate words in the URLs.

Make sure that the links within your site are complete i.e. if you are linking to the blue widgets page link to domain.com/blue-widgets.php as apposed to just blue-widgets.php.

Make sure that you use descriptive URLs for your images i.e. use blue-widget.jpg as apposed a bunch of numbers and or letters .jpg.

Make sure that you label all of your images with descriptive alt attributes.

Make sure that you make good use of anchor text links within your content - if you have a page about blue widgets, use the phrase blue widgets in the text that links to it.

Make sure that there is only one version of your site - 301 redirect all non www. URLs to the www. ones or vice versa.

Make sure that there is only one version of your homepage - 301 redirect the index or default page back to domain.com.

Use the rel="nofollow" tag in the links to websites that you do not trust, you think maybe using spamming techniques or you do not want to help in the search engines.

Make sure that your code is valid, in some instances bad code can lead to search engines not being able to properly read a page. Use the W3C validator to check your markup.

If you follow these guidelines you are on your way towards good on page SEO and to good rankings in the search engines.

Advanced media-The Best Source in the Arena of Mass Media - SEO PPC Tips

on Monday 21 January 2013

With the appearance of the unique period, numerous origins of the computerized media went into lime light for example CD and DVD that were accord as the fundamental origin of space and playback after their separate creation. Separated from the aforementioned, you might have moreover went over some other sort of computerized media that have been configured for rendering the best utility in the field of mass advertising like DAT's, MD's and countless more, yet couple of mediums with the coming of time made a revolutionary methodology towards computerized media. Anyhow in the present day, the standard is fully gripped with the mark Blu-Ray engineering whose has thought of the creative advanced media that has been render with the tag name achieved with the capacity and helpfulness. You can declare that Digital media is sponsored to render you with the dream with its bargains encouraging concerning the blue-flash player. 



We accord you with the selective reach of the exceedingly dedicated reinforcement aid to render the existence time help. Probably you might get favoured in have confidence in moderate progressing mode of traction bona fide for every last trace of the computerized substance, even from the home. The advanced media is climbing prevalently with the Netflix and iTunes, as the buyers pay the principle activity in the term of idea right by owing physical duplicates of the media report. Separated from the proposed, this origin in addition points at making whole music and DVD library convenient that is thriving with the music and motion pictures where ever you go. Thus, just try on going for DVD's at whatever location to delight in the profit of computerized library. Additionally, the worldwide music accumulation of the music or picture could be effortlessly distinguished by our computerized media origin with the assistance of the duplicate refer to the physical duplicates of pictures and music.



Separated from the aforementioned, there are certain origins that secure the magnetic part of music or picture held inside. Huge numbers of the media player and the advanced space are recognized to give the clients the limit to inch toward getting appended collection blankets in addition to the film blurb delineation to the satisfactory origins like pertinent advanced documents that permits the decently versed use of the most cutting edge innovation. On the other hand, the blu-flash media is overall in the business as the most recent form that winds up with the computerized media, this wonderful advancement for the replicating of the non-physical item can happen inside few second. Separated from these, you can recollect that the hard drives are thought about as the most competent origin of space.
Streaming media is the least difficult resource for the space of the normal client in the undertaking of instatement and alignment. Final yet not minimum, you can even make the best methodology for the existing media right to render you with the most extreme deal Conclusions: Finally, the advanced media is an ideal instrument that might offers you with the predominant items at the very most focused cost on top of the full-offered fixes. Our Digital media stages are scaled in agreement to the estimated.