Build relationships, not links - Content to Pace up Indian Digital Marketing Space

on Monday 26 December 2016
“Build relationships, not links,” says Scott Wyden Kivowitz who wants to be known as the community and blog wrangler, a father and an educator than just an ace photographer – his principle vocation. Kivowitz believes that his abilities are heterogeneous hence appropriate for a cross-channel consumption through an ideal digital marketing method. Perhaps there is a robust lesson for Indian digital marketers here; if you have a great digital content, reach out to as many potential customers as possible through the centimane of cross-channel marketing.

The country is on the cusp of an exponential growth in the digital space, fueled by the rising Internet usage via smartphones and other mediums of access. According to Internet Live Stats, as of now, India has over 46 crore Internet users, the second biggest base in the world after China. However, given the major digital push by the Government, this is likely to touch over 730 million in 2020 with much of the spike coming from rural areas, says a report by Nasscom and Akamai Technologies. Rural consumers are consuming data in local languages thus allaying fears that some of the marketing campaigns would not make sense in India’s remote areas since these are in English.

Then comes the channelizing of disposable incomes; India is expected to have mobile wallet market size of Rs 30,000 crore by 2022 as against Rs 154 crore this fiscal. As the banking customer base grows, the growth of credit and debit cards would go up significantly (from 2.70 crore credit cards and 7.6 crore debit cards in November 2016). India now boasts of the second largest internet population at 462 million users and is now the largest smartphone market in the world.

With the Govt giving an aggressive push to projects like Digital India, more and more are moving online to conduct key activities from reading news to banking and shopping using credit cards. As per Google data, 90% of the online users switch between devices to complete a single task, using at least three devices per day. While such an uneven behavior makes tracking and attribution difficult, it also hands out a huge opportunity to launch cost effective marketing brainwaves, using inviting content and advantageous channels.

Indian Campaigns Hit Cross-Channel Favor


There are several positive examples of cross-channel utilization of digital marketing by corporates in India; Amazon India’s `Aur Dikhao’ digital campaign was rolled across a plethora of channels even as it was complimented by a TV campaign during IPL 8. The much-lauded creative content rightly played on the Indian sensibility to `see more and more before buying’, but it was ideally leveraged across cross-channel. For example, on YouTube, it gained over 1.2 million views in few days as the buzz built around the hashtag #AurDikhao started showing conversions.

Land Rover did something more dramatic world-wide, picking up valuable inputs from social media where car buyers were actively seeking inputs. The company reached out to these potential shoppers on all their devices at every point in the purchasing funnel through several optimal ways – from a homepage masthead takeover on YouTube to a masthead in Lightbox ads across the Google Display Network. It added every tool to exploit the visibility on mobile phones. The result – 100 million impressions in no time and 15% of total sales from online leads.

There are several brilliant case studies to vindicate cross-channel exploitation hence Indian corporations, regardless of size, scale or budget, now prioritise cross-channel integration of marketing activities, followed by the selection of channel mix. To build an effective digital marketing campaign, here are few guidelines:

Narrative of Digital Marketing Expands


Digital marketing broadly describes marketing of products or services using digital technologies, mainly on the Internet though mobile phones, display advertising and all emerging digital media which can be clubbed under this definition. As the power of new media grows to disperse personal, valid and timely messages, digital platforms – from SEOs and SEMs on the Net to call back and on-hold ring tones on mobiles - are being incorporated into marketing plans.

Given such huge diversity, to succeed, optimal digital marketing strategies today need greater involvement and innovation at the cross-channel marketing level. Marketers are under pressure to constantly and swiftly grasp what channels, what type of content and what delivery format could thrive. Thanks to the much-evolved cross-channel marketing, using data and learning gathered from a sphere of consumer interfaces, today marketers can implement digital marketing efforts to logical conclusion. Let us examine the pre-requisites for the execution:

Build Content to be Fully Exploited by the Digital Strategy


One oft-repeated limitation of the traditional marketing route is that it `talks at people’ whereas digital content marketing `talks with them’. Consultancy firm, McKinesy envisages that digital advertising will be the fastest-growing global advertising segment over the next five years at 15% till 2018 as against 5% for electronic media. The beauty of digital marketing lies in its simplicity of publishing a variety of content formats though the quality of the content and its presentation would decide its universal appeal.

Content is king if it’s digital: As successful corporates would vouch, the challenge in building an integrated digital marketing plan lies in creation, repurposing, amplification and syndication of the content which that will work across all your digital channels. And those who have made it big in the digital space have done it by pushing a resounding message through the following maze:
  • Organic search
  • Search engine marketing (SEM)
  • Email marketing
  • Display advertising
  • Social media
  • Videos
  • Events
  • Speaking engagements
  • Websites
  • Blogs
  • e-books
  • White papers
A successful campaign is also the one that has harmoniously integrated online and offline campaigns to maximize the reach and impact across PR, TV, radio and print as well. Combining digital and traditional advertising strategies is an even bigger task but with tremendous rewards.

Channel Optimization, Data Treatment to Influence Results


By definition, cross channel marketing is the practice of using multiple channels to reach customers and making it easy for them to use (and convert) whatever medium they are most comfortable with. Despite good content, many great campaigns fail to pass the muster when marketers do not run a well-integrated campaign, often due to poor understanding of optimal channels to do so or the failure to appreciate the requisite business impact. Thus, campaigns across multi-channels can produce better results - if the data thus generated - gets monitored diligently.

Understand the power of each channel to convert: It is said that if the content is king, then conversion is the queen; thus, it is critical to understand which channels would spearhead the strategy, leading to greater business impact. One should strive for a perfect balance between chosen technology and its cross-channel attribution modelling to identify where, how and when different channels would impact each other. Simultaneously, a smart marketer must comprehend micro nuances of the target audience, using diverse sets of primary as well as tertiary data. The data digestion – using both historical and real-time data – would not only identify the right channels but the right type of digital content as well.

Tools to Measure Progress and Success


Technology, analytical integrations and greatly refined data points today assist and guide marketers in gauging actual results of digital marketing efforts while content performance parameters help them sharpen or refashion the content. In fact, many CMOs seek accurate measurements from multiple customer touch points to generate a progressive cross channel marketing strategy. Gauging the efficacy of campaigns can be done through few predefined set of tools/perspectives that would build on from a customer’s initial interface with a brand to his final purchase, giving valuable insights on customer expectations from a brand.

Align Digital to Collective Marketing Ethos


Often we hear CMOs blaming the `silo approach’ in an organisation for the failure of campaigns. When departments within an organization work in isolation or do not share useful and actionable data with others, multi-channel promotions can suffer unless these are duly integrated with the overall marketing. It is imperative to structure and optimise internal departments and digital talent to work towards a common goal. Managements must set up task centres to check and correct the effectiveness of multi-channel marketing efforts by compiling and sharing data from multiple channels and other numerous touch points for future marketing efforts.

Credit:The article is written by Kunal Tomar

How to get rid of referral spam

on Tuesday 22 November 2016
Unfortunately, there’s no way to get information OUT of Google Analytics once it’s in there. However, there are ways to keep this kind of spam out in the future, and to remove it from the reports on existing data.

How to block referral spam in Google Analytics

As the name suggests, we usually see this kind of spam in the referral-section of Google Analytics. If Analytics isn’t properly set up (that’s by far in most of the cases), you can see suspicious referrals (under Acquisition > Referrals), such as:

  • secret.google.com
  • 1-free-share-buttons.com
  • 100dollars-seo.com
  • copyrightclaims.org
  • darodar.com
  • Etc.
In the case of the Vote for Trump referral spam, the information is located in the language-tag (under Audience > Geo > Language).

To effectively block referral spam, do the following in Google Analytics:


  1.     Go to ADMIN > All Filters
  2.     Click on “ADD FILTER”
  3.     Give the filter a name (e.g. “include host only”)
  4.     Click on Filter Type “Custom”
  5.     Click “Include”
  6.     Click “Select Field” and select “Hostname”
  7.     Under “Filter Pattern”, add the domain name of your website (e.g. “searchify.ca”)
  8.     Apply the filter to any of your profiles.

This will ensure that only Analytics requests from your own domain will be counted in Google Analytics. Automated , external requests from other domains will not be included in the data.

How to remove referral spam from Analytics reports

In order to remove referral spam from reports, add a similar filter to the reports. In the case of the Donald Trump referral spam, do the following to filter the referrer spam out of any Analytics report or Data View:
  1. Create a new Report or Segment
  2. Click on “Traffic Sources”
  3. Under “Source”, type the domain of the spammer you’d like to exclude (e.g. “webmonetizer.net” 
  4. If you have multiple spammers, choose “matches RegEx” and add a list of spammer domains, separated by a pipe sign and with a backslash before the dot (e.g. “webmonetizer\.net|wmasterlead\.com”. Feel free to omit the top domain levels (e.g. “.com” or “.ru”) altogether.
  5. Save your report or segment.

Contd.

Google Released Penguin 4.0 - Filter Updates in Real time

on Sunday 2 October 2016

So it happened. Google finally released Penguin 4.0 — the last Penguin update of its kind, as it now processes in real time as a part of Google’s core ranking algorithm.

After a few weeks of turbulence in the SERPs, the announcement that many had predicted was finally made.

The Penguin 4.0 announcement had two key points:

  1. Penguin is now running in real time. This is really good news. There are lots of folks out there who have paid the price for low-quality SEO yet are still not seeing a recovery after removing or disavowing all of their spammy backlinks. Certainly, a house built on dodgy links will not spring back to a position of strength simply by removing those links; however, there are many businesses out there that seem to have been carrying algorithmic boulders around their digital ankles. Hopefully, these folks can now move on, their debt to a punitive algorithm update paid in full.
  2. Penguin is now more granular. This element is a little more curious, in that Penguin 2.0 seemed to add page-level and keyword-level penalties, making it more granular than the 1.0 release. However, we can only imagine that things have got much more advanced, and possibly individual links are considered rather than the seemingly aggregate approach that was taken historically. Only time will tell the degree to which this granular approach will impact sites, but I suspect it will be a good thing for those looking to play by the rules.
It will also be interesting to see how this fits in with the other 200 or so factors or “clues” that Google uses to rank websites. We now we have both Panda and Penguin integrated into Google’s core ranking algorithm (though Panda does not run in real time), so it’s possible that the weight of the various known ranking factors may have changed as a result.

One other interesting nugget is that there will be no more notifications for Penguin updates. Penguin now constantly updates as Google crawls the web, so tweaks to the finer points of this system will no longer be announced.

This post is first appeared in SEL.

Myths About Link Building - Rand Fishkin

on Saturday 10 September 2016
Video Transcription of Rand fishkin's Whiteboard Friday edition

1. Never get links from sites with a lower domain authority than your own

What? No, that is a terrible idea. Domain authority, just to be totally clear, it's a machine learning system that we built here at Moz. It takes and looks at all the metrics. It builds the best correlation it can against Google's rankings across a broad set of keywords, similar to the MozCast 10K. Then it's trying to represent, all other things being equal and just based on raw link authority, how well would this site perform against other sites in Google's rankings for a random keyword? That does not in any way suggest whether it is a quality website that gives good editorial links, that Google is likely to count, that are going to give you great ranking ability, that are going to send good traffic to you. None of those things are taken into account with domain authority.

So when you're doing link building, I think DA can be a decent sorting function, just like Spam Score can. But those two metrics don't mean that something is necessarily a terrible place or a great place to get a link from. Yes, it tends to be the case that links from 80- or 90-plus DA sites tend to be very good, because those sites tend to give a lot of authority. It tends to be the case that links from sub-10 or 20 tend to not add that much value and maybe fail to have a high Spam Score. You might want to look more closely at them before deciding whether you should get a link.

But new websites that have just popped up or sites that have very few links or local links, that is just fine. If they are high-quality sites that give out links editorially and they link to other good places, you shouldn't fret or worry that just because their DA is low, they're going to provide no value or low value or hurt you. None of those things are the case.

2. Never get links from any directories

I know where this one comes from. We have talked a bunch about how low-quality directories, SEO-focused directories, paid link directories tend to be very bad places to get links from. Google has penalized not just a lot of those directories, but many of the sites whose link profiles come heavily from those types of domains.

However, lots and lots of resource lists, link lists, and directories are also of great quality. For example, I searched for a list of Portland bars — Portland, Oregon, of course known for their amazing watering holes. I found PDX Monthly's list of Portland's best bars and taverns. What do you know? It's a directory. It's a total directory of bars and taverns in Portland. Would you not want to be on there if you were a bar in Portland? Of course, you would want to be on there. You definitely want those. There's no question. Give me that link, man. That is a great freaking link. I totally want it.

This is really about using your good judgment and about saying there's a difference between SEO and paid link directories and a directory that lists good, authentic sites because it's a resource. You should definitely get links from the latter, not so much from the former.

3. Don't get links too fast or you'll get penalized

Let's try and think about this. Like Google has some sort of penalty line where they look at, "Oh, well, look at that. We see in August, Rand got 17 links. He was under at 15 in July, but then he got 17 links in August. That is too fast. We're going to penalize him."

No, this is definitely not the case. I think what is the case, and Google has filed some patent applications around this in the past with spam, is that a pattern of low-quality links or spammy-looking links that are coming at a certain pace may trigger Google to take a more close look at a site's link profile or at their link practices and could trigger a penalty. 

Yes. If you are doing sketchy, grey hat/black hat link building with your private networks, your link buys, and your swapping schemes, and all these kinds of things, yeah, it's probably the case that if you get them too fast, you'll trip over some sort of filter that Google has got. But if you're doing the kind of link building that we generally recommend here on Whiteboard Friday and at Moz more broadly, you don't have risk here. I would not stress about this at all. So long as your links are coming from good places, don't worry about the pace of them. There's no such thing as too fast.

4. Don't link out to other sites, or you'll leak link equity, or link juice, or PageRank

...or whatever it is. I really like this illustration of the guys who are like, "My link juice. No!" This is just crap.
All right, again, it's a myth rooted in some fact. Historically, a long time ago, PageRank used to flow in a certain way, and it was the case that if a page had lots of links pointing out from it, that if I had four links, that a quarter each of the PageRank that this page could pass would go to each of them. So if I added one more, oh, now that's one-fifth, then that becomes one-fifth, and that becomes one-fifth. This is old, old, old-school SEO. This is not the way things are anymore.

PageRank is not the only piece of ranking algorithmic goodness that Google is using in their systems. You should not be afraid of linking out. You should not be afraid of linking out without a "nofollow" link. You, in fact, should link out. Linking out is not only correlated with higher rankings. There have also been a bunch of studies and research suggesting that there's something causal going on, because when followed links were added to pages, those pages actually outranked their non-link-carrying brethren in a bunch of tests. I'll try and link to that test in the Whiteboard Friday. But regardless to say, don't stress about this.

5. Variations in anchor text should be kept to precise proportions

So this idea that essentially there's some magic formula for how many of your keyword anchor text, anchor phrases should be branded, partially branded, keyword match links that are carrying anchor text that's specifically for the keywords you're trying to rank for, and random assorted anchor texts and that you need some numbers like these, also a crazy idea. 

Again, rooted in some fact, the fact being if you are doing sketchy forms of link building of any kind, it's probably the case that Google will take a look at the anchor text. If they see that lots of things are kind of keyword-matchy and very few things contain your brand, that might be a trigger for them to look more closely. Or it might be a trigger for them to say, "Hey, there's some kind of problem. We need to do a manual review on this site." 

So yes, if you are in the grey/black hat world of link acquisition, sure, maybe you should pay some attention to how the anchor text looks. But again, if you're following the advice that you get here on Whiteboard Friday and at Moz, this is not a concern. 

6. Never ask for a link directly or you risk penalties

This one I understand, because there have been a bunch of cases where folks or organizations have sent out emails, for example, to their customers saying, "Hey, if you link to us from your website, we'll give you a discount," or, "Hey, we'd like you to link to this resource, and in exchange this thing will happen," something or other. I get that those penalties and that press around those types of activities has made certain people sketched out. I also get that a lot of folks use it as kind of blackmail against someone. That sucks. 

Google may take action against people who engage in manipulative link practices. But for example, let's say the press writes about you, but they don't link to you. Is asking for a link from that piece a bad practice? Absolutely not. Let's say there's a directory like the PDX Monthly, and they have a list of bars and you've just opened a new one. Is asking them for a link directly against the rules? No, certainly not. So there are a lot of good ways that you can directly ask for links and it is just fine. When it's appropriate and when you think there's a match, and when there's no sort of bribery or paid involvement, you're good. You're fine. Don't stress about it.

7. More than one link from the same website is useless

This one is rooted in the idea that, essentially, diversity of linking domains is an important metric. It tends to be the case that sites that have more unique domains linking to them tend to outrank their peers who have only a few sites linking to them, even if lots of pages on those individual sites are providing those links. 

But again, I'm delighted with my animation here of the guys like, "No, don't link to me a second time. Oh, my god, Smashing Magazine." If Smashing Magazine is going to link to you from 10 pages or 50 pages or 100 pages, you should be thrilled about that. Moz has several links from Smashing Magazine, because folks have written nice articles there and pointed to our tools and resources. That is great. I love it, and I also want more of those.

You should definitely not be saying "no." You shouldn't be stopping your link efforts around a site, especially if it's providing great traffic and high-quality visits from those links pointing to you. It's not just the case that links are there for SEO. They're also there for the direct traffic that they pass, and so you should definitely be investing in those.

8. Links from non-relevant sites or sites or pages or content that's outside your niche won't help you rank better

This one, I think, is rooted in that idea that Google is essentially looking and saying like, "Hey, we want to see that there's relevance and a real reason for Site A to link to Site B." But if a link is editorial, if it's coming from a high-quality place, if there's a reason for it to exist beyond just, "Hey, this looks like some sort of sketchy SEO ploy to boost rankings," Googlebot is probably going to count that link and count it well. 

So with that in mind, if you have other link ideas, link myths, or link facts that you think you've heard and you want to verify them, please, I invite you to leave them in the comments below. I'll jump in there, a bunch of our associates will jump in there, folks from the community will jump in, and we'll try and sort out what's myth versus reality in the link building world.

Questions You Must Ask a Prospective SEO Analyst

on Tuesday 30 August 2016
Are you in the dilemma whether to hire an in-house SEO analyst or hire a consulting agency? If you’re looking to hire an in-house SEO analyst, I want to help you to find the right one. That’s because our most successful SEO consulting happens when there’s a competent SEO manager working in-house. Choosing a right in-house SEO person is a daunting task since SEO person should be a good communicator as the right in-house SEO person communicates well with both the CMO and the consultant. He or she follows through on our recommendations and fulfills the plans we’ve jointly made. And a good in-house SEO analyst or manager makes the client-consultant relationship a real partnership - so the client wins.

I am sharing a list of 30 questions that you would pose to the aspiring SEO analyst for your firm. Find the right analyst by asking questions that will allow candidates to not only talk about their SEO experience but also reveal their digital marketing knowledge and strategy.

30 SEO Interview Questions


  1. What makes a website search engine-friendly?
  2. How do you define success when it comes to SEO?
  3. How do you stay updated on industry news and algorithm changes?
  4. What programming languages do you have experience with?
  5. Regarding your previous SEO job, what did an average day look like?
  6. How do you adapt to the needs of different clients?
  7. How often do you communicate with clients?
  8. How did you learn SEO?
  9. How do you approach keyword research?
  10. What is the relationship between SEO, SEM and social media marketing?
  11. What SEO tools do you regularly use?
  12. How do you stay organized when working on an SEO project?
  13. Who are Gary Illyes and John Mueller?
  14. What is your favorite website and why?
  15. What is your opinion on proper link building?
  16. How have you dealt with link penalties?
  17. What’s the ideal speed for a site to load a web page?
  18. What method do you use to redirect a page?
  19. What are your thoughts on accelerated mobile pages (AMP)?
  20. What is your process for helping a local business become more visible in search results?
  21. Are you aware of the latest changes to Google and the latest updates to Panda and Penguin?
  22. How has Hummingbird changed the landscape of search?
  23. What is Google’s preferred method of configuring a mobile site?
  24. What do you know about content building and content marketing?
  25. What has been your experience getting content featured in answer boxes?
  26. How have you utilized structured data to earn featured snippets?
  27. What are the three possible configurations for a mobile site? Which do you prefer and why?
  28. What’s your greatest digital marketing success story?
  29. How do you stay up-to-date on the near-constant search algorithm changes?
  30. What metrics do you use to measure SEO success?

Facebook's media arms race

on Tuesday 23 August 2016

Is the social platform protecting its rights or taking away others' in the new battle on ad blockers?

Image Courtesy : Al Jazeera


Facebook has declared war against ad blockers and says that protecting revenues for media outlets was a key motivating factor.

Recently, millions of Facebook users who have applied ad blocker systems have found their feeds littered with advertisements they had already opted out of seeing. 

Facebook was behind this action, blocking the ad blockers in order for these ads to appear once again on user feeds. 

With $6.4bn in revenue in the first quarter of 2016, it is clear to see why Facebook is invested in this cat-and-mouse game with the ad blockers and the open-source community that supports them.

Facebook released a statement regarding the ad blocker situation, claiming "user experience" and not much else as a reponse to why blockers have been overruled.

However, some might argue that the lack of transparency regarding the platform's social media streams may have been echoed by the ad blockers themselves.

With ad blockers using "white listing", that is a payment by publishers to keep ads unblocked, is this media battle truly a matter of moral fabric or simply a series of business decisions? 

We take a look at what the Facebook vs the ad blocker battle really means for users, publishers and for Facebook's own business model.

Talking us through the story are: Ben Williams, PR manager, Adblock Plus; Justin Schlosberg, lecturer in journalism and media at Birkbeck University; Lara O'Reilly, senior editor at Business Insider; and Raghav Bahl, founder of Quintillion Media.

Source: Al Jazeera

Black Hat SEO Tactics to avoid

on Tuesday 2 August 2016
In search engine optimization (SEO) terminology, black hat SEO refers to the use of aggressive SEO strategies, techniques and tactics that focus only on search engines and not a human audience, and usually does not obey search engines guidelines.

Image Courtesy : positionly.com

These practices are against the search engine's terms of service and can result in the site being banned from the search engine and affiliate sites. A list of tactics and strategies employed by black hat SEO practitioners have been openly denounced on Google's Webmaster Guidelines and Bing's Webmaster Guidelines.

"Is the work that I'm doing adding value to the user or am I just doing this for search engines to see?" is a litmus test on whether an SEO tactic would go against a search engine's webmaster guideline. If no value is added to the user, but rankings are likely to increase, then your decisions are highly likely to be black hat. The same test can be applied to to paid search practices to determine whether an activity is considered black hat ppc.

Black Hat SEO Tactics

The following SEO tricks are considered as black hat and should not be exercised at all if you want to stay in SERP with Google and other search engines.
  • Content Automation
  • Doorway Pages
  • Hidden Text or Links
  • Keyword Stuffing
  • Reporting a Competitor (or Negative SEO)
  • Sneaky Redirects 
  • Cloaking
  • Link Schemes
  • Guest Posting Networks
  • Link Manipulation (including buying links)
  • Article Spinning
  • Link Farms, Link Wheels or Link Networks
  • Rich Snippet Markup Spam
  • Automated Queries to Google
  • Creating pages, subdomains, or domains with duplicate content
  • Pages with malicious behavior, such as phishing, viruses, trojans, and other malware

Avoid Black Hat SEO Tactics

Black Hat SEO tactics can get your website banned from Google and other search engines.                  
Though there may be some short-term success through increased traffic to your site, Google penalties are getting more and more sophisticated and can have devastating effects on your rankings and traffic. With hundreds of millions of users searching on Google per day, can you really afford to be de-indexed? 

Expanded text ads - Google AdWords

on Friday 29 July 2016
Expanded text ads are the next generation of text ads, designed for a mobile-first world with both users and advertisers in mind. Expanded text ads give advertisers additional control over their messaging, and provide users with more information before they click your ad. Like text ads, expanded text ads are available on both the Google Search Network and the Google Display Network, and are supported by all the AdWords tools that currently support text ads. Ad extensions (both automatic and manual) are fully compatible with expanded text ads too.

This guide explains what expanded text ads are and how they differ from standard text ads. It also recommends ways to create new text ads in the expanded text ads format.

Image Courtesy : Google.com


Your new Display URL


  • Two headline fields (up to 30 characters each)
  • A single, expanded description field (up to 80 characters)
  • A Display URL that uses your Final URL's domain
  • Two optional ”Path” fields, used in the ad’s Display URL (up to 15 characters each)


Luxury River Cruise Holidays – Book Your Getaway Today
www.example.com/Cruises
Explore the world along rivers. Selected locations on sale!

With expanded text ads, you no longer need to enter a Display URL that matches your Final URL domain. Instead, AdWords will take the domain from your Final URL and show it as your ad’s Display URL. For example, if your Final URL is www.example.com/outdoor/hiking/shoes, your ad’s Display URL will show as www.example.com.

Expanded text ads also introduces two optional “Path” fields, which can hold up to 15 characters. They are part of your Display URL, and are placed after your website’s domain. So if your final URL is www.example.com/outdoor/hiking/shoes, you might want your path text to be Hiking and Shoes so your ad’s display URL is www.example.com/Hiking/Shoes.

We recommend using path text to give people reading your ads a better sense of where they'll land on your website. The text that you put in the path fields doesn’t necessarily have to be part of your website’s URL, but it should be related to the content on your landing page. Find out more

Transitioning to expanded text ads

Right now, AdWords supports the creation of both expanded text ads and standard text ads in both the new and previous versions of AdWords. When you create a new ad, you will be defaulted to expanded text ads. To switch back to standard text ads, click the Switch back to standard text ad link above the “Final URL” field. 

Starting 26 October 2016, AdWords will no longer support the creation or editing of standard text ads. New text ads generated after that date should use the expanded text ad format. Existing standard text ads will continue to serve past 26 October 2016.

After this date, you will only be able to create and edit expanded text ads. However, existing standard text ads will continue to serve alongside expanded text ads.

AdWords tools that support experimenting with different ad text, like campaign drafts and experiments, are fully compatible with expanded text ads. We recommend trying out different variations on your expanded text ads to see which perform best for your business. When you create a new expanded text ad, we also recommend waiting for the ad to be approved before pausing your standard text ad. Create expanded text ads

Using bulk editing

Making bulk changes to expanded text ads is very similar to making bulk changes to standard text ads. You can multi-select ads and pause, enable, or remove them via the Edit menu. You can also use “Change ads” in the “Edit” menu to edit multiple text edits at once.

Bulk upload also supports expanded text ads, though with additional column headers that aren’t available for standard text ads. You can make changes to both standard and expanded text ads within the same Bulk upload spreadsheet. Get Started 

Using AdWords Editor

You can also create and edit expanded text ads in bulk using AdWords Editor. AdWords Editor supports expanded text ads in Version 11.5. Find out more

Make the most of expanded text ads

Expanded text ads share a number of the best practices that you’re already familiar with for standard text ads. But with almost 50% more ad text available and an additional headline, expanded text ads provide more opportunity for you to connect with users and drive more, qualified clicks to your website. 
  • Rethink the messaging of your ad. It can be tempting, but don’t simply add a second headline to your existing ads. Consider the entire message that you want to present, taking into account all parts of your new ad.
  • Take advantage of your character limits. Expanded headline fields increase the clickable space of your ads, and allow you to communicate more with someone who’s searching before they decide whether to click through to your website.
  • Focus on optimising your headlines. People are most likely to read the headline of your ad. When viewed on the search results page, your ad’s headline fields are combined using a hyphen, “-”. On mobile, the headline may wrap beyond the first line. Consider the different ways in which your headline may display when writing your ad to make sure that your ad is compelling and easy to read on different devices.
  • Use ad extensions. Including information below your ad, such as additional deep links into your website or your business location, has been shown to increase your ad’s performance. Get Started

Six SEO Tips & Rules for 2016

The rules for effective SEO have shifted seismically over the past few years. Experts offer tips on the current state of SEO and how you can use it to maximize your investment in content in 2016.


Marketers are beefing up their investments in content, but to leverage those investments, they’ll also have to put some time and effort into learning the new rules for SEO. The days of driving traffic to your site by packing headlines with keywords are long gone, experts say, and the new SEO strategy revolves around another big-money marketing focus: experience.

“Historically, the recommendations around SEO have been … to focus on keywords,” says Martin Laetsch, director of online marketing , Ore.-based marketing automation company . “The reality is search engines are getting much smarter. The content creator is having a lot less control over how their pages are showing up and what words they’re showing up for.”

According to an August 2015 study on the future of SEO by Moz Inc.,a Seattle-based SEO consulting company, the most important factors for SEO impact next year will be mobile-friendliness, which will increase in impact by 88%; analysis of a page’s perceived value (up 81%); usage data such as dwell time (up 67%); and readability and design (up 67%). SEO factors that the study reported will decrease in impact are the effectiveness of paid links (down 55%) and the influence of anchor text (down 49%).


Here, experts offer six tips on how to use SEO to maximize your content marketing investments.

Intention is everything

You no longer need an exact keyword to offer a relevant search result, says Cyrus Shepard, director of audience development at Moz. “In the old days, it was about getting the click. Now search engines are seeing how people are interacting with your website: Are they going back and clicking on results, or are they finding the answers they’re looking for when they’re on your site? Today it’s about the post-click activity. Not only do you have to get the clicks, but you have to satisfy user intent.”

Keywords aren’t the be-all and end-all

Including keywords in headlines is becoming less important, Shepard says. “Google has gotten better about interpreting meaning. It used to be that if you wanted to rank for ‘best restaurants,’ you had to say ‘best restaurants’ three or four times. It’s still helpful to mention ‘best restaurants,’ but the semantic meaning is becoming much more important. Now you can just talk about great dining experiences, and the search engines will pick up on it.”

Adds Laetsch: “Historically, we wanted to get a keyword in the body copy or in the meta description. Now that’s all gone out the window. As the search engines get smarter, they start to think about other words that you expect to be in that article, what will signal that this is an authoritative article on the topic. If you were writing an article about the Apple Watch, you might have the words ‘Apple,’ ‘iPhone,’ ‘Watch,’ ‘apps’ and ‘time.’ If those are in the body copy, it sends signals to the search engines that this is a pretty good article.” 

Seventy-five percent of search queries are between three and five words long, so you should write headlines accordingly, he adds. “The search engines are figuring out that if people search for the word ‘marketing,’ or any one- or two-word query, they don’t get the results they want. To get quality results that are most likely to answer their question, they have to go to three-, four- or five-word queries. As content creators, when you’re thinking about optimization, you have to think about that.”

Focus on the user experience

Thus, original content is becoming more important than ever, “The more original content that you can produce - whether it’s an image or a video, or long-form content, anything you can put together that’s going to justify someone wanting to read it or share it—the better.”

While articles with a “top five” list format often are clickable, Create an editorial calendar to appeal to your customers’ interests, Your content has to be original and targeted to your audience. If you curate content, take a paragraph from another article or site, and give them full credit and add an attribution, but add a paragraph or two in your own voice: ‘Here’s why I think it’s relevant.’ You’re adding a journalistic voice and making it your own.”

Size matters

Longer articles, between 1,200 to 1,500 words, perform better in search, on average, Laetsch says. “It’s significantly different than it was two or three years ago, when 300 words was a pretty long page. Longer articles are getting more traffic, and they’re ranking higher in SEO, especially for competitive terms. The changes that Google is making, and the reason they’re making these changes, is to make sure they’re sending traffic to pages that delight humans.” He suggests breaking up long-form content with subheads, bullet points and images throughout the copy to make it easy for readers to more quickly scan and digest it.

Longer articles perform better in search results because there are more words and images to rank on the page, Shepard says. “People are sharing longer articles on social media more, and linking to them and citing them more. Shorter articles do well sometimes, but on average, longer articles tend to perform better.”

Optimize for mobile

More people are reading news on their smartphones, so it’s important to ensure that your content is searchable there, says Derek Edmond, managing partner and director of SEO and social media strategies at KoMarketing Associates, a Boston-based B-to-B SEO and social media marketing consultancy. “Making sure Google can understand the content that’s found within a mobile app, and leveraging the marketing of the app with respect to SEO, is an opportunity on the consumer and B-to-B marketing side.”

Use unique images

While images aren’t as big of a referral source in Google as they used to be, having unique images on your site is valuable, Shepard says. “The same image can show up in hundreds of places around the Web, but having unique content around those images is what makes it stand out. I’m not opposed to using stock images to illustrate a point, but any time you can create something that’s custom or use unique photography, that will pay off more in the long run.”

The most important SEO tip for 2016 is to focus on your audience, Shepard says. “In the past, it was about marketers trying to promote what they wanted people to see. Today it’s about delivering what people actually want to see that will give you an SEO ranking boost.”

Adds Laetsch: “The reason we’re doing optimization and want to show up in Google, Bing or Yahoo is not because we make money because we show up No. 1 or No. 2. The reason we want to rank well in the search engine is so that our audience, the people we’re trying to reach, have a great experience. It doesn’t matter how high you rank if your target audience goes to your site and they’re not happy.”

This article was originally published in the November 2015 issue of Marketing News.

White hat SEO

on Friday 15 July 2016
White Hat SEO refers to the usage of optimization strategies, techniques and tactics that focus on a human audience.To be more precise - a website that is optimized for search engines, yet focuses on relevancy and organic ranking is considered to be optimized using White Hat SEO practices. White Hat SEO / Ethical SEO. is more frequently used by those who intend to make a long-term investment on their website. Some examples of White Hat SEO techniques include using keywords and keyword analysis, backlinking, link building to improve link popularity, and writing content for human readers.

These tactics stay within the bounds as defined by Google. Examples include:
  • Offering quality content and services
  • Using descriptive, keyword-rich meta tags
  • Making your site easy to navigate

Implement White Hat SEO Methods

Implementing White Hat SEO practices is the best way to create an ethical, sustainably successful website and business. 

keywords

Keywords still hold value but must be used correctly. Many SEO professionals don’t bother putting keywords in the Meta information as Google doesn’t look at them anymore.  See the video below for Google’s Matt Cutt’s reasoning on why keywords in meta information are more or less ignored by the search engine now:

Keyword ‘stuffing’ used to be a very common practice and was a black hat method intended to ensure that the word associated with a company was picked up. We’ve all seen those articles which are barely legible as every other word is a keyword. Many good SEO content writers now refuse to write content where a client may ask them for a keyword density of 7%, as it lessens the quality of the piece considerably.
However, keywords do still have their place for use with site content, including blogs, images and video and PPC/Adwords. These days, it’s better practice to use similar words throughout a piece of writing as well as the main keyword. Key phrases are also good practice and should be used. It’s also important that these are used in Titles and sub-headers as well as throughout the text.

Content – ascended from King

Have you heard the phrase ‘content is king’? This is no longer true, content is everything (well, pretty much). It’s no longer enough to stick a poorly written blog up once a month – or worse, use text spinners to rejig old content.

Content must be:

  • Original and not infringe on the intellectual property rights of others
  • Highly relevant and useful to your industry and audience
  • Very well written with good grammar and spelling
  • For image and video should include a title, ALT tag and description and if appropriate, give credit to original artist (if using creative commons licensed images for example)
  • Point to the source of research and quotes where applicable

Adding value to your audience

The best possible way you can perform best in the search is by giving value to your audience and preferably, your industry too.

Grey Hat SEO : Should You Use ?

on Thursday 14 July 2016
Gray Hat SEO is difficult to define. In the words of SEO Consultant, John Andrews, Gray Hat SEO is not something between Black Hat and White Hat, but rather "the practice of tactics/techniques which remain ill-defined by published material coming out of Google, and for which reasonable people could disagree on how the tactics support or contrast with the spirit of Google’s published guidelines."  But is what its name suggests .

It’s somewhere in the middle of white and black and if used by a professional, can still be effective. However, it’s safe to say that taking a grey hat approach is playing with fire if you’re not 100% sure of what you’re doing and since we’re predominantly content-led now, it’s not something is recommended.

Grey hat consists of :


  • Cloaking
  • Purchasing old domains
  • Duplicate content
  • Link buying
  • Social media automation
  • Purchasing followers

What does all this mean?

Some SEO companies use grey hat tactics, but the best ones don’t. The short-term advantage, much like black hat (and many of these tactics overlap into black hat) don’t make for a long-term business. Whilst grey hat might gain you some traffic initially, it won’t last, especially if you get caught out.

In essence, with modern content marketing and SEO, white hat wins out. The web is highly competitive, we all know that and so in order to ‘beat the system’, the best way is to work with it and invest in the future of your site.

Like anything in life, taking shortcuts to get you where you want to be right now is often the path to failure. Great SEO, good content and a well-planned out business, that’s what will give your site the competitive edge in the end.

Showcase Shopping Ads from Google: Major Change to PLAs for Broad Product queries

on Wednesday 13 July 2016
As sellers are starting to prep their holiday campaigns, Google announced several new updates to Shopping Campaigns on Tuesday (which also happens to be Amazon Prime Day), including a whole new look for generic product queries.

Showcase Shopping Ads for broad queries

The biggest announcement is a new ad format for broad, non-brand product searches like “women’s dresses” or “patio furniture”. Google says 40 percent of product queries are for these kinds of broad terms. Where in the past Google has often either not shown any product listing ads on broad queries or shown individual products (“patio furniture”, for example, might yield a mix of individual dining and lounge sets), going forward Google will show what it’s calling Showcase Shopping Ads. Ads appear with a main image and two smaller side images related to the product search. At the bottom of the ad is space for a promotional message or distance to the location for Local Inventory advertisers.

Google's "Showcase Shopping" ads, unveiled today to a small group of reporters in New York, are meant to help people find what they're looking for even when the search query they entered is quite vague. According to Google, more than 40 percent of shopping-related queries on Google are for broad terms, such as "summer dress," "women's athletic clothing," or "living room furniture."


With the new format, a retailer can choose to have a certain series of images appear in search results related to various search queries and keywords. If a user clicks an image, they'll be brought to another page with additional information about the products. According to Google, 44 percent of people mentioned using images to find ideas while shopping online, illustrating the role that images play in online shopping. Showcase Shopping ads will be available in the coming weeks to all merchants running campaigns in the U.S., U.K. and Australia.

"This is a different ad format for shopping that will put the retailer first and really help people explore and discover what they want to buy and where to buy it," said Jonathan Alferness, Google's vp of shopping and travel.

Mobile research and mobile shopping both continue to grow in prominence. According to Google, shopping and travel searches are up 30 percent year-over-year, and mobile search related to finding the "best" product has increased more than 50 percent.

Along with the Showcase Shopping ads, Google is also updating its TrueView ads. The format, which was first unveiled last spring, will soon allow marketers to include a banner companion next to a video ad so that users can scroll through product images and information while a video is playing. The latest updates will also let advertisers decide which products they want to highlight as a part of the campaign.

According to Alferness, the number of advertisers using the TrueView product has increased by 50 percent from January 2015 to January 2016, with one in three advertisers using the product on a weekly basis. While he wouldn't provide a benchmark for the growth, Alferness said Google is seeing "really, really great momentum."

"We're trying to find ways to better infuse the unique aspects of the retailer into the ad formats to really help that retailer come first to the consumer as a choice," he said.



Mobile is also playing an increasing role in travel research and booking. According to Google, visits to mobile travel sites comprised around 40 percent of all traffic to those websites during the first quarter of 2016. Mobile conversions have also grown, rising 10 percent. And now, nearly half of all referrals from Google Hotel Ads come from smartphones, growing nearly 2.4 times year-over-year. Brands like La Quinta are seeing mobile web traffic account for a third of all traffic to its website, with mobile bookings increasing by a factor of four in three years.

Because of this, Google is adding features related to booking hotels and flights. For example, through the company's Hotel Smart Filters, users will be able to filter based on prices, ratings or preferences like whether a hotel is "pet-friendly."

Google also is adding features related to finding deals to point out when a hotel's price is lower than normal or when a hotel is running a discounted rate. It's also adding tips for when prices are lower -much like it already does with flights.

Speaking of flights, price tracking also is getting an update and will now allow consumers to skip checking back all the time for the latest prices and instead opt in to track fares for specific routes and dates.

All about the new Google RankBrain algorithm

on Tuesday 28 June 2016
Recently, news emerged that Google was using a machine-learning artificial intelligence system called “RankBrain” to help sort through its search results. Wondering how that works and fits in with Google’s overall ranking system? Here’s what we know about RankBrain.

The information covered below comes from three original sources and has been updated over time, with notes where updates have happened. Here are those sources:

First is the Bloomberg story that broke the news about RankBrain yesterday (See also our write-up of it). Second, additional information that Google has now provided directly to Search Engine Land. Third, our own knowledge and best assumptions in places where Google isn’t providing answers. We’ll make clear where these sources are used, when deemed necessary, apart from general background information.

What is RankBrain?


RankBrain is a machine learning artificial intelligence system, the use of which by Google was confirmed on 26 October 2015. It helps Google to process search results and provide more relevant search results for users.

What is machine learning?


Machine learning is where a computer teaches itself how to do something, rather than being taught by humans or following detailed programming.

What is artificial intelligence?


True artificial intelligence, or AI for short, is where a computer can be as smart as a human being, at least in the sense of acquiring knowledge both from being taught and from building on what it knows and making new connections.

True AI exists only in science fiction novels, of course. In practice, AI is used to refer to computer systems that are designed to learn and make connections.

How’s AI different from machine learning? In terms of RankBrain, it seems to us they’re fairly synonymous. You may hear them both used interchangeably, or you may hear machine learning used to describe the type of artificial intelligence approach being employed.

So RankBrain is the new way Google ranks search results?


No. RankBrain is part of Google’s overall search “algorithm,” a computer program that’s used to sort through the billions of pages it knows about and find the ones deemed most relevant for particular queries.

So RankBrain is part of Google’s Hummingbird search algorithm?


That’s our understanding. Hummingbird is the overall search algorithm, just like a car has an overall engine in it. The engine itself may be made up of various parts, such as an oil filter, a fuel pump, a radiator and so on. In the same way, Hummingbird encompasses various parts, with RankBrain being one of the newest.

In particular, we know RankBrain is part of the overall Hummingbird algorithm because the Bloomberg article makes clear that RankBrain doesn’t handle all searches, as only the overall algorithm would.

Hummingbird also contains other parts with names familiar to those in the SEO space, such as Panda, Penguin and Payday designed to fight spam, Pigeon designed to improve local results, Top Heavy designed to demote ad-heavy pages, Mobile Friendly designed to reward mobile-friendly pages and Pirate designed to fight copyright infringement.

RankBrain is the third-most important signal?


That’s right. From out of nowhere, this new system has become what Google says is the third-most important factor for ranking webpages. From the Bloomberg article:RankBrain is one of the “hundreds” of signals that go into an algorithm that determines what results appear on a Google search page and where they are ranked, Corrado said. In the few months it has been deployed, RankBrain has become the third-most important signal contributing to the result of a search query, he said.

What exactly does RankBrain do?


Used as a way to interpret the searches that people submit to find pages that might not have the exact words that were searched for.

Didn’t Google already have ways to find pages beyond the exact query entered?


Yes, Google has found pages beyond the exact terms someone enters for a very long time. For example, years and years ago, if you’d entered something like “shoe,” Google might not have found pages that said “shoes,” because those are technically two different words. But “stemming” allowed Google to get smarter, to understand that shoes is a variation of shoe, just like “running” is a variation of “run.”

Google also got synonym smarts, so that if you searched for “sneakers,” it might understand that you also meant “running shoes.” It even gained some conceptual smarts, to understand that there are pages about “Apple” the technology company versus “apple” the fruit.

How’s RankBrain helping refine queries?


The methods Google already uses to refine queries generally all flow back to some human being somewhere doing work, either having created stemming lists or synonym lists or making database connections between things. Sure, there’s some automation involved. But largely, it depends on human work.

The problem is that Google processes three billion searches per day. In 2007, Google said that 20 percent to 25 percent of those queries had never been seen before. In 2013, it brought that number down to 15 percent, which was used again in yesterday’s Bloomberg article and which Google reconfirmed to us. But 15 percent of three billion is still a huge number of queries never entered by any human searcher - 450 million per day.

Among those can be complex, multi-word queries, also called “long-tail” queries. RankBrain is designed to help better interpret those queries and effectively translate them, behind the scenes in a way, to find the best pages for the searcher.

As Google told us, it can see patterns between seemingly unconnected complex searches to understand how they’re actually similar to each other. This learning, in turn, allows it to better understand future complex searches and whether they’re related to particular topics. Most important, from what Google told us, it can then associate these groups of searches with results that it thinks searchers will like the most.

Google didn’t provide examples of groups of searches or give details on how RankBrain guesses at what are the best pages. But the latter is probably because if it can translate an ambiguous search into something more specific, it can then bring back better answers.

Does RankBrain really help?


Despite my two examples above being less than compelling as testimony to the greatness of RankBrain, I really do believe that it probably is making a big impact, as Google is claiming. The company is fairly conservative with what goes into its ranking algorithm. It does small tests all the time. But it only launches big changes when it has a great degree of confidence.

Integrating RankBrain, to the degree that it’s supposedly the third-most important signal, is a huge change. It’s not one that I think Google would do unless it really believed it was helping.

When Did RankBrain start?

Google told us that there was a gradual rollout of RankBrain in early 2015 and that it’s been fully live and global for a few months now.

What queries are impacted?

In October 2015, Google told Bloomberg that a “very large fraction” of the 15 percent of queries it normally never sees before were processed by RankBrain. In short, 15 percent or less.
In June 2016, news emerged that RankBrain was being used for every query that Google handles.

NOTE: This story has been revised from when it was originally published in October 2015 to reflect the latest information.(Here on Blog its published for reference only)

SEO is Not Search Engine Manipulation” that Google will ban you for

on Thursday 23 June 2016

SEO not equivalent to “search engine manipulation” 

Google is currently at a legal fight against a publisher that was banned from its search results because of “search engine manipulation” which should not be misconstrued as SEO being as bad, as spam, as something you could get banned for. 

Instead Google has explicitly mentioned that “search engine manipulation” is something the company fights against, might ban anyone for and, importantly, is defined pretty much how many people might define SEO.

Google: SEO is not spam

Given that, “Is Google Trying To Kill SEO,” as the headline of the Entrepreneur article asks? Almost certainly not, as I’ll explain. But let’s start with Google’s official statement that it gave Search Engine Land on this:

While we can’t comment on ongoing litigation, in general, Google supports and encourages SEO practices that are within our guidelines and don’t consider that spam.

Got it? SEO - commonly accepted best practices - isn’t spam. And anyone encountering the newfangled term of “search engine manipulation” should view that, in my opinion, as equating with spam and not SEO.

What is the story behind the term “search engine manipulation” 

The recent legal suit is between Google and e-ventures sites. A statement by Brandon Falls, the Google search quality analyst introduces the term “search engine manipulation” for the first time.

From Falls’ declaration, the first reference to “search engine manipulation” is as follows:

 An important part of providing valuable search results to users is Google’s protection of the integrity of its search results from those who seek to manipulate them for their own gain.

As noted, efforts to subvert or game the process by which search engines rank the relevance of websites are called “webspam” in the search industry.

Such search engine manipulation harms what is most valuable to users about search: the quality (i.e. relevance) of our search results for users.

Accordingly, Google considers search engine manipulation to be extremely serious and expends substantial resources to try to identify and eliminate it. These actions are critical to retain users’ trust in Google’s search results.

The highlighted points above opens with the Google saying the it tries to protect its search results from those who “seek to manipulate them for their own gain.” Now the problem with the above statement is that Google is trying to convey that Anyone doing commonly accepted SEO is doing so in hopes of manipulating the search results for their own gain.In short, Google actively encourages and instructs publishers on how to “manipulate” its search results. While the newly emerged term that Google calls search engine manipulation” is equivalent to spam  does not include accepted SEO.

Now read the next statement by Google that  “Google’s online Webmaster Guidelines include a discussion of “Quality Guidelines.” The Quality Guidelines enumerate numerous manipulation techniques that violate Google’s Webmaster Guidelines.”

In this statement, Google is inherently acknowledging that there are many search engine manipulation techniques but only some violate its guidelines. In other words, not all manipulation is bad and an actionable offense. SEO manipulation, that which is done within its guidelines, should be OK. Outside the guidelines — that’s spam and potentially gets you in trouble.

It’s a pity Google wasn’t much clearer and didn’t stay with the commonly used industry terms out there. As a result, it’s possible for anyone to fear monger or fear that doing accepted SEO might be a bannable offense.

But still I would like to convey that in reality, anyone who stays within Google’s guidelines really should have nothing further to fear, as Google’s own statement today says. In particular, it’s important to note that this declaration introducing “search engine manipulation” as a term was made back in November 2014. It was only just noticed now

This update was first appeared on SEL. 

Embrace the new king of Marketing

on Wednesday 22 June 2016
Content creators ruled and reigned over the marketing and advertising circles for years. But the growth of new media platforms is shifting the power and ad dollars away from content creators to content platform owners. After all, the real value lies in reaching the consumer through the medium and form of his choice. All hail the new king set to reign in the new digital age.

Traditional models of content monetisation such as subscriptions, ticket sales, and license fees used by media companies have been disrupted forever with consumers moving online and demanding access to free online content. While the movement of ad spend from print to online channels has subsidised content costs and helped monetise content, it is yet again threatened with content consumption moving from traditional online media to newer channels offered by internet companies. Ad dollars are following suit proving to be both the biggest disruptor and an opportunity for the industry.

Decelerating content driven growth


“Build it and they come” was the mantra of most media companies in providing good quality content as they expected customers to buy tickets to see movies, subscribe to TV shows, pay affiliate fees to show content, or advertise on the content. This model has stood the test of time and proved to be greatly beneficial to customers over the past few decades in providing quality content at lower and lower costs. As long as ad revenues continued to flow in, media companies could afford to reduce their prices by subsidising it with ad revenues. As long as “content was king”, the advertising spend would follow and continue to fill their coffers up until the new-age customer started viewing content on different media platforms – apps, social media, and peer-to-peer sharing networks. The ad dollars too began shifting to where these customers spend their time and consequently, consume content – the likes of Google, Facebook and their spin offs. The coffers are now running dry.

The numbers speak for themselves:



The ad revenues generated by Google and Facebook already exceed the entire ad spend on TV – and this is without the internet behemoths activating their entire portfolio of media channels (WhatsApp being a case in point).

So why is this happening? And what are media companies doing about it?


Well, not much! Some media companies are imposing punitive price increases on customers who are moving to internet media channels and others are introducing data limits on content consumed. But both these measures are proving to be counter-productive as they drive customers, eventually in greater numbers into the arms of more “flexible” media channels such as apps and social media.

A case in point are the Indian online retailers who realised that they had huge unpaid advertising bills from advertisers who were just not paying up for ads placed on their site. The lack of prepaid ads for smaller advertisers or software to run large ad accounts (like Amazon) led to these gaping holes in their ad revenues. It is this gap that a Google can easily bridge – by providing publishers a safe reliable ad service to receive ads, advertisers an easy-to-use platform to launch their ads, and consumers access to content at a low cost. Media companies are severely restricted by their inability to control these new-age platforms from which customers are now consuming their content. As we now know, the new-age customer spends increasingly larger amounts of time online and demands his content online as well. With 43 percent of his time online spent on search (21 percent) and social media (22 percent), it makes sense for advertisers to spend their ad dollars on Google and Facebook to capture this piece of the customer’s time.
With the above kind of consumer mind share, we foresee that internet companies will morph into advertising firms raking in the ad dollars on different online channels while media companies will struggle with selling content to these internet channel providers.

A new reign and a new strategy


The future of advertising will certainly move online and into the realm of the internet companies. While media companies will continue to be the primary creators and custodians of content, it will certainly not be the king, as most of the media firms are competing for space on the new internet channels by pitching lower quotes. All will need to bow to the new emperor on the block – the internet marketeers. And with the dethroning of content, a new strategy that factors the shift in power and its impact on bottom line will be needed. Being good at creating content is no longer enough, rather how to distribute it on the web.

Credit : This post is first appeared on Forbes Blog

7 Common Hreflang Mistakes

on Saturday 11 June 2016
The Purpose of Hreflang

Hreflang annotations are meant to cross-reference pages that are similar in content, but target different audiences. You can target different audiences with hreflang according to their language and/or their country. This ensures that the correct pages will be shown to the correct users when they search on the versions of Google search that you are targeting.

Here is an example of two hreflang tags that target English speakers in the USA and English speakers in Canada:

<link rel="alternate" hreflang="en-us" href="http://www.example.com/usa/" />
<link rel="alternate" hreflang="en-ca" href="http://www.example.com/ca/" />

Both of these tags would appear on both pages. This would ensure that Canadians searching on google.ca would find the page targeting Canadians, and Americans searching on google.com would find the page targeting Americans.

So that’s the purpose of hreflang, but often I come across mistakes and misconceptions about how to implement hreflang. Here are some of the most common:

Common Mistakes in Implementing Hreflang


Return Tag Errors

“Return Tag Errors” are the result of hreflang annotations that don’t cross-reference each other. These can be found within Google Search Console under the International Targeting tab. If your website has hreflang annotations, either via the page tagging method or the xml sitemaps method, there will be data reported on how many hreflang tags were found, and how many hreflang errors were found. If there are errors, often times those errors are “return tag errors.” Here’s an example of a site that has 412 hreflang tags with errors – all due to “no return tags”):

Hreflang 

Your annotations must be confirmed from the other pages. If page A links to page B, page B must link  back to page A, otherwise, your annotations may not be interpreted correctly.

Often times, the “missing link” is because the hreflang tags do not include a reference to the page itself. Your annotations should be self-referential. Page A should use rel-alternate-hreflang annotation linking to itself.

Using the Wrong Country or Language Codes

When you are adding hreflang codes to your webpages, you need to be absolutely sure that you are using the correct country and language codes. According to Google, “The value of the hreflang attribute must be in ISO 639-1 format for the language, and in ISO 3166-1 Alpha 2 format for the region. Specifying only the region is not supported.”

One of the most common mistakes is using “en-uk” to specify English speakers in the United Kingdom. However, the correct hreflang tag for the UK is actually “en-gb.”
You can use hreflang generator tools like this one to figure out which values you should be using.

Combining Hreflang Sitemaps and Page Tagging Methods

There is no need to use multiple methods for hreflang implementation. Google recommends against it, since it would be redundant. You certainly can use both methods, and there is no clear advantage of one method over the other. Here are some considerations for when you are deciding whether to use the xml sitemaps or page tagging methods:
  • Hreflang xml sitemaps can be difficult to create and update. You can use online tools or create it in Excel, but it is difficult to automate the process. If you have xml sitemaps that your CMS updates for you automatically, it would be better to continue to use those rather than create separate, static hreflang xml sitemaps.
  • Page tagging leads to code bloat, especially when you are targeting several countries/languages. That can mean an additional 10+ lines of code to each geo-targeted page.
  • Some content management systems, such as WordPress and Drupal, offer automatic hreflang page tagging solutions.
Believing That hreflang Annotations Will Consolidate Link Authority

This is another common misconception that can trip up even advanced SEO experts. There have been articles published that seem to show that, once hreflang is correctly implemented across multiple top-level domains or sub-domains, the most authoritative domain gains in link authority. This has not been verified with other international SEO experts, and I have no evidence to believe this is the case either.
The best way to build link authority and consolidate it across your geo-targeted web pages is to keep your content all on one domain. Use a generic, top-level domain such as a .com, and use the sub-folder method to create your country- or language-targeted content. Here is a snapshot of a conversation I had with Gianluca Fiorelli and Martin Kura about this subject:

Fixing Duplicate Content Issues

This is another tricky subject that deserves some nuance. Duplicate content itself is often misunderstood, and throwing hreflang into the mix makes it even more difficult to understand. Hreflang does not “fix” duplicate content issues, per se. For example, when you add hreflang tags to your site, they will appear in the International Targeting tab of Google Search Console (so Google does indeed recognize and understand them), but you will still continue to see Duplicate Title Tags and Duplicate Meta Description warnings in the HTML Improvements tab (if you have pages with duplicate titles and descriptions across your geo-targeted webpages). So if you have two pages in the same language targeting different regions, such as English in the USA and Canada, the content of those two pages may be so similar that they are considered duplicates. Adding hreflang tags will not change that. It is still possible that your American page may outrank your Canadian page, if the American page has significantly more link authority, and especially if it has links from Canadian sources.
However, hreflang tags will help to alleviate this issue. This is why hreflang tags are not enough. They provide a technical structure that helps Google sort out and understand your content, but to have a full-fledged international site(s), you need a holistic international marketing strategy that includes building link authority to your site(s) from the relevant countries/languages that you are targeting.
Hreflang is very effective at handling cross-annotations among different languages, but when it comes to same language, different regions, you can get mixed results.

Not Using Canonical Tags and Hreflang Tags Together Correctly

The hreflang tag also can be used along with rel="canonical" annotations, but hreflang tags need to referenceself-referential,canonical URLs. For example, page A should have a canonical tag pointing to page A, page B should have a canonical tag pointing to page B, and page C should have a canonical tag pointing to page C. All three pages should have hreflang tags that mention all three of the pages in the group. You do NOT want to canonicalize only one version of a page in a page grouping, as that would interfere with hreflang annotations.

Example:

On this page, http://www.example.com/usa/ the hreflang tags might say:
<link rel="alternate" hreflang="en-us" href="http://www.example.com/usa/" />
<link rel="alternate" hreflang="en-ca" href="http://www.example.com/ca/" />
So in this case, the canonical tag for this page would be:
<link rel="Canonical" href="http://www.example.com/usa/" />
And on the Canadian page, the hreflang tags would remain the same, but the canonical tag would be:
<link rel="Canonical" href="http://www.example.com/CA/" />

Not Using Absolute URLs

This one is a heart-breaker because often everything is correct except for the simple fact that the hreflang link referenced is relative rather than absolute. There really is no margin for error with hreflang tags so make sure you are always using absolute URLs. For example, here is what NOT to do:

<link rel="alternate" hreflang="en-us" href="/usa/" />
<link rel="alternate" hreflang="en-ca" href="/ca/" />
Google wants to be able to crawl the entire URL path, especially since many times hreflang tags reference separate ccTLDs or sub-domains.

Content Credit : Kaitlin McMichael