+Chenthil Vel SEO Analyst Latest SEO News 2016 | Upcoming SEO Updates

Latest SEO News Updates

Wednesday, 13 July 2016

Google is a lot of things, but in the world of social media it's still widely considered an incidental player. That might change with the announcement this week that Google's parent company Alphabet has acquired Mountain View, Calif-based Kifi, a start up Web site link-sharing platform. Kifi could possibly help the tech giant beef up its Google Spaces social media platform. Google Spaces aims to be the app individuals use to share content in groups on any topic. Group-sharing app KiFi has Google Search, YouTube and Chrome built into it. The details of the acquisition haven't been released, but Kifi said that part of its team will immediately start working on Google Spaces. That might dovetail with the general idea behind Kifi, which is to add context to collaboration efforts to somewhat ease the process of organizing and using information for a variety of group projects and interests. Kifi has also developed tools for contextualizing stored information, including a deep search engine for Twitter content and links. But Rand Fishkin, founder of Seattle-based SEO consulting company Moz, told us that it's important to be cautious in ascribing a motive to any of Google's small acquisitions like this one. "It could be a pure 'acquihire,' a use of Kifi's technology for a function entirely separate from its original use, or even an experiment that will never see a public launch," he said. "I wouldn't make any assumptions that Google's going deeper in the content production or distribution space based solely on this move." Team To Move Over Launched on May 16, Google Spaces is s a way to make group sharing easier for everyone from book clubs to work groups by enabling them to avoid jumping between apps to copy and paste links. With Google Spaces, group members can see whenever someone shares something new to a space, and the app’s conversational view lets users immediately see what the group is talking about without having to wade through a long thread. Neither company has revealed how many Kifi employees will be joining Google. However, on his Google Plus page, Google engineering director Eddie Kessler said the Kifi team has a great deal of expertise that will help Google, noting "their great expertise in organizing shared content and conversations." That could be helpful to Google Spaces, whose focus now is more on group chat capabilities. The Latest Startup Kifi was founded in 2012 by Dan Blumenfeld and Eishay Smith. The company’s early goal was to provide a consumer-centric experience of group activities and collaboration. In a post on Medium announcing the transaction, Kifi said that some of its services will soon be phased out. But even after the current iteration of Kifi is shut down, users will still have time to transfer their data to another platform of their choice, the company noted. "We see a lot of alignment to Google’s mission to organize the world’s information and make it universally accessible and useful," According to Kifi. "Our team will be joining the Spaces team at Google to build solutions focused on improving group sharing, conversation, and content finding." The transaction seems to fit in with Google’s recent acquisition strategy, which has focused on small startup companies that have developed novel ideas or technologies that are easier for Google to purchase rather than develop on its own. Just in the past month, the company has bought Internet service provider Webpass, French image recognition company Moodstocks and cloud-based video service provider Anvato.

Saturday, 9 April 2016

SSL Vs SEO - HTTPS as a ranking signal

Security is a top priority for Google. We invest a lot in making sure that our services use industry-leading security, like strong HTTPS encryption by default. That means that people using Search, Gmail and Google Drive, for example, automatically have a secure connection to Google.

Beyond our own stuff, we’re also working to make the Internet safer more broadly. A big part of that is making sure that websites people access from Google are secure. For instance, we have created resources to help webmasters prevent and fix security breaches on their sites.

We want to go even further. At Google I/O a few months ago, we called for “HTTPS everywhere” on the web.

We’ve also seen more and more webmasters adopting HTTPS (also known as HTTP over TLS, or Transport Layer Security), on their website, which is encouraging.

For these reasons, over the past few months we’ve been running tests taking into account whether sites use secure, encrypted connections as a signal in our search ranking algorithms. We've seen positive results, so we're starting to use HTTPS as a ranking signal. For now it's only a very lightweight signal — affecting fewer than 1% of global queries, and carrying less weight than other signals such as high-quality content — while we give webmasters time to switch to HTTPS. But over time, we may decide to strengthen it, because we’d like to encourage all website owners to switch from HTTP to HTTPS to keep everyone safe on the web.

SEO advantages of switching to HTTPS

It is clear that HTTPS offers security, so it is definitely the choice to put you in Google’s good graces. There are also some additional SEO benefits for you to consider.

Increased rankings

The obvious one. As stated, Google has confirmed the slight ranking boost of HTTPS sites. Like most ranking signals, it is very hard to isolate on its own, but this is still something to keep in mind. On the plus side, the value of switching to HTTPS is very likely to increase over time.

Referrer Data

When traffic passes to an HTTPS site, the secure referral information is preserved. This is unlike what happens when traffic passes through an HTTP site, and it is stripped away and looks as though it is “direct.”

Security and privacy

HTTPS adds security for your SEO goals and website in several ways:

It verifies that the website is the one the server it is supposed to be talking to.

It prevents tampering by third parties.

It makes your site more secure for visitors.

It encrypts all communication, including URLs, which protects things like browsing history and credit card numbers.

What type of SSL certificate works best?

Companies offer a myriad and confusing array of SSL certificates. The two primary ones to pay attention to are:

Standard Validation SSL – Standard level of validation. Get Free SSL Here https://www.comodo.co.in/ssl-certificates/.

Extended Validation SSL – Offers the highest level of validation and often costs.

Related Source: Low Cost SSL Certificate Provider in India.

Saturday, 30 January 2016

Google last update core algorithm needs to be consistent enough to run by itself without much worry that it won't work right.

Panda is part of Google’s core ranking algorithm has gotten SEOs buzzing about what exactly being part of the core algorithm means. Now, Google has shared some more information about that, saying that the algorithm is consistent enough to not require many changes in the future and can run with less hand-holding.

Watch following Andrey Lipattsev, a search quality senior strategist at Google, said in regard to what core means: Andrey Lipattsev explains that Gary meant here that Panda is now part of the core ranking algorithm, and it means they do not need to know how it runs any more. It’s been tested, it works, and it can now run by itself without much worry. Ammon Johns, in the hangout, then said, “Once they forgot how it works, it is core?” To which Andrey Lipattsev replied, “That is exactly right.” Thanks for share this source: http://searchengineland.com/google-explains-what-it-means-to-be-part-of-the-core-algorithm-240681 Click to read to More Latest SEO News 2016

Monday, 18 January 2016

There is a lot of talk on the Web regarding search engine optimization (SEO) and how, if you just do this one thing, you will be at the top of Google. If only it were that easy. In fact, I believe there are seven distinct rules that a search engine optimizer needs to possess. Most people possess one or maybe two of these skills, very rarely do people posses all seven. In truth, to obtain all seven of theses skills will take time and effort — and, if you are running your own business, do you really have the time to do this?

The golden rules that I believe are necessary for SEO work are:

1. Web Design – Producing a visually attractive page.
2. HTML coding – Developing search engine-friendly coding that sits behind the web design.
3. Copy writing – Producing the actual readable text on the page.
4. Marketing – What are the actual searches that are being used, what key words actually get more business for your company?
5. An eye for detail — Even the smallest errors can stop spiderbots visiting your site.
6. Patience — There is a time lag on any change you make, waiting is a virtue.
7. IT skills — An appreciation of how search engine programs and the algorithms actually work.
1. Many website designers produce more and more eye-catching designs with animations and clever features hoping to entice the people onto their sites. This is the first big mistake; using designs like these may actually decrease your chances of a high Google rating. Yes, that’s right; all that money you have paid for the website design could be wasted because no one will ever find your site.

The reason for this is that before worrying about bringing people to your site, you need to get the spiderbots to like your site. Spiderbots are pieces of software used by the search engine companies to crawl the Internet looking at all the websites, and then having reviewed the sites, they use complex algorithms to rank the sites. Some of the complex techniques used by Web designers cannot be trawled by spiderbots. They come to your site, look at the HTML code and exit stage right, without even bothering to rank your site — meaning you will not be found on any meaningful search.

I am amazed how many times I look at websites and I immediately know they are a waste of money. The trouble is that both the Web designers and the company that paid the money really do not want to know this. In fact, I have stopped playing the messenger of bad news (too many shootings!); I now work round the problem.
Optimizing a website to be Google-friendly is often a compromise between a visually attractive site and an easy-to-find site.
2. The second skill is that of optimizing the actual HTML code to be spiderbot-friendly. I put this as different to the web design because you really do need to be “down and dirty” in the code rather than using an editor like FrontPage, which is OK for website design. This skill takes lots of time and experience to develop, and just when you think you have cracked it, the search engine companies change the algorithms used to calculate how high your site will appear in the search results.

This is no place for even the most enthusiastic amateur. Results need to be constantly monitored, pieces of code added or removed, and a check kept on what the competition is doing. Many people who design their own website feel they will get searched because it looks good, and totally miss out on this step. Without a strong technical understanding of how spiderbots work, you will always struggle to get your company on the first results page in Google. We actually run seven test domains that are testing different theories with different search engines. Remember that different search engines use different criteria and algorithms to rank your site — one size does not fit all.

3. Thirdly, I suggested that copy writing is a skill in its own right. This is the writing of the actual text that people coming to your site will read. The Googlebot and other spiderbots love text – but only when written well in properly constructed English. Some people try to stuff their site with keywords, while others put white writing on white space (so spiderbots can see it but humans cannot).

Spiderbots are very sophisticated and not only will not fall for these tricks, they may actively penalize your site – in Google terms, this is sandboxing. Google takes new sites and “naughty” sites and effectively sin-bins them for three to six months, you can still be found, but n0t until results page 14 – which is not very useful. As well as good English, the spiderbots are also reading the HTML code, so the copywriter also needs an appreciation of the interplay between the two. My recommendation for anyone copy writing their own site is to write normal, well-constructed English sentences that can be read by machine and human alike.

4. The fourth skill is marketing. After all, this is what we are doing – marketing you site and hence company and products/services on the Web. The key here is to set the site up to be accessible to the searches that will provide most business to you. I have seen many sites that can be found as you key in the company name. So the marketing skill requires knowledge of a company’s business, what they are really trying to sell and an understanding of what actual searches may provide dividends.

5. The next rule is an eye for detail. Even a simple change to a Web page can create an error that means the spiderbots will not crawl your site. Recently, I put a link to a page that didn’t have www. at the front of the address. The link still worked but the spiders stopped crawling, and it took my partner to find the error. We have recently invested in a very sophisticated html validator that picks up errors that other validators just fail to see. These errors do not stop the pages displaying correctly to the human eye, but cause massive problems with spiderbots. Almost all the code that I look at on the Web using this validator flags major errors, even from SEO companies.

6. The sixth rule is patience — it really is a virtue. Some people seem to want to make daily changes and then think they can track the web page ranking results the next day. Unfortunately, it can take a week for absolutely correct changes to take effect, in which time you have made six other changes. Add to this Google’s reticence to allow new sites straight on to its listings by adding a waiting factor of, maybe, three months for new sites, and you have a totally uncontrollable situation. We say to all our clients that a piece of SEO work should be looked at like a marketing campaign that runs for six months, since it is only after that time that a true judgment of the effectiveness of the work can be made.

7. The final and seventh skill is an appreciation of how search engines and algorithms work, for this where both IT and math experience is useful. People who have programmed at a detailed systems level have a natural feeling for how spiderbots will read a page, what they will search for, what tables they will set up, what weightings they may give to different elements. All of this builds a picture of the database that will be created and how it will be accessed when a search is undertaken. Unfortunately, this skill is the most difficult one to learn because it relies on many years experience of systems programming.

Written by - Richa Verma

Tuesday, 12 January 2016

Google has pushed a core ranking algorithm update over Jan 8th & 9th 2016.

Did you notice ranking changes with your websites?

Google have confirmed on Twitter that what webmasters were seeing over the weekend was not the Penguin update we are expecting, but rather a core ranking algorithm update.

Google Ranking Algorithm Update 2016
On Friday, I noticed early signs of an update, and then, over the weekend Jan 8th & 9th 2016, I called this a “massive update.” I asked Google for confirmation, and on Twitter they confirmed it was a core ranking algorithm update.

Reference: http://searchengineland.com/google-had-a-major-core-ranking-algorithm-update-this-past-weekend-240067

Google Panda Algorithm is Now Part Of Google’s Core Ranking Signals

Panda is now baked 2016 in as one of Google's core ranking algorithm. Panda is an algorithm that’s applied to sites overall and has become one of our core ranking signals. It measures the quality of a site, which you can read more about in our guidelines. Panda allows Google to take quality into account and adjust ranking accordingly.

Reference: http://searchengineland.com/google-panda-is-now-part-of-googles-core-ranking-signals-240069

Thursday, 24 December 2015

SEO is an ever-changing industry as search engines (Google in particular) evolve to some extent every single day. Google makes algorithm changes on a daily basis, and every now and then it makes major changes that cause massive shake-ups in search results as well as SEO strategies 2016.

What do you expect to change the most about optimizing for Google in 2016?

Mobile has been a major focal point of Google for much longer, but in 2015 it was as big a focus as ever. Early in the year, Google announced two significant ranking factors – app indexing and mobile-friendliness – both aimed at improving the mobile experience for users and getting them the content they want/need in the best way possible.

This will (unsurprisingly) continue to be a major focus on Google’s heading into 2016.

In a recent webmaster hangout on Google+, Google webmaster trends analyst John Mueller spoke a little about what to expect for SEO in the coming year (via Barry Schwartz).

The relevant portion of the video begins at about 26 minutes in, but you’re probably only going to get more by watching the entire video.

Mueller answers a question about general SEO tips for 2016 (as transcribed by Schwartz):

Oh man… I don’t have any magical SEO tips for next year. I can’t tell you about that high ranking meta tag that we’ve been working on [sarcasm].

But in general, I think, next year you’ll probably hear a lot about from us about AMP, mobile friendly, we’ve been doing over the years. It is still a very big topic and we still see a lot of sites not doing that properly. Those are probably the bigger changes, but other things will definitely happen as well. More information about JavaScript in sites so that we can really figure out how to handle these better in search and make a better recommendation on what you should do or shouldn’t do.

But past that, of course, high quality content is something I’d focus on. I see lots and lots of SEO blogs talk about user experience, which I think is a great thing to focus on as well. Because that essentially kind of focuses on what we are trying to look at as well. We want to rank content that is useful for them and if your content is really useful for them, then we want to rank it.

We’ve covered mobile-friendliness a great deal throughout the year, so if this is something you’re still struggling with as Mueller implies, I’d encourage you to read back through the content found here.

AMP of course refers to Accelerated Mobile Pages, which is a new open source project and basically Google’s answer to Facebook’s Instant Articles, which is being supported by a number of other internet players including Yahoo, Twitter, LinkedIn, Pinterest, WordPress.com, ChartBeat, Parse.ly, and Adobe Analytics.

You can read more about this here, but Google recently said it will begin sending search traffic to AMP pages beginning in late February. So that’s one major change you can expect in 2016 (and early 2016 at that).

Another big SEO change coming in early 2016 is Google’s next Penguin update which is supposed to update in real time moving forward.

Reference Source: http://www.webpronews.com/what-google-says-about-seo-in-2016-2015-12/

Wednesday, 8 July 2015

The movement to make the Internet more secure through HTTPS brings several useful advancements for webmasters. In addition to security improvements, HTTPS promises future technological advances and potential SEO benefits for marketers.

HTTPS in search results is rising. Recent MozCast data from Dr. Pete shows nearly 20% of first page Google results are now HTTPS.

Sadly, HTTPS also has its downsides.

Marketers run into their first challenge when they switch regular HTTP sites over to HTTPS. Technically challenging, the switch typically involves routing your site through a series of 301 redirects. Historically, these types of redirects are associated with a loss of link equity (thought to be around 15%) which can lead to a loss in rankings. This can offset any SEO advantage that Google claims switching.

Ross Hudgens perfectly summed it up in this tweet:

Ross Hudgens

Many SEOs have anecdotally shared stories of HTTPS sites performing well in Google search results (and our soon-to-be-published Ranking Factors data seems to support this.) However, the short term effect of a large migration can be hard to take. When Moz recently switched to HTTPS to provide better security to our logged-in users, we saw an 8-9% dip in our organic search traffic.

Problem number two is the subject of this post. It involves the loss of referral data. Typically, when one site sends traffic to another, information is sent that identifies the originating site as the source of traffic. This invaluable data allows people to see where their traffic is coming from, and helps spread the flow of information across the web.

SEOs have long used referrer data for a number of beneficial purposes. Oftentimes, people will link back or check out the site sending traffic when they see the referrer in their analytics data. Spammers know this works, as evidenced by the recent increase in referrer spam:

This process stops when traffic flows from an HTTPS site to a non-secure HTTP site. In this case, no referrer data is sent. Webmasters can't know where their traffic is coming from.

Here's how referral data to my personal site looked when Moz switched to HTTPS. I lost all visibility into where my traffic came from.

Enter the meta referrer tag

While we can't solve the ranking challenges imposed by switching a site to HTTPS, we can solve the loss of referral data, and it's actually super-simple.

Almost completely unknown to most marketers, the relatively new meta referrer tag (it's actually been around for a few years) was designed to help out in these situations.

Better yet, the tag allows you to control how your referrer information is passed.

The meta referrer tag works with most browsers to pass referrer information in a manner defined by the user. Traffic remains encrypted and all the benefits of using HTTPS remain in place, but now you can pass referrer data to all websites, even those that use HTTP.

How to use the meta referrer tag

What follows are extremely simplified instructions for using the meta referrer tag. For more in-depth understanding, we highly recommend referring to the W3C working draft of the spec.

The meta referrer tag is placed in the section of your HTML, and references one of five states, which control how browsers send referrer information from your site. The five states are:

None: Never pass referral data

None When Downgrade: Sends referrer information to secure HTTPS sites, but not insecure HTTP sites

Origin Only: Sends the scheme, host, and port (basically, the subdomain) stripped of the full URL as a referrer, i.e. https://moz.com/example.html would simply send https://moz.com

Origin When Cross-Origin: Sends the full URL as the referrer when the target has the same scheme, host, and port (i.e. subdomain) regardless if it's HTTP or HTTPS, while sending origin-only referral information to external sites. (note: There is a typo in the official spec. Future versions should be "origin-when-cross-origin")

Unsafe URL: Always passes the URL string as a referrer. Note if you have any sensitive information contained in your URL, this isn't the safest option. By default, URL fragments, username, and password are automatically stripped out.

The meta referrer tag in action

By clicking the link below, you can get a sense of how the meta referrer tag works.

Check Referrer


We've set the meta referrer tag for Moz to "origin", which means when we link out to another site, we pass our scheme, host, and port. The end result is you see http://moz.com as the referrer, stripped of the full URL path (/meta-referrer-tag).

My personal site typically receives several visits per day from Moz. Here's what my analytics data looked like before and after we implemented the meta referrer tag.

For simplicity and security, most sites may want to implement the "origin" state, but there are drawbacks.

One negative side effect was that as soon as we implemented the meta referrer tag, our AdRoll analytics, which we use for retargeting, stopped working. It turns out that AdRoll uses our referrer information for analytics, but the meta referrer tag "origin" state meant that the only URL they ever saw reported was https://moz.com.

Conclusion We love the meta referrer tag because it keeps information flowing on the Internet. It's the way the web is supposed to work!

It helps marketers and webmasters see exactly where their traffic is coming from. It encourages engagement, communication, and even linking, which can lead to improvements in SEO. Original Source Posted by: https://moz.com/blog/meta-referrer-tag