Error message

  • Warning: date_timezone_set() expects parameter 1 to be DateTime, boolean given in format_date() (line 2040 of /home1/moitozoc/drupals/drupal-7.33/includes/
  • Warning: date_format() expects parameter 1 to be DateTime, boolean given in format_date() (line 2050 of /home1/moitozoc/drupals/drupal-7.33/includes/
  • Warning: date_timezone_set() expects parameter 1 to be DateTime, boolean given in format_date() (line 2040 of /home1/moitozoc/drupals/drupal-7.33/includes/
  • Warning: date_format() expects parameter 1 to be DateTime, boolean given in format_date() (line 2050 of /home1/moitozoc/drupals/drupal-7.33/includes/
  • Warning: date_timezone_set() expects parameter 1 to be DateTime, boolean given in format_date() (line 2040 of /home1/moitozoc/drupals/drupal-7.33/includes/
  • Warning: date_format() expects parameter 1 to be DateTime, boolean given in format_date() (line 2050 of /home1/moitozoc/drupals/drupal-7.33/includes/
  • Warning: date_timezone_set() expects parameter 1 to be DateTime, boolean given in format_date() (line 2040 of /home1/moitozoc/drupals/drupal-7.33/includes/
  • Warning: date_format() expects parameter 1 to be DateTime, boolean given in format_date() (line 2050 of /home1/moitozoc/drupals/drupal-7.33/includes/


First Click Free update

Google Webmaster Central Blog - Tue, 2015-09-29 05:55

Around ten years ago when we introduced a policy called “First Click Free,” it was hard to imagine that the always-on, multi-screen, multiple device world we now live in would change content consumption so much and so fast. The spirit of the First Click Free effort was - and still is - to help users get access to high quality news with a minimum of effort, while also ensuring that publishers with a paid subscription model get discovered in Google Search and via Google News.

In 2009, we updated the FCF policy to allow a limit of five articles per day, in order to protect publishers who felt some users were abusing the spirit of this policy. Recently we have heard from publishers about the need to revisit these policies to reflect the mobile, multiple device world. Today we are announcing a change to the FCF limit to allow a limit of three articles a day. This change will be valid on both Google Search and Google News.

Google wants to play its part in connecting users to quality news and in connecting publishers to users. We believe the FCF is important in helping achieve that goal, and we will periodically review and update these policies as needed so they continue to benefit users and publishers alike. We are listening and always welcome feedback.

Questions and answers about First Click Free

Q: Do the rest of the old guidelines still apply?
A: Yes, please check the guidelines for Google News as well as the guidelines for Web Search and the associated blog post for more information.

Q: Can I apply First Click Free to only a section of my site / only for Google News (or only for Web Search)?
A: Sure! Just make sure that both Googlebot and users from the appropriate search results can view the content as required. Keep in mind that showing Googlebot the full content of a page while showing users a registration page would be considered cloaking.

Q: Do I have to sign up to use First Click Free?
A: Please let us know about your decision to use First Click Free if you are using it for Google News. There's no need to inform us of the First Click Free status for Google Web Search.

Q: What is the preferred way to count a user's accesses?
A: Since there are many different site architectures, we believe it's best to leave this up to the publisher to decide.

(Please see our related blog post for more information on First Click Free for Google News.)

Posted by John Mueller, Google Switzerland
Categories: sysadmin

Drive app installs through App Indexing

Google Webmaster Central Blog - Fri, 2015-09-25 13:36
You’ve invested time and effort into making your app an awesome experience, and we want to help people find the great content you’ve created. App Indexing has already been helping people engage with your Android app after they’ve installed it — we now have 30 billion links within apps indexed. Starting this week, people searching on Google can also discover your app if they haven’t installed it yet. If you’ve implemented App Indexing, when indexed content from your app is relevant to a search done on Google on Android devices, people may start to see app install buttons for your app in search results. Tapping these buttons will take them to the Google Play store where they can install your app, then continue straight on to the right content within it.

With the addition of these install links, we are starting to use App Indexing as a ranking signal for all users on Android, regardless of whether they have your app installed or not. We hope that Search will now help you acquire new users, as well as re-engage your existing ones. To get started, visit and to learn more about the other ways you can integrate with Google Search, visit

Posted by Lawrence Chang, Product Manager
Categories: sysadmin

FAQs about the April 21st mobile-friendly update

Google Webmaster Central Blog - Wed, 2015-09-23 19:41
We’d like to share answers to your frequently asked questions. For background, in February, we announced that the mobile-friendly update will boost the rankings of mobile-friendly pages -- pages that are legible and usable on mobile devices -- in mobile search results worldwide. (Conversely, pages designed for only large screens may see a significant decrease in rankings in mobile search results.) To get us all on the same page, here are the most frequently asked questions:

General FAQs
1. Will desktop and/or tablet ranking also be affected by this change?

No, this update has no effect on searches from tablets or desktops. It affects searches from mobile devices across all languages and locations.

2. Is it a page-level or site-level mobile ranking boost? 

It’s a page-level change. For instance, if ten of your site’s pages are mobile-friendly, but the rest of your pages aren’t, only the ten mobile-friendly pages can be positively impacted.

3. How do I know if Google thinks a page on my site is mobile-friendly?

Individual pages can be tested for “mobile-friendliness” using the Mobile-Friendly Test.

Test individual URLs in real-time with the Mobile-Friendly Test.
To review site-level information on mobile-friendliness, check out the Mobile Usability report in Webmaster Tools. This feature’s data is based on the last time we crawled and indexed your site’s pages.
Mobile Usability in Webmaster Tools provides a snapshot of your entire site’s mobile-friendliness.

4. Unfortunately, my mobile-friendly pages won’t be ready until after April 21st. How long before they can be considered mobile-friendly in ranking?
We determine whether a page is mobile-friendly every time it’s crawled and indexed -- you don’t have to wait for another update. Once a page is mobile-friendly, you can wait for Googlebot for smartphones to naturally (re-)crawl and index the page or you can expedite processing by using Fetch as Google with Submit to Index in Webmaster Tools. For a large volume of URLs, consider submitting a sitemap. In the sitemap, if your mobile content uses pre-existing URLs (such as with Responsive Web Design or dynamic serving), also include the lastmod tag.

5. Since the mobile ranking change rolls out on April 21st, if I see no drop in traffic on April 22nd, does that mean that my site’s rankings aren't impacted?
You won't be able to definitively determine whether your site’s rankings are impacted by the mobile-friendly update by April 22nd. While we begin rolling out the mobile-friendly update on April 21st, it’ll be a week or so before it makes its way to all pages in the index. 

6. I have a great mobile site, but the Mobile-Friendly Test tells me that my pages aren't mobile-friendly. Why?
If a page is designed to work well on mobile devices, but it’s not passing the Mobile-Friendly Test, the most common reason is that Googlebot for smartphones is blocked from crawling resources, like CSS and JavaScript, that are critical for determining whether the page is legible and usable on a mobile device (i.e., whether it’s mobile-friendly). To remedy:
  1. Check if the Mobile-Friendly Test shows blocked resources (often accompanied with a partially rendered image).
  2. Allow Googlebot to crawl the necessary files.
  3. Double-check that your page passes the Mobile-Friendly Test.
  4. Use Fetch as Google with Submit to Index and submit your updated robots.txt to Google to expedite the re-processing of the updated page (or just wait for Google to naturally re-crawl and index).
The most common reason why a mobile page fails the Mobile-Friendly Test is that Googlebot for smartphones is blocked from crawling resources, like CSS and JavaScript, that are crucial for understanding the page’s mobile-friendliness. 

To reiterate, we recommend that site owners allow Googlebot to crawl all resources for a page (including CSS, JavaScript, and images), so that we can properly render, index, and in this case, assess whether the page is mobile-friendly.

7. What if I link to a site that’s not mobile-friendly?
Your page can still be “mobile-friendly” even if it links to a page that’s not mobile-friendly, such as a page designed for larger screens, like desktops. It’s not the best experience for mobile visitors to go from a mobile-friendly page to a desktop-only page, but hopefully as more sites become mobile-friendly, this will become less of a problem.

8. Does Google give a stronger mobile-friendly ranking to pages using Responsive Web Design (which uses the same URL and the same HTML for the desktop and mobile versions) vs. hosting a separate mobile site (like www for desktop and for mobile)?
No, mobile-friendliness is assessed the same, whether you use responsive web design (RWD), separate mobile URLs, or dynamic serving for your configuration. If your site uses separate mobile URLs or dynamic serving, we recommend reviewing the Mobile SEO guide to make sure Google is properly crawling and indexing your mobile pages.

9. Will my site / page disappear on mobile search results if it's not mobile-friendly?
While the mobile-friendly change is important, we still use a variety of signals to rank search results. The intent of the search query is still a very strong signal -- so even if a page with high quality content is not mobile-friendly, it could still rank high if it has great content for the query.

Specialized FAQs
10. What if my audience is desktop only? Then there’s no reason to have a mobile site, right?
Not exactly. Statistics show that more people are going “mobile only” -- either because they never had a desktop or because they won’t replace their existing desktop. Additionally, a non-mobile-friendly site may not see many mobile visitors precisely for that reason. 
The mobile-friendly update will apply to mobile searches conducted across all sites, regardless of the site’s target audiences’ language, region, or proportion of mobile to desktop traffic.

11. I have pages showing mobile usability errors because they embed a YouTube video. What can I do?
We suggest paying close attention to how the YouTube video is embedded. If you are using the “old-style” <object> embeds in the mobile page, convert to <iframe> embeds for broader compatibility. YouTube now uses the HTML5 player on the web by default, so it’s mobile-friendly to embed videos using the <iframe> tags from the “share” feature on the watch page or from the YouTube iFrame API. If you have a more complex integration, that should also be mobile-friendly, since it’ll instruct the device to use the device’s native support. 
For Flash content from sites other than YouTube, check if there is an equivalent HTML5 embed tag or code snippet to avoid using proprietary plugins.

12. Is there a clear standard for sizing tap targets?
Yes, we suggest a minimum of 7mm width/height for primary tap targets and a minimum margin of 5mm between secondary tap targets. The average width of an adult's finger pad is 10mm, and these dimensions can provide a usable interface while making good use of screen real estate.

13. To become mobile-friendly quickly, we’re thinking of creating a very stripped down version of our site (separate mobile pages) until our new responsive site is complete. Do you foresee any problems with this?
First, keep in mind that we support three mobile configurations and that your website doesn't have to be responsive to be mobile-friendly. In response to your question, please be cautious about creating a “stripped down” version of your site. While the page may be formatted for mobile, if it doesn’t allow your visitors to easily complete their common tasks or have an overall smooth workflow, it may become frustrating to your visitors and perhaps not worth the effort. Should a temporary mobile site be created, once the RWD is live, be sure to move the site properly. For example, update all links so they no longer reference the separate mobile URLs and 301 redirect mobile URLs to their corresponding RWD version.
If you’re totally new to building a mobile-friendly site, it’s not too late! Check out our Getting Started guide in the Mobile-Friendly Websites documentation.
Get started on your mobile site at

If you already have a mobile site, investigate the Mobile Usability report in Webmaster Tools to make sure that Google detects your site’s pages as mobile-friendly. 
Still more questions? Please ask below or check out the Mobile Websites section of the Webmaster Forum

Written by Maile Ohye, Developer Programs Tech Lead.
Categories: sysadmin

Helping hacked sites with reconsideration requests

Google Webmaster Central Blog - Wed, 2015-09-23 08:36

Thus far in 2015 we have seen a 180% increase in the number of sites getting hacked and a 300% increase in hacked site reconsideration requests. While we are working hard to help webmasters prevent hacks in the first place through efforts such as blog posts and #NoHacked campaigns, we recognize that our reconsideration process is an important part of making recovering from a hack faster and easier. Here's what we've been focussing on:

1) Improved communication
2) Better tools
3) Continuous feedback loop

1. Improving communications with webmasters of hacked sites

Last year we launched the "Note from your reviewer" feature in our reconsideration process. This feature enables us to give specific examples and advice tailored to each case in response to a reconsideration request. Thus far in 2015 we have sent a customized note to over 70% of webmasters whose hacked reconsideration request was rejected, with specific guidance on where and how to find the remaining hacked content. The results have been encouraging, as we've seen a 29% decrease in the average amount of time from when a site receives a hacked manual action to the time when the webmaster cleans up and the manual action is removed.

Example "note from your reviewer" with detailed guidance and a custom example of hacked text and a hacked page

We have also completed our second #NoHacked campaign, with more detailed help on preventing and recovering from hacks. In the campaign, we focused on ways to improve the security on your site as well as ways to fix your site if it was compromised. You can catch up by reading the first post.

2. Better tools including auto-removal of some hacked manual actions

Last year we launched the "Fetch and Render" feature to the Fetch as Google tool, which allows you to see the website exactly as Googlebot sees it. This functionality is useful in recovering from a hack, since many hackers inject cloaked content that's not visible to the normal user but obvious to search engine crawlers like Googlebot.

This year we also launched the Hacked Sites Troubleshooter in 23 languages which guides webmasters through some basic steps to recover from a hack. Let us know if you have found the troubleshooter useful as we're continuing to expand its features and impact.

Finally, we're beta testing the automated removal of some hacked manual actions. In Search Console if Google sees a "Hacked site" manual action under "Partial matches", and our systems detect that the hacked content is no longer present, in some cases we will automatically remove that manual action. We still recommend that you submit a reconsideration request if you see any manual actions, but don't be surprised if a "Hacked site" manual action disappears and saves you the trouble!

Example of a Hacked site manual action on a Partial match: if our systems detect that the hacked content is no longer present, in some cases we will automatically remove the manual action

3. Soliciting your feedback and taking action

Our improved communication and tools have come directly from feedback we've collected from webmasters of sites that have been hacked. For example, earlier this year we hosted webmasters who have been through the hacked reconsideration process in both Mountain View, USA and Dublin, Ireland for brainstorming sessions. We also randomly sampled webmasters that had been through a hacked reconsideration. We found that while only 15% of webmasters were dissatisfied with the process, the main challenges those webmasters faced were in clearer notification of their site being hacked and clearer guidance on how to resolve the hack. This feedback contributed directly our more detailed blog post on hacked recovery, and to much of the content in our latest #NoHacked campaign.

(for hi-res version)  

Googlers in Dublin brainstorming ways to improve the hacked reconsideration process after meeting with local webmasters

We will continue to support webmasters of hacked sites through the methods detailed above, in addition to the Webmasters help for hacked sites portal and the security, malware & hacked sites section of our forum. And we'd love to hear your ideas in the comments below on how Google can better support webmasters recovering from a hacked website!

Posted by Josh Feira and Yuan Niu, Search Quality Team
Categories: sysadmin

Repeated violations of Webmaster Guidelines

Google Webmaster Central Blog - Fri, 2015-09-18 04:09

In order to protect the quality of our search results, we take automated and manual actions against sites that violate our Webmaster Guidelines. When your site has a manual action taken, you can confirm in the [Manual Actions] page in Search Console which part of your site the action was taken and why. After fixing the site, you can send a reconsideration request to Google. Many webmasters are getting their manual action revoked by going through the process.

However, some sites violate the Webmaster Guidelines repeatedly after successfully going through the reconsideration process. For example, a webmaster who received a Manual Action notification based on an unnatural link to another site may nofollow the link, submit a reconsideration request, then, after successfully being reconsidered, delete the nofollow for the link. Such repeated violations may make a successful reconsideration process more difficult to achieve. Especially when the repeated violation is done with a clear intention to spam, further action may be taken on the site.

In order to avoid such situations, we recommend that webmasters avoid violating our Webmaster Guidelines, let alone repeating it. We, the Search Quality Team, will continue to protect users by removing spam from our search results.

Posted by Google Search Quality Team
Categories: sysadmin

Mobile-friendly web pages using app banners

Google Webmaster Central Blog - Tue, 2015-09-01 12:53

When it comes to search on mobile devices, users should get the most relevant answers, no matter if the answer lives in an app or a web page. We’ve recently made it easier for users to find and discover apps and mobile-friendly web pages. However, sometimes a user may tap on a search result on a mobile device and see an app install interstitial that hides a significant amount of content and prompts the user to install an app. Our analysis shows that it is not a good search experience and can be frustrating for users because they are expecting to see the content of the web page.

Starting today, we’ll be updating the Mobile-Friendly Test to indicate that sites should avoid showing app install interstitials that hide a significant amount of content on the transition from the search result page. The Mobile Usability report in Search Console will show webmasters the number of pages across their site that have this issue.

After November 1, mobile web pages that show an app install interstitial that hides a significant amount of content on the transition from the search result page will no longer be considered mobile-friendly. This does not affect other types of interstitials. As an alternative to app install interstitials, browsers provide ways to promote an app that are more user-friendly.

App install interstitials that hide a significant amount of content provide a bad search experience

App install banners are less intrusive and preferred

App install banners are supported by Safari (as Smart Banners) and Chrome (as Native App Install Banners). Banners provide a consistent user interface for promoting an app and provide the user with the ability to control their browsing experience. Webmasters can also use their own implementations of app install banners as long as they don’t block searchers from viewing the page’s content.

If you have any questions, we’re always happy to chat in the Webmaster Central Forum.

Posted by Daniel Bathgate, Software Engineer, Google Search.

Categories: sysadmin

An update on CSV download scripts

Google Webmaster Central Blog - Mon, 2015-08-31 09:09

With the new Search Analytics API, it's now time to gradually say goodbye to the old CSV download scripts for information on queries & rankings. We'll be turning off access to these downloads on October 20, 2015.

These download scripts have helped various sites & tools to get information on queries, impressions, clicks, and rankings over the years. However, they didn't use the new Search Analytics data, and relied on the deprecated Client Login API.

Farewell, CSV downloads, you've served us (and many webmasters!) well, but it's time to move on. We're already seeing lots of usage with the new API. Are you already doing something neat with the API? Let us know in the comments!

Posted by John Mueller, Webmaster Trends (and query, impression, & click trends) Analyst
Categories: sysadmin

#NoHacked: Fixing the Injected Gibberish URL Hack

Google Webmaster Central Blog - Mon, 2015-08-24 12:56
Today in our #NoHacked campaign, we’ll be discussing how to fix the injected gibberish URL hack we wrote about last week. Even if your site is not infected with this specific type of hack, many of these steps can be helpful for fixing other types of hacks. Follow along with discussions on Twitter and Google+ using the #NoHacked tag. (Part 1, Part 2, Part 3, Part 4)

Temporarily Take your Site Offline

Taking your site offline temporarily will prevent your site’s visitors from going to hacked pages and give you time to properly fix your site. If you keep your site online, you run the risk of getting compromised again as you clean up your site.

Treating your Site

The next few steps require you to be comfortable making technical changes to your site. If you aren’t familiar or comfortable enough with your site to make these changes, it might be best to consult with or hire someone who is. However, reading through these steps will still be helpful.

Before you start fixing your site, we advise that you back up your site. (This backed up version will still contain hacked content and should only be used if you accidentally remove a critical file.) If you’re unsure how to back up your site, ask your hosting provider for assistance or consult your content management system (CMS) documentation. As you work through the steps, any time you remove a file, make sure to keep a copy of the file as well.

Checking your .htaccess file

In order to manipulate your site, this type of hack creates or alters the contents of your .htaccess file. If you’re not sure where to find your .htaccess file, consult your server or CMS documentation.

Check the contents of your .htaccess file for any suspicious content. If you’re not sure how to interpret the contents of the .htaccess file, you can read about it on the documentation, ask in a help forum, or you can consult an expert. Here is an example of a .htaccess modified by this hack:

  • <IfModule mod_rewrite.c> 
  •   RewriteEngine On  
  •   #Visitors that visit your site from Google will be redirected  
  •   RewriteCond %{HTTP_REFERER} google\.com 
  •   #Visitors are redirected to a malicious PHP file called happypuppy.php 
  •   RewriteRule (.*pf.*) /happypuppy.php?q=$1 [L] 
  • </IfModule>

Identifying other malicious files

The most common types of files that are modified or injected by this hack are JavaScript and PHP files. Hackers typically take two approaches: The first is to insert new PHP or JavaScript files on your server. The inserted files can sometimes be named something very similar to a legitimate file on your site like wp-cache.php versus the legitimate file wp_cache.php. The second approach is to alter legitimate files on your server and insert malicious content into these files. For example, if you have a template or plugin JavaScript file on your site, hackers might add malicious JavaScript to the file.

For example, on a malicious file named happypuppy.php, identified earlier in the .htaccess file, was injected into a folder on the site. However, the hackers also corrupted a legitimate JavaScript file called json2.js by adding malicious code to the file. Here is an example of a corrupted json2.js file. The malicious code is highlighted in red and has been added to the very bottom of the json2.js file:

To effectively track down malicious files, you’ll need to understand the function of the JavaScript and PHP files on your site. You might need to consult your CMS documentation to help you. Once you know what the files do, you should have an easier time tracking down malicious files that don’t belong on your site.

Also, check your site for any recently modified files. Template files that have been modified recently should be thoroughly investigated. Tools that can help you interpret obfuscated PHP files can be found in the Appendix.

Removing malicious content

As mentioned previously, back up the contents of your site appropriately before you remove or alter any files. If you regularly make backups for your site, cleaning up your site might be as easy as restoring a clean backed-up version.

However, if you do not regularly back up your site, you have a few alternatives. First, delete any malicious files that have been inserted on your site. For example, on, you would delete the happypuppy.php file. For corrupted PHP or JavaScript files like json2.js, you’ll have to upload a clean version of those files to your site. If you use a CMS, consider reloading a fresh copy of the core CMS and plugin files on your site.

Identifying and Fixing the Vulnerability

Once you’ve removed the malicious file, you’ll want to track down and fix the vulnerability that allowed your site to be compromised, or you risk your site being hacked again. The vulnerability could be anything from a stolen password to outdated web software. Consult Google Webmaster Hacked Help for ways to identify and fix the vulnerability. If you’re unable to figure out how your site was compromised, you should change your passwords for all your login credentials,update all your web software, and seriously consider getting more help to make sure everything is ok.

Next Steps

Once you’re done cleaning your site, use the Fetch as Google tool to check if the hacked pages still appear to Google. Don’t forget to check your home page for hacked content as well. If the hacked content is gone, then, congratulations, your site should be clean! If the Fetch as Google tool is still seeing hacked content on those hacked pages, you still have work to do. Check again for any malicious PHP or JavaScript files you might have missed.

Bring your site back online as soon as you’re sure your site is clean and the vulnerability has been fixed. If there was a manual action on your site, you’ll want to file a reconsideration request in Search Console. Also, think about ways to protect your site from future attacks. You can read more about how to secure your site from future attacks in the Google Hacked Webmaster Help Center.

We hope this post has helped you gain a better understanding of how to fix your site from the injected gibberish URL hack. Be sure to follow our social campaigns and share any tips or tricks you might have about staying safe on the web with the #nohacked hashtag.

If you have any additional questions, you can post in the Webmaster Help Forums where a community of webmasters can help answer your questions. You can also join our Hangout on Air about Security on August 26.


These are tools that may be useful. Google doesn't run or support them.

PHP Decoder, UnPHP: Hackers will often distort PHP files to make them harder to read. Use these tools to clean up the PHP files so you understand better what the PHP file is doing.

Posted by: Eric Kuan, Webmaster Relations Specialist & Yuan Niu, Webspam Analyst
Categories: sysadmin

#NoHacked: Identifying and Diagnosing Injected Gibberish URL Hacking

Google Webmaster Central Blog - Mon, 2015-08-17 13:08
Today in our #NoHacked campaign, we’ll be discussing how to identify and diagnose a trending hack. Even if your site is not infected with this specific type of hack, many of these steps can be helpful for other types of hacks. Next week, we’ll be following up with a post about fixing this hack. Follow along with discussions on Twitter and Google+ using the #NoHacked tag. (Part 1, Part 2, Part 3)

Identifying Symptoms
Gibberish pages
The hallmark of this type of hacking is spammy pages that appear to be added to the site. These pages contain keyword-rich gibberish text, links, and images in order to manipulate search engines. For example, the hack creates pages like which contain gibberish content like below:

This hack often uses cloaking to avoid webmasters from detecting it. Cloaking refers to the practice of presenting different content or URLs to webmasters, visitors, and search engines. For example, the webmaster of the site might be shown an empty or HTTP 404 page which would lead the webmaster to believe the hack is no longer present. However, users who visit the page from search results will still be redirected to spammy pages, and search engines that crawl the site will still be presented with gibberish content.

Monitoring your Site
Properly monitoring your site for hacking allows you to remedy the hack more quickly and minimize damage the hack might cause. There are several ways you can monitor your site for this particular hack.

Looking for a surge in website traffic

Because this hack creates many keyword heavy URLs that are crawled by search engines, check to see if there was any recent, unexpected surges in traffic. If you do see a surge, use the Search Analytics tool in Search Console to investigate whether or not hacked pages are the source of the unusual website traffic.

Tracking your site appearance in search results

Periodically checking how your site appears in search results is good practice for all webmasters. It also allows you to spot symptoms of hacking. You can check your site in Google by using the site: operator on your site (i.e. search for If you see any gibberish links associated with your site or a label that says “This site may be hacked.”, your site might have been compromised. 

Signing up for alerts from Google

We recommend you sign up for Search Console. In Search Console, you can check if Google has detected any hacked pages on your site by looking in the Manual Actions Viewer or Security Issues report. Search Console will also message you if Google has detected any hacked pages on your site.

Also, we recommend you set up Google Alerts for your site. Google Alerts will email you if Google finds new results for a search query. For example, you can set up a Google Alert for your site in conjunction with common spammy terms like [ cheap software]. If you receive an email that Google has returned a new query for that term, you should immediately check what pages on your site are triggering that alert.

Diagnosing your Site
Gathering tools that can help

In Search Console, you have access to the Fetch as Google tool in Search Console. The Fetch as Google tool allows you to see a page as Google sees it. This will help you to identify cloaked hacked pages. Additional tools from others, both paid and free, are listed in the appendix to this post.

Checking for hacked pages

If you’re not sure if there is hacked content on your site, the Google Hacked Troubleshooter can walk you through some basic checks. For this type of hack, you’ll want to perform a site: search on your site. Look for suspicious pages and URLs loaded with strange keywords in the search results. If you have a large number of pages on your site, you might need to try a more targeted query. Find common spam terms and append them to your site: search query like [ cheap software]. Try this with several spammy terms to see if any results show up.

Checking for cloaking on hacked pages

Because this type of hacking employs cloaking to prevent accurate detection, it’s very important that you use the Fetch as Google tool in Search Console to check the spammy pages you found in the previous step. Remember, cloaked pages can show you an HTTP 404 page that tricks you into thinking the hack is fixed even if the page is still live. You should also use Fetch as Google on your homepage as well. This type of hack often adds text or links to the homepage.

We hope this post has given you a better idea of how to identify and diagnose hacks that inject gibberish URLs on your site. Tune in next week where we’ll be explaining how to remove this hack from your site. Be sure to follow our social campaigns and share any tips or tricks you might have about staying safe on the web with the #NoHacked hashtag.

If you have any additional questions, you can post in the Webmaster Help Forums where a community of webmasters can help answer your questions. You can also join our Hangout on Air about Security on August 26.

These are tools that scan your site and may be able to find problematic content. Other than VirusTotal, Google doesn't run or support them.

Virus Total,, Sucuri Site Check, Wepawet: These are tools that may be able to scan your site for problematic content. Keep in mind that these scanners can’t guarantee that they will identify every type of problematic content.

Posted by Eric Kuan, Webmaster Relations Specialist & Yuan Niu, Webspam Analyst
Categories: sysadmin

#NoHacked: Using two-factor authentication to protect your site

Google Webmaster Central Blog - Mon, 2015-08-10 12:53
Today in our #nohacked campaign, we’ll be talking about two-factor authentication. Follow along with discussions on Twitter and Google+ using the #NoHacked tag. (Part 1, Part 2)
There was once a time when having a relatively strong password or answering a security question was a reasonable way to protect your online accounts. However, according to a study from Stop Badware, stolen credentials is a common way for hackers to compromise websites. Additionally, even reputable sites can fall victim to hacking, potentially exposing your personal data like passwords to attackers.

Fortunately, two-factor authentication can help you keep your accounts safer. Two-factor authentication relies on an additional source of verification, in conjunction with your password, to access your account. You might have used two-factor authentication before if you have ever been prompted for a code from your phone when logging into a social media site or from a chip card reader when logging into a bank account. Two-factor authentication makes it more difficult for someone to log into your account even if they have stolen your password.

As a website owner, you should enable two-factor authentication on your accounts where possible. A compromised account can cause you to lose important personal data and valuable reputation for your site. Two-factor authentication can give you the ease of mind that your accounts and data are safer. 

Google currently offers 2-Step Verification for all of its accounts, including accounts from Google Apps domains. You can use your phone, a hardware token like a Security Key, or the Google Authenticator app to verify your account. These options give you flexibility when traveling or when you don’t have access to the mobile network.

If your hosting provider, Content Management System (CMS), or any type of platform you use for managing your site doesn’t offer two-factor authentication, ask their customer support for an option to use two-factor authentication in the future.They can build two-factor authentication into their own platforms using Google’s open source code. If your platform or hoster doesn’t provide strong protection against unauthorized access consider hosting your content elsewhere. You can see a list of websites that support two-factor authentication, including what types of authentication options they offer, at

If you have any additional questions, you can post in the Webmaster Help Forums where a community of webmasters can help answer your questions. You can also join our Hangout on Air about Security on August 26.

Posted by: Eric Kuan, Webmaster Relations Specialist & Yuan Niu, Webspam Analyst
Categories: sysadmin

Introducing the Search Analytics API

Google Webmaster Central Blog - Wed, 2015-08-05 10:46

With the great feedback from the Search Analytics feature in Google Search Console, we've decided to make this data accessible for developers via API. We hope that the Search Analytics API will help you to bake search performance data into your apps and tools.

If you've used any of Google’s other APIs, or maybe one of the existing Search Console APIs, then getting started will be easy! The how-to page has examples in Python that you can use as recipes for your own programs. For example, you can use the API to:

What will you cook up with the new API? We're curious to see how new tools and apps that use this API will satisfy the hunger for even more information about your site's performance in Google Search! If you've integrated this API into a tool, we'd love to hear about it in the comments. If you've run into any questions about the API, feel free to drop by our webmaster help forum.

Posted by John Mueller, Webmaster Trends Analyst, Google Switzerland
Categories: sysadmin

#NoHacked: How to recognise and protect yourself against social engineering

Google Webmaster Central Blog - Mon, 2015-08-03 13:56
Today in our #NoHacked campaign, we’ll be talking about social engineering. Follow along with discussions on Twitter and Google+ using the #nohacked hashtag. (Part 1)
If you’ve spent some time on the web, you have more than likely encountered some form of social engineering. Social engineering attempts to extract confidential information from you by manipulating or tricking you in some way.


You might be familiar with phishing, one of the most common forms of social engineering. Phishing sites and emails mimic legitimate sites and trick you into entering confidential information like your username and password into these sites. A recent study from Google found that some phishing sites can trick victims 45% of the time! Once a phishing site has your information, the information will either be sold or be used to manipulate your accounts. the owners will either sell it or use it to manipulate your accounts.

Other Forms of Social Engineering 

As a site owner, phishing isn’t the only form of social engineering that you need to watch out for. One other form of social engineering comes from the software and tools used on your site. If you download or use any Content Management System (CMS), plug-ins, or add-ons, make sure that they come from reputable sources like directly from the developer’s site. Software from non-reputable sites can contain malicious exploits that allow hackers to gain access to your site.

For example, Webmaster Wanda was recently hired by Brandon’s Pet Palace to help create a site. After sketching some designs, Wanda starts compiling the software she needs to build the site. However, she finds out that Photo Frame Beautifier, one of her favorite plug-ins, has been taken off the official CMS plug-in site and that the developer has decided to stop supporting the plug-in. She does a quick search and finds a site that offers an archive of old plug-ins. She downloads the plug-in and uses it to finish the site. Two months later, a notification in Search Console notifies Wanda that her client’s site has been hacked. She quickly scrambles to fix the hacked content and finds the source of the compromise. It turns out the Photo Frame Beautifier plug-in was modified by a third party to allow malicious parties to access the site. She removed the plug-in, fixed the hacked content, secured her site from future attacks, and filed a reconsideration request in Search Console. As you can see, an inadvertent oversight by Wanda led to her client's site being compromised.

Protecting Yourself from Social Engineering Attacks

Social engineering is effective because it’s not obvious that there’s something wrong with what you’re doing. However, there are a few basic things you can do protect yourself from social engineering.
  • Stay vigilant: Whenever you enter confidential information online or install website software, have a healthy dose of skepticism. Check URLs to make sure you’re not typing confidential information into malicious sites. When installing website software make sure the software is coming from known, reputable sources like the developer’s site. 
  • Use two-factor authentication: Two-factor authentication like Google’s 2-Step Verification adds another layer of security that helps protect your account even if your password has been stolen. You should use two-factor authentication on all accounts where possible. We’ll be talking more in-depth next week about the benefits of two-factor authentication. 
Additional resources about social engineering:

If you have any additional questions, you can post in the Webmaster Help Forums where a community of webmasters can help answer your questions. You can also join our Hangout on Air about Security on August 26.

Posted by: Eric Kuan, Webmaster Relations Specialist & Yuan Niu, Webspam Analyst
Categories: sysadmin

#NoHacked: How to avoid being the target of hackers

Google Webmaster Central Blog - Mon, 2015-07-27 14:49
If you publish anything online, one of your top priorities should be security. Getting hacked can negatively affect your online reputation and result in loss of critical and private data. Over the past year Google has noticed a 180% increase in the number of sites getting hacked. While we are working hard to combat this hacked trend, there are steps you can take to protect your content on the web.
Today, we’ll be continuing our #NoHacked campaign. We’ll be focusing on how to protect your site from hacking and give you better insight into how some of these hacking campaigns work. You can follow along with #NoHacked on Twitter and Google+. We’ll also be wrapping up with a Google Hangout focused on security where you can ask our security experts questions.

We’re kicking off the campaign with some basic tips on how to keep your site safe on the web.

1. Strengthen your account securityCreating a password that’s difficult to guess or crack is essential to protecting your site. For example, your password might contain a mixture of letters, numbers, symbols, or be a passphrase. Password length is important. The longer your password, the harder it will be to guess. There are many resources on the web that can test how strong your password is. Testing a similar password to yours (never enter your actual password on other sites) can give you an idea of how strong your password is.

Also, it’s important to avoid reusing passwords across services. Attackers often try known username and password combinations obtained from leaked password lists or hacked services to compromise as many accounts as possible.

You should also turn on 2-Factor Authentication for accounts that offer this service. This can greatly increase your account’s security and protect you from a variety of account attacks. We’ll be talking more about the benefits of 2-Factor Authentication in two weeks.

2. Keep your site’s software updatedOne of the most common ways for a hacker to compromise your site is through insecure software on your site. Be sure to periodically check your site for any outdated software, especially updates that patch security holes. If you use a web server like Apache, nginx or commercial web server software, make sure you keep your web server software patched. If you use a Content Management System (CMS) or any plug-ins or add-ons on your site, make sure to keep these tools updated with new releases. Also, sign up to the security announcement lists for your web server software and your CMS if you use one. Consider completely removing any add-ons or software that you don't need on your website -- aside from creating possible risks, they also might slow down the performance of your site.

3. Research how your hosting provider handles security issuesYour hosting provider’s policy for security and cleaning up hacked sites is in an important factor to consider when choosing a hosting provider. If you use a hosting provider, contact them to see if they offer on-demand support to clean up site-specific problems. You can also check online reviews to see if they have a track record of helping users with compromised sites clean up their hacked content.
If you control your own server or use Virtual Private Server (VPS) services, make sure that you’re prepared to handle any security issues that might arise. Server administration is very complex, and one of the core tasks of a server administrator is making sure your web server and content management software is patched and up to date. If you don't have a compelling reason to do your own server administration, you might find it well worth your while to see if your hosting provider offers a managed services option.

4. Use Google tools to stay informed of potential hacked content on your siteIt’s important to have tools that can help you proactively monitor your site.The sooner you can find out about a compromise, the sooner you can work on fixing your site.

We recommend you sign up for Search Console if you haven’t already. Search Console is Google’s way of communicating with you about issues on your site including if we have detected hacked content. You can also set up Google Alerts on your site to notify you if there are any suspicious results for your site. For example, if you run a site selling pet accessories called, you can set up an alert for [ cheap software] to alert you if any hacked content about cheap software suddenly starts appearing on your site. You can set up multiple alerts for your site for different spammy terms. If you’re unsure what spammy terms to use, you can use Google to search for common spammy terms.

We hope these tips will keep your site safe on the web. Be sure to follow our social campaigns and share any tips or tricks you might have about staying safe on the web with the #NoHacked hashtag.

If you have any additional questions, you can post in the Webmaster Help Forums where a community of webmasters can help answer your questions. You can also join our Hangout on Air about Security on August 26.

Posted by: Eric Kuan, Webmaster Relations Specialist and Yuan Niu, Webspam Analyst
Categories: sysadmin

Update on the Autocomplete API

Google Webmaster Central Blog - Fri, 2015-07-24 10:30
Google Search provides an autocomplete service that attempts to predict a query before a user finishes typing. For years, a number of developers have integrated the results of autocomplete within their own services using a non-official, non-published API that also had no restrictions on it. Developers who discovered the autocomplete API were then able to incorporate autocomplete services, independent of Google Search.

There have been multiple times in which the developer community’s reverse-engineering of a Google service via an unpublished API has led to great things. The Google Maps API, for example, became a formal supported API months after seeing what creative engineers could do combining map data with other data sources. We currently support more than 80 APIs that developers can use to integrate Google services and data into their applications.

However, there are some times when using an unsupported, unpublished API also carries the risk that the API will stop being be available. This is one of those situations.

We built autocomplete as a complement to Search, and never intended that it would exist disconnected from the purpose of anticipating user search queries. Over time we’ve realized that while we can conceive of uses for an autocomplete data feed outside of search results that may be valuable, overall the content of our automatic completions are optimized and intended to be used in conjunction with web search results, and outside of the context of a web search don’t provide a meaningful user benefit.

In the interest of maintaining the integrity of autocomplete as part of Search, we will be restricting unauthorized access to the unpublished autocomplete API as of August 10th, 2015. We want to ensure that users experience autocomplete as it was designed to be used -- as a service closely tied to Search. We believe this provides the best user experience for both services.

For publishers and developers who still want to use the autocomplete service for their site, we have an alternative. Google Custom Search Engine allows sites to maintain autocomplete functionality in connection with Search functionality. Any partner already using Google CSE will be unaffected by this change. For others, if you want autocomplete functionality after August 10th, 2015, please see our CSE sign-up page.

Posted by Peter Chiu on behalf of the Autocomplete team
Categories: sysadmin

Google+: A case study on App Download Interstitials

Google Webmaster Central Blog - Thu, 2015-07-23 10:10
Many mobile sites use promotional app interstitials to encourage users to download their native mobile apps. For some apps, native can provide richer user experiences, and use features of the device that are currently not easy to access on a browser. Because of this, many app owners believe that they should encourage users to install the native version of their online property or service. It’s not clear how aggressively to promote the apps, and a full page interstitial can interrupt the user from reaching their desired content.

On Google+ mobile web, we decided to take a closer look at our own use of interstitials. Internal user experience studies identified them as poor experiences, and Jennifer Gove gave a great talk at IO last year which highlights this user frustration.

Despite our intuition that we should remove the interstitial, we prefer to let data guide our decisions, so we set out to learn how the interstitial affected our users. Our analysis found that:
  • 9% of the visits to our interstitial page resulted in the ‘Get App’ button being pressed. (Note that some percentage of these users already have the app installed or may never follow through with the app store download.)
  • 69% of the visits abandoned our page. These users neither went to the app store nor continued to our mobile website.
While 9% sounds like a great CTR for any campaign, we were much more focused on the number of users who had abandoned our product due to the friction in their experience. With this data in hand, in July 2014, we decided to run an experiment and see how removing the interstitial would affect actual product usage. We added a Smart App Banner to continue promoting the native app in a less intrusive way, as recommended in the Avoid common mistakes section of our Mobile SEO Guide. The results were surprising:
  • 1-day active users on our mobile website increased by 17%.
  • G+ iOS native app installs were mostly unaffected (-2%). (We’re not reporting install numbers from Android devices since most come with Google+ installed.)
Based on these results, we decided to permanently retire the interstitial. We believe that the increase in users on our product makes this a net positive change, and we are sharing this with the hope that you will reconsider the use of promotional interstitials. Let’s remove friction and make the mobile web more useful and usable!
(Since this study, we launched a better mobile web experience that is currently without an app banner. The banner can still be seen on iOS 6 and below.)

Posted by David Morell, Software Engineer, Google+
Categories: sysadmin

Google's handling of new top level domains

Google Webmaster Central Blog - Tue, 2015-07-21 09:02
With the coming of many new generic top level domains (gTLDs), we'd like to give some insight into how these are handled in Google's search. We’ve heard and seen questions and misconceptions about the way we treat new top level domains (TLDs), like .guru, .how, or any of the .BRAND gTLDs, for example:

Q: How will new gTLDs affect search? Is Google changing the search algorithm to favor these TLDs? How important are they really in search? 
A: Overall, our systems treat new gTLDs like other gTLDs (like .com & .org). Keywords in a TLD do not give any advantage or disadvantage in search.

Q: What about IDN TLDs such as  .みんな? Can Googlebot crawl and index them, so that they can be used in search?
A: Yes. These TLDs can be used the same as other TLDs (it's easy to check with a query like [site:みんな]). Google treats the Punycode version of a hostname as being equivalent to the unencoded version, so you don't need to redirect or canonicalize them separately. For the rest of the URL, remember to use UTF-8 for the path & query-string in the URL, when using non-ASCII characters.

Q: Will a .BRAND TLD be given any more or less weight than a .com?
A: No. Those TLDs will be treated the same as a other gTLDs. They will require the same geotargeting settings and configuration, and they won’t have more weight or influence in the way we crawl, index, or rank URLs.

Q: How are the new region or city TLDs (like .london or .bayern) handled?
A: Even if they look region-specific, we will treat them as gTLDs. This is consistent with our handling of regional TLDs like .eu and .asia. There may be exceptions at some point down the line, as we see how they're used in practice. See our help center for more information on multi-regional and multilingual sites, and set geotargeting in Search Console where relevant.

Q: What about real ccTLDs (country code top-level domains) : will Google favor ccTLDs (like .uk, .ae, etc.) as a local domain for people searching in those countries?
A: By default, most ccTLDs (with exceptions) result in Google using these to geotarget the website; it tells us that the website is probably more relevant in the appropriate country. Again, see our help center for more information on multi-regional and multilingual sites.

Q: Will Google support my SEO efforts to move my domain from .com to a new TLD? How do I move my website without losing any search ranking or history?
A: We have extensive site move documentation in our Help Center. We treat these moves the same as any other site move. That said, domain changes can take time to be processed for search (and outside of search, users expect email addresses to remain valid over a longer period of time), so it's generally best to choose a domain that will fit your long-term needs.

We hope this gives you more information on how the new top level domains are handled. If you have any more questions, feel free to drop them here, or ask in our help forums.

Posted by John Mueller, Webmaster Trends Analyst
Categories: sysadmin