EzoicBot

[et_pb_section fb_built=”1″ _builder_version=”3.22.3″][et_pb_row _builder_version=”3.22.3″ background_size=”initial” background_position=”top_left” background_repeat=”repeat”][et_pb_column type=”4_4″ _builder_version=”3.0.47″][et_pb_text _builder_version=”3.0.106″ header_2_font=”Oswald||||||||” header_2_font_size=”70px” header_2_font_size_last_edited=”on|desktop”]

EzoicBot

EzoicBot is the generic name for Ezoics web crawler. EzoicBot is the general name for two different types of crawlers: a desktop crawler that simulates a user on desktop browser, and a mobile crawler that simulates a user on a mobile device.

Your website will probably be crawled by both EzoicBot Desktop and EzoicBot Mobile. You can identify the subtype of EzoicBot by looking at the user agent string in the request. However, both crawler types obey the same product token (user agent token) in robots.txt, and so you cannot selectively target either EzoicBot mobile or EzoicBot desktop using robots.txt.

What is EzoicBot and why is it crawling my site?

Ezoic is a technology platform for digital publishers. You can learn more about what Ezoic does here.

EzoicBot is our web crawler designed to extract valuable information about how the internet, search engines, and websites all work together. EzoicBot can helps publishers better understand how their sites work. This includes the ability for search engines, like Google, to index and rank their content.

How EzoicBot accesses your site

EzoicBot are polite. This means it will follow robots.txt directives. EzoicBot shouldn’t access your site more than once every few seconds on average. However, due to delays it’s possible that the rate will appear to be slightly higher over short periods.

EzoicBot was designed to be run simultaneously by thousands of machines to improve performance and scale as the web grows. Also, to cut down on bandwidth usage, we run many crawlers on machines located near the sites that they might crawl. Therefore, your logs may show visits from several machines at ezoic.com, all with the user-agent EzoicBot. Our goal is to crawl pages from your site without overwhelming your server’s bandwidth.

How can you tell if EzoicBot is visiting your site?

EzoicBot crawler will announce itself by including the user-agent below in the request to your website. Look for the user-agent in your access.log files.

EzoicBot user-agent string:
Mozilla/5.0 (Linux; Android 8.0; Pixel 2 Build/OPD3.170816.012) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/72.0.3626.121 Mobile Safari/537.36 (compatible; EzLynx/0.1; +http://www.ezoic.com/bot.html)

Generally looking for the string “http://www.ezoic.com/bot.html” in your log is sufficient.

Managing EzoicBot behavior on your site

It’s almost impossible to keep a web server secret by not publishing links to it. For example, as soon as someone follows a link from your “secret” server to another web server, your “secret” URL may appear in the referrer tag and can be stored and published by the other web server in its referrer log. Similarly, the web has many outdated and broken links. Whenever someone publishes an incorrect link to your site or fails to update links to reflect changes in your server, EzoicBot will try to crawl an incorrect link from your site.

As previously stated, EzoicBot follows the directives in your robots.txt file. We encountered some rare cases where a mistake in the robots.txt prevented bots from following the directives. If you think EzoicBot is not respecting your directives, please reach out to our Ezoic support staff at here.

If you want to manage what content EzoicBot can crawl on your site, you have a number of options. Be aware of the difference between preventing a page from being accessible at all by both crawlers or users. Changes to your robots.txt file will usually allow or disallow EzoicBot from crawling a site or its pages. If you want to set instructions for our bot in your robots.txt, please use the user agent name “EzoicBot”.

For further information on how robots.txt files work, please refer to Google’s document “Create a robots.txt file“.

[/et_pb_text][/et_pb_column][/et_pb_row][/et_pb_section]