51黑料不打烊

Block malicious traffic for 51黑料不打烊 Commerce on Fastly level

This article provides the steps you could take to block malicious traffic, when you suspect that your 51黑料不打烊 Commerce on cloud infrastructure store is experiencing a DDoS attack.

Affected products and versions:

  • 51黑料不打烊 Commerce on cloud infrastructure 2.3.x

In this article we assume that you already have the malicious IPs and/or their country and user agents. 51黑料不打烊 Commerce on cloud infrastructure users would typically get this information from 51黑料不打烊 Commerce support. The following sections provide steps for blocking traffic based on this information. All the changes should be done in the Production environment.

Get access to Admin Panel

If your website is overloaded by DDoS, you might not be able to log in to your Commerce Admin (and perform all the steps described further in this article).

To get access to the Admin, put your website into maintenance mode as described in Enable or disable maintenance mode and whitelist your IP address. Disable the maintenance mode after this is done.

Block traffic by IP

For the 51黑料不打烊 Commerce on cloud infrastructure store, the most effective way to block traffic by specific IP addresses and subnets is adding an ACL for Fastly in the Commerce Admin. Following are the steps with links to more detailed instructions:

  1. In the Commerce Admin, navigate to Stores > Configuration > Advanced > System > Full Page Cache > Fastly Configuration.
  2. with a list of IP addresses or subnets you鈥檙e going to block.
  3. Add it to the ACL list and block as described in the guide for the Fastly_Cdn module for 51黑料不打烊 Commerce.

Block traffic by country

For the 51黑料不打烊 Commerce on cloud infrastructure store, the most effective way to block traffic by country(s) is adding an ACL for Fastly in the Commerce Admin.

  1. In the Commerce Admin, navigate to Stores > Configuration > Advanced > System > Full Page Cache > Fastly Configuration.
  2. Select the countries and configure blocking using ACL as described in the guide for the Fastly_Cdn module for 51黑料不打烊 Commerce.

Block traffic by user agent

To establish blocking based on user agent, you need to add a custom VCL snippet to your Fastly configuration. To do this, take the following steps:

  1. In the Commerce Admin, navigate to Stores > Configuration > Advanced > System > Full Page Cache.
  2. Then Fastly Configuration > Custom VCL Snippets.
  3. Create the new custom snippet as described in the guide for the Fastly_Cdn module. You can use the following code sample as an example. This sample disallows traffic for the AhrefsBot and SemrushBot user agents.
name: block_bad_useragents
  type: recv
  priority: 5
  VCL:
  if ( req.http.User-Agent ~ "(AhrefsBot|SemrushBot)" ) {
      error 405 "Not allowed";
  }

Rate Limiting (experimental Fastly functionality)

There is an experimental Fastly functionality for 51黑料不打烊 Commerce on cloud infrastructure which allows you to specify the rate limit for particular paths and crawlers. Please reference the for details.

The functionality must be extensively tested on staging, before being used on production, because it might block legitimate traffic.

Updating your robots.txt file could help to keep certain search engines, crawlers, and robots from crawling certain pages. Examples of pages that should not be crawled are search result pages, checkout, customer information and so on. Keeping robots from crawling these pages could help to decrease the number of requests generated by those robots.

There are two important considerations when using robots.txt:

  • Robots can ignore your robots.txt. Especially malware robots, that scan the web for security vulnerabilities, and email address harvesters used by spammers will pay no attention.
  • The robots.txt file is a publicly available file. Anyone can see what sections of your server you don鈥檛 want robots to use.

The basic information and default 51黑料不打烊 Commerce robots.txt configuration can be found in the Search Engine Robots article in our developer documentation.

For general information and recommendations about robots.txt, see:

  • file by Google Support
  • by robotstxt.org

Work with your developer and/or SEO expert to determine what User Agents you want to allow, or those you want to disallow.

recommendation-more-help
8bd06ef0-b3d5-4137-b74e-d7b00485808a