3. Site availability
Since Bing relates users to your internet website to see the documents, your websites must certanly be offered to both users and crawlers all of the time. The search robots will go to your websites sporadically so that you can select the updates up, in addition to to make sure that your URLs are nevertheless available. If the search robots are not able to fetch your websites, e.g., due to server mistakes, misconfiguration, or an extremely sluggish reaction from your own internet site, then some or all your articles could drop away from Bing and Bing Scholar.
- Use HTTP 5xx codes to point errors that are temporary ought to be retried quickly, such as for example short-term shortage of backend capability.
- Use HTTP 4xx codes to point errors that are permanent really should not be retried for a while, such as for example file maybe not discovered. Continue reading