shellhub icon indicating copy to clipboard operation
shellhub copied to clipboard

Add X-Robots-Tag header to HTTP tunnels to prevent search engine indexing

Open gustavosbarreto opened this issue 2 months ago • 1 comments

Problem

Google Search Console has flagged *.shellhub.cloud (our HTTP tunneling domain) as containing "deceptive pages" due to user-hosted content being indexed by search engines.

Since ShellHub provides HTTP tunneling services (similar to ngrok), the content served through *.shellhub.cloud subdomains is controlled by end-users and can include login pages, admin panels, or other interfaces that trigger false positives in Google Safe Browsing.

Affected URLs example:

  • https://*.shellhub.cloud/auth/login
  • https://*.shellhub.cloud/admin/users

These are legitimate self-hosted applications (e.g., Immich) from our users, not phishing attempts.

Solution

Add the X-Robots-Tag: noindex, nofollow HTTP header to all responses served through HTTP tunnels on *.shellhub.cloud.

This will:

  1. Prevent search engines from indexing user-hosted content
  2. Avoid future false positive security warnings
  3. Maintain clear separation between our control panel (cloud.shellhub.io) and user tunneling infrastructure

Implementation

The header should be added at the reverse proxy level for all HTTP tunnel responses:

X-Robots-Tag: noindex, nofollow

Additional Context

This is a common issue for HTTP tunneling providers. Similar services (ngrok, localtunnel, etc.) implement the same solution to prevent search engine crawlers from indexing user-generated content.

After implementation, we will submit a review request to Google Search Console explaining the nature of our service.

gustavosbarreto avatar Oct 20 '25 19:10 gustavosbarreto

Hey, can i do this ? If so, just assign me.

ignorant05 avatar Nov 27 '25 19:11 ignorant05