You are here

README.txt in RobotsTxt 7

Same filename and directory in other branches
  1. 5 README.txt
  2. 6 README.txt
CONTENTS OF THIS FILE
---------------------

 * Introduction
 * Installation
 * Frequently Asked Questions (FAQ)
 * Known Issues
 * How Can You Contribute?


INTRODUCTION
------------

Maintainer: hass <https://drupal.org/user/85918>
Project Page: https://drupal.org/project/robotstxt

Use this module when you are running multiple Drupal sites from a single code
base (multisite) and you need a different robots.txt file for each one. This
module generates the robots.txt file dynamically and gives you the chance to
edit it, on a per-site basis.

For developers, you can automatically add paths to the robots.txt file by
implementing hook_robotstxt(). See robotstxt.api.php for more documentation.


INSTALLATION
------------

See https://drupal.org/getting-started/install-contrib for instructions on
how to install or update Drupal modules.

Once you have the RobotsTxt modules installed, make sure to delete or rename
the robots.txt file in the root of your Drupal installation. Otherwise, the
module cannot intercept requests for the /robots.txt path.


FREQUENTLY ASKED QUESTIONS
--------------------------

Q: Can this module work if I have clean URLs disabled?
A: Yes it can! In the .htaccess file of your Drupal's root directory, add the
   following two lines to the mod_rewrite section, immediately after the line
   that says "RewriteEngine on":

   # Add redirection for the robots.txt path for use with the RobotsTxt module.
   RewriteRule ^(robots.txt)$ index.php?q=$1

Q: Does this module work together with Drupal Core "Fast 404 pages" feature?
A: Yes, but you need to add robots.txt to the 'exclude_paths' of your 
   settings.php.
   
   Drupal (default):
   $conf['404_fast_paths_exclude'] = '/\/(?:styles)|(?:system\/files)\//';

   Drupal with RobotsTxt module:
   $conf['404_fast_paths_exclude'] = '/\/(?:styles)|(?:system\/files)\/|(?:robots.txt)/';

Q: How can I install the module with custom default robots.txt?
A: The module allows adding a default.robots.txt to the defaults folder.

   1. Remove the robots.txt from site root.
   2. Save your custom robots.txt to "/sites/default/default.robots.txt"
   3. Run the module installation.


KNOWN ISSUES
------------

There are no known issues at this time.

To report new bug reports, feature requests, and support requests, visit
https://drupal.org/project/issues/robotstxt.


HOW CAN YOU CONTRIBUTE?
---------------------

- Report any bugs, feature requests, etc. in the issue tracker.
  https://drupal.org/project/issues/robotstxt

File

README.txt
View source
  1. CONTENTS OF THIS FILE
  2. ---------------------
  3. * Introduction
  4. * Installation
  5. * Frequently Asked Questions (FAQ)
  6. * Known Issues
  7. * How Can You Contribute?
  8. INTRODUCTION
  9. ------------
  10. Maintainer: hass
  11. Project Page: https://drupal.org/project/robotstxt
  12. Use this module when you are running multiple Drupal sites from a single code
  13. base (multisite) and you need a different robots.txt file for each one. This
  14. module generates the robots.txt file dynamically and gives you the chance to
  15. edit it, on a per-site basis.
  16. For developers, you can automatically add paths to the robots.txt file by
  17. implementing hook_robotstxt(). See robotstxt.api.php for more documentation.
  18. INSTALLATION
  19. ------------
  20. See https://drupal.org/getting-started/install-contrib for instructions on
  21. how to install or update Drupal modules.
  22. Once you have the RobotsTxt modules installed, make sure to delete or rename
  23. the robots.txt file in the root of your Drupal installation. Otherwise, the
  24. module cannot intercept requests for the /robots.txt path.
  25. FREQUENTLY ASKED QUESTIONS
  26. --------------------------
  27. Q: Can this module work if I have clean URLs disabled?
  28. A: Yes it can! In the .htaccess file of your Drupal's root directory, add the
  29. following two lines to the mod_rewrite section, immediately after the line
  30. that says "RewriteEngine on":
  31. # Add redirection for the robots.txt path for use with the RobotsTxt module.
  32. RewriteRule ^(robots.txt)$ index.php?q=$1
  33. Q: Does this module work together with Drupal Core "Fast 404 pages" feature?
  34. A: Yes, but you need to add robots.txt to the 'exclude_paths' of your
  35. settings.php.
  36. Drupal (default):
  37. $conf['404_fast_paths_exclude'] = '/\/(?:styles)|(?:system\/files)\//';
  38. Drupal with RobotsTxt module:
  39. $conf['404_fast_paths_exclude'] = '/\/(?:styles)|(?:system\/files)\/|(?:robots.txt)/';
  40. Q: How can I install the module with custom default robots.txt?
  41. A: The module allows adding a default.robots.txt to the defaults folder.
  42. 1. Remove the robots.txt from site root.
  43. 2. Save your custom robots.txt to "/sites/default/default.robots.txt"
  44. 3. Run the module installation.
  45. KNOWN ISSUES
  46. ------------
  47. There are no known issues at this time.
  48. To report new bug reports, feature requests, and support requests, visit
  49. https://drupal.org/project/issues/robotstxt.
  50. HOW CAN YOU CONTRIBUTE?
  51. ---------------------
  52. - Report any bugs, feature requests, etc. in the issue tracker.
  53. https://drupal.org/project/issues/robotstxt