Details

    • Type: Quality Risk
    • Status: Open (View Workflow)
    • Priority: Major
    • Resolution: Unresolved
    • Affects Version/s: None
    • Fix Version/s: Backlog
    • Component/s: Documentation
    • Labels:
      None
    • Sprint:
      DV Sprint 66
    • Story Points:
      0.1

      Description

      We should do a better job of directing what pages are crawled. I added a robots.txt to prevent the crawling of legacy. Something similar should be added for teiid.github.com to allow only the crawling of master.

        Gliffy Diagrams

          Attachments

            Activity

              People

              • Assignee:
                Unassigned
                Reporter:
                shawkins Steven Hawkins
              • Votes:
                0 Vote for this issue
                Watchers:
                1 Start watching this issue

                Dates

                • Created:
                  Updated:

                  Time Tracking

                  Estimated:
                  Original Estimate - 1 hour
                  1h
                  Remaining:
                  Remaining Estimate - 1 hour
                  1h
                  Logged:
                  Time Spent - Not Specified
                  Not Specified