boB Rudis
7 years ago
21 changed files with 131 additions and 129 deletions
@ -1,14 +1,14 @@ |
|||
#' Tools to Parse and Test Robots Exclusion Protocol Files and Rules |
|||
#' Parse and Test Robots Exclusion Protocol Files and Rules |
|||
#' |
|||
#' The 'Robots Exclusion Protocol' (<http://www.robotstxt.org/orig.html>) documents a set |
|||
#' of standards for allowing or excluding robot/spider crawling of different areas of |
|||
#' site content. Tools are provided which wrap The 'rep-cpp` <https://github.com/seomoz/rep-cpp> |
|||
#' site content. Tools are provided which wrap The `rep-cpp` <https://github.com/seomoz/rep-cpp> |
|||
#' C++ library for processing these `robots.txt`` files. |
|||
#' |
|||
#' @md |
|||
#' @name rep |
|||
#' @name spiderbar |
|||
#' @docType package |
|||
#' @author Bob Rudis (bob@@rud.is) |
|||
#' @useDynLib rep, .registration=TRUE |
|||
#' @useDynLib spiderbar, .registration=TRUE |
|||
#' @importFrom Rcpp sourceCpp |
|||
NULL |
@ -1,15 +0,0 @@ |
|||
% Generated by roxygen2: do not edit by hand |
|||
% Please edit documentation in R/rep-package.R |
|||
\docType{package} |
|||
\name{rep} |
|||
\alias{rep} |
|||
\alias{rep-package} |
|||
\title{Tools to Parse and Test Robots Exclusion Protocol Files and Rules} |
|||
\description{ |
|||
The 'Robots Exclusion Protocol' (\url{http://www.robotstxt.org/orig.html}) documents a set |
|||
of standards for allowing or excluding robot/spider crawling of different areas of |
|||
site content. Tools are provided which wrap The 'rep-cpp\code{<https://github.com/seomoz/rep-cpp> C++ library for processing these}robots.txt`` files. |
|||
} |
|||
\author{ |
|||
Bob Rudis (bob@rud.is) |
|||
} |
@ -0,0 +1,16 @@ |
|||
% Generated by roxygen2: do not edit by hand |
|||
% Please edit documentation in R/spiderbar-package.R |
|||
\docType{package} |
|||
\name{spiderbar} |
|||
\alias{spiderbar} |
|||
\alias{spiderbar-package} |
|||
\title{Parse and Test Robots Exclusion Protocol Files and Rules} |
|||
\description{ |
|||
The 'Robots Exclusion Protocol' (\url{http://www.robotstxt.org/orig.html}) documents a set |
|||
of standards for allowing or excluding robot/spider crawling of different areas of |
|||
site content. Tools are provided which wrap The \code{rep-cpp} \url{https://github.com/seomoz/rep-cpp} |
|||
C++ library for processing these `robots.txt`` files. |
|||
} |
|||
\author{ |
|||
Bob Rudis (bob@rud.is) |
|||
} |
@ -1,3 +1,3 @@ |
|||
library(testthat) |
|||
library(robotstxt) |
|||
test_check("rep") |
|||
test_check("spiderbar") |
|||
|
Loading…
Reference in new issue