我正在尝试创建一个htaccess规则来重定向包含除两页之外的某个单词的网址。
示例:
https://www.example.com/string
为:
https://www.example.com
https://www.example.com/string/some-page.html
为:
https://www.example.com/some-page.html
https://www.example.com/string/some-directory/some-page.html
为:
https://www.example.com/some-directory/some-page.html
https://www.example.com/string/some-directory/
为:
https://www.example.com/some-directory/
除非重新删除这两个页面:
https://www.example.com/string/checkout/cart/
为:
https://www.example.com/string/checkout/cart/
和
https://www.example.com/string/checkout/onepage/#/steps
为:
https://www.example.com/string/checkout/onepage/#/steps
我已经尝试过在Google上找到的所有解决方案。我似乎无法使基础工作正常工作!
非常感谢任何帮助。谢谢
已编辑添加当前的重写规则
ModPagespeed on
AllowOverride None
AllowOverrideList None
Require all granted
DirectoryIndex index.php
ErrorDocument 401 default
ErrorDocument 401 "Authorization Required"
ErrorDocument 403 "Authorization Required"
ErrorDocument 500 /error-500.html
Options All -Indexes +FollowSymlinks
Options All
SSLOptions
RewriteEngine On
RewriteBase /
RewriteRule ^api/rest api.php?type=rest [QSA,L]
RewriteRule ^(.*)local.xml$ - [F,L]
RewriteCond %{THE_REQUEST} ^.*/index.php
RewriteRule ^(.*)index.php$ https://www.example.com$1 [R=301,L]
RewriteCond %{HTTP_HOST} ^example.com
RewriteRule (.*) https://www.example.com$1 [R=301,L]
RewriteRule .* - [E=HTTP_AUTHORIZATION:%{HTTP:Authorization}]
RewriteCond %{REQUEST_URI} !^/(media|skin|js)/
RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME} !-d
RewriteCond %{REQUEST_FILENAME} !-l
RewriteRule .* index.php [L]
RewriteCond %{HTTP_HOST} ^11\.11\.11\.111
RewriteRule (.*) https://www.example.com$1 [R=301,L]
#RewriteCond %{HTTP:Accept-Language} (es) [NC]
#RewriteRule .* https://www.example.com/mx [L]
##### Block unwanted Crawler Bots that clog your server #####
RewriteCond %{HTTP_USER_AGENT} libwww-perl.*
RewriteRule .* ? [F,L]
RewriteCond %{HTTP_USER_AGENT} MJ12bot
RewriteRule .* - [F]
RewriteCond %{HTTP_USER_AGENT} 80legs [NC]
RewriteRule ^ - [F]
##############################################################
## Remove Bad Bots from crawling ##
# IF THE UA STARTS WITH THESE
# Block spambots
#
RewriteCond %{HTTP_USER_AGENT} ^(aesop_com_spiderman|alexibot|backweb|bandit|batchftp|bigfoot) [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^(black.?hole|blackwidow|blowfish|botalot|buddy|builtbottough|bullseye) [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^(cheesebot|cherrypicker|chinaclaw|collector|copier|copyrightcheck) [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^(cosmos|crescent|curl|custo|da|diibot|disco|dittospyder|dragonfly) [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^(drip|easydl|ebingbong|ecatch|eirgrabber|emailcollector|emailsiphon) [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^(emailwolf|erocrawler|exabot|eyenetie|filehound|flashget|flunky) [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^(frontpage|getright|getweb|go.?zilla|go-ahead-got-it|gotit|grabnet) [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^(grafula|harvest|hloader|hmview|httplib|httrack|humanlinks|ilsebot) [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^(infonavirobot|infotekies|intelliseek|interget|iria|jennybot|jetcar) [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^(joc|justview|jyxobot|kenjin|keyword|larbin|leechftp|lexibot|lftp|libweb) [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^(likse|linkscan|linkwalker|lnspiderguy|lwp|magnet|mag-net|markwatch) [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^(mata.?hari|memo|microsoft.?url|midown.?tool|miixpc|mirror|missigua) [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^(mister.?pix|moget|mozilla.?newt|nameprotect|navroad|backdoorbot|nearsite) [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^(net.?vampire|netants|netcraft|netmechanic|netspider|nextgensearchbot) [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^(attach|nicerspro|nimblecrawler|npbot|octopus|offline.?explorer) [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^(offline.?navigator|openfind|outfoxbot|pagegrabber|papa|pavuk) [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^(pcbrowser|php.?version.?tracker|pockey|propowerbot|prowebwalker) [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^(psbot|pump|queryn|recorder|realdownload|reaper|reget|true_robot) [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^(repomonkey|rma|internetseer|sitesnagger|siphon|slysearch|smartdownload) [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^(snake|snapbot|snoopy|sogou|spacebison|spankbot|spanner|sqworm|superbot) [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^(superhttp|surfbot|asterias|suzuran|szukacz|takeout|teleport) [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^(telesoft|the.?intraformant|thenomad|tighttwatbot|titan|urldispatcher) [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^(turingos|turnitinbot|urly.?warning|vacuum|vci|voideye|whacker) [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^(libwww-perl|widow|wisenutbot|wwwoffle|xaldon|xenu|zeus|zyborg|anonymouse) [NC,OR]
#
## STARTS WITH WEB
#
RewriteCond %{HTTP_USER_AGENT} ^web(zip|emaile|enhancer|fetch|go.?is|auto|bandit|clip|copier|master|reaper|sauger|site.?quester|whack) [NC,OR]
#
## ANYWHERE IN UA -- GREEDY REGEX
#
RewriteCond %{HTTP_USER_AGENT} ^.*(craftbot|download|extract|stripper|sucker|ninja|clshttp|webspider|leacher|collector|grabber|webpictures).*$ [NC]
#
RewriteRule . - [F,L]
#
## Useragents starting with
#
RewriteCond %{HTTP_USER_AGENT} ^atraxbot [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^Azureus [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^geohasher [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^PycURL [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^Python-urllib [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^research-scan-bot [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^Sosospider [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^Wget [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^xenu [NC,OR]
#
## User agents contains string
#
RewriteCond %{HTTP_USER_AGENT} ^.*casper [NC,OR]
RewriteCond %{REQUEST_METHOD} ^(HEAD|TRACE|DELETE|TRACK) [NC,OR]
#
## Block out use of illegal or unsafe characters in the HTTP Request
#
RewriteCond %{THE_REQUEST} ^.*(\\r|\\n|%0A|%0D).* [NC,OR]
#
## Block out use of illegal or unsafe characters in the Referer Variable of the HTTP Request
## RewriteCond %{HTTP_REFERER} ^(.*)(<|>|'|'|%0A|%0D|%27|%3C|%3E|%00).* [NC,OR]
## Block out use of illegal or unsafe characters in any cookie associated with the HTTP Request
#
RewriteCond %{HTTP_COOKIE} ^.*(<|>|'|'|%0A|%0D|%27|%3C|%3E|%00).* [NC,OR]
#
## Block out use of illegal characters in URI or use of malformed URI
#
RewriteCond %{REQUEST_URI} ^/(,|;|:|<|>|">|"<|/|\\\.\.\\).{0,9999}.* [NC,OR]
#
## Block out use of empty User Agent Strings
## NOTE - disable this rule if your site is integrated with Payment Gateways such as PayPal
## RewriteCond %{HTTP_USER_AGENT} ^$ [OR]
## Block out use of illegal or unsafe characters in the User Agent variable
#
RewriteCond %{HTTP_USER_AGENT} ^.*(<|>|'|'|%0A|%0D|%27|%3C|%3E|%00).* [NC,OR]
#
## Measures to block out SQL injection attacks
#
RewriteCond %{QUERY_STRING} ^.*(;|<|>|'|"|"|'|\)|%0A|%0D|%22|%27|%3C|%3E|%00).*(/\*|union|select|insert|cast|set|declare|drop|update|md5|benchmark).* [NC,OR]
#
## Block out reference to localhost/loopback/127.0.0.1 in the Query String
#
RewriteCond %{QUERY_STRING} ^.*(localhost|loopback|127\.0\.0\.1).* [NC,OR]
#
## Block out use of illegal or unsafe characters in the Query String variable
#
RewriteCond %{QUERY_STRING} ^.*(<|>|'|'|%0A|%0D|%27|%3C|%3E|%00).* [NC]
#
########## Begin - File injection protection, by SigSiu.net
RewriteCond %{REQUEST_METHOD} GET
RewriteCond %{QUERY_STRING} [a-zA-Z0-9_]=http:// [OR]
RewriteCond %{QUERY_STRING} [a-zA-Z0-9_]=(\.\.//?)+ [OR]
RewriteCond %{QUERY_STRING} [a-zA-Z0-9_]=/([a-z0-9_.]//?)+ [NC]
RewriteRule .* - [F]
########## End - File injection protection
答案 0 :(得分:0)
用于htaccess的Apache文档在开始时可能很难理解。自第一个Web服务器以来,Htaccess已经存在,并且在我们现在摆弄的东西中变形。我不得不多次弄清楚这样的事情。肯定有几种方法可以实现你想要的东西,这让它更加令人困惑。这是一个.htaccess
文件,应该可以帮到你:
RewriteEngine On
RewriteBase /
# allow my ln -s links
Options +FollowSymLinks
# pass everything with /string/checkout as-is
RewriteCond %{REQUEST_URI} ^/string/checkout/(.*)$
RewriteRule ^(.*)$ /string/checkout/%1 [END]
# remove "/string" from all other URIs
RewriteCond %{REQUEST_URI} ^/string/(.*)$
RewriteRule ^(.*)$ /%1 [R,L]
可以只处理您请求的2个页面,但您可能需要允许所有内容通过/string/checkout
URI路径。
您可能不需要RewriteBase /
和Options +FollowSymlinks
行,但是当我在我的开发服务器上旋转它时我需要它们。
第一条规则上的[END]
标志表示停止所有进一步的规则
处理,保护您的/string/checkout
URI
被第二条规则破坏,剥离了/string
所有匹配的URI的一部分,使其达到那一点。
第二条规则上的[R,L]
标志意味着要重定向[R]
和
使其成为此规则集中处理的最后一个[L]
规则。
%1
被正则表达式的任何匹配所取代
{/ 1}}在每条规则的第一部分。这里有很多内容只有几行,所以如果你需要更多的信息,你可以通过相关的Apache文档进行讨论:
https://httpd.apache.org/docs/2.4/mod/mod_rewrite.html https://httpd.apache.org/docs/2.4/rewrite/flags.html
我在这里使用简单的HTML上传了一个源演示:
https://github.com/ByteSlinger/Htaccess-Rewrite-Demo
如果你没有root网站可以使用,那么测试和演示这可能很棘手,所以这是一个子目录的现场演示:
答案 1 :(得分:0)
使用否定前瞻表达式可以在单个规则中完成:
/path/to/phatomjs /path/to/script.js
确保此规则是您的最高规则,并在新浏览器中对此进行测试以避免旧缓存。