To hide a specific page from all robots that could be lurking around your pages, do:
Code:
User-agent: *
Disallow: html/file.html
Change the highlighted to the hidden file.
When you say you want to ban them from seeing the file again after one time, it's hard to do. You can try two things that I can think of - I would do both. You could try storing the IPs in a database and then getting information from a database to see if a user has already viewed the page. The first things you have to think about is whether you want to use a file database or a sql database. I would suggest a sql database if you can get one. The following code will be for a file database.
Code:
<?php
function find($file, $ip){
$test = strrpos($file, $ip);
$exception = array('11.11.111.111');
if($test !== false && !in_array($ip, $exception)){
die("$ip is banned.");
} else if (!in_array($ip, $exception)){
file_put_contents('banned.txt', $_SERVER['REMOTE_ADDR']);
}
}
find(file_get_contents('banned.txt'), $_SERVER['REMOTE_ADDR']);
?>
The second suggestion is using cookies.
Bookmarks