This morning, there was an article where the EFF is claiming that just because you turn off cookies and javascript in your browser doesn’t mean that you’re not giving away information. Unfortunately, they are very correct. Your browser will give away ALL kinds of information about your computer; such as operating system, browser type / version number, browser plugin’s, etc.

I’ve used this exact same information for years to gain information about visitors on a site that I couldn’t physically monitor the logs. What I did was use a CGI script, written in perl, to modify the HTTP header to point to an transparent image that was 1 pixel high and wide. It’s very easy to hide an image when it’s transparent and only a single pixel.

The information that this script grabbed were IP Address, date / time the image was accessed, browser user agent, and the referring URL. That’s enough information for me to get an idea of what content people are looking at and to even identify unique and repeat users.

Here is a sample script that I’ve used before.

#!/usr/bin/perl -w

use DBI;

my $imgurl = "";

print "Cache-control: no-cache\n";
print "Content-type: image/gif\n";
print "Location: $imgurl\n\n";

$refer = "$ENV{HTTP_REFERER}";
$ipaddr = "$ENV{REMOTE_ADDR}";
$browser = "$ENV{HTTP_USER_AGENT}";

my $dbh = DBI->connect("DBI:mysql:database=mytracker;host=localhost", "username", "password", {'RaiseError' => 1});

my $rows = $dbh->do("INSERT INTO trackerlogs (id, date, referurl, ipaddress, useragent) VALUES ('', NOW(), '$refer', '$ipaddr', '$browser')");

Here’s an example of the information that the log generates:

2009-10-11 19:54:06 Mozilla/4.0 (compatible; MSIE 8.0; Windows NT 6.0; Trident/4.0; FunWebProducts; Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.1; SV1) ; SLCC1; .NET CLR 2.0.50727; Media Center PC 5.0; .NET CLR 3.5.30729; .NET CLR 3.0.30618; AskTB5.3)

Here’s a link to the article: EFF Browser Fingerprints Article

Share on FacebookTweet about this on TwitterShare on LinkedInShare on RedditEmail this to someone

I was recently tasked with coming up with a backup solution for our Linux based servers. My solution was to use rsync over SSH to pull the data that we wanted over and then use tar to create daily archives, which we can then pull off the server to some other type of storage media or a remote server.

After creating a Linux server that I would use as the backup server, I setup SSH with a public key exchange.

To do this, I typed “ssh-keygen” on my Linux backup server.

root@linuxbackup:~# ssh-keygen
Generating public/private rsa key pair.
Enter file in which to save the key (/root/.ssh/id_rsa):
Enter passphrase (empty for no passphrase):
Enter same passphrase again:
Your identification has been saved in /root/.ssh/id_rsa.
Your public key has been saved in /root/.ssh/
The key fingerprint is:
c3:81:ee:64:5b:d0:8c:8e:5a:ff:68:49:be:f4:ff:68 root@linuxbackup

After creating a public key on my Linux backup server, I moved the public key over to the servers that the server would be accessing.

root@linuxbackup:~# ssh-copy-id -i .ssh/ root@server01

To automate the process, I created a custom perl script.


use Time::localtime;

## Date and Time Configuration
$tm = localtime;
($day,$month,$year) = ($tm->mday,$tm->mon,$tm->year);
$year += 1900;
$month += 1;

## User Changeable Variables
$archiveDir = "/data/archive";
@server = ("server01", "server02");

## The Nitty Gritty
$args = $ARGV[0];
if(!$args) {
print "Error: Invalid Option.\n";
print "$0 help\n";
} else {
sub arguments {
if($args eq "help") {
print "\n$0 help | auto | list\n\n";
print "help - Lists all available options.\n";
print "auto - Automatically runs the backup functions on the servers listed in the database.\n";
} elsif($args eq "auto") {
foreach $box (@server) {
`rsync -ae ssh --delete $box:/root /data/$box`;
`rsync -ae ssh --delete $box:/home /data/$box`;
`rsync -ae ssh --delete $box:/etc /data/$box`;
`rsync -ae ssh --delete $box:/var /data/$box`;
if($box eq "server02") {
`rsync -ae ssh --delete $box:/customdir /data/$box`;
`tar -cpjf $archiveDir/$box-$month$day$year.tar.bz2 /data/$box/`;
} else {
print "Error: Invalid Option.\n";
print "Type: $0 help\n";

You will notice that the perl script is pretty simple, but written in a way that it can be easily expanded upon. For example, you might get to the point where keeping up with the @server array is more maintenance than it’s worth. You could easily have the perl script access a MySQL database to pull a list of servers and the directories that needed to be pulled over via rsync. You could also add options so that it automatically put the tar.bz2 archive files onto remote storage or even tape.

To automate the script, save the script in a place like /usr/sbin/ and then create a bash script in /etc/cron.daily/ that executes the command “ auto”. It’s really pretty simple.

Share on FacebookTweet about this on TwitterShare on LinkedInShare on RedditEmail this to someone