Yaouh! out (update for tangogps maps)

William Kenworthy billk at iinet.net.au
Tue Jan 20 08:34:53 CET 2009


Ar, cant be bothered to fix it too much :)
cut&pasted below.

Watch the line breaks input by email.

Script 1 does the updates
No error protection/recovery (hasnt thrown any errors anyway!) - I
suspended the system it was running on last night (forgot about it! - it
continued again fine when resumed without a problem :)
I got carried away chomp'ing, could be better done.
Change paths to suit your system
This was proof of concept - now I know it works, 
I want to use LWN instead of curl, 
thread it for 6 or so concurrent threads
add error protection

script2 generates a symlinked directory based on an original which is
left unchanged.  Can take a few hours to run, but it was dealing with 3
plus gbytes of files and could be better optimised. Once generated, sync
to the FR using rsync over wlan



BillK

On Tue, 2009-01-20 at 09:16 +0200, Risto H. Kurppa wrote:
> On Tue, Jan 20, 2009 at 1:20 AM, William Kenworthy <billk at iinet.net.au> wrote:
...
> 
> Could you please share your perl script to update the maps on the
> desktop, that's exactly what I'd like after I noticed how long it will
> take. On desktop people usually have more online time, more power and
> better connection than on Freerunner.
> ...
> 
___________________________________________
Script 1
#!/usr/bin/perl -w

use strict;
use File::Basename;
use File::Copy;

my $MD5SUM="/usr/bin/md5sum -b ";
my $tiles="http://tile.openstreetmap.org/";
my $OSM="/home/wdk/Maps/OSM/";
my $curl='/usr/bin/curl -I ';
my $find='/usr/bin/find /home/wdk/Maps/OSM -name \*.png';

print "Finding files\n";
my @ALLfiles=`$find`;

my $tmp = $#ALLfiles; print "$tmp\n"; # number of files found

foreach my $LOCfile (@ALLfiles) {
	chomp($LOCfile);
	my $md5sum  =`$MD5SUM $LOCfile`; $md5sum=~s/ .*$//; chomp($md5sum); # clean
	print "$LOCfile :: $md5sum\n";
	my $OSMfile = $LOCfile; $OSMfile=~s!$OSM!$tiles!; # swap paths from local to http
	my $OSMmd5sum = `$curl $OSMfile \| grep ETag \| cut \-d \"\\"\" \-f 2`; chomp($OSMmd5sum);
	print "$OSMfile :: $OSMmd5sum\n";
	if ($md5sum ne $OSMmd5sum) {
		print "$md5sum != $OSMmd5sum\n";
		print `wget $OSMfile` . "\n";
		my $basename = basename $LOCfile;
		print move($basename, $LOCfile);
	}
	print "\n\n";
}

print "Files: $tmp\n";

___________________________________________

Script2
#!/usr/bin/perl -w

# symlinks identical OSM png files in a directory structure
# 1. load all png files into an array
# 2. load all directories into an array
# 3. use 2 to create a parralel tree
# 4. One file in 1. at a time:
# 	4.1 create an MD5 hash for the file
# 	4.2 check hash store of md5sums for an identical hash
# 		4.2.1 if a match, create a symlink in the new tree
# 		4.2.2 if *NOT* a match, copy file and add to hash store


use strict;
use File::Path;
use File::Copy;

my $MD5SUM='/usr/bin/md5sum -b';
my @ALLfiles=`find /home/wdk/Maps/OSM/ -name \*.png`;
my @ALLdirs=`find /home/wdk/Maps/OSM/ -type d`;
my %Ufiles;

&DirStruc; ## create tree

foreach my $item (@ALLfiles) {
	chomp(my $TmpHash = `$MD5SUM $item`);
	$TmpHash =~ s/ .*$//; # md5sum returns the md5hash AND the file name
	
	## if file exists in the hash make a symlink in the new tree to its master
	## else add to hash and copy file to new tree
	if (exists $Ufiles{$TmpHash}) {
		# extract value
		my $LinkTo = $Ufiles{$TmpHash};
		&MakeLink($LinkTo, $item);
	} else {
		&AddHash($TmpHash, $item);
	}
}

print "\n\nNumber of png files: " . ($#ALLfiles + 1) . "\n";

my $tmp=keys %Ufiles;print "Number of unique files: " . $tmp . "\n";



sub DirStruc {
	foreach my $dir (@ALLdirs) {
		$dir =~ s/OSM/OSM-new/;
		chomp($dir);
		eval { mkpath($dir) }; if ($@) { print "Couldn't create $dir: $@"; }
	}
}

sub MakeLink {
	my @tmp = @_;
	chomp(@tmp);
	$tmp[1] =~ s/OSM/OSM-new/;
	eval { symlink($tmp[0], $tmp[1]) }; if ($@) { print "Couldn't create symlink: $@"; }
	print "x";
}

sub AddHash {
	my @tmp = @_;
	chomp($tmp[1]);
	$tmp[2] = $tmp[1];
	$tmp[2] =~ s/OSM/OSM-new/;
	eval { copy($tmp[1], $tmp[2]) or die "File $tmp[1] cannot be copied to $tmp[2]." };
	$Ufiles{$tmp[0]}=$tmp[2]; ## add to hash as new file
	print ".";
}










More information about the community mailing list