Monday, May 7, 2012

lose dozens of Mbs just by following this one weird old tip!

The other week the Apple Blog Daring Fireball linked to an adactio blog entry about how after a number of years of forgetting about dialup users and focusing on folks blessed with broadband (which, conveniently, is an experience closer to what the developers themselves enjoy on their desktops) "mobile" is making a whole host of old concepts in bandwidth and loading efficiency relevant once more.

My company Alleyoop recently had a sprint dedicated to improving site/app performance but we missed one obvious thing: optimizing our images. Those aforementioned links recommended a specific tool: ImageOptim. I heartily concur; using this is a total no-brainer. (Unfortunately it's OSX only and I tend to code on PCs... I think some of the several format-specific tools it frontends are available on different platforms, but as far as I know only Macs boast this "one stop shopping" aspect.)

Not being the trusting type, I copied a large folder of images as a test, made a backup copy, and let ImageOptim do its thing. (ImageOptim is great at updating images "in place", including spidering through subfolders and ignoring non-image files, though its multi-tool approach makes it a bit slow). I then made a script to make an HTML doc with the original and the optimized version of each image side by side, and invited our designer to take a peek at the result. The visual differences between the images were negligible (the most suspicious thing was some very, very faint color shifting on some PNGs) but the byte savings was immense: the 3.2 meg contents of the folder smushed down to about 1.6 megs. 

Here's the quick and exceedingly dirty Perl script I concocted to do the A/B test, it reads all the files in "files.txt", and then shows that file in the directory before/ next to that file under after/ . The files are sorted by decreasing order of bytes saved, and a little additional savings information is printed up as well.

print "<center><div style='width:3000px'><nobr>";
open(FILES,"files.txt");
while(defined($line=<FILES>)){
  chomp $line;
  push (@files,$line);
  $sizediff{$line} = (-s "before/$line") - (-s "after/$line");
  $sizepercentdiff{$line} = (100 * (-s "after/$line")) / (-s "before/$line");
  $totalsave += $sizediff{$line};
  $totalpercent += $sizepercentdiff{$line};
  $totalpercentCount += 1;


}
close FILES;


@sortfiles = @files; #sort numSort @files;
foreach $line (sort{$sizediff{$b} <=> $sizediff{$a}} @sortfiles){
  print "$line ".int(($sizediff{$line}/1000)+.5)."Kb Diff  (".int($sizepercentdiff{$line}+.5)."%)<br>";
  print "<img src='before/$line'><img src='after/$line'><br><br>\n";
}
print "TOTAL SAVED: ".int(($totalsave /1000)+ .5)."Kb AVG PERCENT: ".int(($totalpercent/ $totalpercentCount )+.5);
print "</nobr></div></center>";

(PROTIP: to get the bare list of files this program uses on windows, "dir /b > files.txt" is just the thing.)

Man, I used to love Perl so much, and it still shows its utility in every job I've had since graduation, but my street Perl from the mid-90s is showing it's age, I never really absorbed its object model, and its API can be so barebones... (plus on the cheap webspace I rent, it never seems to have the useful modules I want already installed...)

No comments:

Post a Comment