3rd in my series of talks at SVPerl about Mojolicious; this one reviews using the server-side features for building bespoke mock servers, and adds a quick overview of the Mojo client features, which I had missed until last week. Color me corrected.
3rd in my series of talks at SVPerl about Mojolicious; this one reviews using the server-side features for building bespoke mock servers, and adds a quick overview of the Mojo client features, which I had missed until last week. Color me corrected.
I was having trouble watching the Théâtre du Châtelet performance of Einstein on the beach at home; my connection was stuttering and buffering, which makes listening to highly-pulsed minimalist music extremely unrewarding. Nothing like a hitch in the middle of the stream to throw you out of the zone that Glass is trying to establish. (This is a brilliant staging of this opera and you should go watch it Right Now.)
So I started casting around for a way to download the video and watch it at my convenience. (Public note: I would never redistribute the recording; this is solely to allow me to timeshift the recording such that I can watch it continuously.) I looked at the page and thought, “yeah, I could work this out, but isn’t there a better way?” I searched for a downloader for the site in question, and found it mentioned in a comment in the GitHub pages for youtube-dl.
I wasn’t 100% certain that this would work, but a quick perusal seemed to indicate that it was a nicely sophisticated Python script that ought to be able to do the job. I checked it out and tried a run; it needed a few things installed, most importantly ffmpeg. At this point I started getting a little excited, as I knew ffmpeg should technically be quite nicely able to do any re-enoding etc. that the stream might need.
A quick brew install later, I had ffmpeg, and I asked for the download (this is where we’d gotten to while I’ve been writing this post):
$ youtube_dl/__main__.py http://culturebox.francetvinfo.fr/einstein-on-the-beach-au-theatre-du-chatelet-146813 [culturebox.francetvinfo.fr] einstein-on-the-beach-au-theatre-du-chatelet-146813: Downloading webpage [culturebox.francetvinfo.fr] EV_6785: Downloading XML config [download] Destination: Einstein on the beach au Théâtre du Châtelet-EV_6785.mp4 ffmpeg version 1.2.1 Copyright (c) 2000-2013 the FFmpeg developers built on Jan 12 2014 20:50:55 with Apple LLVM version 5.0 (clang-500.2.79) (based on LLVM 3.3svn) configuration: --prefix=/usr/local/Cellar/ffmpeg/1.2.1 --enable-shared --enable-pthreads --enable-gpl --enable-version3 --enable-nonfree --enable-hardcoded-tables --enable-avresample --enable-vda --cc=cc --host-cflags= --host-ldflags= --enable-libx264 --enable-libfaac --enable-libmp3lame --enable-libxvid libavutil 52. 18.100 / 52. 18.100 libavcodec 54. 92.100 / 54. 92.100 libavformat 54. 63.104 / 54. 63.104 libavdevice 54. 3.103 / 54. 3.103 libavfilter 3. 42.103 / 3. 42.103 libswscale 2. 2.100 / 2. 2.100 libswresample 0. 17.102 / 0. 17.102 libpostproc 52. 2.100 / 52. 2.100 [h264 @ 0x7ffb5181ac00] non-existing SPS 0 referenced in buffering period [h264 @ 0x7ffb5181ac00] non-existing SPS 15 referenced in buffering period [h264 @ 0x7ffb5181ac00] non-existing SPS 0 referenced in buffering period [h264 @ 0x7ffb5181ac00] non-existing SPS 15 referenced in buffering period [mpegts @ 0x7ffb52deb000] max_analyze_duration 5000000 reached at 5013333 microseconds [mpegts @ 0x7ffb52deb000] Could not find codec parameters for stream 2 (Unknown: none ([21][0][0][0] / 0x0015)): unknown codec Consider increasing the value for the 'analyzeduration' and 'probesize' options [mpegts @ 0x7ffb52deb000] Estimating duration from bitrate, this may be inaccurate [h264 @ 0x7ffb51f9aa00] non-existing SPS 0 referenced in buffering period [h264 @ 0x7ffb51f9aa00] non-existing SPS 15 referenced in buffering period [hls,applehttp @ 0x7ffb51815c00] max_analyze_duration 5000000 reached at 5013333 microseconds [hls,applehttp @ 0x7ffb51815c00] Could not find codec parameters for stream 2 (Unknown: none ([21][0][0][0] / 0x0015)): unknown codec Consider increasing the value for the 'analyzeduration' and 'probesize' options Input #0, hls,applehttp, from 'http://ftvodhdsecz-f.akamaihd.net/i/streaming-adaptatif/evt/pf-culture/2014/01/6785-1389114600-1-,320x176-304,512x288-576,704x400-832,1280x720-2176,k.mp4.csmil/index_2_av.m3u8': Duration: 04:36:34.00, start: 0.100667, bitrate: 0 kb/s Program 0 Metadata: variant_bitrate : 0 Stream #0:0: Video: h264 (Main) ([27][0][0][0] / 0x001B), yuv420p, 704x396, 12.50 fps, 25 tbr, 90k tbn, 50 tbc Stream #0:1: Audio: aac ([15][0][0][0] / 0x000F), 48000 Hz, stereo, fltp, 102 kb/s Stream #0:2: Unknown: none ([21][0][0][0] / 0x0015) Output #0, mp4, to 'Einstein on the beach au Théâtre du Châtelet-EV_6785.mp4.part': Metadata: encoder : Lavf54.63.104 Stream #0:0: Video: h264 ([33][0][0][0] / 0x0021), yuv420p, 704x396, q=2-31, 12.50 fps, 90k tbn, 90k tbc Stream #0:1: Audio: aac ([64][0][0][0] / 0x0040), 48000 Hz, stereo, 102 kb/s Stream mapping: Stream #0:0 -> #0:0 (copy) Stream #0:1 -> #0:1 (copy) Press [q] to stop, [?] for help frame=254997 fps=352 q=-1.0 size= 1072839kB time=02:49:59.87 bitrate= 861.6kbits/s
Son of a gun. It works.
I’m waiting for the download to complete to be sure I got the whole video, but I am pretty certain this is going to work. Way better than playing screen-capture games. We’ll see how it looks when we’re all done, but I’m quite pleased to have it at all. The download appears to be happening at about 10x realtime, so I should have it all in about 24 minutes, give or take (it’s a four-hour, or 240 minute, presentation).
Update: Sadly, does not work for PBS videos, but you can actually buy those; I can live with that.
This is my Test::Routine slide deck for the presentation I ended up doing from memory at the last SVPerl.org meeting. I remembered almost all of it except for the Moose trigger and modifier demos – but since I didn’t have any written yet, we didn’t miss those either!
Update: My WordPress installation seems to have misplaced this file. I’ll look around for it and try to put it back soon.
[This was originally asked on Quora, but the result of figuring this out was interesting enough that I thought I’d make it a blog post.]
It’s a very interesting question, because there are so many differing kinds of computing capability in that one device. The parallel processing power of the GPU (slinging bits to the display) and the straight-ahead FLOPs of the ARM processor.
Let’s try some back of the envelope calculations and comparisons.
The iPhone 5’s A6 processor is a dual-core, triple-GPU device. The first multiprocessor computer was the Burroughs D-285 (defense-only, of course).
A D-285 had 1 to 4 processors, running at ~ .070 s /1 operation = ~14 FLOPS for divide, the slowest operation, 166 FLOPS for add, the fastest, and ~25 FLOPS for multiply. Let’s assume adds are 10x more frequent than multiply and divide to come up with an average speed of 35 FLOPS per processor, so 70 FLOPS for a 2-processor D825, handwaving CPU synchronization, etc.
Let’s take the worst number from the Geekbench stats via AnandTech for the iPhone 5’s processor: 322 MFLOPS doing a dot product, a pure-math operation reasonably similar to the calculations being done at the time in 1962. Note that’s MFLOPS. Millions. To meet the worst performance of the iPhone 5 with the most optimistic estimate of a 2-processor Burroughs D825’s performance, you’d need 4.6 million of them.
I can state confidently that there were not that many Burroughs B362s available in 1962, so there’s a hard lower bound at 1962. The top-end supercomputer at that point was probably the IBM 7090, at 0.1 MFLOPS.
We’d still have needed 3200 of those. in 1960, there were in total about 6000 computers (per IBM statistics – 4000 of those were IBM machines), and very few in the 7090 range. Throwing in all other computers worldwide, let’s say we double that number for 1962 – we’re still way behind the iPhone.
Let’s move forward. The CDC 7600, in 1969, averaged 10 MFLOPS (with hand-compiled code, and could peak at 35 MFLOPS).
Let’s go with the 10 MFLOPS – to equal a single iPhone 5, you’d need 32 of them. Putting aside the once-a-day (and sometimes 4-5x a day) multi-hour breakdowns, we’re in the realm of possibility that the CDCs in existence alone at that time could equal or beat an iPhone 5 (assuming they were actually running), so the likelihood is that all computing in the world probably easily equalled or surpassed an iPhone 5 at that point in straight compute ability, making 1969 the top end of our range.
So without a lot of complicated research, we can narrow it down to somewhere in the seven-ish years between 1962 and 1969, closer to the end than the start. (As a note, the Cray-1 didn’t make the scene till 1975, with a performance of 80 MFLOPS, a quarter of an iPhone; in 1982, the Cray X-MP hit 800 MFLOPS, or 2.5 iPhones.)
And we haven’t talked about the GPUs, which are massively parallel processors the likes of which were uncommon until the 1980’s (and even the top-end graphics machines of the time 1962-1969 era couldn’t equal the performance of the iPhone’s GPU with weeks or months to work on rendering – let alone there not being output devices with the pixels per inch of the iPhone’s display capable of responding in real time). But on the basis of raw compute power, somewhere after the Beatles and before the moon landing. Making a finer estimate, I’d guess somewhere in late 1966, so let’s call it somewhere around the last Gemini mission, or Doctor Who’s first regeneration.
On rereading the question I saw that the asker wanted the numbers for an iPhone 4 instead of a 5. Given the amount of handwaving I’m doing anyway, I’d say we’re still talking about close to the same period but a bit later. Without actual numbers as to the computers in use at the time, which I don’t think I can dig up without much more work than I’m willing to do for free, it’s difficult to be any closer than a couple years plus or minus. Definitely before Laugh-In (1968), definitely after the miniskirt (1964).
iPhone 5s update: the 5s is about 1.75 times faster than the 5, so that puts us at a rough 530 MFLOPS. The computing power estimate becomes much harder at this point, as minicomputers start up about 1969 (the PDP-11 and the Data General Nova). The Nova sold 50,000 units, equivalencing out to about 130 MFLOPS; total PDP-11’s sold “during the 1970’s” was 170,000 for a total of 11 GFLOPS (based on the 11/40 as my guess as to the most-often-sold machine); divide that by ten and then take half of that for a rough estimate, and the PDP-11s by themselves equivalence to one 5s. So I’ll say that the moon landing was probably about the equivalence point for the 5s, but the numbers are much shakier than they are for the 4 or 5, so call it around the first message sent over ARPANet at the end of October 1969. (Side note: this means that the average small startup in Silicon Valley today – 20 or so people – is carrying about the equivalent power of all the PDP-11’s sold during the 1970’s in their pockets and purses.)
Past this, world computing power is too hard to track without a whole lot of research, so take this as the likely last point where I can feel comfortable making an estimate.
A nice evening at SVPerl – we talked about the basic concepts of testing, and walked through some examples of using Test::Simple, Test::More, and Test::Exception to write tests. We did a fair amount of demo that’s not included in the slides – we’ll have to start recording these sometime – but you should be able to get the gist of the talk from the slides.
[wpdm_file id=2]
The situation: a friend had a MacBook Air whose motherboard went poof. Fortunately she had backups (almost up-to-date) in CrashPlan, so she did a restore of her home directory, which worked fine in that she had her files, but not so fine in that all the folder last-changed dates now ran from the current date to a couple days previous (it takes a couple days to recover ~60GB of data).
This was a problem for her, because she partly uses the last-changed date on her folders to help her keep organized. “When was the last time I did anything on project X?” (I should note: she uses Microsoft Word and a couple different photo library managers, so git or the equivalent doesn’t work well for her workflow. She is considering git or the like now for her future text-based work…)
A check with CrashPlan let us know that they did not track folder update dates and couldn’t restore them. We therefore needed to come up with a way to re-establish as best we could what the dates were before the crash.
Our original thought was simply to start at the bottom and recursively restore the folder last-used dates using touch -t, taking the most-recently-updated file in the folder as the folder’s last-updated date. Some research and thought turned up the following:
This meant that we couldn’t precisely guarantee that the folder’s last-updated date would accurately reflect the last update of its contents. We decided in the end that the best strategy for her was to “bubble up” the last-updated dates by checking both files and folders contained in a subject folder. This way, if a file deep in the hierarchy is updated, but the files and folders above it have not been, the file’s last-updated date is applied to its containing folder, and subsequently is applied also to each containing folder (since we’re checking both files and folders, and there’s always a folder that has the last-updated date that corresponds to the one on the deeply-nested file). This seemed like the better choice for her as she had no other records of what had been worked on when, and runs a very nested set of folders.
If you were running a flatter hierarchy, only updating the folders to the last-updated date of the files might be a better choice. Since I was writing a script to do this anyway, it seemed reasonable to go ahead and implement it so that you could choose to bubble up or not as you liked, and to also allow you to selectively bubble-up or not in a single directory.
This was the genesis of date-fixer.pl. Here’s the script. A more detailed example of why neither approach to restoring the folder dates is perfect is contained in the POD.
[wpdm_file id=1]
use strict;
use warnings;
use 5.010;
=head1 NAME
date-fixer.pl - update folder dates to match newest contained file
=head1 SYNOPSIS
date-fixer.pl --directory top_dir_to_fix
[--commit]
[--verbose]
[--includefolders]
[--single]
=head1 DESCRIPTION
date-fixer.pl is meant to be used after you've used something like CrashPlan
to restore your files. The restore process will put the files back with their
proper dates, but the folders containing those files will be updated to the
current date (the last time any operation was done in this folder -
specifically, putting the files back).
date-fixer.pl's default operation is to tell you what it would do; if you want
it to actually do anything, you need to add the --commit argument to force it
to actually execute the commands that change the folder dates.
If you supply the --verbose argument, date-fixer.pl will print all the commands
it is about to execute (and if you didn't specify --includefolders, warn you
about younger contained folders - see below). You can capture these from STDOUT
and further process them if you like.
=head2 Younger contained folders and --includefolders
Consider the following:
folder1 (created January 2010 - date is April 2011)
veryoldfile 1 (updated March 2011)
oldfile2 (updated April 2011)
folder2 (created June 2012 - date is July 2012)
newfile (updated July 2012)
If we update folder1 to only match the files within it, we won't catch that
folder2's date could actually be much more recent that that of either of the
files directly contained by folder1. However, if we use contained folder dates
as well as contained file dates to calculate the "last updated" date of the
current folder, we may make the date of the current folder considerably more
recent than it may actually have been.
Example: veryoldfile1 and oldfile2 were updated in March and April 2011.
Folder2 was updated in June 2012, and newfile was added to in in July 2012.
The creation of folder2 updates the last-updated date of folder1 to June 2012;
the addition of newfile updates folder2's last-updated date to that date --
but the last-updated date of folder1 does not change - it remains June 2012.
If we restore all the files and try to determine the "right" dates to set the
folder update dates to, we discover that there is no unambiguous way to decide
what the "right" dates are. If we use the file dates, alone, we'll miss that
folder2 was created in June (causing folder1 to update to June); if we use
both file and folder dates, we update folder1 to July 2012, which is not
accurate either.
date-fixer.pl takes a cautious middle road, defaulting to only using the files
within a folder to update that folder's last-modified date. If you prefer to
ensure that the newest date buried in a folder hierarchy always "bubbles up"
to the top, add the --includefolders option to the command.
date-fixer will, in verbose mode, print a warning for every folder that
contains a folder younger than itself; you may choose to go back and adjust
the dates on those folders with
date-fixer.pl --directory fixthisone --includefolders --single
This will, for this one folder, adjust the folder's last-updated date to the
most recent date of any of the items contained in it.
=head1 USAGE
To fix all the dates in a directory and all directories below it, "bubbling
up" dates from later files:
date-fixer.pl --directory dir --commit --includefolders
To fix the dates in just one directory based on only the files in it and
ignoring the dates on any directories it contains:
date-fixer.pl --directory dir --commit --single
To see in detail what date-fixer is doing while recursively fixing dates,
"bubbling up" folder dates:
date-fixer.pl --directory dir --commit --verbose --includefolders
=head1 NOTES
"Why didn't you use File::Find?"
I conceived the code as a simple recursion; it seemed much easier to go ahead and read the directories
myself than to go through the mental exercise of transforming the treewalk into an iteration such as I
would need to use File::Find instead.
=head1 AUTHOR
Joe McMahon, mcmahon@cpan.org
=head1 LICENSE
This code is licensed under the same terms as Perl itself.
=cut
use Getopt::Long;
use Date::Format;
my($commit, $start_dir, $verbose, $includefolders, $single);
GetOptions(
'commit' => \$commit,
'directory=s' => \$start_dir,
'verbose|v' => \$verbose,
'includefolders' => \$includefolders,
'single' => \$single,
);
$start_dir or die "Must specify --directory\n";
set_date_from_contained_files($start_dir);
sub set_date_from_contained_files {
my($directory) = @_;
return unless defined $directory;
opendir my $dirhandle, $directory
or die "Can't read $directory: $!\n";
my @contents;
push @contents, $_ while readdir($dirhandle);
closedir $dirhandle;
@contents = grep { !/\.$|\.\.$/ } @contents;
my @dirs = grep { -d "$directory/$_" } @contents;
my %dirmap;
@dirmap{@{[@dirs]}} = ();
my @files = grep { !exists $dirmap{$_}} @contents;
# Recursively apply the same update criteria unless --single is on.
unless ($single) {
foreach my $dir (@dirs) {
set_date_from_contained_files("$directory/$dir");
}
}
my $most_recent_date;
if (! $includefolders) {
$most_recent_date = most_recent_date($directory, @files);
my $most_recent_folder = most_recent_date($directory, @dirs);
warn "Folders in $directory are more recent ($most_recent_folder) than the most-recent file ($most_recent_date)\n";
}
else {
$most_recent_date = most_recent_date($directory, @files, @dirs);
}
if (defined $most_recent_date) {
(my $requoted = $directory) =~ s/'/\\'/g;
my @command = (qw(touch -t), $most_recent_date, $directory);
print "@command\n" if $verbose;
system @command if $commit;
}
else {
warn "$directory unchanged because it is empty\n" if $verbose;
}
}
sub most_recent_date {
my ($directory, @items) = @_;
my @dates = map { (stat "$directory/$_")[9] } @items;
my @formatted = map { time2str("%Y%m%d%H%M.%S", $_) } @dates;
my @ordered = sort { $a lt $b } @formatted;
return $ordered[0];
}
FOLLOWUP:
I followed these instructions, but alas, they do not work under iOS 6.1.3. I got an OK restore, but still couldn’t unlock Parental Restrictions. I used iExplorer to backup all my text messages and voicemails, and did a “set up as new device” to get a known passcode on Parental Restrictions again.
This is, not to put too fine a point on it, a real pain. Fortunately the nice folks at iphonebackupextractor.com had the steps lined out; I just needed to do a little command-line tinkering to make it work. So here’s the rundown:
<key> SBParentalControlsPIN</key>
<string>1234</string>
You should now be able to access parental controls again if you got all the steps right. If not, copy the two files you backed up to ~ back into the backup directory and restore the backup again. You should be in no worse shape than you were before.
tl;dr – I like it; looking forward to the GM.
So, like every other iPhone user, I was *very* curious about iOS 7. As a developer, even more so. (Particularly, was I going to have to scramble to get my app working again under iOS 7?)
So I took my backup and installed it. First impression is that it feels ever so much lighter, psychologically, than iOS 6. The “flattening” of the interface greatly enhances the experience; Microsoft was right on the money with that one. My experiences with Windows 8 only make me wish they could have committed even harder to it and gotten rid of the desktop altogether – but I digress.
Some bugs, as expected, and I’ll be filing radars about them. In general, working pretty well, but there are a few showstoppers for me in this beta related to my day job. If it were not for those, I’d stick with it. Even with the crashes and hiccups, it’s that much of an improvement.
My app does continue to work, and I’ve now, I think, spotted the problem that’s causing it to drop and resume streaming, so that was a benefit.
Today I DFU my phone and return it to iOS 6 so I have a dependable device again, but it’s definitely a wrench. I’d much rather stay in the brighter, smoother, lighter world of iOS 7.