Skype has been having network issues in the last few days. I don't like the fact that I've been using Skype, but I have. Why? Because it's the easiest way to talk to my girlfriend while she's away.
That is, it's easy while it works. It has many downsides, but to be fair to it, it's very good at dealing with firewalls and NAT, something that Ekiga really (really really) isn't.
With Skype broken, and wanting to talk to Ulrike, I suggested we try WengoPhone. She grabbed the windows binary, I grabbed the Linux binary. After a few false starts we got connected and for bonus points, it supports webcams in Windows and Linux, which dramatically improves the value of talking to her (and it's very cool that Ubuntu supports my tiny little Creative laptop webcam out of the box).
To my utter surprise, not only is Wengo GPL'd, it's also really just a SIP client. This is fantastic and it's just a bit of a shame that the interface uses Qt (the Wengo guys say they would love to see a Gtk port. So would I!). I was able to call her Wengo address (translated into the underlying SIP address) from Ekiga, but it took ages to arrange the call, even longer to connect the audio/video to it and even then it got the video wrong.
I know Ekiga has been making great strides in their development version and I very much look forward to being able to use it as a swiss army knife of VoIP. Having said that, I kind of suspect that I will be more tempted by some kind of Telepathy VoIP interface which hides all the tediousness of different forms of instant communication and just lets me type to, talk with, and look at people.
As a result of the Wengo transition, I have just been able to do this:
-(cmsj@waishou)-(~)- sudo dpkg -P skype(Reading database ... 144578 files and directories currently installed.)Removing skype ...Purging configuration files for skype ...-(cmsj@waishou)-(~)-
Terminator 0.3 is out, with the expected Terminal closing feature. This really makes the Splitting from 0.2 much more useful. As usual, please head over here for the goodness. NOTE: There is a bug preventing zsh users from using terminator. See the bug on Launchpad for full details.
Two days in a row! This is not going to be a continuous thing, but since it's the weekend I have been hacking on Terminator a lot. This is a big release for me, it finally brings in one of the crucial features required to make this more than just a script for 4 terminals in a window - you can now split terminals on demand. Right click on one and you can turn it into two terminals, horizontally or vertically. My roadmap currently is to have 0.3 allow you to remove terminals. 0.4 will then concentrate on loading/saving some kind of profile so you don't have to do a complex splitting procedure each time you start Terminator. I'm not sure if many other features will get in between 0.4 and 1.0, because there is lots to do on the gconf and gnome-terminal emulation. Head on over to Terminator's page for various links, including the download link.
Firefox is a very popular piece of software. Claims run up to 100 million users, which is really good and on the whole I think it's a very good browser. However. What Firefox isn't, is integrated. Sure it renders using gtk (and Cairo, if not already then soon) and gnome actions involving URLs spawn Firefox, but it's still trapped away in its own little universe - Marc Andreeson's gift to the world, a platform agnostic application architecture. Clearly Mozilla has built itself a highly capable cross-platform application architecture, but that necessarily isolates them on every platform. The trigger behind this post is the patches that recently appeared to let Epiphany use Webkit (Apple's fork of KHTML, as used n Safari). Epiphany isn't a bad browser, but it's not flexible like the fox (purely because there aren't enough extensions). The problem here is that if GNOME is going to achieve the online desktop integration they have been talking about, reliable HTML widgets seem quite vital. GtkMozEmbed (I say having never used it) appears to be very painful to work with. A high quality GNOME widget based on Webkit that makes displaying HTML really easy would be so extraordinarily useful to the project. It would allow the browser to disappear into the desktop - want to visit a page? click/press something to type some stuff which is an address or search keywords. Out slides the appropriate web page. It gets rid of the necessity to go Applications->Internet->Firefox before typing a URL (and yes I know things like deskbar can launch a browser in these circumstances). Mostly it massively lower the barrier to writing apps which partly rely on the internet, or HTML in general, which can only be a good thing for a more online world. What's holding it back though is Firefox. It's a very popular piece of software, even on Windows. Maybe too popular, if Ubuntu were to drop Firefox by default in favour of an integrated future version of Epiphany it could hurt Ubuntu - one of its selling points is no longer that it uses the much vaunted Firefox thingy people have heard of. (I also wonder if GTK should support CSS ;)
I've just pushed out the first release ever of Terminator, a python script to make a window have multiple Terminals in it. It's still very rough around the edges. And the middle. But it's there! Rather than repeat myself here, just click over to the Terminator page for full details.
How would you shrink the root file system of a remote machine? Of course the easy answer is to boot into a rescue environment and do it (because you can't shrink ext3 online). If you have a good KVM or ILO setup, you already have a rescue environment of sorts - initramfs. Chuck "break=mount" on your kernel commandline and the initramfs will drop out to a shell before it mounts the root filesystem. You can now mount the root fs manually and copy out the required tools/libs (e2fsck, resize2fs, fdisk and their libraries, in this case), then unmount the root fs. Now, with appropriate $LD_LIBRARY_PATH mangling you can run the extracted binaries and operate on your root partition with impunity
It's pretty much exactly a decade since I started using Linux, so it seems like a good time to look back at what I used to use before. Immediately prior to jumping into the FOSS world, I was using Windows 98, but I don't really want to talk about that because I never really liked it and it hated my hardware, so it was a very brief partnership. The 7 or 8 years before that though, were computing heaven because I was a devoted Amiga user. Initially I was using an A500, which I added a second floppy drive to (I think the Cumana drive I bought cost me about £80!), as well as a couple of MB of Fast RAM (some of which I hacked into being Chip RAM for better graphics). Eventually the 500 was getting far too restrictive and even my 2-disk boot environment was getting hard to live with, so I got a job in a supermarket to earn some money to buy a shiny new A1200, which was a pretty big leap forward over the 500. After a while I put the much faster 68030 CPU in it (thanks to phase5's excellent 1230 IV expansion card), a 16MB SIMM and a 120MB 2.5" hard disk. Later I swapped the 030 card for an 040 card, for even more blazing performance. Anyway, enough boring hardware reminiscing, on to the fun stuff! For a while now I've wanted to rescue everything on the last Amiga hard disk I owned (a Western Digital 1.2GB monster!), but since my A1200 had something of a small accident (here's a tip kids, never use the inside of a computer as a footrest) that wasn't going to be hugely easy. Had I not broken the 1200, things would have been fine - by the time I stopped using the Amiga it had an Ethernet interface and a fair whack of UNIX programs on it like scp. A few months back I fished the disk out of the remains of the Amiga (now forever consigned to the past, as I took the carcass to the local dump), hooked it up to an external USB-IDE interface and took a raw image of the disk. I then bought Amiga Forever, a distribution of various Amiga Emulators and a pretty much complete set of officially licenced ROMs and system disks (lacking working hardware there was no way I could get dumps of my ROMs or transfer the contents of the PC-incompatible floppy system disks). I briefly dallied with the included emulator for UNIX (the venerable UAE), but it was pretty unstable and on further investigation it turns out that most of the development work these days goes into the Windows fork (WinUAE). This was quite disappointing and I never really looked into it all further. That was, until last night when I started tidying up all the crap on my desktop and got to the Amiga Forever folder. The pangs of nostalgia grabbed me again and I decided to have another stab at things. This time I used e-UAE, another fork of UAE, maintained by Richard Drummond (any Amiga user will recognise that name). He has been diligently pulling in the improvements from WinUAE, and it really shows. It's much more stable than vanilla UAE (although I can still provoke it into crashing). This was a good start, but I was still left with the problem of how to extract the data from the disk image I had. After battling with the uae configs a little, I discovered that there was something wrong - I could only persuade the Amiga to see 1 of the 4 partitions. Fortunately it was the one with all my data on - except my old programming stuff, but the point of this exercise was not to rescue data as I had copied the stuff I really cared about off before I stopped using it. The point was to get *my* Amiga running again, even if the hardware was now just some software. I conversed with some of the long time Amiga stalwarts I still converse with on IRC and one of them pointed me at some really simple code to extract partitions from an Amiga disk image. This proved to be part of the key to making everything Just Work™. The other part being that Linux can read AFFS formatted partitions. I quickly mounted them and pointed e-UAE at the mountpoints and bam! off it went. Ok so I had to spend a few minutes hacking out the various hardware hacks I had from the Startup-sequence, but with that done, I was left with a pretty much exact copy of what I used to use 10 years ago. It's a very strange experience, leaping back in time like this. You look over your old code, email, pictures and so on and while one part of you thinks "hey I remember this!", another part things "damn what was I thinking" ;) As Jamie Zawinski found when he tried to do a similar (but unfortunately for him, much more painful) operation a while back, the best way to keep data from being obsoleted is to keep it on a live computer. Sooner or later all hardware fails, but if you always transfer all of your data from one computer to your new one, you'll never have a huge gap to cross (this is exactly how speciation works, by the way). Emulation and FOSS suggest that there is no real reason why my Amiga now can't live on forever, virtually. That's hardly the hugest achievement of mankind, but it makes me happy. I'd like to say thank you to everyone who made the Amiga, everyone who made its community such a fantastic place, and everyone who still works on making it live on. (As a side note, this all serves to make me think what a natural predecessor to the current Linux ecosystem the Amiga was. It had a powerful shell, a friendly GUI, but most crucially, an active and dedicated community)
I usually have at least 4 terminals visible on my screen at once. Each one is running screen(1), and each screen has probably at least 3 or 4 different things going on in it (usually sshing to servers).
Once you are up to about a dozen or so shells spread across 4 terminals it can get quite interesting to remember where you left the one you are looking for.
Since screen can have a status line which lists its screens, and it is possible to change their names, I figured it ought to be possible to have ssh set the title of a screen when it connects to a remote machine. This would make things a lot easier to find, as well as being a cute hack.
It turns out that it is indeed possible.... to a degree.
ssh lets you specify a command to be run on the local machine after a connection is established, which is the ideal place to do this kind of thing. Sadly it doesn't help you out by setting any useful environment variables (such as the machine you just ssh'd to). You're probably thinking "but you know which one it is, you just ssh'd there!" and while that is true, it's not very easy to handle programatically. Mainly because it means parsing the arguments to ssh, which is no fun at all.
So, rather than do that, I am making the blanket assumption that the final word on ssh's command line is the host you are sshing to. If that is not true (e.g. you are doing "ssh someserver rm /etc/foo") you will get whatever the last word actually is, sucks to be you.
Also, if you use ProxyCommand, you really don't want the second ssh to do this, because it will confuse the first one and you'll never establish a connection, so detecting the type of output ssh is connected to is necessary.
Thanks to the many, many people I've consulted in the process of figuring this out. It doesn't seem like anyone has done this before (at least I can't find an example on google. There are some very similar things though), so after running out of ideas myself I started polling the community and got enough nuggets of inspiration back to produce a workable solution.
You will need to make sure screen is configured to show a status line (otherwise you won't see the screen names, except in a C-A-" or similar). Then drop this into ~/.ssh/config:
PermitLocalCommand yes LocalCommand tty -s && cat /proc/$PPID/cmdline | xargs -0 | awk '{ printf ("\033k%s\033\\", $NF) }'
(yes, that is hacky and disgusting. I am tempted to look at patching ssh to provide the hostname to the spawned LocalCommand shell, but right now the above config seems to be the best way of doing this).
I came across a python script called vepp, which aims to be a simple way of transcoding files for portable media devices. Why not also use it for very unportable media devices such as the PS3? :)
Initially I've just added a target for fairly high bitrate 720p H.264/AVC, 1080 and MPG-SP targets still to come.
If you want to track my development version, you can do so via Launchpad. You will need to use bzr thus: bzr branch http://bazaar.launchpad.net/~cmsj/+junk/ps3tools
You'll need a capable version of ffmpeg, as discussed previously. Output files will be written to the current directory (I'm looking at adapting the current behaviour to be able to automatically direct the output to either attached media that is PS3 compatible (CF/SD/MS/USB) or sending it straight to a directory you are sharing via UPnP (far more useful than ferrying things about with SD cards!)
Here is my current patch against vepp:
=== modified file 'vepp-2.0.1.py' (properties changed)--- vepp-2.0.1.py 2007-06-09 01:01:48 +0000+++ vepp-2.0.1.py 2007-06-09 03:12:21 +0000@@ -4,8 +4,8 @@ from math import sqrt # defaults-remove = True-target = 'psp-oe'+remove = False+target = 'ps3-avc-720p' vbr = True audio = None@@ -85,6 +85,22 @@ 'qmax': 24, 'channels': (2, 1), },+ 'ps3-avc-720p': { # Only tested with firmware 1.80+ 'maxx': 1280,+ 'maxy': 720,+ 'stepx': 8, # FIXME: lower?+ 'stepy': 8, # FIXME: lower?+ 'pixels': 1280 * 720,+ 'namedfiles': True,+ 'thumb': False, # FIXME: Can this be True?+ 'ext': "mp4",+ 'video': ["-vcodec", "h264", "-f", "mp4", "-bufsize", "14000k", "-maxrate", "14000k", "-coder", "1", "-level", "31", "-r", "24000/1001", "-g", "300"],+ 'audio': ["-acodec", "aac", "-ab", "160k"],+ 'bitrate': lambda x,y: "3072000",+ 'qscale': 18,+ 'qmax': 24,+ 'channels': (2, 1),+ }, 's60': { 'maxx': 352, 'maxy': 288,
It would be nice to be able to push content to the PS3 from a LAN, but I have no idea how they could do it sanely. Maybe I can push files via Bluetooth.
Of course, if the rumours are true, this is going to all be immaterial shortly...
I've yet to complete this, because I stopped my attempts last night when I reached an unusual situation. Specifically, I was doing the partitioning manually, but the two visible disks had no partition tables. Not wanting to trash the PS3 disk I didn't let it create the tables, so had to abort. After consulting with the very helpful Colin Watson, it turns out that the disk Ubuntu sees as sda is not the whole PS3 disk, it's the Other OS partition virtualised to look like a whole disk. It is therefore fine to create a partition table and proceed with the install, which I will do tonight. (I'm not sure yet what sdb is, it smells like the PS3s internal flash and again I'm not sure if it's wise to mess with it) UPDATE: The disks that Linux sees are virtualised by the PS3 (they're actually just partitions made to look like whole disks), so it is fine to make the partition tables (or indeed let the installer do automatic partitioning). The bug where the installer hangs at 15% is due to the low RAM in the PS3. Stop some services (cupsys and hplip are good candidates) and remove some things from your session (update-manager and gnome-cups-icon, for example). Removing applets from the panel is not a bad idea either, and don't run anything else while you are installing. Of course you could plug in a disk of some kind and set up swap, but this bug makes that quite hard at the moment.