Monday, December 23, 2013

TPP and TAFTA

As much as the EU has its problems, it was an agreement among nations for nations, with nations retaining sovereignty (the Lisbon treaty introduced majority voting, so that's less true now). It has respect for human rights built in and its courts are, well, actual courts. It has a pretty good track record on delivering on human rights within the EU. I dread to think where Ireland and some others would be socially without it.

The TPP and TAFTA on the other hand, are agreements by nations to hand over some of their sovereignty to corporations and opaque tribunals. The WTO is already this to some extent but is actually about international trade. TPP and TAFTA go much further and impact the laws that can be passed inside countries based on the impact they would have on international investors operating inside those countries.

There are already treaties like this and the results are not good. The Australian govt is attempting to regulate tobacco packaging and is being sued by Philip Morris because it will hit their profits (http://www.bbc.co.uk/news/world-asia-15815311). Tobacco companies making less money is exactly the point of the law. If a government decides to hit tobacco companies and the existing courts are OK with that then that should be the end of it.

The idea that someone has a right to continue to profit from a harmful industry or to be compensated for their reduced profits is nuts. It warps the investment market. Right now, investments like this are risky and capital is allocated elsewhere as a result. These treaties remove these risks by forcing governments to compensate investors who have chosen to invest in industries that end up being regulated. This makes antisocial investments more attractive.

TPP and TAFTA are more of the same, just bigger and stronger. They are being negotiated in secret with only a corporate interests represented. Some parts have been leaked, they are not nice.

You can read an American perspective on TPP here:

http://www.nakedcapitalism.com/2013/11/the-tpp-if-passed-spells-the-end-of-popular-sovereignty-for-the-united-states.html

This article is about opposition in Japan:

http://www.globalresearch.ca/the-trans-pacific-partnership-and-its-critics/5355052

If either of these passes, the other will become much more likely ("we have to follow the international standards, don't you know").

They both need to be shot down.

Saturday, July 06, 2013

Looking for a backup system (maybe)

I have many static files that I care about, e.g. photos, downloaded media, media that I've ripped, archives of old email, important documents. They are mostly mirrored on 2 disks but it's a bit of a mess. For some I do extra backing up, e.g. photos are supposed to end up in the cloud too. A few other things are backed up to a virtual server but that's even less systematic. Some things are over-backed up, they exist in a RAID and also on my media player disk and on a physical DVD. That's a waste of space. I don't want to have to think about any of this (at least not more than once - at the time I decide I want to be careful about a file).

I'm not trying to solve the general backup problem, this is pretty much for files that do not change once written (although their metadata or importance may change).

I want to be able to do something like

absorb --tag personal-documents --delete-when-safe tax-return-2013.pdf
and in a few seconds, there's a RAIDed copy on my home server and a copy in Google drive, another in Dropbox and that working copy is gone, not cluttering up my laptop anymore.

Does this exist already? It's not just backup, it's part "document management" and there's lots of stuff I'd like to be able to do with tagging, managing, presenting, sharing but I'd settle for a partial solution that I don't have write myself for now. I can't find anything suitable but thought I'd ask before putting any effort in.

Tuesday, June 25, 2013

Getting notification when a long-running command finishes.

My job often involves running a command to do something, realising that it's going to take longer than I expected and going doing something else in the meantime. Conversely I'll run something that should take a long time, go do something else and find that it stopped with an error after a very short time and I didn't notice.

For a long time I've wanted to set things up so that any command that takes more than 20s to run would send me an IM when it was finished. What I didn't want was to have to prefix all my commands with some notification because that's annoying and I would probably forget in exactly the cases I need it.

It sounds pretty simple but getting it work turned out to be quite fiddly.

Delivering an instant message.

There's a program call sendxmpp that will deliver an instant message over XMPP. It's written in Perl and definitely works on Linux but probably also Mac and Windows if you really want. Every few months I would try to make it work and when I finally got it working, it would stop working again the next day.. Apparently Google's XMPP service is particularly finicky. I appear to have made it work for real now. I honestly don't know what I did differently, it may even be something that changed when Google deprecated XMPP in favour of Hangouts. As I write this, I see that it no longer works for one of my accounts but it's working fine with another! If there's a more reliable option, I'd love to know.

To make this work you'll need a second Google account because sending a message from yourself to yourself doesn't seem to work. So I created an account that I'm going to use only for sending IMs. After creating that account, I invited it to chat and made sure that the accounts can chat with each other through the normal Google chat interface.

Next I created a config file for it with the following commands (you'll need to replace some parts of the last one)

Now to send a notification to myself I just do

echo some message | sendxmpp -t -u my.new.im.username -o gmail.com other.username@gmail.com

and the message shows up. I also get a bunch of Perl warnings but such is life.

Finally I wrapped that up as a command called notify and I can just pipe a message into that.

Spotting that a command is taking a long time to finish

I hate unix shells, they're a horrible mishmash of special cases, obscure features and crap that's only that way for backwards compatibility. What I've done is a hack in bash and it's not perfect (it seems like it would be slightly less hacky in zsh but that wouldn't fix the oddities described below). The core is this

What that does is run bash_pre_cmd every time you hit enter to kick off a command and bash_post_command every time it prints the command prompt (which indicates that the command has finished, although there are various way for this not to be true, hence the oddities but in normal usage it's true). So all that remains is sensible definitions for those two commands

You'll probably want to customize the callback for your own preferences.

Put all that into your .bashrc and cross your fingers

Oddities

If you suspend a process, that will cause the prompt to be printed and may cause a notification. If you unsuspend that process with fg then when it finishes, this code will see fg as the command that just completed. Also, we cannot notify for 0 seconds otherwise just hitting enter would cause notification and the IM would refer to the last command in the history. Interactive commands obviously take a long time and so cause useless notifications. I'll probably whitelist a few common ones like man and less. There are probably some problems too but for normal type-run-complete shell use it works just fine.

Basically shells are shitty and not very flexible. Doing this correctly would involve being able to stash away some state along with each command that's started then getting access to that state when the command finishes, it would require changes to shell.

Friday, June 14, 2013

Julian Assange's "secret" meeting with Eric Schmidt

http://wikileaks.org/Transcript-Meeting-Assange-Schmidt.html has a transcript and a 3 hour recording of a meeting between Julian Assange and Eric Schmidt (and some others) as part of the research for Eric's recent book.

It's long but it's quite interesting (nothing terribly secret though).

I like Assange's goal of using content-based addressing for everything. It's not an original idea of his but I his point about detecting newspaper articles that silently disappear was good. If he can help to popularise it, that's great.

I completely agree with his call for "scientific journalism". That is journalism where all claims are backed up with references. He says that anything else should be dismissed as not journalism at all. There used to be the excuse that there wasn't enough room on paper but that gone now. George Monbiot (http://www.monbiot.com/) has been including full references in the web versions of his articles for years now but I haven't see anyone else doing it.

I thought he didn't give a convincing response to the question of what happens if the governments and corporations floods Wikileaks with thousands of computer-generated fake but plausible leaked documents. That whole conversation got a bit messed up and he seemed to miss the core of the question.

Possibly the most shocking thing was that Eric Schmidt didn't know what simulated annealing was :)

Anyway, I've linked the MP3 to this post so if you want to listen to it conveniently through a podcast player, you can subscribe to my feed to get it (there are no other podcasts in my feed).

Saturday, May 04, 2013

ASUS RT-N66U, PPPoE and OCN

Quick note on this in case anyone runs into the same thing.

I got a one of these

http://www.asus.com/Networking/RTN66U/

When it checked for firmware upgrades it found "3.0.0.4.260" but that would not connect to OCN no matter what I did. In the end I noticed that "3.0.0.4.270" was available, installed that and OCN worked immediately.

No special settings needed, just username and password.

Sunday, February 17, 2013

Nobody steals wood in Japan

This is a photo of a wood warehouse near my home on a Sunday afternoon. It's closed, has been since Saturday or maybe even Friday. Can you see anything to stop all that wood being stolen? No you can't because there's nothing. This has puzzled me for over a year.


There is a rickety wooden fence pulled in front of the trucks but the 1000s of Euros of wood and who knows what else they sell is sitting right out there with no security whatsoever. How is this possible? It's not a busy street and they don't close it up any further at night time.

A few theories:

  • They are paying the right people and criminals know they'll lose a finger (or more) if they try stealing from it? Or maybe they are the right people.
  • There are actually no criminals in Japan. This can't be true, otherwise there'd be nothing to blame on immigrants :)
  • The police are incredibly vigilant and effective...
  • There is no market for stolen building materials.

Is there anywhere else in the world you could do this and expect to find everything still there on Monday morning? Switzerland?

Monday, January 07, 2013

How to extract vob subtitles from an mkv file and render them as png files

Posting this because I spent far too much time figuring this out from various scraps here and there. I got a bunch of Japanese DVDs recently and ripped them to mkv. As part of learning Japanese, I want to be able to see the subtitles side by side, and use them in flash cards etc. I found recipes for extract subs from a DVD but it's not quite right for mkv. I also found tools that worked except but output distorted images.

There is also the very impressive subs2srs which will generate an anki deck with images and audio snippets and it works under wine but it actually is a little bit heavy for what I want.

First, you need to know what track has your subs. mkvinfo will tell you that. Then after that you can use the following bash script (you'll need to install mkvtoolnix and transcode).

You can invoke it as e.g. extract.sh foo.mkv 4 and you'll have a bunch of .png files generated from the 4th track of foo.mkv in the foo-sub-4/ directory. There will also be a .srtx file which gives the timings for all the subs.

My next step will be to use these timings to line the subs up and output a HTML page with them side-by-side.

Saturday, January 05, 2013

Ice skating

We went skating on Thu and Fri and the kids got some lessons in the mornings. Sean was able to skate but his technique is funny. He's nt skating so much as walking while sliding. I knew when he was behind me because I could hear the stomping sound.

I saw several people watching him, a couple of people following him on the ice and videoing him(!) and at 1:05 in the longer video, there's someone mimicking him.

http://www.youtube.com/watch?v=Y8-fLwiJyRg

http://www.youtube.com/watch?v=z1A3dBq2WPc

Friday, January 04, 2013

Fuck you Irish newspapers

The story of how Irish newspapers are performing ballistic amputation of their own feet is gathering steam. They think they can charge sites to link directly to articles on their websites. A good summary is here

http://www.thejournal.ie/readme/newspapers-charges-linking-ireland-740093-Jan2013/

Of course the story (and their attempted change to the copyright law in 2012) has not been covered at all in these papers.

Meanwhile one of the major papers carries an article titled, "Venomous and toxic social media out of control" which includes this gem: "Free speech and democracy are far better served by a regulated system of commentary, which insists on basic civilities, foremost among which is that participants identify themselves before contributing."

Right, good one, this regulated system is right now demonstrating that it cannot be trusted to report fairly and accurately (or even at all) on certain subjects - the internet and copyright law being two rather important ones.

Of course I'm not linking to this article or even saying what newspaper, not because I'm afraid of a bill but because I have no intention of ever sending another bit of traffic to one of these newspapers if I can avoid it.