TBBle Scarry’s Busy, Busy Weekend

Often my weekends start out with grandiose plans of what I might try and get done.

This weekend (and the preceding evenings I guess) saw me produce a Wine patch I was only playing with out of interest but which turns out to affect Warhammer Online, although I didn’t know it until after I implemented the patch, and a WIne patch I’ve been meaning to prototype for a while using XInput 2 to fix a long-standing Wine bug which also affects Warhammer Online.

I also got back to watching Life On Mars, although I’ve only managed one episode and a bit. It’s pretty damned good.

I also decided to make gyoza, as I have fond, alcohol-supported memories of the last time I made them.

I managed to lazy my cooking even more than usual. I’m using a recipe I picked up last time I made them off a site called The Food Palate by Deborah Rodrigo, whom Google has since informed me is from Sydney but both that site and her personal blog appear to have fallen off the Internet, sadly. However, I distilled (with the help of Kirky at work) the ingredients down to this:

Ginger, chives, chili flakes, coriander, garlic, sesame seed oil, soy sauce for dumplings, and gyoza skins

Ginger, chives, chili flakes, coriander, garlic, sesame seed oil, soy sauce for dumplings, and gyoza skins

Adding half a kilo of lean pork mince, and about a half-hour, you get:

30 gyoza, freezer-bound

So not as bad as the ugly cake I made recently, but still not spectacular. And unlike the cake, I don’t yet know if these turn out to be poison or not.

I expect that they’ll be delicious, and not even slightly poisonous. And unlike my cake, I’m not going to try to share them with anyone. ^_^

It could be worse, at least I seem to have not managed to poison my housemate’s lizards, Prime and Grimlock, whom I’ve been feeding while he’s away this weekend. I’m not sure how I could get “put grasshoppers into the box” wrong, but I don’t think I did. I think they’re pretty neat names for lizards, reflecting Mick’s inner geek, and his outer geek, although Prime seems to be larger than Grimlock which is to the best of my knowledge the wrong way ’round.

I was going to try and leverage in a rant about characters in children’s books with alliterative names at this point, and observe that one of my favorite authors as a young child, Richard Scarry happened to avoid that, but upon actually looking him up, I realise the characters whose names I’d forgotten quite often had alliterative names. The characters I remembered still had non-alliterative names, so it’s not as bad as some authors I can’t be bothered remembering, but I’ll chalk that one up as being disappointed by a childhood memory.

A less disappointing childhood memory turns out to be Piers Anthony‘s Incarnations of Immortality series. I read the series when I was quite young, and I’m only re-reading the first one at the moment, but it reminds me how good a writer he is, and why I loved his books so much as a child. Also because he’s alphabetically early on the shelves. I don’t know why I seem to do that. I think when I’m picking a new series, I start at the beginning and go until I’ve chosen one. So that favours the alphabetically early.

I’ve managed to get a whole bunch of reading done recently, which is good. Sadly, Borders now wants me to pay $7 on a $14 book to order it in from overseas, and it turns out most of the series I’m following keenly enough to actually order books are on that list, so I may end up having to do an Amazon order. Which is annoying, because I’m also looking for some DS games: Ace Attorney: Trials and Tribulations appears to be discontinued in Australia and the US, and Impossible Mission never seems to have been released here at all. Along with wanting Race on DVD, I have a fair bit of overseas shopping to do, and the local financial climate is not exactly conducive to that. -_-

Anyway, the above is my documentation supporting why I should not be left alone for days at a time. ^_^

Edit: Fix images after changing hosting.

Advertisements

Grinding code in Warhammer Online

My original plan was to only use the Windows XP 64 installed on my laptop for video games (and then only when necessary due to a Wine disfeature) and Linux for everything else. My World Of Warcraft days actually worked quite well for this, as it played very nicely under Wine. However, as interesting things (ala my previous blog post) sometimes crop up while I’m in Windows, and also as I’m now playing games that aren’t so nice under Linux, I’ve ended up being in Windows more than Linux. And now I’ve found myself distracted from games playing by, of all things, MMO UI Addon programming, keeping me in Windows even more.

You’d think with my strong awareness of the commercial nature of grind, I’d prolly be trying to get all the playtime I can out of my monthly subscription to Warhammer Online. Instead, I seem to be burrowing my head down into some UI programming in Lua. Like WoW, WAR (or WHO as a friend of mine calls it) uses LUA to implement its user interface and provides a way of adding modules in to modify, adjust or just plain futz with the interface. The big site for WAR addons (like WoW addons, in fact) is Curse Gaming and they even provide a Sourceforge-like site for addon development called CurseForge.

Anyway, why am I doing this, given I managed to avoid WoW addon programming for my entire playing time? Apart from external reasons I’m not going to post here, WAR being brand new is missing a fair few addons. None I can’t live without, but one it does lack is DrDamage, which enhances your ability tooltips with the actual effective values of the ability once gear and stats are taken into account.

Part of the issue is that WAR’s combat calculations are not fully understood yet. An excellent primer is available at Disquette’s Weblog and Warhammer Alliance has a Mechanic Analysis forum as well. I’ve posted some comments at the former, but the latter requires you to be a “WAR Soldier” before you can post, and I seem to still be a “WAR Recruit”, which means I haven’t contributed enough to the Warhammer Alliance forums. Ah well.

So anyway, my addon. LibCombatCalcs is my first MMO addon, basically supposed to encapsulate the various combat number mechanics of WAR so that I or someone else can write tools like DrDamage (or RatingsBuster) which magically continue working when they change the mechanics, and which don’t need large hard-coded tables of information duplicated across each addon.

It also intends to tie together the seperate sources of combat information into a single coherent stream for other addons to listen to.

Anyway, we’re not there yet. What it does do right now is record hits against monsters, and give you a little window with /lcc mobinfo which shows the calculated toughness of the monster (from an unambigous non critical autoattack) and the calculated values for all the subsequent abilities you used, letting you see if my calculations (and therefore my transcriptions of the community’s understanding) are correct, and/or where things need tweaks. I’ll be using this (and I hope others do too, I don’t want to build a level 40 of each class to do this…) to identify the sources of DPS that contribute to each ability.

Anyway, there it is. I’d love to hear feedback about it, preferably at Curse/CurseForge but here is fine too if you hate those sort of sites. You can clone the git repo from CurseForge, and it currently autopackages every commit I push so you can also grab and install the zips.

By the by, this is my first time using mysgit although I did contribute some work to a different msys git effort, and it combined with Console and an updated Vim with some nice colour schemes (I’m using xterm16 at home and work now) makes me a much happier Windows programmer on my laptop.

On other fronts, I’ve recently been playing with Python-Ogre, hoping to knock out a 3D physics-based tech demo of some kind with it in the middle-term future. (May end up being a Christmas break project…). After my disappointments with 64-bit Python and Pyglet under Windows, I may end up doing it under Linux. Ideally it’s cross-platform of course. I’ve also done some more serious work on my book cataloging software using Elixir, SQLAlchemy and SQLite to turn my collection of text files into a real database. However, there’s not a particularly good way of dealing with schema changes that I can wrap my head around, so I’ve put that on hold while I think about how the data’s going to have to look in the long run. And then I got distracted, so it’s on the Christmas break pile too.

Tension in Debian changelogs

Holger Levsen wonders what tense people write their changelogs in. Andrew Pollock feels that his tendancy is past-tense.

Looking back over some of mine, FreeRADIUS from a long time ago, and openjpeg more recently, it appears that my preference is to actually write them as untensed fragments. I think I’m answering the question “What does this change do?” from the perspective of the change. This would make sense, mirroring somewhat the comments I put in dpatches (and the overly verbose names that have been known to occur) which are usually the patch talking about itself in the plural. Unless that’s the patch _and_ I talking about ourselves in the plural?

First rule of karma: You don't talk about karma

OK, so I made it to LCA08 in Melbourne, eventually.

However, I managed to have the following happen on the way:

  • Got the time of my flight to Melbourne wrong, arrived at 5:30pm for a flight that left at 5pm.
  • Caught the wrong tram from Melbourne CBD to uni accomodation, had to walk from Royal Melbourne Hospital back to the university. This was precipitated by me misreading the tram t
    imetable thingy.
  • Failed to wave at a traim outside the uni, meaning it sailed right on past me.
  • Locked myself out of my room, the third time I left it. (They’ve got those dumb swipe-card locks which are always locked except when you’ve just swiped from the outside, but are open from the inside.)
  • Asked on #linux.conf.au about the URL for Planet LCA 2008 while it was in the topic. (Unlike on #debian, not only was I not mocked for this, no one noticed before I did, a while later)

On the other hand, I caught up with Brad, Evelyn, Bek, Jason, Phil, Naoko, Geoff and Ange, all in the one day. That was fun, we had dinner, I stuck my sore feet in the ocean and felt better, and I manged to catch the right trams from the university _to_ the city. Well, lunch with Naoko, the rest with the others. (Actually, that’s in reverse chronological order)

The actual conference first day was interesting. I was at the Debian Mini-conf all day, seeing a neat thing about using git for managing packges sensibly, which is something I was trying to figure out when I was packaging Second Life last year, as well as some cool stuff coming into Debian over the next year or so.

After the Debian Mini-conf all went over to the keysigning (I didn’t go again this year, I wasn’t organised in time) I went to see a presentation about Ingex which is something the BBC have developed to try and take Digital Betamax out of the video production process (since Digital Betamax only works in real-time, as I understand it) with some success so far, and it’s pretty interesting.

Speaking of not being organised in time, I only thought today to look at the Tutorials, and both Wednesday’s tutorial about hooking up hardware to Second Life and Thursday’s tutorial about hacking on lguest require preperation. I was able to grab Jon Oxer at the Debian Mini-conf and get my name put on the one remaining spare development kit, and so now I’m down in the Junior Common Room of Trinity College (no wireless in the rooms yet) updating my blog instead of trying to get lguest running under qemu. I’ll have to go dig up Rusty’s and Robert Love’s instructions from LCA05 preparing for their kernel hacking tutorial that year. Wow. Archiving the old LCA websites kicks ass!

Edit: I actually was dumb on #linux.conf.au, not #debian. As an aside, I managed to lock myself out of my room again later that week.

It takes surprisingly little bad karma to get a good karma payout

Good news! Having worked for most of the traditional Christmas break, I’m now going to to linux.conf.au 08 in Melbourne next week, and Game Developers Conference 2008 in San Francisco in late February.

CAPSLOCK CANNOT EXPRESS MY GIRLY DELIGHT

For those of you who don’t already realise, my dream job since age six was to be a video games programmer. Having now achieved that, you’d figure I was now in for karmic mortgage payments for a while. And sure enough, having an umbilical hernia become quite painful on Friday night, 28th of December (I was working that day) would certainly seem to be within reach. I’d actually had the hernia for a couple of months, I reckon, but hadn’t known what it was or what to do with it. (I thought I was just getting fatter. -_-) Anyway, a mix of mentos, Coca-cola, lifting a heavy TV that week and who knows what else ended up with me spending the night in hospital on morphine. (Well, I dunno if I was on morphine all night. They gave me some) Thankfully, the surgeon registrar was able to push the bits of bowels sticking out back in (before the morphine. -_-) without problem, and no problems appeared overnight, so I’m now waiting for the letter to let me join the waiting list for surgery, and occasionally stopping to push bits of my bowel back through my belly-button.

This means I’m no longer a hospital virgin (not that I really was. I went to hospital when I was three years old or so, to get my forehead stitched up after falling off the wall above our driveway in Oyster Bay, Sydney) but it was a scare that I wouldn’t be able to go to LCA this year, having already booked and paid for it, and LCA being my main actual holiday each year.

Also, it was lucky my sister was in town, since when I told her where and how it hurt, her mind went straight to hernia, so she and my mother came over to check me out and took me to hospital, hours earlier than I would have gone myself.

Anyway, early last week I saw the surgeon consultant, and he said I’d be fine to travel, since the surgery was fairly far off in the future anyway (“several months” I believe) and as long as I don’t put sustained lateral strain on my abdomen, I’ll be fine.

He also said to lose weight, of course.

So yeah, I reckon that the hernia prolly balances out LCA, GDC, my job, and maybe even my paying off of the ATO this year. I hope the universe agrees, ’cause if I’m still in the red for those good things, I’ll have to be sure to backup my new laptop before I travel.

Things that happen when my brain gets full

I recently was linked to CCG Workshop which is a site where you can play collectable card games (CCGs) online. It’s interesting because they have this gatlingEngine software, which apparently runs the game for you using a set of rules in a gatlingML file.

I thought this would be a wonderful chance to document the rules for the Love Hina CCG, which I never finished translating as you can see, but the gatlingDevKit and all the developer documentation requires that you sign an NDA and suchlike.

Discussions on the forum (the developers talk openly on the public forum, so I have an idea what’s not under NDA ^_^) indicated the gatlingML files were XML, but when I got one while trying to play a game, it was quite clearly binary.

The first four bytes are !HZL which I thought looked really familiar, but it took a fair while before I clicked that that was “LZH!” backwards, LZH being the compression algorithm used in the LHA family of archivers. Of course, research indicated that none of the LHA family of archivers actually wrote a file with !HZL at the front.

Poking about some more, I noticed that the gatlingEngine is written in Delphi (and is legacy code anyway) and went looking for Delphi compression libraries. Thankfully, the vast majority do PKZIP-compatible compression, and the first one I tried that supported LZH compression was Tlzrw1. (Apologies for the quality of the link, the 1998 link in the read file is dead, and the Wayback machine record for it indicates that the author’s page didn’t mention the library anyway) So I note that the library in question attributes its LZH code to LZHUF.C which Google duly turns up for me. I change the code a bit to stop assuming a 16-bit word, handle the header at the front, and suddenly I have a utility which can encode and decode files compressed with the LZH mode of Tlzrw1. (Which has been ported to C# and Delphi.NET, Google tells me.)

Now of course someone needs the interest, gumption and skills needed to produce an open-source program that can process gatlingML files and run games from them. ^_^

Oh, and a cool thing: progress bar for cp, courtesy of Chris Lamb via Planet Debian.

Edit: Missing quote put a whole whack of text inside an <a>-tag.

Miam: It's French for leaving a bad taste in your mouth

(Side note: Due to 410549, some kind of PHP4/Apache2 bug in Debian/Stable that WordPress 2.1 has triggered, this site’s not loading fully. It’s apparently only happening on Debian, and upgrading PHP4 to the Dotdeb 4.4 build fixes it, apparently. >_<)

Anyway, here’s an entry in my “Why everything that isn’t apt sucks” category.

[root@bookcase ~]# yum info kernel-2.6.19-1.2911.fc6.i686 kernel-devel-2.6.19-1.2911.fc6.i686
Loading "installonlyn" plugin
Setting up repositories
Reading repository metadata in from local files
Available Packages
Name   : kernel
Arch   : i686
Version: 2.6.19
Release: 1.2911.fc6
Size   : 16 M
Repo   : updates
Summary: The Linux kernel (the core of the Linux operating system)
Description:
The kernel package contains the Linux kernel (vmlinuz), the core of any
Linux operating system.  The kernel handles the basic functions
of the operating system:  memory allocation, process allocation, device
input and output, etc.


Name   : kernel-devel
Arch   : i686
Version: 2.6.19
Release: 1.2911.fc6
Size   : 4.7 M
Repo   : updates
Summary: Development package for building kernel modules to match the kernel.
Description:
This package provides kernel headers and makefiles sufficient to build modules
against the kernel package.

[root@bookcase ~]# yum install kernel-2.6.19-1.2911.fc6.i686 kernel-devel-2.6.19-1.2911.fc6.i686
Loading "installonlyn" plugin
Setting up Install Process
Setting up repositories
Reading repository metadata in from local files
Parsing package install arguments
Resolving Dependencies
--> Populating transaction set with selected packages. Please wait.
---> Package kernel-devel.i686 0:2.6.19-1.2911.fc6 set to be installed
--> Running transaction check
--> Populating transaction set with selected packages. Please wait.
---> Package kernel-devel.i686 0:2.6.18-1.2798.fc6 set to be erased
--> Running transaction check

Dependencies Resolved

=============================================================================
 Package                 Arch       Version          Repository        Size
=============================================================================
Installing:
 kernel-devel            i686       2.6.19-1.2911.fc6  updates           4.7 M
Removing:
 kernel-devel            i686       2.6.18-1.2798.fc6  installed          14 M

Transaction Summary
=============================================================================
Install      1 Package(s)
Update       0 Package(s)
Remove       1 Package(s)

Total download size: 4.7 M
Is this ok [y/N]: Y
Downloading Packages:
(1/1): kernel-devel-2.6.1 100% |=========================| 4.7 MB    00:21
Running Transaction Test
Finished Transaction Test
Transaction Test Succeeded
Running Transaction
  Installing: kernel-devel                 ######################### [1/2]
  Cleanup   : kernel-devel                 ######################### [2/2]

Removed: kernel-devel.i686 0:2.6.18-1.2798.fc6
Installed: kernel-devel.i686 0:2.6.19-1.2911.fc6
Complete!
[root@bookcase ~]# yum install kernel-2.6.19-1.2911.fc6.i686
Loading "installonlyn" plugin
Setting up Install Process
Setting up repositories
Reading repository metadata in from local files
Parsing package install arguments
Nothing to do
[root@bookcase ~]# rpm -q kernel-2.6.19-1.2911.fc6.i686
package kernel-2.6.19-1.2911.fc6.i686 is not installed
[root@bookcase ~]# wget http://mirror.aarnet.edu.au/pub/fedora/linux/core/updates/6/i386/kernel-2.6.19-1.2911.fc6.i686.rpm
...
11:36:50 (141 KB/s) - `kernel-2.6.19-1.2911.fc6.i686.rpm' saved [17169362/17169362]
[root@bookcase ~]# rpm -i kernel-2.6.19-1.2911.fc6.i686.rpm
[root@bookcase ~]# rpm -q kernel-2.6.19-1.2911.fc6.i686
kernel-2.6.19-1.2911.fc6
[root@bookcase ~]# yum info kernel-2.6.19-1.2911.fc6.i686 kernel-devel-2.6.19-1.2911.fc6.i686
Loading "installonlyn" plugin
Setting up repositories
Reading repository metadata in from local files
Installed Packages
Name   : kernel
Arch   : i686
Version: 2.6.19
Release: 1.2911.fc6
Size   : 46 M
Repo   : installed
Summary: The Linux kernel (the core of the Linux operating system)

Description:
The kernel package contains the Linux kernel (vmlinuz), the core of any
Linux operating system.  The kernel handles the basic functions
of the operating system:  memory allocation, process allocation, device
input and output, etc.


Name   : kernel-devel
Arch   : i686
Version: 2.6.19
Release: 1.2911.fc6
Size   : 14 M
Repo   : installed
Summary: Development package for building kernel modules to match the kernel.

Description:
This package provides kernel headers and makefiles sufficient to build modules
against the kernel package.

This all started when I tried to build a kernel module for the default Fedora Core 6 kernel on a fileserver at MF, only to find that the version magic didn’t match, as I had an i586 kernel but i686 headers. No matter the cajoling, I couldn’t get it to install an i586 set of headers, or an i686 version of the running kernel. I gave in and figured that due to a security issue, the old 2.6.19 kernel had been retired and the new kernel (2911) was the only one in the repositories.

Which led me to try the above. Clearly, yum agrees there’s a kernel image RPM and kernel headers RPM available, both i686, but bizarrely is completely ignoring any requests to install it. And I mean ignoring, no error, no failure, it’s as if I haven’t listed the pacakge.

Sure enough, grabbing the RPM directly from the mirror and installing it with rpm worked fine.

And just to keep the hate flowing, the default setup of Yum is awful. There’s no Australian mirrors in the mirror rotation, so I was getting 20kB/s before thinking to take away its mirror list and force it to use mirror.aarnet and suddenly getting the full effect of our two-megabit-per-second link. And before I did that, if I changed my mind about an operation that was busy fetching things from the network, control-c would kill the fetcher, and yum would then proceed to try the next mirror in the list. The default installation contains a huge list of mirrors (fetched from the Fedora website) which now I look at it, does start with mirror.aarnet, although it also then tells me it couldn’t find any mirrors to match AU, despite having just given me one, and lists mirrors all over the shop. And it certainly never seemed to be using one when told to fetch something.

In Yum’s defense, I will say that it survived being backgrounded and kill -9d on several occations. ^_^

Speaking of changing mirrors, it doesn’t notice when you tell it to use a different mirror, and won’t invalidate its cached metadata, meaning it’ll reject the downloaded primary.xml.gz. When this happens, it still doesn’t clear its metadata, meaning if you try it again, it’ll fail again.

I feel better, having vented that. And I can hardly wait until we can whack this server and make it a nice Debian box, like all the rest of the systems in here (bar one FC4 box which only has one task, but happens to be in the DMZ…).

OK, one more thing. The Yum instructions say you can upgrade Fedora Core using Yum, but don’t. And it’ll only go one version at a time, and the box was an FC4 box in need of serious love. So I loved having to grab a four-gigabyte DVD to upgrade a server which is actually less than four gigabytes of system… It would have been quicker to image everything but our data, and FTP that to someone who already had the DVD. Except that it had to come back too. And it turned out to have, for a server, an incredible amount of crap on it. (I’ve this afternoon removed kde, gnome, metacity, cups, evolution, firefox…) This machine is Raided, backed up and was never ever going to be someone’s desktop machine. (I hope).

Although I now understand why there are people who want to upgrade Sarge to Etch, and start by downloading the 8-CD weekly Etch image. And in fact I had someone two weeks ago who was going to install Sarge, didn’t have a good Internet connection, and was asking if there was a better way than grabbing two DVD images.

In case you’re wondering, the kernel module I wanted to build was ppscsi, for a HP ScanJet 5100C. I wouldn’t have had this problem under Debian. ^_^