Thursday, September 29, 2005

A change of pace

Well, finally all the high priority issues and most of the medium priority issues of the weather decoders are fixed. Rest of the stuff is pretty trivial. It's taken a good amount of effort to get to this stage. All changes are booked in and awaiting the project lead to process, test and include them.

This means I can move onto what I really want to do - game programming - specifically the space game that we've been working on for a while. Several things are happening to this: it's becoming cross-platform (Mac and Windows) via SDL and it's becoming both Mac-PowerPC and Mac-x86. Lots of things to do.

Strangely the project lead for the weather program can't get to my decoder changes because he's also working on the space game. For some reason he's extremely gleeful at having a totally new machine with a completely new architecture to port the game to (Mac-x86). Haven't seen him this happy in years!

Anyway, must code not blog.

Sunday, September 25, 2005

A lack of multiplayer games?

This is short ramble into multiplayer gaming containing random and a possibly unrelated set of ideas. It mainly applies to multiplayer games over the Internet. Some of the arguments are not fully fleshed out. If you have a specific questions or comments let me know and I'll think up something up. Or change my mind. Whatever is easier. And, although I refer to computers specifically in this article, a lot of the things here apply to console gaming as well - I think.

My primary concern: there aren't a lot of internet multiplayer games and, in my opinion, there certainly aren't many as there should be. I'm not saying there should not be any single player games - having a lone gaming session is still very important for reasons I won't go into here. But sometimes I think we certainly haven't explored the multiplayer game arena to much extent at all.

For example: generally, the available shareware multiplayer games break down into simple games (card games, puzzle games, etc) and more immersive games. However, there are very few of either to be honest. I guess this is because doing multiplayer games isn't easy, especially with the latencies over the internet making things much more complicated. However, even the number of commercial games that are multiplayer is limited. Agreed, making a multiplayer game is totally different and changes the game and development dynamics - whether they are competitive or co-operative.

I'm mostly interested in 'immersive' games. For the current purposes, let's say there are two class of immersive games:
1. Games like Doom, Quake, Unreal or Neverwinter Nights where a limited number of players game together. Let's call these team games.
2. Massively multiplayer games, usually MMORPG (massively multiplayer online role-playing games), which have usually greater than 100 users. The difference with the recent MMORPG over old days MUDs is mainly they are fully graphical - usually some form of 3D or fake 3D.

Maintaining a MMORPG as shareware or freeware is a tall order - even for commercial companies it's difficult. From a running point of view I guess most MMORPG typically require a single server* that is constantly running with a fair amount of bandwidth. Although totally peer-based persistent worlds are possible, it's cutting edge stuff that's difficult even if we assume no one is going to cheat. (* I know quite a few commercial MMORPG run multiple servers)

(Perhaps another alternative for persistent worlds for shareware/freeware is lots of little servers with less bandwidth that only run a small section of the map. Think of a network of web pages where the whole web-site is run over multiple servers. Of course each system operator would need some sort of gate-keeping so that only certainly things pass over the borders of their land. Character advancement and object management would be a challenging problem. )

Even designing team games add many different challenges. They require stories that suite multiple players for co-operative games or equalisation of game play for competitive games (e.g. no greatly better weapon). On top of this nearly all team games have a single player mode - a story to play. This itself requires much tuning to make into a good game. Therefore you really have two games. One of the reasons for needing a single player mode is finding people to play with. Of course, there is a solution; quite a few games have lists of currently active games you can play with others. This solution is only half a solution: most of the games have very experienced players which the average player has no chance against. Therefore it becomes no game. Perhaps some system to put similar players together? Additionally, these are people you don't know - and hence have no connection to. In a MMORPG you at least build up relationships with people as well.

The two types of games are obviously very different. MMORPG have theoretically infinitely more variety than team games. But there are limitations. One of them is that the players have much less control over the game than you do when playing a single player game or a team game with friends. Playing occurs constantly and people you know who play for more time are at a serious advantage. Also most games also charge a fixed fee by the month - ruling out a lot of casual gamers getting their moneys worth. Additionally a lot of MMORPG rely on character advancement rather than story or even exploration as a pull. Constant character advancement is only a driver for so long. Of course, some players are good at (real) role-playing - usually without assistance from the game code or world at all. This provides an infinitely more convincing driver for players. When programmers learn to code that into a game they will have a winner.

Oh a footnote: I'm focused here on internet multiplayer games over the other two modes - local connection (e.g. LAN) and multiplayer games on a single computer because both are significantly inferior to internet multiplayer games. There are good games in this category, of course, e.g. Bloodwych, but generally it adds much more dimension to the game when you don't have to be physically together with the other players all the time. Generally Internet games can be played with a local connection (or at least can be easily designed to). Playing multiplayer games on a single computer can be good - but generally it's a bit of a pain - and have limitations - e.g. you can see what the other player is going (although with a co-operative game that can be fun - like watching reality t.v. but better).

Anyhow, enough babbling.

Friday, September 23, 2005

Bizarre Future

From all accounts, it appears that Mac OS X on x86 pretty much rocks. Obviously results from the first real x86-Mac boxes will be the things to look for; chip-sets, memory and processor variants can make a massive amount of difference. But on what we have today Apple looks like it might be just fine on performance. Lets hope the power dissipation works out in the future as well.

I was confident if Apple had made the decision to go x86 they had thought (and worked) long and hard. Some other coders I'd spoken to weren't so sure - but they seem to be changing their minds based on the, albeit early, evidence. Sure, I'll be sad seeing the PowerPC go from personal computers over the next few years. It's a great architecture. But at least we use (mostly) high-level languages nowadays.

The emulator (Rosetta) does the job well even for games according to my source - which is just what you want. There aren't too many holes in the development version of the operating system overall. All is looking good.

Porting seems easy and the problems minimal - the biggest difficulty is the lack of boxes for people to develop on. Certainly shareware authors can't afford the Dev machine rental. Hopefully the first Intel Mac's won't be far away.

The only issue for shareware and freeware developers longer term is going to be testing both a PowerPC and an Intel build - I don't see how you can do it well without both boxes - perhaps with an Intel box you can rely on an emulator to make 'universal' builds? Additionally quite a few Mac shareware developers will have an old PowerPC box as well. Initially, I guess greater than 95% of PowerPC shareware and full commercial will run just fine on Intel based macs.

I do wonder how Windows programs in an emulator will perform (whos will be first?). It will be good for the occasional programs. It also could provide another excuse for companies avoiding ports but, Mac users being Mac users, they will choose Mac-version alternatives (and why not - they choose Mac, of course).

I'm sure this will encourage new developers - or give older Mac developers new life. That, in my opinion, is worth it's weight in gold. Of course, this has always happened in the past for the Mac (and other 'alternative' platforms). May long it continue - you can have too much commonality in the computer world.

Wednesday, September 21, 2005

About Debugging

Debugging is a strange thing. You do your design, write your code and then the code written doesn't work. Of course you might not know it doesn't work. Sometimes you will take a chance, run the code and look at the results - occasionally I will, especially if I have only made a small change I'm confident in. But mostly I got straight into a debugging mode of some sort.

Usually I develop incrementally: write a bit, debug a bit and repeat. This means that I know straight away if I'm doing something stupid. You also know exactly where the problem must have been introduced. No complex debugging strategies are usually required. Of course, as the problem as a whole gets larger there can be complex interactions. Nothing is a sure-fire way of debugging.

This style of debugging usually involves running test data through the program and if things go wrong then I some sort of print statement to show me the data or program flow. This seems a very simple and effective way of debugging.

Sometimes you have to write significant chunk of code, where debugging incrementally is not possible. This is usually some complicated set of routines where writing test harnesses, or forcing data in, would produce a lot more work than debugging it as a whole. The code quite often has a lot of dependencies with itself. I've no real problem with this - after all I'm not bad at writing low error rate code I've found.

Often this demands a much different approach using debuggers so that you can see where the flow of the code is going and the data being operated on.

In Python I tend to avoid the debugger. It's not that it's bad, just that the one I normally use is the basic text debugger that isn't as easy to use as a GUI driven debugger. Additionally, with Python, you definitely need to test every code path. I agree it's a good strategy in any language - and my experience with Python is that if you don't you're code will not work. There will be some runtime error in some code-path. This sounds a tad superstitious - but most of the checks in Python are done at runtime.

I know people have criticised Java for throwing up run-time errors - and perhaps Java's runtime errors are different. Also, I haven't written very large programs in Python (usually the leverage of the libraries mean that's not necessary). However, at least in my experience, all you need to do is to make sure that you've at least run every line of code before you declare it's done and with a reasonable set of data. That's not all that bad - and the fact that you can get away with not running every single line of code in languages like C and C++ is not necessarily a benefit. I guess you can get away with it most of the time because the compile-time nature usually warns you of stupid syntax or type operation. However, in reality not running every line of code is a mistake, in my opinion - and leaves you open to not fully understanding the code you've written and the associated problems.

This brings me onto another point about C and C++. For some reason, and I guess it's a mixture of the language, the libraries and the way applications are written and the type of applications that I write in C and C++, I find that a debugger is much more important and a good debugger can save a lot of work. It's quite often harder to put a good range of test data through a C and C++ routine. One way round is a test harness, which is a set of code (not part of the application) designed to test a specific routine, module or class.

Whilst test harnesses is definitely the right approach in some circumstances, it certainly isn't in all circumstances. For a start the test harnesses to test a specific piece of code can be significant because of the number of test cases. The module might not just need calling but might also need all the called routines replaced - so it might need to have a separate project with a lot of support code. There comes some point where the effort required to produce the piece of code - which includes debugging the test harness - can be way more than the code being tested. Remember, the object of the exercise is to know that a piece of code works - not to stick to one method. Faster (but effective) solutions should be seriously considered.

The other way is to test in place, usually with a debugger to examine the state before, during and after. Stepping each line once very often illuminates errors or problem cases (not necessarily happening in the current run). Get the program under test to perform actions that provide boundary conditions for the code under test. Sometimes this is via program input data (files, user actions, etc) but other times the rest of the program needs to be modified temporarily to 'stress' the module under test and get the target code to perform in all the required modes of operation. If it's an important test that needs to be repeated when the code is modified the test code can be left in the source code but conditionally complied out. (Of course, if it's not required then take it out since more lines of code = harder to change. These bits of code can always be re-made if the test needs to be repeated).

Of course there are many more approaches to debugging (e.g. support programs). But these are the ones that I use the most.

Apple's Xcode debugger uses gdb underneath. They've added some nice features recently to the GUI front end. Breakpoints especially are getting powerful. You can do things like counts, etc. You can also set them up to log. I guess this means that rather than sprinkling print's over the code you can just click and it will log. This seems neater somehow and I really fancy trying it out.

Additionally you can run a program when a breakpoint is hit. This is very cool - because you could: set an ichat status, send an email, get another program to change one of the data files that the application is using, etc, etc.

Finally, I remember when debugging meant not running anything else on the computer. I think of all the things that have happened in debugging, this is my favourite change.

Wednesday, September 14, 2005

Programmer's Block

Sometimes I've heard this type of thing called 'Programmer's block' - like writer's block. Things getting in the way and avoiding starting (or usually restarting) a project can be a real problem.

I never really have any difficulties coming up with ideas and I think this is the typical view of writer's block. Another definition of writer's block is " A usually temporary psychological inability to begin or continue work on a piece of writing." It's the continuing that's the problem after there has been a natural gap in the work. I'm not sure if it's temporary or permanent in my case. There are always lots of other things to do - especially for home projects.

Projects in teams - like office projects - tend to have a certain momentum because as one person hits a low, other team members are in their stride. This coupled with the fact that people are interested in the project day-to-day means that it's easier to avoid the something-better-to-do feeling and force yourself to start work.

I do enjoy programming. Once I've got the editor running and I'm actually coding then I can code for hours.

I know it's going to be a long process - therefore sometimes I do little jobs (not programming) first (in order to clear them off my 'to-do' list) and never get started. An old boss of mine used to want us to get little jobs finished first before starting on big jobs. This meant we never started on big jobs. What should have have done is started on important jobs first (i.e. ones that made the company the most return on investment) and ignored the small unimportant jobs.

As I've said, with personal projects it can difficult to get time for with other pretty good things competing for my time: in my case my lovely wife and my beautiful daughter. There are some less impressive time sinks: things that need doing around the house, shopping (food, clothes, presents (it's always at least one person's birthday)), etc.

Another problem: tiredness. Perhaps because I do a full time job (part programming, part design, part meetings, part management) it means I'm usually pretty whacked. Once mundane but essential things are out the way (cook tea, change baby x 4, eat tea, bath and put baby to bed), it's late and I'm after something with low mental effort that entertaining.

On Saturdays tiredness and the fact I've been in the office all week usually means I need to go out. Since we usually haven't had a chance to go out for general things this usually leads to shopping.

Once in a while we also need to go out on Saturday evening to get some socialising done - to make sure we don't turn into totally antisocial people. Play a few games, drink a few beers etc. Of all other other distractions here (excluding the actual essential things like playing with your daughter, giving some attention to your wife and eating food) this I feel is essential.

Additionally, some sections of the code are non-trivial and require some considerable concentration for extended periods of time. Obviously I try to make all sections simple in the design stage - but sometimes it's difficult to avoid difficult modules. These can easily become blocks in the work until a big amount of continuous time exists. This is dangerous for getting started. It would be great if all coding tasks were small.

A friend of mine has, I believe, two interesting tricks. Firstly, he leaves an easy bug fix or a little feature enhancement which can get you into the editor and coding. Once you are there it's not difficult to carry on. Secondly he avoids spurious tasks (like invitations to do non-specific meetings) which means he maximises the time he has to program. This is good in that any little distraction or excuse can be used to avoid programming.

The Internet (or rather the web), even more than the TV, is an almost infinite universe of distractions. I say more than the TV because, apart from Sci-Fi, there are few programs that interest me. They are all targeted at non-geeks and generally poorly written with simple but silly stories (IMHO, YMMV). The web, however, has places like,,,, etc, etc. Additionally it has re-runs of TV programs like the sky-at-night with links to a whole host of Astronomy sites. Way too many. Dangerous stuff indeed.

All too often I'll be reading the comments posted in reply to a article. This is a massive way to waste hours.

Additionally someone - on a blog or web site - will post a really interesting programming/ design/ software engineering article. Now it might be related to what I do, or things I'm thinking about. But usually it won't, in reality, effect my programming. Once I've read that article I have a horrible habit of wanting to read everything to do with it - links off that site, other articles by the same author, searches off Google, what Wikipedia has to say. If I read an article at 8pm that will be my night over!

Leaving a specific project for a while (over a week, for instance) usually means it goes off the boil. This means it's even more difficult to start up, remember where you are and actually do something on it.

Even blogging itself can get in the way. But the reason this is a coding blog is to talk about what I'm doing or things that are bugging me. I'm hoping this will have a beneficial effect. But we will see.

Tuesday, September 06, 2005

Humans vs. Machines

Humans win - especially if they set the rules. I'm trying to teach a computer (ok, write a program) to spot a Metar weather string. I've also got code to extract specific information out of a web page. Both really quite difficult specifically because both were really meant to be read by humans. "Web pages!" I hear you cry - they were specifically designed to be read by computers! Well, certainly the web pages meant to be read by computers, but the content was certainly not.

Additionally I've also spotted one of the other programs has become somehow broken. That's what you get when your program interacts with other peoples programs and data.

Oh well , back to Metar weather strings and fixing broken programs.
Newer›  ‹Older