Some Boston-based friends introduced me to The Cheesecake Factory a few years back when I was over there, and I was impressed. Not by the cheesecakes - oddly enough both times I’ve been there, we barely made it as far as the dessert. No, there was one particular dish on their menu that made the place memorable for me, and it was an appetiser - their Avocado Egg Rolls. Personally, I would never have picked that off a menu unprompted - I like avocado well enough, but it wouldn’t necessarily be my starter of choice in usual circumstances given the cornucopia of options available, but my ever savvy hosts insisted I try them, as they were, and I believe I quote correctly, “to die for”.
It may be going on for 4 years old now, but GPU Gems is still a fantastic resource - in fact now that you can rely on being able to use the techniques it contains on a much larger array of hardware, it’s perhaps even more practically useful than it was on release. Graphical products outside the hardcore gaming space (and this is where Ogre gets used most) are increasingly catching up and using more advanced shader effects now, and so a resource like this is actually maturing rather well.
Everyone loves high-level shader languages. Well, most people anyway - it takes a certain kind of person to enjoy writing assembler language. They do have one disadvantage though - compiling them is not ‘free’, it does take an amount of time. For many things you might not notice, but if you have a lot of shaders, or particularly if you have long, complex shaders such as ‘uber shaders’, where you have one enormous set of code with precompiler options producing many variants, you might be surprised how long they might take to compile.
Ogre’s material system is pretty smart already, and one of its many features is to automatically switch techniques based on the supportability of certain hardware features you’ve tried to use in your material. Thus, you might provide an uber-fantastic Shader Model 3 lighting model for cards that can handle it, and provide simpler implementations that will be automatically used on lower end hardware. Other uses of techniques include ‘material schemes’ (where you can define alternate pathways for your materials and associate them with particular viewports, render targets or system options - useful for HDR paths, high / low detail options etc) and level-of-detail support where you can drop the complexity of your materials in the distance automatically - however hardware fallback is probably the most common use.
I have a confession to make - in my deep and distant past, I have an accountancy qualification - a legacy of a young man with too little focus about what he wanted to do in life and before it had occurred to him that someone might pay him to play with computers. Not that I think there’s anything particularly wrong with accountancy; in an island dominated by finance it was a safe bet for a mathematically / logically-minded individual with too many diverse interests and no particular clue which one to follow at the time - but I found out pretty quickly that it definitely wasn’t for me.
It hasn’t been that long since I last upgraded my main dev machine, but as it happens although the 320Mb 8800 GTS seemed perfectly adequate for my needs late last summer, most of the people on the team I’m working on now bought their cards more recently, and of course now the 8800 GTS 512Mb is now the ‘sweet spot’ for nvidia cards. It didn’t really matter until I started running out of memory, and of course since I was always the first on the team to run out, it was a mite inconvenient.
Open Season is a podcast about open source issues, weighted towards the practical rather than the philosophical, and as such I tune into it regularly. Some are better than others, but I found the latest Episode 13 quite interesting for a number of reasons. They had an analyst from RedMonk on board this time, which was fascinating - RedMonk are (AFAIK) the only research firm that release their results openly rather than charging a few thousand for detailed papers, so they’re quire interesting.
When you’re talking with some programmers, particularly younger ones, you can’t help but run into the ‘great language debate’ at some point or another. That is, that many programmers have a language which they feel is superior to all the others, and they’ll put up a ton of resistence should you suggest that they use something else. It happens in other areas too of course - preferred operating systems, databases, apps etc, but as coders the language issue always tends to come up most, closely followed by IDEs.
I like co-op games a lot - currently my wife and I are battling on-and-off through the triple-distilled gameplay that comprises the multiplayer co-op levels on N+ - and both ruthless and supremely entertaining they are. I’m glad that game designers are starting to give this mode more attention these days, and so when I initially saw Army of Two (peripherally), I was semi-interested given that it was clearly designed for co-op from the outset.