Everyone loves high-level shader languages. Well, most people anyway - it takes a certain kind of person to enjoy writing assembler language. They do have one disadvantage though - compiling them is not ‘free’, it does take an amount of time. For many things you might not notice, but if you have a lot of shaders, or particularly if you have long, complex shaders such as ‘uber shaders’, where you have one enormous set of code with precompiler options producing many variants, you might be surprised how long they might take to compile.
Ogre’s material system is pretty smart already, and one of its many features is to automatically switch techniques based on the supportability of certain hardware features you’ve tried to use in your material. Thus, you might provide an uber-fantastic Shader Model 3 lighting model for cards that can handle it, and provide simpler implementations that will be automatically used on lower end hardware. Other uses of techniques include ‘material schemes’ (where you can define alternate pathways for your materials and associate them with particular viewports, render targets or system options - useful for HDR paths, high / low detail options etc) and level-of-detail support where you can drop the complexity of your materials in the distance automatically - however hardware fallback is probably the most common use.
I have a confession to make - in my deep and distant past, I have an accountancy qualification - a legacy of a young man with too little focus about what he wanted to do in life and before it had occurred to him that someone might pay him to play with computers. Not that I think there’s anything particularly wrong with accountancy; in an island dominated by finance it was a safe bet for a mathematically / logically-minded individual with too many diverse interests and no particular clue which one to follow at the time - but I found out pretty quickly that it definitely wasn’t for me.
It hasn’t been that long since I last upgraded my main dev machine, but as it happens although the 320Mb 8800 GTS seemed perfectly adequate for my needs late last summer, most of the people on the team I’m working on now bought their cards more recently, and of course now the 8800 GTS 512Mb is now the ‘sweet spot’ for nvidia cards. It didn’t really matter until I started running out of memory, and of course since I was always the first on the team to run out, it was a mite inconvenient.
Open Season is a podcast about open source issues, weighted towards the practical rather than the philosophical, and as such I tune into it regularly. Some are better than others, but I found the latest Episode 13 quite interesting for a number of reasons. They had an analyst from RedMonk on board this time, which was fascinating - RedMonk are (AFAIK) the only research firm that release their results openly rather than charging a few thousand for detailed papers, so they’re quire interesting.
When you’re talking with some programmers, particularly younger ones, you can’t help but run into the ‘great language debate’ at some point or another. That is, that many programmers have a language which they feel is superior to all the others, and they’ll put up a ton of resistence should you suggest that they use something else. It happens in other areas too of course - preferred operating systems, databases, apps etc, but as coders the language issue always tends to come up most, closely followed by IDEs.
I like co-op games a lot - currently my wife and I are battling on-and-off through the triple-distilled gameplay that comprises the multiplayer co-op levels on N+ - and both ruthless and supremely entertaining they are. I’m glad that game designers are starting to give this mode more attention these days, and so when I initially saw Army of Two (peripherally), I was semi-interested given that it was clearly designed for co-op from the outset.
Yet more proof, if we needed it, that Mr Facebook has his head permanently lodged shoulder-deep in his own arse, because allegedly, Facebook is now going to help rid the world of terrorism. Yep, that’s right - not content with running a one-trick popularity-dependent company that despite still scrabbling around for a viable business model still gets funded to a level that defies all rational analysis, nor with his could-you-get-any-more-pompous 100 years of media gaffe, good old Zucky is now taking credit for stopping terrorism too.
It’s easy for my wife to tell if I’m ‘properly’ sick; if I don’t have any interest in touching a keyboard for a while, it’s official - I’m ill. More reliable than any doctor’s prognosis. I came down with proper ‘flu for the second time this winter, which sapped my energy to do pretty much anything for the last few days. Sucks. I didn’t even have the energy to play any games until late in the weekend!