Archive for January, 2014

Linux Setup Notes, Part 2

Tuesday, January 28th, 2014

I’ve been using Linux for a month now, and while I’ve done it, I’ve been adding things to this post over here:

/2013/12/29/linux-setup-notes/

Well that post is extremely long now, and I accidentally had to reinstall it (video driver fight), so it’s time for a fresh post!

I had some complaints about Ubuntu 13.10, so I decided to go ahead and install the Alpha version of Ubuntu 14.4 (2 months before release).

So far:
– Nautilus (File browser) is now fixed! I can type stuff, and it jumps to files instead of invoking a tree search! Huzzah!
– I has Mesa 10.0.1 with OpenGL 3 drivers stock! And they work!

The way things are going, I think Ubuntu 14.4 LTS is going to be a really good long term update to Ubuntu. Out of the box, these annoyances were solved for me (video driver tomfoolery is what ruined my last install. No more!).

So far, no regrets.

Fun new keys and commands to remember

ProTip: If using Ubuntu, and it says Dpkg, just don’t do it!

It’s so easy to break a Linux install. Apt and Dpkg (Synaptic) are package managers available to you. Both can be run, but these 2 managers don’t communicate at all, so if you’re not careful you can break stuff (like I did).

So if you’re using Ubuntu, prefer:

For all your installation needs.

Or alternatively, make sure to NEVER use Dpkg on “drivers”. Let Apt manage this stuff, and you will save yourself a world of pain.

Backups and Restoring

Be sure to copy your entire user folder, especially the root! Re-setting up applications becomes extremely simple, as you can just copy the .hidden folders back in to your new home folder (~).

I got scared that I would have to waste an UltraEdit activation (ugh DRM), but nope, all I had to do was copy my .idm folder. Huzzah!

Need Flash? Use Chrome

Download it directly from Google.

I’ve been a loyal Firefox user for a very long time. That said, as someone involved in games, Flash content is still kind of important. There was a debacle some time back, Mozilla refusing to support Chrome’s PPAPI. Or basically, a new sandboxed API similar to the legacy NPAPI used by all the other browsers. PPAPI has another advantage in that it’s not only sandboxed from the browser, but sandboxed from the OS. It’s the foundation for Native Client (NaCl).

Adobe declared that it will not support Linux anymore. The last version of Flash for Linux is 11.2. They just didn’t want to maintain the old NPAPI version on Linux. Unusually though, to better support the Chrome Browser, Adobe makes a PPAPI port of Flash.

Recently, Google announced that they will be discontinuing NPAPI support in Chrome. To a lot of folks, this sounded like “no more Flash on Linux”, but most people seemed to have missed the memo that Flash supports PPAPI.

And since PPAPI is really only supported by Chrome, PPAPI Flash comes bundled with Chrome. Easy.

As of this writing, the Flash bundled in Chrome is version 12, the same as on Windows.

So really, the hardest part about Flash on Linux is switching away from Firefox.

VMware Player on Ubuntu 14.4 Alpha

The latest Ubuntu has a brand new version of the Linux kernel. That broke the network bridge driver code that VMware Player 6.0.1 ships with. The solution, a patch, can be found here:

http://dandar3.blogspot.ca/2014/01/vmware-player-601-on-ubuntu-1404-alpha.html

Remapping keys on Ubuntu 14.4

Well it looks like the keyboard symbol files moved in the latest Ubuntu. Now they’re under /usr/share/X11/xkb/symbols/. Below is the modified snippet from my original post.

The Lenovo X220 has Web Forward/Back keys beside the arrow keys. I prefer that they act like alternative PageUp and PageDown keys.

Open up /usr/share/X11/xkb/symbols/inet (i.e. sudo gedit /usr/share/X11/symbols/inet)

Find a key named <I166>. Change “XF86Back” to “Prior“.

Find a key named <I167>. Change “XF86Forward” to “Next“.

Browse to /var/lib/xkb/

Delete *.xkm in the /var/lib/xkb folder. You need to do this to force a keyboard code refresh.

Logout, then Login to refresh. (Or reboot)

Power user GUI config tools

Install CompizConfig Settings Manager.

Install Unity Tweak Tool.

TODO: figure out the name of the tool I installed that made the taskbar work again. No, it wasn’t dconf-tools / dconf-editor.

Setting default file Associations to any program

Source.

Emit Keypresses (when being clever)

Use XDoTool.

On Windows, use nircmd.

Notable Patches

Minimize on Click (out of date, but a simple edit)
Disable Middle Button Paste
Taskbark Whitelisting. *sigh*… or a fix for Xchat.

FUIbuntu/FUXbuntu

TODO: Start a repos/ppa that undoes/fixes some of the annoying UI “fixes” Mr Shuttleworth has introduced in to Ubuntu and Unity. The F stands for what you think it stands for, and the UI is to polite’n up the U, and to be specific that the tweaks are UI related. Alternatively, FUXbuntu if feeling angry.

I want to still use a stock Ubuntu, but holy hell there are some frustrating UI decisions in Ubuntu.

UPDATE: Ubuntu 14.4 does fix the file browser.

SOURCE YOUR SHELL SCRIPTS!

Very important thing I *just* learned: Source!

If you write a shell script that (for example) sets environment variables, they will set the variables for any commands run in that script, but will not propagate outside of the script and in to the shell. The solution is to run the script inside the source shell, like so:

or

Startup and Environment Variables

To run things on startup, add them to .bashrc ~/.profile

Simply log-out and log back in again for the changes to take effect.

More details.

C++ typeof vs decltype

Saturday, January 25th, 2014

GCC and compatible (Clang) compilers implement a feature known as typeof (or __typeof__).

http://gcc.gnu.org/onlinedocs/gcc/Typeof.html

This is a pre C++11 feature, omitted from the standard, and is unavailable Visual C++.

For the most part it does what you’d expect. Given a variable, it returns the type. This lets you create another instance of a type without having to use its full name. This is helpful if you happen to be using templates.

Regrettably, typeof seems to be somewhat flawed.

Generally speaking, there is no way using typeof to read type members (typedefs, structs, unions, classes).

As of C++11, a new keyword decltype was introduced. It is functionally the same as typeof, but the case shown above works. It has been available since GCC 4.3 (2008) and Visual C++ 2010.

In practice, you’re often using this in conjunction with an assignment. So if you have C++11 available, you may as well just use an auto.

*sigh*… I wish a certain company didn’t use GHS Multi, so I could … you know, use decltype and auto. 😉

4K, HDMI, and Deep Color

Friday, January 10th, 2014

As of this writing (January 2014), there are 2 HDMI specifications that support 4K Video (3840×2160 16:9). HDMI 1.4 and HDMI 2.0. As far as I know, there are currently no HDMI 2.0 capable TVs available in the market (though many were announced at CES this week).

Comparing HDMI 1.4 (Black text) and 2.0 (Orange). Source: HDMI 2.0 FAQ on HDMI.org

Comparing HDMI 1.4 (Black text) and 2.0 (Orange). Source: HDMI 2.0 FAQ

A detail that tends to be neglected in all this 4K buzz is the Chroma Subsampling. If you’ve ever compared an HDMI signal against something else (DVI, VGA), and the quality looked worse, one of the reasons is because of Chroma Subsampling (for the other reason, see xvYCC at the bottom of this post).

Chroma Subsampling is extremely common. Practically every video you’ve ever watched on a computer or other digital video player uses it. As does the JPEG file format. That’s why we GameDevs prefer formats like PNG that don’t subsample. We like our source data raw and pristine. We can ruin it later with subsampling or other forms of texture compression (DXT/S3TC).

In the land of Subsampling, a descriptor like 4:4:4 or 4:2:2 is used. Images are broken up in to 4×2 pixel cells. The descriptor says how much color (chroma) data is lost. 4:4:4 is the perfect form of Chroma Subsampling. Chroma Subsampling uses the YCbCr color space (sometimes called YCC) as opposed to the standard RGB color space.

Great subsampling diagram from Wikipedia, showing the different encodings mean

Great subsampling diagram from Wikipedia, showing the different encodings mean

Occasionally the term “4:4:4 RGB” or just “RGB” is used to describe the standard RGB color space. Also note, Component Video cables, though they are colored red, green, and blue, are actually YPbPr encoded (the Analog version of YCbCr).

Looking at the first diagram again, we can make a little more sense of it.

Comparing HDMI 1.4 (Black text) and 2.0 (Orange). Source: HDMI 2.0 FAQ on HDMI.org

In other words:

  • HDMI 1.4 supports 8bit RGB, 8bit 4:4:4 YCbCr, and 12bit 4:2:2 YCbCr, all at 24-30 FPS
  • HDMI 2.0 supports RGB and 4:4:4 in all color depths (8bit-16bit) at 24-30 FPS
  • HDMI 2.0 only supports 8bit RGB and 8bit 4:4:4 at 60 FPS
  • All other color depths require Chroma Subsampling at 60 FPS in HDMI 2.0
  • Peter Jackson’s 48 FPS (The Hobbit’s “High Frame Rate” HFR) is notably absent from the spec

Also worth noting, the most well supported color depths are 8bit and 12bit. The 12 bit over HDMI is referred to as Deep Color (as opposed to High Color).

The HDMI spec has supported only 4:4:4 and 4:2:2 since HDMI 1.0. As of HDMI 2.0, it also supports 4:2:0, which is available in HDMI 2.0’s 60 FPS framerates. Blu-ray movies are encoded in 4:2:0, so I’d assume this is why they added this.

All this video signal butchering does beg the question: Which is the better trade off? More color range per pixel, or more pixels with color channels?

I have no idea.

If I was to guess though, because TV’s aren’t right in front of your face like a Computer Monitor, I’d expect 4K 4:2:2 might actually be better. Greater luminance precision, with a bit of chroma fringing.

Some Plasma and LCD screens use something called Pentile Matrix arrangement of their red, green, and blue pixels.

The AMOLED screen of the Nexus One

The AMOLED screen of the Nexus One. A green for every pixel, but every other pixel is a blue, switching red/blue order every line. Not all AMOLED screens are Pentile. The Super AMOLED Plus screen found in the PS Vita uses a standard RGB layout

So even if we wanted more color fidelity per individual pixel, it may not be physically there.

Deep Color

Me, my latest graphics fascination is Deep Color. Deep Color is the marketing name for more than 8 bits per pixel of a color. It isn’t necessarily something we need in asset creation (not me, but some do want full 16bit color channels). But as we start running filters/shaders on our assets, stuff like HDR (but more than that), we end up losing the quality of the original assets as they are re-sampled to fit in to an 8bit RGB color space.

This can result in banding, especially in near flat color gradients.

From Wikipedia

From Wikipedia, though it’s possible the banding shown may be exaggerated

Photographers have RAW and HDR file formats for dealing with this stuff. We have Deep Color, in all its 30bit (10bpp), 36bit (12bpp) and 48bit (16bpp) glory. Or really, just 36bit (12bpp), but 48bit can be used as a RAW format if we wanted.

So the point of this nerding: An ideal display would be 4K, support 12bit RGB or 12bit YCbCr, at 60 FPS.

The thing is, HDMI 2.0 doesn’t support it!

Perhaps that’s fine though. Again, HDMI is a television spec. Most television viewers are watching video, and practically all video is 4:2:0 encoded anyway (which is supported by the HDMI 2.0 spec). The problem is gaming, where our framerates can reach 60FPS.

The HDMI 2.0 spec isn’t up to spec. 😉

Again this is probably fine. The now-current generation of consoles, nobody is really pushing them as 4K machines anyway. Sony may have 4K video playback support, but most high end games are still targeting 1080p and even 720p. 4K is 4x the pixels of 1080p. I suppose it’s an advantage that 4K only supports 30FPS right now, meaning you only need to push 2x the data to be a “4K game”, but still.

HDMI Bandwidth is rated in Gigabits per second.

  • HDMI 1.0->1.2: ~4 Gb
  • HDMI 1.3->1.4: ~8 Gb
  • HDMI 2.0: ~14 Gb (NEW)

Not surprisingly, 4K 8bit 60FPS is ~12 Gb of data, and 30FPS is ~6 Gb of data. Our good friend 4K 12bit 60FPS though is ~18 Gb of data, well above the limits of HDMI 2.0.

To compare, Display Port.

  • DisplayPort 1.0 and 1.1: ~8 Gb
  • DisplayPort 1.2: ~17 Gb
  • DisplayPort 1.3: ~32 Gb (NEW)

They’re claiming 8K and 4K@120Hz (FPS) support with the latest standard, but 18 doesn’t divide that well in to 32, so somebody has to have their numbers wrong (admittedly I did not divide mine by 1024, but 1000). Also since 8k is 4x the resolution of 4K, and the bandwidth only roughly doubled, practically speaking DisplayPort 1.3 can only support 8k 8bit 30FPS. Also that 4K@120Hz is 4K 8bit 120FPS. Still, if you don’t want 120FPS, that leaves room for 4K 16bit 60FPS, which should be more than needed (12bit). I wonder if anybody will support 4K 12bit 90FPS over DisplayPort?

And that’s 4K.

1080p and 2K Deep Color

Today 1080p is the dominant high resolution: 1920×1080. To the film guys, true 2K is 2048×1080, but there are a wide variety of devices in the same range, such as 2560×1600 and 2560×1440 (4x 720p). These, including 1080p, are often grouped under the label 2K.

A second of 1080p 8bit 60FPS data requires ~3 Gb of bandwidth, well within the range supported by the original HDMI 1.0 spec (though why we even had to deal with 1080i is a good question, probably due to the inability to even meet the HDMI spec).

To compare, a second of 1080p 12bit 60FPS data requires ~4.5 Gb of bandwidth. Even 1080p 16bit 60FPS needed only ~6 Gb, well within the range supported by HDMI 1.3 (where Deep Color was introduced). Plenty of headroom still. Only when we push 2560×1440 12bit 60FPS (~8 Gb) do we hit the limits of HDMI 1.3.

So from a specs perspective, I just wanted to note this because Deep Color and 1080p are reasonable to support on now-current generation game consoles. Even the PlayStation 3, by specs, supported this. High end games probably didn’t have enough processing to spare for this, but it’s something to definitely consider supporting on PlayStation 4 and Xbox One. As for PC, many current GPUs support Deep Color in full-screen resolutions. Again, full-screen, not necessarily on your Desktop (i.e. windowed). From what I briefly read, Deep Color is only supported on the Desktop with specialty cards (FirePro, etc).

One more thing: YCrCb (YCC) and xvYCC

You make have noticed watching a video file that the blacks don’t look very black.

Due to a horrible legacy thing (CRT displays), data encoded as YCrCb use values from 16->240 (15-235?) instead of 0->255. Thats quite the loss, nearly 12% of the available data range, effectively lowering the precision below 8bit. The only reason it’s still done is because of old CRT televisions, which can be really tough to find these days. Regrettably, that does mean both of the original DVD and Bluray movies standards were forced to comply to this.

Sony proposed x.v.Color (xvYCC) as a way of finally forgetting this stupid limitation of old CRT displays, and using the full 0->255 range. As of HDMI 1.3 (June 2006), xvYCC and Deep Color are part of the HDMI spec.

Several months later (November 2006), The PlayStation 3 was launched. So as a rule of thumb, only HDMI devices newer than the PlayStation 3 will could potentially support xvYCC. This means televisions, audio receivers, other set top boxes, etc. It’s worth noting that some audio receivers may actually clip video signals to the 16-240 range, thus ruining picture quality of an xvYCC source. Also the PS3 was eventually updated to HDMI 1.4 via a software update, but the only 1.4 feature supported is Stereoscopic 3D.

Source. Wikipedia.

The point of bringing this up is to further emphasize the potential for color banding and terrible color reproduction over HDMI. An 8bit RGB framebuffer is potentially being compressed to fit within the YCbCr 16-240 range before it gets sent over HDMI. The PlayStation 3 has a setting for enabling the full color range (I forget the name used), and other new devices probably do to (unlikely named xvYCC).

According to Wikipedia, all of the Deep Color modes supported by HDMI 1.3 are xvYCC, as they should be.

Linux Device Input Notes

Wednesday, January 8th, 2014

I can use the following to list all attached USB devices.

The output will be something like the following.

Take note of Device 009, i.e. “Microsoft Corp.”. Thats an Xbox One controller plugged in to a USB port.

(Also FYI, the device known as “Logic3” is a RockCandy brand Xbox 360 controller)

I can retrieve some data about the controller as follows:

Bus 001, Device 009.

It is not controller data though. Pressing buttons does not update any of the values.

Using ‘lsusb -v‘ provides an interpretation of all attached devices (i.e. if you follow along, you’ll note the data above is the same as the data below).

An article on creating Linux Drivers (kernel modules).

lsmod can be used to list all currently installed Kernel Modules.

The other option is communicating using usbfs, but usbfs is legacy and no longer enabled on Ubuntu. Instead, Ubuntu uses udev. udev is what’s going on in the /dev/ folder. A dynamically generated file system of attached hardware.

If I do the following:

I’ll get a realtime stream of raw input data.

That’s all. Just thought this was interesting.

Quadruped Animation Test

Monday, January 6th, 2014

Yeah, the list of things needed to make that animation better are extensive (it’s so floaty). Still, first time I’ve ever animated a quadruped, and first time I’ve ever seriously animated something in Spine.

Hello 2014. The year of #MRK. Plus a 2013 retrospective

Wednesday, January 1st, 2014

Time to start the year right. After much folley and a good chunk of contract work last year, I started working on a new project. Some of the early-most efforts can be seen in the Quack tech demo video. No gameplay to see there, but you can see one of my prouder achievements in action: Live Coding/Live Content. Change a source file or an asset, and it updates in game immediately.

I really like HTML5, but JavaScript as a language is evolving way too slow. Hence why I gravitated towards Squirrel. It has all the good parts of JavaScript, and few of the bad parts. It does a few little things differently, but there is a method to the madness. Overall I’m extremely happy with the languge. There are some things that would make debugging easier, but it’s still very strong. Combined with the Live Coding workflow, you can make changes to more quickly attempt to isolate issues.

In the grand scheme of things, I do feel somewhat horrible that my last real game came out in late 2008. I did spend a few years after that porting the game to other platforms, and I have released several smaller games (Ludum Dare Jam Games), but I am frustrated that I’m still best known for a casual game released 5 and a half years ago.

* * *

Actually that’s not true. I’m best known for Ludum Dare, which has been a lot of fun. But I’ve been developing games professionally since 1999; I left my job so many years ago to create something great, not to be a community manager.

That said, Ludum Dare is great. I certainly plan to keep running it, because it’s important to a lot of people, myself included. Even if all I ever do for Ludum Dare again is schedule events and push the buttons, it still makes an impact. The look of the website may be old, clumsy, ugly, the rules may be unrefined and confusing, but the message and the intent is still there: inspiring people.

Earlier this year, myself alongside folks behind Global Game Jam and Molyjam were discussing putting together a GDC talk about Game Jams. That didn’t happen (partially my fault as I was too busy doing my contract work, very little free time to help out). An interesting point came up that, to be honest, I was a bit too speechless to respond to: Apparently the other events don’t have enough “Pro” gamedevs participating.

To me that’s weird. *cough* It might be because all the “Pros” are doing our event instead (oh burn!). But seriously, I think maybe we just have different opinions of who “Pros” are. As a developer that “did his time” at AAA game studios, and chose to go Indie, I umbrella everyone that does good work on both sides as a pro. I know a lot of gamedevs, and while it’s not a requirement that you have entered Ludum Dare to know me, I’d say most gamedevs I know *have* participated in a Ludum Dare event (and almost everyone at least one game jam). Almost everyone I know I consider a Pro. Maybe my expectations were just lower, but Ludum Dare has certainly achieved way more than I ever expected it to. We used to get excited about just 100 people participating. 😉

To me, Game Jams are about inspiration. We might not be aiming for Guinness World Records, but for damn sure we’re inspiring people.

* * *

Admittedly, I’m not putting much effort in to improving Ludum Dare. It’s more, if there are problems, I’ll fix them. Ludum Dare takes up a lot of my time. It is distracting though, no matter how much I like it, and hoo-boy I’m easily distracted. 😉

So, I try to think of Ludum Dare as my hobby.

One thing that’s kind-of popped up from Ludum Dare is the idea of Live Streaming game development. I think this is extremely cool. It’s something I personally want to do more of. And as far as the internet is concerned, aside from us there really is no hub of gamedev Live Streaming.

So me personally, hobby speaking, I would love to do more to facilitate the gamedev streaming community. The widget I made is somewhat small, only fitting 4 streams. That’s fine for us developers, but I would love to have a proper hub for viewers that want to see games made. So in the back of my mind, I have this idea of a tv.ludumdare.com website I would like to create. None of the fat of the compo website, just our livestreams. A nice easy URL you can give viewers that just want to watch games made.

In theory, it probably should be a combination of our Twitch Live Streams, and the various Youtube videos people have posted (Timelapses).

Honestly, it’s something I don’t know if I have the time to do, but it’s something I would love to have. Ludum Dare TV.

* * *

A couple other things I did, before we get to the meat of this post.

The first, a relatively minor thing, I moved this very website (my blog) on to a different cheaper webhost (Namecheap). I used to host it on Hostgator, where I was spending ~$10 a month. As it stands today, this is a very low-traffic website of mine. Again, nothing cool to show. So I decided to save myself a few dollars by moving. Back end of November, Namecheap had some flash-deals for Black Friday. Long story short, I bought 2 years (YEARS) of hosting for ~$21. Since it took time to migrate everything over, I wont have my regular hosting costs covered until February (oh darn). But then ya, assuming this website doesn’t get busy, I basically just saved $220 over 2 years. Yes it was minor, perhaps not worth the time, but I’m cheap. I like saving money.

On the flipside, I bumped up my internet speed.

Before

Before

After

After

So for a few dollars more per month, I now have 20x the upload I used to. Yes, I’m well aware some people have numbers that eclipse mine (damn you Jonny!), but having lived with .6 Mb for so long it’s like a dream. I can now Live Stream in outstanding quality. 😀

And to throw a wrench in to that Live Streaming, ha, I just switched to Linux.

Hey wait! That's Windows 7! Oh weird, how'd you get that sidebar?

Hey wait! That’s Windows 7! Oh weird, how’d you get that sidebar?

This was a long time coming. I’ve been a Windows user for gawd knows how long (20 years?). But at the same time, ironically, I’ve been a Linux/Unix developer. I’ve been using GCC since the late 90’s (DJGPP YO!), I prefer makefiles to project files, Cygwin and MSys, and I use a glorified text editor instead of an IDE (UltraEdit). Plus, for the past few years, I’ve always had an Ubuntu VM on my work machine. So it was bound to happen eventually. I just had to wait for all the stars to algin.

Obviously as an iPhone Dev, I tried the Mac route, but I just couldn’t do it. Both Microsoft and Apple are notorious for doing things that break things for developers. At least in Apple’s case, they stop charging for the OS. I do like Apple though; I’ve made some pretty good money from them. It’s just the Apple’isms don’t feel right to me. At my core, I’ve always been more of a Unix developer. That mean scripts and madness. So switching to Linux feels right to me.

I’d by lying if I said the process of switching to Linux went smoothly. I have a whole super-long post dedicated to the steps I took to get various software working on this laptop of mine. But despite the difficulty researching and getting things working, it does feel oddly rewarding. Switching to Linux IS NOT for the feint of heart. But if you’ve been using computers as long as I have, and find them stagnant and uninteresting, it can be very refreshing to jump head-first in to something like this.

Similarly, about 10 years ago I switched to DVORAK keyboard layout (from QWERTY). It honestly took me a good year or so until I was comfortable typing again, but it had the added benefit of teaching me to type properly. Growing up, I created my own very strange freestyle typing style. I was fast, but I dunno, haha, something felt kinda wrong about it. My point though: it was an interesting challenge. Like me with Linux right now, it kept things interesting. A way of challenging myself to work differently. More efficently? I have no clue. But it was definitely more interesting.

As things stand, I think I have everything figured out and working now on my Linux. I’m using the latest OpenGL drivers (newer than what’s in Ubuntu 13.10). I can build for all my currently supported platforms (Linux, Windows, Android). I can make and export usable assets in both Blender and Spine. I’ve learned that GIMP is less terrible if you switch to Single Window Mode. I’ve gleefully learned that when your host OS is Linux, and you run Windows in a VM, you can reboot that VM MOFO as much as you want, and your music doesn’t have to stop playing!! F yeah! It’s the little things like that which are making this process fun.

Things I have lost out on, being able to fullscreen a Youtube/Twitch video on a seperate monitor while doing something else. Adobe basically stopped making Flash for Linux back in 2012, and it looks like this never got fixed. It’s unfortunate, but it’s mostly a slacking thing anyway. If I really want to work and watch, I do have another PC in the other room that I can always play something on.

Another thing, I have a large library of VST instruments. VST’s, for the most part, run on Windows. Linux isn’t really well known for its audio production skills, but things like Renoise do run on Linux. That said, I should still be able to run Renoise on my Windows VM. I haven’t tried it yet, but it is something I know wont be as elegant now.

And finally, I have taken a GPU performance hit switching to Linux. GL features, on Windows I get all OpenGL 3.1 features, and a good chunk of GL 3.2 and 3.3 features as well. On Linux, thanks to a driver released a month ago, I get OpenGL 3.0 features (plus a few more). Phoronix has been doing some outstanding work benchmarking and comparing driver performance. Thanks to them, I did know what to expect. And thanks to the SteamBox and SteamOS craze, this is only going to get better. I’m a little sad to have lost GPU features moving to Linux, but I’ll manage. This is probably the one thing that would have kept me from switching to Linux. But because it is still close to what I had, I still did it.

Oh, and I guess the fonts don’t feel as nice as they did on Windows and Mac. Minor complaint, as this only really bothered me the first couple days.

That’s all that comes to mind.

* * *

Okay, on to the meat! Lets start with this.

So yeah, a self serving tweet basically declaring my intent. For me, this year is all about #MRK. It’s an abbreviation of the true name of the game I’m working on. Like I said, this is a brand new game I started just a few months ago.

For me, I’ve been trying to work on a game called “Alone, The” for the past few years. As Indie game developers, we often talk about the idea of creating very personal games. For me, “Alone, The” was this. It’s a concept I absolutely love, but to truly realize the vision of the game, there’s one big problem: I have to make it alone. It’s totally silly, but over the years “Alone, The” evolved in to a game I literally cannot share the work. There’s a distinct sound, vibe, mood, everything about it. There are strong inspirations from my childhood, nothing I’m afraid to share with with people playing the game, but a very specific set of feelings I’m looking to create. The excitement of gaming to me as I was growing up. We as Indie Developers are directly tied to the games we create. The games we create define us. And I, somewhat proudly, found “me, the game”. Regrettably though, I’ve discovered I lack the time and the vocabulary to create “me, the game” right now.

So really, I had to shelve it. It’s the one constant these past few years that I could point to. I’ve come up with a lot of concepts for games I like, but my passion was always in “Alone, The”. It is time to move on though.

#MRK was born out of a look at my own gaming habits. I literally, sat down and forced myself to play games, just to see what I even liked anymore. That’s something crazy that’s come up lately. Despite how passionate I am about games, as the years go on I’m playing less and less of them. I needed a reboot. I thought I knew what I liked, but I had to sit down to find out if that was still true. Perhaps my tastes have changed?

After a few weeks of playing, I definitely wasn’t wrong about my tastes, but I’d forgotten about a lot of things. As it turns out, I’m a lot more forgiving and fanatical about games than I thought. I assumed I didn’t like grinding in RPGs, or working hard for something, but as it turns out I’ll do some some really repetitive stuff if the reward is worthwhile. Heh, gaming masochism. 😉

As it also turns out, I really like tweaking (not twerking). I actually really like looking at my play style, changing something, either how I play or a piece of equipment, and finding my best balance. I like strategizing. I like little challenges that get me thinking. I could really care less about puzzles (basically, pure mental challenges). What I care about is results, not the mental challenge itself. So again, in a small way, I like strategy.

That said, while I do really like games like Starcraft, Civilization, I don’t think that’s specifically my niche. I care a bit about the macro game, but I think I only really have the patience for the micro game. Tweak and adjust one, or a small group of things.

So these ideas helped shape #MRK. Also, that I really wanted to do a 2D platformer again. To compare, “Alone, The” is a top-down action adventure game (where part of my struggle was how 3D to make it). There’s definitely a lot more to it than this.

Most importantly though: It’s a game I can share.

About a month ago, I had a lunch with good friends HalfBot. I did something as simple as sharing the idea with them, and HOLY F*! Just one little off hand comment made a WORLD of difference. It took this weird piecemeal amalgamation of many ideas and refined it in to a easy concise idea. I’m not going to say the magic word just yet, but suffice to say, it makes so much sense. It’s an almost obvious idea, but the execution is somewhat … okay, very difficult.

So I am excited. “Alone, The” was a game, to be honest, I wasn’t even sure would excite people other than me. This one I know will excite people. Hell, I saw it excite some friends at the lunch table. The question is now: can I pull it off?

.. Fuuuahhh… well we’re going to find out!

Hello 2014, lets do this! #MRK 2014!