People cannot think unless they are free to think. Government rules and regulations literally prevent thought and prevent experimentation. A free market is a massive experiment in competing ideas, the most productive of which win out. Most of the experiments fail, but even failed experiments lead to better understanding. When the intellectual elitists stop the experiments because they are smarter than the rest of us and know what is in the “public good,” the learning stops – witness the Soviet Union. By now, the elitists should know better. Often, they use the public good as an excuse for their own lust for power. Those of us who have had to face government bureaucrats often see the lust for power as the true motivation and the “public good” as the bureaucrat’s rationalization.
—  John Allison, The Financial Crisis and the Free Market Cure
Watch on

The son of a mechanic from Baie-Comeau, Prime Minister Brian Mulroney takes on Liberal leader John Turner in this head-to-head exchange in 1988 on CBC’s The National. The issue is free trade. Mulroney, representing the neo-Conservative movement of Thatcher and Reagan in Canadian political circles, had proposed free trade with the United States. John Turner, although a blue Liberal representing business within his own party, had taken a stance against free trade.

This is a little gem that you will never see in Canadian politics again (it happened a few time with Turner and Mulroney, for example, during the 1984 federal election debate). The Prime Minister lays out a passionate argument for his patriotism (which was subtly being challenged by Turner) and his economical arguments for why free trade with the US was good for Canada. Turner offers up an equal share of passion in defence of Canada and her free enterprises at home, fearing (and blatantly showing that fear) of the US giant to the south.

Voluntary Trade Is Never A Zero-Sum Game, Including The Purchase Of Movie Tickets

I’m not rewarding the producers of a movie by purchasing a ticket because trade is not a zero-sum game.  I value the experience I expect to receive when I buy that ticket more than the money I am paying for that ticket.  That is capitalism.  I am not buying a movie ticket to reward the producers of a movie as part of some altruistic gesture for making a movie.  I am buying a ticket because I expect to receive a benefit that I subjectively value more than the price of the ticket.  The first time I see a movie, I am making my purchase based on extremely limited information so I may or may not regret my decision after I have more information.  That is also part of capitalism.  The price system is the mechanism that rewards or punishes producers for what they produce in the free market. I focus on my own individual preferences to make my own decisions, not how the Invisible Hand will ultimately translate the decisions of all moviegoers, theater owners, and movie producers into a profit or loss for the producers of any single movie or the owners of any single theater.

People who dismiss the unemployed and dependent as ‘parasites’ fail to understand economics and parasitism. A successful parasite is one that is not recognized by its host, one that can make its host work for it without appearing as a burden. Such is the ruling class in a capitalist society.
—  Jason Read
Indie, Addendum (or: why I benefit from GamerGate's opposition)

I know, I know, I said I wasn’t going to talk about GamerGate anymore. So instead: let’s talk about something else. Let’s talk about Daggerfall.


This is Daggerfall. It was the second game in the Elder Scrolls series - the sequel to the 1994 Arena and predecessor to the 1997 Battlespire

You don’t see people talking about Daggerfall much, but it had a lot of interesting concepts that set it apart from the series’ later titles. The scale was enormous - the width of Cyrodiil in Oblivion is just a bit shorter than the distance between two of Daggerfall's 15,000 towns. Dungeons were not made to be navigable - they were confusing, winding messes where there was a good chance you would have to give up without ever finding the artifact/person you were looking for. On top of this, quests were timed - waiting around to heal or getting roped into a sidequest wasn't viable, since you were only given a certain number of days to complete a mission and a good number of those would be dedicated to sleep or travel. If you failed a quest, it would reduce your reputation with factions or make the main storyline unwinnable.

Daggerfall was a game from a different era. Its production values weren’t high: it was made by twenty people, around half of them credited as just “additional art” (which in my own credits is codeword for “drew one sprite”). It was released one year after its predecessor - for comparison, I’ve been working on my 2013 Seven Day Roguelike entry for longer than that. The game was also very punishing - unless you had looked at the 100-page manual and gotten a good grasp on its mechanics, you probably weren’t going to make it out of the starting dungeon. You’d get killed by a rat.

But that’s the thing: Daggerfall was made for a certain kind of person. It came from an era where games were made for gamers. Like a child whose family could only afford one toy for Christmas, there was an expectation that the people who bought games were going to care about them. They were going to explore them, learn their nuances, use their imagination and find depth where others couldn’t. Games were meant to release their content over weeks, not hours. I once saw an interview with an early adventure game developer where he explained that, in his era, adventure games weren’t something you plowed through in one sitting - they were something you’d have to put down for a while, and think about the puzzles as you went through your day. They weren’t for a casual audience - much like Ulysses is only read by literature aficionados who care enough to dissect its nuances, games were only played by gamers.

With Daggerfall in particular, that dedication was required. It could take weeks or months of playing before you began to see the scope of the main plot. You’d have to save up hundreds of thousands of gold pieces to even speak to a Daedric Prince and get their quest. You were encouraged to do frivolous things for your own fun, like buy casual outfits for your character to wear when in town. I found this beautiful relic of a roleplaying guide, pointing out things like how you can be a “Knight of the Dragon” by wearing fullplate and refusing to attack dragonlings. It’s like watching a grown man play with dolls. It’s awesome.


At some point, games lost that. I’m not going to say games turned bad, because I love games like Skyrim - but I love them in the same way I love The Avengers or Gravity. They’re high-budget quick fun, but after a few hours they’ve shown all their cards. Developers aren’t going to put huge amounts of effort into some critical twist that comes after a month of playing because they know most of their playerbase won’t be that dedicated. And with a team of 100 people working on a game for three years, you need to target the biggest audience you can to recoup losses. A niche won’t do it.


This is how “gamers” became a demographic separate from “people who play games”. As games grew to be accessible enough that the average person could throw a few hours into them, gamers became a niche market of users who cared about games enough to explore their full depth. Mainstream games continued to rope them in with things like optional high difficulty levels or weapons that have a high skill ceiling that can only be reached after weeks of practice. Indie games, on the other hand, found a niche market that let them survive in the shadow of mainstream games: comically difficult platformers and heinously deep RPGs appealed uniquely to this demographic that wanted to care about their games. 

This is important to understand: indie games, in their earliest years, were not conceived as art. They were a business. We didn’t have these short three-hour rides showing a creative new mechanic that define indie games today: rather, we had games that survived by niche marketing to a small audience - typically gamers, since they were the only ones who would care enough to seek the game out. They were also the ones who would continue to support you - if you made a game that targeted their rare tastes, you could bet they’d be back for the sequel rather than simply moving on to the next popular FPS. Jeff Vogel has been surviving off this mentality since 1995, milking his tiny little squadron of fans - many of whom seem to remember the original Exile. In his latest blog post, he even spells it out: 

Indie is a type of business. It’s a type of funding. It’s a marketing term. In fact, the term ‘indie’ can mean everything but a type of game.

And yet, today indie has become something different. It became about art. With the internet and later Steam, niche marketing to gamers was no longer a necessity for indie devs. They could compete in a mainstream market, as long as they could keep their production expenses down. Thus, you got little games: short one-hour romps, gimmicky iPhone toys, and art games relying on the presentation of a single creative idea. You got what Vogel calls “the Indie Bubble” - this idea that the market has become completely diluted. It became a roulette game: people would make a creative thing and hope to hit it big. If you were friends with advertisers, reporters, or award judges, you could give yourself an edge.

And just like that, indies unknowingly reinvented the publisher. Success as an indie developer no longer pivoted on connecting with a niche audience, but on working your way into a group of advertisers. If the group thought your ideas were good - that they’d make a decent return - they would pay you to make games. It wasn’t about hitting the niche audience you had built or discovered, but rather about working with someone who could advertise you to the non-gamer mainstream. 

It’s what planted the seeds for “GamerGate”, and eventually this mass declaration by news sites that “gamers are dead”.

The Death of the Gamer (and why I benefit)


Imagine, for a moment, that the Coca-Cola company suddenly became very health-conscious. Obesity in America, they decided, is a big problem. That day, they announce that from now on they will be producing nothing but vegetable juice. They will now be a competitor to Campbell’s V8 Juice. Not only that, but they publicly announce that all soda-drinkers are trash, and if they want to be accepted anywhere they will need to learn about healthy eating habits. Health groups are overjoyed and commend Coca-Cola on the decision. But do you know who’s even happier about this?

Motherfucking Pepsi.

I was introduced to indie development through the Old Ways. I don’t talk about it much, but my first steps into game design came when I was copying Exile off one of my dad’s shareware disks and discovered that all its visual resources were stored externally. Summoning up my mad Kid Pix skills, I drew my own character overtop the default male fighter. When I saw him in the actual game’s party-builder screen, I was enthralled. 

I never had a game console growing up - all my money went to Very Important Legos. My introduction to gaming was through shareware games distributed by companies like Spiderweb Software or Epic MegaGames. I never grew up seeing games as something high-budget that you needed dozens of people and a publisher’s backing to produce. The biggest hurdle was that you needed this mythical device called a CD Burner that cost well over 1000 dollars - enough to buy nearly six copies of that awesome Unitron monorail I wanted for Christmas. I didn’t grow up seeing games as coming from these big companies like Nintendo or Sega, but from these smaller individuals like Jeff Vogel or Cliff Bleszinski. They didn’t put a dozen copies of their game on shelves, but rather seeded the world with free shareware to find that niche that liked their product for what it was. They could get away with making weird shit because they had us. They had gamers, who cared enough to fill out that order form. They reached out to us like friends because we were theirs, and they were ours.


"Gamers are dead" is a stupid-ass thing to say. I don’t mean "stupid" as in it’s wrong - though it is - but stupid in that it’s a self-defeating business move. 

Sure, it’s true that normal people play games. I was at a bar earlier today - to my left, a grizzled man in a Ravens jersey discussed the math behind how he was going to defeat his Fantasy Football rival, and to my right, a woman had her collection of riddle-emblazoned beer bottle caps (I helped her figure out one involving a picture of France, which I admittedly only recognized thanks to Francis the Talking France). These people probably wouldn’t consider themselves gamers, but like many other people in our culture they can enjoy the occasional game.

But at the same time, non-gamers are not the people reading gaming news sites, or buying short little art games because they won indie awards. They’re the people buying the latest popular shooter they see advertised on TV, and yelling “you cocksucking whore!” into their mic when a gamer headshots them.

Alienating gamers is a stupid-ass move. For a gaming news site, it consigns their audience to the people with the least investment in their content. For a developer of small indie games, it deprives them of the two demographics that would possibly hold them in higher regard than mainstream media - the remaining group being the notoriously broke artists. I’ve even seen an article declaring that, in addition to gamers being dead, games should no longer strive to be “fun” - and instead, developer’s primary goal should be to create games that can teach good lessons and encourage positive social behavior. The Coca-Cola-switching-to-vegetable-juice analogy runs deep. All of these people are completely fucked.


But you know who’s not fucked? I’m not fucked. The old-guard indie developers - the ones who knew how the industry worked before you could buy a game for a dollar while sitting on the toilet - are not fucked.

Let’s be honest here: when we stand up for GamerGate, we’re doing it for sites like Kotaku who don’t know how to be writers or people like Zoe Quinn and Tim Schafer who know shit all about being an indie. We’re trying to maintain an industry where these people are not automatically fucked. You ask a random person on the street if they’ve heard of Broken Age, you’re going to get nothing. When you finally find someone who says “yes”, go ahead and ask if they’re a gamer. These newbie indies don’t even realize who their audience is, and I doubt they’d last a second if their de-facto publishers ever pulled the tit from their mouth.

But you, me - the Jeff Vogels and the Rich Burlews and everyone inbetween - this doesn’t hurt us. If anything, we stand to gain from it - groups stigmatized by society become all the more likely to look up to or help those who treat them as equals. In the end, all we’re really doing with GamerGate is protecting idiots. And I mean, that’s fine! That is a just cause; nobody deserves to get hurt, even a creator who attacks their audience. But the more I think about it, the less crushed I’ll be if the opposition ultimately prevails. I’ll be sad, until I realize how much money I’ll be getting.


I ran into an old friend this week - one of the most chipper and upbeat people I had ever known. They’d had a bit too much to drink, and admitted to me that when we first became friends in high school they were being sexually abused and raped at home. It went on for years before they told someone, and even longer before it was actually taken care of.

I guess on some level it reminded me of why I originally started caring about GamerGate. I want to see Zoe Quinn at least respond to some of the more serious allegations against her. Maybe refute them, maybe apologize, but at least stop attacking people or painting them as villains for bringing it up. I want to see her friends and the gaming press acknowledge that the things they said in defense of her got pretty horrible at times.

But I don’t know if I’m actually scared. I mean, sure, I’m probably on Kotaku and Rock Paper Shotgun’s blacklists by now, but as someone whose work has been praised on those sites and was able to analyze the benefit incurred, I can say that it’s not that big of a deal. I mean, fuck, I’ll even show you:


Rock Paper Shotgun, after a praising review that told everyone to read my webcomic, sits down in a comfortable 10th place, making up 3.38% of my direct referral traffic since the review was posted. Ahead of it are three social media sites (Facebook, Tumblr, and Reddit), two communities I used to post in (MSPAForums, Something Awful), two art sites where I can only be linked in individuals’ journals and image descriptions (Deviantart, Furaffinity), a wiki (TVTropes), and a webcomic (Three Panel Soul). The 5000 clicks when the Rock Paper Shotgun article was first posted were nice, but the bulk of my traffic comes from communities. It comes from the people who respect me or my work, or the niches like MSPAForums or Furaffinity who it connects to specifically.

And you know who’s in a close 11th place behind Rock Paper Shotgun?

Motherfucking 4chan. A positive reputation with 4chan is worth almost as much as a glowing mention on a news site. And things that are worth even more than journalists: furries and webcomic authors. Welcome to indie.


So, yeah. GamerGate is a good thing; I want to see it succeed, gaming journalists get standards, bad people face justice, that kind of thing. But in the end we have to remember to put it in perspective: your reputation with Furaffinity is worth more than your reputation with gaming journalists. Rather than working my way into some nepotistic indie clique or standing up for some wannabe feminist icon, I’m just going to post this picture of a shirtless argonian. They can never blacklist me from cold, scaly pecs.

GamerGate failing will only make me and people like me stronger. The damage has already been done: people are offended; hundreds if not thousands of individuals are getting their gaming news from other sites and being more cautious about believing info from the big ones. A bunch of game designers have been complete assholes to their audience. And then, on the flipside of this, you’ve got me. And you’ve got the old indies. You’ve got the people who care about gamers. Not only will we will make things you love, but we think you rock.

Toss your Coke-branded vegetable juice and crack a Pepsi. This is the beginning of a beautiful friendship.


Frameworks | Sam Laughlin

Across Europe one finds buildings that lie unfinished, some are skeletal in form and purpose. These concrete forms represent a stage in architectural process that, in their case, may never be completed. Here we see architecture paused; construction has ceased and we are left with the bones of buildings in stasis.

In this state, an architectural lineage is revealed by their resemblance to the remains of classical structures. Incomplete for economic and political reasons, they becomes runis of modernity.

The UK has always been an imperialist construct, set up to protect and further the interests of the rich. There was a brief period after the Second World War when it sought to be something more. So we had the emergence of a post war consensus and the welfare state.

I was a benefactor of that consensus. I took evening classes at the local college for a pound, had my university fees paid, obtained a full student grant, and benefited from universal healthcare. For the social equivalent of me today, making this progress would be impossible without accruing a lifetime of debt and becoming no better than a slave – fuck that bullshit. All that has now gone, and the Labour Party will not be bringing it back. Tony Blair and Gordon Brown were no aberrations; they were the natural progression of a movement that has ‘evolved’ from its radical roots into a centre-right focus group-driven party of power. Now, on a policy level, they chase middle England votes, while lecturing working class people on their ‘duty to vote’ (Labour), in order to ‘keep the Tories out’.

—  Brave and honest words from Irvine Welsh, taken from this article (8 Sept 2014) on Scotland, Britain, and the upcoming independence referendum.
The fundamental law of capitalism is: When workers have more money, businesses have more customers. Which makes middle-class consumers—not rich businesspeople—the true job creators. A thriving middle class isn’t a consequence of growth—which is what the trickle-down advocates would tell you. A thriving middle class is the source of growth and prosperity in capitalist economies.

*sigh* Is it Explain Why Grossly Oversimplified Info-graphic is Grossly Oversimplified Time again? Ah well, here it goes. Ahem: 

Once again, the social media social justice brigade has tackled a symptom instead of the disease. Repeat after me: It’s not as simple as raising minimum wage.

The disease is inflation and the lowered value of the dollar. 20 dollars could easily feed a large family 50 years ago, now it’ll barely get you a few groceries for one person. Finding sustainable sources rather than spending what we don’t have is the only way to tackle inflation, which will make any money you earn go farther without the unwanted side effect of making it harder for small businesses (the real heart of american industry) to hire people because of increased wage requirements. One possible idea (and I’m sure it has some issues I haven’t thought through, this is just off the top of my head) is to adjust minimum wage based on what a company earns. That way the laws wouldn’t hurt small businesses that want to be able to hire people but can’t afford it, while still requiring larger earners to give their employees fair wages.

Money has to come from somewhere.

We cannot just wave a magic wand and give people more money without consequences. Most businesses are going to cope with paying their employees more by raising their prices. Cost of living goes up, people will want to be paid more, and the cycle continues. Now certainly Wal-Mart can afford to pay its employees more than Frankie Joe’s pizza Palace or Mama Jane’s bakery, but Jane and Joe won’t be able to keep up with increasing minimum wage without raising their prices significantly and probably losing business to the Wal-Mart.

PS: There always has been and always will be “income inequality”. Comparing the job of a CEO of a corporation to flipping burgers like they’re essentially the same and should be paid the same way is ridiculous. I don’t care how butthurt this makes you: Some jobs are worth more than others. We will always pay doctors more than we pay taxi drivers. Scientists and astronauts will always make more than the guy who buffs your car. Get over it. 

Why It’s Hard to Take Men’s Fashion Magazines Seriously

This New York Times Magazine piece features a brooding model wearing some beautiful leather jackets in a banal suburban setting. OK. Not a problem; even kind of cool. The average price of the five jackets shown? $5,980.60. And it’s not a number thrown off by an outlier; the bargain Coach jacket above is the cheapest at just under $1k.

According to the Bureau of Labor Statistics, the average American family spent $1,736 on clothing in 2012 (the subset of “men and boys” spent $408, a figure that may make PTO readers blush).

I know that we, too, regularly recommend clothing with costs that would dwarf the average man’s yearly apparel budget, and we gawk admiringly at the wardrobes of the superwealthy, but the gulf between realistic spending habits and the cost of the clothing regularly featured in the pages of magazines raises the questions: what’s the point? To look at beautiful things, artfully arranged? To show us what’s current in loftier circles that we might aspire to? To placate advertisers? And where do we each choose to draw the line between what’s OK to spend vs. what is ridiculous?

I’d argue this economic disconnect is what drives men to seek out more value-oriented sources of information like PTO, independent blogs, and clothing forums.


Photo by Matthew Kristall.